I've never been involved in any physically abusive relationship, so my experience on this is quite limited. I (a woman), however, used to be a very anger and destructive teenager, and remember hitting one of my friends (a man) at random times when he would annoy me or whenever I felt like it. I can't recall if I ever really hurt him, but I don't believe he ever thought to hit back. I can imagine the thoughts going through his head could've been along the lines of, "If I hit a girl, then I'm an asshole, disgusting, and someone would think horribly of me if I hit a girl." Yet, here I was, punching him in the stomach and trying to knee him in the groin whenever I felt like it. The idea that he might not of fought back because society/parents/friends told him he shouldn't ever hit a woman, disturbs me a little. I took advantage of someone without even realizing I did-- his personality, his weaknesses, and his gender.
I realize my mistakes now that I am older and not so angry or destructive, and to label this thread "Male violence towards women is disgusting" is very gender biased and makes me cringe and shake my head. It goes the other way as well for me. Female violence towards men is disgusting. Why should either gender stoop to the level of someone who would hit another person “just because they can,” “just to exude their power,” “just because the other person is weaker,” etc, etc?
__________________
"You must be the change you wish to see in the world." - Gandhi
|