Maybe its me, and although I do agree that the ability to be open with your emotions with the one that you love is important, does anyone think that guys are being ''feminized''? I just see that over the years, men are expected less and less to be men, which moves us to a gender neutral society. Am I out of whack in thinking that men and women have certain roles, and theres nothing wrong with a man being strong and silent and the woman being the emotional part of the equation?
__________________
If you've ever felt there was a reason to be afraid of the dark, you were right.
|