Personally I think it's a good thing. Like other posters said, you get desensitized a bit but this doesn't diminish your appetite for real women. In fact, it shows you the extremities and you get a little more comfortable with the little discrepancies in real women.
Plus it evens out the market a bit.
Women wanted equality? They got it. Men respond to this new culture (as analog said) by taking better care of themselves (my dad often scoffs at me for caring about my hair and skin) and women get to be more sexually aggressive. I'd say it's a win-win situation.
