America may very well be evolving in its sexual ethics. As Christianity wanes, the stigma associated with good sex tends to fall away and open up those not associated with dogmatic ties to a larger understanding of what human sexuality can be. Note the concept of marriage in the eyes of the average American, as seen by the number of divorces we instigate.
As the concept of God is removed from the reality of copulation, humankind will be freed to express the joys of sex unconfined by false morals, and enjoy each other without the senseless guilt. Granted the physical aspects of disease and violence will still be there, but hopefully we will deal with these in turn.
Hollywood tends to represent the direction society is heading in a macro sense....Maybe we'll all get laid more often, might be the best thing to come out of Hollywood in decades.
|