I was reading 
this thread regarding Dove's "Real Women" campaign, and it jogged my memory of what I believe is a rather disturbing trend I have been noticing lately.
Now, to kick things off, I'm first going to get all my disclaimers out of the way - I'm not racist nor anything of the sort.  I realize that this may turn into a controversial thread, as it does deal with race indirectly, but please keep it civil folks.  I'm trying to look at the bigger picture from the societial point of view.
Also, I just threw these together in photoshop real quick.  A picture here or there may be misleading because of the lighting/flash or whatever, but by and large I think that this holds true.
So, with no further ado....
 
 
 
And it doesn't just apply to women... (I'm gonna leave Michael Jackson out of this... for obvious reason 

)
 
Notice anything?  As my ultra-subtle background conveys, often times "beautiful" black people are becoming whiter and whiter - especially when they are being photographed for a magazine, or in a movie, ect.  Granted, there are exceptions, but this seems to be the general rule of thumb.
Don't get me wrong, I think all the chicks pictured are pretty smokin' hot, but it surprised me that the photos taken for a larger audience, they consistantly show up lighter in color.
Now I ponder, what does this mean?  Could it be that although America "accepts" black people in the entertainment industry as a whole, they still must go through the utter ridiculousness of making themselves appear whiter?  If that's the case, has America as a whole really accepted black people - or are we just a bunch of closet bigots?
I'd go on, but I want to see where the discussion leads before I pipe up again - I look forward to the responses...