Alright, so, I recently got to thinking, in a lot of action movies, there seems to be a common theme, or underlying theme.
Take for example the two most infamous action series ever made.
Die Hard and Lethal Weapon. Both of these movies featured white racists, and had a lot of underlying vibe's of how evil white people are.
Die hard one- Main bad guy, German, Neo-Nazi type
Die hard Three- His brother
Lethal weapon 2- Neo nazi south american folk.
Now, those are the obvious ones, some of the less obvious fuck whitey messages comes in various ways. Perhaps i'm looking to closely, but take the matrix, reloaded/revolutions. In both of them, the majority of the people in Zion were not white, and they were the worlds "last hope".
Also, the movie Strange Days had a huge and blatently stated anti white message.
Varsity blues had a coach that just hated black people.
Other movies like "Falling down" offer mixed and skewed views.
There are other movies, like one I remember watching a long time ago, called "white mans burden". However, i think flicks like that are usually the exception to the rule.
All in all, its not in every movie, but more and more, it seems like if there isn't some sort of anti-white message, or racist bigot, then a lot of minorites get pissy.
Take LOTR for example, some people actually got pissed off because there were no black actors in it...
Anyways, I'm just wondering, does anyone else see this happening more and more? Just a lot of racism portrayed from all angles in the movies?
Personally, i'm just sick and tired of Hollywood playing the "white racist" card to make someone out to be a bad guy. Make him a sociopath that is powerhungry and greedy as Donald trump.
Basically, to end my rant, I just wish they would drop the with the making every badguy a Neo-nazi/KKK member,and get back to just making them
bad guys instead.
Side note- I predict that in a few years, the bad guys will go from being anti white, to anti arab...just to try and be PC, etc
So discuss!