American here, and this needs to be said. Not all American's want to tell the world to fuck off. Some of us see the benifit of having neighbors you get along with. Some of us realize that despite what some people believe we cannot honestly say the USA is the best country in the world. It's time the world started to work together for the greater good.
I just don't understand it how a country as big as the USA composed of such a large diverse geographical, ethnical, and political spectrum can't see the value of the world working together. This country was founded on seperate governmental entities working together to further the greater good. What happend to that ideal?
|