American's don't have to pay attention to world affairs, unless what some other country is doing directly affects America in a negative way. As the reigning superpower and the largest economy, other countries have to come here for business. Now knowing about world affairs isn't necessarily bad, but I think it's awfully vain of other countries to think that America should care about what they are doing. Until someone else is a threat to us either economically or militarily, America should focus on America. It's why most other industrialized nations are all taught english in their schools-It's not to deal with the UK. If you want the US to take notice, threaten us (although you might not like the attention you recieve). Otherwise, it's not necessary.
|