America should have mandatory travel
I have travelled extensively, been to most of the continents and Iīm currently making my way through Central and South America. Everywhere I go I see a multitude of travelers from Europe and Asia and Canada, but comparitively few United Statesians. This depresses me greatly, for various reasons. Living in a foreign country opens your eyes to how the majority of the world lives and thinks, and gives you such a broader perspective on things, makes you well rounded, gives you the opportunity to network and make freinds around the world.
If I were king for a day, I would make travel to a foreign country (not Canada, itīs too close) mandatory between high school and college. You graduate highschool, then you MUST travel to a foreign country for three months prior to being allowed to start college or work. It is an education you can only get by doing it. No matter how many books you read about a place, a week in that place will teach you more.
I would fund this through some kind of trade agreement with other countries. Let the Americans live in your country for three months, they will spend thier money in your stores, they will learn about you and possibly take you into consideration when they do things like vote for people, etc. That and maybe 1% of all politicianīs paychecks go to the Travel Fund. They supposedly work for "us," what better way to help the country?
|