I'm not trying to say wanting to make money is unhealth or anything. But I think we take it too far, to the point where Americans basically live life to make money and buy things thinking that what will make them happy is the stuff they buy. I live in a rural but pretty heavily populated town. People are never really outside interacting with each other doing things, they are all inside watching tv and playing games. Thats a general statement, however its not all that far from the truth.
I don't really know what I'm trying to say, maybe I mean that I think money has become more important then life.
__________________
"We do what we like and we like what we do!"~andrew Wk
Procrastinate now, don't put off to the last minute.
|