Quote:
Originally posted by MrSelfDestruct
The other possibility is that we develop technology much faster than our morality and ability to comprehend that technology can evolve, and at some point the system becomes unstable and we anihilate ourselves through malevolent intentions, unwillingness to accept the consequences of our actions, or sheer ignorance of what we are doing.
|
Scary T3: Rise of the Machine scenario.
I personaly would like to believe that your first possible outcome will be true. A brighter tomorow is always better than a gloomy today. Although, I think morality would evolve about the same rate as the advancement of technology. What scares me though is that we are developing too rapidly, using up natural resources too quickly, we may not kill ourselves with technology, but we might starve ourselves. (Yeah, take it with a grain of salt)