04-03-2004, 09:34 PM | #1 (permalink) |
The sky calls to us ...
Super Moderator
Location: CT
|
Relative advancement of human abilities
Every few years, we come up with new ways to make ourselves live longer. Every few years, we come up with new, more horrific ways to kill ourselves and each other.
The way I see it, we have two possible outcomes. The first is that the evolution of morality and intelligence are able to keep up with and surpass that of technology and we are able to survive without killing ourselves. The other possibility is that we develop technology much faster than our morality and ability to comprehend that technology can evolve, and at some point the system becomes unstable and we anihilate ourselves through malevolent intentions, unwillingness to accept the consequences of our actions, or sheer ignorance of what we are doing. Where do you think we are headed, and do you think there is any way that it can change for either better or worse? |
04-03-2004, 11:25 PM | #2 (permalink) |
Baltimoron
Location: Beeeeeautiful Bel Air, MD
|
It seems to me that the technology will always come first, and it will take a while for the morality to set in, but it will eventually.
It's just a fact of how when we learn something new, it takes a while for us to discover the consequences of the new knowledge. Once we do, however, we tend to make as much of an effort as a society as possible to balance the knowledg with it's consequences. To give a fairly recent example, the atomic bomb. From the first bomb that was dropped, it was just another weapon in the arsenal. In the wars up through Vietnam, it was seriously considered as an option, though it was never needed. As time has gone by and the effects of nuclear weapons have become clear, it has become less and less of an option to the point where it would take nothing less then a direct thermonuclear strike on a major US city to be able to be used. Another is industrialization. When the Industrial Revolution was beginning, it was seen as a great advancement in society. People flocked to cities, factories were built all over the place, and recources were being turned into useful items. Now we have found out that pollution, overcrowding, and serious depleation of resources have resulted from the industrialization of society, so we have taken steps to alleviate the consequences while keeping as much of the good as possible. It's a gradual process of learning. In the future, the learning will likely come at an increasingly fast rate, so we'll be able to discover consequences sooner. So I believe humanity will always be able to keep up with the technologies they think of.
__________________
"Final thought: I just rented Michael Moore's Bowling for Columbine. Frankly, it was the worst sports movie I've ever seen." --Peter Schmuck, The (Baltimore) Sun |
04-04-2004, 09:11 AM | #3 (permalink) |
Sky Piercer
Location: Ireland
|
First of all, I have to be honest with you, I find that attempting to predict the (reasonably distant) future, is one of the most pointless activities one can engage in.
With this in mind, take what I say (and what everyone else says) with a grain of salt. It is nothing more than an unfounded, unjustifiable opinion... Obviously the future will bring with it good and bad, but I think that, the good will far outweigh the bad, as it most certainly has done in the past. Although ludites are very quick to point out the horros of the atomic bomb, and other such things, you would do well to realise how much good technology has done for you. It is a simple fact that the majority of us on this board would simply not be alive today if it were not for technology, We would have died from some illness as children. Also bear in mind what would have been considered "old age" many years ago. The idea of a person living to be 70 years old would have been simply unspeakable. In short, technology has saved vastly many more lives that it has killed. So extrapoloating this trend to the future (which is all we can do in the business of "prediction") I would see lives continuing to get longer, (perhaps due to the ability to replace worn body parts), but also many new challenges that need to be overcome. Will we eventually put an end to ourselves? I would tentatively suggest that we won't, though I would predict that sometime in the future there will be some kind of global war with massive casualties, but I doubt that war will result in complete extinction. How will we eventually die? For me, the most likely event would probably be some sort of global epidemic of some highly evolved disease which is resistant to any of our technologies, and which spreads too quickly to give us enough time to research a way to stop it.
__________________
|
04-04-2004, 09:06 PM | #4 (permalink) | |
Comment or else!!
Location: Home sweet home
|
Quote:
I personaly would like to believe that your first possible outcome will be true. A brighter tomorow is always better than a gloomy today. Although, I think morality would evolve about the same rate as the advancement of technology. What scares me though is that we are developing too rapidly, using up natural resources too quickly, we may not kill ourselves with technology, but we might starve ourselves. (Yeah, take it with a grain of salt)
__________________
Him: Ok, I have to ask, what do you believe? Me: Shit happens. |
|
04-04-2004, 09:45 PM | #5 (permalink) |
can't help but laugh
Location: dar al-harb
|
i propose that morality will never be able to keep pace, side-by-side, with the advance of technology.
if making the moral choice is doing what is right in a given situation, then you must draw your sense of what is good and what is bad from pre-existing experiences. if there is no experience history to draw from in a new situation (new application of technology), then morality doesn't really exist for that instance. the only thing we can do is to attempt to apply any of the new situation's likenesses to a familiar situation and try to respond accordingly. but, we are never having a direct moral response to the specific technology until we have built up a base of experience-knowledge about it.
__________________
If you will not fight when your victory will be sure and not too costly, you may come to the moment when you will have to fight with all the odds against you and only a precarious chance for survival. There may even be a worse case. You may have to fight when there is no hope of victory, because it is better to perish than to live as slaves. ~ Winston Churchill |
04-08-2004, 05:21 AM | #6 (permalink) |
Addict
Location: Grey Britain
|
If we consider only these two options as a possibility, then statistically, on an undefined timescale, option two is a certainty. This is simply because for every time we don't wipe ourselves out, we have another chance to do it, but if we wipe ourselves out even once, we don't get a chance to undo it. So as time approaches a limit in infinity, the probability of us having wiped ourselves out at some point in that time approaches 1.
However, I don't think these are the only two possibilities. If we manage to stay alive long enough we will all die of something far bigger than us, like being engulfed by the Sun. Also, I really don't think that our morality and intelligence have changed in the last ten thousand years or so. We have simply built on and shared our existing knowledge base to a greater and greater extent as the years have gone by.
__________________
"No one was behaving from very Buddhist motives. Then, thought Pigsy, he was hardly a Buddha, nor was he a monkey. Presently, he was a pig spirit changed into a little girl pretending to be a little boy to be offered to a water monster. It was all very simple to a pig spirit." |
04-25-2004, 01:54 PM | #8 (permalink) | ||
Wehret Den Anfängen!
Location: Ontario, Canada
|
In Ian Banks "The Culture", society contained both hyper-intelligent "machines" and humans. In a way, the "machines" broght the humans along for the ride: they where "just like us", but only moreso, and the humans wheren't all that much of a burden.
Quote:
For instance, imagine a technology where you can build "bud" universes and then detatch them from ours. Humans build some, populate them, and disconnect: now, no one set of humans could wipe out all of them. Or, less dramatically, imagine we spread over a galaxy or two... Quote:
__________________
Last edited by JHVH : 10-29-4004 BC at 09:00 PM. Reason: Time for a rest. |
||
04-25-2004, 01:58 PM | #9 (permalink) |
* * *
|
I think of this as a non-sequitur question. The idea of evolution is lost on the abilities of individuals to make decisions. Since Mutually Assured Destruction we have lived in a world that overtly has a tenuous balance between security and widespread destruction. This balance has always been hard to maintain, overtly or not, and it will continue to be hard as populations continue to grow. Truly, your question is a question of scale and perhaps faith and luck. We are too new to these technologies of vast destruction to know where the future will lead us because we haven't solved the political dilemmas that accompany them. Sadly, the future of the human race might merely be an issue of policy.
__________________
Innominate. |
04-28-2004, 11:10 PM | #10 (permalink) |
lost and found
Location: Berkeley
|
This has been a troubling and dark contemplation for me for some time now. First, there are a few assumptions that are always taken for granted--that man necessarily evolves better morality and foresight over time. There is no indicator that is not on a genetic timescale. There is no indicator that morality and foresight are consistent survival traits, compared to, say, tireless cunning. Man is a cunning beast, not an intellectual. Being surrounded by smart people as we are in this forum, and, statistically, in our real lives makes it easy to forget how generally brutish and coldly simple the human mind is.
Morality and foresight (or whatever the communal understanding is of the traits of a wholly successful civilization) are high-level mental decisions--the last part of the brain to be consulted, the part of the brain to be addressed after instinct brings no answers. We are instinctive animals with less than 5% DNA differentation from chimpanzees. We have our Shakespeares and Einsteins, but we also have our Ed Geins and Hillside Stranglers. Anyway, I'm not really sober at the moment, so there's probably a hole or two in my logic. The short version of my opinion is that I am not optimistic about long-term viability. Granted, this sentiment is biased towards personal experience, but I think the things I've done, read, watched and listened to and have had done to me, both good and bad, are sufficiently generalizeable.
__________________
"The idea that money doesn't buy you happiness is a lie put about by the rich, to stop the poor from killing them." -- Michael Caine |
05-17-2004, 11:03 AM | #11 (permalink) |
Upright
|
As Technology evolves so will it's potentials. The possible outcomes I see are
A) Technology will assist in evolving morality and intelligence. Technology is evolving in all aspects. Including technology for communcation, learning, etc... Things that are important to build a world based on morality and intelligence. Technology could also potentially bring about truly democratic systems where every single citizen is truly able to vote and voice their opinions. There are other potential ways that technology could have such a huge positive impact that it would change the world drastically. B) The other outcome I see is people start using the potential of technology to manipulate and take advantage of each other, possibly leading to the mass killing of the human race. The chaos of the violence and killing will kill off most humans. A few will probably survive whether it is through heavy planning in the past or just pure luck. The chaos will ultimately lead to order and people will live on from there. Giving them the chance to build a new system from scratch with the technology left from thousands of years of human work. |
Tags |
abilities, advancement, human, relative |
|
|