07-29-2003, 11:27 PM | #1 (permalink) |
Addict
Location: The Land Down Under
|
Emergent Artificial Intelligence
Sorry about the length of this. I got carried away. But I do think it is a very interesting point to ponder.
BACKGROUND: Emergent AI is a form of AI where you give your system a bare minimum of information and a learning algorithm, and let it do its own thing. A Neural Network is a computing system which attempts to emulate the workings of a human brain, and is usually programmed with an emergent AI. Typically, one has a set of inputs which are all connected to every one of a set of parallel processors, which are each in turn connected to every one of a set of outputs. There may or may not be a feedback loop. Now consider this: Given a sufficiently fast computer, it is not unreasonable to believe (Terminator III-spawned misgivings aside) that an emergent AI with a very sophisticated learning algorithm may become self aware. From self awareness, it is a short leap to a machine with wants and desires, and from there, a machine with something akin to emotion. What do you think the ramifications of this would be to humans? If a machine could truly pass the most exhaustive turing test, and feel emotion, doesn't that take away a very great part of the uniqueness and specialness of life? Bonus question: Suppose we created several of these simulated beings, and gave them the capacity to reproduce and die. We give them a simulated world to live on, and let them go. I'm certain the first million attempts would die of starvation or something, but let's assume at least one attempt at this experiment learns to survive. Let them run for a half million years...I'd be surprised if they didn't develop something very close to feudalism, or even democracy. But more importantly, what would be the implications of such a simulation developing a moral society? What if those morals were the same as our own? What if they weren't? What if our sims became religious? What if that religion resembled one of our known ones? What if it didn't? My point: If we could create a moral, religious society in the lab, would that signal the end of people thinking they were special?
__________________
Strewth |
07-30-2003, 06:36 AM | #2 (permalink) |
Fucking Hostile
Location: Springford, ON, Canada
|
The trick is to boil down the learning process to an algorithm. Emergent AI seems to be the way to go, but it's that one step that has prevented it from really producing intelligence.
Bonus Q: Let's not assume that a more logical would necessarily gravitate towards something as illogical as democracy or feudalism. Even if these are living beings with emotion, they are at their very core a suprmely logical being. Would they find religion? I don't think so.
__________________
Get off your fuckin cross. We need the fuckin space to nail the next fool martyr. |
07-30-2003, 08:43 AM | #5 (permalink) |
Rookie
Location: Oxford, UK
|
TIO - could we rightly claim to have "created" it, if it merely emerged from its interactions with its environment and others?
eg my parents can claim to have "created" me (my body etc) but I don't think they "created" my mind. They produced the substrate, but the mind itself formed from interactions.
__________________
I can't understand why people are frightened of new ideas. I'm frightened of the old ones. -- John Cage (1912 - 1992) |
07-30-2003, 10:21 AM | #6 (permalink) |
Banned
Location: Autonomous Zone
|
There has been tons of research into genetic algorithims that simulate evolution with the species striving for a common goal. So far, there have been issues with environment. How do you simulate the entire planet at the molecular level to provide the correct environment? Some have rejected that idea and simply made programs that duplicate again and again, but there is the issue of natural selection and how to weed out the bad. Some have taken the intellegent design approach and done it by hand, but that is near impossible to do, unless you are God himself, as it takes at least a minute or two to decide if a specific incarnation is fit to reproduce.
One of the more interesting ways of doing this is to create a physical machine that can build a copy of itself, with minor mutations, given enough raw supplies. People are constatnly trying to do this, but so far it hasn't been possible to make a machine that can make circut boards Anyway, there is no way a program will somehow become selfaware unless we program it in or evolution takes care of it. The reason animals are self-aware is because it benefits them in natural selection and those that arn't self aware die off quickly. As with all engineering, I think we should look to nature for an answer, as she's had a few billion years to think of it before us. |
Tags |
artificial, emergent, intelligence |
|
|