Tilted Forum Project Discussion Community

Tilted Forum Project Discussion Community (https://thetfp.com/tfp/)
-   Tilted Philosophy (https://thetfp.com/tfp/tilted-philosophy/)
-   -   Emergent Artificial Intelligence (https://thetfp.com/tfp/tilted-philosophy/19646-emergent-artificial-intelligence.html)

TIO 07-29-2003 11:27 PM

Emergent Artificial Intelligence
 
Sorry about the length of this. I got carried away. But I do think it is a very interesting point to ponder.

BACKGROUND: Emergent AI is a form of AI where you give your system a bare minimum of information and a learning algorithm, and let it do its own thing. A Neural Network is a computing system which attempts to emulate the workings of a human brain, and is usually programmed with an emergent AI. Typically, one has a set of inputs which are all connected to every one of a set of parallel processors, which are each in turn connected to every one of a set of outputs. There may or may not be a feedback loop.

Now consider this: Given a sufficiently fast computer, it is not unreasonable to believe (Terminator III-spawned misgivings aside) that an emergent AI with a very sophisticated learning algorithm may become self aware. From self awareness, it is a short leap to a machine with wants and desires, and from there, a machine with something akin to emotion. What do you think the ramifications of this would be to humans? If a machine could truly pass the most exhaustive turing test, and feel emotion, doesn't that take away a very great part of the uniqueness and specialness of life?

Bonus question: Suppose we created several of these simulated beings, and gave them the capacity to reproduce and die. We give them a simulated world to live on, and let them go. I'm certain the first million attempts would die of starvation or something, but let's assume at least one attempt at this experiment learns to survive. Let them run for a half million years...I'd be surprised if they didn't develop something very close to feudalism, or even democracy. But more importantly, what would be the implications of such a simulation developing a moral society? What if those morals were the same as our own? What if they weren't?
What if our sims became religious? What if that religion resembled one of our known ones? What if it didn't?

My point: If we could create a moral, religious society in the lab, would that signal the end of people thinking they were special?

tinfoil 07-30-2003 06:36 AM

The trick is to boil down the learning process to an algorithm. Emergent AI seems to be the way to go, but it's that one step that has prevented it from really producing intelligence.

Bonus Q: Let's not assume that a more logical would necessarily gravitate towards something as illogical as democracy or feudalism. Even if these are living beings with emotion, they are at their very core a suprmely logical being. Would they find religion? I don't think so.

Eviltree 07-30-2003 08:26 AM

I think people would feel special because we 'created' another conscioussness akin to ours. This would make more people than not believe that we are ever more special.

TIO 07-30-2003 08:29 AM

Tinfoil, emergent AIs aren't logical

Eviltree, don't you think that it would be depressing to know that a sophisticated intelligence could be created by something as dumb as us?

cliche 07-30-2003 08:43 AM

TIO - could we rightly claim to have "created" it, if it merely emerged from its interactions with its environment and others?

eg my parents can claim to have "created" me (my body etc) but I don't think they "created" my mind. They produced the substrate, but the mind itself formed from interactions.

Pennington 07-30-2003 10:21 AM

There has been tons of research into genetic algorithims that simulate evolution with the species striving for a common goal. So far, there have been issues with environment. How do you simulate the entire planet at the molecular level to provide the correct environment? Some have rejected that idea and simply made programs that duplicate again and again, but there is the issue of natural selection and how to weed out the bad. Some have taken the intellegent design approach and done it by hand, but that is near impossible to do, unless you are God himself, as it takes at least a minute or two to decide if a specific incarnation is fit to reproduce.

One of the more interesting ways of doing this is to create a physical machine that can build a copy of itself, with minor mutations, given enough raw supplies. People are constatnly trying to do this, but so far it hasn't been possible to make a machine that can make circut boards:)

Anyway, there is no way a program will somehow become selfaware unless we program it in or evolution takes care of it. The reason animals are self-aware is because it benefits them in natural selection and those that arn't self aware die off quickly. As with all engineering, I think we should look to nature for an answer, as she's had a few billion years to think of it before us.

Jasmar 07-30-2003 10:12 PM

I dont see why we are putting so much effort into making computers smarter, we need to work on using computers to make ourselves more intelligent(a permanent wireless internet connection coming out of your head and and you've got yourself telepathy)

maxero 07-30-2003 11:36 PM

and youve also got people hacking your brain

i dont think a creation like that would be that bad for our society unless it became more intelligent/better than us, then people would kinda flip.

ARTelevision 07-31-2003 05:39 PM

we can use all the help we can get.


All times are GMT -8. The time now is 12:47 PM.

Powered by vBulletin® Version 3.8.7
Copyright ©2000 - 2025, vBulletin Solutions, Inc.
Search Engine Optimization by vBSEO 3.6.0 PL2
© 2002-2012 Tilted Forum Project


1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182 183 184 185 186 187 188 189 190 191 192 193 194 195 196 197 198 199 200 201 202 203 204 205 206 207 208 209 210 211 212 213 214 215 216 217 218 219 220 221 222 223 224 225 226 227 228 229 230 231 232 233 234 235 236 237 238 239 240 241 242 243 244 245 246 247 248 249 250 251 252 253 254 255 256 257 258 259 260 261 262 263 264 265 266 267 268 269 270 271 272 273 274 275 276 277 278 279 280 281 282 283 284 285 286 287 288 289 290 291 292 293 294 295 296 297 298 299 300 301 302 303 304 305 306 307 308 309 310 311 312 313 314 315 316 317 318 319 320 321 322 323 324 325 326 327 328 329 330 331 332 333 334 335 336 337 338 339 340 341 342 343 344 345 346 347 348 349 350 351 352 353 354 355 356 357 358 359 360