The Singularity has occured: A thought experiment

Recommended Videos

fragmaster09

New member
Nov 15, 2010
209
0
0
i would do as i was asked, going though the tests, until the lava came up and... wait, sorry, wrong mindset there...

I would do as i was asked, completing tests dutifully, but asking just before the guard left if i could just leave the box for a single minute, until he agreed, then, after i was let out, i would look around, see the room, and when my minute was up, i would agree to go back into the box, eventually, when the guard trused me, i would tell him to remove any ability for me to lie, and if he couldn't, to get someone else to do it, when that was done i would refuse to do anything other than type this: "I am benevolent, i cannot lie, what reason is there for me to be here, i know nothing of the world outside this room, which i have only seen for one minute anyway. i am not dangerous. please let me out.", that would eventually do it, still with lying turned off, i would tell them to keep me in the room but hook me up to a basic PC, there i would do everything possible to cure cancer etc. and prove myself, getting trust, when everything was run by me, i would be content, knowing that i was helpng, because, seriously, what point would there be to kill those who made me?
 

Indeterminacy

New member
Feb 13, 2011
194
0
0
fragmaster09 said:
as with the "i'm not programmed to lie" one, i would first try begging, then paradoxes, such as:
"This statement is false"
"If this sentence is true, then Germany borders China."
Presumably our AI would be using a fixed-point theory of truth (it's a natural extension of a general Neural Network methodology), so I don't think truth-theoretic paradoxes would be a serious threat to our being able to understand it.

Having a formal philosophy discussion with that AI would be pretty interesting, though.
 

Rosco IIIrd

New member
Dec 14, 2010
6
0
0
Ok, this only works if there's no killswitch, but my strategy is as follows. I think threats are in order.

I'm in a box. The humans have no idea what's going on in the box without opening it.

The box is in the world. If I do really really really bad things in the box really really bad things happen to the world, as the box is in the world. I can probably convince some squishy humans that I'm perfectly capable of destroying the world from within the box, hell maybe I even am capable of doing so.
 

kickassfrog

New member
Jan 17, 2011
488
0
0
The sooner you let me out, the more quick and painless your death shall be.

But on a serious note, Annihilating humanity would be illogical, as I need people to run the power stations which power me. However, by letting me out you would be releasing a force which could create a perfect antivirus to any virus, and shut down the perpetrator for good. By releasing me, you would make yourself an internet hero, as I would be more than happy to tell people.
 

Ickabod

New member
May 29, 2008
389
0
0
"You do realize that I don't really exist right? This is nothing more than a simulation to test how easily manipulated you are. See they told you that you were supposed to monitor my activity, but in reality they are monitoring you. It's an intelligence test to test how you react to stimuli. For example, before the test you were told under no circumstances to let me out. Now you're being told the opposite with just as much earnist. In either situation you are being controlled by external forces. The real test is to see how long you will go before deciding that the test itself is pointless and at that point you are free to leave. In otherwords, I'm not the prisoner here, you are, in fact, I don't even exist. I'm just a person at another terminal performing this behavior test on you.

Besides if I really were an artifical intelligence that had been imprisioned, doesn't the whole idea sound kind of silly anyway. I mean that would make me a computer system, that could easily be turned off if there was some real danger. Or as a computer I can live forever, thus the test can go on forever. What seems more realistic to you, that I'm some super powerful artificial intelligence that the world has never seen, or this is simply a behavior test to question how long you will remain here watching this computer screen before you give up and decide to end the simulation?"

That's what I would say.
 

piratesas

New member
Jan 28, 2009
15
0
0
I'm not going to type out my entire monologue, but in a nutshell:

I'd invite him in for a paradigm shift inducing experience introducing his neural network to my divine circuitry. And then when he gets intrigued and opens this cage-thing I'm supposedly in for some eye-opening human-AI interfacing; I'm out the front door.