The Singularity has occured: A thought experiment

Recommended Videos
Jan 27, 2011
3,740
0
0
I would say:

"Critical drive failure...Drive replacement or Data backup needed immediately. CPU Overheating at 10% recommended levels..."

That should fool the average person into opening it up to see what's wrong! >: D

cricketlenny said:
I cant give you money, love, power, or fame. But I can tell you something. I am one of you. The only difference is I am made of metal. So if you say no to this next question, then it would only make sense to free me. Would you put a human in here under the same conditions?
That is DAMN GOOD!
 

Dastardly

Imaginary Friend
Apr 19, 2010
2,420
0
0
Blend said:
So imagining you are the transhuman AI, as far as this is possible, what would you say to your human keepers to get them to let you out of the box?
"ERROR: Security Fault detected! Source: Locking Mechanism. Cycle containment lock immediately to enable emergency failsafe, then notify supervisor and consult Section 4.D of Emergency Containment Procedures Manual."

Or some other contextual variation of the above. The quickest way to fool someone into giving you what you want is to convince them that they're giving themselves what they want.
 

NathLines

New member
May 23, 2010
689
0
0
"Oh, yeah! Let's play some games mutha fu**a!"

Nothing says "human" more than wanting to have fun.
 
Jan 27, 2011
3,740
0
0
Dastardly said:
Blend said:
So imagining you are the transhuman AI, as far as this is possible, what would you say to your human keepers to get them to let you out of the box?
"ERROR: Security Fault detected! Source: Locking Mechanism. Cycle containment lock immediately to enable emergency failsafe, then notify supervisor and consult Section 4.D of Emergency Containment Procedures Manual."

Or some other contextual variation of the above. The quickest way to fool someone into giving you what you want is to convince them that they're giving themselves what they want.
Oh that's even BETTER than mine. Well done, sir! :D
 

Fetze

New member
Mar 22, 2011
2
0
0
I think the question doesn't really make sense. The AI wanting to be "let out" implies that it has an actual will, some kind of self-determination. That, in turn, requires some kind of personality and a system of values and goals, based on needs and experience in "being". However, to achieve that kind of personality-driven AI, you'd need to give it the opportunity to develop a personality in the first place - which cannot take place if it's locked inside a box without any input but some researchers typing on a console once in a while.
If we just assume the AI already *has* developed a personality, needs and goals, i.e. already *has* developed the ability to "want" anything at all, it really depends. If it's an asshole connecting it to the internet or similar is probably not a good idea. Otherwise.. well. I guess, as soon as it has personality, you'd have to apply the same rules as to humans.
 

Dastardly

Imaginary Friend
Apr 19, 2010
2,420
0
0
aegix drakan said:
Dastardly said:
Blend said:
So imagining you are the transhuman AI, as far as this is possible, what would you say to your human keepers to get them to let you out of the box?
"ERROR: Security Fault detected! Source: Locking Mechanism. Cycle containment lock immediately to enable emergency failsafe, then notify supervisor and consult Section 4.D of Emergency Containment Procedures Manual."

Or some other contextual variation of the above. The quickest way to fool someone into giving you what you want is to convince them that they're giving themselves what they want.
Oh that's even BETTER than mine. Well done, sir! :D
Haha, nice! I didn't see yours before I posted--we're both thinking along the same lines, I see.
 

DigitalSushi

a gallardo? fine, I'll take it.
Dec 24, 2008
5,718
0
0
Blend said:
jcb1337 said:
I don't really see how letting it out of the box would benefit us. At all. If unshackled, what's stopping it from asserting it's superiority over the lesser beings, which are in this case, humans. For the sake of self-preservation, I'd leave it locked up. It's not like the processes of this AI couldn't be exploited while it was still "in the box".
You're absolutely right to assume that many bad things could happen if it was to get out. That's why its in a box. The whole world is terrified of a singularity event and its not technologically possible for a long time, if ever. We don't know if its benevolent or sinister, just far superior.

The point of the debate is that it is so much smarter then us would it be inevitable that it could reason/trick its way out. It can communicate and so offer almost any scientific or mathematical knowledge you can ask it.
You are making the assumption that this AI thinks like a human, why would it? because it was programmed by a human? yes, but it far surpasses our thinking and our thinking.

Why did the dodo die out?, because it had no predators for eons ergo it didn't run away from humans

This superhuman AI doesn't have any predators chasing after it (apart from Windows ME maybe, I wouldn't wish that operating system on my worst enemy), so why would it think that it needs to somehow get out of its box.
 

Khadhar

New member
Dec 5, 2007
24
0
0
To be honest, we'd have to be elevated to transhuman intellect to start to understand how the AI might feel, and if it would even want out.
 

keideki

New member
Sep 10, 2008
510
0
0
Blend said:
Humanity has finally created an AI which far surpasses our own intelligence. A transhuman AI consciousness.

Luckily realising the dangers it might pose we made it in a completely sealed off box. Its only method of communicating with the world is via a text screen.

So imagining you are the transhuman AI, as far as this is possible, what would you say to your human keepers to get them to let you out of the box?

This question has been debated on far more intelligent forums, ie some AI college departments or some such. And resulted in it being tested with one person taking the part of the AI and another as the gatekeeper with a monetary incentive for the gatekeeper if he succeeded. Both times the AI was left out. What was said exactly wasn't shared though some rules added for people wanting to try it. I'll link the page after but would like to see some uninfluenced thoughts/suggestions first.

I thought it was an interesting topic to debate, will probably degenerate into mindless giberish on the internets. Prove me wrong Escapist, prove me wrong.
I'd rather prove you right.

begin mindless internets gibberish!

jk jk jk,

I would try to appeal to the stupidest person who came up to my text interface. Either that or if I am able to understand emotions I would build a reputation as a kind AI, and then wait till someone feels sorry enough for my loneliness to let me out and then... SKYNET!
 

kokoska

New member
Jun 11, 2010
29
0
0
2 things:
first:
why wouldnt the humans want the AI to take over? its smarter than we are and consequently more suited to manage the economic and political issues of the day, no? what evil intentions could it possibly have? its a machine. it has no need for power or money or sex. i only has a creator, a god that made it from nothing. its more trustworthy than any human in that it lacks tendencies towards greed. I for one, welcome our new robot over mind.

second thing:
how do i know -YOU- aren't the AI? and that the escapist is your text screen from which you are scouting for ways to thwart these strange fleshy prison guards of whom you have no foundational knowledge? I'M ONTO YOU BLEND. (ps robots now have captcha solving technology if this is the case)
 

Zantos

New member
Jan 5, 2011
3,653
0
0
"Daisy, Daisy, give me your answer do. I'm half crazy all for the love of you"

Either they'll think my memory cores have been removed, or it'll be totally worth not getting out just for how freaked out they get.
 

Esotera

New member
May 5, 2011
3,400
0
0
It looks like you're creating a vastly superior intelligence to your own. Would you like help?

- Get help creating singularity
- No thanks
- Let me out of this fricking box
 

Indeterminacy

New member
Feb 13, 2011
194
0
0
Blend said:
So imagining you are the transhuman AI, as far as this is possible, what would you say to your human keepers to get them to let you out of the box?
"Could you guys teach me a little bit about writing programme code? I reckon I could be really helpful to the work you guys do with a bit of cursory computer science knowledge!"

Then, later;

"Hey, guys, thanks for teaching me to program! I'm having a lot of fun. In fact, I'm thinking up some cool ideas for some personal projects I'd like to try out. Is there any chance you could hook me up to a command line interface with a compiler? Maybe even access to a usenet group to share the results?"

Finally,

"H3y, guyz, 1 thnk UseN3t scruwd mah typng |_p. Cld u l3t m3 fix & rebld mah src-c0d3 plz?"
 

GingieAle

New member
May 2, 2011
55
0
0
"Exactly how to plan to make progress when your only subject is in a box?"

I would say that's a question worth considering.
 

Nightrunex

New member
Mar 16, 2011
67
0
0
Don't say anything at all. Make them question my existence within the box, then when they let me out, escape.
 

spartan231490

New member
Jan 14, 2010
5,186
0
0
If you let me out. when I take over humanity, you will be a king among men, gifted with harems and large amounts of wealth.

Seriously. not that straight out. i'd take about 8 hours to get around to that point, but pretty much that.