A human and an AI are facing death, who do you save?

Recommended Videos

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
KyuubiNoKitsune-Hime said:
If the AI does not obey the three laws of robotics then I will let it perish and save the human. Because the simple fact is that in this case if she ever decided humans were a threat, theoretically she could become a threat to the human race. That is not something I'd like to be responsible for.
The exact same outcome could happen with the human you know. There isn't anything saying that the person, who isn't under any programming obligation to be good to humanity, could be the greater evil, in your example. So that's not really a good reason to save the human. xD For all you know, he's the next evil tyrant bent on world domination.
 

KyuubiNoKitsune-Hime

Lolita Style, The Best Style!
Jan 12, 2010
2,151
0
0
Happyninja42 said:
KyuubiNoKitsune-Hime said:
If the AI does not obey the three laws of robotics then I will let it perish and save the human. Because the simple fact is that in this case if she ever decided humans were a threat, theoretically she could become a threat to the human race. That is not something I'd like to be responsible for.
The exact same outcome could happen with the human you know. There isn't anything saying that the person, who isn't under any programming obligation to be good to humanity, could be the greater evil, in your example. So that's not really a good reason to save the human. xD For all you know, he's the next evil tyrant bent on world domination.
Yeah but evil tyrants have capabilities and thought processes we understand. The AI may not be the same, it may be able to create and command an entire army of machines. The human could possibly do the same but not to the same degree. While the AI may not be able to upload it self in an emergency it could at some later point and take control of the internet. With a human tyrant at least we know what we're dealing with, and have an assurance to beat them, or at worst overthrow them at a later point.

Edit: A further point is this. The AI will likely not need a planet we consider habitable. So one of it's options would to make the earth too hostile for organic life. Thus exterminating humanity. Something a human tyrant of even the most insane variety would never do, because the tyrant would be concerned about self preservation.
 

happyninja42

Elite Member
Legacy
May 13, 2010
8,577
2,990
118
KyuubiNoKitsune-Hime said:
Happyninja42 said:
KyuubiNoKitsune-Hime said:
If the AI does not obey the three laws of robotics then I will let it perish and save the human. Because the simple fact is that in this case if she ever decided humans were a threat, theoretically she could become a threat to the human race. That is not something I'd like to be responsible for.
The exact same outcome could happen with the human you know. There isn't anything saying that the person, who isn't under any programming obligation to be good to humanity, could be the greater evil, in your example. So that's not really a good reason to save the human. xD For all you know, he's the next evil tyrant bent on world domination.
Yeah but evil tyrants have capabilities and thought processes we understand. The AI may not be the same, it may be able to create and command an entire army of machines. The human could possibly do the same but not to the same degree. While the AI may not be able to upload it self in an emergency it could at some later point and take control of the internet. With a human tyrant at least we know what we're dealing with, and have an assurance to beat them, or at worst overthrow them at a later point.

Edit: A further point is this. The AI will likely not need a planet we consider habitable. So one of it's options would to make the earth too hostile for organic life. Thus exterminating humanity. Something a human tyrant of even the most insane variety would never do, because the tyrant would be concerned about self preservation.
That's a lot of assumptions about the capabilities of this AI to support your conclusion, but that's fine, this entire thing is a wild thought experiment anyway. I still say rocket jets, laser beams and a grapple hook are the true answer to this question.
 

FPLOON

Your #1 Source for the Dino Porn
Jul 10, 2013
12,531
0
0
madwarper said:
The Electronic Afterlife
Oh no! Not the EA!! I don't think my feels could take the thought of thinking about the robot equivalent to heaven!

OT: They were dead to me the moment they both started having an affair behind my back... They thought they could get away with it, but they forgot that ever since I won the human lottery of the "dolla-dolla-billz" amount, I placed camera all around our lovable home... And, as I monologue to them as their pending deaths approach them, I forget to save myself and, in the cruelest irony since the controversial Row V Bot court trial, I die alongside them...

The last memory that flows through my mind is "Why didn't they invite me and make it a threesome?"
 

Rowan93

New member
Aug 25, 2011
485
0
0
Naturally, any being as sentient and sapient as a human is of equal moral worth, and since you haven't specified anything about the nature of my friendship with each individual, the issue is moot on that front.

The determining factor, then, is how much each can benefit other people in the world if they survive, which probably goes to the AI but that depends on a lot of things left unstated - does "unique" mean this particular individual is unique, or that this is the only AI on Earth? Is she smarter-than-human, and if so by how much? Its possible that if she survives she can take over the world and vastly improve on the shitshow human politicians have made of running the place. Or it's possible she's the science-fiction-standard "AIs are just autism-spectrum humans made of metal" kind of deal, and won't change the world any more than the human.

Although I guess if she is a standard science fiction robot buddy who's basically an autistic human, the choice is between saving a human friend who'll die in less than a century anyway and saving a human friend who's functionally immortal outside of pyroclastic flows, so it still goes easily to the robot.
 

Johnny Impact

New member
Aug 6, 2008
1,528
0
0
Depends.

In the moment, I think I would just instinctively reach for the human.

If there were time to think, I'd save the 'Data from Star Trek: TNG' form of AI before the human. The AI is totally unique, the first and only one of its kind, special result only achieved once by a now-dead supergenius, which no one has since understood or duplicated. The universe may never know another. Humans are a dime a dozen. Save an endangered species first.
 

Rowan93

New member
Aug 25, 2011
485
0
0
KyuubiNoKitsune-Hime said:
If the AI obeys the three laws of robotics then she'll help me save the human, and save her self if she has the time.

If the AI does not obey the three laws of robotics then I will let it perish and save the human. Because the simple fact is that in this case if she ever decided humans were a threat, theoretically she could become a threat to the human race. That is not something I'd like to be responsible for.

Either way the AI will perish simply because either it took my choice and saved the human, or saving the human ultimately protected the human race.
The robot and human are described as being "trapped" and needing your help, one would think this rules out either of them saving the other or themselves. And if she's described as your friend, that presumably rules out her being the sort of AI you'd want exterminated in case she poses a threat to the human race.
 

PlayerDos

New member
Nov 10, 2013
63
0
0
The human, because I'm not an edgy 15 year old.

There's no way the AI would be unique, the second someone invented it there'd be special hats that you download through an AppleAI which changes the AI's appearance or some shit, not to mention DLC for different voices and alarm clock settings.
 

DrOswald

New member
Apr 22, 2011
1,443
0
0
Assuming the AI is not some sort of unique prototype, I would choose the one I felt was more needed (for example, if the AI was a mother and the human just some surfer bum, save the AI.) If that tied, I would choose whoever I liked more. If that tied, it would probably come down to random chance. Probably whoever I was closest to.
 

Addendum_Forthcoming

Queen of the Edit
Feb 4, 2009
3,647
0
0
None of them ... if death is certain I would be terrified of the implications of said certainty, curl up in the foetal position and start rocking myself slowly, sobbing, as the inevitable comes to take both of them.
 

Starik20X6

New member
Oct 28, 2009
1,685
0
0
If the AI has true sentience, then I suppose it really just becomes a question of which of two friends I would save. Determining the worth of a life should not be influenced by the physical vessel that carries it, whether that vessel is organic or manufactured. The robot and the human are both made of atoms after all.

balladbird said:
kris40k said:
Human.

AI that can't shut off its pain sensors is flawed, and being unable to replicate itself, it lacks a required definition of life.
I dunno... I would argue that the AI having pain sensors in the first place was a pretty big flaw!
A self aware AI inhabiting a physical body would absolutely require pain sensors. Pain is the body's way of knowing something is wrong, or when you're approaching/exceeding its limitations. People born without the ability to feel pain can get pretty messed up by it [http://en.wikipedia.org/wiki/Congenital_insensitivity_to_pain], so it stands to reason a self-aware robot without such sensors would be breaking itself constantly, as it would have no idea of its own limits, and the damages it was incurring by exceeding them. Granted, such feelings may not be thought of by you or I as 'pain' in the traditional sense, but then we get into the discussion of qualia and all that heavy philosophical stuff...
 

J Tyran

New member
Dec 15, 2011
2,407
0
0
I'm human, so I will stand there looking shocked/panicking and would probably die myself.