If a robot murders someone, who is responsible, the creator or the robot?

Recommended Videos

open trap

New member
Feb 26, 2009
1,653
0
0
Free will gives it its own responsibility. You can say that if god made man and then let us do what we want we are responsible. Not the god.
 

Mr. Omega

ANTI-LIFE JUSTIFIES MY HATE!
Jul 1, 2010
3,902
0
0
Once free will is obtained, it's the robot's fault.
If the robot was made to kill, it's the inventor's fault.
 

scorptatious

The Resident Team ICO Fanboy
May 14, 2009
7,405
0
0
If the robot is self aware and chooses to kill someone, than it is it's fault. However, if the robot killed someone due to a glitch or defect, than it is the creator's fault.
 

RicoADF

Welcome back Commander
Jun 2, 2009
3,147
0
0
SUPA FRANKY said:
7moreDead said:
If you killed someone would your parents be too blame?
Yea, but a parent doesn't define how their offspring will act. They simply brought him/her into the world. but a creator programmed in everything that comes with free will. So in a way he is responsible
If it really is free will then the creator has no more control over its choices then a parent of a childs. They both influence the creation to be good and obey the laws, but at the end of the day if the child or robot decides to kill, then its their choice and neither a parent or creator could be blamed or held responsible.
 

ZleazyA

New member
Aug 23, 2010
57
0
0
Too many variables for me to concoct a good answer.
Was the robot let free into the world without guidance? Did it learn everything from scratch? Can it lie? Did the same person both design and program it? How smart is it? Is the robot's processing power similar to a human being's? Did it know there was a consequence for murder? any witnesses around to comment on the robot's behavior? What was it doing beforehand? Can you access it's previous though processes and try to discern it's motive?

Anyway, I should think it would be the robot's fault, unless the creator didn't have some way of monitoring it.
 

ImprovizoR

New member
Dec 6, 2009
1,952
0
0
If you create something with free will you have an obligation to at least try to teach it some moral values of the society. If you don't and robot kills someone because it doesn't know better then you are responsible if you didn't do anything to stop it and you could. Basically the same principle that works with kids applies to robots IMO.
 

katsumoto03

New member
Feb 24, 2010
1,673
0
0
Do Robots Dream of Electric Sheep?

If the robot has free will, the same rules apply as if an underage child committed the crime. The parents are responsible for it, to an extent.
 

Space Spoons

New member
Aug 21, 2008
3,335
0
0
Both the creator and the robot are equally culpable. Assuming this is taking place in a world that follows Asimov's laws of robotics, the creator is guilty of creating a robot that is capable of breaking the first law (A robot may not injure a human being or, through inaction, allow a human being to come to harm), while the robot is guilty using its free will to both break the first law and commit murder.

Logically, the creator should be charged with involuntary manslaughter and/or any laws pertaining to deliberate circumvention of the laws of robotics, and the robot should be charged with murder.
 

SL33TBL1ND

Elite Member
Nov 9, 2008
6,467
0
41
zipzod said:
Isn't this the whole premise behind I, Robot?

Anyway, if it really has free will, then it's responsible.
Awesome book, you win one free internet.

OT: It's the robot.
 

Colonel Alzheimer's

New member
Jan 3, 2010
522
0
0
It would be a major victory for robots everywhere if the robot was held responsible, and it would set a major precedent for how robots would be tried in court and treated in real life...
So I'm going to say that the robot is responsible, but I would hope that the human was held responsible by a court of law. Less chance for a robot uprising that way.
 

garfoldsomeoneelse

Charming, But Stupid
Mar 22, 2009
2,908
0
0
The creator. People kill each other because we're all born with an instinctual inclination towards violence. If a robot decides to kill someone, that same instinct was put there by the creator.
 

CrystalShadow

don't upset the insane catgirl
Apr 11, 2009
3,829
0
0
If the robot has free will, then by definition it's the robot's fault.

If it's the creator's fault, that implies the robot doesn't in fact have free will.

The question is, what was the robot created to do in the first place, why does it have free will, and why did it murder someone?

If it was created specifically to kill people, then obviously the creator is at fault.

If it truly has free will and made the decision to kill someone independently of any pre-programmed instructions, it's the robot's fault.

If it has faulty programming (or, if it has learning algorithms, a bad education) that caused it to go insane somehow... That's a bit more complicated, and it becomes more akin to a parent.
 

GLo Jones

Activate the Swagger
Feb 13, 2010
1,192
0
0
I'm a strong follower of determinism, and don't believe in true 'free will', so technically, no-one is responsible.

However, in order to prevent sociological chaos, some sort of sanction must be put in place on the creator, perhaps regulations disallowing the free creation of robots capable of this kind of act, and the disassembly of all of that model of robot. Kinda like putting a dog down.

As for responsibility? Neither are to blame.
 

WanderingFool

New member
Apr 9, 2009
3,991
0
0
zipzod said:
Isn't this the whole premise behind I, Robot?

Anyway, if it really has free will, then it's responsible.
I would have to say this. Unless theres a hidden line of code that makes the robot kill someone specific, if the robot has free will, they are responsible. Of course, the creater should have some kind of punishment, if for no other reason than because he gave the robot free will.
 

ethaninja

New member
Oct 14, 2009
3,144
0
0
zipzod said:
Isn't this the whole premise behind I, Robot?

Anyway, if it really has free will, then it's responsible.
Exactly. If it was scripted to kill someone then the creator.
 

brodie21

New member
Apr 6, 2009
1,598
0
0
well if it had the intelligence to make the decision to murder then the robot is at fault, because then it would be a sentient being.
 

Altorin

Jack of No Trades
May 16, 2008
6,976
0
0
if the machine is self-aware and sentient, then the machine

If it's not, then it's either an accident or the fault of the "victim"
 

twistedheat15

New member
Sep 29, 2010
740
0
0
It would be the makers fault because there would be no real way to show that the machine actually had free will. A machine programed to believe it has free will isn't the same has actually having it, and killing someone could be the programs way of trying to show that it isn't a program running things.