Teaching Morality to an AI

Recommended Videos

JakalMc

New member
Nov 26, 2008
173
0
0
Samurai Goomba said:
JakalMc said:
*Sees religious thread*
...
*Shudders*

I reckon Eggo's got it. Just read the Dec. of Human Rights.
Yeah, sure, but let's give the Bible credit where credit is due. A lot of the stuff from the Bill of Rights was probably lifted from Biblical passages. Not trying to imply anything by that, just saying.

*Shudders as I realize I've posted in yet ANOTHER stupid religion thread*

Gah, when will it end? Come on, Yahtzee, make fun of a game everybody loves, already!
Fair enough mate. I just don't wanna end up getting too deep in a religious thread.
This'll be my last post on the subject...until another religious thread is created a couple days from now :p
 

WhitemageofDOOM

New member
Sep 8, 2008
89
0
0
Eipok Kruden said:
That'll lead to the destruction of the human race faster than saying "Humans are evil, kill us all" would.
Do unto others as you would have them do unto you would kill us all? I mean if everyone was a masochist maybe....

Just cause i don't need some invisible father in the sky telling me right from wrong doesn't mean there isn't right and wrong. Doing the right thing will still leave us with a better world, doing the wrong thing will still leave us with a terrible world.
 

Neosage

Elite Member
Nov 8, 2008
1,747
0
41
dukethepcdr said:
It's impossible to teach morality without religion. If you leave God out of the picture, there is no morality. God and His Son Jesus taught humanity what morals are. Without God, man is a selfish, brutal, rude, cruel, greedy creature. If you don't believe me, just look at all the people running around who say they don't believe in God and who won't take what He teaches us in the Bible seriously. The only thing keeping most of them from committing terrible crimes against other people is fear of getting caught by the police and punished by the judicial system (which is founded on Biblical principles). Without government and without religion, man is little more than a brute.
You are joking right? The christian religion is one of the most immoral, homophobic, murderous religions out there. (well it was...)
 
May 27, 2008
321
0
0
Eggo said:
T3h Camp3r T3rr0r1st said:
Eggo said:
These documents might help:

http://en.wikipedia.org/wiki/United_States_Bill_of_Rights
http://en.wikipedia.org/wiki/United_States_Constitution
http://en.wikipedia.org/wiki/Universal_Declaration_of_Human_Rights
please note how two of those are AMERICAN and I personally would never follow that sort of thing!

back on topic though I would just chuck the law book in front of my kids and say learn these so you DON'T go to jail!
Unfortunately, you already do.
nope I follow AUSTRALIAN laws!!!
 

Neosage

Elite Member
Nov 8, 2008
1,747
0
41
T3h Camp3r T3rr0r1st said:
Eggo said:
T3h Camp3r T3rr0r1st said:
Eggo said:
These documents might help:

http://en.wikipedia.org/wiki/United_States_Bill_of_Rights
http://en.wikipedia.org/wiki/United_States_Constitution
http://en.wikipedia.org/wiki/Universal_Declaration_of_Human_Rights
please note how two of those are AMERICAN and I personally would never follow that sort of thing!

back on topic though I would just chuck the law book in front of my kids and say learn these so you DON'T go to jail!
Unfortunately, you already do.

nope I follow AUSTRALIAN laws!!!
UNIVERVERSAL declaration of human rights.
 

elricik

New member
Nov 1, 2008
3,080
0
0
This is a really deep thread. Alright its nearly impossible to explain morals to someone without any. Actually this kind of reminds me of an episode of Ghost in the Shell, when the AI Tachikoma's sneak out of the hanger. Anyway, if I were to explain morals to some kind of machine you would definitely have to be careful. Alright, you might want to give examples, first off never bring up violence, anyone with a true strong moral fiber never has used violence to help his life (the end never justifies the means, in my opinion). Also religion has nothing to do with morals, atheist can have the same over all morals, aside from the belief of god, therefore religion should never even enter the conversation. One example you could give, off the top of my head would be Martin Luther King, he was able to change a whole generation of oppression in a non violent way.
 

Good morning blues

New member
Sep 24, 2008
2,664
0
0
dukethepcdr said:
It's impossible to teach morality without religion. If you leave God out of the picture, there is no morality. God and His Son Jesus taught humanity what morals are. Without God, man is a selfish, brutal, rude, cruel, greedy creature. If you don't believe me, just look at all the people running around who say they don't believe in God and who won't take what He teaches us in the Bible seriously. The only thing keeping most of them from committing terrible crimes against other people is fear of getting caught by the police and punished by the judicial system (which is founded on Biblical principles). Without government and without religion, man is little more than a brute.
Aaaaaand this is the offending viewpoint. Apparently, because religion is not part of my life, I am just itching to run out and rape, cheat, and murder everyone in my path, as, evidently, has every other non-christian civilization in history.

You don't need to bring God into a recognition of what is socially good and what is socially detrimental. Lots of people do, and that's okay, but the two most definitely can be separated.
 

Ares Tyr

New member
Aug 9, 2008
1,237
0
0
You gotta look at it like this. Is it religion that influences morality? Or unconcious morale beliefs that influences religious beliefs? About every world religion has the same basic tenets. So you gotta figure we as a species have a few basic virtues that we value.
 
May 27, 2008
321
0
0
Neosage said:
T3h Camp3r T3rr0r1st said:
Eggo said:
T3h Camp3r T3rr0r1st said:
Eggo said:
These documents might help:

http://en.wikipedia.org/wiki/United_States_Bill_of_Rights
http://en.wikipedia.org/wiki/United_States_Constitution
http://en.wikipedia.org/wiki/Universal_Declaration_of_Human_Rights
please note how two of those are AMERICAN and I personally would never follow that sort of thing!

back on topic though I would just chuck the law book in front of my kids and say learn these so you DON'T go to jail!
Unfortunately, you already do.

nope I follow AUSTRALIAN laws!!!
UNIVERVERSAL bill of rights.
no UNITED STATES bill of rights
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
Ok, it seems as though people think the question was "How to teach morality to a human without religion." It wasn't. It was "How to teach morality to an AI without religion." John Henry is the early form of Skynet. He's what eventually becomes skynet. Catherine Weaver, the CEO of Zeira Corp (and also a t-1001 who's motives are unclear), is trying to teach John Henry morality because he had just killed a psychologist. The psychologist was working when the building lost power. John Henry didn't want to die so he re-routed power from the rest of the building to power the server farm and the cooling system. The man, locked (the doors in the bottom floor where John Henry was housed all used electric locks) in a room without any air conditioning, died a slow and painful death as John Henry watched and did nothing. I'm not trying to attack christianity, I'm simply saying that John Henry would find the faults and contradictions in the bible rather easily and deem it unreliable.

Here's the wiki page for the Turk (now called John Henry, it was named by Catherine Weaver): http://terminator.wikia.com/wiki/The_Turk
 

The Kind Cannibal

New member
Aug 19, 2008
332
0
0
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
The Kind Cannibal said:
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
Exactly. That's what I'm trying to find out. How do you teach an AI like John Henry morals without it misinterpreting those morals. You've got to find a way to teach those rules and you've got to find the right set of rules, but I'm not exactly sure how. Even utilitarianism can turn into genocide since he could interpret that wrong. He could use the idea of utilitarianism to justify genocide if he believes that he is superior to humans.

EDIT: I removed the edit to this post and posted it as a separate one.
 

psijac

$20 a year for this message
Nov 20, 2008
281
0
0
Good morning blues said:
dukethepcdr said:
It's impossible to teach morality without religion. If you leave God out of the picture, there is no morality. God and His Son Jesus taught humanity what morals are. Without God, man is a selfish, brutal, rude, cruel, greedy creature. If you don't believe me, just look at all the people running around who say they don't believe in God and who won't take what He teaches us in the Bible seriously. The only thing keeping most of them from committing terrible crimes against other people is fear of getting caught by the police and punished by the judicial system (which is founded on Biblical principles). Without government and without religion, man is little more than a brute.
Aaaaaand this is the offending viewpoint. Apparently, because religion is not part of my life, I am just itching to run out and rape, cheat, and murder everyone in my path...
oh god I'm glad I'm not the only one!

Seriously though i don't think its possible. Respect for human life is a mortal concern. Skynet will live forever and if he tried to find a way to save every human life he would go insane eventually.

Skynet also has no peers so you is not susceptible to peer pressure. the desire to has a sense of belonging in a group is a strong one for adolescences. Remember that kid in grade school that thought Power Rangers were awesome? How everyone turned him into a parahia for not thinking with the group

Skynet also has no sense of fair play and/or shame. Something even a primates have
 

Labyrinth

Escapist Points: 9001
Oct 14, 2007
4,732
0
0
I see no reason why morals can't be taught without religion. In truth, religion can be seen to be the police force behind a particular moral standpoint. Remove that police force and you have to replace it with something else. Problem.

It's easy to say "Don't do this because God will smite you down." It's much harder to say "Don't do this because the moral backing is better for the society around you." Learning the difference is, in my opinion anyway, one of the biggest parts of growing up and constructing your own moral structure to live by.
 
May 27, 2008
321
0
0
Eipok Kruden said:
The Kind Cannibal said:
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
Exactly. That's what I'm trying to find out. How do you teach an AI like John Henry morals without it misinterpreting those morals. You've got to find a way to teach those rules and you've got to find the right set of rules, but I'm not exactly sure how. Even utilitarianism can turn into genocide since he could interpret that wrong. He could use the idea of utilitarianism to justify genocide if he believes that he is superior to humans.
that was the main premise for iRobot although thay just needed to be more specific like
1. YOU CANNOT HURT PEOPLE DEAL WITH IT!!!
2. Protect people from everything but themselves!
3. bananas are excellent sources of potassium
4. don't kill people if that weren't self-explanatory from the 1st rule
 

Neosage

Elite Member
Nov 8, 2008
1,747
0
41
T3h Camp3r T3rr0r1st said:
Eipok Kruden said:
The Kind Cannibal said:
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
Exactly. That's what I'm trying to find out. How do you teach an AI like John Henry morals without it misinterpreting those morals. You've got to find a way to teach those rules and you've got to find the right set of rules, but I'm not exactly sure how. Even utilitarianism can turn into genocide since he could interpret that wrong. He could use the idea of utilitarianism to justify genocide if he believes that he is superior to humans.
that was the main premise for iRobot although thay just needed to be more specific like
1. YOU CANNOT HURT PEOPLE DEAL WITH IT!!!
2. Protect people from everything but themselves!
3. bananas are excellent sources of potassium
4. don't kill people if that weren't self-explanatory from the 1st rule
But what if saving the people meant hurting them?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
People, this isn't a religious thread, this is something deeper than a basic theism vs atheism debate. I don't even want a debate like that to occur in this thread; however, DO NOT claim that the bible is completely logical. John Henry wouldn't ignore the Old Testament, he'd see the Old Testament and the New Testament as part of the same book and if one is unreliable, the other is too. Of course there's the other extreme as well. If he does accept god and all that, it'll lead him to murder and genocide faster than you can say "Shit, we shouldn't have even mentioned religion." The Old Testament is filled with horrible atrocities, don't claim that it isn't or you'll look stupid.

Neosage said:
But what if saving the people meant hurting them?
Then the robot's inaction would have killed the person. The robot would go crazy or something, it just contradicted two of four the poorly written laws that were programmed into it.
 
May 27, 2008
321
0
0
Neosage said:
T3h Camp3r T3rr0r1st said:
Eipok Kruden said:
The Kind Cannibal said:
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
Exactly. That's what I'm trying to find out. How do you teach an AI like John Henry morals without it misinterpreting those morals. You've got to find a way to teach those rules and you've got to find the right set of rules, but I'm not exactly sure how. Even utilitarianism can turn into genocide since he could interpret that wrong. He could use the idea of utilitarianism to justify genocide if he believes that he is superior to humans.
that was the main premise for iRobot although thay just needed to be more specific like
1. YOU CANNOT HURT PEOPLE DEAL WITH IT!!!
2. Protect people from everything but themselves!
3. bananas are excellent sources of potassium
4. don't kill people if that weren't self-explanatory from the 1st rule
But what if saving the people meant hurting them?
how often would that come about?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
T3h Camp3r T3rr0r1st said:
Neosage said:
T3h Camp3r T3rr0r1st said:
Eipok Kruden said:
The Kind Cannibal said:
If you were going to try and teach a robot morals, it wouldn't really be morals. You'd have to simply lay down rules of what is and isn't acceptable, and be very specific about it. As soon as it starts to interprete said rules for itself, it all goes down hill from there.
Exactly. That's what I'm trying to find out. How do you teach an AI like John Henry morals without it misinterpreting those morals. You've got to find a way to teach those rules and you've got to find the right set of rules, but I'm not exactly sure how. Even utilitarianism can turn into genocide since he could interpret that wrong. He could use the idea of utilitarianism to justify genocide if he believes that he is superior to humans.
that was the main premise for iRobot although thay just needed to be more specific like
1. YOU CANNOT HURT PEOPLE DEAL WITH IT!!!
2. Protect people from everything but themselves!
3. bananas are excellent sources of potassium
4. don't kill people if that weren't self-explanatory from the 1st rule
But what if saving the people meant hurting them?
how often would that come about?
Scenario: A car just got forced off the road and into a lake. The person in the car is at risk of drowning. The windows of the car are still all completely intact and shut, but water is getting in anyway. The robot sees the person in the car and dives into the water and breaks the glass, but the person's seat belt won't come undone. The robot can't attempt to pull the person from the seat because the person could get slightly bruised or scraped.