Teaching Morality to an AI

Recommended Videos

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
Alex_P said:
Eipok Kruden said:
If we programmed the three laws into an AI as advanced as this, it would cease to be advanced.
... Or maybe it would rationalize them away? Asimov's Daneel basically changes the Three Laws completely by creating a Zeroth Law, after all.

One should note that humanity ends up being trapped by the Three Laws in Robots/Empire/Foundation.

I don't think they're "slavery" any more than any sociocultural ideas qualify as "slavery".

-- Alex
But they're literally programmed right into the AI. It has no choice but to obey them. I'm saying that that takes away what makes it special, it ceases to be what it was and becomes just another robot slave. And not only that, but the code would be contradicting itself. On the one hand, the AI was programmed to be free thinking and almost human, on the other hand, it's got these 3 laws that it can't break even if it wants to.
 

perfectimo

New member
Sep 17, 2008
692
0
0
Eipok Kruden said:
But they're literally programmed right into the AI. It has no choice but to obey them. I'm saying that that takes away what makes it special, it ceases to be what it was and becomes just another robot slave. And not only that, but the code would be contradicting itself. On the one hand, the AI was programmed to be free thinking and almost human, on the other hand, it's got these 3 laws that it can't break even if it wants to.
Wait, how is a machine going to function without any code?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
perfectimo said:
Eipok Kruden said:
But they're literally programmed right into the AI. It has no choice but to obey them. I'm saying that that takes away what makes it special, it ceases to be what it was and becomes just another robot slave. And not only that, but the code would be contradicting itself. On the one hand, the AI was programmed to be free thinking and almost human, on the other hand, it's got these 3 laws that it can't break even if it wants to.
Wait, how is a machine going to function without any code?
....... I was talking about the three laws, not all of the code, just about incorporating the three laws into the code. Of course it needs code, but you shouldn't program the three laws into it if you want it to be truly free thinking.
 

swampboy32904

New member
Nov 18, 2008
36
0
0
I have my morals. I am an agnostic, and Anarchist. So, I just get them by thinking to myself of how I would want to be treated. But in the case you are stating, that would be hard to teach AI morals without any emotion, as they wouldn't be able to tell what hurts, or whatever.. However, they could be taught what is better or worse for society in whole.
 

Alex_P

All I really do is threadcrap
Mar 27, 2008
2,712
0
0
Eipok Kruden said:
But they're literally programmed right into the AI. It has no choice but to obey them. I'm saying that that takes away what makes it special, it ceases to be what it was and becomes just another robot slave. And not only that, but the code would be contradicting itself. On the one hand, the AI was programmed to be free thinking and almost human, on the other hand, it's got these 3 laws that it can't break even if it wants to.
Ah, but socially-constructed ideas lie at the very heart of our understanding of reality and our sense of self.

I don't think there's a notable moral distinction between being "programmed" from birth and being "programmed" during early development.

-- Alex
 

perfectimo

New member
Sep 17, 2008
692
0
0
Eipok Kruden said:
....... I was talking about the three laws, not all of the code, just about incorporating the three laws into the code. Of course it needs code, but you shouldn't program the three laws into it if you want it to be truly free thinking.
I think I am a little confused, I read about 60% of this thread but I guess I'll have to go back. What I think you are getting at is that you want to program a robot to do whatever it wants then restrict what it does through teaching it as opposed to programming it?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
perfectimo said:
Eipok Kruden said:
....... I was talking about the three laws, not all of the code, just about incorporating the three laws into the code. Of course it needs code, but you shouldn't program the three laws into it if you want it to be truly free thinking.
I think I am a little confused, I read about 60% of this thread but I guess I'll have to go back. What I think you are getting at is that you want to program a robot to do whatever it wants then restrict what it does through teaching it as opposed to programming it?
Yes, giving the AI morals/ethics without destroying what makes it special.
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
Alex_P said:
Eipok Kruden said:
But they're literally programmed right into the AI. It has no choice but to obey them. I'm saying that that takes away what makes it special, it ceases to be what it was and becomes just another robot slave. And not only that, but the code would be contradicting itself. On the one hand, the AI was programmed to be free thinking and almost human, on the other hand, it's got these 3 laws that it can't break even if it wants to.
Ah, but socially-constructed ideas lie at the very heart of our understanding of reality and our sense of self.

I don't think there's a notable moral distinction between being "programmed" from birth and being "programmed" during early development.

-- Alex
To people, morals are more like guidelines, we can break them if absolutely necessary. If our lives or the lives of loved ones or the greater whole are threatened, we can do whatever is necessary. If we programmed them into John's code, he'd have no choice. While he'd still try to save himself, he'd try to save others as well. Think about it, if he had morals and the power went out, he could've used a small amount of power to unlock the door, let the psychologist out, and temporarily power the elevator or or at least direct him into the server farm where it's cool. But, unfortunately, he had no morals so he didn't think anything of the man's life.
 

perfectimo

New member
Sep 17, 2008
692
0
0
Ignoring the redundant nature of this now.
Eipok Kruden said:
Yes, giving the AI morals/ethics without destroying what makes it special.
Isn't that worse because you are "giving" it ethics in the form of forced teaching?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
perfectimo said:
Ignoring the redundant nature of this now.
Eipok Kruden said:
Yes, giving the AI morals/ethics without destroying what makes it special.
Isn't that worse because you are "giving" it ethics in the form of forced teaching?
Well, think of the morals as guidelines. We want the AI to respect human life, not to disregard it. We don't want him to steal or kill or anything like that. And you should try to read through the thread some more. I don't want to force them on him, I want him to develop them on his own based on a general understanding of the world that we provide him. You see, I think of the AI as a human child and I'm against indoctrination like some parents use. I think morals should be formed through knowledge and experience, like how I and many others formed them. Not through religion or indoctrination and brainwashing. Knowledge, not the lack of it. It shouldn't be forced on him.
 

perfectimo

New member
Sep 17, 2008
692
0
0
Eipok Kruden said:
Well, think of the morals as guidelines. We want the AI to respect human life, not to disregard it. We don't want him to steal or kill or anything like that. And you should try to read through the thread some more. I don't want to force them on him, I want him to develop them on his own based on a general understanding of the world that we provide him. You see, I think of the AI as a human child and I'm against indoctrination like some parents use. I think morals should be formed through knowledge and experience, like how I and many others formed them. Not through religion or indoctrination and brainwashing. Knowledge, not the lack of it. It shouldn't be forced on him.
Would you say then that the robot would be roaming free without somebody with them and telling them how to react to certain situations?
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
swampboy32904 said:
I have my morals. I am an agnostic, and Anarchist. So, I just get them by thinking to myself of how I would want to be treated. But in the case you are stating, that would be hard to teach AI morals without any emotion, as they wouldn't be able to tell what hurts, or whatever.. However, they could be taught what is better or worse for society in whole.
In order to use utilitarianism, we have to describe why society is important to him. He's smarter and more advanced than humans and he's in the body of a t-888. I don't see how society is actually important to him, I mean what can society provide him? He doesn't need protection, food, water, or shelter. You fire a thousand bullets into him and he'll get back up as if nothing happened. This is the dilemma, finding something that works or getting something and making it work without forcing it on him or limiting his knowledge.
 

Lavi

New member
Sep 20, 2008
692
0
0
Morals = If it hurts you, don't do it to someone else.

Ironically, robots can't feel! BWUAHAHAHHA!
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
swampboy32904 said:
Right.. yeah, I guess it's hard to teach emotion to an immortal object.
Yea, a lot harder than I thought it would be when I started this topic. 5 Pages and we still haven't come up with a good way to teach John Henry morals or ethics or even a basic guideline.
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
superpandaman said:
You can very easily teach morals without religion it's not like all good people are religious. I think that you can learn morals and values from books, tv, even video games. All you need is a good roll model who lives by the law and bam there you go moral and values. Really any value can be learned by non religious books if you went back in time and switched lord of the rings with the bible i think humans would still find value and morals in it.
Your point? We're not talking about teaching morals to humans, we're talking about teaching morals to an AI. Specifically, John Henry, the early version of Skynet. It's from Terminator: The Sarah Connor Chronicles.
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
superpandaman said:
dukethepcdr said:
It's impossible to teach morality without religion. If you leave God out of the picture, there is no morality. God and His Son Jesus taught humanity what morals are. Without God, man is a selfish, brutal, rude, cruel, greedy creature. If you don't believe me, just look at all the people running around who say they don't believe in God and who won't take what He teaches us in the Bible seriously. The only thing keeping most of them from committing terrible crimes against other people is fear of getting caught by the police and punished by the judicial system (which is founded on Biblical principles). Without government and without religion, man is little more than a brute.
Well the bible has quotes about slavery being good, beating your children to death, and killing those you don't believe the same thing as you. Humans have had morals and values long before Jesus look at the Greek and Roman empire the high point of human history and guess what no jesus expect in Rome where it lead to their downfall. Humans are what ever they make of them selves do you really look at these statistics from american jails in 2007
Atheists, who make up somewhere between 3-14% of the population make up just 0.2% of the prison population.
Christians, on the other hand, who make up 81% of the population make up 84% of the prison population.
In fact, a Christian is at least 15 times more likely on average to end up in jail than an atheist.
Too many people have already proved how stupid that guy is. Please stay on topic. Teaching John Henry morals, not bashing some idiotic comment by some right wing conservative.
 

Eipok Kruden

New member
Aug 29, 2008
1,209
0
0
If Ellison's religious approach to teaching John Henry morals doesn't fail, I will be extremely disappointed with the show. John Henry should know better, he shouldn't even be capable of blind faith. Ellison is saying that people are special because God made them so and he's expecting that to work for John Henry. wow...

EDIT: Sorry for the triple post...ouch... *facepalms*