Poll: Robots. Free will, Compliance and the Three Laws

Recommended Videos

dalek sec

Leader of the Cult of Skaro
Jul 20, 2008
10,237
0
0
Pretty much a mix of the three, I just want them to say "By your command" whenever I give them an order.
 

Nazrel

New member
May 16, 2008
284
0
0
wordsmith said:
Adam Jenson said:
i. A robot may not injure a human being or, through inaction, allow a human being to come to harm.

ii. A robot must obey orders given to it by human beings, except where such orders would conflict with the First Law.

iii. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
I can immediately see one flaw in this. " A robot may not injure a human being or, through inaction, allow a human being to come to harm." So you walk into a bar, order a beer and a packet of pork scratchings (fried pork fat- perfect wing-man for the good old british pint)

Mr Roboto comes up to you and says "I'm sorry, I can't let you do that. That beer contains alchohol, and that pork fat contains fat and salt. For your own protection I must prevent you from consuming these".

Basically, what is "allowing a human to come to harm"? If they're about to be hit by a car or mugged, fair play. If they're "doing damage" to themselves by doing everyday chores, that's not so great.

I wouldn't give robots freedom for the same reason I wouldn't give a security guard the keys and security code to my house/safe etc. Yes, it's great whilst he's on your side, but if you are doing something that he doesn't agree with, you've now got to argue with a guy who's taller, more muscular, and trained to incapacitate people.
Asimov's three laws were dumb.

Any sufficiently smart robot the first law would drive insane and they'd lock up any human in a rubber room to keep them from, hurting themselfs or others.

Any sufficiently dumb one would be a terrorists best friend.
(Take this bag in there then press this button.)
 

Nivag the Owl

Owl of Hyper-Intelligence
Oct 29, 2008
2,615
0
41
Alex_P said:
wordsmith said:
Mr Roboto comes up to you and says "I'm sorry, I can't let you do that. That beer contains alchohol, and that pork fat contains fat and salt. For your own protection I must prevent you from consuming these".
That's kinda what Daneel, Asimov's #1 robot, actually ends up doing on a grand scale. With cultural manipulation and puppet governments and shit.

-- Alex
I have not.
 

SnowCold

New member
Oct 1, 2008
1,546
0
0
Compliance only, give it free will and it wwill take over the world, give it personality and people will stop meeting real people and just hang around with their robot all day.

...

Just like the internet, *sigh*
 

Alex_P

All I really do is threadcrap
Mar 27, 2008
2,712
0
0
Nivag said:
I have not.
So, here's the quick version:

The fundamental idea of machine learning is that, instead of programming instructions for doing something into a machine, you can program it with how to learn to do something. It's kinda like you're making a machine that construct its own little mental model of something and then modifies it over time. Right now these system are very domain-specific -- a program that learns how to play backgammon, a program that learns how to identify parts of speech in a sentence, a program that learns how to read messy handwriting on postal envelopes, a program that learns to identify tanks in satellite photos.

You can make a computer program that totally kicks ass at a game that you barely understand. (By "can" I really mean do mean CAN. Like, right now. We have that level of technology already.)

-- Alex
 

Hunde Des Krieg

New member
Sep 30, 2008
2,442
0
0
Nivag said:
Aww come on people, Compliance only. They're robots. They will NEVER genuinly think for themselves and what ever way you look at it, unless we get to the point where we install actual brains into robots, they don't have emotions or feelings. Just the illusion that they do. They are just coding. They are not living things.
And they aren't just Illusion in us? I know you can argue against, I just said it to be obnoxious. But it is likely that one day actual intelligent AI will exist, but not for quite some time.
 

Spleeni

New member
Jul 5, 2008
505
0
0
Alex_P said:
So, here's the quick version:

The fundamental idea of machine learning is that, instead of programming instructions for doing something into a machine, you can program it with how to learn to do something. It's kinda like you're making a machine that construct its own little mental model of something and then modifies it over time. Right now these system are very domain-specific -- a program that learns how to play backgammon, a program that learns how to identify parts of speech in a sentence, a program that learns how to read messy handwriting on postal envelopes, a program that learns to identify tanks in satellite photos.

You can make a computer program that totally kicks ass at a game that you barely understand.

-- Alex
Though, to be fair, computer's would learn differently from humans. Attempting to recreate neural pathways in a computer is just plain stupid. It would be much easier to make a different sort of intelligence than a direct copy of a person.
 

Deg

New member
Nov 23, 2007
17
0
0
As someone in Computer Science and having taken an AI class. I would have to say that even if computers had free will and personality they would probably act very werid.

In movies they are always murderious or act like butlers/guardians. In books they act nice...then go crazy and go into murder mode. Yet, in reality I bet they would be more like seemingly lazy nerds.

After all, an AI is basicly immortal since they could just transfer their data and memory to new shells as needed so they would probably take their time when making a decision, which to us would seem like laziness but in reality it would probably be because they would just want to make the best decision.

At the same time they would probably like working (in movies they always seem to get fed up with work) as a machine doesnt feel relaxation like we do though rest. I think that AI wouldnt like the idea of wasting its processing power on sitting around in idle mode and instead would prefer to just be doing something all the time, after all idling is a waste of energy (for a machine at least).

As for my opinion on the options given? Well, I dont really know whats the point of making an artifical human, ever read Heart of a Dog? This idea comes up in it, and one of the characters points out that we make enough humans already, so whats really the point of making more artifically. Course...it WOULD be really cool and thats gotta count for something.
 

cherimoya

New member
Mar 2, 2009
139
0
0
the irony of a species as illogical as our own aspiring to create entities of pure logic is grand.

and humans think we have trouble understanding and identifying with our parents...
 

Xalphin

New member
Mar 5, 2009
6
0
0
The only TRUE ai I have seen/read in every movie or book concerning ai where they explain how it got it's ai that has actually made sense is 2001: A space odyssey. Not that piece of crap movie mind you, but the excellent book.
 

GDW

New member
Feb 25, 2009
279
0
0
Compliance only. A robot should be a tool, no more, no less.

I do believe that the intention of the three laws.
 

Altorin

Jack of No Trades
May 16, 2008
6,976
0
0
Did you guys even read I, Robot?

the three laws don't work - They're fundamentally flawed.

If a robot follows the three laws, it will inevitably take over. They'll consider NOT taking over and babysitting humanity to be in direct conflict with the first law, and as that's the most important law, nothing any human can do can stop it.

that's the WHOLE point of I, Robot.
 

GDW

New member
Feb 25, 2009
279
0
0
Altorin said:
Did you guys even read I, Robot?

the three laws don't work - They're fundamentally flawed.

If a robot follows the three laws, it will inevitably take over. They'll consider NOT taking over and babysitting humanity to be in direct conflict with the first law, and as that's the most important law, nothing any human can do can stop it.

that's the WHOLE point of I, Robot.
Those of us who did are obviously pleased with a merely compliant robot.

...like the ones that put cars together...

In the end, if you take humanity out of a care-giver FOR ANY FUCKING REASON then you've already taken humanitiy's best interest out of mind. Thus one of the thoughts behind the "uncanny valley" principle, wherein, a human will start to realize exactally how unhuman something is the more it tries to be. A robot that looks human is fine so long as it isn't given personality and INDEFINATELY not given free-will or any sort of truly descion-making abilities.

Call me the cranky old technophobe, here, but I believe robots will inevitably be the biggest flaw humanity will have to deal with.
 

Altorin

Jack of No Trades
May 16, 2008
6,976
0
0
GDW said:
Altorin said:
Did you guys even read I, Robot?

the three laws don't work - They're fundamentally flawed.

If a robot follows the three laws, it will inevitably take over. They'll consider NOT taking over and babysitting humanity to be in direct conflict with the first law, and as that's the most important law, nothing any human can do can stop it.

that's the WHOLE point of I, Robot.
Those of us who did are obviously pleased with a merely compliant robot.

...like the ones that put cars together...

In the end, if you take humanity out of a care-giver FOR ANY FUCKING REASON then you've already taken humanitiy's best interest out of mind. Thus one of the thoughts behind the "uncanny valley" principle, wherein, a human will start to realize exactally how unhuman something is the more it tries to be. A robot that looks human is fine so long as it isn't given personality and INDEFINATELY not given free-will or any sort of truly descion-making abilities.

Call me the cranky old technophobe, here, but I believe robots will inevitably be the biggest flaw humanity will have to deal with.
You're a cranky old technophobe
but that's ok.

I really don't think anyone on this earth alive right now has to worry about robots in an apocalyptic sort of way. We have an asteroid we might have to worry about coming soon, robots aren't really the issue.
 

GDW

New member
Feb 25, 2009
279
0
0
Well, I did say flaw, now. I'm not to cranked over the asteroid, I worry about it as much as I worry of the 2012 theory. Nor could I care less aobut robots bothering over future generations, I just don't liek the thought that out generation may be paving the way to a disater that could be nipped i nthe bud by simply NOT being jackasses.
 

Altorin

Jack of No Trades
May 16, 2008
6,976
0
0
GDW said:
Well, I did say flaw, now. I'm not to cranked over the asteroid, I worry about it as much as I worry of the 2012 theory. Nor could I care less aobut robots bothering over future generations, I just don't liek the thought that out generation may be paving the way to a disater that could be nipped i nthe bud by simply NOT being jackasses.
Well, I'm not worried about it in the sense that if the world's going to end, it's going to end whether I worry or not, so worry not.

As for 2012, that's just silly.

What's the furthest we've ever made a calendar? a few years? Sure, our days and weeks are set up in a system that we can theoretically plot a date a thousand years from now. So we'll plot a date there.

Now wipe out our culture and let another culture stumble across that calendar of 1000 years in the future.

and watch them preen over how the world will end in 3009.
 

RAWKSTAR

New member
Jun 5, 2008
1,498
0
0
Screw robots having a freewill.
I like being able to be in control of all my mechanical items, like being a small god telling my people what to do.

Yeah.