I have been wondering on this for a While.
we have made missiles that are Computer operated to hit a target
Built Guns that work with cameras.
so have we already Over stepped the rule. We have Programmers working all the time making AI's to kill the players of games and adapting tactics. (if you piss of your AI you are fighting against and it fights back by hitting the real you with a missile)
These are the 3 Laws.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
I Would like to see your reasoning to this.
we have made missiles that are Computer operated to hit a target
Built Guns that work with cameras.
so have we already Over stepped the rule. We have Programmers working all the time making AI's to kill the players of games and adapting tactics. (if you piss of your AI you are fighting against and it fights back by hitting the real you with a missile)
These are the 3 Laws.
1. A robot may not injure a human being or, through inaction, allow a human being to come to harm.
2. A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
3. A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
I Would like to see your reasoning to this.