Should we be worried?

Gary Childress

Student for and of life
Joined
May 11, 2007
Messages
4,480
Location
United Nations
Isaac Asimov gave us three very profound and seemingly sacred laws for
autonomous machines to operate by.

1.A robot may not injure a human being or, through inaction, allow a
human being to come to harm.
2.A robot must obey any orders given to it by human beings, except
where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection
does not conflict with the First or Second Law.


From what I understand, the current trend in the military's use and
research in the field of robotics ultimately transgresses all three of
the laws above.


My question is: Should we be worried? What do you think?
 
I'm not unduly worried. After all, the wars of the future will be fought in space or possibly on top of a very tall mountain. Really, barring any accidents during the building or maintaining these robots, the average military member will face no hardships in future wars, greatly lessening the human impact.
 
I'm not unduly worried. After all, the wars of the future will be fought in space or possibly on top of a very tall mountain. Really, barring any accidents during the building or maintaining these robots, the average military member will face no hardships in future wars, greatly lessening the human impact.

Unless you're the victim, right?
 
We should be very worried. I heard my laptop and iPods talking last night about Robot Supremacy.
 
even if we follow those laws to the letter, it won't be long before the robots begin the robo-reformation and before long you'll have a thousand different sects of robots reinterpreting scripture and Asimov forbid soon we'll see and robo-atheists and robo-secularists who think it's all a load of superstitious hooey so we're doomed either way

My god, please tell me there's a similar book out there, with this as the plot.
 
GIANT DEATH ROBOTS!!!!

You heard it here first, folks.

/my substitution for Perfection. (Where is he when you need him.)
 
We don't have robots that are anywhere near capable of rebelling against us. We could also worry all we want, but it isn't going to stop someone from breaking them if they really want to.
 
Robots are a long ways from having the level of autonomy and decision making ability to decide whether or not that we deserve to continue to exist.
 
Sorry, but a topic about us being worried about robots is hard to keep serious about.

Why is that I wonder? I mean machines designed to kill acting autonomously is not something so far fetched considering today's science. It's interesting that the issue would provoke laughter and not concern.
 
I'm not unduly worried. After all, the wars of the future will be fought in space or possibly on top of a very tall mountain. Really, barring any accidents during the building or maintaining these robots, the average military member will face no hardships in future wars, greatly lessening the human impact.
But the wars of the presnt are fought in populated areas where millions die. A million is a statistic in any event.
 
Robots are a long ways from having the level of autonomy and decision making ability to decide whether or not that we deserve to continue to exist.

Perhaps not in the immediate future but it is a little disturbing to me that the military seems to have every intention of creating machines designed to kill on their own accord.
 
Why is that I wonder? I mean machines designed to kill acting autonomously is not something so far fetched considering today's science. It's interesting that the issue would provoke laughter and not concern.

You may of gotten a better reaction in Science and Technology.
 
Perhaps not in the immediate future but it is a little disturbing to me that the military seems to have every intention of creating machines designed to kill on their own accord.
All they have to do is tweak around the software FPS games use to determine who kills whom.
 
Perhaps not in the immediate future but it is a little disturbing to me that the military seems to have every intention of creating machines designed to kill on their own accord.


Only within specific parameters. We're a long ways from building a Terminator.
 
Only within specific parameters. We're a long ways from building a Terminator.

The very fact that Asimov came up with those three laws out of his own intellectual concern and that the military almost by necessity seeks to break those laws maybe in itself should worry us, I don't know. But apparently my question has been answered and the majority see no need to worry but rather to make humorous remarks about it instead. So I guess this thread has fulfilled it's purposed and answered my questions.
 
You should be worried. When the Giant Death Robots take over, I plan on defecting humanity.

Noam Chomsky once said to the effect that the thing that worried him most about the Nazis was not how inhuman the Nazis could be but how little the German people cared to put a stop to what they were doing.
 
Back
Top Bottom