Gary Childress
Student for and of life
Isaac Asimov gave us three very profound and seemingly sacred laws for
autonomous machines to operate by.
1.A robot may not injure a human being or, through inaction, allow a
human being to come to harm.
2.A robot must obey any orders given to it by human beings, except
where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection
does not conflict with the First or Second Law.
From what I understand, the current trend in the military's use and
research in the field of robotics ultimately transgresses all three of
the laws above.
My question is: Should we be worried? What do you think?
autonomous machines to operate by.
1.A robot may not injure a human being or, through inaction, allow a
human being to come to harm.
2.A robot must obey any orders given to it by human beings, except
where such orders would conflict with the First Law.
3.A robot must protect its own existence as long as such protection
does not conflict with the First or Second Law.
From what I understand, the current trend in the military's use and
research in the field of robotics ultimately transgresses all three of
the laws above.
My question is: Should we be worried? What do you think?