- A robot may not injure a human being or, through inaction, allow a human being to come to harm.
- A robot must obey any orders given to it by human beings, except where such orders would conflict with the First Law.
- A robot must protect its own existence as long as such protection does not conflict with the First or Second Law.
Sunday, September 4, 2011
Ethics & Robotics
The Three Laws of Robotics (often shortened to The Three Laws or Three Laws) are a set of rules devised by the science fiction author Isaac Asimov and later expanded upon. The rules are introduced in his 1942 short story "Runaround", although they were foreshadowed in a few earlier stories. The Three Laws are:
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment