One of the most famous pieces of writing ever about robots was presented in the Three Laws of Robotics, as laid out by sci-fi author Isaac Asimov.
The laws state that a robot may not injure a human being or, through inaction, allow a human being to come to harm; that a robot must obey the orders given to it by human beings, except where such orders would conflict with the First Law; and that a robot must protect its own existence, as long as such protection does not conflict with the First or Second Laws.
Well, it seems such laws aren’t going to remain science fiction for too much longer — since the British Standards Institution, which is the U.K.’s national standards body charged with creating the technical standards and certification for various products and services, has just produced its first set of official ethics guidelines relating to robots.
“The expert committee responsible for this thought there was really a need for a set of guidelines, setting out the ethical principles surrounding how robots are used,” Dan Palmer, head of market development at BSI, told Digital Trends. “It’s an area of big public debate right now.”
The catchily-named BS 8611 guidelines start by echoing Asimov’s Three Laws in stating that: “Robots should not be designed solely or primarily to kill or harm humans.”
However, it also takes aim at more complex issues of transparency by noting that: “It should be possible to find out who is responsible for any robot and its behavior.” There’s even discussion about whether it’s desirable for a robot to form an emotional bond with its users, an awareness of the possibility robots could be racist and/or sexist in their conduct, and other contentious gray areas.
In all, it’s an interesting attempt to start formalizing the way we deal with robots — and the way roboticists need to think about aspects of their work that extend beyond technical considerations. You can check it out here — although it’ll set you back 158 pounds ($208) if you want to read the BSI guidelines in full. (Is that ethical?)
“Robots have been used in manufacturing for a long time,” Palmer said. “But what we’re seeing now are more robots interacting with people. For instance, there are cases in which robots are being used to give care to people. These are usages that we haven’t seen before — [which is where the need for guidelines comes in.]”
- Robots could soon make up a quarter of U.K. army, top general suggests
- Three words: Robot. Talent. Agency.
- Editorial Guidelines: Ethics, Advertising, and More
- DARPA is sending robots underground to teach them to save lives
- The U.K.’s biggest (and only) asteroid mining company has designs on our skies