What we can learn from robot ethics

The British Standards Institute (BSI) recently released a new set of standards for the ethical design of robots and robotic devices. The standards highlight the growing need for guidelines on robotic safety, contact with human beings, robotic deception, addiction and possible sexism or racism exhibited by self-learning artificial intelligence (AI) systems.

When science fiction writer Isaac Asimov wrote about the three laws of robotics in his book Runaround in 1942, little did he know they would one day become a reality for a world filled with robots. From automated manufacturing plants, medical and pharmaceutical applications to military, agricultural and automotive systems, robots are everywhere in our modern world.

Asimov’s laws outline that a robot must not injure a human being, must obey humans  and must protect itself. The new BSI standard BS 8611 builds on Asimov’s laws and aims to help designers and manufacturers consider the ethical hazards of robots. The new standard states that, “robots should not be designed solely or primarily to kill or harm humans; humans, not robots, are the responsible agents” and that “it should be possible to find out who is responsible for any robot or its behaviour”.

Here are the top three things industry can learn from the new standards for robot ethics:

1. Industry needs to focus on robot safety

In large part, the new standards cater for the rise in artificial intelligence (AI). Although this is one of the most exciting developments in robotics, it is not currently where the majority of robots are used. The industrial robots sector is the principal driver in the general robotics market, accounting for a 33 per cent rise in 2015 according to a report by the International Federation of Robotics.

The new standards make it clear that a robot needs to be, “safe, secure and fit for purpose”. As such, the introduction of the new standards will have far-reaching implications, particularly for industrial users where hazardous robotic environments can pose serious risk of injury to human workers. Many of the deaths caused by industrial robots in recent years may have been avoided if their design considered smart algorithms and programming that was aware of human presence.

2. OEMs need to change the way they design robots for the future

To date, the majority of industrial robots have been used in traditional applications including factory automation, automotive, metalworking and electronics assembly. The rise of smaller and lighter robots capable of delivering higher payloads with very high accuracy means that original equipment manufacturers (OEMs) need to rethink their approach.

Industries including aerospace, pharmaceutical, food and medical manufacturing are increasingly demanding specific requirements for each application. Whether its aerospace robots that need to use lightweight, high torque gears for the vacuum of space, pharmaceutical and food robots that need to offer ingress protection for easy cleaning or collaborative robots used for small electronics assembly alongside human workers, the traditional approach to robotics simply will not work.

For example, Harmonic Drive gears have been used on NASA’s Mars Rovers. Because the system was remotely controlled from Earth it was vital that the gearing system provide smooth, zero backlash, repeatable movements with absolute accuracy in a low weight design. Our strainwave gear has teeth that form the outside edge of a flex spline, leaving the central area to be bored into a hollow shaft to allow data cables and other services to pass through, while allowing continuous rotation.

By considering the variety of new applications and choosing the right components for each application, design engineers can successfully embrace the new wave of robot developments.

3. Humans will remain responsible for a robot’s actions

As robots become more sophisticated, our perception of their traits such as deception, addiction, sexism and racism will only grow. Despite this, a responsible human being will always need to be accountable for the robot’s actions.

Although this raises many legal and ethical questions, we can reduce the risks by improving transparency in the robotics supply chain, by choosing the right component manufacturer at the outset, setting high standards for the design engineering and product testing of new systems and better recycling obsolete systems.

By changing our approach to the way we design and use robots, engineers, manufacturers and users can embrace, rather than fear, robot ethics.

 

 

 

0 replies

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *