Making Room for Robots

0

Long a trope of science fiction, service robots are making their way into modern-day life. Robotic technologies, including personal care robots, physical assistant robots, and mobile servant robots, are being created to assist humans with everyday activities. Robotic exoskeleton devices can help their users walk or climb stairs, while advanced robotic wheelchairs equipped with a robotic arm can help individuals eat, pick up a glass, open a door, and even shave.

As these service robots become more fully integrated into human society, questions arise regarding responsibility, liability, and accountability. For instance, what laws govern a robot’s actions?

On Wednesday, August 31, Eduard Fosch Villaronga spoke at Fordham Law School about bridging the gap between legal standards and advancing robotic technologies during a talk hosted by the Center on Law and Information Policy. Fordham Law faculty and CLIP-affiliated student fellows comprised the audience.

According to Villaronga, current standards regulating personal care robots focus on the physical human-robot interaction but fail to recognize the legal and ethical concerns of other aspects of the user experience, such as cognitive safety, data protection matters, liability contexts, and privacy. Furthermore, the use of personal care robots requires the consent of users to relinquish a certain level of decision-making power to these semi-autonomous devices; however, there are no current legal standards to ensure this surrender of power does not compromise the dignity of the user.

A Ph.D. candidate in the Law, Science, and Technology Erasmus Mundus Program, Villaronga noted that one of the major challenges is that the definition of service robots is vague. Service robots are designed to help people, but there are numerous ways in which someone can interpret “help.” A vacuum or a blender, for example, could fall under one of the criteria of “improving the lives of humans.”

Villaronga proposed the following situation: A robotic wheelchair has been programed to transport a person to the park, but along the way that person sees shoes she likes in a store window. While the person may want to stop and take a look, the robot does not allow for that. In this scenario, the human’s decision-making ability has been taken away, even though she may have been perfectly capable of making that decision.

When robots are given the authority to make decisions such as this, it becomes more difficult to distinguish who is ultimately responsible for the decisions. Is it the human or the robot? To take another example, a human outfitted with a robotic exoskeleton device steps on and breaks another person’s foot. The robot directed the human’s action, so does that mean the human was not in control? Do you sue the robot or the human?

Villaronga noted that these types of questions need to be examined in order to bridge the gap that exists between current legal standards and advancing robotic technologies.

Share.

Comments are closed.