Roboethics : Is it OK to abuse, trust or make love to a robot?

Advances in artificial intelligence are blurring the line between humans and robots. As robots start to interact ever more closely with us, a number of new ethical questions are emerging, related to a variety of issues, from violence to sex to privacy.

Poor bots

In late February, a video uploaded on YouTube by Boston Dynamics, an American robot developer, sparked controversy. “I felt a little sorry for the robot,” one viewer commented below the video. “Stop bullying robots!” another wrote. “I cried,” said another.

Viewers of the video were apparently shocked by the scene in which a man knocks down a box that was being lifted by a two-legged humanoid robot developed by the company, and another scene in which the man knocks the robot down from behind with a stick.

Many viewers, on the other hand, were impressed by the robot’s ability to correct its position to stand up after being knocked down.

Similar outrage erupted last year when the company posted a video in which a man kicked the company’s four-legged, dog-like robot.

Boston Dynamics did not intend to abuse robots. The videos were meant to demonstrate the company’s technology that allows robots to maintain their posture when obstructed.

Still, they caused an unintended stir, because viewers reacted to the robots as if they were alive. Harming a robot is not a crime. But, apparently, seeing robots that look like living creatures getting hurt induced strong emotions in people, a sign that robots could have a big impact on human ethics.

Sex bots

“Robotic sex partners will be a commonplace,” one researcher predicted in response to a 2014 survey on the state of AI and robotics in 2025 by the Pew Research Center, an influential American private think tank. That prediction is not a figment of his imagination. In fact, the technology is already here.

Last fall, sex robots produced by U.S. company True Companion made headlines. What the company claims is the “world’s first sex robot” is a life-size humanoid female robot and is sold on its website for $6,995. Customers can choose from different personalities, such as “outgoing” and “reserved.” A male version will soon be available for the same price, according to the company’s website.

Although little is known about the company itself or the sex robots’ sales, it has provoked considerable opposition across borders.

Sex robots will create more sexual desires, warned Kathleen Richardson, a researcher at De Montfort University in the U.K. who specializes in ethics of robotics. She launched a campaign website and calls on people around the world to join her in opposing the development of sex robots. She said that “the development of sex robots will further reduce human empathy that can only be developed by an experience of mutual relationship.”

The head of the company, however, argues that the sex robots can solve problems of those who have lost their spouses, for example.

Robotic ethics

Engineers are also becoming aware of the changes taking place between humans and robots. In December 2014, the Japanese Society for Artificial Intelligence set up an ethics committee after its journal’s cover was criticized for “gender discrimination.” The cover featured a female-looking cleaning robot.

The society now considers it necessary to study the relationship between robots and social ethics seriously. It calls for scientific research into how to realize and maintain a sound society, and for studies to be conducted within accepted social norms.

The idea is to develop robots that can work closely with humans. But there are potential pitfalls.

“The collection and management of privacy-related information will be crucial,” warned Masahiro Kobayashi, a lawyer at Hanamizuki Law Office who is familiar with legal matters concerning robots.

One example is Pepper, a humanoid robot developed by SoftBank Group, a major Japanese telecommunications company. Equipped with artificial intelligence, Pepper perceives indicators of human emotions and analyzes them to improve communication skills. Data obtained by Pepper, such as images, video footage and audio recordings, can be stored in the cloud and processed. Chairman and CEO Masayoshi Son hopes to see Pepper, which he calls “a robot with love,” in every household across Japan.

Crossing the line

“Even if it is not illegal, mishandling personal information could lead to problems,” Kobayashi said. “Companies should carefully explain” the issues to the public.

Japanese scientists learned this lesson in April 2014. Japan’s National Institute of Information and Communications Technology planned an experiment to analyze the flow of people in areas around Osaka Station using some 90 surveillance cameras. The aim was to test facial recognition technology that can track individuals in crowds. However, the plan met with harsh criticism from the public and the researchers were forced to change the study.

Unlike street surveillance cameras, humanoid robots can obtain personal information more easily because of their intimacy with the users. SoftBank says that it seeks prior consent regarding the handling of personal information after explaining the scope and the purpose of the data gathering.

Ubic, a Japanese data analysis company, is also treading carefully as it is set to launch the Kibiro domestic robot this year. “The robot keeps conversation records, and we are careful not to make customers feel that we are crossing the line,” the company said.

The relationship between humans and robots involve a range of difficult issues that cannot be easily dealt with under existing law. Science fiction writer Isaac Asimov once presented the “three laws of robotics.” The first law says: “A robot may not injure a human being, or, through inaction, allow a human being to come to harm.” The issues before us today, however, are much more complicated.

In Japan, legal and robotics experts are preparing to set up a new academic institution on laws of robotics. At its preparatory meeting, Fumio Shimpo, a law professor at Keio University, proposed tentative “eight new principles,” including the principles of “humanity first” and “secrecy.” “It is important to present guidelines on laws and ethics of robotics to industry,” Shimpo said.

Humanity has overcome numerous ethical problems in order to adopt new technologies. It is time we debated the full extent of the ethics of robotics.

via Nikkei

Share on Pinterest
More share buttons
Share with your friends










Submit