A Confucian Perspective on Human-Robot Interactions: Should Robots Have Rights?

Posted by
Check your BMI

As robots become more human-like, should they have rights just as humans? HennyGe Wichers looks at a Confucianist approach.

 

SwissCognitive GuestBlogger: HennyGe Wichers, PhD – “A Confucian Perspective on Human-Robot Interactions: Should Robots Have Rights?”


 

toonsbymoonlight
A man hits a robot with a hockey stick, causing it to drop the box it was carrying. He taunts it, sliding the box just out of reach each time the robot tries to pick it up. Another shoves an android hard; it stumbles and falls to the floor, curling up into a ball. A mechanical dog scrambles to stay on its feet. Someone kicked it as it tried to cross an icy parking lot.

I watch the video and listen to the narrator joke about robot revenge scenarios in some distant future. An audience laughs in the background, and I wonder, is this ok? According to a paper published in Communications of the ACM on May 24, 2023, it is.

The video, which uses clips of robot-maker Boston Dynamics testing their products, is seven years old. But now, with the incredible rate of progress in artificial intelligence (AI), we need to think seriously about how we will treat robots. Tae Wan Kim, Associate Professor of Business Ethics at Carnegie Mellon’s Tepper School of Business, illustrates: “Imagine you allow your children to treat humanoids in whatever manner they think is interesting and appropriate. But that will be bad for human learning and education, and the relationship with the robots too.”

Robots will look and behave similarly to humans. “That raises the question whether robots should have rights, or some other kind of moral standing”, the researcher elaborates.

He approaches the problem from the Chinese philosophy of Confucianism with co-author Alan Strudler, Professor of Legal Studies and Business Ethics at the Wharton School. Confucianism emphasises social harmony and uses rites where Western thinking has rights. Rights apply to individual freedoms, but rites are about relationships and relate to ceremonies, rituals, and etiquette.

The handshake provides an intuitive example. When I see you, I smile and extend my hand. You lean in and do the same. We shake hands in perfect and effortless coordination, neither leading nor following the other.

Through the lens of rites, we can think of people and robots as teams, where each is obliged to play their role. Kim notes, “Granting rights is not the only way to address the moral status of robots: Envisioning robots as rites bearers—not as rights bearers—could work better.”

The Golden Rule of Confucianism is treating others how we would like to be treated. When applied to human-computer interaction (HCI), the objective is a flourishing life for both humans and robots.

Robots thrive when they can perform their tasks properly. If the team from Boston Dynamics need to kick, shove, and taunt to achieve that goal, then that’s acceptable. But your teenager punching a humanoid teacher because they’re not interested in learning is clearly not ok.

A Confucian Perspective on Human-Robot Interactions-Should Robots Have Rights1
Fig 1: Boston Dynamics testing a robot on an icy surface

In this article, we’ve assumed human-like robots that can respect others and express authentic and appropriate affection. Such conscious systems don’t exist today, but they soon may. There are two ways to think about robot consciousness: functional and phenomenal.

Functional consciousness is the ability to respond to stimuli and behave in a way that is appropriate to the situation. Phenomenal consciousness is the subjective experience of being aware of oneself and the world around oneself. Robots may never graduate to the latter, but functionally conscious robots could be possible in the not-too-distant future.

That’s why we need to think about how we interact with robots. “To the extent that we make robots in our image, if we don’t treat them well, as entities capable of participating in rites, we degrade ourselves,” warns Kim.

Humans can recognise and respect the boundaries of non-humans. We have formalised the rights of corporations, animals, rivers, and mountains. And a partnership with robots based on rites seems practical as well as gracious.

But that means programming computers to perform complex social interactions depending on context. Kim explains, “Artificial intelligence imitates human intelligence, so for robots to develop as rites bearers, they must be powered by a type of AI that can imitate humans’ capacity to recognize and execute team activities—and a machine can learn that ability in various ways.”

Still, we don’t know precisely how. The abilities of new Large Language Models (LLMs) like ChatGPT and Bard are impressive, but they simplify patterns and often overgeneralise. The technology can’t support the type of human-robot interaction in this article.

We may need to go backwards to go forward. Good Old-Fashioned AI (GOFAI) was popular in the 1980s and can reason about the world, understand language, and play games. But GOFAI is also easy to fool and requires significant encoded knowledge. That’s why it lost popularity in favour of Machine Learning in the 1990s. Yet, combining GOFAI and machine learning techniques could be the key to human-like robots.

Used Source: EurekAlert!

Journal link (with video): Communications of the ACM


About the Author:

HennyGe Wichers is a technology science writer and reporter. For her PhD, she researched misinformation in social networks. She now writes more broadly about artificial intelligence and its social impacts.

 

Der Beitrag A Confucian Perspective on Human-Robot Interactions: Should Robots Have Rights? erschien zuerst auf SwissCognitive, World-Leading AI Network.