Do Robots Need A Story For Us To Trust Them?

Robots and other seemingly inanimate objects may seem just that, but of course, that doesn’t mean that we don’t wish to ascribe certain human characteristics to them. For instance, research from Washington State University suggests that the supposed sex of a robot affects how, or even whether, we want to engage with it.

The study argues that people may actually be happier conversing with a robot in hospitality settings if the robot appears to be female rather than male.  This was especially so when the robot was humanoid in appearance.

“People have a tendency to feel more comfort in being cared for by females because of existing gender stereotyping about service roles,” the authors explain. That gender stereotype appears to transfer to robot interactions, and it is more amplified when the robots are more human-like.”

Robot origins

Research from Stanford Graduate School of Business suggests that this might extend to the “origins” of a robot. The study found that when we think about the people who create robots (and other technologies), we seem to regard the work performed by the robot as more authentic.

Traditionally we tend to view AI as less authentic than humans, but the researchers wanted to understand if assigning a form of a human origin story to technology could help to reduce that authenticity gap.

“If you look at what drives purchases of consumers in advanced economies, it’s often not objective characteristics of products or services,” the authors explain. “It’s our interpretation of them, the meaning we derive. It matters a lot if we think something is authentic.”

This can be hugely powerful for companies, as it is believed that authenticity is so powerful that we’re willing to pay more for goods and services that we believe are authentic.

Artificial authenticity

The researchers tested the authenticity of AI technology in a range of scenarios, from recruitment to therapy. The work in each scenario was performed by a hypothetical AI agent, called Cyrill. In each scenario, Cyrill was given a backstory related to the work “he” did.

Gaining trust between robots and humans has been an ongoing source of research for some time now. For instance, research from the U.S. Army Combat Capabilities Development Command’s Army Research Laboratory suggests that human facial expressions could be crucial in establishing that trust, at least on the battlefield.

“We wanted to characterize and quantify factors that impact the emotional experience that humans have with trust in automated driving,” the researchers explain. “With this information, we want to develop a robust way to predict decision errors in automation use to eventually enable active, online mitigation strategies and effective calibration techniques when humans and agents are teaming in real-time.”

Suffice to say, however, giving robots a human origin story is perhaps rather more straightforward than giving them human facial characteristics. It also appeared to have a stronger impact on the authenticity of the robot. Indeed, this boost was even found when the origin story was deliberately tailored to be less humanlike.

Developing trust

The issue of developing trust with robots is becoming more pressing as our interactions with them become more frequent. While we may assume that the way in which we build such trusting relationships will inevitably differ from the approach taken with fellow humans, that may not be the case. Research from the University of Montreal suggests that the way we build trust with robots is very similar to the way we do so with humans.

The researchers conducted a trust game experiment, whereby human volunteers were asked to bestow a $10 endowment to a partner, who was either a human, a robot, or a robot acting on behalf of a human.  It was in many ways a classic game theory setup, with the human volunteer knowing that gains were to be made, but the trust would be key. The robots in the experiment were programmed to mimic reciprocation behaviors from previous human players.

It’s common in these kinds of games for decisions to quickly converge around outcomes that are mutually beneficial to both parties. In this experiment, a key factor was the emotional reaction of people following their interactions with robots versus humans.

The results suggest that people develop trust similarly in both humans and robots. Traditionally, people would trust humans for both monetary gains and also gain information about the other party, and a similar pattern emerged in the relations with the robots.

This is positive, especially as interactions between man and machine are becoming both more frequent and are taking place in more sensitive domains. Nonetheless, if we want to encourage trusting relationships to form, giving technology both a face and a back story might not do any harm.

Facebooktwitterredditpinterestlinkedinmail