Computer says: you’re fired

computer-says-noThe British comedy show Little Britain had a running gag featuring a lady (played by David Walliams) working in a bank.  She would plug every customer inquiry into her computer, before typically returning the curt response “the computer says no”.

The gag is a jab at the notion that so many of the decisions that affect our lives appear to be made according to algorithms rather than that most human of virtues – common sense.

Of course, that’s not to say that we prefer making all tough decisions ourselves, as a new study is only to happy to illustrate.  The study, headed by academics at the University of Bonn, looked at the particular issue of risky or difficult decisions.  Such situations can often lead one party or other disappointed in the outcome, thus eroding the trust that is so important for subsequent decisions.

“Everyone knows that trust can be shattered in risky businesses,” explained Prof. Dr. Bernd Weber from the Center for Economics and Neuroscience (CENs) at the University of Bonn. “As a result, people are not all that eager to put their trust in others.” Scientists call this attitude “betrayal aversion” – people try to avoid being disappointed by potential breaches of trust.

The study explored the affect betrayal aversion had on some basic financial decisions whereby participants were promised real money for winning a game whilst their brain activity was monitored in an MRI scanner.  Participants were offered the chance to offer their peers a small sum of 1 euro, or a higher amount (€6) that would be divided up between them.  This latter option came with a risk of course.  The player could choose to give his colleague a sum smaller than the guaranteed €1.

The task of dividing the amount however could be done by the other player, or by a computer, albeit a computer that was programmed to give exactly the same amounts as the human would.

“So, from the point of view of winnings, there was no difference whether the other player or the machine divided the amount,” explained Prof. Weber. “And the subjects had explicitly been told so from the very start.”

Despite the winnings being equal either way, significantly more people were happy to place their trust in the computer than they did their human partner.  When the computer option was on the table, some 63% of players went with the higher amount to be divided between the players rather than the smaller, but guaranteed sum.  When the computer mediator wasn’t available however, just 49% of participants went for the riskier option.

“These results show that more subjects prefer to leave risky decisions in which they may be betrayed to an impersonal device, thus avoiding the negative feeling that comes from having wrongly trusted a human,” said Prof. Weber, adding that obviously a breach of trust committed by an impersonal computer was less emotionally stressful than if had been a private business partner.

The MRI scan showed that the predominant active area during the game was the frontal insula.

“This area of the brain is always involved when negative emotions such as pain, disappointment or fear are activated,” explained Prof. Weber. He added that the fact that the frontal insula was activated is a clear indication that negative emotions played an important role in these situations.

Now of course, financial decisions are inherently complex, and there have been various studies highlighting the erosion of trust by the relative anonymity of conducting business online.  Nevertheless, this research shows that in certain circumstances, this anonymity can be a good thing, and can help make difficult pills easier to swallow.

Related

Facebooktwitterredditpinterestlinkedinmail

Leave a Reply

Your email address will not be published. Required fields are marked *

Captcha loading...