I’ve written previously about the ability for AI technologies to detect empathy. These concepts built upon the burgeoning field of behavioral signal processing, which uses computational methods to aid us in making decisions about behavioral phenomena.
The researchers trained the algorithm to detect empathy within conversations by training it on real data from therapy sessions that tackled alcoholism and other addictions.
A simple level of speech recognition allowed the system to automatically identify key phrases that would signal a level of empathy in the speech. These included things like “do you think”, and “it sounds like” for high empathy, or “you need to” and “during the past” for low empathy.
Empathetic machines
Of course, things become interesting when we examine not only empathy from a machine to a human, but vice versa. A recent study by Businesssolver found that over 2/3 of HR professionals believe that AI technology can improve empathy in the workplace, but employees are not so sure.
A survey a few years ago examined how employees feel about the introduction of ‘robotic colleagues’. The researchers set out to examine whether there may exist cultural differences in the acceptance of robotic colleagues between German and American workers, but there appeared to be a degree of homogeneity in the responses.
For instance, over 60% of respondents could easily imagine being supported by a robotic colleague, with 21% even suggesting such a change would be an improvement, with this largely due to the belief that a robot would be less error prone and more predictable in their behavior.
There was a strong appreciation of the so called ‘uncanny valley’ however, with respondents revealing that they don’t want robots to start displaying emotions.
This is further reflected in the belief that whilst robots are great at routine tasks, more complex endeavors are beyond them. This is especially true of things such as leadership, with very few respondents willing to consider a robot boss.
“A robot has no empathy for my family situation or other concerns that radiate into the job” they would say. “A machine cannot judge a man… and cannot serve as role model,” they continue.
Empathy for the machine
Things can go the other way too of course. A recent study from academics at the University of Lincoln suggested that robots may need to be made with more human like flaws and foibles in order for them to gain acceptance in our more personal lives.
There are positive signs of progress however, with a recent study highlighting how robot carers have been accepted, and even relied upon by patients on a personal level in elderly care.
Most of these explorations look at the relationship very much from the humans point of view, but a recent study, published in Nature, looks at things from the other end and explores whether humans feel empathy for their robot companions.
The research, conducted by a Japanese team, believes it has found the first neurophysiological evidence of our ability to empathize with a robot in apparent pain, albeit at a slightly different level to that shown towards other humans.
The study saw EEG tests performed on a, relatively small, sample of adults who were shown pictures of either a human or robotic hand that was in a painful situation.
The results were fascinating. Participants did show empathy towards the robot, but at a lower level than to the humans in the picture.
“The ascending phase of P3 (350-500 ms after the stimulus presentation) showed a positive shift in the observer for a human in pain in comparison with the no-pain condition, but not for a robot in perceived pain. Then, the difference between empathy toward humans and robots disappeared in the descending phase of P3 (500-650 ms)”, the authors say, “The positive shift of P3 is considered as reflecting the top-down process of empathy. Its beginning phase seems related to the process of perspective taking, as was shown in a previous study.”
As man and machine work increasingly in unison, it’s inevitable that we will attempt to gain a greater understanding of how this interaction might unfold. These studies give us a good start point.