I’ve written a few times about the increasing capacity for machines to understand complex human emotions, whether it’s humor, empathy, sarcasm or even lies.
Being able to understand human emotions is likely to be of fundamental importance if machines are to be accepted as companions or equals alongside humans.
Effective communication
Whilst there have been improvements in how good machines are at understanding the nuances of language, it’s slightly trickier to grasp things such as facial expressions.
A recent project is taking a somewhat circular route to achieve this by tapping into our brainwaves. Researchers are analyzing the brainwaves of participants to try and do this.
The approach is certainly interesting, but not without significant challenges. Whilst we have a good understanding of how our brain displays traits such as concentration, the neurological footprint of emotions tend to be more complex.
Until recently it has been sufficiently challenging to have stumped researchers, but a team from Shanghai Jiao Tong University believe they’ve found a way to accurately detect our emotions from our brain activity.
Reading our mind
Their technique was tested on a number of subjects, and they were able to identify their emotional state on a consistent basis purely by analyzing their brain waves.
They achieved this by creating a database that could be used to train the machine. This database consisted of the brain records of a number of participants as they watched a series of film clips, together with their emotions as they watched each clip.
This experiment was repeated several times to try and build up an accurate picture of the emotional arousal level of participants over a period of time.
This data was then used to train the algorithm to allow it to look for common features in our brain activity when we experience particular emotions.
It transpired that the algorithm was indeed remarkably effective at distinguishing between happiness and sadness with an accuracy of around 80%.
“The performance of our emotion recognition system shows that the neural patterns are relatively stable within and between sessions,” the researchers say.
Suffice to say, whilst 80% is good, there is clearly a lot more progress to be made, both in improving the accuracy and in expanding the breadth of emotions it is capable of detecting and the subjects it is capable of predicting.
It’s certainly a fascinating start point though and further evidence that we’re beginning to develop machines capable of understanding the complexity of human emotions.
Thanks for the great explanation of this technology – really fascinating.