For instance, Canary Speech are an American company that monitor our speech patterns for various diseases, from Alzheimer’s to Parkinson’s.
Now, researchers from the University of Nottingham have released a paper in which they use machine learning to analyze facial expressions and head movements to diagnose conditions such as autism and ADHD.
There isn’t a simple and straight forward test for either autism or ADHD at the moment, with clinicians typically relying upon their observation skills to conduct their assessment.
“These are frequently co-occurring conditions and the visual behaviours that come with them are similar,” the researchers note.
The researchers used machine learning to help spot some of these behaviors. The algorithm was trained on videos of 55 adults as they engaged with a number of stories. This was used because autistic people often struggle with the social and emotional subtleties inherent in stories.
The researchers tracked a range of people, including those already diagnosed with either ADHD or autism, or none of them.
The system quickly learned to detect differences in how the groups responded to the stories. For instance, those with both autism and ADHD were found to be less likely to raise their eyebrows in response to surprising information.
The system also monitored head movements to understand when the attention of the volunteers waned. By combining these measures, the system was capable of spotting people with ADHD or autism like conditions with an accuracy of 96%.
Suffice to say, it’s unlikely that an algorithm will be replacing humans in the process just yet, but there are still positive signs that they can support what we do in diagnosing these kind of conditions.
“We are creating diagnostic tools that will speed up the diagnosis in an existing practice, but we do not believe we can remove humans. Humans add ethics and moral values to the process,” the authors conclude.