Attempts to reduce the effects of autism on child development have been a hot topic for researchers for sometime now, and with good cause. It’s a condition that is believed to affect around 1 in every 100 children, impacting upon every aspect of their world.
A recent study suggests that progress is being made. It focuses on babies that are at risk of autism. Each of the participating children had an older sibling with autism, which raises the chances of the baby themselves having the condition.
The experiment involved therapists giving parents feedback via video as they interacted with their children. The aim was to support parents and help them to understand and adapt to their children’s behavior and engagement.
The results suggest that the approach has merit, with the children who had received the intervention showing clear improvements in behaviors typically associated with autism than their peers in the control group.
Suffice to say, there isn’t a claim that the intervention prevents autism or anything of the sort, but it might just help reduce some of the behaviors that inhibit the child’s development.
Such interventions do appear to have merit, but they still require an early diagnosis of autism to allow parents to adapt their approach. One startup that aims to provide such early detection is California based Cognoa.
The startup, which recently raised $11.6 million in funding, was originally developed at Harvard and Stanford’s medical schools. It consists of an assessment platform whereby parents can upload information and videos, with the company claiming that autism can be detected in children up to 13 months sooner than existing methods.
To date, the platform has been used by over 300,000 families across the US, and the company believe it could change the standard of care for children.
It isn’t the only project aiming to provide earlier detection however. Earlier this year I wrote about a project from the University of Nottingham to monitor facial expressions and head movements using AI to diagnose conditions such as autism and ADHD.
There isn’t a simple and straight forward test for either autism or ADHD at the moment, with clinicians typically relying upon their observation skills to conduct their assessment.
“These are frequently co-occurring conditions and the visual behaviours that come with them are similar,” the researchers note.
The researchers used machine learning to help spot some of these behaviors. The algorithm was trained on videos of 55 adults as they engaged with a number of stories. This was used because autistic people often struggle with the social and emotional subtleties inherent in stories.
The system quickly learned to detect differences in how the groups responded to the stories. For instance, those with both autism and ADHD were found to be less likely to raise their eyebrows in response to surprising information.
The system also monitored head movements to understand when the attention of the volunteers waned. By combining these measures, the system was capable of spotting people with ADHD or autism like conditions with an accuracy of 96%.
You’ve also got a number of fascinating projects that are hoping to support children with autism themselves. For instance, Autism Glass is a Stanford project that utilizes Google Glass and provides wearers with a number of social cues delivered via machine learning to help them make sense of the situation they find themselves in.
The glasses, which feature an outward-facing camera, use machine learning to gage the facial expressions of those the wearer encounters. They’re capable of providing the user with real-time cues to better understand their companion’s emotions.
It’s also capable of monitoring the level of eye contact that occurs, and make inferences on the way the user is engaging with others as a result.
The glasses are connected to a smartphone app to allow the patient, their family and medical professionals to monitor their progress. There is even the capability to watch back the interactions as they unfolded.
Or you’ve got the robot developed by researchers at Imperial College London. Researchers have embarked on a four-year project, called DeEnigma, which aims to test whether robots can provide a fun way for children to learn about emotions.
Central to the project is a consistent means of teaching, with the use of robots allowing for the same facial expressions and gestures to be used in each interaction.
“Autism affects people in different ways. However, many struggle with understanding and conveying emotions, often preferring to shut out what they don’t understand. It is important to help them to understand how people convey their emotions so that they don’t find human interactions so confusing and that is why we think our project is so important,” the researchers say.
The project is believed to be the first of its kind, as no previous work had allowed robots to interact with children in real time. The team believe this spontaneity is going to be crucial in the effective teaching of each child by making learning fun and exciting.
As you can see, there are a wide range of projects tackling both earlier detection but also better management and interaction with autistic children. Hopefully projects such as these will go a long towards a better life for those with the condition.