Nine-year-old Alex Cullenbine was diagnosed with autism spectrum disorder at five years old – but his mum, Donji, said she knew from 18 months that something was different.
He avoided making eye contact, was slow to develop speech and was overly sensitive to sounds.
Since his diagnosis, he had received behavioural therapy one or two times a week but he remained gaze avoidant.
However, a few weeks after he started taking part in a small trial funded by the US government, at Stanford University, Donji noticed a difference in her son.
“He was starting to flick glances to my eyes and then dart his gaze away. At first it happened a few times a day, which was absolutely stunning. I remember catching my breath and almost doing a mental double-take.
“It gradually progressed to a handful of times per day, particularly when he wanted me to listen to him with my full undivided attention.”
People with autism can struggle to maintain eye contact and recognise emotions – and the scheme Alex had taken part in had been teaching him how to recognise emotions using Google glasses and an app.
The glasses had a camera to record his field of view as well as a small screen and speaker to give him visual and audio information.
As he interacted with other people, the app identified their emotions and told him via the glasses. He also used it as a game where he would have to guess the facial expression being shown by his mum or other adults.
He used the system three times a week for six weeks and Donji says it “transformed” how Alex felt about looking at faces.
“The glasses made a game of looking and gave him a key to decoding what he was seeing. He needed a translator, to make explicit what other children are able to figure out for themselves.
“Making it a game removed his anxiety over failure, made it fun, and motivated him to keep at it long enough to learn from it.
“At some point in the study, Alex said to me, ‘Mummy, I can read minds.’ Something had clicked and he now understood the benefit of looking at faces.”
The researchers who carried out the three-month study with 14 families say it is too early to know if the intervention works. The system didn’t help everybody in the trial and the study lacked a control group.
Those taking part were from the local area of Silicon Valley, where major technology companies are based, so they were much more likely to be tech savvy. It also took commitment to wear the glasses, donated by Google via the university and turn up for appointments.
The researchers, led by Dr Dennis Wall, are now working on a larger scale study with a control group.
The autism charity Autistica also have words of caution.
“Overall, I think it’s healthy to remain sceptical about this sort of approach,” says Dr James Cusack, director of science at Autistica,
“Although we know that some autistic people struggle with emotion recognition, there is very little actual evidence that the challenges autistic people actually experience are simply about recognising emotions.
“We know that many autistic people already struggle to manage different types of sensory input and information.
“Adding another stream of information could be unhelpful and could ultimately negatively impact on autistic people’s ability to navigate the world around them.”
Donji says she is delighted with the changes she has seen in Alex.
“Alex looks at faces and eyes so often, I no longer keep count. He may glance at my eyes multiple times during a single conversation.
“His understanding is more than simple pattern recognition – he sometimes comments on my expressions if they aren’t what he expects.
“For example, when he thought my facial expression was too intent while looking at a Lego creation he handed me, he said, ‘Why aren’t you happy? You aren’t smiling.'”
The research findings are published in the journal npj Digital Medicine – Nature.