The wide range of education experiments conducted at our SURFnet Google Glass competition is proof positive of Google Glass’ potential as a wearable. After a look at the UvA’s Fix My Vowels and the HAN University of Applied Sciences’ RobotPatient, this time the spotlight is on ‘Google Glass for Gemma’, a joint experiment of the University of Amsterdam (UvA) and Leiden University, aimed at helping Autism Spectrum Disorder (ASD) sufferers see themselves differently.
In this experiment, Google Glass helps young people with ASD become more socially confident.
Emotion recognition software
ASD is associated with insecurity and, at times, awkward situations. Many sufferers have difficulty reading emotions in the facial expressions of others when carrying on a conversation. That makes it hard to respond with the appropriate emotion through their own facial expression. With UvA’s and Leiden University’s Google Glass application, ASD sufferers can see their own facial expressions, have them evaluated and learn from the feedback they receive. The application uses emotion recognition software that reads expressions at various facial reference points, analyses these and informs the user if the emotion fits the situation.
‘The idea behind “Google Glass for Gemma” is to enable an ASD sufferer, whom we call “Gemma” in this experiment, to practise at home with a wearable,’ says Annette Langedijk, Education & Research Department ICT consultant at UvA ICT Services. ‘At home, Gemma can put on the glasses and look in the mirror. Then she picks one of the pre-programmed video clips featuring everyday social situations. As Gemma focuses on the facial expressions in the clip, she has her own response to the emotions she recognises. Google Glass photographs her reflection and the software reads and analyses her facial expression. Then Gemma receives targeted feedback, helping her learn appropriate responses and which facial expression fits a particular situation. In this way, she will become more confident because she knows how her face responds and what to expect in certain situations.’
A collaborative experiment
Annette Langedijk and Joffrey Hoijer (Information Strategist at Leiden University) decided to enter a project in the Google Glass competition relating to behavioural therapy for young ASD sufferers during their joint visit to the eScience symposium in Almere in October 2014. Langedijk had heard of the SURF competition and wanted to participate. Hoijer provided support to Leiden University’s Faculty of Psychology, where researchers are studying therapies for children with ASD. Two of the researchers on the team, Marcia Goddard and Gemma Zantinge, had been studying emotion recognition software for a while and wanted to find feasible ways to improve the quality of life of ASD sufferers with the software.
The parties pooled their expertise and requirements, which, combined with the existing emotion recognition software, made the experiment possible. ‘I came up with the idea to combine Google Glass and emotion recognition software to improve the lives of ASD sufferers,’ says Hoijer. ‘We got access to Google Glass by entering the competition through SURF. Then we had the wearable we needed to actually launch the experiment.’
That was the cue for researchers Marcia Goddard and Gemma Zantinge to consider a training method that would be compatible with the Google Glass application and help children with ASD. A training programme was devised to help children in everyday life. According to Goddard, ‘We started selecting clips of people experiencing strong emotions. The emotions experienced by ASD sufferers at a particular moment can be analysed using emotion recognition software by having them wear Google Glass as they watch the clips. As they train, our team receives that data. which we use to track testing progress and learn about the interaction between brain activity and visible behaviour.’
The emotion recognition software was developed by professor Theo Gevers (professor of Computer Vision and the UvA and CSO at Sightcorp) in collaboration with Sightcorp, an UvA spin-off. The software had been previously tested in mood analysis of theatre audiences, as well in the Frans Hals Museum in Haarlem, where Gevers tested the visitors’ ability to empathise with the emotions of the people shown in the exhibited paintings.
Programming an app
Gevers’ software was a boon for Tom Kuipers, UvA senior developer and Gemma project programmer. ‘We used the existing API and Sightcorp’s facial recognition software, so we didn’t have to reinvent the wheel or build an application from scratch. Actually we didn’t have the time – we could only keep Google Glass for three months. So we really had to move. That’s also why we deliberately chose to make a hybrid app which would be compatible with all platforms and devices. The downside is that certain elements are not as deeply integrated into the device. Things like taking a photo unnoticed, are therefore harder to achieve, which definitely creates more distraction than Gemma would like. The software is too bulky to run in the wearable itself, so the image is transmitted via the API and the server transmits back an answer. So feedback was not received in real time.’ Kuipers and his team hope to win the Google Glass and have all the time in the world to tweak the application.
It was a challenge – in more ways than one – to work with the pre-set, three month period. ‘The glasses had to be sent to the next entrant before we were ready,’ relates Langedijk. ‘So we couldn’t run all of the planned tests. Testing in practice, for instance, fell by the wayside. We weren’t able to test it on actual ASD sufferers, only on our own researchers. You just need more time to perform all the steps involved in this kind of experiment: programming, training development, working with the various parties – and with the necessary focus. The emotion recognition software is controlled via the glasses – that much we were able to engineer. We can also recognise facial expressions and emotions via the mirror. We set out to develop a proof of concept, which we have. We hope of course to conduct further experiments and proceed to actually testing the system on those who will ultimately use it.
The experiment has also had valuable secondary benefits. Langedijk feels that experimenting and research are the key ways we acquire new knowledge. The UvA and Leiden University have clearly showed they have the expertise to take on such projects, for instance. It also inspires and encourages to carry out more experiments and – often brief – studies, as well as focus more on proofs of concept. The SURFnet Testbed project supports this wholeheartedly!
The race for Google Glass did not end with ‘Google Glass for Gemma’. We will soon publish our next blog, which will feature a story about the fourth and last participant: Utrecht University, about geosciences, digital cards and augmented reality in the field.
Want to have Sightcorp’s emotion recognition software read your facial expression? Go to: https://face.sightcorp.com/demo_basic/