Back to Search Start Over

Look me in the eye: evaluating the accuracy of smartphone-based eye tracking for potential application in autism spectrum disorder research

Authors :
Florian Lipsmeier
Maximilian A.R. Strobl
Liliana Ramona Demenescu
Michael Lindemann
Christian Gossens
Maarten De Vos
Source :
BioMedical Engineering OnLine, Vol 18, Iss 1, Pp 1-12 (2019), BioMedical Engineering
Publication Year :
2019
Publisher :
BMC, 2019.

Abstract

Background Avoidance to look others in the eye is a characteristic symptom of Autism Spectrum Disorders (ASD), and it has been hypothesised that quantitative monitoring of gaze patterns could be useful to objectively evaluate treatments. However, tools to measure gaze behaviour on a regular basis at a manageable cost are missing. In this paper, we investigated whether a smartphone-based tool could address this problem. Specifically, we assessed the accuracy with which the phone-based, state-of-the-art eye-tracking algorithm iTracker can distinguish between gaze towards the eyes and the mouth of a face displayed on the smartphone screen. This might allow mobile, longitudinal monitoring of gaze aversion behaviour in ASD patients in the future. Results We simulated a smartphone application in which subjects were shown an image on the screen and their gaze was analysed using iTracker. We evaluated the accuracy of our set-up across three tasks in a cohort of 17 healthy volunteers. In the first two tasks, subjects were shown different-sized images of a face and asked to alternate their gaze focus between the eyes and the mouth. In the last task, participants were asked to trace out a circle on the screen with their eyes. We confirm that iTracker can recapitulate the true gaze patterns, and capture relative position of gaze correctly, even on a different phone system to what it was trained on. Subject-specific bias can be corrected using an error model informed from the calibration data. We compare two calibration methods and observe that a linear model performs better than a previously proposed support vector regression-based method. Conclusions Under controlled conditions it is possible to reliably distinguish between gaze towards the eyes and the mouth with a smartphone-based set-up. However, future research will be required to improve the robustness of the system to roll angle of the phone and distance between the user and the screen to allow deployment in a home setting. We conclude that a smartphone-based gaze-monitoring tool provides promising opportunities for more quantitative monitoring of ASD. Electronic supplementary material The online version of this article (10.1186/s12938-019-0670-1) contains supplementary material, which is available to authorized users.

Details

Language :
English
Database :
OpenAIRE
Journal :
BioMedical Engineering OnLine, Vol 18, Iss 1, Pp 1-12 (2019), BioMedical Engineering
Accession number :
edsair.doi.dedup.....cba23a1a46c7a38672bcd40cd76acb0b