1. Eye Tracking Interaction on Unmodified Mobile VR Headsets Using the Selfie Camera
- Author
-
George Alex Koulieris, Katerina Mania, and Panagiotis Drakopoulos
- Subjects
General Computer Science ,Panorama ,Computer science ,Headset ,ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION ,Experimental and Cognitive Psychology ,02 engineering and technology ,Theoretical Computer Science ,law.invention ,law ,0202 electrical engineering, electronic engineering, information engineering ,medicine ,0501 psychology and cognitive sciences ,Computer vision ,Input method ,Iris (anatomy) ,050107 human factors ,Eye tracking ,Pixel ,business.industry ,05 social sciences ,020207 software engineering ,Mobile VR ,Lens (optics) ,medicine.anatomical_structure ,Artificial intelligence ,business ,Mobile device - Abstract
Summarization: Input methods for interaction in smartphone-based virtual and mixed reality (VR/MR) are currently based on uncomfortable head tracking controlling a pointer on the screen. User fixations are a fast and natural input method for VR/MR interaction. Previously, eye tracking in mobile VR suffered from low accuracy, long processing time, and the need for hardware add-ons such as anti-reflective lens coating and infrared emitters. We present an innovative mobile VR eye tracking methodology utilizing only the eye images from the front-facing (selfie) camera through the headset’s lens, without any modifications. Our system first enhances the low-contrast, poorly lit eye images by applying a pipeline of customised low-level image enhancements suppressing obtrusive lens reflections. We then propose an iris region-of-interest detection algorithm that is run only once. This increases the iris tracking speed by reducing the iris search space in mobile devices. We iteratively fit a customised geometric model to the iris to refine its coordinates. We display a thin bezel of light at the top edge of the screen for constant illumination. A confidence metric calculates the probability of successful iris detection. Calibration and linear gaze mapping between the estimated iris centroid and physical pixels on the screen results in low latency, real-time iris tracking. A formal study confirmed that our system’s accuracy is similar to eye trackers in commercial VR headsets in the central part of the headset’s field-of-view. In a VR game, gaze-driven user completion time was as fast as with head-tracked interaction, without the need for consecutive head motions. In a VR panorama viewer, users could successfully switch between panoramas using gaze. Presented on: ACM Transactions on Applied Perception
- Published
- 2021