Back to Search Start Over

Robust 3-D Gaze Estimation via Data Optimization and Saliency Aggregation for Mobile Eye-Tracking Systems.

Authors :
Liu, Meng
Li, Youfu
Liu, Hai
Source :
IEEE Transactions on Instrumentation & Measurement; 2021, Vol. 70, p1-10, 10p
Publication Year :
2021

Abstract

In order to precisely predict 3-D gaze points, calibration is needed for each subject prior to first use the mobile gaze tracking system. However, traditional calibration methods normally expect the user to stare at predefined targets in the scene, which is troublesome and time-consuming. In this study, we proposed a novel method to remove the explicit user calibration and achieve robust 3-D gaze estimation in the room-scale area. Our proposed framework treats salient regions in the scene as possible 3-D locations of gaze points. To improve the efficiency of predicting 3-D gaze from visual saliency, the bag-of-word algorithm is adopted for eliminating redundant scene image data based on their similarities. After the elimination, saliency maps are generated from those scene images, and the geometrical relationship among the scene and eye cameras is obtained through aggregating 3-D salient targets with eye visual directions. Finally, we calculate the 3-D point of regard (PoR) by utilizing 3-D structures of the scene. The experimental results indicate that our method enhances the reliability of saliency maps and achieves promising performances on 3-D gaze estimation with different subjects. [ABSTRACT FROM AUTHOR]

Subjects

Subjects :
GAZE
EYE tracking

Details

Language :
English
ISSN :
00189456
Volume :
70
Database :
Complementary Index
Journal :
IEEE Transactions on Instrumentation & Measurement
Publication Type :
Academic Journal
Accession number :
170415297
Full Text :
https://doi.org/10.1109/TIM.2021.3065437