Back to Search
Start Over
Characterizing information access needs in gaze-adaptive augmented reality interfaces: implications for fast-paced and dynamic usage contexts.
- Source :
- Human-Computer Interaction; 2024, Vol. 39 Issue 5/6, p553-583, 31p
- Publication Year :
- 2024
-
Abstract
- Gaze-adaptive interfaces can enable intuitive hands-free augmented reality (AR) interaction but unintentional selection (i.e. "Midas Touch") can have serious consequences during high-stakes real-world AR use. In the present study, we assessed how simulated gaze-adaptive AR interfaces, implementing single and dual gaze inputs, influence Soldiers' human performance and user experience (UX) in a fast-paced virtual reality marksmanship task. In Experiment 1, we investigated 1- and 2-stage dwell-based interfaces, finding confirmatory dual gaze dwell input effectively reduced Midas Touch but also reduced task performance and UX compared to an always-on (AO) interface. In Experiment 2, we investigated gaze depth-based interfaces, finding similar negative impacts of confirmatory dwell on Midas Touch, task performance, and UX. Overall, compared to the AO interface, single gaze input interfaces (e.g. single dwell or gaze depth threshold) reduced viewing of task-irrelevant information and yielded similar task performance and UX despite being prone to Midas Touch. Broadly, our findings demonstrate that AR users performing fast-paced dynamic tasks can tolerate some unintentional activation of AR displays if reliable and rapid information access is maintained and point to the need to develop and refine gaze depth estimation algorithms and novel gaze depth-based interfaces that provide rapid access to AR display content. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 07370024
- Volume :
- 39
- Issue :
- 5/6
- Database :
- Complementary Index
- Journal :
- Human-Computer Interaction
- Publication Type :
- Academic Journal
- Accession number :
- 179415594
- Full Text :
- https://doi.org/10.1080/07370024.2023.2260788