Back to Search Start Over

Multi-scale gestural interaction for augmented reality

Authors :
Pourang Irani
Hui-Shyong Yeo
Mark Billinghurst
Barrett Ens
Aaron Quigley
University of St Andrews. School of Computer Science
SIGGRAPH Asia 2017 Mobile Graphics and Interactive Applications Bangkok, Thailand 27-30 November 2017
Ens, B
Quigley, A
Yeo, H-S
Irani, P
Billinghurst, Mark
Source :
SIGGRAPH ASIA Mobile Graphics and Interactive Applications
Publication Year :
2017
Publisher :
ACM Press, 2017.

Abstract

We present a multi-scale gestural interface for augmented reality applications. With virtual objects, gestural interactions such as pointing and grasping can be convenient and intuitive, however they are imprecise, socially awkward, and susceptible to fatigue. Our prototype application uses multiple sensors to detect gestures from both arm and hand motions (macro-scale), and finger gestures (micro-scale). Micro-gestures can provide precise input through a belt-worn sensor configuration, with the hand in a relaxed posture. We present an application that combines direct manipulation with microgestures for precise interaction, beyond the capabilities of direct manipulation alone. Postprint

Details

Database :
OpenAIRE
Journal :
SIGGRAPH Asia 2017 Mobile Graphics & Interactive Applications on - SA '17
Accession number :
edsair.doi.dedup.....9cfdfc244fdbccd8155db480410c2a3f
Full Text :
https://doi.org/10.1145/3132787.3132808