1. A multimodal interaction system for big displays
- Author
-
Juan A. Besada, Ana Muñoz, Luca Bergesio, Ana M. Bernardos, and José R. Casar
- Subjects
Vocabulary ,Service (systems architecture) ,Multimedia ,Computer science ,business.industry ,media_common.quotation_subject ,05 social sciences ,Control (management) ,computer.software_genre ,Multimodal interaction ,User experience design ,Human–computer interaction ,0502 economics and business ,Natural (music) ,050211 marketing ,0501 psychology and cognitive sciences ,Set (psychology) ,business ,computer ,050107 human factors ,media_common ,Gesture - Abstract
Big displays and ultrawalls are increasingly present in nowadays environments (e.g. in city spaces, buildings, transportation means, teaching rooms, operation rooms, convention centres, etc.), at the same time that they are widely used as tools for collaborative work, monitoring and control in many other contexts. How to enhance interaction with big displays to make it more natural and fluent is still an open challenge. This paper presents a system for multimodal interaction based on pointing and speech recognition. The system makes possible for the user to control the big display through a combination of pointing gestures and a set of control commands built on a predefined vocabulary. The system is already prototyped and being used for service demonstrations for different applications.
- Published
- 2017
- Full Text
- View/download PDF