Back to Search Start Over

A Multiparadigm Approach to Integrate Gestures and Sound in the Modeling Framework

Authors :
Amaral, Vasco
Cicchetti, Antonio
Deshayes, Romuald
Amaral, Vasco
Cicchetti, Antonio
Deshayes, Romuald
Publication Year :
2013

Abstract

One of the essential means of supporting Human-MachineInteraction is a (software) language, exploited to input commands andreceive corresponding outputs in a well-dened manner. In the past,language creation and customization used to be accessible to softwaredevelopers only. But today, as software applications gain more ubiquity,these features tend to be more accessible to application users themselves.However, current language development techniques are still based on tra-ditional concepts of human-machine interaction, i.e. manipulating textand/or diagrams by means of more or less sophisticated keypads (e.g.mouse and keyboard).In this paper we propose to enhance the typical approach for dealing withlanguage intensive applications by widening available human-machine in-teractions to multiple modalities, including sounds, gestures, and theircombination. In particular, we adopt a Multi-Paradigm Modelling ap-proach in which the forms of interaction can be specied by means ofappropriate modelling techniques. The aim is to provide a more advancedhuman-machine interaction support for language intensive applications.<br />RALF3 - Software for Embedded High Performance Architectures

Details

Database :
OAIster
Notes :
English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1233492304
Document Type :
Electronic Resource