Back to Search Start Over

The Mamem Project - A Dataset For Multimodal Human-Computer Interaction Using Biosignals And Eye Tracking Information

Authors :
Nikolopoulos, Spiros
Georgiadis, Kostas
Kalaganis, Fotis
Liaros, Georgios
Lazarou, Ioulietta
Adam, Katerina
Papazoglou-Chalikias Anastasios
Chatzilari, Elisavet
Oikonomou, P. Vangelis
Petrantonakis, C. Panagiotis
Kompatsiaris, I.
Kumar, Chandan
Menges, Raphael
Staab, Steffen
Müller, Daniel
Sengupta, Korok
Bostantjopoulou, Sevasti
Katsarou, Zoe
Zeilig, Gabi
Plotnin, Meir
Gottlieb, Amihai
Fountoukidou, Sofia
Ham, Jaap
Athanasiou, Dimitrios
Mariakaki, Agnes
Comanducci, Dario
Sabatini, Eduardo
Nistico, Walter
Plank, Markus
Publication Year :
2017
Publisher :
Zenodo, 2017.

Abstract

In this report we present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals along with demographic, clinical and behavioral data collected from 36 individuals (18 able-bodied and 18 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. Alongside these data we also include evaluation reports both from the subjects and the experimenters as far as the experimental procedure and collected dataset are concerned. We believe that the presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.

Details

Database :
OpenAIRE
Accession number :
edsair.doi.dedup.....14c48f0aee961ba84bf86f0299b5e9d3
Full Text :
https://doi.org/10.5281/zenodo.834154