1. The Mamem Project - A Dataset For Multimodal Human-Computer Interaction Using Biosignals And Eye Tracking Information
- Author
-
Nikolopoulos, Spiros, Georgiadis, Kostas, Kalaganis, Fotis, Liaros, Georgios, Lazarou, Ioulietta, Adam, Katerina, Papazoglou-Chalikias Anastasios, Chatzilari, Elisavet, Oikonomou, P. Vangelis, Petrantonakis, C. Panagiotis, Kompatsiaris, I., Kumar, Chandan, Menges, Raphael, Staab, Steffen, Müller, Daniel, Sengupta, Korok, Bostantjopoulou, Sevasti, Katsarou, Zoe, Zeilig, Gabi, Plotnin, Meir, Gottlieb, Amihai, Fountoukidou, Sofia, Ham, Jaap, Athanasiou, Dimitrios, Mariakaki, Agnes, Comanducci, Dario, Sabatini, Eduardo, Nistico, Walter, and Plank, Markus
- Subjects
BCI, Brain-Computer-Interfaces, Biosignals, eye-tracking, multimodal, human-computer interaction, EEG, GSR, HR, GazeTheWeb, Training, hands-free interaction, disabled, people with disabilities - Abstract
In this report we present a dataset that combines multimodal biosignals and eye tracking information gathered under a human-computer interaction framework. The dataset was developed in the vein of the MAMEM project that aims to endow people with motor disabilities with the ability to edit and author multimedia content through mental commands and gaze activity. The dataset includes EEG, eye-tracking, and physiological (GSR and Heart rate) signals along with demographic, clinical and behavioral data collected from 36 individuals (18 able-bodied and 18 motor-impaired). Data were collected during the interaction with specifically designed interface for web browsing and multimedia content manipulation and during imaginary movement tasks. Alongside these data we also include evaluation reports both from the subjects and the experimenters as far as the experimental procedure and collected dataset are concerned. We believe that the presented dataset will contribute towards the development and evaluation of modern human-computer interaction systems that would foster the integration of people with severe motor impairments back into society.
- Published
- 2017
- Full Text
- View/download PDF