Back to Search Start Over

Adolescents Environmental Emotion Perception by Integrating EEG and Eye Movements

Authors :
Yuanyuan Su
Wenchao Li
Ning Bi
Zhao Lv
Source :
Frontiers in Neurorobotics, Vol 13 (2019)
Publication Year :
2019
Publisher :
Frontiers Media S.A., 2019.

Abstract

Giving a robot the ability to perceive emotion in its environment can improve human-robot interaction (HRI), thereby facilitating more human-like communication. To achieve emotion recognition in different built environments for adolescents, we propose a multi-modal emotion intensity perception method using an integration of electroencephalography (EEG) and eye movement information. Specifically, we first develop a new stimulus video selection method based on computation of normalized arousal and valence scores according to subjective feedback from participants. Then, we establish a valence perception sub-model and an arousal sub-model by collecting and analyzing emotional EEG and eye movement signals, respectively. We employ this dual recognition method to perceive emotional intensities synchronously in two dimensions. In the laboratory environment, the best recognition accuracies of the modality fusion for the arousal and valence dimensions are 72.8 and 69.3%. The experimental results validate the feasibility of the proposed multi-modal emotion recognition method for environment emotion intensity perception. This promising tool not only achieves more accurate emotion perception for HRI systems but also provides an alternative approach to quantitatively assess environmental psychology.

Details

Language :
English
ISSN :
16625218
Volume :
13
Database :
Directory of Open Access Journals
Journal :
Frontiers in Neurorobotics
Publication Type :
Academic Journal
Accession number :
edsdoj.7ca6aae437db417e8bcc7526d7e1aa5d
Document Type :
article
Full Text :
https://doi.org/10.3389/fnbot.2019.00046