Back to Search
Start Over
Multimodal Affective States Recognition Based on Multiscale CNNs and Biologically Inspired Decision Fusion Model
- Source :
- IEEE Transactions on Affective Computing 14 (2023) 1391-1403
- Publication Year :
- 2019
-
Abstract
- There has been an encouraging progress in the affective states recognition models based on the single-modality signals as electroencephalogram (EEG) signals or peripheral physiological signals in recent years. However, multimodal physiological signals-based affective states recognition methods have not been thoroughly exploited yet. Here we propose Multiscale Convolutional Neural Networks (Multiscale CNNs) and a biologically inspired decision fusion model for multimodal affective states recognition. Firstly, the raw signals are pre-processed with baseline signals. Then, the High Scale CNN and Low Scale CNN in Multiscale CNNs are utilized to predict the probability of affective states output for EEG and each peripheral physiological signal respectively. Finally, the fusion model calculates the reliability of each single-modality signals by the Euclidean distance between various class labels and the classification probability from Multiscale CNNs, and the decision is made by the more reliable modality information while other modalities information is retained. We use this model to classify four affective states from the arousal valence plane in the DEAP and AMIGOS dataset. The results show that the fusion model improves the accuracy of affective states recognition significantly compared with the result on single-modality signals, and the recognition accuracy of the fusion result achieve 98.52% and 99.89% in the DEAP and AMIGOS dataset respectively.<br />Comment: 21 pages, 11 figures, 8 tables
Details
- Database :
- arXiv
- Journal :
- IEEE Transactions on Affective Computing 14 (2023) 1391-1403
- Publication Type :
- Report
- Accession number :
- edsarx.1911.12918
- Document Type :
- Working Paper
- Full Text :
- https://doi.org/10.1109/TAFFC.2021.3093923