Back to Search Start Over

Decoding dynamic affective responses to naturalistic videos with shared neural patterns

Authors :
Chan, H.Y. (Hang-Yee)
Smidts, A. (Ale)
Schoots, V.C. (Vincent)
Sanfey, A.G. (Alan)
Boksem, M.A.S. (Maarten)
Chan, H.Y. (Hang-Yee)
Smidts, A. (Ale)
Schoots, V.C. (Vincent)
Sanfey, A.G. (Alan)
Boksem, M.A.S. (Maarten)
Publication Year :
2020

Abstract

This study explored the feasibility of using shared neural patterns from brief affective episodes (viewing affective pictures) to decode extended, dynamic affective sequences in a naturalistic experience (watching movie-trailers). Twenty-eight participants viewed pictures from the International Affective Picture System (IAPS) and, in a separate session, watched various movie-trailers. We first located voxels at bilateral occipital cortex (LOC) responsive to affective picture categories by GLM analysis, then performed between-subject hyperalignment on the LOC voxels based on their responses during movie-trailer watching. After hyperalignment, we trained between-subject machine learning classifiers on the affective pictures, and used the classifiers to decode affective states of an out-of-sample participant both during picture viewing and during movie-trailer watching. Within participants, neural classifiers identified valence and arousal categories of pictures, and tracked self-reported valence and arousal during video watching. In aggregate, neural classifiers produced valence and arousal time series that tracked the dynamic ratings of the movie-trailers obtained from a separate sample. Our findings

Details

Database :
OAIster
Notes :
application/pdf, NeuroImage, English
Publication Type :
Electronic Resource
Accession number :
edsoai.on1143370663
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1016.j.neuroimage.2020.116618