1. A multi-componential analysis of emotions during complex learning with an intelligent multi-agent system.
- Author
-
Harley, Jason M., Bouchet, François, Hussain, M. Sazzad, Azevedo, Roger, and Calvo, Rafael
- Subjects
- *
BIOMETRY , *EMOTIONS , *FACIAL expression , *LEARNING , *SELF-evaluation , *VIDEO recording , *UNDERGRADUATES , *DESCRIPTIVE statistics - Abstract
This paper presents the evaluation of the synchronization of three emotional measurement methods (automatic facial expression recognition, self-report, electrodermal activity) and their agreement regarding learners’ emotions. Data were collected from 67 undergraduates enrolled at a North American University whom learned about a complex science topic while interacting with MetaTutor, a multi-agent computerized learning environment. Videos of learners’ facial expressions captured with a webcam were analyzed using automatic facial recognition software (FaceReader 5.0). Learners’ physiological arousal was recorded using Affectiva’s Q-Sensor 2.0 electrodermal activity measurement bracelet. Learners’ self-reported their experience of 19 different emotional states on five different occasions during the learning session, which were used as markers to synchronize data from FaceReader and Q-Sensor. We found a high agreement between the facial and self-report data (75.6%), but low levels of agreement between them and the Q-Sensor data, suggesting that a tightly coupled relationship does not always exist between emotional response components. [ABSTRACT FROM AUTHOR]
- Published
- 2015
- Full Text
- View/download PDF