1. Enhancing Human–Robot Interaction: Development of Multimodal Robotic Assistant for User Emotion Recognition
- Author
-
Sergio Garcia, Francisco Gomez-Donoso, and Miguel Cazorla
- Subjects
human–robot interaction ,multimodal emotion recognition ,social robotics ,humanoid robots ,natural language processing ,Technology ,Engineering (General). Civil engineering (General) ,TA1-2040 ,Biology (General) ,QH301-705.5 ,Physics ,QC1-999 ,Chemistry ,QD1-999 - Abstract
This paper presents a study on enhancing human–robot interaction (HRI) through multimodal emotional recognition within social robotics. Using the humanoid robot Pepper as a testbed, we integrate visual, auditory, and textual analysis to improve emotion recognition accuracy and contextual understanding. The proposed framework combines pretrained neural networks with fine-tuning techniques tailored to specific users, demonstrating that high accuracy in emotion recognition can be achieved by adapting the models to the individual emotional expressions of each user. This approach addresses the inherent variability in emotional expression across individuals, making it feasible to deploy personalized emotion recognition systems. Our experiments validate the effectiveness of this methodology, achieving high precision in multimodal emotion recognition through fine-tuning, while maintaining adaptability in real-world scenarios. These enhancements significantly improve Pepper’s interactive and empathetic capabilities, allowing it to engage more naturally with users in assistive, educational, and healthcare settings. This study not only advances the field of HRI but also provides a reproducible framework for integrating multimodal emotion recognition into commercial humanoid robots, bridging the gap between research prototypes and practical applications.
- Published
- 2024
- Full Text
- View/download PDF