Back to Search Start Over

SynthoGestures: A Novel Framework for Synthetic Dynamic Hand Gesture Generation for Driving Scenarios

Authors :
Gomaa, Amr
Zitt, Robin
Reyes, Guillermo
Krüger, Antonio
Gomaa, Amr
Zitt, Robin
Reyes, Guillermo
Krüger, Antonio
Publication Year :
2023

Abstract

Creating a diverse and comprehensive dataset of hand gestures for dynamic human-machine interfaces in the automotive domain can be challenging and time-consuming. To overcome this challenge, we propose using synthetic gesture datasets generated by virtual 3D models. Our framework utilizes Unreal Engine to synthesize realistic hand gestures, offering customization options and reducing the risk of overfitting. Multiple variants, including gesture speed, performance, and hand shape, are generated to improve generalizability. In addition, we simulate different camera locations and types, such as RGB, infrared, and depth cameras, without incurring additional time and cost to obtain these cameras. Experimental results demonstrate that our proposed framework, SynthoGestures\footnote{\url{https://github.com/amrgomaaelhady/SynthoGestures}}, improves gesture recognition accuracy and can replace or augment real-hand datasets. By saving time and effort in the creation of the data set, our tool accelerates the development of gesture recognition systems for automotive applications.<br />Comment: Shorter versions are accepted as AutomotiveUI2023 Work in Progress and UIST2023 Poster Papers

Details

Database :
OAIster
Publication Type :
Electronic Resource
Accession number :
edsoai.on1438478052
Document Type :
Electronic Resource
Full Text :
https://doi.org/10.1145.3581961.3609889