Back to Search
Start Over
A Music-Driven Deep Generative Adversarial Model for Guzheng Playing Animation
- Source :
- IEEE Transactions on Visualization and Computer Graphics. 29:1400-1414
- Publication Year :
- 2023
- Publisher :
- Institute of Electrical and Electronics Engineers (IEEE), 2023.
-
Abstract
- To date relatively few efforts have been made on the automatic generation of musical instrument playing animations. This problem is challenging due to the intrinsically complex, temporal relationship between music and human motion as well as the lacking of high quality music-playing motion datasets. In this paper, we propose a fully automatic, deep learning based framework to synthesize realistic upper body animations based on novel guzheng music input. Specifically, based on a recorded audiovisual motion capture dataset, we delicately design a generative adversarial network (GAN) based approach to capture the temporal relationship between the music and the human motion data. In this process, data augmentation is employed to improve the generalization of our approach to handle a variety of guzheng music inputs. Through extensive objective and subjective experiments, we show that our method can generate visually plausible guzheng-playing animations that are well synchronized with the input guzheng music, and it can significantly outperform \uline{the state-of-the-art} methods. In addition, through an ablation study, we validate the contributions of the carefully-designed modules in our framework.
- Subjects :
- Computer science
business.industry
Generalization
Process (engineering)
Deep learning
Musical instrument
Animation
Computer Graphics and Computer-Aided Design
Motion capture
Motion (physics)
Human–computer interaction
Signal Processing
Computer Vision and Pattern Recognition
Artificial intelligence
business
Software
Generative grammar
Subjects
Details
- ISSN :
- 21609306 and 10772626
- Volume :
- 29
- Database :
- OpenAIRE
- Journal :
- IEEE Transactions on Visualization and Computer Graphics
- Accession number :
- edsair.doi.dedup.....2209fcd14a59877e4bf1ac27e0e5d749