Back to Search Start Over

Beat-It: Beat-Synchronized Multi-Condition 3D Dance Generation

Authors :
Huang, Zikai
Xu, Xuemiao
Xu, Cheng
Zhang, Huaidong
Zheng, Chenxi
Qin, Jing
He, Shengfeng
Publication Year :
2024

Abstract

Dance, as an art form, fundamentally hinges on the precise synchronization with musical beats. However, achieving aesthetically pleasing dance sequences from music is challenging, with existing methods often falling short in controllability and beat alignment. To address these shortcomings, this paper introduces Beat-It, a novel framework for beat-specific, key pose-guided dance generation. Unlike prior approaches, Beat-It uniquely integrates explicit beat awareness and key pose guidance, effectively resolving two main issues: the misalignment of generated dance motions with musical beats, and the inability to map key poses to specific beats, critical for practical choreography. Our approach disentangles beat conditions from music using a nearest beat distance representation and employs a hierarchical multi-condition fusion mechanism. This mechanism seamlessly integrates key poses, beats, and music features, mitigating condition conflicts and offering rich, multi-conditioned guidance for dance generation. Additionally, a specially designed beat alignment loss ensures the generated dance movements remain in sync with the designated beats. Extensive experiments confirm Beat-It's superiority over existing state-of-the-art methods in terms of beat alignment and motion controllability.<br />Comment: ECCV 2024

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2407.07554
Document Type :
Working Paper