Back to Search Start Over

RSMT: Real-time Stylized Motion Transition for Characters

Authors :
Tang, Xiangjun
Wu, Linjun
Wang, He
Hu, Bo
Gong, Xu
Liao, Yuchen
Li, Songnan
Kou, Qilong
Jin, Xiaogang
Source :
SIGGRAPH 2023 Conference Proceedings
Publication Year :
2023

Abstract

Styled online in-between motion generation has important application scenarios in computer animation and games. Its core challenge lies in the need to satisfy four critical requirements simultaneously: generation speed, motion quality, style diversity, and synthesis controllability. While the first two challenges demand a delicate balance between simple fast models and learning capacity for generation quality, the latter two are rarely investigated together in existing methods, which largely focus on either control without style or uncontrolled stylized motions. To this end, we propose a Real-time Stylized Motion Transition method (RSMT) to achieve all aforementioned goals. Our method consists of two critical, independent components: a general motion manifold model and a style motion sampler. The former acts as a high-quality motion source and the latter synthesizes styled motions on the fly under control signals. Since both components can be trained separately on different datasets, our method provides great flexibility, requires less data, and generalizes well when no/few samples are available for unseen styles. Through exhaustive evaluation, our method proves to be fast, high-quality, versatile, and controllable. The code and data are available at {https://github.com/yuyujunjun/RSMT-Realtime-Stylized-Motion-Transition.}

Details

Database :
arXiv
Journal :
SIGGRAPH 2023 Conference Proceedings
Publication Type :
Report
Accession number :
edsarx.2306.11970
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3588432.3591514