Back to Search Start Over

Learning spectral transform for 3D human motion prediction.

Authors :
Kim, Boeun
Choi, Jin Young
Source :
Computer Vision & Image Understanding; Oct2022, Vol. 223, pN.PAG-N.PAG, 1p
Publication Year :
2022

Abstract

In existing motion prediction methods that use graph convolution networks, motion sequences are transformed to a spectral domain, and future motions are predicted through graph spectral filtering for the transformed spectral sequences. However, because the conventional spectral transform method uses a predetermined spectral basis, the motion prediction does not work well for aperiodic or complicated motions. To overcome this problem, we propose a method to learn spectral domain transforms from motion sequences in the training dataset. To this end, two methods are attempted: one for learning the frequency of each spectral basis, and another for learning the values of the basis function directly. Through experiments on representative 3D human motion benchmarks, H3.6M and CMU Mocap, we demonstrate that both of the proposed methods consistently outperform the baseline method. In particular, the method of directly learning the basis function outperforms the state-of-the-art methods. We also demonstrate that the proposed method yields realistic predictions, even for aperiodic and complicated action categories. • Proposes a novel GCN-based motion prediction method of learning spectral transform. • Tackles the limitations of fixed spectral basis functions in conventional methods. • Automatically learns spectral basis without manual setting of the number of basis. • Generates realistic prediction results for aperiodic and dynamic action categories. • Sets the new best performance in 3D human motion prediction on H3.6M and CMU Mocap. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10773142
Volume :
223
Database :
Supplemental Index
Journal :
Computer Vision & Image Understanding
Publication Type :
Academic Journal
Accession number :
159289459
Full Text :
https://doi.org/10.1016/j.cviu.2022.103548