Back to Search Start Over

Dynamic Compositional Graph Convolutional Network for Efficient Composite Human Motion Prediction

Authors :
Zhang, Wanying
Zhao, Shen
Meng, Fanyang
Wu, Songtao
Liu, Mengyuan
Source :
Proceedings of the 31st ACM International Conference on Multimedia, October 2023, Pages 2856-2864
Publication Year :
2023

Abstract

With potential applications in fields including intelligent surveillance and human-robot interaction, the human motion prediction task has become a hot research topic and also has achieved high success, especially using the recent Graph Convolutional Network (GCN). Current human motion prediction task usually focuses on predicting human motions for atomic actions. Observing that atomic actions can happen at the same time and thus formulating the composite actions, we propose the composite human motion prediction task. To handle this task, we first present a Composite Action Generation (CAG) module to generate synthetic composite actions for training, thus avoiding the laborious work of collecting composite action samples. Moreover, we alleviate the effect of composite actions on demand for a more complicated model by presenting a Dynamic Compositional Graph Convolutional Network (DC-GCN). Extensive experiments on the Human3.6M dataset and our newly collected CHAMP dataset consistently verify the efficiency of our DC-GCN method, which achieves state-of-the-art motion prediction accuracies and meanwhile needs few extra computational costs than traditional GCN-based human motion methods.

Details

Database :
arXiv
Journal :
Proceedings of the 31st ACM International Conference on Multimedia, October 2023, Pages 2856-2864
Publication Type :
Report
Accession number :
edsarx.2311.13781
Document Type :
Working Paper
Full Text :
https://doi.org/10.1145/3581783.3612532