Back to Search Start Over

PREF: Predictability Regularized Neural Motion Fields

Authors :
Song, Liangchen
Gong, Xuan
Planche, Benjamin
Zheng, Meng
Doermann, David
Yuan, Junsong
Chen, Terrence
Wu, Ziyan
Publication Year :
2022

Abstract

Knowing the 3D motions in a dynamic scene is essential to many vision applications. Recent progress is mainly focused on estimating the activity of some specific elements like humans. In this paper, we leverage a neural motion field for estimating the motion of all points in a multiview setting. Modeling the motion from a dynamic scene with multiview data is challenging due to the ambiguities in points of similar color and points with time-varying color. We propose to regularize the estimated motion to be predictable. If the motion from previous frames is known, then the motion in the near future should be predictable. Therefore, we introduce a predictability regularization by first conditioning the estimated motion on latent embeddings, then by adopting a predictor network to enforce predictability on the embeddings. The proposed framework PREF (Predictability REgularized Fields) achieves on par or better results than state-of-the-art neural motion field-based dynamic scene representation methods, while requiring no prior knowledge of the scene.<br />Comment: Accepted at ECCV 2022 (oral). Paper + supplementary material

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2209.10691
Document Type :
Working Paper