Back to Search Start Over

Distribution and Depth-Aware Transformers for 3D Human Mesh Recovery

Authors :
Bright, Jerrin
Balaji, Bavesh
Prakash, Harish
Chen, Yuhao
Clausi, David A
Zelek, John
Publication Year :
2024

Abstract

Precise Human Mesh Recovery (HMR) with in-the-wild data is a formidable challenge and is often hindered by depth ambiguities and reduced precision. Existing works resort to either pose priors or multi-modal data such as multi-view or point cloud information, though their methods often overlook the valuable scene-depth information inherently present in a single image. Moreover, achieving robust HMR for out-of-distribution (OOD) data is exceedingly challenging due to inherent variations in pose, shape and depth. Consequently, understanding the underlying distribution becomes a vital subproblem in modeling human forms. Motivated by the need for unambiguous and robust human modeling, we introduce Distribution and depth-aware human mesh recovery (D2A-HMR), an end-to-end transformer architecture meticulously designed to minimize the disparity between distributions and incorporate scene-depth leveraging prior depth information. Our approach demonstrates superior performance in handling OOD data in certain scenarios while consistently achieving competitive results against state-of-the-art HMR methods on controlled datasets.<br />Comment: Submitted to 21st International Conference on Robots and Vision (CRV'24), Guelph, Ontario, Canada

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.09063
Document Type :
Working Paper