Back to Search Start Over

Strand-accurate multi-view facial hair reconstruction and tracking.

Authors :
Li, Hanchao
Liu, Xinguo
Source :
Visual Computer. Jul2024, Vol. 40 Issue 7, p4713-4724. 12p.
Publication Year :
2024

Abstract

Accurate modeling of facial hair is crucial not only for reconstructing a clean-shaven face but also for enhancing realism when creating digital human avatars. Previous methods have primarily focused on reconstructing sparse facial hairs from static scans, and more recently, efforts have been made to track facial performances. However, there is still room for improvement in reconstructing thick hairs. In this paper, we address the challenges of facial hair reconstruction and tracking, enhancing the realism and detail of facial hair in digital human avatars. For facial hair reconstruction, we propose a method that combines line-based multi-view stereo with line segment matching to recover a dense hair point cloud. From this point cloud, hair strands are extracted using the forward Euler method. For tracking, we introduce a space-time optimization method that registers the reference facial hair strands to subsequent frames, taking into account both the global shape and the motion between frames. After capturing the facial hair structures, we refine the underlying skin meshes by replacing the noisy hair region points with strand roots. We conducted experiments on various examples captured under different systems, demonstrating that our facial hair capture method outperforms previous methods in terms of accuracy and completeness. Our method provides a comprehensive and accurate solution for capturing and modeling facial hair in various facial performance scenarios. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01782789
Volume :
40
Issue :
7
Database :
Academic Search Index
Journal :
Visual Computer
Publication Type :
Academic Journal
Accession number :
178276408
Full Text :
https://doi.org/10.1007/s00371-024-03465-5