Back to Search Start Over

Conditional-GAN-Based Face Inpainting Approaches With Symmetry and View-Degree Utilization

Authors :
Tzung-Pei Hong
Jin-Hang Wu
Ja-Hwung Su
Tang-Kai Yin
Source :
IEEE Access, Vol 12, Pp 87467-87478 (2024)
Publication Year :
2024
Publisher :
IEEE, 2024.

Abstract

Recently, image inpainting has been proposed as a solution for restoring the polluted image in the field of computer vision. Further, face inpainting is a subfield of image inpainting, which refers to a set of image editing algorithms re-conducting the missing regions in face smoothly. Actually, face inpainting is more challenging than general image inpainting because it needs more face structure information. Although a number of past studies were proposed for face inpainting by using face segmentation, face edge and face topology, there is some important information ignored, such as geometric and symmetric properties. Based on such concepts, in this paper, we propose a two-stage face inpainting method called CGAN (Conditional Generative Adversarial Network) which integrates face landmarks and Generative Adversarial Network (called GAN). In the first stage, the face landmark is predicted as the condition, providing GAN with important information of geometry and symmetry. The main idea in this stage is to dynamically adjust the loss by the proposed view degree. Accordingly, the masked face image and the corresponding face landmark are used as conditions input to the GAN in the second stage. Finally, the missing-regions are inpainted by the proposed CGAN. To reveal the effectiveness of proposed method, a number of evaluations were conducted on real datasets. The experimental results show that, the proposed method predicts a better face landmark by information of geometric structures and symmetric outlooks, and thereupon the proposed CGAN reconstructs the missing regions superior to the compared methods.

Details

Language :
English
ISSN :
21693536
Volume :
12
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.893e4f76a06d49a0a46475713f78e672
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2024.3417442