Back to Search Start Over

Privileged Prior Information Distillation for Image Matting

Authors :
Lyu, Cheng
Xie, Jiake
Xu, Bo
Lu, Cheng
Huang, Han
Huang, Xin
Wu, Ming
Zhang, Chuang
Tang, Yong
Publication Year :
2022

Abstract

Performance of trimap-free image matting methods is limited when trying to decouple the deterministic and undetermined regions, especially in the scenes where foregrounds are semantically ambiguous, chromaless, or high transmittance. In this paper, we propose a novel framework named Privileged Prior Information Distillation for Image Matting (PPID-IM) that can effectively transfer privileged prior environment-aware information to improve the performance of students in solving hard foregrounds. The prior information of trimap regulates only the teacher model during the training stage, while not being fed into the student network during actual inference. In order to achieve effective privileged cross-modality (i.e. trimap and RGB) information distillation, we introduce a Cross-Level Semantic Distillation (CLSD) module that reinforces the trimap-free students with more knowledgeable semantic representations and environment-aware information. We also propose an Attention-Guided Local Distillation module that efficiently transfers privileged local attributes from the trimap-based teacher to trimap-free students for the guidance of local-region optimization. Extensive experiments demonstrate the effectiveness and superiority of our PPID framework on the task of image matting. In addition, our trimap-free IndexNet-PPID surpasses the other competing state-of-the-art methods by a large margin, especially in scenarios with chromaless, weak texture, or irregular objects.<br />Comment: 15 pages, 7 figures

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2211.14036
Document Type :
Working Paper