Back to Search Start Over

One-Shot Generative Domain Adaptation

Authors :
Yang, Ceyuan
Shen, Yujun
Zhang, Zhiyi
Xu, Yinghao
Zhu, Jiapeng
Wu, Zhirong
Zhou, Bolei
Publication Year :
2021

Abstract

This work aims at transferring a Generative Adversarial Network (GAN) pre-trained on one image domain to a new domain referring to as few as just one target image. The main challenge is that, under limited supervision, it is extremely difficult to synthesize photo-realistic and highly diverse images, while acquiring representative characters of the target. Different from existing approaches that adopt the vanilla fine-tuning strategy, we import two lightweight modules to the generator and the discriminator respectively. Concretely, we introduce an attribute adaptor into the generator yet freeze its original parameters, through which it can reuse the prior knowledge to the most extent and hence maintain the synthesis quality and diversity. We then equip the well-learned discriminator backbone with an attribute classifier to ensure that the generator captures the appropriate characters from the reference. Furthermore, considering the poor diversity of the training data (i.e., as few as only one image), we propose to also constrain the diversity of the generative domain in the training process, alleviating the optimization difficulty. Our approach brings appealing results under various settings, substantially surpassing state-of-the-art alternatives, especially in terms of synthesis diversity. Noticeably, our method works well even with large domain gaps, and robustly converges within a few minutes for each experiment.<br />Comment: Technical Report

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2111.09876
Document Type :
Working Paper