Back to Search Start Over

Generating Target Image-Label Pairs for Unsupervised Domain Adaptation.

Authors :
Li, Rui
Cao, Wenming
Wu, Si
Wong, Hau-San
Source :
IEEE Transactions on Image Processing; 2020, Vol. 29, p7997-8011, 15p
Publication Year :
2020

Abstract

Deep learning demonstrates its impressive success across various machine learning problems. However, its performance often suffers in the case where the training and test data sets follow different distributions, due to the domain shift. Most current domain adaptation methods minimize the discrepancy between the source and target domains by enforcing the alignment of their marginal distributions without considering the class-level matching. Consequently, data from different classes may become close together after mapping. To address this issue, we propose an unsupervised domain adaptation method by generating image-label pairs in the target domain, in which the model is augmented with the generated target pairs and achieve class-level transfer. Specifically, we integrate generative adversarial networks (GAN) into the model predictor, where the generator fed with labels aims to produce corresponding target domain images with a well-designed semantic loss. Meanwhile, compared to previous methods which focus on discrepancy reduction across domains, i.e., image to image translation, our model focuses on semantic preservation during image generation. Our model is straightforward yet effective for unsupervised domain adaptation problems. Without any labels in the target domain in all the experiments, we demonstrate the validity of our approach by presenting the plausible generated target image-label pairs. In addition, our proposed method achieves the best or comparable performance on multiple unsupervised domain adaptation datasets which include image classification and semantic segmentation. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10577149
Volume :
29
Database :
Complementary Index
Journal :
IEEE Transactions on Image Processing
Publication Type :
Academic Journal
Accession number :
170078540
Full Text :
https://doi.org/10.1109/TIP.2020.3009853