Back to Search Start Over

Weighted Correlation Embedding Learning for Domain Adaptation.

Authors :
Lu, Yuwu
Zhu, Qi
Zhang, Bob
Lai, Zhihui
Li, Xuelong
Source :
IEEE Transactions on Image Processing; 2022, Vol. 31, p5303-5316, 14p
Publication Year :
2022

Abstract

Domain adaptation leverages rich knowledge from a related source domain so that it can be used to perform tasks in a target domain. For more knowledge to be obtained under relaxed conditions, domain adaptation methods have been widely used in pattern recognition and image classification. However, most of the existing domain adaptation methods only consider how to minimize different distributions of the source and target domains, which neglects what should be transferred for a specific task and suffers negative transfer by distribution outliers. To address these problems, in this paper, we propose a novel domain adaptation method called weighted correlation embedding learning (WCEL) for image classification. In the WCEL approach, we seamlessly integrated correlation learning, graph embedding, and sample reweighting into a unified learning model. Specifically, we extracted the maximum correlated features from the source and target domains for image classification tasks. In addition, two graphs were designed to preserve the discriminant information from interclass samples and neighborhood relations in intraclass samples. Furthermore, to prevent the negative transfer problem, we developed an efficient sample reweighting strategy to predict the target with different confidence levels. To verify the performance of the proposed method in image classification, extensive experiments were conducted with several benchmark databases, verifying the superiority of the WCEL method over other state-of-the-art domain adaptation algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10577149
Volume :
31
Database :
Complementary Index
Journal :
IEEE Transactions on Image Processing
Publication Type :
Academic Journal
Accession number :
170077332
Full Text :
https://doi.org/10.1109/TIP.2022.3193758