Back to Search Start Over

Mutual-weighted feature disentanglement for unsupervised domain adaptation.

Authors :
Wang, Shanshan
Xiao, Qian
Wang, Keyang
Yang, Xun
Zhang, Xingyi
Source :
Multimedia Systems. Dec2024, Vol. 30 Issue 6, p1-15. 15p.
Publication Year :
2024

Abstract

Unsupervised domain adaptation (UDA) aims to reduce the distribution discrepancy across domains, enabling the transfer of knowledge from the labeled source domain to the unlabeled target domain. The main focus of most current UDA methods lies on extracting domain invariant representations to reduce the gap between domains. However, this singular emphasis on domain invariance may inadvertently ignore domain relevant information, which may lead to negative transfer effects. Moreover, current adversarial DA methods give uniform weight to all samples, overlooking the distinct influence that variations within source domain data or noise may exert on the adversarial performance. To address these challenges, we propose a novel method called Mutual-weighted Feature Disentanglement for Unsupervised Domain Adaptation (MWFD). Specifically, we decouple domain invariant features from domain specific features, and then use the entropy of the classifier to rebalance the weights of the domain discriminator, and simultaneously adjust the weights of the classifier using the domain entropy of the domain discriminator to reduce domain discrepancies. Finally, to obtain more discriminative features, we adopt a self-supervised contrastive learning framework to bring positive sample pairs closer together while pushing negative sample pairs apart, enhancing the discriminability of the model on the target domain. Extensive experiments on five benchmark datasets demonstrate that our model outperforms several state-of-the-art domain adaptation methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09424962
Volume :
30
Issue :
6
Database :
Academic Search Index
Journal :
Multimedia Systems
Publication Type :
Academic Journal
Accession number :
180131719
Full Text :
https://doi.org/10.1007/s00530-024-01477-8