Back to Search Start Over

Adaptive Mutual Learning for Unsupervised Domain Adaptation

Authors :
Zhou, Lihua
Xiao, Siying
Ye, Mao
Zhu, Xiatian
Li, Shuaifeng
Source :
IEEE Transactions on Circuits and Systems for Video Technology; November 2023, Vol. 33 Issue: 11 p6622-6634, 13p
Publication Year :
2023

Abstract

Unsupervised domain adaptation aims to transfer knowledge from labeled source domain to unlabeled target domain. The semi-supervised method based on mean-teacher framework is one of the main stream approaches. By enforcing consistency constraints, it is hopeful that the teacher network will distill useful source domain knowledge to the student network. However, in practice negative transfer often emerges because the performance of the teacher network is not guaranteed to be always better than the student network. To address this limitation, a novel Adaptive Mutual Learning (AML) strategy is proposed in this paper. Specifically, given a target sample, the network with worse prediction will be optimized by pushing its prediction close to the better prediction. This is in the spirit of traditional knowledge distillation. On the other hand, the network with better prediction is further refined by requiring its prediction to stay away from the worse prediction. This can be regarded conceptually as reverse knowledge distillation. In this way, two networks learn from each other according to their respective performance. At inference phase, the averaged output of these two networks can be taken as the final prediction. Experimental results demonstrate that our AML achieves competitive results.

Details

Language :
English
ISSN :
10518215 and 15582205
Volume :
33
Issue :
11
Database :
Supplemental Index
Journal :
IEEE Transactions on Circuits and Systems for Video Technology
Publication Type :
Periodical
Accession number :
ejs64405144
Full Text :
https://doi.org/10.1109/TCSVT.2023.3265853