Back to Search Start Over

RAIN: RegulArization on Input and Network for Black-Box Domain Adaptation

Authors :
Peng, Qucheng
Ding, Zhengming
Lyu, Lingjuan
Sun, Lichao
Chen, Chen
Source :
International Joint Conferences on Artificial Intelligence 32 (2023) 4118-4126
Publication Year :
2022

Abstract

Source-Free domain adaptation transits the source-trained model towards target domain without exposing the source data, trying to dispel these concerns about data privacy and security. However, this paradigm is still at risk of data leakage due to adversarial attacks on the source model. Hence, the Black-Box setting only allows to use the outputs of source model, but still suffers from overfitting on the source domain more severely due to source model's unseen weights. In this paper, we propose a novel approach named RAIN (RegulArization on Input and Network) for Black-Box domain adaptation from both input-level and network-level regularization. For the input-level, we design a new data augmentation technique as Phase MixUp, which highlights task-relevant objects in the interpolations, thus enhancing input-level regularization and class consistency for target models. For network-level, we develop a Subnetwork Distillation mechanism to transfer knowledge from the target subnetwork to the full target network via knowledge distillation, which thus alleviates overfitting on the source domain by learning diverse target representations. Extensive experiments show that our method achieves state-of-the-art performance on several cross-domain benchmarks under both single- and multi-source black-box domain adaptation.<br />Comment: Accepted by IJCAI 2023

Details

Database :
arXiv
Journal :
International Joint Conferences on Artificial Intelligence 32 (2023) 4118-4126
Publication Type :
Report
Accession number :
edsarx.2208.10531
Document Type :
Working Paper
Full Text :
https://doi.org/10.24963/ijcai.2023/458