Back to Search Start Over

Reweighted Mixup for Subpopulation Shift

Authors :
Han, Zongbo
Liang, Zhipeng
Yang, Fan
Liu, Liu
Li, Lanqing
Bian, Yatao
Zhao, Peilin
Hu, Qinghua
Wu, Bingzhe
Zhang, Changqing
Yao, Jianhua
Publication Year :
2023

Abstract

Subpopulation shift exists widely in many real-world applications, which refers to the training and test distributions that contain the same subpopulation groups but with different subpopulation proportions. Ignoring subpopulation shifts may lead to significant performance degradation and fairness concerns. Importance reweighting is a classical and effective way to handle the subpopulation shift. However, recent studies have recognized that most of these approaches fail to improve the performance especially when applied to over-parameterized neural networks which are capable of fitting any training samples. In this work, we propose a simple yet practical framework, called reweighted mixup (RMIX), to mitigate the overfitting issue in over-parameterized models by conducting importance weighting on the ''mixed'' samples. Benefiting from leveraging reweighting in mixup, RMIX allows the model to explore the vicinal space of minority samples more, thereby obtaining more robust model against subpopulation shift. When the subpopulation memberships are unknown, the training-trajectories-based uncertainty estimation is equipped in the proposed RMIX to flexibly characterize the subpopulation distribution. We also provide insightful theoretical analysis to verify that RMIX achieves better generalization bounds over prior works. Further, we conduct extensive empirical studies across a wide range of tasks to validate the effectiveness of the proposed method.<br />Comment: Journal version of arXiv:2209.08928

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2304.04148
Document Type :
Working Paper