Back to Search Start Over

Jo-SRC: A Contrastive Approach for Combating Noisy Labels

Authors :
Yao, Yazhou
Sun, Zeren
Zhang, Chuanyi
Shen, Fumin
Wu, Qi
Zhang, Jian
Tang, Zhenmin
Publication Year :
2021

Abstract

Due to the memorization effect in Deep Neural Networks (DNNs), training with noisy labels usually results in inferior model performance. Existing state-of-the-art methods primarily adopt a sample selection strategy, which selects small-loss samples for subsequent training. However, prior literature tends to perform sample selection within each mini-batch, neglecting the imbalance of noise ratios in different mini-batches. Moreover, valuable knowledge within high-loss samples is wasted. To this end, we propose a noise-robust approach named Jo-SRC (Joint Sample Selection and Model Regularization based on Consistency). Specifically, we train the network in a contrastive learning manner. Predictions from two different views of each sample are used to estimate its "likelihood" of being clean or out-of-distribution. Furthermore, we propose a joint loss to advance the model generalization performance by introducing consistency regularization. Extensive experiments have validated the superiority of our approach over existing state-of-the-art methods.<br />Comment: accepted by IEEE Conference on Computer Vision and Pattern Recognition, 2021

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2103.13029
Document Type :
Working Paper