Back to Search Start Over

DualMix: Unleashing the Potential of Data Augmentation for Online Class-Incremental Learning

Authors :
Fan, Yunfeng
Xu, Wenchao
Wang, Haozhao
Zhu, Jiaqi
Wang, Junxiao
Guo, Song
Publication Year :
2023

Abstract

Online Class-Incremental (OCI) learning has sparked new approaches to expand the previously trained model knowledge from sequentially arriving data streams with new classes. Unfortunately, OCI learning can suffer from catastrophic forgetting (CF) as the decision boundaries for old classes can become inaccurate when perturbated by new ones. Existing literature have applied the data augmentation (DA) to alleviate the model forgetting, while the role of DA in OCI has not been well understood so far. In this paper, we theoretically show that augmented samples with lower correlation to the original data are more effective in preventing forgetting. However, aggressive augmentation may also reduce the consistency between data and corresponding labels, which motivates us to exploit proper DA to boost the OCI performance and prevent the CF problem. We propose the Enhanced Mixup (EnMix) method that mixes the augmented samples and their labels simultaneously, which is shown to enhance the sample diversity while maintaining strong consistency with corresponding labels. Further, to solve the class imbalance problem, we design an Adaptive Mixup (AdpMix) method to calibrate the decision boundaries by mixing samples from both old and new classes and dynamically adjusting the label mixing ratio. Our approach is demonstrated to be effective on several benchmark datasets through extensive experiments, and it is shown to be compatible with other replay-based techniques.<br />Comment: 10 pages, 7 figures and 3 tables

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2303.07864
Document Type :
Working Paper