Back to Search Start Over

Decision Boundary-aware Knowledge Consolidation Generates Better Instance-Incremental Learner

Authors :
Nie, Qiang
Fu, Weifu
Lin, Yuhuan
Li, Jialin
Zhou, Yifeng
Liu, Yong
Zhu, Lei
Wang, Chengjie
Publication Year :
2024

Abstract

Instance-incremental learning (IIL) focuses on learning continually with data of the same classes. Compared to class-incremental learning (CIL), the IIL is seldom explored because IIL suffers less from catastrophic forgetting (CF). However, besides retaining knowledge, in real-world deployment scenarios where the class space is always predefined, continual and cost-effective model promotion with the potential unavailability of previous data is a more essential demand. Therefore, we first define a new and more practical IIL setting as promoting the model's performance besides resisting CF with only new observations. Two issues have to be tackled in the new IIL setting: 1) the notorious catastrophic forgetting because of no access to old data, and 2) broadening the existing decision boundary to new observations because of concept drift. To tackle these problems, our key insight is to moderately broaden the decision boundary to fail cases while retain old boundary. Hence, we propose a novel decision boundary-aware distillation method with consolidating knowledge to teacher to ease the student learning new knowledge. We also establish the benchmarks on existing datasets Cifar-100 and ImageNet. Notably, extensive experiments demonstrate that the teacher model can be a better incremental learner than the student model, which overturns previous knowledge distillation-based methods treating student as the main role.<br />Comment: 14 pages

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2406.03065
Document Type :
Working Paper