Back to Search Start Over

Auxiliary Classifiers Improve Stability and Efficiency in Continual Learning

Authors :
Szatkowski, Filip
Yang, Fei
Twardowski, Bartłomiej
Trzciński, Tomasz
van de Weijer, Joost
Publication Year :
2024

Abstract

Continual learning is crucial for applications in dynamic environments, where machine learning models must adapt to changing data distributions while retaining knowledge of previous tasks. Despite significant advancements, catastrophic forgetting - where performance on earlier tasks degrades as new information is learned - remains a key challenge. In this work, we investigate the stability of intermediate neural network layers during continual learning and explore how auxiliary classifiers (ACs) can leverage this stability to improve performance. We show that early network layers remain more stable during learning, particularly for older tasks, and that ACs applied to these layers can outperform standard classifiers on past tasks. By integrating ACs into several continual learning algorithms, we demonstrate consistent and significant performance improvements on standard benchmarks. Additionally, we explore dynamic inference, showing that AC-augmented continual learning methods can reduce computational costs by up to 60\% while maintaining or exceeding the accuracy of standard methods. Our findings suggest that ACs offer a promising avenue for enhancing continual learning models, providing both improved performance and the ability to adapt the network computation in environments where such flexibility might be required.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2403.07404
Document Type :
Working Paper