Back to Search Start Over

Active Continual Learning: On Balancing Knowledge Retention and Learnability

Authors :
Vu, Thuy-Trang
Khadivi, Shahram
Ghorbanali, Mahsa
Phung, Dinh
Haffari, Gholamreza
Publication Year :
2023

Abstract

Acquiring new knowledge without forgetting what has been learned in a sequence of tasks is the central focus of continual learning (CL). While tasks arrive sequentially, the training data are often prepared and annotated independently, leading to the CL of incoming supervised learning tasks. This paper considers the under-explored problem of active continual learning (ACL) for a sequence of active learning (AL) tasks, where each incoming task includes a pool of unlabelled data and an annotation budget. We investigate the effectiveness and interplay between several AL and CL algorithms in the domain, class and task-incremental scenarios. Our experiments reveal the trade-off between two contrasting goals of not forgetting the old knowledge and the ability to quickly learn new knowledge in CL and AL, respectively. While conditioning the AL query strategy on the annotations collected for the previous tasks leads to improved task performance on the domain and task incremental learning, our proposed forgetting-learning profile suggests a gap in balancing the effect of AL and CL for the class-incremental scenario.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2305.03923
Document Type :
Working Paper