Back to Search Start Over

Continual Named Entity Recognition without Catastrophic Forgetting

Authors :
Zhang, Duzhen
Cong, Wei
Dong, Jiahua
Yu, Yahan
Chen, Xiuyi
Zhang, Yonggang
Fang, Zhen
Publication Year :
2023

Abstract

Continual Named Entity Recognition (CNER) is a burgeoning area, which involves updating an existing model by incorporating new entity types sequentially. Nevertheless, continual learning approaches are often severely afflicted by catastrophic forgetting. This issue is intensified in CNER due to the consolidation of old entity types from previous steps into the non-entity type at each step, leading to what is known as the semantic shift problem of the non-entity type. In this paper, we introduce a pooled feature distillation loss that skillfully navigates the trade-off between retaining knowledge of old entity types and acquiring new ones, thereby more effectively mitigating the problem of catastrophic forgetting. Additionally, we develop a confidence-based pseudo-labeling for the non-entity type, \emph{i.e.,} predicting entity types using the old model to handle the semantic shift of the non-entity type. Following the pseudo-labeling process, we suggest an adaptive re-weighting type-balanced learning strategy to handle the issue of biased type distribution. We carried out comprehensive experiments on ten CNER settings using three different datasets. The results illustrate that our method significantly outperforms prior state-of-the-art approaches, registering an average improvement of $6.3$\% and $8.0$\% in Micro and Macro F1 scores, respectively.<br />Comment: Accepted by EMNLP2023 main conference as a long paper

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2310.14541
Document Type :
Working Paper