1. Named Entity Recognition Model Based on k-best Viterbi Decoupling Knowledge Distillation
- Author
-
ZHAO Honglei, TANG Huanling, ZHANG Yu, SUN Xueyuan, LU Mingyu
- Subjects
named entity recognition (ner) ,knowledge distillation ,k-best viterbi decoding ,data augmentation ,Electronic computers. Computer science ,QA75.5-76.95 - Abstract
Knowledge distillation is a general approach to improve the performance of the named entity recognition (NER) models. However, the classical knowledge distillation loss functions are coupled, which leads to poor logit distillation. In order to decouple and effectively improve the performance of logit distillation, this paper proposes an approach, k-best Viterbi decoupling knowledge distillation (kvDKD), which combines k-best Viterbi decoding to improve the computational efficiency, effectively improving the model performance. Additionally, the NER based on deep learning is easy to introduce noise in data augmentation. Therefore, a data augmentation method combining data filtering and entity rebalancing algorithm is proposed, aiming to reduce noise introduced by the original dataset and to enhance the problem of mislabeled data, which can improve the quality of data and reduce overfitting. Based on the above method, a novel named entity recognition model NER-kvDKD (named entity recognition model based on k-best Viterbi decoupling knowledge distillation) is proposed. The comparative experimental results on the datasets of MSRA, Resume, Weibo, CLUENER and CoNLL-2003 show that the proposed method can improve the generalization ability of the model and also effectively improves the student model performance.
- Published
- 2024
- Full Text
- View/download PDF