Back to Search Start Over

Classifier-adaptation knowledge distillation framework for relation extraction and event detection with imbalanced data.

Authors :
Song, Dandan
Xu, Jing
Pang, Jinhui
Huang, Heyan
Source :
Information Sciences. Sep2021, Vol. 573, p222-238. 17p.
Publication Year :
2021

Abstract

Fundamental information extraction tasks, such as relation extraction and event detection, suffer from a data imbalance problem. To alleviate this problem, existing methods rely mostly on well-designed loss functions to reduce the negative influence of imbalanced data. However, this approach requires additional hyper-parameters and limits scalability. Furthermore, these methods can only benefit specific tasks and do not provide a unified framework across relation extraction and event detection. In this paper, a Classifier-Adaptation Knowledge Distillation (CAKD) framework is proposed to address these issues, thus improving relation extraction and event detection performance. The first step is to exploit sentence-level identification information across relation extraction and event detection, which can reduce identification errors caused by the data imbalance problem without relying on additional hyper-parameters. Moreover, this sentence-level identification information is used by a teacher network to guide the baseline model's training by sharing its classifier. Like an instructor, the classifier improves the baseline model's ability to extract this sentence-level identification information from raw texts, thus benefiting overall performance. Experiments were conducted on both relation extraction and event detection using the Text Analysis Conference Relation Extraction Dataset (TACRED) and Automatic Content Extraction (ACE) 2005 English datasets, respectively. The results demonstrate the effectiveness of the proposed framework. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00200255
Volume :
573
Database :
Academic Search Index
Journal :
Information Sciences
Publication Type :
Periodical
Accession number :
151980092
Full Text :
https://doi.org/10.1016/j.ins.2021.05.045