Back to Search
Start Over
A recollect-tuning method for entity and relation extraction.
- Source :
-
Expert Systems with Applications . Jul2024, Vol. 245, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Fine-tuning and mask-tuning (or prompt tuning) are two approaches to construct deep neural networks for entity and relation extraction. Fine-tuning based models optimize neural networks with task-relevant objective, in which pre-trained language models (PLMs) are mainly used as external resources to support word embedding. In mask-tuning models, neural networks is optimized by the same pre-training objective in a PLM, which directly outputs verbalized entity type representations. It is effective to utilize potential knowledge of PLMs. In this paper, we propose a recollect-tuning approach, which jointly makes full use of the mechanisms of both fine-tuning and mask-tuning. In this approach, the recollect-tuning iteratively masks tokens in a possible entity span. The classification is based on both the masked token representation and the entity span representation. It is the same as the process to make a decision based on incomplete information. In the training process, the deep network is optimized by task-relevant objective, which strengthens the semantic representation of each entity span. It is effective to learn entity noise-invariant features and take full advantage of potential knowledge of PLMs. Our method is evaluated on three public benchmarks (the ACE 2004, ACE 2005 and SciERC datasets) for the entity and relation extraction task. The result shows significant improvement in the two tasks, outperforming the state-of-the-art performance on ACE04, ACE05 and SciERC by +0.4%, +0.6%, and +0.5%, respectively. • A unified framework combines fine-tuning and mask-tuning for NER and RE extraction. • A recollect-tuning approach proposed can learn noise-invariant entity features. • A grouped mask attention is designed to model the inter-span token interaction. [ABSTRACT FROM AUTHOR]
- Subjects :
- *ARTIFICIAL neural networks
*LANGUAGE models
*PSEUDOPOTENTIAL method
Subjects
Details
- Language :
- English
- ISSN :
- 09574174
- Volume :
- 245
- Database :
- Academic Search Index
- Journal :
- Expert Systems with Applications
- Publication Type :
- Academic Journal
- Accession number :
- 176151934
- Full Text :
- https://doi.org/10.1016/j.eswa.2023.123000