Back to Search Start Over

A novel joint extraction model based on cross-attention mechanism and global pointer using context shield window.

Authors :
Zhai, Zhengwei
Fan, Rongli
Huang, Jie
Xiong, Neal
Zhang, Lijuan
Wan, Jian
Zhang, Lei
Source :
Computer Speech & Language. Aug2024, Vol. 87, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Relational triple extraction is a critical step in knowledge graph construction. Compared to pipeline-based extraction, joint extraction is gaining more attention because it can better utilize entity and relation information without causing error propagation issues. Yet, the challenge with joint extraction lies in handling overlapping triples. Existing approaches adopt sequential steps or multiple modules, which often accumulate errors and interfere with redundant data. In this study, we propose an innovative joint extraction model with cross-attention mechanism and global pointers with context shield window. Specifically, our methodology begins by inputting text data into a pre-trained RoBERTa model to generate word vector representations. Subsequently, these embeddings are passed through a modified cross-attention layer along with entity type embeddings to address missing entity type information. Next, we employ the global pointer to transform the extraction problem into a quintuple extraction problem, which skillfully solves the issue of overlapping triples. It is worth mentioning that we design a context shield window on the global pointer, which facilitates the identification of correct entities within a limited range during the entity extraction process. Finally, the capability of our model against malicious samples is improved by adding adversarial training during the training process. Demonstrating superiority over mainstream models, our approach achieves impressive results on three publicly available datasets. • Solving overlapping triples problems by employing global pointer. • Context shield window is designed to eliminate redundant information. • Merging entity type information with designed entity-type cross attention module. • Adversarial training is added to improve the generalization ability. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08852308
Volume :
87
Database :
Academic Search Index
Journal :
Computer Speech & Language
Publication Type :
Academic Journal
Accession number :
177037999
Full Text :
https://doi.org/10.1016/j.csl.2024.101643