Back to Search Start Over

Zero-shot relation triplet extraction as Next-Sentence Prediction.

Authors :
Liao, Wenxiong
Liu, Zhengliang
Zhang, Yiyang
Huang, Xiaoke
Liu, Ninghao
Liu, Tianming
Li, Quanzheng
Li, Xiang
Cai, Hongmin
Source :
Knowledge-Based Systems. Nov2024, Vol. 304, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Zero-shot relation triplet extraction (ZeroRTE) endeavors to extract relation triplets from a test set using a model trained on a training set with disjoint relations from the test set. Current ZeroRTE approaches primarily rely on two strategies: 1) Combining pre-trained language models to generate additional training samples; 2) Adding a large number of parameters that require training from scratch on top of a pre-trained language model. However, the former approach does not ensure the quality of generated samples, and the latter often struggles to generalize to unseen relations in the test set, particularly when the training set is small. In this paper, we introduce a novel method, N ext S entence P rediction for R elation T riplet E xtraction (NSP-RTE), abstracting ZeroRTE as a higher-level next sentence prediction (NSP) task to enhance its generalization ability to unseen relation categories. NSP-RTE integrates modules for relation recognition, entity detection, and triplet classification, leveraging pre-trained BERT models with fewer parameters requiring training from scratch, while eliminating the need for additional sample generation. Our experiments on the FewRel and Wiki-ZSL datasets demonstrate that NSP-RTE, with its simple and efficient design, significantly outperforms previous methods. • This paper transforms the zero-shot relation triplet extraction task into a next-sentence prediction problem for enhanced generalization. • The proposed method directly addresses the zero-shot relation triplet extraction task based on contextual semantics, eliminating the need for additional sample synthesis. • Experimental results on publicly available datasets demonstrate that the proposed method significantly outperforms previous methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09507051
Volume :
304
Database :
Academic Search Index
Journal :
Knowledge-Based Systems
Publication Type :
Academic Journal
Accession number :
180797902
Full Text :
https://doi.org/10.1016/j.knosys.2024.112507