Back to Search
Start Over
BDCore: Bidirectional Decoding with Co-graph Representation for Joint Entity and Relation Extraction.
- Source :
-
Knowledge-Based Systems . Jun2024, Vol. 294, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Relation extraction has become a crucial step for the automatic construction of Knowledge Graph (KG). Recently, researchers leverage Sequence-to-Sequence (Seq2Seq) models for Joint Entity and Relation Extraction (JERE). Nevertheless, traditional decoding methods entail the generation of the target sequence incrementally from left to right by the decoder, without the ability to revise earlier predictions when errors occur. This limitation becomes evident when decoding errors manifest prior to the current decoding step. Furthermore, the interrelations among triplets originating from the same sentence exhibit a robust correlation, which has been overlooked. In this paper, we propose B idirectional D ecoding with Co -graph re presentation (BDCore) to address the issues mentioned above. Specifically, we first introduce a backward decoder to decode the target sequence in a reverse order. Then, the forward decoder introduces two attention mechanisms to simultaneously considering the hidden states of the encoder and the backward decoder. Thus, the backward decoding information helps to alleviate the negative impact of the forward decoding errors. Besides, we construct a relation co-occurrence graph (Co-graph) and exploit Graph Convolutional Network (GCN) to capture the relation correlation. The extensive experiments demonstrate the benefits of the proposed bidirectional decoding and co-graph representation for relation extraction. Compared to the previous methods, our approach significantly outperforms the baselines on the NYT benchmark. • The Seq2Seq approaches disregard forward decoding errors and relation co-occurrence. • GCN enables to efficiently learn node representation of relation co-occurrence graph. • Bidirectional decoding relieves the negative impact of the forward decoding errors. • Two attention mechanisms simultaneously consider backward and forward information. [ABSTRACT FROM AUTHOR]
- Subjects :
- *DECODERS & decoding
*KNOWLEDGE graphs
*RESEARCH personnel
*FORECASTING
Subjects
Details
- Language :
- English
- ISSN :
- 09507051
- Volume :
- 294
- Database :
- Academic Search Index
- Journal :
- Knowledge-Based Systems
- Publication Type :
- Academic Journal
- Accession number :
- 177088957
- Full Text :
- https://doi.org/10.1016/j.knosys.2024.111781