Back to Search
Start Over
A pattern-aware self-attention network for distant supervised relation extraction.
- Source :
-
Information Sciences . Jan2022, Vol. 584, p269-279. 11p. - Publication Year :
- 2022
-
Abstract
- Distant supervised relation extraction is an efficient strategy of finding relational facts from unstructured text without labeled training data. A recent paradigm to develop relation extractors is using pre-trained Transformer language models to produce high-quality sentence representations. However, due to the original Transformer is weak at capturing local dependencies and phrasal structures, existing Transformer-based methods cannot identify various relational patterns in sentences. To address this issue, we propose a novel distant supervised relation extraction model, which employs a specific-designed pattern-aware self-attention network to automatically discover relational patterns for pre-trained Transformers in an end-to-end manner. Specifically, the proposed method assumes that the correlation between two adjacent tokens reflects the probability that they belong to the same pattern. Based on this assumption, a novel self-attention network is designed to generate the probability distribution of all patterns in a sentence. Then, the probability distribution is applied as a constraint in the first Transformer layer to encourage its attention heads to follow the relational pattern structures. As a result, fine-grained pattern information is enhanced in the pre-trained Transformer without losing global dependencies. Extensive experimental results on two popular benchmark datasets demonstrate that our model performs better than the state-of-the-art baselines. [ABSTRACT FROM AUTHOR]
- Subjects :
- *DISTRIBUTION (Probability theory)
*PROBABILITY theory
Subjects
Details
- Language :
- English
- ISSN :
- 00200255
- Volume :
- 584
- Database :
- Academic Search Index
- Journal :
- Information Sciences
- Publication Type :
- Periodical
- Accession number :
- 154049274
- Full Text :
- https://doi.org/10.1016/j.ins.2021.10.047