Back to Search Start Over

Distant supervision for relation extraction with hierarchical selective attention.

Authors :
Zhou, Peng
Xu, Jiaming
Qi, Zhenyu
Bao, Hongyun
Chen, Zhineng
Xu, Bo
Source :
Neural Networks. Dec2018, Vol. 108, p240-247. 8p.
Publication Year :
2018

Abstract

Abstract Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods. One is that they take all sentences of an entity pair as input, which would result in a large computational cost. But in fact, few of most relevant sentences are enough to recognize the relation of an entity pair. To tackle these problems, we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate these sentence representations. Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08936080
Volume :
108
Database :
Academic Search Index
Journal :
Neural Networks
Publication Type :
Academic Journal
Accession number :
133047556
Full Text :
https://doi.org/10.1016/j.neunet.2018.08.016