1. Distant supervision for relation extraction with hierarchical selective attention.
- Author
-
Zhou, Peng, Xu, Jiaming, Qi, Zhenyu, Bao, Hongyun, Chen, Zhineng, and Xu, Bo
- Subjects
- *
NATURAL language processing , *ARTIFICIAL intelligence , *COMPUTATIONAL linguistics , *SEMANTIC computing , *SEMANTICS - Abstract
Abstract Distant supervised relation extraction is an important task in the field of natural language processing. There are two main shortcomings for most state-of-the-art methods. One is that they take all sentences of an entity pair as input, which would result in a large computational cost. But in fact, few of most relevant sentences are enough to recognize the relation of an entity pair. To tackle these problems, we propose a novel hierarchical selective attention network for relation extraction under distant supervision. Our model first selects most relevant sentences by taking coarse sentence-level attention on all sentences of an entity pair and then employs word-level attention to construct sentence representations and fine sentence-level attention to aggregate these sentence representations. Experimental results on a widely used dataset demonstrate that our method performs significantly better than most of existing methods. [ABSTRACT FROM AUTHOR]
- Published
- 2018
- Full Text
- View/download PDF