Back to Search Start Over

Syntactically aware neural architectures for definition extraction

Authors :
Luis Espinosa Anke
Steven Schockaert
Source :
16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), NAACL-HLT (2)

Abstract

Automatically identifying definitional knowledge in text corpora (Definition Extraction or DE) is an important task with direct applications in, among others, Automatic Glossary Generation, Taxonomy Learning, Question Answering and Semantic Search. It is generally cast as a binary classification problem between definitional and non-definitional sentences. In this paper we present a set of neural architectures combining Convolutional and Recurrent Neural Networks, which are further enriched by incorporating linguistic information via syntactic dependencies. Our experimental results in the task of sentence classification, on two benchmarking DE datasets (one generic, one domain-specific), show that these models obtain consistent state of the art results. Furthermore, we demonstrate that models trained on clean Wikipedia-like definitions can successfully be applied to more noisy domain-specific corpora.

Details

Language :
English
Database :
OpenAIRE
Journal :
16th Annual Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Proceedings of the 2018 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 2 (Short Papers), NAACL-HLT (2)
Accession number :
edsair.doi.dedup.....bb55e6378511e30a5fbfa980056b0c1e