Back to Search Start Over

Modeling Content and Context with Deep Relational Learning

Authors :
Dan Goldwasser
Maria Leonor Pacheco
Source :
Transactions of the Association for Computational Linguistics. 9:100-119
Publication Year :
2021
Publisher :
MIT Press - Journals, 2021.

Abstract

Building models for realistic natural language tasks requires dealing with long texts and accounting for complicated structural dependencies. Neural-symbolic representations have emerged as a way to combine the reasoning capabilities of symbolic methods, with the expressiveness of neural networks. However, most of the existing frameworks for combining neural and symbolic representations have been designed for classic relational learning tasks that work over a universe of symbolic entities and relations. In this paper, we present DRaiL, an open-source declarative framework for specifying deep relational models, designed to support a variety of NLP scenarios. Our framework supports easy integration with expressive language encoders, and provides an interface to study the interactions between representation, inference and learning.<br />TACL pre-MIT Press version

Details

ISSN :
2307387X
Volume :
9
Database :
OpenAIRE
Journal :
Transactions of the Association for Computational Linguistics
Accession number :
edsair.doi.dedup.....ab118d2aedaf8502e998ccf2da63f5dc
Full Text :
https://doi.org/10.1162/tacl_a_00357