1. Modeling Content and Context with Deep Relational Learning
- Author
-
Dan Goldwasser and Maria Leonor Pacheco
- Subjects
FOS: Computer and information sciences ,Linguistics and Language ,Computer Science - Computation and Language ,Artificial neural network ,Computer Science - Artificial Intelligence ,Interface (Java) ,Computer science ,Communication ,Statistical relational learning ,Inference ,Context (language use) ,02 engineering and technology ,Computer Science Applications ,Variety (cybernetics) ,Human-Computer Interaction ,Artificial Intelligence (cs.AI) ,Artificial Intelligence ,Human–computer interaction ,020204 information systems ,0202 electrical engineering, electronic engineering, information engineering ,020201 artificial intelligence & image processing ,Representation (mathematics) ,Computation and Language (cs.CL) ,Natural language - Abstract
Building models for realistic natural language tasks requires dealing with long texts and accounting for complicated structural dependencies. Neural-symbolic representations have emerged as a way to combine the reasoning capabilities of symbolic methods, with the expressiveness of neural networks. However, most of the existing frameworks for combining neural and symbolic representations have been designed for classic relational learning tasks that work over a universe of symbolic entities and relations. In this paper, we present DRaiL, an open-source declarative framework for specifying deep relational models, designed to support a variety of NLP scenarios. Our framework supports easy integration with expressive language encoders, and provides an interface to study the interactions between representation, inference and learning., TACL pre-MIT Press version
- Published
- 2021
- Full Text
- View/download PDF