Back to Search
Start Over
Modeling Content and Context with Deep Relational Learning
- Source :
- Transactions of the Association for Computational Linguistics. 9:100-119
- Publication Year :
- 2021
- Publisher :
- MIT Press - Journals, 2021.
-
Abstract
- Building models for realistic natural language tasks requires dealing with long texts and accounting for complicated structural dependencies. Neural-symbolic representations have emerged as a way to combine the reasoning capabilities of symbolic methods, with the expressiveness of neural networks. However, most of the existing frameworks for combining neural and symbolic representations have been designed for classic relational learning tasks that work over a universe of symbolic entities and relations. In this paper, we present DRaiL, an open-source declarative framework for specifying deep relational models, designed to support a variety of NLP scenarios. Our framework supports easy integration with expressive language encoders, and provides an interface to study the interactions between representation, inference and learning.<br />TACL pre-MIT Press version
- Subjects :
- FOS: Computer and information sciences
Linguistics and Language
Computer Science - Computation and Language
Artificial neural network
Computer Science - Artificial Intelligence
Interface (Java)
Computer science
Communication
Statistical relational learning
Inference
Context (language use)
02 engineering and technology
Computer Science Applications
Variety (cybernetics)
Human-Computer Interaction
Artificial Intelligence (cs.AI)
Artificial Intelligence
Human–computer interaction
020204 information systems
0202 electrical engineering, electronic engineering, information engineering
020201 artificial intelligence & image processing
Representation (mathematics)
Computation and Language (cs.CL)
Natural language
Subjects
Details
- ISSN :
- 2307387X
- Volume :
- 9
- Database :
- OpenAIRE
- Journal :
- Transactions of the Association for Computational Linguistics
- Accession number :
- edsair.doi.dedup.....ab118d2aedaf8502e998ccf2da63f5dc
- Full Text :
- https://doi.org/10.1162/tacl_a_00357