Back to Search Start Over

Learning to refer informatively by amortizing pragmatic reasoning

Authors :
White, Julia
Mu, Jesse
Goodman, Noah D.
Publication Year :
2020

Abstract

A hallmark of human language is the ability to effectively and efficiently convey contextually relevant information. One theory for how humans reason about language is presented in the Rational Speech Acts (RSA) framework, which captures pragmatic phenomena via a process of recursive social reasoning (Goodman & Frank, 2016). However, RSA represents ideal reasoning in an unconstrained setting. We explore the idea that speakers might learn to amortize the cost of RSA computation over time by directly optimizing for successful communication with an internal listener model. In simulations with grounded neural speakers and listeners across two communication game datasets representing synthetic and human-generated data, we find that our amortized model is able to quickly generate language that is effective and concise across a range of contexts, without the need for explicit pragmatic reasoning.<br />Comment: Accepted to CogSci 2020

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2006.00418
Document Type :
Working Paper