Back to Search Start Over

Calibrate your listeners! Robust communication-based training for pragmatic speakers

Authors :
Wang, Rose E.
White, Julia
Mu, Jesse
Goodman, Noah D.
Publication Year :
2021

Abstract

To be good conversational partners, natural language processing (NLP) systems should be trained to produce contextually useful utterances. Prior work has investigated training NLP systems with communication-based objectives, where a neural listener stands in as a communication partner. However, these systems commonly suffer from semantic drift where the learned language diverges radically from natural language. We propose a method that uses a population of neural listeners to regularize speaker training. We first show that language drift originates from the poor uncertainty calibration of a neural listener, which makes high-certainty predictions on novel sentences. We explore ensemble- and dropout-based populations of listeners and find that the former results in better uncertainty quantification. We evaluate both population-based objectives on reference games, and show that the ensemble method with better calibration enables the speaker to generate pragmatic utterances while scaling to a large vocabulary and generalizing to new games and listeners.<br />Comment: Findings of EMNLP 2021 Code: https://github.com/rosewang2008/calibrate_your_listeners

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2110.05422
Document Type :
Working Paper