Back to Search Start Over

Improving Text Generation with Student-Forcing Optimal Transport

Authors :
Wang, Guoyin
Li, Chunyuan
Li, Jianqiao
Fu, Hao
Lin, Yuh-Chen
Chen, Liqun
Zhang, Yizhe
Tao, Chenyang
Zhang, Ruiyi
Wang, Wenlin
Shen, Dinghan
Yang, Qian
Carin, Lawrence
Publication Year :
2020

Abstract

Neural language models are often trained with maximum likelihood estimation (MLE), where the next word is generated conditioned on the ground-truth word tokens. During testing, however, the model is instead conditioned on previously generated tokens, resulting in what is termed exposure bias. To reduce this gap between training and testing, we propose using optimal transport (OT) to match the sequences generated in these two modes. An extension is further proposed to improve the OT learning, based on the structural and contextual information of the text sequences. The effectiveness of the proposed method is validated on machine translation, text summarization, and text generation tasks.<br />Comment: To appear at EMNLP 2020

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2010.05994
Document Type :
Working Paper