Back to Search Start Over

Learning Implicit Text Generation via Feature Matching

Authors :
Pierre L. Dognin
Vijil Chenthamarakshan
Youssef Mroueh
Payel Das
Inkit Padhi
Ke Bai
Cicero Nogueira dos Santos
Source :
ACL
Publication Year :
2020
Publisher :
Association for Computational Linguistics, 2020.

Abstract

Generative feature matching network (GFMN) is an approach for training implicit generative models for images by performing moment matching on features from pre-trained neural networks. In this paper, we present new GFMN formulations that are effective for sequential data. Our experimental results show the effectiveness of the proposed method, SeqGFMN, for three distinct generation tasks in English: unconditional text generation, class-conditional text generation, and unsupervised text style transfer. SeqGFMN is stable to train and outperforms various adversarial approaches for text generation and text style transfer.<br />Comment: ACL 2020

Details

Database :
OpenAIRE
Journal :
Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
Accession number :
edsair.doi.dedup.....9eeb0ef236fb1c54f88ec10db1a63657
Full Text :
https://doi.org/10.18653/v1/2020.acl-main.354