1. Learning Implicit Text Generation via Feature Matching
- Author
-
Pierre L. Dognin, Vijil Chenthamarakshan, Youssef Mroueh, Payel Das, Inkit Padhi, Ke Bai, and Cicero Nogueira dos Santos
- Subjects
FOS: Computer and information sciences ,Computer Science - Machine Learning ,Matching (statistics) ,Computer Science - Computation and Language ,Artificial neural network ,Computer science ,business.industry ,05 social sciences ,010501 environmental sciences ,Machine learning ,computer.software_genre ,01 natural sciences ,Machine Learning (cs.LG) ,Moment (mathematics) ,0502 economics and business ,Text generation ,Artificial intelligence ,050207 economics ,business ,Computation and Language (cs.CL) ,computer ,Feature matching ,Generative grammar ,0105 earth and related environmental sciences - Abstract
Generative feature matching network (GFMN) is an approach for training implicit generative models for images by performing moment matching on features from pre-trained neural networks. In this paper, we present new GFMN formulations that are effective for sequential data. Our experimental results show the effectiveness of the proposed method, SeqGFMN, for three distinct generation tasks in English: unconditional text generation, class-conditional text generation, and unsupervised text style transfer. SeqGFMN is stable to train and outperforms various adversarial approaches for text generation and text style transfer., Comment: ACL 2020
- Published
- 2020
- Full Text
- View/download PDF