Back to Search
Start Over
Learning Implicit Text Generation via Feature Matching
- Source :
- ACL
- Publication Year :
- 2020
- Publisher :
- Association for Computational Linguistics, 2020.
-
Abstract
- Generative feature matching network (GFMN) is an approach for training implicit generative models for images by performing moment matching on features from pre-trained neural networks. In this paper, we present new GFMN formulations that are effective for sequential data. Our experimental results show the effectiveness of the proposed method, SeqGFMN, for three distinct generation tasks in English: unconditional text generation, class-conditional text generation, and unsupervised text style transfer. SeqGFMN is stable to train and outperforms various adversarial approaches for text generation and text style transfer.<br />Comment: ACL 2020
- Subjects :
- FOS: Computer and information sciences
Computer Science - Machine Learning
Matching (statistics)
Computer Science - Computation and Language
Artificial neural network
Computer science
business.industry
05 social sciences
010501 environmental sciences
Machine learning
computer.software_genre
01 natural sciences
Machine Learning (cs.LG)
Moment (mathematics)
0502 economics and business
Text generation
Artificial intelligence
050207 economics
business
Computation and Language (cs.CL)
computer
Feature matching
Generative grammar
0105 earth and related environmental sciences
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics
- Accession number :
- edsair.doi.dedup.....9eeb0ef236fb1c54f88ec10db1a63657
- Full Text :
- https://doi.org/10.18653/v1/2020.acl-main.354