Back to Search
Start Over
GTAE: Graph-Transformer based Auto-Encoders for Linguistic-Constrained Text Style Transfer
- Publication Year :
- 2021
-
Abstract
- Non-parallel text style transfer has attracted increasing research interests in recent years. Despite successes in transferring the style based on the encoder-decoder framework, current approaches still lack the ability to preserve the content and even logic of original sentences, mainly due to the large unconstrained model space or too simplified assumptions on latent embedding space. Since language itself is an intelligent product of humans with certain grammars and has a limited rule-based model space by its nature, relieving this problem requires reconciling the model capacity of deep neural networks with the intrinsic model constraints from human linguistic rules. To this end, we propose a method called Graph Transformer based Auto Encoder (GTAE), which models a sentence as a linguistic graph and performs feature extraction and style transfer at the graph level, to maximally retain the content and the linguistic structure of original sentences. Quantitative experiment results on three non-parallel text style transfer tasks show that our model outperforms state-of-the-art methods in content preservation, while achieving comparable performance on transfer accuracy and sentence naturalness.<br />Comment: The first two authors share equal-authorship; Code:https://github.com/SenZHANG-GitHub/graph-text-style-transfer ; benchmark: https://github.com/ykshi/text-style-transfer-benchmark
Details
- Database :
- OAIster
- Publication Type :
- Electronic Resource
- Accession number :
- edsoai.on1269526404
- Document Type :
- Electronic Resource