Back to Search Start Over

An Augmented Transformer Architecture for Natural Language Generation Tasks

Authors :
Li, Hailiang
Wang, Adele Y. C.
Liu, Yang
Tang, Du
Lei, Zhibin
Li, Wenye
Publication Year :
2019

Abstract

The Transformer based neural networks have been showing significant advantages on most evaluations of various natural language processing and other sequence-to-sequence tasks due to its inherent architecture based superiorities. Although the main architecture of the Transformer has been continuously being explored, little attention was paid to the positional encoding module. In this paper, we enhance the sinusoidal positional encoding algorithm by maximizing the variances between encoded consecutive positions to obtain additional promotion. Furthermore, we propose an augmented Transformer architecture encoded with additional linguistic knowledge, such as the Part-of-Speech (POS) tagging, to boost the performance on some natural language generation tasks, e.g., the automatic translation and summarization tasks. Experiments show that the proposed architecture attains constantly superior results compared to the vanilla Transformer.<br />Comment: This paper will be appeared in the conference workshop ICDM MLCS 2019

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1910.13634
Document Type :
Working Paper