Back to Search Start Over

Improving neural machine translation using gated state network and focal adaptive attention networtk.

Authors :
Huang, Li
Chen, Wenyu
Liu, Yuguo
Zhang, He
Qu, Hong
Source :
Neural Computing & Applications. Dec2021, Vol. 33 Issue 23, p15955-15967. 13p.
Publication Year :
2021

Abstract

The currently predominant token-to-token attention mechanism has demonstrated its ability to capture word dependencies in neural machine translation. This mechanism treats a sequence as bag-of-words tokens and compute the similarity between tokens without considering their intrinsic interactions. In this paper, we argue that this attention mechanism may miss opportunity of take advantage of the state information through multiple time steps. Thus, we propose a Gated State Network which manipulates the state information flow with sequential characteristics. We also incorporate a Focal Adaptive Attention Network which utilizes a Gaussian distribution to concentrate the attention distribution to a predicted focal position and its neighborhood. Experimental results on WMT'14 English–German and WMT'17 Chinese–English translation tasks demonstrate the effectiveness of the proposed approach. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
33
Issue :
23
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
153416072
Full Text :
https://doi.org/10.1007/s00521-021-06444-2