Back to Search Start Over

Research on the LSTM Mongolian and Chinese machine translation based on morpheme encoding.

Authors :
Qing-dao-er-ji, Ren
Su, Yi La
Liu, Wan Wan
Source :
Neural Computing & Applications; Jan2020, Vol. 32 Issue 1, p41-49, 9p
Publication Year :
2020

Abstract

The neural machine translation model based on long short-term memory (LSTM) has become the mainstream in machine translation with its unique coding–decoding structure and semantic mining features. However, there are few studies on the Mongolian and Chinese neural machine translation combined with LSTM. This paper mainly studies the preprocessing of Mongolian and Chinese bilingual corpus and the construction of the LSTM model of Mongolian morpheme coding. In the corpus preprocessing stage, this paper presents a hybrid algorithm for the construction of word segmentation modules. The sequence that has not been annotated is treated semantically and labeled by a combination of gated recurrent unit and conditional random field. In order to learn more grammar and semantic knowledge from Mongolian corpus, in the model construction stage, this paper presents the LSTM neural network model based on morpheme coding to construct the encoder. This paper also constructs the LSTM neural network decoder to predict the Chinese decode. Experimental comparisons of sentences of different lengths according to the construction model show that the model has improved translation performance in dealing with long-term dependence problems. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
32
Issue :
1
Database :
Complementary Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
141168393
Full Text :
https://doi.org/10.1007/s00521-018-3741-5