Back to Search Start Over

NEWLSTM: An Optimized Long Short-Term Memory Language Model for Sequence Prediction

Authors :
Qing Wang
Rong-Qun Peng
Jia-Qiang Wang
Zhi Li
Han-Bing Qu
Source :
IEEE Access, Vol 8, Pp 65395-65401 (2020)
Publication Year :
2020
Publisher :
IEEE, 2020.

Abstract

The long short-term memory (LSTM) model trained on the universal language modeling task overcomes the bottleneck of vanishing gradients in the traditional recurrent neural network (RNN) and shows excellent performance in processing multiple tasks generated by natural language processing. Although LSTM effectively alleviates the vanishing gradient problem in the RNN, the information will be greatly lost in the long distance transmission, and there are still some limitations in its practical use. In this paper, we propose a new model called NEWLSTM, which improves the LSTM model, and alleviates the defects of too many parameters in LSTM and the vanishing gradient. The NEWLSTM model directly correlates the cell state information with current information. The traditional LSTM's input gate and forget gate are integrated, some components are deleted, the problems of too many LSTM parameters and complicated calculations are solved, and the iteration time is effectively reduced. In this paper, a neural network model is used to identify the relationship between input information sequences to predict the language sequence. The experimental results show that the improved new model is simpler than traditional LSTM models and LSTM variants on multiple test sets. NEWLSTM has better overall stability and can better solve the sparse words problem.

Details

Language :
English
ISSN :
21693536
Volume :
8
Database :
Directory of Open Access Journals
Journal :
IEEE Access
Publication Type :
Academic Journal
Accession number :
edsdoj.40145191bf5a453789be0965038d14c0
Document Type :
article
Full Text :
https://doi.org/10.1109/ACCESS.2020.2985418