Back to Search
Start Over
ELSTM: An improved long short‐term memory network language model for sequence learning.
- Source :
-
Expert Systems . Jun2024, Vol. 41 Issue 6, p1-9. 9p. - Publication Year :
- 2024
-
Abstract
- The gated structure of the long short‐term memory (LSTM) alleviates the defects of gradient disappearance and explosion in the recurrent neural network (RNN). It has received widespread attention in sequence learning such as text analysis. Although LSTM has good performance in handling remote dependencies, information loss often occurs in long‐distance transmission. We propose a new model called ELSTM based on the computational complexity and gradient dispersion in the traditional LSTM model. This model simplifies the input gate of LSTM, reduces some time complexity by reducing some components, and improves the output gate. By introducing the exponential linear unit activation layer, the problem of gradient dispersion is alleviated. Comparing the new model with multiple existing models, when predicting language sequences, the time used by the model has been greatly reduced, and the language confusion has been reduced, showing good performance. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 02664720
- Volume :
- 41
- Issue :
- 6
- Database :
- Academic Search Index
- Journal :
- Expert Systems
- Publication Type :
- Academic Journal
- Accession number :
- 176989653
- Full Text :
- https://doi.org/10.1111/exsy.13211