Back to Search Start Over

NOA-LSTM: An efficient LSTM cell architecture for time series forecasting.

Authors :
Yadav, Hemant
Thakkar, Amit
Source :
Expert Systems with Applications. Mar2024:Part F, Vol. 238, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

The application of Machine learning and deep learning techniques for time series forecasting has gained significant attention in recent years. Numerous endeavors have been devoted to automating forecasting through the utilization of cutting-edge neural networks. Notably, the recurrent neural network (LSTM – Long Short-Term Memory) has emerged as a central concept in most research endeavors. Although LSTM was initially introduced in 1997 for sequence modeling, subsequent updates have primarily focused on language learning tasks. These updates have introduced various computational mechanisms within the LSTM cell, including the forget gate, input gate, and output gate. In this study, we investigate the impact of each computational component in isolation to analyze their effects on time series forecasting tasks. Our experiments utilize the Jena weather dataset and Appliance Energy Usage time series for evaluation. The experimental results reveal that variations of the LSTM model outperform the most popular LSRM cell format in terms of error rate and training time. Specifically, the variations identified in this paper demonstrate superior generalization capabilities and yield reduced forecasting errors. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09574174
Volume :
238
Database :
Academic Search Index
Journal :
Expert Systems with Applications
Publication Type :
Academic Journal
Accession number :
173694084
Full Text :
https://doi.org/10.1016/j.eswa.2023.122333