Back to Search
Start Over
A Temporal Window Attention-Based Window-Dependent Long Short-Term Memory Network for Multivariate Time Series Prediction.
- Source :
-
Entropy . Jan2023, Vol. 25 Issue 1, p10. 15p. - Publication Year :
- 2023
-
Abstract
- Multivariate time series prediction models perform the required operation on a specific window length of a given input. However, capturing complex and nonlinear interdependencies in each temporal window remains challenging. The typical attention mechanisms assign a weight for a variable at the same time or the features of each previous time step to capture spatio-temporal correlations. However, it fails to directly extract each time step's relevant features that affect future values to learn the spatio-temporal pattern from a global perspective. To this end, a temporal window attention-based window-dependent long short-term memory network (TWA-WDLSTM) is proposed to enhance the temporal dependencies, which exploits the encoder–decoder framework. In the encoder, we design a temporal window attention mechanism to select relevant exogenous series in a temporal window. Furthermore, we introduce a window-dependent long short-term memory network (WDLSTM) to encode the input sequences in a temporal window into a feature representation and capture very long term dependencies. In the decoder, we use WDLSTM to generate the prediction values. We applied our model to four real-world datasets in comparison to a variety of state-of-the-art models. The experimental results suggest that TWA-WDLSTM can outperform comparison models. In addition, the temporal window attention mechanism has good interpretability. We can observe which variable contributes to the future value. [ABSTRACT FROM AUTHOR]
- Subjects :
- *TIME series analysis
*FORECASTING
*PREDICTION models
*TIME-varying networks
Subjects
Details
- Language :
- English
- ISSN :
- 10994300
- Volume :
- 25
- Issue :
- 1
- Database :
- Academic Search Index
- Journal :
- Entropy
- Publication Type :
- Academic Journal
- Accession number :
- 161479997
- Full Text :
- https://doi.org/10.3390/e25010010