Back to Search
Start Over
MultiWaveNet: A long time series forecasting framework based on multi-scale analysis and multi-channel feature fusion.
- Source :
-
Expert Systems with Applications . Oct2024, Vol. 251, pN.PAG-N.PAG. 1p. - Publication Year :
- 2024
-
Abstract
- Long time series forecasting is widely used in areas such as power dispatch, traffic control, and weather forecasting. The pattern of seasonality and trends in long time series are often complex, especially when they are presented at different time scales. Existing methods typically focus on only one scale or randomly select scales, which leads to a significant loss of valuable information. Additionally, current methods often transform multi-channel data into a single-channel format, ignoring interactions and complex relationships between channels. The paper proposes MultiWaveNet, a novel long time series forecasting framework that addresses seasonality as well as trends separately. For the seasonal component, the framework uses multi-scale wavelet decomposition to generate subseries at multiple scales. A learnable optimization factor is introduced simultaneously to separate high-frequency components mixed in low-frequency series after wavelet decomposition. In order to reduce information redundancy and model complexity, the paper develops a wavelet domain sampling encoder that consists of just one Transformer encoder, ensuring effective modeling of long-term dependencies while maintaining feature extraction effectiveness. As for the trend component, unlike previous research, the weights of channels are adjusted based on their importance, allowing the more crucial channels to have a greater impact and thereby addressing the limitations of individual processing methods. The paper performs extensive experiments on nine standard datasets, demonstrating that MultiWaveNet is the most competitive method. • Learnable optimization factor eliminate high-frequency information mixed in the low-frequency subseries. • Developed a wavelet-domain sampling encoder incorporating only the Transformer Encoder. • An attention-based channel-aware module was designed. [ABSTRACT FROM AUTHOR]
Details
- Language :
- English
- ISSN :
- 09574174
- Volume :
- 251
- Database :
- Academic Search Index
- Journal :
- Expert Systems with Applications
- Publication Type :
- Academic Journal
- Accession number :
- 177514327
- Full Text :
- https://doi.org/10.1016/j.eswa.2024.124088