Back to Search Start Over

A novel approach to ultra-short-term multi-step wind power predictions based on encoder–decoder architecture in natural language processing.

Authors :
Wang, Lei
He, Yigang
Li, Lie
Liu, Xiaoyan
Zhao, Yingying
Source :
Journal of Cleaner Production. Jun2022, Vol. 354, pN.PAG-N.PAG. 1p.
Publication Year :
2022

Abstract

Accurate wind power predictions (WPPs) are highly significant to the safety, stability, and economic operation of power systems. The reported encoder-–decoder architectures have demonstrated clear advantages over traditional methods in multi-step WPP tasks. However, the reported frameworks still have defects involving insufficient information mining abilities and low computing efficiencies. To address these shortcomings, this study proposed three improved encoder–decoder architectures, sequence-to-sequence bidirectional gated recurrent unit (SBIGRU), attention-based sequence-to-sequence Bi-GRU (ASBIGRU) and Transformer, in natural language processing for multi-step WPP. Data, including numerical weather predictions and wind powers, from 12 wind farms located in 12 different regions of China were used to validate our proposed models. The correlations between the datasets from multiple wind farms were analyzed using Pearson's correlation coefficient method to demonstrate the feasibility of our proposed models even without considering the spatial correlations. We adopted an effective strategy combining manual experience and machine grid searches to define the hyper-parameters needed to optimize the performance of our proposed models. The prediction accuracies and computational efficiencies of the reported and proposed models were compared experimentally. For prediction accuracy, the experimental results showed that, compared with existing models, Transformer, ASBIGRU and SBIGRU reduced the root mean square error by 3.21%, 1.06% and 0.88% in 16-step-ahead predictions, respectively. Furthermore, for computational efficiency, the training time of the existing model at a wind farm is 3.57 times that of Transformer. This confirmed that the Transformer model performs better in terms of prediction accuracy and computational efficiency. Our work illustrates the potential of Transformer for large-scale wind farm applications. • Novel encoder-decoder architecture of NLP are used for multi-step wind power forecasting. • Validating the modeling results with historical wind power and NWP data. • Hyper-parameters are determined by combining manual experience and machine grid searches. • Transformer has obvious advantages in prediction accuracy and computational efficiency. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09596526
Volume :
354
Database :
Academic Search Index
Journal :
Journal of Cleaner Production
Publication Type :
Academic Journal
Accession number :
156650250
Full Text :
https://doi.org/10.1016/j.jclepro.2022.131723