Back to Search
Start Over
Simultaneous multistep transformer architecture for model predictive control.
- Source :
-
Computers & Chemical Engineering . Oct2023, Vol. 178, pN.PAG-N.PAG. 1p. - Publication Year :
- 2023
-
Abstract
- Transformer neural networks have revolutionized natural language processing by effectively addressing the vanishing gradient problem. This study focuses on applying Transformer models to time-series forecasting and customizing them for a simultaneous multistep-ahead prediction model in surrogate model predictive control (MPC). The proposed method showcases improved control performance and computational efficiency compared to LSTM-based MPC and one-step-ahead prediction models using both LSTM and Transformer networks. The study introduces three key contributions: (1) a new MPC system based on a Transformer time-series architecture, (2) a training method enabling multistep-ahead prediction for time-series machine learning models, and (3) validation of the enhanced time performance of multistep-ahead Transformer MPC compared to one-step-ahead LSTM networks. Case studies demonstrate a significant fifteen-fold improvement in computational speed compared to one-step-ahead LSTM, although this improvement may vary depending on MPC factors like the lookback window and prediction horizon. • Transformer and LSTM models compared in Model Predictive Control. • New multistep Transformer architecture proposed for MPC. • Transformer MPC outperforms LSTM-based MPC with 15x speedup. • Practical implementation on fluidized-bed gold ore roaster. [ABSTRACT FROM AUTHOR]
- Subjects :
- *MACHINE learning
*PREDICTION models
*GOLD ores
Subjects
Details
- Language :
- English
- ISSN :
- 00981354
- Volume :
- 178
- Database :
- Academic Search Index
- Journal :
- Computers & Chemical Engineering
- Publication Type :
- Academic Journal
- Accession number :
- 171953946
- Full Text :
- https://doi.org/10.1016/j.compchemeng.2023.108396