Back to Search Start Over

Distributional Deep Reinforcement Learning-Based Emergency Frequency Control.

Authors :
Xie, Jian
Sun, Wei
Source :
IEEE Transactions on Power Systems; Jul2022, Vol. 37 Issue 4, p2720-2730, 11p
Publication Year :
2022

Abstract

Emergency frequency control is one of the most critical approaches to maintain power system stability after major disturbances. With the increasing number of grid-connected renewable energy sources, existing model-based methods of frequency control are facing up with challenges of computational speed and scalability for large-scale systems. In this paper, the emergency frequency control problem is formulated as a Markov Decision Process and solved through a novel distributional deep reinforcement learning (DRL) method, namely the distributional soft actor critic (DSAC) method. Compared with other reinforcement learning methods that only estimate the mean value, the proposed DSAC model estimates the distribution of value function over returns. This advancement can lead to more insights and knowledge for the agent, with the benefit of a much faster and more stable learning process, and the improved frequency control performance. The simulation results on IEEE 39-bus and IEEE 118-bus systems demonstrate the effectiveness and robustness of proposed models, as well as the advantage compared to other state-of-the-art DRL algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
08858950
Volume :
37
Issue :
4
Database :
Complementary Index
Journal :
IEEE Transactions on Power Systems
Publication Type :
Academic Journal
Accession number :
157552002
Full Text :
https://doi.org/10.1109/TPWRS.2021.3130413