Back to Search Start Over

Markov Decision Process-based Resilience Enhancement for Distribution Systems: An Approximate Dynamic Programming Approach

Authors :
Wang, Chong
Ju, Ping
Lei, Shunbo
Wang, Zhaoyu
Hou, Yunhe
Publication Year :
2019

Abstract

Because failures in distribution systems caused by extreme weather events directly result in consumers' outages, this paper proposes a state-based decision-making model with the objective of mitigating loss of load to improve the distribution system resilience throughout the unfolding events. The sequentially uncertain system states, e.g., feeder line on/off states, driven by the unfolding events are modeled as Markov states, and the probabilities from one Markov state to another Markov state throughout the unfolding events are determined by the component failure caused by the unfolding events. A recursive optimization model based on Markov decision processes (MDP) is developed to make state-based actions, i.e., system reconfiguration, at each decision time. To overcome the curse of dimensionality caused by enormous states and actions, an approximate dynamic programming (ADP) approach based on post-decision states and iteration is used to solve the proposed MDP-based model. IEEE 33-bus system and IEEE 123-bus system are used to validate the proposed model.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.1904.00627
Document Type :
Working Paper