Back to Search Start Over

Demand Response in HEMSs Using DRL and the Impact of Its Various Configurations and Environmental Changes.

Authors :
Amer, Aya
Shaban, Khaled
Massoud, Ahmed
Source :
Energies (19961073); Nov2022, Vol. 15 Issue 21, p8235, 20p
Publication Year :
2022

Abstract

With smart grid advances, enormous amounts of data are made available, enabling the training of machine learning algorithms such as deep reinforcement learning (DRL). Recent research has utilized DRL to obtain optimal solutions for complex real-time optimization problems, including demand response (DR), where traditional methods fail to meet time and complex requirements. Although DRL has shown good performance for particular use cases, most studies do not report the impacts of various DRL settings. This paper studies the DRL performance when addressing DR in home energy management systems (HEMSs). The trade-offs of various DRL configurations and how they influence the performance of the HEMS are investigated. The main elements that affect the DRL model training are identified, including state-action pairs, reward function, and hyperparameters. Various representations of these elements are analyzed to characterize their impact. In addition, different environmental changes and scenarios are considered to analyze the model's scalability and adaptability. The findings elucidate the adequacy of DRL to address HEMS challenges since, when appropriately configured, it successfully schedules from 73% to 98% of the appliances in different simulation scenarios and minimizes the electricity cost by 19% to 47%. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19961073
Volume :
15
Issue :
21
Database :
Complementary Index
Journal :
Energies (19961073)
Publication Type :
Academic Journal
Accession number :
160146899
Full Text :
https://doi.org/10.3390/en15218235