1. Deep reinforcement learning for optimal rescue path planning in uncertain and complex urban pluvial flood scenarios.
- Author
-
Li, Xiaoyan, Liang, Xuedong, Wang, Xia, Wang, Rong, Shu, Lingli, and Xu, Wentao
- Subjects
REINFORCEMENT learning ,FLOOD warning systems ,FLOODS ,ARTIFICIAL intelligence ,SMART cities ,NATURAL disasters - Abstract
An urban pluvial flood is a devastating, costly natural disaster requiring effective rescue path planning to mitigate the loss of lives and property. The inherent uncertainty and complexity of the risks associated with urban flooding limit the ability to plan optimal rescue paths that prioritize both timeliness and safety. This study addresses the challenge by proposing an innovative assessment methodology to output risk values and probability representing safety and timeliness in each passable area while simulating real-world flood scenarios. Furthermore, the paper develops a pioneering path-planning algorithm based on deep reinforcement learning, incorporating improved stochastic reward exploitation and heterogeneous reward exploration mechanisms to function in a simulation rescue path-planning scenario with uncertainty and complexity. According to the findings, the proposed algorithm outperforms current state-of-the-art algorithms in converging to the optimal path, fully sampling, and running efficiency. The study contributes to theoretical progress on urban pluvial flood rescue, deep reinforcement learning, risk assessment, and decision intelligence while offering practical implications for smart cities, emergency management, and optimizing real-world problems by employing artificial intelligence. • Urban pluvial flood rescue path planning using deep reinforcement learning. • Probability-based risk assessment methodology for urban pluvial flood rescue. • Environmental simulation based on real-world rescue scenarios. • Improving the exploitation and exploration to stochastic and heterogeneous reward. • Expanding deep reinforcement learning to solve real-world problems. [ABSTRACT FROM AUTHOR]
- Published
- 2023
- Full Text
- View/download PDF