51. Key technologies based on UAV SLAM vision in disaster response.
- Author
-
Zhang, Weilin
- Abstract
With the increase in frequency and intensity of natural disasters induced by extreme weather events globally, there are growing threats to human life and property safety. Effective utilization of unmanned aerial vehicle (UAV) vision techniques for rapid disaster assessment and relief is of great value, but still faces many challenges. The purpose of this study is to provide a comprehensive review on the application status, technological progress, and existing problems of UAV vision technology in disaster response. Through extensive investigation of relevant literature, this paper compares and analyze the advantages and disadvantages of key UAV visual techniques for disaster response. The mainstream algorithms and future improvements of several critical technologies are summarized, including image stabilization, target recognition, and three-dimensional reconstruction. Image stabilization technology compensates for UAV motion to obtain clear and stable video, crucial for further analysis. Both mechanical and digital methods have been explored. Target recognition enables UAVs to automatically detect disaster-related entities like survivors, dangerous areas, and blocked roads. Deep learning methods are now dominant. Three-dimensional reconstruction builds precise environment models, providing essential data for assessment and planning. Multi-view methods improve accuracy. In addition, this paper discusses the trends and open challenges faced by UAV vision applications in complex disaster environments, such as robustness, real-time performance, energy efficiency, and data security. Integrating multiple visual modules into a unified system is an important direction. This study provides valuable insights to promote the engineering development and field deployment of UAV vision technology for disaster response. The analysis and summaries serve as important references for future research, development, and applications in this domain. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF