Back to Search
Start Over
Adopting the YOLOv4 Architecture for Low-Latency Multispectral Pedestrian Detection in Autonomous Driving
- Source :
- Sensors, Vol 22, Iss 3, p 1082 (2022)
- Publication Year :
- 2022
- Publisher :
- MDPI AG, 2022.
-
Abstract
- Detecting pedestrians in autonomous driving is a safety-critical task, and the decision to avoid a a person has to be made with minimal latency. Multispectral approaches that combine RGB and thermal images are researched extensively, as they make it possible to gain robustness under varying illumination and weather conditions. State-of-the-art solutions employing deep neural networks offer high accuracy of pedestrian detection. However, the literature is short of works that evaluate multispectral pedestrian detection with respect to its feasibility in obstacle avoidance scenarios, taking into account the motion of the vehicle. Therefore, we investigated the real-time neural network detector architecture You Only Look Once, the latest version (YOLOv4), and demonstrate that this detector can be adapted to multispectral pedestrian detection. It can achieve accuracy on par with the state-of-the-art while being highly computationally efficient, thereby supporting low-latency decision making. The results achieved on the KAIST dataset were evaluated from the perspective of automotive applications, where low latency and a low number of false negatives are critical parameters. The middle fusion approach to YOLOv4 in its Tiny variant achieved the best accuracy to computational efficiency trade-off among the evaluated architectures.
Details
- Language :
- English
- ISSN :
- 14248220
- Volume :
- 22
- Issue :
- 3
- Database :
- Directory of Open Access Journals
- Journal :
- Sensors
- Publication Type :
- Academic Journal
- Accession number :
- edsdoj.31562fd949a44dfa9f35b27836d12bea
- Document Type :
- article
- Full Text :
- https://doi.org/10.3390/s22031082