Back to Search Start Over

A Deep-Learning-Based Vehicle Detection Approach for Insufficient and Nighttime Illumination Conditions

Authors :
Yen-Lin Chen
Liang Hong-Yi
Chao-Wei Yu
Jian-Yi Wu
Xiu-Zhi Chen
Ho Kwan Leung
Source :
Applied Sciences, Volume 9, Issue 22, Applied Sciences, Vol 9, Iss 22, p 4769 (2019)
Publication Year :
2019
Publisher :
MDPI AG, 2019.

Abstract

Most object detection models cannot achieve satisfactory performance under nighttime and other insufficient illumination conditions, which may be due to the collection of data sets and typical labeling conventions. Public data sets collected for object detection are usually photographed with sufficient ambient lighting. However, their labeling conventions typically focus on clear objects and ignore blurry and occluded objects. Consequently, the detection performance levels of traditional vehicle detection techniques are limited in nighttime environments without sufficient illumination. When objects occupy a small number of pixels and the existence of crucial features is infrequent, traditional convolutional neural networks (CNNs) may suffer from serious information loss due to the fixed number of convolutional operations. This study presents solutions for data collection and the labeling convention of nighttime data to handle various types of situations, including in-vehicle detection. Moreover, the study proposes a specifically optimized system based on the Faster region-based CNN model. The system has a processing speed of 16 frames per second for 500 &times<br />375-pixel images, and it achieved a mean average precision (mAP) of 0.8497 in our validation segment involving urban nighttime and extremely inadequate lighting conditions. The experimental results demonstrated that our proposed methods can achieve high detection performance in various nighttime environments, such as urban nighttime conditions with insufficient illumination, and extremely dark conditions with nearly no lighting. The proposed system outperforms original methods that have an mAP value of approximately 0.2.

Details

ISSN :
20763417
Volume :
9
Database :
OpenAIRE
Journal :
Applied Sciences
Accession number :
edsair.doi.dedup.....adaf98a814f63986a8c79efc3e50fffe