Back to Search Start Over

All-Day Object Detection and Recognition for Blind Zones of Vehicles Using Deep Learning.

Authors :
Chia, Tsorng-Lin
Liu, Pei-June
Huang, Ping-Sheng
Source :
International Journal of Pattern Recognition & Artificial Intelligence. Jan2024, Vol. 38 Issue 1, p1-34. 34p.
Publication Year :
2024

Abstract

The neglect of perception ability to the surrounding traffic conditions has always been the major cause of traffic accidents and the inattention to blind spots is the most important factor during driving. Existing solutions are facing the problems of using expensive equipment, wrong classification of the target object type, not suitable for nighttime, and incorrectly determining if the target object is in the blind zones. This paper aims to improve driving perception ability by developing an all-day object detection and recognition system with more accurate performance for blind zones. The proposed method uses a general-purpose camera as a single input and a two-stage deep network architecture for object detection and recognition. The proposed system is based on a two-stage cascaded network structure. At first, the style conversion process is performed to convert the daytime and nighttime images with different brightness into consistent brightness. Then the objects in the visual blind zones are detected and identified. Therefore, the accuracy of object detection can be significantly improved. Due to the diversity and complexity of Taiwan's road conditions, the public databases cannot effectively fulfill local application model training needs. Therefore, we have built training data set from available night images. Experimental results show that the proposed method has demonstrated promising performance in all-day object detection and recognition for blind zones. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
02180014
Volume :
38
Issue :
1
Database :
Academic Search Index
Journal :
International Journal of Pattern Recognition & Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
175724997
Full Text :
https://doi.org/10.1142/S0218001423500350