Back to Search Start Over

Convolutional neural networks for computer vision-based detection and recognition of dumpsters.

Authors :
Ramírez, Iván
Cuesta-Infante, Alfredo
Pantrigo, Juan J.
Montemayor, Antonio S.
Moreno, José Luis
Alonso, Valvanera
Anguita, Gema
Palombarani, Luciano
Source :
Neural Computing & Applications. Sep2020, Vol. 32 Issue 17, p13203-13211. 9p.
Publication Year :
2020

Abstract

In this paper, we propose a twofold methodology for visual detection and recognition of different types of city dumpsters, with minimal human labeling of the image data set. Firstly, we carry out transfer learning by using Google Inception-v3 convolutional neural network, which is retrained with only a small subset of labeled images out of the whole data set. This first classifier is then improved with a semi-supervised learning based on retraining for two more rounds, each one increasing the number of labeled images but without human supervision. We compare our approach against both to a baseline case, with no incremental retraining, and the best case, assuming we had a fully labeled data set. We use a data set of 27,624 labeled images of dumpsters provided by Ecoembes, a Spanish nonprofit organization that cares for the environment through recycling and the eco-design of packaging in Spain. Such a data set presents a number of challenges. As in other outdoor visual tasks, there are occluding objects such as vehicles, pedestrians and street furniture, as well as other dumpsters whenever they are placed in groups. In addition, dumpsters have different degrees of deterioration which may affect their shape and color. Finally, 35% of the images are classified according to the capacity of the container, which contains a feature which is hard to assess in a snapshot. Since the data set is fully labeled, we can compare our approach both against a baseline case, doing only the transfer learning using a minimal set of labeled images, and against the best case, using all the labels. The experiments show that the proposed system provides an accuracy of 88%, whereas in the best case it is 93%. In other words, the method proposed attains 94% of the best performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
32
Issue :
17
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
145259038
Full Text :
https://doi.org/10.1007/s00521-018-3390-8