Back to Search Start Over

Part-based deformable object detection with a single sketch

Authors :
Sreyasee Das Bhattacharjee
Anurag Mittal
School of Electrical and Electronic Engineering
Source :
Computer Vision and Image Understanding. 139:73-87
Publication Year :
2015
Publisher :
Elsevier BV, 2015.

Abstract

Contour-based object detection scheme uses a single sketch as input model.An automatic part decomposition method segments the given model sketch into partsMulti-stage coarse-to-fine locally affine-invariant part-based matching strategyFirst, a PS-based framework is used to roughly identify some candidate locations.A detailed Contour Tracing then evaluates these initial detections more thoroughly. Object detection using shape is interesting since it is well known that humans can recognize an object simply from its shape. Thus, shape-based methods have great promise to handle a large amount of shape variation using a compact representation. In this paper, we present a new algorithm for object detection that uses a single reasonably good sketch as a reference to build a model for the object. The method hierarchically segments a given sketch into parts using an automatic algorithm and estimates a different affine transformation for each part while matching. A Hough-style voting scheme collects evidence for the object from the leaves to the root in the part decomposition tree for robust detection. Missing edge segments, clutter and generic object deformations are handled by flexibly following the contour paths in the edge image that resemble the model contours. Efficient data-structures and a two-stage matching approach assist in yielding an efficient and robust system. Results on ETHZ and several other popular image datasets yield promising results compared to the state-of-the-art. A new dataset of real-life hand-drawn sketches for all the object categories in the ETHZ dataset is also used for evaluation.

Details

ISSN :
10773142
Volume :
139
Database :
OpenAIRE
Journal :
Computer Vision and Image Understanding
Accession number :
edsair.doi.dedup.....61248c93ad4a706c8024029135e1170f