Back to Search
Start Over
HOG based multi-stage object detection and pose recognition for service robot
- Source :
- ICARCV
- Publication Year :
- 2010
- Publisher :
- IEEE, 2010.
-
Abstract
- This paper develops a HOG-based multistage approach for object detection and object pose recognition for service robots. This approach makes use of the merits of both multi-class and bi-class HOG-based detectors to form a three-stage algorithm at low computing cost. In the first stage, the multi-class classifier with coarse features is employed to estimate the orientation of a potential target object in the image; in the second stage, a bi-class detector corresponding to the detected orientation with intermediate level features is used to filter out most of false positives; and in the third stage, a bi-class detector corresponding to the detected orientation using fine features is used to achieve accurate detection with low rate of false positives. The training of multi-class and bi-class SVMs with their respective features in different levels is described. Experiments in real-world environments have shown that the proposed method is much more accurate than the detection method as it uses only multi-class detector. The proposed method is also much more efficient than the detection method as it uses a bi-class detector for each possible orientation. The approach works well on the scenarios where the SIFT-based detector may fail. The method can achieve real-time object detection, localization, and pose recognition on a P4 2.4GHz PC.
- Subjects :
- Computer science
business.industry
3D single-object recognition
Detector
Feature extraction
Cognitive neuroscience of visual object recognition
Scale-invariant feature transform
Pattern recognition
Object detection
Computer vision
Viola–Jones object detection framework
Artificial intelligence
business
Pose
Subjects
Details
- Database :
- OpenAIRE
- Journal :
- 2010 11th International Conference on Control Automation Robotics & Vision
- Accession number :
- edsair.doi...........811337083b62b8fd13a1cfbe07c1e9d0
- Full Text :
- https://doi.org/10.1109/icarcv.2010.5707916