Back to Search
Start Over
Gesture spotting and recognition for human-robot interaction
- Source :
- IEEE Transactions on Robotics. April, 2007, Vol. 23 Issue 2, p256, 15 p.
- Publication Year :
- 2007
-
Abstract
- Visual interpretation of gestures can be useful in accomplishing natural human-robot interaction (HRI). Previous HRI research focused on issues such as hand gestures, sign language, and command gesture recognition. Automatic recognition of whole-body gestures is required in order for HRI to operate naturally. This presents a challenging problem, because describing and modeling meaningful gesture patterns from whole-body gestures is a complex task. This paper presents a new method for recognition of whole-body key gestures in HRI. A human subject is first described by a set of features, encoding the angular relationship between a dozen body parts in 3-D. A feature vector is then mapped to a codeword of hidden Markov models. In order to spot key gestures accurately, a sophisticated method of designing a transition gesture model is proposed. To reduce the states of the transition gesture model, model reduction which merges similar states based on data-dependent statistics and relative entropy is used. The experimental results demonstrate that the proposed method can be efficient and effective in HRI, for automatic recognition of whole-body key gestures from motion sequences. Index Terms--Gesture spotting, hidden Markov model (HMM), human-robot interaction (HRI), mobile robot, transition gesture model, whole-body gesture recognition.
- Subjects :
- Human-machine systems -- Research
Mobile robots -- Analysis
Subjects
Details
- Language :
- English
- ISSN :
- 15523098
- Volume :
- 23
- Issue :
- 2
- Database :
- Gale General OneFile
- Journal :
- IEEE Transactions on Robotics
- Publication Type :
- Academic Journal
- Accession number :
- edsgcl.163333958