Back to Search Start Over

Human action recognition based on kinematic similarity in real time.

Authors :
Wu, Qingqiang
Xu, Guanghua
Chen, Longting
Luo, Ailing
Zhang, Sicong
Source :
PLoS ONE. 10/26/2017, Vol. 12 Issue 10, p1-15. 15p.
Publication Year :
2017

Abstract

Human action recognition using 3D pose data has gained a growing interest in the field of computer robotic interfaces and pattern recognition since the availability of hardware to capture human pose. In this paper, we propose a fast, simple, and powerful method of human action recognition based on human kinematic similarity. The key to this method is that the action descriptor consists of joints position, angular velocity and angular acceleration, which can meet the different individual sizes and eliminate the complex normalization. The angular parameters of joints within a short sliding time window (approximately 5 frames) around the current frame are used to express each pose frame of human action sequence. Moreover, three modified KNN (k-nearest-neighbors algorithm) classifiers are employed in our method: one for achieving the confidence of every frame in the training step, one for estimating the frame label of each descriptor, and one for classifying actions. Additional estimating of the frame’s time label makes it possible to address single input frames. This approach can be used on difficult, unsegmented sequences. The proposed method is efficient and can be run in real time. The research shows that many public datasets are irregularly segmented, and a simple method is provided to regularize the datasets. The approach is tested on some challenging datasets such as MSR-Action3D, MSRDailyActivity3D, and UTD-MHAD. The results indicate our method achieves a higher accuracy. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19326203
Volume :
12
Issue :
10
Database :
Academic Search Index
Journal :
PLoS ONE
Publication Type :
Academic Journal
Accession number :
125889940
Full Text :
https://doi.org/10.1371/journal.pone.0185719