Back to Search Start Over

A Novel Method of Human Joint Prediction in an Occlusion Scene by Using Low-Cost Motion Capture Technique.

Authors :
Niu, Jianwei
Wang, Xiai
Wang, Dan
Ran, Linghua
Source :
Sensors (14248220). Feb2020, Vol. 20 Issue 4, p1119. 1p.
Publication Year :
2020

Abstract

Microsoft Kinect, a low-cost motion capture device, has huge potential in applications that require machine vision, such as human-robot interactions, home-based rehabilitation and clinical assessments. The Kinect sensor can track 25 key three-dimensional (3D) "skeleton" joints on the human body at 30 frames per second, and the skeleton data often have acceptable accuracy. However, the skeleton data obtained from the sensor sometimes exhibit a high level of jitter due to noise and estimation error. This jitter is worse when there is occlusion or a subject moves slightly out of the field of view of the sensor for a short period of time. Therefore, this paper proposed a novel approach to simultaneously handle the noise and error in the skeleton data derived from Kinect. Initially, we adopted classification processing to divide the skeleton data into noise data and erroneous data. Furthermore, we used a Kalman filter to smooth the noise data and correct erroneous data. We performed an occlusion experiment to prove the effectiveness of our algorithm. The proposed method outperforms existing techniques, such as the moving mean filter and traditional Kalman filter. The experimental results show an improvement of accuracy of at least 58.7%, 47.5% and 22.5% compared to the original Kinect data, moving mean filter and traditional Kalman filter, respectively. Our method provides a new perspective for Kinect data processing and a solid data foundation for subsequent research that utilizes Kinect. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14248220
Volume :
20
Issue :
4
Database :
Academic Search Index
Journal :
Sensors (14248220)
Publication Type :
Academic Journal
Accession number :
142170323
Full Text :
https://doi.org/10.3390/s20041119