Back to Search Start Over

Spatial Transformer Network on Skeleton-based Gait Recognition

Authors :
Zhang, Cun
Chen, Xing-Peng
Han, Guo-Qiang
Liu, Xiang-Jie
Publication Year :
2022

Abstract

Skeleton-based gait recognition models usually suffer from the robustness problem, as the Rank-1 accuracy varies from 90\% in normal walking cases to 70\% in walking with coats cases. In this work, we propose a state-of-the-art robust skeleton-based gait recognition model called Gait-TR, which is based on the combination of spatial transformer frameworks and temporal convolutional networks. Gait-TR achieves substantial improvements over other skeleton-based gait models with higher accuracy and better robustness on the well-known gait dataset CASIA-B. Particularly in walking with coats cases, Gait-TR get a 90\% Rank-1 gait recognition accuracy rate, which is higher than the best result of silhouette-based models, which usually have higher accuracy than the silhouette-based gait recognition models. Moreover, our experiment on CASIA-B shows that the spatial transformer can extract gait features from the human skeleton better than the widely used graph convolutional network.

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2204.03873
Document Type :
Working Paper