Back to Search Start Over

Collaborative Feature Learning for Gait Recognition Under Cloth Changes.

Authors :
Yao, Lingxiang
Kusakunniran, Worapan
Wu, Qiang
Xu, Jingsong
Zhang, Jian
Source :
IEEE Transactions on Circuits & Systems for Video Technology. Jun2022, Vol. 32 Issue 6, p3615-3629. 15p.
Publication Year :
2022

Abstract

Since gait can be utilized to identify individuals from a far distance without their interaction and coordination, recently many gait recognition methods have been proposed. However, due to a real-world scenario of clothing changes, a degradation occurs for most of these methods. Thus in this paper, a more efficient gait recognition method is proposed to address the problem of clothing variances. First, part-based gait features are formulated from two different perspectives, i.e., the separated body parts that are more robust to clothing changes and the estimated human skeleton key-point regions. It is reasonable to formulate such features for cloth-changing gait recognition, because these two perspectives are both less vulnerable to clothing changes. Given that each feature has its own advantages and disadvantages, a more efficient gait feature is generated in this paper by assembling these two features together. Moreover, since local features are more discriminative than global features, in this paper more attention is focused on the local short-range features. Also, unlike most methods, in our method we treat the estimated key-point features as a set of word embeddings, and a transformer encoder is specifically used to learn the dependence of each correlative key-points. The robustness and effectiveness of our proposed method are certified by experiments on CASIA Gait Dataset B, and it has achieved the state-of-the-art performance on this dataset. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10518215
Volume :
32
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Circuits & Systems for Video Technology
Publication Type :
Academic Journal
Accession number :
157258475
Full Text :
https://doi.org/10.1109/TCSVT.2021.3112564