Back to Search
Start Over
Feedback weight convolutional neural network for gait recognition
- Source :
- Journal of Visual Communication and Image Representation. 55:424-432
- Publication Year :
- 2018
- Publisher :
- Elsevier BV, 2018.
-
Abstract
- Gait recognition is an important issue currently. In this paper, we propose to combine deep features and hand-crafted representations into a globally trainable deep model. Specifically, a set of deep feature vectors are firstly extracted by a pre-trained CNN model from the input sequences. Then, a kernel function with respect to the fully connected vector is trained as the guiding weight of the respective receptive fields of the input sequences. Therefore, the hand-crafted features are extracted based on the guiding weight. Finally, the hand-crafted features and the deep features are combined into a unified deep network to complete classification. The optimized gait descriptor, termed as deep convolutional location weight descriptor (DLWD), is capable of effectively revealing the importance of different body parts to gait recognition accuracy. Experiments on two gait data sets (i.e., CASIA-B, OU-ISIR) show that our method outperforms the other existing methods for gait recognition.
- Subjects :
- business.industry
Computer science
Feature vector
ComputingMethodologies_IMAGEPROCESSINGANDCOMPUTERVISION
020207 software engineering
Pattern recognition
02 engineering and technology
Convolutional neural network
Set (abstract data type)
ComputingMethodologies_PATTERNRECOGNITION
Gait (human)
Receptive field
Signal Processing
0202 electrical engineering, electronic engineering, information engineering
Media Technology
020201 artificial intelligence & image processing
Computer Vision and Pattern Recognition
Artificial intelligence
Electrical and Electronic Engineering
business
Subjects
Details
- ISSN :
- 10473203
- Volume :
- 55
- Database :
- OpenAIRE
- Journal :
- Journal of Visual Communication and Image Representation
- Accession number :
- edsair.doi...........1aa5d468991fd02c0250eb0a1631931e
- Full Text :
- https://doi.org/10.1016/j.jvcir.2018.06.019