Back to Search Start Over

Convolutional transformer network for fine-grained action recognition.

Authors :
Ma, Yujun
Wang, Ruili
Zong, Ming
Ji, Wanting
Wang, Yi
Ye, Baoliu
Source :
Neurocomputing. Feb2024, Vol. 569, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Fine-grained action recognition is one of the critical problems in video processing, which aims to recognize similar actions of subtle interactions between humans and objects. Inspired by the remarkable performance of the Transformer in natural language processing, Transformer has been applied to the fine-grained action recognition task. However, Transformer needs abundant training data and extra supervision to achieve comparable results with convolutional neural networks (CNNs). To address these issues, we propose a Convolutional Transformer Network (CTN), which integrates the merits of CNN (e.g. , sharing weights, capturing low-level features in videos and locality) and the benefits of Transformer (e.g., dynamic attention and learning long-range dependencies). In this paper, we propose two modifications to the original Transformer: (i) We propose a video-to-tokens module that can extract tokens from extracted spatial-temporal features in videos by 3D convolutions instead of the direct token embedding from raw input video clips; (ii) We completely replace the linear mapping in multi-head self-attention layer with depth-wise convolutional mapping, which applies a depth-wise separable convolution operation on embedded token maps. With these two modifications, our approach can extract effective spatial-temporal features from videos and process the long sequences of tokens encountered in videos. Experimental results demonstrate that our proposed CTN can achieve state-of-the-art accuracy on two fine-grained action recognition datasets (i.e., Epic-Kitchens and Diving 48) with a small computational increase. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09252312
Volume :
569
Database :
Academic Search Index
Journal :
Neurocomputing
Publication Type :
Academic Journal
Accession number :
174470013
Full Text :
https://doi.org/10.1016/j.neucom.2023.127027