Back to Search Start Over

MFCANN: A feature diversification framework based on local and global attention for human activity recognition.

Authors :
Yang, Zhixuan
Li, Kewen
Huang, Zongchao
Source :
Engineering Applications of Artificial Intelligence. Jul2024:Part B, Vol. 133, pN.PAG-N.PAG. 1p.
Publication Year :
2024

Abstract

Human activity recognition (HAR) is a crucial detection technique extensively employed in various contexts demanding accurate identification of human actions. Currently, mainstream HAR approaches frequently rely on sequential data generated by wearable sensors, with a primary focus on extracting feature representations that align with various human activities. Nevertheless, complex and diverse human activities encompass various kinds of features and substantial variations in the features emphasized across different activities, making it challenging for current artificial intelligence methods to achieve accurate recognition. To address this challenge, this paper introduces a network framework called the Multi-Feature Combining Attention Neural Network (MFCANN). This framework incorporates diverse feature extraction, along with both local and global feature attention mechanisms. It is composed of stacked Multi-Feature Combining Attention Blocks (MFCAB) designed by us. In contrast to traditional convolutional methods, MFCAB parallelly stacks various convolutional components, enabling the extraction of a broader range of features from human activity data. Additionally, we propose the Intra-Module Attention Block (Intra-MAB) and the Inter-Module Attention Block (Inter-MAB), which simultaneously focus on local fine-grained features within module feature maps and global distinguishing features across module feature maps. This aims to achieve more targeted feature learning, reinforcing the network's ability to distinguish between different human activities. Abundant experimental results demonstrate that the proposed MFCANN in this paper outperforms current mainstream deep learning algorithms in HAR tasks, achieving recognition accuracies of 0.9813, 0.9324, and 0.9930 on the UCI-HAR, USC-HAD, and RealWorld datasets respectively. We release our code at https://github.com/RangerHeart/MFCANN. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09521976
Volume :
133
Database :
Academic Search Index
Journal :
Engineering Applications of Artificial Intelligence
Publication Type :
Academic Journal
Accession number :
177604145
Full Text :
https://doi.org/10.1016/j.engappai.2024.108110