Back to Search Start Over

A Novel Lightweight Human Activity Recognition Method Via L-CTCN.

Authors :
Ding, Xue
Li, Zhiwei
Yu, Jinyang
Xie, Weiliang
Li, Xiao
Jiang, Ting
Source :
Sensors (14248220). Dec2023, Vol. 23 Issue 24, p9681. 18p.
Publication Year :
2023

Abstract

Wi-Fi-based human activity recognition has attracted significant attention. Deep learning methods are widely used to achieve feature representation and activity sensing. While more learnable parameters in the neural networks model lead to richer feature extraction, it results in significant resource consumption, rendering the model unsuitable for lightweight Internet of Things (IoT) devices. Furthermore, the sensing performance heavily relies on the quality and quantity of data, which is a time-consuming and labor-intensive task. Therefore, there is a need to explore methods that reduce the dependence on the quality and quantity of the dataset while ensuring recognition performance and decreasing model complexity to adapt to ubiquitous lightweight IoT devices. In this paper, we propose a novel Lightweight-Complex Temporal Convolution Network (L-CTCN) for human activity recognition. Specifically, this approach effectively combines complex convolution with a Temporal Convolution Network (TCN). Complex convolution can extract richer information from limited raw complex data, reducing the reliance on the quality and quantity of training samples. Based on the designed TCN framework with 1D convolution and residual blocks, the proposed model can achieve lightweight human activity recognition. Extensive experiments verify the effectiveness of the proposed method. We can achieve an average recognition accuracy of 96.6% with only 0.17 M parameter size. This method performs well under conditions of low sampling rates and a low number of subcarriers and samples. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
14248220
Volume :
23
Issue :
24
Database :
Academic Search Index
Journal :
Sensors (14248220)
Publication Type :
Academic Journal
Accession number :
174463232
Full Text :
https://doi.org/10.3390/s23249681