Back to Search Start Over

Bi-LSTM Network for Multimodal Continuous Human Activity Recognition and Fall Detection.

Authors :
Li, Haobo
Shrestha, Aman
Heidari, Hadi
Le Kernec, Julien
Fioranelli, Francesco
Source :
IEEE Sensors Journal; Feb2020, Vol. 20 Issue 3, p1191-1201, 11p
Publication Year :
2020

Abstract

This paper presents a framework based on multi-layer bi-LSTM network (bidirectional Long Short-Term Memory) for multimodal sensor fusion to sense and classify daily activities’ patterns and high-risk events such as falls. The data collected in this work are continuous activity streams from FMCW radar and three wearable inertial sensors on the wrist, waist, and ankle. Each activity has a variable duration in the data stream so that the transitions between activities can happen at random times within the stream, without resorting to conventional fixed-duration snapshots. The proposed bi-LSTM implements soft feature fusion between wearable sensors and radar data, as well as two robust hard-fusion methods using the confusion matrices of both sensors. A novel hybrid fusion scheme is then proposed to combine soft and hard fusion to push the classification performances to approximately 96% accuracy in identifying continuous activities and fall events. These fusion schemes implemented with the proposed bi-LSTM network are compared with conventional sliding window approach, and all are validated with realistic “leaving one participant out” (L1PO) method (i.e. testing subjects unknown to the classifier). The developed hybrid-fusion approach is capable of stabilizing the classification performance among different participants in terms of reducing accuracy variance of up to 18.1% and increasing minimum, worst-case accuracy up to 16.2%. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
1530437X
Volume :
20
Issue :
3
Database :
Complementary Index
Journal :
IEEE Sensors Journal
Publication Type :
Academic Journal
Accession number :
141381908
Full Text :
https://doi.org/10.1109/JSEN.2019.2946095