1. Analysis of Near-Fall Detection Method Utilizing Dynamic Motion Images and Transfer Learning
- Author
-
Jung-Yeon Kim, Nab Mat, Chomyong Kim, Awais Khan, Hyo-Wook Gil, Jiwon Lyu, Euyhyun Chung, Kwang Seock Kim, Seob Jeon, and Yunyoung Nam
- Subjects
Near-fall detection ,CNN ,dynamic image ,rank pooling ,fusion ,transfer learning ,Electrical engineering. Electronics. Nuclear engineering ,TK1-9971 - Abstract
This study explores a model for detecting fall, non-fall and near-fall events as frequent experiences of near-falls are closely associated with a heightened risk of falls. Detecting near-falls can lead to more accurate predictions of falls. However, near-falls exhibit certain movement patterns similar to actual falls, making it challenging to distinguish between near-fall events and falls. We investigated the detection of fall-related activities, including falls, near-falls, and non-falls, by utilizing dynamic motion images derived from video clips. There were two primary classification approaches: a vanilla convolutional neural network (CNN) model and a transfer learning approach that utilizes InceptionV3 and DenseNet201 models as feature extractors and train conventional machine learning classifiers, such as support vector machine (SVM), K-nearest neighborhood, decision tree, and random forest, and adaptive boosting models. The vanilla CNN model achieved a high accuracy of 97.89% compared to the transfer learning approach, which reached a maximum accuracy of 95.54% for binary classification of fall and non-fall events. On the other hand, the transfer learning approach, which integrated feature from InceptionV3 and DenseNet201 into machine learning classifiers, achieved an accuracy of up to 90.14% for the three-class classification of fall, non-fall, and near-fall events. The findings of this study underscores the modelÃs robustness in detecting various fall-related activities, highlighting its potential for improving safety in at-risk populations.
- Published
- 2025
- Full Text
- View/download PDF