Back to Search Start Over

A Comparative Study on Classifying Human Activities Using Classical Machine and Deep Learning Methods

Authors :
Ferhat Bozkurt
Source :
Arabian Journal for Science and Engineering. 47:1507-1521
Publication Year :
2021
Publisher :
Springer Science and Business Media LLC, 2021.

Abstract

Prediction of human physical activities has become a necessity for some applications that come with the development of wearable and portable hardware such as smartwatches and smartphones. The task of Human Activity Recognition (HAR) is to recognize human physical activities, e.g., walking, sitting, and running, using the data collected from sensors, e.g., accelerometers and gyroscope. HAR is commonly applied on smart systems, such as smartphones, to serve the understanding of a user’s behaviors and provide assistance to the user because of the rapid development of ubiquitous computing technology in recent years. Thus, predicting activities, such as standing, walking, sitting, during the day have become a popular topic in machine and deep learning. The aim of this study is to predict the user’s activities based on context information gathered by sensors such as gyroscopes and accelerometers. The conducted classification algorithms extract features from training data and learn a classification model based on the features to predict activity. In this paper, various classical machine and deep learning techniques have been studied and compared for human activity recognition. A comparative analysis is performed between techniques in order to select the classifier with the best recognition performance. Experimental results show that established Deep Neural Network (DNN) model achieved an accuracy of up to 96.81% and mean absolute error of up to 0.03 on publicly available UCI-HAR dataset. This method has given the best performance between conducted classification methods in this study to predict human activity.

Details

ISSN :
21914281 and 2193567X
Volume :
47
Database :
OpenAIRE
Journal :
Arabian Journal for Science and Engineering
Accession number :
edsair.doi...........34522c174f33400aac4c2a84dc6d459d
Full Text :
https://doi.org/10.1007/s13369-021-06008-5