Back to Search Start Over

A novel framework for intelligent surveillance system based on abnormal human activity detection in academic environments

Authors :
Malek Al-Nawashi
Mohamad Saraee
Obaida M. Al-Hazaimeh
Source :
Neural Computing & Applications
Publication Year :
2016
Publisher :
Springer, 2016.

Abstract

Abnormal activity detection plays a crucial role\ud in surveillance applications, and a surveillance system thatcan perform robustly in an academic environment has\ud become an urgent need. In this paper, we propose a novel\ud framework for an automatic real-time video-based\ud surveillance system which can simultaneously perform the\ud tracking, semantic scene learning, and abnormality detection in an academic environment. To develop our system, we have divided the work into three phases: preprocessing phase, abnormal human activity detection phase, and content-based image retrieval phase. For motion object detection, we used the temporal-differencing algorithm and then located the motions region using the Gaussian function.Furthermore, the shape model based on OMEGA equation was used as a filter for the detected objects (i.e.,human and non-human). For object activities analysis, we evaluated and analyzed the human activities of the detected objects. We classified the human activities into two groups:normal activities and abnormal activities based on the support vector machine. The machine then provides an automatic warning in case of abnormal human activities. It also embeds a method to retrieve the detected object from the database for object recognition and identification using content-based image retrieval.Finally,a software-based simulation using MATLAB was performed and the results of the conducted experiments showed an excellent surveillance system that can simultaneously perform the tracking, semantic scene learning, and abnormality detection in an academic environment with no human intervention.

Details

Language :
English
ISSN :
09410643
Database :
OpenAIRE
Journal :
Neural Computing & Applications
Accession number :
edsair.doi.dedup.....0a85810de41d6030adec00e2a3782e78