Back to Search Start Over

Convolutional neural network–based person tracking using overhead views.

Authors :
Ahmad, Misbah
Ahmed, Imran
Khan, Fakhri Alam
Qayum, Fawad
Aljuaid, Hanan
Source :
International Journal of Distributed Sensor Networks; Jun2020, Vol. 16 Issue 6, p1-12, 12p
Publication Year :
2020

Abstract

In video surveillance, person tracking is considered as challenging task. Numerous computer vision, machine and deep learning–based techniques have been developed in recent years. Majority of these techniques are based on frontal view images/video sequences. The advancement of convolutional neural network reforms the way of object tracking. The network layers of convolutional neural network models trained on a number of images or video sequences improve speed and accuracy of object tracking. In this work, the generalization performance of existing pre-trained deep learning models have investigated for overhead view person detection and tracking, under different experimental conditions. The object tracking method Generic Object Tracking Using Regression Networks (GOTURN) which has been yielding outstanding tracking results in recent years is explored for person tracking using overhead views. This work mainly focused on overhead view person tracking using Faster region convolutional neural network (Faster-RCNN) in combination with GOTURN architecture. In this way, the person is first identified in overhead view video sequences and then tracked using a GOTURN tracking algorithm. Faster-RCNN detection model achieved the true detection rate ranging from 90% to 93% with a minimum false detection rate up to 0.5%. The GOTURN tracking algorithm achieved similar results with the success rate ranging from 90% to 94%. Finally, the discussion is made on output results along with future direction. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
15501329
Volume :
16
Issue :
6
Database :
Complementary Index
Journal :
International Journal of Distributed Sensor Networks
Publication Type :
Academic Journal
Accession number :
144323906
Full Text :
https://doi.org/10.1177/1550147720934738