Back to Search Start Over

SiTGRU: Single-Tunnelled Gated Recurrent Unit for Abnormality Detection

Authors :
Fanta, Habtamu
Shao, Zhiwen
Ma, Lizhuang
Source :
Journal of Information Sciences 524 (2020) 15-32
Publication Year :
2020

Abstract

Abnormality detection is a challenging task due to the dependence on a specific context and the unconstrained variability of practical scenarios. In recent years, it has benefited from the powerful features learnt by deep neural networks, and handcrafted features specialized for abnormality detectors. However, these approaches with large complexity still have limitations in handling long term sequential data (e.g., videos), and their learnt features do not thoroughly capture useful information. Recurrent Neural Networks (RNNs) have been shown to be capable of robustly dealing with temporal data in long term sequences. In this paper, we propose a novel version of Gated Recurrent Unit (GRU), called Single Tunnelled GRU for abnormality detection. Particularly, the Single Tunnelled GRU discards the heavy weighted reset gate from GRU cells that overlooks the importance of past content by only favouring current input to obtain an optimized single gated cell model. Moreover, we substitute the hyperbolic tangent activation in standard GRUs with sigmoid activation, as the former suffers from performance loss in deeper networks. Empirical results show that our proposed optimized GRU model outperforms standard GRU and Long Short Term Memory (LSTM) networks on most metrics for detection and generalization tasks on CUHK Avenue and UCSD datasets. The model is also computationally efficient with reduced training and testing time over standard RNNs.<br />Comment: 14 pages, 11 figures, 13 tables, this paper is accepted on Journal of Information Sciences

Details

Database :
arXiv
Journal :
Journal of Information Sciences 524 (2020) 15-32
Publication Type :
Report
Accession number :
edsarx.2003.13528
Document Type :
Working Paper
Full Text :
https://doi.org/10.1016/j.ins.2020.03.034