Back to Search Start Over

Visual-Attention-Based Background Modeling for Detecting Infrequently Moving Objects.

Authors :
Yuewei Lin
Yan Tong
Yu Cao
Youjie Zhou
Song Wang
Source :
IEEE Transactions on Circuits & Systems for Video Technology. Jun2017, Vol. 27 Issue 6, p1208-1221. 14p.
Publication Year :
2017

Abstract

Motion is one of the most important cues to separate foreground objects from the background in a video. Using a stationary camera, it is usually assumed that the background is static, while the foreground objects are moving most of the time. However, in practice, the foreground objects may show infrequent motions, such as abandoned objects and sleeping persons. Meanwhile, the background may contain frequent local motions, such as waving trees and/or grass. Such complexities may prevent the existing background subtraction algorithms from correctly identifying the foreground objects. In this paper, we propose a new approach that can detect the foreground objects with frequent and/or infrequent motions. Specifically, we use a visual-attention mechanism to infer a complete background from a subset of frames and then propagate it to the other frames for accurate background subtraction. Furthermore, we develop a feature-matching-based local motion stabilization algorithm to identify frequent local motions in the background for reducing false positives in the detected foreground. The proposed approach is fully unsupervised, without using any supervised learning for object detection and tracking. Extensive experiments on a large number of videos have demonstrated that the proposed approach outperforms the state-of-the-art motion detection and background subtraction methods in comparison. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10518215
Volume :
27
Issue :
6
Database :
Academic Search Index
Journal :
IEEE Transactions on Circuits & Systems for Video Technology
Publication Type :
Academic Journal
Accession number :
127950120
Full Text :
https://doi.org/10.1109/TCSVT.2016.2527258