Back to Search Start Over

Deformable Object Tracking with Gated Fusion

Authors :
Liu, Wenxi
Song, Yibing
Chen, Dengsheng
He, Shengfeng
Yu, Yuanlong
Yan, Tao
Hancke, Gerhard P.
Lau, Rynson W. H.
Source :
IEEE Transactions on Image Processing, 2019
Publication Year :
2018

Abstract

The tracking-by-detection framework receives growing attentions through the integration with the Convolutional Neural Networks (CNNs). Existing tracking-by-detection based methods, however, fail to track objects with severe appearance variations. This is because the traditional convolutional operation is performed on fixed grids, and thus may not be able to find the correct response while the object is changing pose or under varying environmental conditions. In this paper, we propose a deformable convolution layer to enrich the target appearance representations in the tracking-by-detection framework. We aim to capture the target appearance variations via deformable convolution, which adaptively enhances its original features. In addition, we also propose a gated fusion scheme to control how the variations captured by the deformable convolution affect the original appearance. The enriched feature representation through deformable convolution facilitates the discrimination of the CNN classifier on the target object and background. Extensive experiments on the standard benchmarks show that the proposed tracker performs favorably against state-of-the-art methods.

Details

Database :
arXiv
Journal :
IEEE Transactions on Image Processing, 2019
Publication Type :
Report
Accession number :
edsarx.1809.10417
Document Type :
Working Paper
Full Text :
https://doi.org/10.1109/TIP.2019.2902784