Back to Search Start Over

Target-aware transformer tracking with hard occlusion instance generation.

Authors :
Xiao D
Wei Z
Zhang G
Source :
Frontiers in neurorobotics [Front Neurorobot] 2024 Jan 10; Vol. 17, pp. 1323188. Date of Electronic Publication: 2024 Jan 10 (Print Publication: 2023).
Publication Year :
2024

Abstract

Visual tracking is a crucial task in computer vision that has been applied in diverse fields. Recently, transformer architecture has been widely applied in visual tracking and has become a mainstream framework instead of the Siamese structure. Although transformer-based trackers have demonstrated remarkable accuracy in general circumstances, their performance in occluded scenes remains unsatisfactory. This is primarily due to their inability to recognize incomplete target appearance information when the target is occluded. To address this issue, we propose a novel transformer tracking approach referred to as TATT, which integrates a target-aware transformer network and a hard occlusion instance generation module. The target-aware transformer network utilizes an encoder-decoder structure to facilitate interaction between template and search features, extracting target information in the template feature to enhance the unoccluded parts of the target in the search features. It can directly predict the boundary between the target region and the background to generate tracking results. The hard occlusion instance generation module employs multiple image similarity calculation methods to select an image pitch in video sequences that is most similar to the target and generate an occlusion instance mimicking real scenes without adding an extra network. Experiments on five benchmarks, including LaSOT, TrackingNet, Got10k, OTB100, and UAV123, demonstrate that our tracker achieves promising performance while running at approximately 41 fps on GPU. Specifically, our tracker achieves the highest AUC scores of 65.5 and 61.2% in partial and full occlusion evaluations on LaSOT, respectively.<br />Competing Interests: The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.<br /> (Copyright © 2024 Xiao, Wei and Zhang.)

Details

Language :
English
ISSN :
1662-5218
Volume :
17
Database :
MEDLINE
Journal :
Frontiers in neurorobotics
Publication Type :
Academic Journal
Accession number :
38268505
Full Text :
https://doi.org/10.3389/fnbot.2023.1323188