Back to Search Start Over

Reverse Attention-Based Residual Network for Salient Object Detection.

Authors :
Chen, Shuhan
Tan, Xiuli
Wang, Ben
Lu, Huchuan
Hu, Xuelong
Fu, Yun
Source :
IEEE Transactions on Image Processing; 2020, Vol. 29, p3763-3776, 14p
Publication Year :
2020

Abstract

Benefiting from the quick development of deep convolutional neural networks, especially fully convolutional neural networks (FCNs), remarkable progresses have been achieved on salient object detection recently. Nevertheless, these FCNs based methods are still challenging to generate high resolution saliency maps, and also not applicable for subsequent applications due to their heavy model weights. In this paper, we propose a compact and efficient deep network with high accuracy for salient object detection. Firstly, we propose two strategies for initial prediction, one is a new designed multi-scale context module, the other is incorporating hand-crafted saliency priors. Secondly, we employ residual learning to refine it progressively by only learning the residual in each side-output, which can be achieved with few convolutional parameters, therefore leads to high compactness and high efficiency. Finally, we further design a novel top-down reverse attention block to guide the above side-output residual learning. Specifically, the current predicted salient regions are used to erase its side-output feature, thus the missing object parts and details can be efficiently learned from these unerased regions, which results in more complete detection and high accuracy. Extensive experimental results on seven benchmark datasets demonstrate that the proposed network performs favorably against the state-of-the-art approaches, and shows advantages in simplicity, compactness and efficiency. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10577149
Volume :
29
Database :
Complementary Index
Journal :
IEEE Transactions on Image Processing
Publication Type :
Academic Journal
Accession number :
170078234
Full Text :
https://doi.org/10.1109/TIP.2020.2965989