Back to Search Start Over

Layer-Output Guided Complementary Attention Learning for Image Defocus Blur Detection.

Authors :
Li, Jinxing
Fan, Dandan
Yang, Lingxiao
Gu, Shuhang
Lu, Guangming
Xu, Yong
Zhang, David
Source :
IEEE Transactions on Image Processing. 2021, Vol. 30, p3748-3763. 16p.
Publication Year :
2021

Abstract

Defocus blur detection (DBD), which has been widely applied to various fields, aims to detect the out-of-focus or in-focus pixels from a single image. Despite the fact that the deep learning based methods applied to DBD have outperformed the hand-crafted feature based methods, the performance cannot still meet our requirement. In this paper, a novel network is established for DBD. Unlike existing methods which only learn the projection from the in-focus part to the ground-truth, both in-focus and out-of-focus pixels, which are completely and symmetrically complementary, are taken into account. Specifically, two symmetric branches are designed to jointly estimate the probability of focus and defocus pixels, respectively. Due to their complementary constraint, each layer in a branch is affected by an attention obtained from another branch, effectively learning the detailed information which may be ignored in one branch. The feature maps from these two branches are then passed through a unique fusion block to simultaneously get the two-channel output measured by a complementary loss. Additionally, instead of estimating only one binary map from a specific layer, each layer is encouraged to estimate the ground truth to guide the binary map estimation in its linked shallower layer followed by a top-to-bottom combination strategy, gradually exploiting the global and local information. Experimental results on released datasets demonstrate that our proposed method remarkably outperforms state-of-the-art algorithms. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10577149
Volume :
30
Database :
Academic Search Index
Journal :
IEEE Transactions on Image Processing
Publication Type :
Academic Journal
Accession number :
170077735
Full Text :
https://doi.org/10.1109/TIP.2021.3065171