Back to Search Start Over

Guided Attention Inference Network.

Authors :
Li, Kunpeng
Wu, Ziyan
Peng, Kuan-Chuan
Ernst, Jan
Fu, Yun
Source :
IEEE Transactions on Pattern Analysis & Machine Intelligence. Dec2020, Vol. 42 Issue 12, p2996-3010. 15p.
Publication Year :
2020

Abstract

With only coarse labels, weakly supervised learning typically uses top-down attention maps generated by back-propagating gradients as priors for tasks such as object localization and semantic segmentation. While these attention maps are intuitive and informative explanations of deep neural network, there is no effective mechanism to manipulate the network attention during learning process. In this paper, we address three shortcomings of previous approaches in modeling such attention maps in one common framework. First, we make attention maps a natural and explicit component in the training pipeline such that they are end-to-end trainable. Moreover, we provide self-guidance directly on these maps by exploring supervision from the network itself to improve them towards specific target tasks. Lastly, we proposed a design to seamlessly bridge the gap between using weak and extra supervision if available. Despite its simplicity, experiments on the semantic segmentation task demonstrate the effectiveness of our methods. Besides, the proposed framework provides a way not only explaining the focus of the learner but also feeding back with direct guidance towards specific tasks. Under mild assumptions our method can also be understood as a plug-in to existing convolutional neural networks to improve their generalization performance. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
01628828
Volume :
42
Issue :
12
Database :
Academic Search Index
Journal :
IEEE Transactions on Pattern Analysis & Machine Intelligence
Publication Type :
Academic Journal
Accession number :
146892158
Full Text :
https://doi.org/10.1109/TPAMI.2019.2921543