Back to Search Start Over

ADF‐Net: Attention‐guided deep feature decomposition network for infrared and visible image fusion

Authors :
Sen Shen
Taotao Zhang
Haidi Dong
ShengZhi Yuan
Min Li
RenKai Xiao
Xiaohui Zhang
Source :
IET Image Processing, Vol 18, Iss 10, Pp 2774-2787 (2024)
Publication Year :
2024
Publisher :
Wiley, 2024.

Abstract

Abstract To effectively enhance the ability to acquire information by making full use of the complementary features of infrared and visible images, the widely used image fusion algorithm is faced with challenges such as information loss and image blurring. In response to this issue, the authors propose a dual‐branch deep hierarchical fusion network (ADF‐Net) guided by an attention mechanism. Initially, the attention convolution module extracts the shallow features of the image. Subsequently, a dual‐branch deep decomposition feature extractor is introduced, where in the transformer encoder block (TEB) employs remote attention to process low‐frequency global features, while the CNN encoder block (CEB) extracts high‐frequency local information. Ultimately, the global fusion layer based on TEB and the local fusion layer based on CEB produce the fused image through the encoder. Multiple experiments demonstrate that ADF‐Net excels in various aspects by utilizing two‐stage training and an appropriate loss function for training and testing.

Details

Language :
English
ISSN :
17519667 and 17519659
Volume :
18
Issue :
10
Database :
Directory of Open Access Journals
Journal :
IET Image Processing
Publication Type :
Academic Journal
Accession number :
edsdoj.6105f03331e4406ebcc44b420d6e0398
Document Type :
article
Full Text :
https://doi.org/10.1049/ipr2.13134