1. Multi-scale convolutional neural networks and saliency weight maps for infrared and visible image fusion.
- Author
-
Yang, Chenxuan, He, Yunan, Sun, Ce, Chen, Bingkun, Cao, Jie, Wang, Yongtian, and Hao, Qun
- Subjects
- *
IMAGE fusion , *ARTIFICIAL neural networks , *INFRARED image converters , *IMAGE processing , *DEEP learning - Abstract
• An image fusion method is proposed that can effectively highlight infrared target features and retain visible visual information. • Infrared targets are analyzed based on the saliency weight map. • The target contour and background texture are refined by a method based on weight refinement of guided filtering. • A multi-layer fusion strategy is used to fuse saliency weight maps at multiple scales. Image fusion is the fusion of multiple images from the same scene to produce a more informative image, and infrared and visible image fusion is an important branch of image fusion. To tackle the issues of diminished luminosity in the infrared target, inconspicuous target features, and blurred texture of the fused image after the fusion of infrared and visible images. This paper introduces a novel effective fusion framework that merges multi-scale Convolutional Neural Networks (CNN) with saliency weight maps. First, the method measures the source image features to estimate the initial saliency weight map. Then, the initial weight map is segmented and optimized using a guided filter before being further processed by CNN. Next, a trained Siamese convolutional network is used to solve the two key problems of activity measure and weight assignment. Meanwhile, a multi-layer fusion strategy is designed to effectively retain the luminance of the infrared target and the texture information in the visible background. Finally, adaptive adjustment of the fusion coefficients is achieved by employing saliency. The experimental results show that the method outperforms the state-of-the-art algorithms in terms of both subjective visual quality and objective evaluation effects. [ABSTRACT FROM AUTHOR]
- Published
- 2024
- Full Text
- View/download PDF