Back to Search Start Over

RepDehazeNet: Dual subnets image dehazing network based on structural re-parameterization.

Authors :
Luo, Xiaozhong
Zhong, Han
Lu, Junjie
Meng, Chen
Han, Xu
Source :
Computers & Graphics. Feb2024, Vol. 118, p71-79. 9p.
Publication Year :
2024

Abstract

In recent times, there has been notable and swift advancement in the field of image dehazing. Several deep learning techniques have demonstrated remarkable proficiency in resolving homogeneous dehazing issues. Nonetheless, the current dehazing approaches are generally formulated to deal with homogeneous haze, which is often undermined in real-world scenarios due to the uncertain haze dispersion. In this paper, we propose a dehazing model named RepDehazeNet by combining a structurally Reparameterization Encoder-Decoder subnet and a Full-Resolution Attention subnet. To be specific, the structural reparameterization idea is introduced into the encoder–decoder subnet to strengthen the feature extraction of dehazed images and improve the feature extraction speed. RepDehazeNet is compared with seven SOTA models on different datasets in terms of PSNR, SSIM, parameter quantity, and inference time. Compared to the DW-GAN model, the proposed RepDehazeNet model reduces the number of parameters by 2.7 million, and improves the inference speed by 90.3%, while achieving a higher PSNR of 0.5 dB on the NH-Haze2021 dataset. The experimental results demonstrate that the proposed RepDehazeNet model can effectively improve the real-time performance, accuracy of dehazing synthesized and nonhomogeneous haze images. [Display omitted] • Structural reparameterization dehazenet: outstanding performance, faster speed. • Replacing Tanh with ReLU leads to better results. • Transfer learning addresses the problem of insufficient samples. • Dual subnets method proves highly effective in datasets of different scales. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
00978493
Volume :
118
Database :
Academic Search Index
Journal :
Computers & Graphics
Publication Type :
Academic Journal
Accession number :
176247053
Full Text :
https://doi.org/10.1016/j.cag.2023.12.001