Back to Search Start Over

Rain Streak Removal From Light Field Images.

Authors :
Ding, Yuyang
Li, Mingyue
Yan, Tao
Zhang, Fan
Liu, Yuan
Lau, Rynson W. H.
Source :
IEEE Transactions on Circuits & Systems for Video Technology; Feb2022, Vol. 32 Issue 2, p467-482, 16p
Publication Year :
2022

Abstract

Raining is a common weather condition, and may seriously degrade the performances of outdoor computer vision systems, such as surveillance and autonomous navigation. Rain streaks may exhibit diverse appearances in the captured images, depending on their distances from the camera. For example, sparse rain streaks near the camera lens may appear as continuous and translucent strips, while distant densely accumulated rain streaks are more like fog and mist. Existing rain removal methods are mainly based on a single input image. However, on a single image, it is difficult to estimate a reliable depth map for rain removal. A light field image (LFI) records abundant structural and texture information of the target scene by capturing multi-perspective sub-aperture views with a single exposure. With a LFI, it is easier to estimate the depth maps, and rain streak locations across sub-aperture views are highly correlated. We observe that rain streaks usually have different slops and/or chromaic values, compared with the background scene, along the epipolar plane images (EPIs) of an LFI. Thus, we propose to make use of 3D EPIs to detect rain streaks and restore the background. To this end, we propose a novel GAN architecture to remove rain streaks from an LFI. Our method takes as input a 3D EPI, i.e., a stacked of sub-aperture views along the same row of a rainy LFI. It first estimates the disparity maps for the 3D EPI by utilizing an auto-encoder based depth estimation sub-network. The disparity maps concatenated with the input sub-aperture views are then fed into a non-local residual block, and two branched autoencoder sub-networks are used to extract rain-streaks and recover rain-free sub-aperture views. Extensive experiments conducted on both synthetic real-world-like LFIs and real-world LFIs demonstrate the effectiveness of our method. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10518215
Volume :
32
Issue :
2
Database :
Complementary Index
Journal :
IEEE Transactions on Circuits & Systems for Video Technology
Publication Type :
Academic Journal
Accession number :
155108595
Full Text :
https://doi.org/10.1109/TCSVT.2021.3063853