Back to Search Start Over

Depth-guided deep filtering network for efficient single image bokeh rendering.

Authors :
Chen, Quan
Zheng, Bolun
Zhou, Xiaofei
Huang, Aiai
Sun, Yaoqi
Chen, Chuqiao
Yan, Chenggang
Yuan, Shanxin
Source :
Neural Computing & Applications. Oct2023, Vol. 35 Issue 28, p20869-20887. 19p.
Publication Year :
2023

Abstract

Bokeh effect is usually used to highlight major contents in an image. Limited by the small sensors, cameras on smartphones are less sensitive to the depth information and cannot directly produce bokeh effect as pleasant as digital single lens reflex cameras. To address this problem, a depth-guided deep filtering network, called DDFN, is proposed in this study. Specifically, the focused region detection block is designed to detect the salient areas, and the depth estimated block is introduced to estimate depth maps from full-focus images. Further, combining depth maps and focused features, an adaptive rendering block is proposed to synthesize bokeh effect with adaptive cross 1-D filters. Both quantitative and qualitative evaluations on the public datasets demonstrate that the proposed model performs favorably against state-of-the-art methods in terms of rendering effects and has lower computational cost, e.g., 24.07 dB PSNR on EBB! dataset and 0.45 s inference times for a 512 × 768 image on a Snapdragon 865 mobile processor. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
09410643
Volume :
35
Issue :
28
Database :
Academic Search Index
Journal :
Neural Computing & Applications
Publication Type :
Academic Journal
Accession number :
170899851
Full Text :
https://doi.org/10.1007/s00521-023-08852-y