Back to Search Start Over

Real-Time Neural Light Field on Mobile Devices

Authors :
Cao, Junli
Wang, Huan
Chemerys, Pavlo
Shakhrai, Vladislav
Hu, Ju
Fu, Yun
Makoviichuk, Denys
Tulyakov, Sergey
Ren, Jian
Publication Year :
2022

Abstract

Recent efforts in Neural Rendering Fields (NeRF) have shown impressive results on novel view synthesis by utilizing implicit neural representation to represent 3D scenes. Due to the process of volumetric rendering, the inference speed for NeRF is extremely slow, limiting the application scenarios of utilizing NeRF on resource-constrained hardware, such as mobile devices. Many works have been conducted to reduce the latency of running NeRF models. However, most of them still require high-end GPU for acceleration or extra storage memory, which is all unavailable on mobile devices. Another emerging direction utilizes the neural light field (NeLF) for speedup, as only one forward pass is performed on a ray to predict the pixel color. Nevertheless, to reach a similar rendering quality as NeRF, the network in NeLF is designed with intensive computation, which is not mobile-friendly. In this work, we propose an efficient network that runs in real-time on mobile devices for neural rendering. We follow the setting of NeLF to train our network. Unlike existing works, we introduce a novel network architecture that runs efficiently on mobile devices with low latency and small size, i.e., saving $15\times \sim 24\times$ storage compared with MobileNeRF. Our model achieves high-resolution generation while maintaining real-time inference for both synthetic and real-world scenes on mobile devices, e.g., $18.04$ms (iPhone 13) for rendering one $1008\times756$ image of real 3D scenes. Additionally, we achieve similar image quality as NeRF and better quality than MobileNeRF (PSNR $26.15$ vs. $25.91$ on the real-world forward-facing dataset).<br />Comment: CVPR 2023. Project page: https://snap-research.github.io/MobileR2L/ Code: https://github.com/snap-research/MobileR2L/

Details

Database :
arXiv
Publication Type :
Report
Accession number :
edsarx.2212.08057
Document Type :
Working Paper