Back to Search Start Over

Long-Range Augmented Reality with Dynamic Occlusion Rendering.

Authors :
Sizintsev, Mikhail
Mithun, Niluthpol Chowdhury
Chiu, Han-Pang
Samarasekera, Supun
Kumar, Rakesh
Source :
IEEE Transactions on Visualization & Computer Graphics; Nov2021, Vol. 27 Issue 11, p4236-4244, 9p
Publication Year :
2021

Abstract

Proper occlusion based rendering is very important to achieve realism in all indoor and outdoor Augmented Reality (AR) applications. This paper addresses the problem of fast and accurate dynamic occlusion reasoning by real objects in the scene for large scale outdoor AR applications. Conceptually, proper occlusion reasoning requires an estimate of depth for every point in augmented scene which is technically hard to achieve for outdoor scenarios, especially in the presence of moving objects. We propose a method to detect and automatically infer the depth for real objects in the scene without explicit detailed scene modeling and depth sensing (e.g. without using sensors such as 3D-LiDAR). Specifically, we employ instance segmentation of color image data to detect real dynamic objects in the scene and use either a top-down terrain elevation model or deep learning based monocular depth estimation model to infer their metric distance from the camera for proper occlusion reasoning in real time. The realized solution is implemented in a low latency real-time framework for video-see-though AR and is directly extendable to optical-see-through AR. We minimize latency in depth reasoning and occlusion rendering by doing semantic object tracking and prediction in video frames. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
10772626
Volume :
27
Issue :
11
Database :
Complementary Index
Journal :
IEEE Transactions on Visualization & Computer Graphics
Publication Type :
Academic Journal
Accession number :
153764259
Full Text :
https://doi.org/10.1109/TVCG.2021.3106434