Back to Search Start Over

Event-based Stereo Depth Estimation from Ego-motion using Ray Density Fusion

Authors :
Ghosh, Suman
Gallego, Guillermo
Source :
2nd International Ego4D Workshop at ECCV 2022
Publication Year :
2022

Abstract

Event cameras are bio-inspired sensors that mimic the human retina by responding to brightness changes in the scene. They generate asynchronous spike-based outputs at microsecond resolution, providing advantages over traditional cameras like high dynamic range, low motion blur and power efficiency. Most event-based stereo methods attempt to exploit the high temporal resolution of the camera and the simultaneity of events across cameras to establish matches and estimate depth. By contrast, this work investigates how to estimate depth from stereo event cameras without explicit data association by fusing back-projected ray densities, and demonstrates its effectiveness on head-mounted camera data, which is recorded in an egocentric fashion. Code and video are available at https://github.com/tub-rip/dvs_mcemvs<br />Comment: 6 pages, 3 figures, project page: https://github.com/tub-rip/dvs_mcemvs

Details

Database :
arXiv
Journal :
2nd International Ego4D Workshop at ECCV 2022
Publication Type :
Report
Accession number :
edsarx.2210.08927
Document Type :
Working Paper