Back to Search Start Over

EV-MGRFlowNet: Motion-Guided Recurrent Network for Unsupervised Event-Based Optical Flow With Hybrid Motion-Compensation Loss

Authors :
Zhuang, Hao
Fang, Zheng
Huang, Xinjie
Hou, Kuanxu
Kong, Delei
Hu, Chenming
Source :
IEEE Transactions on Instrumentation and Measurement; 2024, Vol. 73 Issue: 1 p1-15, 15p
Publication Year :
2024

Abstract

Event cameras offer promising properties, such as high temporal resolution and high dynamic range. These benefits have been used into many machine vision tasks, especially optical flow estimation. Currently, most existing event-based works use deep learning to estimate optical flow. However, their networks have not fully exploited prior hidden states and motion flows. In addition, their supervision strategy has not fully leveraged the geometric constraints of event data to unlock the potential of networks. In this article, we propose EV-MGRFlowNet, an unsupervised event-based optical flow estimation pipeline with motion-guided recurrent networks using a hybrid motion-compensation loss (HMC-Loss). First, we propose a feature-enhanced recurrent encoder (FER-Encoder) which fully uses prior hidden states to obtain multilevel motion features. Then, we propose a flow-guided decoder (FG-Decoder) to integrate prior motion flows. Finally, we design an HMC-Loss to strengthen geometric constraints for the more accurate alignment of events. Experimental results show that our method outperforms the current state-of-the-art (SOTA) method on the multivehicle stereo event camera (MVSEC) dataset, with an average reduction of approximately 22.71% in average endpoint error (AEE). To our knowledge, our method ranks first among unsupervised learning-based methods.

Details

Language :
English
ISSN :
00189456 and 15579662
Volume :
73
Issue :
1
Database :
Supplemental Index
Journal :
IEEE Transactions on Instrumentation and Measurement
Publication Type :
Periodical
Accession number :
ejs65710419
Full Text :
https://doi.org/10.1109/TIM.2024.3365160