Towards Anytime Optical Flow Estimation with Event Cameras.

Journal: Sensors (Basel, Switzerland)
Published Date:

Abstract

Event cameras respond to changes in log-brightness at the millisecond level, making them ideal for optical flow estimation. However, existing datasets from event cameras provide only low-frame-rate ground truth for optical flow, limiting the research potential of event-driven optical flow. To address this challenge, we introduce a low-latency event representation, , and propose , an ent-based nytime estimation network to produce high-frame-rate event optical flow with only low-frame-rate optical flow ground truth for supervision. Furthermore, we propose for the unsupervised assessment of intermediate optical flow. A comprehensive variety of experiments on MVSEC, DESC, and our EVA-FlowSet demonstrates that EVA-Flow achieves competitive performance, super-low-latency (5 ms), time-dense motion estimation (200 Hz), and strong generalization.

Authors

  • Yaozu Ye
    State Key Laboratory of Extreme Photonics and Instrumentation, College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, China.
  • Hao Shi
    College of Computer Science and Technology, Taiyuan University of Science and Technology, Taiyuan, China.
  • Kailun Yang
    Institute for Anthropomatics and Robotics, Karlsruhe Institute of Technology, 76131 Karlsruhe, Germany.
  • Ze Wang
    School of Traditional Chinese Materia Medica, Tianjin University of Traditional Chinese Medicine, 312 Anshanwest Road, Nankai District, Tianjin 300193, China.
  • Xiaoting Yin
    State Key Laboratory of Extreme Photonics and Instrumentation, College of Optical Science and Engineering, Zhejiang University, Hangzhou 310027, China.
  • Lei Sun
    1Department of Biological Engineering, Utah State University, 4105 Old Main Hill, Logan, UT 84322-4105 USA.
  • Yaonan Wang
  • Kaiwei Wang
    College of Optical Science and Engineering, Zhejiang University, Hangzhou, China.

Keywords

No keywords available for this article.