Microsaccade-inspired event camera for robotics.

Journal: Science robotics
PMID:

Abstract

Neuromorphic vision sensors or event cameras have made the visual perception of extremely low reaction time possible, opening new avenues for high-dynamic robotics applications. These event cameras' output is dependent on both motion and texture. However, the event camera fails to capture object edges that are parallel to the camera motion. This is a problem intrinsic to the sensor and therefore challenging to solve algorithmically. Human vision deals with perceptual fading using the active mechanism of small involuntary eye movements, the most prominent ones called microsaccades. By moving the eyes constantly and slightly during fixation, microsaccades can substantially maintain texture stability and persistence. Inspired by microsaccades, we designed an event-based perception system capable of simultaneously maintaining low reaction time and stable texture. In this design, a rotating wedge prism was mounted in front of the aperture of an event camera to redirect light and trigger events. The geometrical optics of the rotating wedge prism allows for algorithmic compensation of the additional rotational motion, resulting in a stable texture appearance and high informational output independent of external motion. The hardware device and software solution are integrated into a system, which we call artificial microsaccade-enhanced event camera (AMI-EV). Benchmark comparisons validated the superior data quality of AMI-EV recordings in scenarios where both standard cameras and event cameras fail to deliver. Various real-world experiments demonstrated the potential of the system to facilitate robotics perception both for low-level and high-level vision tasks.

Authors

  • Botao He
    Department of Computer Science, University of Maryland, College Park, MD 20742, USA.
  • Ze Wang
    School of Traditional Chinese Materia Medica, Tianjin University of Traditional Chinese Medicine, 312 Anshanwest Road, Nankai District, Tianjin 300193, China.
  • Yuan Zhou
    Department of Pharmacy, Taihe Hospital, Hubei University of Medicine, Shiyan, China.
  • Jingxi Chen
    Department of Computer Science, University of Maryland, College Park, MD 20742, USA.
  • Chahat Deep Singh
    Department of Computer Science, University of Maryland, College Park, MD 20742, USA.
  • Haojia Li
    Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Hongkong, China.
  • Yuman Gao
    State Key Laboratory of Industrial Control and Technology, Zhejiang University, Hangzhou, China.
  • Shaojie Shen
    Department of Electronic and Computer Engineering, Hong Kong University of Science and Technology, Hong Kong, China.
  • Kaiwei Wang
    College of Optical Science and Engineering, Zhejiang University, Hangzhou, China.
  • Yanjun Cao
    Huzhou Institute of Zhejiang University, Huzhou, China.
  • Chao Xu
    Department of Neurology, the Second Affiliated Hospital, Zhejiang University School of Medicine, Hangzhou 310009, China;Department of Emergency, Zhejiang Hospital, Hangzhou 310013, China.
  • Yiannis Aloimonos
    2Department of Computer Science, University of Maryland, College Park, MD USA.
  • Fei Gao
    College of Biological Sciences, China Agricultural University, Beijing 100193, China.
  • Cornelia Fermüller
    Department of Computer Science, University of Maryland, College Park, MD 20742, USA.