Brain-inspired multimodal hybrid neural network for robot place recognition.

Journal: Science robotics
Published Date:

Abstract

Place recognition is an essential spatial intelligence capability for robots to understand and navigate the world. However, recognizing places in natural environments remains a challenging task for robots because of resource limitations and changing environments. In contrast, humans and animals can robustly and efficiently recognize hundreds of thousands of places in different conditions. Here, we report a brain-inspired general place recognition system, dubbed NeuroGPR, that enables robots to recognize places by mimicking the neural mechanism of multimodal sensing, encoding, and computing through a continuum of space and time. Our system consists of a multimodal hybrid neural network (MHNN) that encodes and integrates multimodal cues from both conventional and neuromorphic sensors. Specifically, to encode different sensory cues, we built various neural networks of spatial view cells, place cells, head direction cells, and time cells. To integrate these cues, we designed a multiscale liquid state machine that can process and fuse multimodal information effectively and asynchronously using diverse neuronal dynamics and bioinspired inhibitory circuits. We deployed the MHNN on Tianjic, a hybrid neuromorphic chip, and integrated it into a quadruped robot. Our results show that NeuroGPR achieves better performance compared with conventional and existing biologically inspired approaches, exhibiting robustness to diverse environmental uncertainty, including perceptual aliasing, motion blur, light, or weather changes. Running NeuroGPR as an overall multi-neural network workload on Tianjic showcases its advantages with 10.5 times lower latency and 43.6% lower power consumption than the commonly used mobile robot processor Jetson Xavier NX.

Authors

  • Fangwen Yu
    Faculty of Information Engineering, China University of Geosciences and National Engineering Research Center for Geographic Information System, Wuhan, 430074, China.
  • Yujie Wu
    Institute of Agricultural Products Processing, Jiangsu Academy of Agricultural Sciences, Nanjing, 210014, PR China.
  • Songchen Ma
    Center for Brain-Inspired Computing Research (CBICR), Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China.
  • Mingkun Xu
    Department of Precision Instrument, Tsinghua University, Beijing, 100084, China; Center for Brain Inspired Computing Research, Tsinghua University, Beijing, 100084, China; Beijing Innovation Center for Future Chip, Beijing, 100084, China.
  • Hongyi Li
    State Key Laboratory of Robotics, Shenyang Institute of Automation, University of Chinese Academy of Sciences, Shenyang, Liaoning, P. R. China.
  • Huanyu Qu
    Center for Brain-Inspired Computing Research (CBICR), Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China.
  • Chenhang Song
    Center for Brain-Inspired Computing Research (CBICR), Beijing Innovation Center for Future Chip, Optical Memory National Engineering Research Center, Department of Precision Instrument, Tsinghua University, Beijing 100084, China.
  • Taoyi Wang
    School of Software, Beijing Institute of Technology, Beijing 100081, China.
  • Rong Zhao
    Pinggu District Center for Disease Control and Prevention, Beijing 101200, China.
  • Luping Shi
    Centre for Brain Inspired Computing Research (CBICR), Department of Precision Instrument, Tsinghua University, Beijing 100084, China.