Towards parameter-free attentional spiking neural networks.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Brain-inspired spiking neural networks (SNNs) are increasingly explored for their potential in spatiotemporal information modeling and energy efficiency on emerging neuromorphic hardware. Recent works incorporate attentional modules into SNNs, greatly enhancing their capabilities in handling sequential data. However, these parameterized attentional modules have placed a huge burden on memory consumption, a factor that is constrained on neuromorphic chips. To address this issue, we propose a parameter-free attention (PfA) mechanism that establishes a parameter-free linear space to bolster feature representation. The proposed PfA approach can be seamlessly integrated into the spiking neuron, resulting in enhanced performance without any increase in parameters. The experimental results on the SHD, BAE-TIDIGITS, SSC, DVS-Gesture, DVS-Cifar10, Cifar10, and Cifar100 datasets well demonstrate its competitive or superior classification accuracy compared with other state-of-the-art models. Furthermore, our model exhibits stronger noise robustness than conventional SNNs and those with parameterized attentional mechanisms. Our codes can be accessible at https://github.com/sunpengfei1122/PfA-SNN.

Authors

  • Pengfei Sun
    Department of Information Technology, WAVES Research Group, Ghent University, Gent, Belgium.
  • Jibin Wu
  • Paul Devos
    WAVES Research Group, Department of Information Technology, Ghent University, 4 Technologiepark 126, Zwijnaarde, 9052 Ghent, Belgium.
  • Dick Botteldooren
    WAVES Research Group, Faculty of Engineering and Architecture, Ghent University, Technologiepark 126, 9052 Gent, Belgium.