Leveraging Spiking Deep Neural Networks to Understand the Neural Mechanisms Underlying Selective Attention.

Journal: Journal of cognitive neuroscience
Published Date:

Abstract

Spatial attention enhances sensory processing of goal-relevant information and improves perceptual sensitivity. Yet, the specific neural mechanisms underlying the effects of spatial attention on performance are still contested. Here, we examine different attention mechanisms in spiking deep convolutional neural networks. We directly contrast effects of precision (internal noise suppression) and two different gain modulation mechanisms on performance on a visual search task with complex real-world images. Unlike standard artificial neurons, biological neurons have saturating activation functions, permitting implementation of attentional gain as gain on a neuron's input or on its outgoing connection. We show that modulating the connection is most effective in selectively enhancing information processing by redistributing spiking activity and by introducing additional task-relevant information, as shown by representational similarity analyses. Precision only produced minor attentional effects in performance. Our results, which mirror empirical findings, show that it is possible to adjudicate between attention mechanisms using more biologically realistic models and natural stimuli.

Authors

  • Lynn K A Sörensen
    University of Amsterdam, The Netherlands.
  • Davide Zambrano
    Centrum Wiskunde & Informatica, Amsterdam, The Netherlands.
  • Heleen A Slagter
    Vrije Universiteit Amsterdam, The Netherlands.
  • Sander M Bohte
    Machine Learning Group, Centrum Wiskunde & Informatica, Amsterdam, the Netherlands.
  • H Steven Scholte
    Department of Brain & Cognition, University of Amsterdam, The Netherlands. Electronic address: h.s.scholte@uva.nl.