Structural plasticity on an accelerated analog neuromorphic hardware system.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

In computational neuroscience, as well as in machine learning, neuromorphic devices promise an accelerated and scalable alternative to neural network simulations. Their neural connectivity and synaptic capacity depend on their specific design choices, but is always intrinsically limited. Here, we present a strategy to achieve structural plasticity that optimizes resource allocation under these constraints by constantly rewiring the pre- and postsynaptic partners while keeping the neuronal fan-in constant and the connectome sparse. In particular, we implemented this algorithm on the analog neuromorphic system BrainScaleS-2. It was executed on a custom embedded digital processor located on chip, accompanying the mixed-signal substrate of spiking neurons and synapse circuits. We evaluated our implementation in a simple supervised learning scenario, showing its ability to optimize the network topology with respect to the nature of its training data, as well as its overall computational efficiency.

Authors

  • Sebastian Billaudelle
    Kirchhoff-Institute for Physics, Heidelberg University, Germany. Electronic address: sebastian.billaudelle@kip.uni-heidelberg.de.
  • Benjamin Cramer
    Kirchhoff-Institute for Physics, Heidelberg University, Germany. Electronic address: benjamin.cramer@kip.uni-heidelberg.de.
  • Mihai A Petrovici
    Kirchhoff-Institute for Physics, Heidelberg University, Im Neuenheimer Feld 227, D-69120 Heidelberg, Germany; Department of Physiology, University of Bern, Bühlplatz 5, CH-3012 Bern, Switzerland. Electronic address: petrovici@pyl.unibe.ch.
  • Korbinian Schreiber
    Kirchhoff-Institute for Physics, Heidelberg University, Germany.
  • David Kappel
    Institute for Theoretical Computer Science, Graz University of Technology, 8010 Graz, Austria.
  • Johannes Schemmel
  • Karlheinz Meier