Stabilizing sequence learning in stochastic spiking networks with GABA-Modulated STDP.

Journal: Neural networks : the official journal of the International Neural Network Society
PMID:

Abstract

Cortical networks are capable of unsupervised learning and spontaneous replay of complex temporal sequences. Endowing artificial spiking neural networks with similar learning abilities remains a challenge. In particular, it is unresolved how different plasticity rules can contribute to both learning and the maintenance of network stability during learning. Here we introduce a biologically inspired form of GABA-Modulated Spike Timing-Dependent Plasticity (GMS) and demonstrate its ability to permit stable learning of complex temporal sequences including natural language in recurrent spiking neural networks. Motivated by biological findings, GMS utilizes the momentary level of inhibition onto excitatory cells to adjust both the magnitude and sign of Spike Timing-Dependent Plasticity (STDP) of connections between excitatory cells. In particular, high levels of inhibition in the network cause depression of excitatory-to-excitatory connections. We demonstrate the effectiveness of this mechanism during several sequence learning experiments with character- and token-based text inputs as well as visual input sequences. We show that GMS maintains stability during learning and spontaneous replay and permits the network to form a clustered hierarchical representation of its input sequences. Overall, we provide a biologically inspired model of unsupervised learning of complex sequences in recurrent spiking neural networks.

Authors

  • Marius Vieth
    Frankfurt Institute for Advanced Studies, Frankfurt am Main, Germany. Electronic address: vieth@fias.uni-frankfurt.de.
  • Jochen Triesch
    Frankfurt Institute for Advanced Studies, Johann Wolfgang Goethe University, Frankfurt am Main, Germany.