Timescale separation in recurrent neural networks.

Journal: Neural computation
Published Date:

Abstract

Supervised learning in recurrent neural networks involves two processes: the neuron activity from which gradients are estimated and the process on connection parameters induced by these measurements. A problem such algorithms must address is how to balance the relative rates of these activities so that accurate sensitivity estimates are obtained while still allowing synaptic modification to take place at a rate sufficient for learning. We show how to calculate a sufficient timescale separation between these two processes for a class of contracting neural networks.

Authors

  • Thomas Flynn
    Graduate Center, City University of New York, New York, NY 10016, U.S.A. tflynn@gradcenter.cuny.edu.