Recurrent networks with soft-thresholding nonlinearities for lightweight coding.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

A long-standing and influential hypothesis in neural information processing is that early sensory networks adapt themselves to produce efficient codes of afferent inputs. Here, we show how a nonlinear recurrent network provides an optimal solution for the efficient coding of an afferent input and its history. We specifically consider the problem of producing lightweight codes, ones that minimize both ℓ and ℓ constraints on sparsity and energy, respectively. When embedded in a linear coding paradigm, this problem results in a non-smooth convex optimization problem. We employ a proximal gradient descent technique to develop the solution, showing that the optimal code is realized through a recurrent network endowed with a nonlinear soft thresholding operator. The training of the network connection weights is readily achieved through gradient-based local learning. If such learning is assumed to occur on a slower time-scale than the (faster) recurrent dynamics, then the network as a whole converges to an optimal set of codes and weights via what is, in effect, an alternative minimization procedure. Our results show how the addition of thresholding nonlinearities to a recurrent network may enable the production of lightweight, history-sensitive encoding schemes.

Authors

  • MohammadMehdi Kafashan
    Department of Electrical and Systems Engineering, Washington University in St. Louis, MO 63130, USA.
  • ShiNung Ching