A new mechanical approach to handle generalized Hopfield neural networks.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

We propose a modification of the cost function of the Hopfield model whose salient features shine in its Taylor expansion and result in more than pairwise interactions with alternate signs, suggesting a unified framework for handling both with deep learning and network pruning. In our analysis, we heavily rely on the Hamilton-Jacobi correspondence relating the statistical model with a mechanical system. In this picture, our model is nothing but the relativistic extension of the original Hopfield model (whose cost function is a quadratic form in the Mattis magnetization and mimics the non-relativistic counterpart, the so-called classical limit). We focus on the low-storage regime and solve the model analytically by taking advantage of the mechanical analogy, thus obtaining a complete characterization of the free energy and the associated self-consistency equations in the thermodynamic limit. Further, on the numerical side, we test the performances of our proposal with extensive Monte Carlo simulations, showing that the stability of spurious states (limiting the capabilities of the standard Hebbian construction) is sensibly reduced due to presence of unlearning contributions that prune them massively.

Authors

  • Adriano Barra
    Dipartimento di Fisica, Sapienza Università di Roma, P.le A. Moro 2, 00185, Roma, Italy.
  • Matteo Beccaria
    Dipartimento di Matematica e Fisica Ennio De Giorgi, Università del Salento, Lecce, Italy; INFN, Istituto Nazionale di Fisica Nucleare, Sezione di Lecce, Italy. Electronic address: matteo.beccaria@le.infn.it.
  • Alberto Fachechi
    Dipartimento di Matematica e Fisica Ennio De Giorgi, Università del Salento, Lecce, Italy; INFN, Istituto Nazionale di Fisica Nucleare, Sezione di Lecce, Italy. Electronic address: alberto.fachechi@le.infn.it.