State-Space Representations of Deep Neural Networks.

Journal: Neural computation
Published Date:

Abstract

This letter deals with neural networks as dynamical systems governed by finite difference equations. It shows that the introduction of -many skip connections into network architectures, such as residual networks and additive dense networks, defines th order dynamical equations on the layer-wise transformations. Closed-form solutions for the state-space representations of general th order additive dense networks, where the concatenation operation is replaced by addition, as well as th order smooth networks, are found. The developed provision endows deep neural networks with an algebraic structure. Furthermore, it is shown that imposing th order smoothness on network architectures with -many nodes per layer increases the state-space dimension by a multiple of , and so the effective embedding dimension of the data manifold by the neural network is -many dimensions. It follows that network architectures of these types reduce the number of parameters needed to maintain the same embedding dimension by a factor of when compared to an equivalent first-order, residual network. Numerical simulations and experiments on CIFAR10, SVHN, and MNIST have been conducted to help understand the developed theory and efficacy of the proposed concepts.

Authors

  • Michael Hauser
    Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A. mikebenh@gmail.com.
  • Sean Gunn
    Department of Mechanical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A. sug375@psu.edu.
  • Samer Saab
    Department of Electrical Engineering, Pennsylvania State University, University Park, PA 16802, U.S.A. sys5880@psu.edu.
  • Asok Ray