An effective SteinGLM initialization scheme for training multi-layer feedforward sigmoidal neural networks.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

Network initialization is the first and critical step for training neural networks. In this paper, we propose a novel network initialization scheme based on the celebrated Stein's identity. By viewing multi-layer feedforward sigmoidal neural networks as cascades of multi-index models, the projection weights to the first hidden layer are initialized using eigenvectors of the cross-moment matrix between the input's second-order score function and the response. The input data is then forward propagated to the next layer and such a procedure can be repeated until all the hidden layers are initialized. Finally, the weights for the output layer are initialized by generalized linear modeling. Such a proposed SteinGLM method is shown through extensive numerical results to be much faster and more accurate than other popular methods commonly used for training neural networks.

Authors

  • Zebin Yang
    Department of Statistics and Actuarial Science, The University of Hong Kong, Pokfulam Road, Hong Kong.
  • Hengtao Zhang
    Department of Statistics and Actuarial Science, The University of Hong Kong, Pokfulam Road, Hong Kong.
  • Agus Sudjianto
    Corporate Model Risk, Wells Fargo, USA.
  • Aijun Zhang
    Department of Plastic Surgery, Affiliated Hospital of Xuzhou Medical University, 99 Huai-hai West Road, 221002 Xuzhou, Jiangsu, China. Electronic address: zaj165@126.com.