A lightweight and gradient-stable neural layer.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

To enhance resource efficiency and model deployability of neural networks, we propose a neural-layer architecture based on Householder weighting and absolute-value activating, called Householder-absolute neural layer or simply Han-layer. Compared to a fully connected layer with d-neurons and d outputs, a Han-layer reduces the number of parameters and the corresponding computational complexity from O(d) to O(d). The Han-layer structure guarantees that the Jacobian of the layer function is always orthogonal, thus ensuring gradient stability (i.e., free of gradient vanishing or exploding issues) for any Han-layer sub-networks. Extensive numerical experiments show that one can strategically use Han-layers to replace fully connected (FC) layers, reducing the number of model parameters while maintaining or even improving the generalization performance. We will also showcase the capabilities of the Han-layer architecture on a few small stylized models, and discuss its current limitations.

Authors

  • Yueyao Yu
    School of Science and Engineering, The Chinese University of Hong Kong-Shenzhen, China; Shenzhen Research Institute of Big Data, China.
  • Yin Zhang
    Department of Radiation Oncology, Rutgers-Cancer Institute of New Jersey, Rutgers-Robert Wood Johnson Medical School, New Brunswick, NJ, United States.