Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem.

Journal: Neural networks : the official journal of the International Neural Network Society
Published Date:

Abstract

We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov-Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.

Authors

  • Hadrien Montanelli
    Department of Applied Physics and Applied Mathematics, Columbia University, NY, United States. Electronic address: hadrien.montanelli@gmail.com.
  • Haizhao Yang
    Department of Mathematics, National University of Singapore, Singapore. Electronic address: haizhao@nus.edu.sg.