Error bounds for deep ReLU networks using the Kolmogorov-Arnold superposition theorem.
Journal:
Neural networks : the official journal of the International Neural Network Society
Published Date:
May 26, 2020
Abstract
We prove a theorem concerning the approximation of multivariate functions by deep ReLU networks, for which the curse of the dimensionality is lessened. Our theorem is based on a constructive proof of the Kolmogorov-Arnold superposition theorem, and on a subset of multivariate continuous functions whose outer superposition functions can be efficiently approximated by deep ReLU networks.