CSIAM Trans. Appl. Math., 1 (2020), pp. 387-440.
Published online: 2020-09
Cited by
- BibTex
- RIS
- TXT
We develop Banach spaces for ReLU neural networks of finite depth $L$ and
infinite width. The spaces contain all finite fully connected $L$-layer networks and their $L^2$-limiting objects under bounds on the natural path-norm. Under this norm, the
unit ball in the space for $L$-layer networks has low Rademacher complexity and thus
favorable generalization properties. Functions in these spaces can be approximated by
multi-layer neural networks with dimension-independent convergence rates.
The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to
define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent
dynamics for the associated multi-layer neural networks. We show that the path-norm
increases at most polynomially under this continuous gradient flow dynamics.
We develop Banach spaces for ReLU neural networks of finite depth $L$ and
infinite width. The spaces contain all finite fully connected $L$-layer networks and their $L^2$-limiting objects under bounds on the natural path-norm. Under this norm, the
unit ball in the space for $L$-layer networks has low Rademacher complexity and thus
favorable generalization properties. Functions in these spaces can be approximated by
multi-layer neural networks with dimension-independent convergence rates.
The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to
define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent
dynamics for the associated multi-layer neural networks. We show that the path-norm
increases at most polynomially under this continuous gradient flow dynamics.