Volume 1, Issue 3
On the Banach Spaces Associated with Multi-Layer ReLU Networks: Function Representation, Approximation Theory and Gradient Descent Dynamics

Weinan E & Stephan Wojtowytsch

CSIAM Trans. Appl. Math., 1 (2020), pp. 387-440.

Published online: 2020-09

Export citation
  • Abstract

We develop Banach spaces for ReLU neural networks of finite depth $L$ and infinite width. The spaces contain all finite fully connected $L$-layer networks and their $L^2$-limiting objects under bounds on the natural path-norm. Under this norm, the unit ball in the space for $L$-layer networks has low Rademacher complexity and thus favorable generalization properties. Functions in these spaces can be approximated by multi-layer neural networks with dimension-independent convergence rates.
The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent dynamics for the associated multi-layer neural networks. We show that the path-norm increases at most polynomially under this continuous gradient flow dynamics.

  • AMS Subject Headings

68T07, 46E15, 26B35, 35Q68, 34A12, 26B40

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-1-387, author = {E , Weinan and Wojtowytsch , Stephan}, title = {On the Banach Spaces Associated with Multi-Layer ReLU Networks: Function Representation, Approximation Theory and Gradient Descent Dynamics}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2020}, volume = {1}, number = {3}, pages = {387--440}, abstract = {

We develop Banach spaces for ReLU neural networks of finite depth $L$ and infinite width. The spaces contain all finite fully connected $L$-layer networks and their $L^2$-limiting objects under bounds on the natural path-norm. Under this norm, the unit ball in the space for $L$-layer networks has low Rademacher complexity and thus favorable generalization properties. Functions in these spaces can be approximated by multi-layer neural networks with dimension-independent convergence rates.
The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent dynamics for the associated multi-layer neural networks. We show that the path-norm increases at most polynomially under this continuous gradient flow dynamics.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.20-211}, url = {http://global-sci.org/intro/article_detail/csiam-am/18324.html} }
TY - JOUR T1 - On the Banach Spaces Associated with Multi-Layer ReLU Networks: Function Representation, Approximation Theory and Gradient Descent Dynamics AU - E , Weinan AU - Wojtowytsch , Stephan JO - CSIAM Transactions on Applied Mathematics VL - 3 SP - 387 EP - 440 PY - 2020 DA - 2020/09 SN - 1 DO - http://doi.org/10.4208/csiam-am.20-211 UR - https://global-sci.org/intro/article_detail/csiam-am/18324.html KW - Barron space, multi-layer space, deep neural network, representations of functions, machine learning, infinitely wide network, ReLU activation, Banach space, path-norm, continuous gradient descent dynamics, index representation. AB -

We develop Banach spaces for ReLU neural networks of finite depth $L$ and infinite width. The spaces contain all finite fully connected $L$-layer networks and their $L^2$-limiting objects under bounds on the natural path-norm. Under this norm, the unit ball in the space for $L$-layer networks has low Rademacher complexity and thus favorable generalization properties. Functions in these spaces can be approximated by multi-layer neural networks with dimension-independent convergence rates.
The key to this work is a new way of representing functions in some form of expectations, motivated by multi-layer neural networks. This representation allows us to define a new class of continuous models for machine learning. We show that the gradient flow defined this way is the natural continuous analog of the gradient descent dynamics for the associated multi-layer neural networks. We show that the path-norm increases at most polynomially under this continuous gradient flow dynamics.

E , Weinan and Wojtowytsch , Stephan. (2020). On the Banach Spaces Associated with Multi-Layer ReLU Networks: Function Representation, Approximation Theory and Gradient Descent Dynamics. CSIAM Transactions on Applied Mathematics. 1 (3). 387-440. doi:10.4208/csiam-am.20-211
Copy to clipboard
The citation has been copied to your clipboard