J. Mach. Learn. , 1 (2022), pp. 342-372.

Published online: 2022-12

Category: Theory

[*An open-access
article; the PDF
is free to any online user.*]

Cited by

- BibTex
- RIS
- TXT

In this paper, we establish a neural network to approximate functionals, which are maps from infinite dimensional spaces to finite dimensional spaces. The approximation error of the neural network is $\mathcal{O}(1/\sqrt{m})$ where $m$ is the size of networks. In other words, the error of the network is no dependence on the dimensionality respecting to the number of the nodes in neural networks. The key idea of the approximation is to define a Barron space of functionals.

}, issn = {2790-2048}, doi = {https://doi.org/10.4208/jml.221018}, url = {http://global-sci.org/intro/article_detail/jml/21297.html} }In this paper, we establish a neural network to approximate functionals, which are maps from infinite dimensional spaces to finite dimensional spaces. The approximation error of the neural network is $\mathcal{O}(1/\sqrt{m})$ where $m$ is the size of networks. In other words, the error of the network is no dependence on the dimensionality respecting to the number of the nodes in neural networks. The key idea of the approximation is to define a Barron space of functionals.

*Journal of Machine Learning*.

*1*(4). 342-372. doi:10.4208/jml.221018