Volume 2, Issue 3
Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections

Jian-Feng Cai, Dong Li, Jiaze Sun & Ke Wang

CSIAM Trans. Appl. Math., 2 (2021), pp. 532-550.

Published online: 2021-08

Export citation
  • Abstract

Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this paper, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold and weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection and then the standard linear connection and non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, and therefore the training of neural networks can be accelerated without too much performance loss.

  • AMS Subject Headings

68T05, 41A30

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{CSIAM-AM-2-532, author = {Cai , Jian-FengLi , DongSun , Jiaze and Wang , Ke}, title = {Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections}, journal = {CSIAM Transactions on Applied Mathematics}, year = {2021}, volume = {2}, number = {3}, pages = {532--550}, abstract = {

Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this paper, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold and weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection and then the standard linear connection and non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, and therefore the training of neural networks can be accelerated without too much performance loss.

}, issn = {2708-0579}, doi = {https://doi.org/10.4208/csiam-am.SO-2020-0004}, url = {http://global-sci.org/intro/article_detail/csiam-am/19449.html} }
TY - JOUR T1 - Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections AU - Cai , Jian-Feng AU - Li , Dong AU - Sun , Jiaze AU - Wang , Ke JO - CSIAM Transactions on Applied Mathematics VL - 3 SP - 532 EP - 550 PY - 2021 DA - 2021/08 SN - 2 DO - http://doi.org/10.4208/csiam-am.SO-2020-0004 UR - https://global-sci.org/intro/article_detail/csiam-am/19449.html KW - Neural network, approximation, random projection. AB -

Random projections are able to perform dimension reduction efficiently for datasets with nonlinear low-dimensional structures. One well-known example is that random matrices embed sparse vectors into a low-dimensional subspace nearly isometrically, known as the restricted isometric property in compressed sensing. In this paper, we explore some applications of random projections in deep neural networks. We provide the expressive power of fully connected neural networks when the input data are sparse vectors or form a low-dimensional smooth manifold. We prove that the number of neurons required for approximating a Lipschitz function with a prescribed precision depends on the sparsity or the dimension of the manifold and weakly on the dimension of the input vector. The key in our proof is that random projections embed stably the set of sparse vectors or a low-dimensional smooth manifold into a low-dimensional subspace. Based on this fact, we also propose some new neural network models, where at each layer the input is first projected onto a low-dimensional subspace by a random projection and then the standard linear connection and non-linear activation are applied. In this way, the number of parameters in neural networks is significantly reduced, and therefore the training of neural networks can be accelerated without too much performance loss.

Jian-Feng Cai, Dong Li, Jiaze Sun & KeWang. (2021). Enhanced Expressive Power and Fast Training of Neural Networks by Random Projections. CSIAM Transactions on Applied Mathematics. 2 (3). 532-550. doi:10.4208/csiam-am.SO-2020-0004
Copy to clipboard
The citation has been copied to your clipboard