arrow
Volume 14, Issue 4
Approximation and Generalization of DeepONets for Learning Operators Arising from a Class of Singularly Perturbed Problems

Ting Du, Zhongyi Huang & Ye Li

East Asian J. Appl. Math., 14 (2024), pp. 841-873.

Published online: 2024-09

Export citation
  • Abstract

Singularly perturbed problems present inherent difficulty due to the presence of thin layers in their solutions. To overcome this difficulty, we propose using deep operator networks (DeepONets), a method previously shown to be effective in approximating nonlinear operators between infinite-dimensional Banach spaces. In this paper, we demonstrate for the first time the application of DeepONets to one-dimensional singularly perturbed problems, achieving promising results that suggest their potential as a robust tool for solving this class of problems. We consider the convergence rate of the approximation error incurred by the operator networks in approximating the solution operator, and examine the generalization gap and empirical risk, all of which are shown to converge uniformly with respect to the perturbation parameter. By utilizing Shishkin mesh points as locations of the loss function, we conduct several numerical experiments that provide further support for the effectiveness of operator networks in capturing the singular layer behavior.

  • AMS Subject Headings

65M10, 78A48

  • Copyright

COPYRIGHT: © Global Science Press

  • Email address
  • BibTex
  • RIS
  • TXT
@Article{EAJAM-14-841, author = {Du , TingHuang , Zhongyi and Li , Ye}, title = {Approximation and Generalization of DeepONets for Learning Operators Arising from a Class of Singularly Perturbed Problems}, journal = {East Asian Journal on Applied Mathematics}, year = {2024}, volume = {14}, number = {4}, pages = {841--873}, abstract = {

Singularly perturbed problems present inherent difficulty due to the presence of thin layers in their solutions. To overcome this difficulty, we propose using deep operator networks (DeepONets), a method previously shown to be effective in approximating nonlinear operators between infinite-dimensional Banach spaces. In this paper, we demonstrate for the first time the application of DeepONets to one-dimensional singularly perturbed problems, achieving promising results that suggest their potential as a robust tool for solving this class of problems. We consider the convergence rate of the approximation error incurred by the operator networks in approximating the solution operator, and examine the generalization gap and empirical risk, all of which are shown to converge uniformly with respect to the perturbation parameter. By utilizing Shishkin mesh points as locations of the loss function, we conduct several numerical experiments that provide further support for the effectiveness of operator networks in capturing the singular layer behavior.

}, issn = {2079-7370}, doi = {https://doi.org/10.4208/eajam.2023-128.051023}, url = {http://global-sci.org/intro/article_detail/eajam/23440.html} }
TY - JOUR T1 - Approximation and Generalization of DeepONets for Learning Operators Arising from a Class of Singularly Perturbed Problems AU - Du , Ting AU - Huang , Zhongyi AU - Li , Ye JO - East Asian Journal on Applied Mathematics VL - 4 SP - 841 EP - 873 PY - 2024 DA - 2024/09 SN - 14 DO - http://doi.org/10.4208/eajam.2023-128.051023 UR - https://global-sci.org/intro/article_detail/eajam/23440.html KW - Deep operator network, singularly perturbed problem, Shishkin mesh, uniform convergence. AB -

Singularly perturbed problems present inherent difficulty due to the presence of thin layers in their solutions. To overcome this difficulty, we propose using deep operator networks (DeepONets), a method previously shown to be effective in approximating nonlinear operators between infinite-dimensional Banach spaces. In this paper, we demonstrate for the first time the application of DeepONets to one-dimensional singularly perturbed problems, achieving promising results that suggest their potential as a robust tool for solving this class of problems. We consider the convergence rate of the approximation error incurred by the operator networks in approximating the solution operator, and examine the generalization gap and empirical risk, all of which are shown to converge uniformly with respect to the perturbation parameter. By utilizing Shishkin mesh points as locations of the loss function, we conduct several numerical experiments that provide further support for the effectiveness of operator networks in capturing the singular layer behavior.

Ting Du, Zhongyi Huang & Ye Li. (2024). Approximation and Generalization of DeepONets for Learning Operators Arising from a Class of Singularly Perturbed Problems. East Asian Journal on Applied Mathematics. 14 (4). 841-873. doi:10.4208/eajam.2023-128.051023
Copy to clipboard
The citation has been copied to your clipboard