- Journal Home
- Volume 36 - 2024
- Volume 35 - 2024
- Volume 34 - 2023
- Volume 33 - 2023
- Volume 32 - 2022
- Volume 31 - 2022
- Volume 30 - 2021
- Volume 29 - 2021
- Volume 28 - 2020
- Volume 27 - 2020
- Volume 26 - 2019
- Volume 25 - 2019
- Volume 24 - 2018
- Volume 23 - 2018
- Volume 22 - 2017
- Volume 21 - 2017
- Volume 20 - 2016
- Volume 19 - 2016
- Volume 18 - 2015
- Volume 17 - 2015
- Volume 16 - 2014
- Volume 15 - 2014
- Volume 14 - 2013
- Volume 13 - 2013
- Volume 12 - 2012
- Volume 11 - 2012
- Volume 10 - 2011
- Volume 9 - 2011
- Volume 8 - 2010
- Volume 7 - 2010
- Volume 6 - 2009
- Volume 5 - 2009
- Volume 4 - 2008
- Volume 3 - 2008
- Volume 2 - 2007
- Volume 1 - 2006
Commun. Comput. Phys., 33 (2023), pp. 596-627.
Published online: 2023-03
Cited by
- BibTex
- RIS
- TXT
With the remarkable empirical success of neural networks across diverse scientific disciplines, rigorous error and convergence analysis are also being developed and enriched. However, there has been little theoretical work focusing on neural networks in solving interface problems. In this paper, we perform a convergence analysis of physics-informed neural networks (PINNs) for solving second-order elliptic interface problems. Specifically, we consider PINNs with domain decomposition technologies and introduce gradient-enhanced strategies on the interfaces to deal with boundary and interface jump conditions. It is shown that the neural network sequence obtained by minimizing a Lipschitz regularized loss function converges to the unique solution to the interface problem in $H^2$ as the number of samples increases. Numerical experiments are provided to demonstrate our theoretical analysis.
}, issn = {1991-7120}, doi = {https://doi.org/10.4208/cicp.OA-2022-0218}, url = {http://global-sci.org/intro/article_detail/cicp/21501.html} }With the remarkable empirical success of neural networks across diverse scientific disciplines, rigorous error and convergence analysis are also being developed and enriched. However, there has been little theoretical work focusing on neural networks in solving interface problems. In this paper, we perform a convergence analysis of physics-informed neural networks (PINNs) for solving second-order elliptic interface problems. Specifically, we consider PINNs with domain decomposition technologies and introduce gradient-enhanced strategies on the interfaces to deal with boundary and interface jump conditions. It is shown that the neural network sequence obtained by minimizing a Lipschitz regularized loss function converges to the unique solution to the interface problem in $H^2$ as the number of samples increases. Numerical experiments are provided to demonstrate our theoretical analysis.