TY - JOUR T1 - Frame Invariance and Scalability of Neural Operators for Partial Differential Equations AU - Zafar , Muhammad I. AU - Han , Jiequn AU - Zhou , Xu-Hui AU - Xiao , Heng JO - Communications in Computational Physics VL - 2 SP - 336 EP - 363 PY - 2022 DA - 2022/08 SN - 32 DO - http://doi.org/10.4208/cicp.OA-2021-0256 UR - https://global-sci.org/intro/article_detail/cicp/20861.html KW - Neural operators, graph neural networks, constitutive modeling, inverse modeling, deep learning. AB -
Partial differential equations (PDEs) play a dominant role in the mathematical modeling of many complex dynamical processes. Solving these PDEs often requires prohibitively high computational costs, especially when multiple evaluations must be made for different parameters or conditions. After training, neural operators can provide PDEs solutions significantly faster than traditional PDE solvers. In this work, invariance properties and computational complexity of two neural operators are examined for transport PDE of a scalar quantity. Neural operator based on graph kernel network (GKN) operates on graph-structured data to incorporate nonlocal dependencies. Here we propose a modified formulation of GKN to achieve frame invariance. Vector cloud neural network (VCNN) is an alternate neural operator with embedded frame invariance which operates on point cloud data. GKN-based neural operator demonstrates slightly better predictive performance compared to VCNN. However, GKN requires an excessively high computational cost that increases quadratically with the increasing number of discretized objects as compared to a linear increase for VCNN.