Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

When Do Extended Physics-Informed Neural Networks (XPINNs) Improve Generalization?

Journal Article · · SIAM Journal on Scientific Computing
DOI:https://doi.org/10.1137/21m1447039· OSTI ID:2282018
 [1];  [2];  [2];  [3]
  1. National University of Singapore (Singapore); Brown University
  2. Brown University, Providence, RI (United States)
  3. National University of Singapore (Singapore)

Physics-informed neural networks (PINNs) have become a popular choice for solving high-dimensional partial differential equations (PDEs) due to their excellent approximation power and generalization ability. Recently, extended PINNs (XPINNs) based on domain decomposition methods have attracted considerable attention due to their effectiveness in modeling multiscale and multiphysics problems and their parallelization. However, theoretical understanding of their convergence and generalization properties remains unexplored. In this study, we take an initial step towards understanding how and when XPINNs outperform PINNs. Specifically, for general multilayer PINNs and XPINNs, we first provide a prior generalization bound via the complexity of the target functions in the PDE problem and a posterior generalization bound via the posterior matrix norms of the networks after optimization. Moreover, based on our bounds, we analyze the conditions under which XPINNs improve generalization. Concretely, our theory shows that the key building block of XPINN, namely, the domain decomposition, introduces a tradeoff for generalization. On the one hand, XPINNs decompose the complex PDE solution into several simple parts, which decreases the complexity needed to learn each part and boosts generalization. On the other hand, decomposition leads to less training data being available in each subdomain, and hence such a model is typically prone to overfitting and may become less generalizable. Empirically, we choose five PDEs to show when XPINNs perform better than, similar to, or worse than PINNs, hence demonstrating and justifying our new theory.

Research Organization:
Brown University, Providence, RI (United States)
Sponsoring Organization:
USDOE; OSD/AFOSR MURI
Grant/Contract Number:
SC0019453
OSTI ID:
2282018
Journal Information:
SIAM Journal on Scientific Computing, Journal Name: SIAM Journal on Scientific Computing Journal Issue: 5 Vol. 44; ISSN 1064-8275
Publisher:
Society for Industrial and Applied Mathematics (SIAM)Copyright Statement
Country of Publication:
United States
Language:
English

References (15)

Rayleigh‐Ritz‐Galerkin methods for dirichlet's problem using subspaces without boundary conditions journal July 1970
On the limited memory BFGS method for large scale optimization journal August 1989
Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review journal March 2017
Approximation of a function and its derivative with a neural network journal January 1992
Approximations of Functions by a Multilayer Perceptron: a New Approach journal August 1997
Conservative physics-informed neural networks on discrete domains for conservation laws: Applications to forward and inverse problems journal June 2020
Non-invasive inference of thrombus material properties with physics-informed neural networks journal March 2021
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations journal February 2019
Adaptive activation functions accelerate convergence in deep and physics-informed neural networks journal March 2020
NSFnets (Navier-Stokes flow nets): Physics-informed neural networks for the incompressible Navier-Stokes equations journal February 2021
Flow over an espresso cup: inferring 3-D velocity and pressure fields from tomographic background oriented Schlieren via physics-informed neural networks journal March 2021
Physics-Informed Neural Networks for Heat Transfer Problems journal April 2021
Physics-informed neural networks for inverse problems in nano-optics and metamaterials journal January 2020
Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations journal June 2020
On the Convergence of Physics Informed Neural Networks for Linear Second-Order Elliptic and Parabolic Type PDEs journal June 2020

Similar Records

Extended Physics-Informed Neural Networks (XPINNs): A Generalized Space-Time Domain Decomposition Based Deep Learning Framework for Nonlinear Partial Differential Equations
Journal Article · Sun Nov 01 00:00:00 EDT 2020 · Communications in Computational Physics · OSTI ID:2282003

Parallel physics-informed neural networks via domain decomposition
Journal Article · Mon Sep 06 00:00:00 EDT 2021 · Journal of Computational Physics · OSTI ID:1977268