Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

When and why PINNs fail to train: A neural tangent kernel perspective

Journal Article · · Journal of Computational Physics
Physics-informed neural networks (PINNs) have lately received great attention thanks to their flexibility in tackling a wide range of forward and inverse problems involving partial differential equations. However, despite their noticeable empirical success, little is known about how such constrained neural networks behave during their training via gradient descent. More importantly, even less is known about why such models sometimes fail to train at all. Here in this work, we aim to investigate these questions through the lens of the Neural Tangent Kernel (NTK); a kernel that captures the behavior of fully-connected neural networks in the infinite width limit during training via gradient descent. Specifically, we derive the NTK of PINNs and prove that, under appropriate conditions, it converges to a deterministic kernel that stays constant during training in the infinite-width limit. This allows us to analyze the training dynamics of PINNs through the lens of their limiting NTK and find a remarkable discrepancy in the convergence rate of the different loss components contributing to the total training error. To address this fundamental pathology, we propose a novel gradient descent algorithm that utilizes the eigenvalues of the NTK to adaptively calibrate the convergence rate of the total training error. Finally, we perform a series of numerical experiments to verify the correctness of our theory and the practical effectiveness of the proposed algorithms.
Research Organization:
Raytheon Technologies Corporation, Waltham, MA (United States); University of Pennsylvania, Philadelphia, PA (United States)
Sponsoring Organization:
Air Force Office of Scientific Research (AFOSR); USDOE Advanced Research Projects Agency - Energy (ARPA-E)
Grant/Contract Number:
AR0001201; SC0019116
OSTI ID:
1977272
Journal Information:
Journal of Computational Physics, Journal Name: Journal of Computational Physics Journal Issue: C Vol. 449; ISSN 0021-9991
Publisher:
ElsevierCopyright Statement
Country of Publication:
United States
Language:
English

References (20)

Fundamentals of Engineering Numerical Analysis book January 2010
Wide neural networks of any depth evolve as linear models under gradient descent * journal December 2020
Frequency Bias in Neural Networks for Input of Non-Uniform Density preprint January 2020
Machine learning in cardiovascular flows modeling: Predicting arterial blood pressure from non-invasive 4D flow MRI data using physics-informed neural networks journal January 2020
Surrogate modeling for fluid flows based on physics-constrained deep learning without simulation data journal April 2020
DGM: A deep learning algorithm for solving partial differential equations journal December 2018
Physics-informed neural networks: A deep learning framework for solving forward and inverse problems involving nonlinear partial differential equations journal February 2019
Physics-constrained deep learning for high-dimensional surrogate modeling and uncertainty quantification without labeled data journal October 2019
Adversarial uncertainty quantification in physics-informed neural networks journal October 2019
Deep learning of vortex-induced vibrations journal December 2018
Physics‐Informed Deep Neural Networks for Learning Parameters and Constitutive Relationships in Subsurface Flow Problems journal May 2020
Solving high-dimensional partial differential equations using deep learning journal August 2018
Deep learning of turbulent scalar mixing journal December 2019
Deep Physical Informed Neural Networks for Metamaterial Design journal January 2020
Multi-Fidelity Physics-Constrained Neural Network and Its Application in Materials Modeling journal September 2019
Hidden fluid mechanics: Learning velocity and pressure fields from flow visualizations journal January 2020
fPINNs: Fractional Physics-Informed Neural Networks journal January 2019
Learning in Modal Space: Solving Time-Dependent Stochastic PDEs Using Physics-Informed Neural Networks journal January 2020
Physics-informed neural networks for inverse problems in nano-optics and metamaterials journal January 2020
Physics-Informed Neural Networks for Cardiac Activation Mapping journal February 2020

Cited By (1)


Similar Records

On the eigenvector bias of Fourier feature networks: From regression to solving multi-scale PDEs with physics-informed neural networks
Journal Article · Tue Jun 01 20:00:00 EDT 2021 · Computer Methods in Applied Mechanics and Engineering · OSTI ID:1976965

Infinite neural network quantum states: entanglement and training dynamics
Journal Article · Fri Jun 02 20:00:00 EDT 2023 · Machine Learning: Science and Technology · OSTI ID:2425540

Feature learning and generalization in deep networks with orthogonal weights
Journal Article · Wed Aug 06 20:00:00 EDT 2025 · Machine Learning: Science and Technology · OSTI ID:2575064