Learning functional priors and posteriors from data and physics
Journal Article
·
· Journal of Computational Physics
- Brown Univ., Providence, RI (United States)
- Xiamen Univ. (China)
- Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
- Brown Univ., Providence, RI (United States); Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
In this work, we develop a new Bayesian framework based on deep neural networks to be able to extrapolate in space-time using historical data and to quantify uncertainties arising from both noisy and gappy data in physical problems. Specifically, the proposed approach has two stages: (1) prior learning and (2) posterior estimation. At the first stage, we employ the physics-informed Generative Adversarial Networks (PI-GAN) to learn a functional prior either from a prescribed function distribution, e.g., Gaussian process, or from historical data and physics. At the second stage, we employ the Hamiltonian Monte Carlo (HMC) method to estimate the posterior in the latent space of PI-GANs. In addition, we use two different approaches to encode the physics: (1) automatic differentiation, used in the physicsinformed neural networks (PINNs) for scenarios with explicitly known partial differential equations (PDEs), and (2) operator regression using the deep operator network (DeepONet) for PDE-agnostic scenarios. We then test the proposed method for (1) meta-learning for one-dimensional regression, and forward/inverse PDE problems (combined with PINNs); (2) PDE-agnostic physical problems (combined with DeepONet), e.g., fractional diffusion as well as saturated stochastic (100-dimensional) flows in heterogeneous porous media; and (3) spatial-temporal regression problems, i.e., inference of a marine riser displacement field using experimental data from the Norwegian Deepwater Programme (NDP). The results demonstrate that the proposed approach can provide accurate predictions as well as uncertainty quantification given very limited scattered and noisy data, since historical data could be available to provide informative priors. In summary, the proposed method is capable of learning flexible functional priors, e.g., both Gaussian and non-Gaussian process, and can be readily extended to big data problems by enabling mini-batch training using stochastic HMC or normalizing flows since the latent space is generally characterized as low dimensional.
- Research Organization:
- Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
- Sponsoring Organization:
- National Institutes of Health (NIH); US Air Force Office of Scientific Research (AFOSR); USDOE
- Grant/Contract Number:
- AC05-76RL01830; SC0019453
- OSTI ID:
- 1897198
- Alternate ID(s):
- OSTI ID: 1846502
- Report Number(s):
- PNNL-SA-178723
- Journal Information:
- Journal of Computational Physics, Journal Name: Journal of Computational Physics Vol. 457; ISSN 0021-9991
- Publisher:
- ElsevierCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
B-PINNs: Bayesian Physics-informal Neural Networks for Forward and Inverse PDE Problems with Noisy Data
B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data
B-DeepONet: An enhanced Bayesian DeepONet for solving noisy parametric PDEs using accelerated replica exchange SGLD
Journal Article
·
Thu Jan 14 23:00:00 EST 2021
· Journal of Computational Physics
·
OSTI ID:1763962
B-PINNs: Bayesian physics-informed neural networks for forward and inverse PDE problems with noisy data
Journal Article
·
Wed Oct 14 20:00:00 EDT 2020
· Journal of Computational Physics
·
OSTI ID:2282008
B-DeepONet: An enhanced Bayesian DeepONet for solving noisy parametric PDEs using accelerated replica exchange SGLD
Journal Article
·
Thu Oct 13 20:00:00 EDT 2022
· Journal of Computational Physics
·
OSTI ID:2421766