ExtremeScale Bayesian Inference for Uncertainty Quantification of Complex Simulations
Abstract
Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their largescale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extremescale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parametertoobservable map. These include structureexploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extremescale Bayesian inversion problems by appealing to and adapting ideas from largescale PDEconstrained optimization, which have been very successful at exploring highdimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and nonparametric density estimation. Constructing problemspecific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithmsmore »
 Authors:

 Univ. of Texas, Austin, TX (United States)
 Publication Date:
 Research Org.:
 Univ. of Texas, Austin, TX (United States)
 Sponsoring Org.:
 USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC21)
 OSTI Identifier:
 1416727
 Report Number(s):
 Final Report: DOEUT AustinSC0010518
 DOE Contract Number:
 SC0010518
 Resource Type:
 Technical Report
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING
Citation Formats
Biros, George. ExtremeScale Bayesian Inference for Uncertainty Quantification of Complex Simulations. United States: N. p., 2018.
Web. doi:10.2172/1416727.
Biros, George. ExtremeScale Bayesian Inference for Uncertainty Quantification of Complex Simulations. United States. doi:10.2172/1416727.
Biros, George. Fri .
"ExtremeScale Bayesian Inference for Uncertainty Quantification of Complex Simulations". United States. doi:10.2172/1416727. https://www.osti.gov/servlets/purl/1416727.
@article{osti_1416727,
title = {ExtremeScale Bayesian Inference for Uncertainty Quantification of Complex Simulations},
author = {Biros, George},
abstractNote = {Uncertainty quantification (UQ)—that is, quantifying uncertainties in complex mathematical models and their largescale computational implementations—is widely viewed as one of the outstanding challenges facing the field of CS&E over the coming decade. The EUREKA project set to address the most difficult class of UQ problems: those for which both the underlying PDE model as well as the uncertain parameters are of extreme scale. In the project we worked on these extremescale challenges in the following four areas: 1. Scalable parallel algorithms for sampling and characterizing the posterior distribution that exploit the structure of the underlying PDEs and parametertoobservable map. These include structureexploiting versions of the randomized maximum likelihood method, which aims to overcome the intractability of employing conventional MCMC methods for solving extremescale Bayesian inversion problems by appealing to and adapting ideas from largescale PDEconstrained optimization, which have been very successful at exploring highdimensional spaces. 2. Scalable parallel algorithms for construction of prior and likelihood functions based on learning methods and nonparametric density estimation. Constructing problemspecific priors remains a critical challenge in Bayesian inference, and more so in high dimensions. Another challenge is construction of likelihood functions that capture unmodeled couplings between observations and parameters. We will create parallel algorithms for nonparametric density estimation using high dimensional Nbody methods and combine them with supervised learning techniques for the construction of priors and likelihood functions. 3. Bayesian inadequacy models, which augment physics models with stochastic models that represent their imperfections. The success of the Bayesian inference framework depends on the ability to represent the uncertainty due to imperfections of the mathematical model of the phenomena of interest. This is a central challenge in UQ, especially for largescale models. We propose to develop the mathematical tools to address these challenges in the context of extremescale problems. 4. Parallel scalable algorithms for Bayesian optimal experimental design (OED). Bayesian inversion yields quantified uncertainties in the model parameters, which can be propagated forward through the model to yield uncertainty in outputs of interest. This opens the way for designing new experiments to reduce the uncertainties in the model parameters and model predictions. Such experimental design problems have been intractable for largescale problems using conventional methods; we will create OED algorithms that exploit the structure of the PDE model and the parametertooutput map to overcome these challenges. Parallel algorithms for these four problems were created, analyzed, prototyped, implemented, tuned, and scaled up for leadingedge supercomputers, including UTAustin’s own 10 petaflops Stampede system, ANL’s Mira system, and ORNL’s Titan system. While our focus is on fundamental mathematical/computational methods and algorithms, we will assess our methods on model problems derived from several DOE mission applications, including multiscale mechanics and ice sheet dynamics.},
doi = {10.2172/1416727},
journal = {},
number = ,
volume = ,
place = {United States},
year = {2018},
month = {1}
}