An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions
Abstract
Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computationaldemanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limitedmore »
 Authors:
 Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
 Purdue University, West Lafayette, IN (United States)
 Publication Date:
 Research Org.:
 Pacific Northwest National Lab. (PNNL), Richland, WA (United States)
 Sponsoring Org.:
 USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC21)
 OSTI Identifier:
 1182897
 Alternate Identifier(s):
 OSTI ID: 1367756
 Report Number(s):
 PNNLSA105408
Journal ID: ISSN 00219991; KJ0401000
 Grant/Contract Number:
 DMS1115887; AC0576RL01830
 Resource Type:
 Journal Article: Accepted Manuscript
 Journal Name:
 Journal of Computational Physics
 Additional Journal Information:
 Journal Volume: 294; Journal Issue: C; Journal ID: ISSN 00219991
 Publisher:
 Elsevier
 Country of Publication:
 United States
 Language:
 English
 Subject:
 72 PHYSICS OF ELEMENTARY PARTICLES AND FIELDS; inverse modeling, uncertainty reduction, adaptive sampling, Gaussian mixture, mixture of polynomial chaos expansions
Citation Formats
Li, Weixuan, and Lin, Guang. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions. United States: N. p., 2015.
Web. doi:10.1016/j.jcp.2015.03.047.
Li, Weixuan, & Lin, Guang. An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions. United States. doi:10.1016/j.jcp.2015.03.047.
Li, Weixuan, and Lin, Guang. 2015.
"An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions". United States.
doi:10.1016/j.jcp.2015.03.047. https://www.osti.gov/servlets/purl/1182897.
@article{osti_1182897,
title = {An adaptive importance sampling algorithm for Bayesian inversion with multimodal distributions},
author = {Li, Weixuan and Lin, Guang},
abstractNote = {Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes’ rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle these challenges. Two essential ingredients of the algorithm are: 1) a Gaussian mixture (GM) model adaptively constructed as the proposal distribution to approximate the possibly multimodal target posterior, and 2) a mixture of polynomial chaos (PC) expansions, built according to the GM proposal, as a surrogate model to alleviate the computational burden caused by computationaldemanding forward model evaluations. In three illustrative examples, the proposed adaptive importance sampling algorithm demonstrates its capabilities of automatically finding a GM proposal with an appropriate number of modes for the specific problem under study, and obtaining a sample accurately and efficiently representing the posterior with limited number of forward simulations.},
doi = {10.1016/j.jcp.2015.03.047},
journal = {Journal of Computational Physics},
number = C,
volume = 294,
place = {United States},
year = 2015,
month = 3
}
Web of Science

Parametric uncertainties are encountered in the simulations of many physical systems, and may be reduced by an inverse modeling procedure that calibrates the simulation results to observations on the real system being simulated. Following Bayes' rule, a general approach for inverse modeling problems is to sample from the posterior distribution of the uncertain model parameters given the observations. However, the large number of repetitive forward simulations required in the sampling process could pose a prohibitive computational burden. This difficulty is particularly challenging when the posterior is multimodal. We present in this paper an adaptive importance sampling algorithm to tackle thesemore »

ADAPTIVE ANNEALED IMPORTANCE SAMPLING FOR MULTIMODAL POSTERIOR EXPLORATION AND MODEL SELECTION WITH APPLICATION TO EXTRASOLAR PLANET DETECTION
We describe an algorithm that can adaptively provide mixture summaries of multimodal posterior distributions. The parameter space of the involved posteriors ranges in size from a few dimensions to dozens of dimensions. This work was motivated by an astrophysical problem called extrasolar planet (exoplanet) detection, wherein the computation of stochastic integrals that are required for Bayesian model comparison is challenging. The difficulty comes from the highly nonlinear models that lead to multimodal posterior distributions. We resort to importance sampling (IS) to estimate the integrals, and thus translate the problem to be how to find a parametric approximation of the posterior.more » 
An adaptive Gaussian processbased method for efficient Bayesian experimental design in groundwater contaminant source identification problems: ADAPTIVE GAUSSIAN PROCESSBASED INVERSION
Surrogate models are commonly used in Bayesian approaches such as Markov Chain Monte Carlo (MCMC) to avoid repetitive CPUdemanding model evaluations. However, the approximation error of a surrogate may lead to biased estimations of the posterior distribution. This bias can be corrected by constructing a very accurate surrogate or implementing MCMC in a twostage manner. Since the twostage MCMC requires extra original model evaluations, the computational cost is still high. If the information of measurement is incorporated, a locally accurate approximation of the original model can be adaptively constructed with low computational cost. Based on this idea, we propose amore » 
Adaptive Importance Sampling with a Rapidly Varying Importance Function
It is known well that zerovariance Monte Carlo solutions are possible if an exact importance function is available to bias the random walks. Monte Carlo can be used to estimate the importance function. This estimated importance function then can be used to bias a subsequent Monte Carlo calculation that estimates an even better importance function; this iterative process is called adaptive importance sampling.To obtain the importance function, one can expand the importance function in a basis such as the Legendre polynomials and make Monte Carlo estimates of the expansion coefficients. For simple problems, Legendre expansions of order 10 to 15more »