Iterative Importance Sampling Algorithms for Parameter Estimation
Abstract
In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is a challenging task. Several sampling algorithms have been proposed over the past years that take an iterative approach to constructing a proposal distribution. We investigate the applicability of such algorithms by applying them to two realistic and challenging test problems, one in subsurface flow, and one in combustion modeling. More specifically, we implement importance sampling algorithms that iterate over the mean and covariance matrix of Gaussian or multivariate tproposal distributions. Our implementation leverages massively parallel computers, and we present strategies to initialize the iterations using “coarse” MCMC runs or Gaussian mixture models.
 Authors:

 Univ. of Arizona, Tucson, AZ (United States)
 Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
 National Renewable Energy Lab. (NREL), Golden, CO (United States)
 Publication Date:
 Research Org.:
 Lawrence Berkeley National Lab. (LBNL), Berkeley, CA (United States)
 Sponsoring Org.:
 USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC21); Alfred P. Sloan Foundation
 OSTI Identifier:
 1525280
 Grant/Contract Number:
 AC0205CH11231; DMS1619630
 Resource Type:
 Accepted Manuscript
 Journal Name:
 SIAM Journal on Scientific Computing
 Additional Journal Information:
 Journal Volume: 40; Journal Issue: 2; Journal ID: ISSN 10648275
 Publisher:
 SIAM
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING; Importance sampling; parameter estimation; Bayesian inverse problem
Citation Formats
Morzfeld, Matthias, Day, Marcus S., Grout, Ray W., Heng Pau, George Shu, Finsterle, Stefan A., and Bell, John B. Iterative Importance Sampling Algorithms for Parameter Estimation. United States: N. p., 2018.
Web. doi:10.1137/16M1088417.
Morzfeld, Matthias, Day, Marcus S., Grout, Ray W., Heng Pau, George Shu, Finsterle, Stefan A., & Bell, John B. Iterative Importance Sampling Algorithms for Parameter Estimation. United States. doi:10.1137/16M1088417.
Morzfeld, Matthias, Day, Marcus S., Grout, Ray W., Heng Pau, George Shu, Finsterle, Stefan A., and Bell, John B. Thu .
"Iterative Importance Sampling Algorithms for Parameter Estimation". United States. doi:10.1137/16M1088417. https://www.osti.gov/servlets/purl/1525280.
@article{osti_1525280,
title = {Iterative Importance Sampling Algorithms for Parameter Estimation},
author = {Morzfeld, Matthias and Day, Marcus S. and Grout, Ray W. and Heng Pau, George Shu and Finsterle, Stefan A. and Bell, John B.},
abstractNote = {In parameter estimation problems one computes a posterior distribution over uncertain parameters defined jointly by a prior distribution, a model, and noisy data. Markov chain Monte Carlo (MCMC) is often used for the numerical solution of such problems. An alternative to MCMC is importance sampling, which can exhibit near perfect scaling with the number of cores on high performance computing systems because samples are drawn independently. However, finding a suitable proposal distribution is a challenging task. Several sampling algorithms have been proposed over the past years that take an iterative approach to constructing a proposal distribution. We investigate the applicability of such algorithms by applying them to two realistic and challenging test problems, one in subsurface flow, and one in combustion modeling. More specifically, we implement importance sampling algorithms that iterate over the mean and covariance matrix of Gaussian or multivariate tproposal distributions. Our implementation leverages massively parallel computers, and we present strategies to initialize the iterations using “coarse” MCMC runs or Gaussian mixture models.},
doi = {10.1137/16M1088417},
journal = {SIAM Journal on Scientific Computing},
number = 2,
volume = 40,
place = {United States},
year = {2018},
month = {3}
}
Web of Science