Dimensionindependent likelihoodinformed MCMC
Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operatorweighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operatorweighted proposals adapted to the nonGaussian structure of the posterior. The resulting dimensionindependent and likelihoodinformed (DILI) MCMC samplers may be useful for a large class of highdimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.
 Authors:

^{[1]};
^{[2]};
^{[1]}
 Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Publication Date:
 Grant/Contract Number:
 AC0500OR22725; SC0009297
 Type:
 Accepted Manuscript
 Journal Name:
 Journal of Computational Physics
 Additional Journal Information:
 Journal Volume: 304; Journal Issue: C; Journal ID: ISSN 00219991
 Publisher:
 Elsevier
 Research Org:
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Sponsoring Org:
 USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC21)
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING; Markov chain Monte Carlo; likelihoodinformed subspace; infinitedimensional inverse problems; langevin SDE; conditioned diffusion
 OSTI Identifier:
 1324173
 Alternate Identifier(s):
 OSTI ID: 1359277
Cui, Tiangang, Law, Kody J. H., and Marzouk, Youssef M.. Dimensionindependent likelihoodinformed MCMC. United States: N. p.,
Web. doi:10.1016/j.jcp.2015.10.008.
Cui, Tiangang, Law, Kody J. H., & Marzouk, Youssef M.. Dimensionindependent likelihoodinformed MCMC. United States. doi:10.1016/j.jcp.2015.10.008.
Cui, Tiangang, Law, Kody J. H., and Marzouk, Youssef M.. 2015.
"Dimensionindependent likelihoodinformed MCMC". United States.
doi:10.1016/j.jcp.2015.10.008. https://www.osti.gov/servlets/purl/1324173.
@article{osti_1324173,
title = {Dimensionindependent likelihoodinformed MCMC},
author = {Cui, Tiangang and Law, Kody J. H. and Marzouk, Youssef M.},
abstractNote = {Many Bayesian inference problems require exploring the posterior distribution of highdimensional parameters that represent the discretization of an underlying function. Our work introduces a family of Markov chain Monte Carlo (MCMC) samplers that can adapt to the particular structure of a posterior distribution over functions. There are two distinct lines of research that intersect in the methods we develop here. First, we introduce a general class of operatorweighted proposal distributions that are well defined on function space, such that the performance of the resulting MCMC samplers is independent of the discretization of the function. Second, by exploiting local Hessian information and any associated lowdimensional structure in the change from prior to posterior distributions, we develop an inhomogeneous discretization scheme for the Langevin stochastic differential equation that yields operatorweighted proposals adapted to the nonGaussian structure of the posterior. The resulting dimensionindependent and likelihoodinformed (DILI) MCMC samplers may be useful for a large class of highdimensional problems where the target probability measure has a density with respect to a Gaussian reference measure. Finally, we use two nonlinear inverse problems in order to demonstrate the efficiency of these DILI samplers: an elliptic PDE coefficient inverse problem and path reconstruction in a conditioned diffusion.},
doi = {10.1016/j.jcp.2015.10.008},
journal = {Journal of Computational Physics},
number = C,
volume = 304,
place = {United States},
year = {2015},
month = {10}
}