Multilevel sequential Monte Carlo samplers
Abstract
Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the stepsize level h _{L}. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levels $${\infty}$$ >h _{0}>h _{1 }...>h _{L}. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.
 Authors:
 Univ. College London, London (United Kingdom). Dept. of Statistical Science
 National Univ. of Singapore (Singapore). Dept. of Statistics & Applied Probability
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Computer Science and Mathematics Division
 King Abdullah Univ. of Science and Technology, Thuwal (Saudi Arabia)
 Publication Date:
 Research Org.:
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Sponsoring Org.:
 USDOE Laboratory Directed Research and Development (LDRD) Program
 OSTI Identifier:
 1302922
 Grant/Contract Number:
 AC0500OR22725; R155000143112
 Resource Type:
 Journal Article: Accepted Manuscript
 Journal Name:
 Stochastic Processes and Their Applications
 Additional Journal Information:
 Journal Name: Stochastic Processes and Their Applications; Journal ID: ISSN 03044149
 Publisher:
 Elsevier
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING; 77 NANOSCIENCE AND NANOTECHNOLOGY; multilevel Monte Carlo; sequential Monte Carlo; Bayesian inverse problems
Citation Formats
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, and Zhou, Yan. Multilevel sequential Monte Carlo samplers. United States: N. p., 2016.
Web. doi:10.1016/j.spa.2016.08.004.
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, & Zhou, Yan. Multilevel sequential Monte Carlo samplers. United States. doi:10.1016/j.spa.2016.08.004.
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, and Zhou, Yan. Wed .
"Multilevel sequential Monte Carlo samplers". United States.
doi:10.1016/j.spa.2016.08.004. https://www.osti.gov/servlets/purl/1302922.
@article{osti_1302922,
title = {Multilevel sequential Monte Carlo samplers},
author = {Beskos, Alexandros and Jasra, Ajay and Law, Kody and Tempone, Raul and Zhou, Yan},
abstractNote = {Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the stepsize level hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levels ${\infty}$ >h0>h1 ...>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.},
doi = {10.1016/j.spa.2016.08.004},
journal = {Stochastic Processes and Their Applications},
number = ,
volume = ,
place = {United States},
year = {Wed Aug 24 00:00:00 EDT 2016},
month = {Wed Aug 24 00:00:00 EDT 2016}
}
Web of Science

This article considers the sequential Monte Carlo (SMC) approximation of ratios of normalizing constants associated to posterior distributions which in principle rely on continuum models. Therefore, the Monte Carlo estimation error and the discrete approximation error must be balanced. A multilevel strategy is utilized to substantially reduce the cost to obtain a given error level in the approximation as compared to standard estimators. Two estimators are considered and relative variance bounds are given. The theoretical results are numerically illustrated for two Bayesian inverse problems arising from elliptic partial differential equations (PDEs). The examples involve the inversion of observations of themore »

Multilevel sequential Monte Carlo: Mean square error bounds under verifiable conditions
We consider the multilevel sequential Monte Carlo (MLSMC) method of Beskos et al. (Stoch. Proc. Appl. [to appear]). This technique is designed to approximate expectations w.r.t. probability laws associated to a discretization. For instance, in the context of inverse problems, where one discretizes the solution of a partial differential equation. The MLSMC approach is especially useful when independent, coupled sampling is not possible. Beskos et al. show that for MLSMC the computational effort to achieve a given error, can be less than independent sampling. In this article we significantly weaken the assumptions of Beskos et al., extending the proofs tomore »Cited by 1 
SmallNoise Analysis and Symmetrization of Implicit Monte Carlo Samplers
Implicit samplers are algorithms for producing independent, weighted samples from multivariate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algorithms that leads to improved implicit sampling schemes at a relatively small additional cost. Here, computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems. 
Multilevel Monte Carlo for two phase flow and Buckley–Leverett transport in random heterogeneous porous media
Monte Carlo (MC) is a well known method for quantifying uncertainty arising for example in subsurface flow problems. Although robust and easy to implement, MC suffers from slow convergence. Extending MC by means of multigrid techniques yields the multilevel Monte Carlo (MLMC) method. MLMC has proven to greatly accelerate MC for several applications including stochastic ordinary differential equations in finance, elliptic stochastic partial differential equations and also hyperbolic problems. In this study, MLMC is combined with a streamlinebased solver to assess uncertain two phase flow and Buckley–Leverett transport in random heterogeneous porous media. The performance of MLMC is compared tomore » 
Stabilized multilevel Monte Carlo method for stiff stochastic differential equations
A multilevel Monte Carlo (MLMC) method for mean square stable stochastic differential equations with multiple scales is proposed. For such problems, that we call stiff, the performance of MLMC methods based on classical explicit methods deteriorates because of the time step restriction to resolve the fastest scales that prevents to exploit all the levels of the MLMC approach. We show that by switching to explicit stabilized stochastic methods and balancing the stabilization procedure simultaneously with the hierarchical sampling strategy of MLMC methods, the computational cost for stiff systems is significantly reduced, while keeping the computational algorithm fully explicit and easymore »