Multilevel sequential Monte Carlo samplers
Abstract
Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the stepsize level h _{L}. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levels $${\infty}$$ >h _{0}>h _{1 }...>h _{L}. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.
 Authors:
 Univ. College London, London (United Kingdom). Dept. of Statistical Science
 National Univ. of Singapore (Singapore). Dept. of Statistics & Applied Probability
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Computer Science and Mathematics Division
 King Abdullah Univ. of Science and Technology, Thuwal (Saudi Arabia)
 Publication Date:
 Research Org.:
 Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
 Sponsoring Org.:
 USDOE Laboratory Directed Research and Development (LDRD) Program
 OSTI Identifier:
 1302922
 Grant/Contract Number:
 AC0500OR22725; R155000143112
 Resource Type:
 Journal Article: Accepted Manuscript
 Journal Name:
 Stochastic Processes and Their Applications
 Additional Journal Information:
 Journal Name: Stochastic Processes and Their Applications; Journal ID: ISSN 03044149
 Publisher:
 Elsevier
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING; 77 NANOSCIENCE AND NANOTECHNOLOGY; multilevel Monte Carlo; sequential Monte Carlo; Bayesian inverse problems
Citation Formats
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, and Zhou, Yan. Multilevel sequential Monte Carlo samplers. United States: N. p., 2016.
Web. doi:10.1016/j.spa.2016.08.004.
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, & Zhou, Yan. Multilevel sequential Monte Carlo samplers. United States. doi:10.1016/j.spa.2016.08.004.
Beskos, Alexandros, Jasra, Ajay, Law, Kody, Tempone, Raul, and Zhou, Yan. 2016.
"Multilevel sequential Monte Carlo samplers". United States.
doi:10.1016/j.spa.2016.08.004. https://www.osti.gov/servlets/purl/1302922.
@article{osti_1302922,
title = {Multilevel sequential Monte Carlo samplers},
author = {Beskos, Alexandros and Jasra, Ajay and Law, Kody and Tempone, Raul and Zhou, Yan},
abstractNote = {Here, we study the approximation of expectations w.r.t. probability distributions associated to the solution of partial differential equations (PDEs); this scenario appears routinely in Bayesian inverse problems. In practice, one often has to solve the associated PDE numerically, using, for instance finite element methods and leading to a discretisation bias, with the stepsize level hL. In addition, the expectation cannot be computed analytically and one often resorts to Monte Carlo methods. In the context of this problem, it is known that the introduction of the multilevel Monte Carlo (MLMC) method can reduce the amount of computational effort to estimate expectations, for a given level of error. This is achieved via a telescoping identity associated to a Monte Carlo approximation of a sequence of probability distributions with discretisation levels ${\infty}$ >h0>h1 ...>hL. In many practical problems of interest, one cannot achieve an i.i.d. sampling of the associated sequence of probability distributions. A sequential Monte Carlo (SMC) version of the MLMC method is introduced to deal with this problem. In conclusion, it is shown that under appropriate assumptions, the attractive property of a reduction of the amount of computational effort to estimate expectations, for a given level of error, can be maintained within the SMC context.},
doi = {10.1016/j.spa.2016.08.004},
journal = {Stochastic Processes and Their Applications},
number = ,
volume = ,
place = {United States},
year = 2016,
month = 8
}

This article considers the sequential Monte Carlo (SMC) approximation of ratios of normalizing constants associated to posterior distributions which in principle rely on continuum models. Therefore, the Monte Carlo estimation error and the discrete approximation error must be balanced. A multilevel strategy is utilized to substantially reduce the cost to obtain a given error level in the approximation as compared to standard estimators. Two estimators are considered and relative variance bounds are given. The theoretical results are numerically illustrated for two Bayesian inverse problems arising from elliptic partial differential equations (PDEs). The examples involve the inversion of observations of themore »

Multilevel sequential Monte Carlo: Mean square error bounds under verifiable conditions
We consider the multilevel sequential Monte Carlo (MLSMC) method of Beskos et al. (Stoch. Proc. Appl. [to appear]). This technique is designed to approximate expectations w.r.t. probability laws associated to a discretization. For instance, in the context of inverse problems, where one discretizes the solution of a partial differential equation. The MLSMC approach is especially useful when independent, coupled sampling is not possible. Beskos et al. show that for MLSMC the computational effort to achieve a given error, can be less than independent sampling. In this article we significantly weaken the assumptions of Beskos et al., extending the proofs tomore » 
SmallNoise Analysis and Symmetrization of Implicit Monte Carlo Samplers
Implicit samplers are algorithms for producing independent, weighted samples from multivariate probability distributions. These are often applied in Bayesian data assimilation algorithms. We use Laplace asymptotic expansions to analyze two implicit samplers in the small noise regime. Our analysis suggests a symmetrization of the algorithms that leads to improved implicit sampling schemes at a relatively small additional cost. Here, computational experiments confirm the theory and show that symmetrization is effective for small noise sampling problems. 
Multilevel Monte Carlo simulation of Coulomb collisions
We present a new, for plasma physics, highly efficient multilevel Monte Carlo numerical method for simulating Coulomb collisions. The method separates and optimally minimizes the finitetimestep and finitesampling errors inherent in the Langevin representation of the Landau–Fokker–Planck equation. It does so by combining multiple solutions to the underlying equations with varying numbers of timesteps. For a desired level of accuracy ε , the computational cost of the method is O(ε ^{–2}) or (ε ^{–2}(lnε) ^{2}), depending on the underlying discretization, Milstein or Euler–Maruyama respectively. This is to be contrasted with a cost of O(ε ^{–3}) for direct simulation Monte Carlomore »Cited by 1 
A continuation multilevel Monte Carlo algorithm
Here, we propose a novel Continuation Multi Level Monte Carlo (CMLMC) algorithm for weak approximation of stochastic models. The CMLMC algorithm solves the given approximation problem for a sequence of decreasing tolerances, ending when the required error tolerance is satisfied. CMLMC assumes discretization hierarchies that are defined a priori for each level and are geometrically refined across levels. Moreover, the actual choice of computational work across levels is based on parametric models for the average cost per sample and the corresponding variance and weak error. These parameters are calibrated using Bayesian estimation, taking particular notice of the deepest levels ofmore »