Accelerating multilevel Markov Chain Monte Carlo using machine learning models
- Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
Here, this work presents an efficient approach for accelerating multilevel Markov Chain Monte Carlo (MCMC) sampling for large-scale problems using low-fidelity machine learning models. While conventional techniques for large-scale Bayesian inference often substitute computationally expensive high-fidelity models with machine learning models, thereby introducing approximation errors, our approach offers a computationally efficient alternative by augmenting high-fidelity models with low-fidelity ones within a hierarchical framework. The multilevel approach utilizes the low-fidelity machine learning model (MLM) for inexpensive evaluation of proposed samples thereby improving the acceptance of samples by the high-fidelity model. The hierarchy in our multilevel algorithm is derived from geometric multigrid hierarchy. We utilize an MLM to accelerate the coarse level sampling. Training machine learning model for the coarsest level significantly reduces the computational cost associated with generating training data and training the model. We present an MCMC algorithm to accelerate the coarsest level sampling using MLM and account for the approximation error introduced. We provide theoretical proofs of detailed balance and demonstrate that our multilevel approach constitutes a consistent MCMC algorithm. Additionally, we derive the expression for cost reduction due to machine learning model to facilitate cost analysis of the hierarchical sampling algorithm. Our technique is demonstrated on a standard benchmark inference problem in groundwater flow, where we estimate the probability density of a quantity of interest using a four-level MCMC algorithm. Our proposed algorithm accelerates multilevel sampling by a factor of two while achieving similar accuracy compared to sampling using the standard multilevel algorithm.
- Research Organization:
- Lawrence Livermore National Laboratory (LLNL), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA); USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR)
- Grant/Contract Number:
- AC52-07NA27344
- OSTI ID:
- 2566341
- Report Number(s):
- LLNL--JRNL-862759; 1095298
- Journal Information:
- Physica Scripta, Journal Name: Physica Scripta Journal Issue: 5 Vol. 100; ISSN 0031-8949
- Publisher:
- IOP PublishingCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Multilevel Hierarchical Decomposition of Finite Element White Noise with Application to Multilevel Markov Chain Monte Carlo
Accelerating Markov Chain Monte Carlo sampling with diffusion models
Context-aware learning of hierarchies of low-fidelity models for multi-fidelity uncertainty quantification
Journal Article
·
Mon Jun 07 20:00:00 EDT 2021
· SIAM Journal on Scientific Computing
·
OSTI ID:1843111
Accelerating Markov Chain Monte Carlo sampling with diffusion models
Journal Article
·
Tue Dec 12 19:00:00 EST 2023
· Computer Physics Communications
·
OSTI ID:2281802
Context-aware learning of hierarchies of low-fidelity models for multi-fidelity uncertainty quantification
Journal Article
·
Mon Jan 30 19:00:00 EST 2023
· Computer Methods in Applied Mechanics and Engineering
·
OSTI ID:2424931