skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference

; ;
Publication Date:
Sponsoring Org.:
OSTI Identifier:
Grant/Contract Number:
Resource Type:
Journal Article: Publisher's Accepted Manuscript
Journal Name:
Journal of Computational Physics
Additional Journal Information:
Journal Volume: 284; Journal Issue: C; Related Information: CHORUS Timestamp: 2016-09-04 19:01:05; Journal ID: ISSN 0021-9991
Country of Publication:
United States

Citation Formats

Chen, Peng, Zabaras, Nicholas, and Bilionis, Ilias. Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference. United States: N. p., 2015. Web. doi:10.1016/
Chen, Peng, Zabaras, Nicholas, & Bilionis, Ilias. Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference. United States. doi:10.1016/
Chen, Peng, Zabaras, Nicholas, and Bilionis, Ilias. 2015. "Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference". United States. doi:10.1016/
title = {Uncertainty propagation using infinite mixture of Gaussian processes and variational Bayesian inference},
author = {Chen, Peng and Zabaras, Nicholas and Bilionis, Ilias},
abstractNote = {},
doi = {10.1016/},
journal = {Journal of Computational Physics},
number = C,
volume = 284,
place = {United States},
year = 2015,
month = 3

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record at 10.1016/

Citation Metrics:
Cited by: 9works
Citation information provided by
Web of Science

Save / Share:
  • Abstract not provided.
  • Uncertainty quantification (UQ) tasks, such as model calibration, uncertainty propagation, and optimization under uncertainty, typically require several thousand evaluations of the underlying computer codes. To cope with the cost of simulations, one replaces the real response surface with a cheap surrogate based, e.g., on polynomial chaos expansions, neural networks, support vector machines, or Gaussian processes (GP). However, the number of simulations required to learn a generic multivariate response grows exponentially as the input dimension increases. This curse of dimensionality can only be addressed, if the response exhibits some special structure that can be discovered and exploited. A wide range ofmore » physical responses exhibit a special structure known as an active subspace (AS). An AS is a linear manifold of the stochastic space characterized by maximal response variation. The idea is that one should first identify this low dimensional manifold, project the high-dimensional input onto it, and then link the projection to the output. If the dimensionality of the AS is low enough, then learning the link function is a much easier problem than the original problem of learning a high-dimensional function. The classic approach to discovering the AS requires gradient information, a fact that severely limits its applicability. Furthermore, and partly because of its reliance to gradients, it is not able to handle noisy observations. The latter is an essential trait if one wants to be able to propagate uncertainty through stochastic simulators, e.g., through molecular dynamics codes. In this work, we develop a probabilistic version of AS which is gradient-free and robust to observational noise. Our approach relies on a novel Gaussian process regression with built-in dimensionality reduction. In particular, the AS is represented as an orthogonal projection matrix that serves as yet another covariance function hyper-parameter to be estimated from the data. To train the model, we design a two-step maximum likelihood optimization procedure that ensures the orthogonality of the projection matrix by exploiting recent results on the Stiefel manifold, i.e., the manifold of matrices with orthogonal columns. The additional benefit of our probabilistic formulation, is that it allows us to select the dimensionality of the AS via the Bayesian information criterion. We validate our approach by showing that it can discover the right AS in synthetic examples without gradient information using both noiseless and noisy observations. We demonstrate that our method is able to discover the same AS as the classical approach in a challenging one-hundred-dimensional problem involving an elliptic stochastic partial differential equation with random conductivity. Finally, we use our approach to study the effect of geometric and material uncertainties in the propagation of solitary waves in a one dimensional granular system.« less
  • A unified, Bayesian inference of midplane electron temperature and density profiles using both Thomson scattering (TS) and interferometric data is presented. Beyond the Bayesian nature of the analysis, novel features of the inference are the use of a Gaussian process prior to infer a mollification length-scale of inferred profiles and the use of Gauss-Laguerre quadratures to directly calculate the depolarisation term associated with the TS forward model. Results are presented from an application of the method to data from the high resolution TS system on the Mega-Ampere Spherical Tokamak, along with a comparison to profiles coming from the standard analysismore » carried out on that system.« less
  • In this study, a Bayesian based non-stationary Gaussian Process (GP) method for the inference of soft X-ray emissivity distribution along with its associated uncertainties has been developed. For the investigation of equilibrium condition and fast magnetohydrodynamic behaviors in nuclear fusion plasmas, it is of importance to infer, especially in the plasma center, spatially resolved soft X-ray profiles from a limited number of noisy line integral measurements. For this ill-posed inversion problem, Bayesian probability theory can provide a posterior probability distribution over all possible solutions under given model assumptions. Specifically, the use of a non-stationary GP to model the emission allowsmore » the model to adapt to the varying length scales of the underlying diffusion process. In contrast to other conventional methods, the prior regularization is realized in a probability form which enhances the capability of uncertainty analysis, in consequence, scientists who concern the reliability of their results will benefit from it. Under the assumption of normally distributed noise, the posterior distribution evaluated at a discrete number of points becomes a multivariate normal distribution whose mean and covariance are analytically available, making inversions and calculation of uncertainty fast. Additionally, the hyper-parameters embedded in the model assumption can be optimized through a Bayesian Occam's Razor formalism and thereby automatically adjust the model complexity. This method is shown to produce convincing reconstructions and good agreements with independently calculated results from the Maximum Entropy and Equilibrium-Based Iterative Tomography Algorithm methods.« less