skip to main content
DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

This content will become publicly available on March 27, 2020

Title: Entropy-based closure for probabilistic learning on manifolds

Abstract

In a recent paper, the authors proposed a general methodology for probabilistic learning on manifolds. The method was used to generate numerical samples that are statistically consistent with an existing dataset construed as a realization from a non-Gaussian random vector. The manifold structure is learned using diffusion manifolds and the statistical sample generation is accomplished using a projected Itˆo stochastic differential equation. This probabilistic learning approach has been extended to polynomial chaos representation of databases on manifolds and to probabilistic nonconvex constrained optimization with a fixed budget of function evaluations. The methodology introduces an isotropic-diffusion kernel with hyperparameter ". Currently, " is more or less arbitrarily chosen. In this paper, we propose a selection criterion for identifying an optimal value of ", based on a maximum entropy argument. The result is a comprehensive, closed, probabilistic model for characterizing data sets with hidden constraints. This entropy argument ensures that out of all possible models, this is the one that is the most uncertain beyond any specified constraints, which is selected. Applications are presented for several databases.

Authors:
 [1];  [2];  [3];  [3];  [3];  [3];  [3];  [3];  [4];  [4]
  1. Univ. Paris, Marne-La-Vallee (France)
  2. Univ. of Southern California, Los Angeles, CA (United States)
  3. Sandia National Lab. (SNL-CA), Livermore, CA (United States)
  4. Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
Publication Date:
Research Org.:
Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
Sponsoring Org.:
USDOE National Nuclear Security Administration (NNSA); Defense Advanced Research Projects Agency (DARPA)
OSTI Identifier:
1526161
Alternate Identifier(s):
OSTI ID: 1529282
Report Number(s):
LLNL-JRNL-744191; SAND-2018-0462J
Journal ID: ISSN 0021-9991; 899147
Grant/Contract Number:  
AC52-07NA27344; AC04-94AL85000
Resource Type:
Accepted Manuscript
Journal Name:
Journal of Computational Physics
Additional Journal Information:
Journal Volume: 388; Journal Issue: C; Journal ID: ISSN 0021-9991
Publisher:
Elsevier
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING

Citation Formats

Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., and Chen, X. Entropy-based closure for probabilistic learning on manifolds. United States: N. p., 2019. Web. doi:10.1016/j.jcp.2018.12.029.
Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., & Chen, X. Entropy-based closure for probabilistic learning on manifolds. United States. doi:10.1016/j.jcp.2018.12.029.
Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., and Chen, X. Wed . "Entropy-based closure for probabilistic learning on manifolds". United States. doi:10.1016/j.jcp.2018.12.029.
@article{osti_1526161,
title = {Entropy-based closure for probabilistic learning on manifolds},
author = {Soize, C. and Ghanem, R. and Safta, C. and Huan, X. and Vane, Z. P. and Oefelein, J. and Lacaze, G. and Najm, H. N. and Tang, Q. and Chen, X.},
abstractNote = {In a recent paper, the authors proposed a general methodology for probabilistic learning on manifolds. The method was used to generate numerical samples that are statistically consistent with an existing dataset construed as a realization from a non-Gaussian random vector. The manifold structure is learned using diffusion manifolds and the statistical sample generation is accomplished using a projected Itˆo stochastic differential equation. This probabilistic learning approach has been extended to polynomial chaos representation of databases on manifolds and to probabilistic nonconvex constrained optimization with a fixed budget of function evaluations. The methodology introduces an isotropic-diffusion kernel with hyperparameter ". Currently, " is more or less arbitrarily chosen. In this paper, we propose a selection criterion for identifying an optimal value of ", based on a maximum entropy argument. The result is a comprehensive, closed, probabilistic model for characterizing data sets with hidden constraints. This entropy argument ensures that out of all possible models, this is the one that is the most uncertain beyond any specified constraints, which is selected. Applications are presented for several databases.},
doi = {10.1016/j.jcp.2018.12.029},
journal = {Journal of Computational Physics},
number = C,
volume = 388,
place = {United States},
year = {2019},
month = {3}
}

Journal Article:
Free Publicly Available Full Text
This content will become publicly available on March 27, 2020
Publisher's Version of Record

Save / Share: