Entropybased closure for probabilistic learning on manifolds
Abstract
In a recent paper, the authors proposed a general methodology for probabilistic learning on manifolds. The method was used to generate numerical samples that are statistically consistent with an existing dataset construed as a realization from a nonGaussian random vector. The manifold structure is learned using diffusion manifolds and the statistical sample generation is accomplished using a projected Itˆo stochastic differential equation. This probabilistic learning approach has been extended to polynomial chaos representation of databases on manifolds and to probabilistic nonconvex constrained optimization with a fixed budget of function evaluations. The methodology introduces an isotropicdiffusion kernel with hyperparameter ". Currently, " is more or less arbitrarily chosen. In this paper, we propose a selection criterion for identifying an optimal value of ", based on a maximum entropy argument. The result is a comprehensive, closed, probabilistic model for characterizing data sets with hidden constraints. This entropy argument ensures that out of all possible models, this is the one that is the most uncertain beyond any specified constraints, which is selected. Applications are presented for several databases.
 Authors:

 Univ. Paris, MarneLaVallee (France)
 Univ. of Southern California, Los Angeles, CA (United States)
 Sandia National Lab. (SNLCA), Livermore, CA (United States)
 Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)
 Publication Date:
 Research Org.:
 Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Sandia National Lab. (SNLNM), Albuquerque, NM (United States)
 Sponsoring Org.:
 USDOE National Nuclear Security Administration (NNSA); Defense Advanced Research Projects Agency (DARPA)
 OSTI Identifier:
 1526161
 Alternate Identifier(s):
 OSTI ID: 1529282
 Report Number(s):
 LLNLJRNL744191; SAND20180462J
Journal ID: ISSN 00219991; 899147
 Grant/Contract Number:
 AC5207NA27344; AC0494AL85000
 Resource Type:
 Accepted Manuscript
 Journal Name:
 Journal of Computational Physics
 Additional Journal Information:
 Journal Volume: 388; Journal Issue: C; Journal ID: ISSN 00219991
 Publisher:
 Elsevier
 Country of Publication:
 United States
 Language:
 English
 Subject:
 97 MATHEMATICS AND COMPUTING
Citation Formats
Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., and Chen, X. Entropybased closure for probabilistic learning on manifolds. United States: N. p., 2019.
Web. doi:10.1016/j.jcp.2018.12.029.
Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., & Chen, X. Entropybased closure for probabilistic learning on manifolds. United States. doi:10.1016/j.jcp.2018.12.029.
Soize, C., Ghanem, R., Safta, C., Huan, X., Vane, Z. P., Oefelein, J., Lacaze, G., Najm, H. N., Tang, Q., and Chen, X. Wed .
"Entropybased closure for probabilistic learning on manifolds". United States. doi:10.1016/j.jcp.2018.12.029. https://www.osti.gov/servlets/purl/1526161.
@article{osti_1526161,
title = {Entropybased closure for probabilistic learning on manifolds},
author = {Soize, C. and Ghanem, R. and Safta, C. and Huan, X. and Vane, Z. P. and Oefelein, J. and Lacaze, G. and Najm, H. N. and Tang, Q. and Chen, X.},
abstractNote = {In a recent paper, the authors proposed a general methodology for probabilistic learning on manifolds. The method was used to generate numerical samples that are statistically consistent with an existing dataset construed as a realization from a nonGaussian random vector. The manifold structure is learned using diffusion manifolds and the statistical sample generation is accomplished using a projected Itˆo stochastic differential equation. This probabilistic learning approach has been extended to polynomial chaos representation of databases on manifolds and to probabilistic nonconvex constrained optimization with a fixed budget of function evaluations. The methodology introduces an isotropicdiffusion kernel with hyperparameter ". Currently, " is more or less arbitrarily chosen. In this paper, we propose a selection criterion for identifying an optimal value of ", based on a maximum entropy argument. The result is a comprehensive, closed, probabilistic model for characterizing data sets with hidden constraints. This entropy argument ensures that out of all possible models, this is the one that is the most uncertain beyond any specified constraints, which is selected. Applications are presented for several databases.},
doi = {10.1016/j.jcp.2018.12.029},
journal = {Journal of Computational Physics},
number = C,
volume = 388,
place = {United States},
year = {2019},
month = {3}
}
Web of Science