Uncertainty Quantification via Stable Distribution Propagation
Journal Article
·
· Published as a conference paper at ICLR 2024
We propose a new approach for propagating stable probability distributions through neural networks. Our method is based on local linearization, which we show to be an optimal approximation in terms of total variation distance for the ReLU non-linearity. This allows propagating Gaussian and Cauchy input uncertainties through neural networks to quantify their output uncertainties. To demonstrate the utility of propagating distributions, we apply the proposed method to predicting calibrated confidence intervals and selective prediction on out-of-distribution data. The results demonstrate a broad applicability of propagating distributions and show the advantages of our method over other approaches such as moment matching.
- Research Organization:
- SLAC National Accelerator Laboratory (SLAC), Menlo Park, CA (United States). SUNCAT Center for Interface Science and Catalysis
- Sponsoring Organization:
- USDOE Office of Science (SC)
- Grant/Contract Number:
- AC02-76SF00515
- OSTI ID:
- 3017340
- Journal Information:
- Published as a conference paper at ICLR 2024, Journal Name: Published as a conference paper at ICLR 2024
- Country of Publication:
- United States
- Language:
- English
Similar Records
Uncertainty Quantification and Sensitivity Analysis of Low-Dimensional Manifold via Co-Kurtosis PCA in Combustion Modeling
Uncertainty quantification for neural network potential foundation models
Technical Report
·
Sun Sep 01 00:00:00 EDT 2024
·
OSTI ID:2462986
Uncertainty quantification for neural network potential foundation models
Journal Article
·
Wed Apr 23 20:00:00 EDT 2025
· npj Computational Materials
·
OSTI ID:2562728