Stochastic gradient descent for optimization for nuclear systems
Journal Article
·
· Scientific Reports
- Univ. of Tennessee, Knoxville, TN (United States); University of Tennessee, Knoxville, TN (United States)
- Univ. of Tennessee, Knoxville, TN (United States)
The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue nuclear systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty. Furthermore, it is clearly demonstrated that low-compute time, high-variance estimates of the gradient lead to better performance in the optimization challenge problems tested here.
- Research Organization:
- Univ. of California, Oakland, CA (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- Grant/Contract Number:
- NA0000979; NA0003180
- OSTI ID:
- 2417878
- Journal Information:
- Scientific Reports, Journal Name: Scientific Reports Journal Issue: 1 Vol. 13; ISSN 2045-2322
- Publisher:
- Nature Publishing GroupCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
A Stochastic Gradient Descent Approach for Stochastic Optimal Control
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent
Journal Article
·
Sat Aug 01 00:00:00 EDT 2020
· East Asian Journal on Applied Mathematics
·
OSTI ID:1814258
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Journal Article
·
Sat Feb 29 23:00:00 EST 2020
· Foundations of Data Science
·
OSTI ID:1632071
Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent
Journal Article
·
Thu Oct 05 00:00:00 EDT 2023
· Communications on Applied Mathematics and Computation
·
OSTI ID:2007675