Stochastic gradient descent for optimization for nuclear systems
- Univ. of Tennessee, Knoxville, TN (United States); University of Tennessee, Knoxville, TN (United States)
- Univ. of Tennessee, Knoxville, TN (United States)
The use of gradient descent methods for optimizing k-eigenvalue nuclear systems has been shown to be useful in the past, but the use of k-eigenvalue gradients have proved computationally challenging due to their stochastic nature. ADAM is a gradient descent method that accounts for gradients with a stochastic nature. This analysis uses challenge problems constructed to verify if ADAM is a suitable tool to optimize k-eigenvalue nuclear systems. ADAM is able to successfully optimize nuclear systems using the gradients of k-eigenvalue problems despite their stochastic nature and uncertainty. Furthermore, it is clearly demonstrated that low-compute time, high-variance estimates of the gradient lead to better performance in the optimization challenge problems tested here.
- Research Organization:
- Univ. of California, Oakland, CA (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- Grant/Contract Number:
- NA0000979; NA0003180
- OSTI ID:
- 2417878
- Journal Information:
- Scientific Reports, Journal Name: Scientific Reports Journal Issue: 1 Vol. 13; ISSN 2045-2322
- Publisher:
- Nature Publishing GroupCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Stochastic gradient descent algorithm for stochastic optimization in solving analytic continuation problems
Convergence of Hyperbolic Neural Networks Under Riemannian Stochastic Gradient Descent