Multiobjective Hyperparameter Optimization for Deep Learning Interatomic Potential Training Using NSGA-II
- ORNL
Deep neural network (DNN) potentials are an emerging tool for simulation of dynamical atomistic systems, with the promise of quantum mechanical accuracy at speedups of 10000$$\times$$. As with other DNN methods, hyperparameters used during training can make a substantial difference in model accuracy, and optimal settings vary with dataset. To enable rapid tuning of hyperparameters for DNN potential training, we developed a scalable multiobjective optimization evolutionary algorithm for supercomputers and tested it on the Summit system at the Oak Ridge Leadership Computing Facility (OLCF). The multiobjective approach is required due to the coupling of two learned values defining the potential: the energy and force. Using a large-scale implementation of the NSGA-II algorithm adapted for training DNN potentials, we discovered several optimal multiobjective combinations, including best choices of activation functions, learning rate scaling scheme, and pairing of the two radial cutoffs used in the three dimensional descriptor function.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE
- DOE Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1996670
- Country of Publication:
- United States
- Language:
- English
Similar Records
Improving Deep Neural Networks’ Training for Image Classification With Nonlinear Conjugate Gradient-Style Adaptive Momentum
NSGA-PINN: A Multi-Objective Optimization Method for Physics-Informed Neural Network Training