Evolving Deep Networks Using HPC
- ORNL, Oak Ridge
- Fermilab
- Santa Maria U., Valparaiso
While a large number of deep learning networks have been studied and published that produce outstanding results on natural image datasets, these datasets only make up a fraction of those to which deep learning can be applied. These datasets include text data, audio data, and arrays of sensors that have very different characteristics than natural images. As these “best” networks for natural images have been largely discovered through experimentation and cannot be proven optimal on some theoretical basis, there is no reason to believe that they are the optimal network for these drastically different datasets. Hyperparameter search is thus often a very important process when applying deep learning to a new problem. In this work we present an evolutionary approach to searching the possible space of network hyperparameters and construction that can scale to 18, 000 nodes. This approach is applied to datasets of varying types and characteristics where we demonstrate the ability to rapidly find best hyperparameters in order to enable practitioners to quickly iterate between idea and result.
- Research Organization:
- Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States). Oak Ridge Leadership Computing Facility (OLCF); Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC), High Energy Physics (HEP)
- DOE Contract Number:
- AC02-07CH11359
- OSTI ID:
- 1414394
- Report Number(s):
- FERMILAB-CONF-17-567-CD-ND; 1644275
- Country of Publication:
- United States
- Language:
- English
Similar Records
Evolving Larger Convolutional Layer Kernel Sizes for a Settlement Detection Deep-Learner on Summit
Multiobjective Hyperparameter Optimization for Deep Learning Interatomic Potential Training Using NSGA-II