skip to main content
DOE PAGES title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: HyperSpace: Distributed Bayesian Hyperparameter Optimization

Abstract

As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has been considerable progress toward finding optimal configurations of these hyperparameters, many optimization procedures are treated as black boxes. We believe optimization methods should not only return a set of optimized hyperparameters, but also give insight into the effects of model hyperparameter settings. To this end, we present HyperSpace, a parallel implementation of Bayesian sequential model-based optimization. HyperSpace leverages high performance computing (HPC) resources to better understand unknown, potentially non-convex hyperparameter search spaces. We show that it is possible to learn the dependencies between model hyperparameters through the optimization process. By partitioning large search spaces and running many optimization procedures in parallel, we also show that it is possible to discover families of good hyperparameter settings over a variety of models including unsupervised clustering, regression, and classification tasks.

Authors:
 [1];  [1];  [1];  [1]
  1. Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Publication Date:
Research Org.:
Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)
Sponsoring Org.:
USDOE
OSTI Identifier:
1556107
Grant/Contract Number:  
AC05-00OR22725
Resource Type:
Accepted Manuscript
Journal Name:
Proceedings (Symposium on Computer Architecture and High Performance Computing)
Additional Journal Information:
Journal Name: Proceedings (Symposium on Computer Architecture and High Performance Computing); Conference: 2018 30th International Symposium on Computer Architecture and High Performance Computing (SBAC-PAD), Lyon (France), 24-27 Sept. 2018; Journal ID: ISSN 1550-6533
Publisher:
IEEE
Country of Publication:
United States
Language:
English
Subject:
97 MATHEMATICS AND COMPUTING

Citation Formats

Young, M. Todd, Hinkle, Jacob, Ramanathan, Arvind, and Kannan, Ramakrishnan. HyperSpace: Distributed Bayesian Hyperparameter Optimization. United States: N. p., 2018. Web. doi:10.1109/CAHPC.2018.8645954.
Young, M. Todd, Hinkle, Jacob, Ramanathan, Arvind, & Kannan, Ramakrishnan. HyperSpace: Distributed Bayesian Hyperparameter Optimization. United States. doi:10.1109/CAHPC.2018.8645954.
Young, M. Todd, Hinkle, Jacob, Ramanathan, Arvind, and Kannan, Ramakrishnan. Sat . "HyperSpace: Distributed Bayesian Hyperparameter Optimization". United States. doi:10.1109/CAHPC.2018.8645954. https://www.osti.gov/servlets/purl/1556107.
@article{osti_1556107,
title = {HyperSpace: Distributed Bayesian Hyperparameter Optimization},
author = {Young, M. Todd and Hinkle, Jacob and Ramanathan, Arvind and Kannan, Ramakrishnan},
abstractNote = {As machine learning models continue to increase in complexity, so does the potential number of free model parameters commonly known as hyperparameters. While there has been considerable progress toward finding optimal configurations of these hyperparameters, many optimization procedures are treated as black boxes. We believe optimization methods should not only return a set of optimized hyperparameters, but also give insight into the effects of model hyperparameter settings. To this end, we present HyperSpace, a parallel implementation of Bayesian sequential model-based optimization. HyperSpace leverages high performance computing (HPC) resources to better understand unknown, potentially non-convex hyperparameter search spaces. We show that it is possible to learn the dependencies between model hyperparameters through the optimization process. By partitioning large search spaces and running many optimization procedures in parallel, we also show that it is possible to discover families of good hyperparameter settings over a variety of models including unsupervised clustering, regression, and classification tasks.},
doi = {10.1109/CAHPC.2018.8645954},
journal = {Proceedings (Symposium on Computer Architecture and High Performance Computing)},
number = ,
volume = ,
place = {United States},
year = {2018},
month = {9}
}

Journal Article:
Free Publicly Available Full Text
Publisher's Version of Record

Save / Share: