skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Bayesian Multi-objective Hyperparameter Optimization for Accurate, Fast, and Efficient Neural Network Accelerator Design

Journal Article · · Frontiers in Neuroscience (Online)

In resource-constrained environments, such as low-power edge devices and smart sensors, deploying a fast, compact, and accurate intelligent system with minimum energy is indispensable. Embedding intelligence can be achieved using neural networks on neuromorphic hardware. Designing such networks would require determining several inherent hyperparameters. A key challenge is to find the optimum set of hyperparameters that might belong to the input/output encoding modules, the neural network itself, the application, or the underlying hardware. In this work, we present a hierarchical pseudo agent-based multi-objective Bayesian hyperparameter optimization framework (both software and hardware) that not only maximizes the performance of the network, but also minimizes the energy and area requirements of the corresponding neuromorphic hardware. We validate performance of our approach (in terms of accuracy and computation speed) on several control and classification applications on digital and mixed-signal (memristor-based) neural accelerators. We show that the optimum set of hyperparameters might drastically improve the performance of one application (i.e., 52–71% for Pole-Balance), while having minimum effect on another (i.e., 50–53% for RoboNav). In addition, we demonstrate resiliency of different input/output encoding, training neural network, or the underlying accelerator modules in a neuromorphic system to the changes of the hyperparameters.

Research Organization:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR)
Grant/Contract Number:
AC05-00OR22725
OSTI ID:
1640236
Alternate ID(s):
OSTI ID: 1649130
Journal Information:
Frontiers in Neuroscience (Online), Journal Name: Frontiers in Neuroscience (Online) Vol. 14; ISSN 1662-453X
Publisher:
Frontiers Research FoundationCopyright Statement
Country of Publication:
Switzerland
Language:
English
Citation Metrics:
Cited by: 18 works
Citation information provided by
Web of Science

References (26)

Minerva: Enabling Low-Power, Highly-Accurate Deep Neural Network Accelerators conference June 2016
NeoN: Neuromorphic control for autonomous robotic navigation conference October 2017
Neuromorphic hardware in the loop: Training a deep spiking network on the BrainScaleS wafer-scale system conference May 2017
Energy-Efficient and Improved Image Recognition with Conditional Deep Learning
  • Panda, Priyadarshini; Sengupta, Abhronil; Roy, Kaushik
  • ACM Journal on Emerging Technologies in Computing Systems, Vol. 13, Issue 3 https://doi.org/10.1145/3007192
journal May 2017
HyperPower: Power- and memory-constrained hyper-parameter optimization for neural networks conference March 2018
A New Method of Locating the Maximum Point of an Arbitrary Multipeak Curve in the Presence of Noise journal March 1964
Taking the Human Out of the Loop: A Review of Bayesian Optimization journal January 2016
Neuromorphic Hardware Learns to Learn journal May 2019
Convolutional networks for fast, energy-efficient neuromorphic computing journal September 2016
Memristive Mixed-Signal Neuromorphic Systems: Energy-Efficient Learning at the Circuit-Level journal March 2018
A Comparison of Neuromorphic Classification Tasks conference January 2018
Training deep neural networks for binary communication with the Whetstone method journal January 2019
Efficient Global Optimization of Expensive Black-Box Functions journal January 1998
DANNA 2: Dynamic Adaptive Neural Network Arrays conference January 2018
Non-Traditional Input Encoding Schemes for Spiking Neuromorphic Systems conference July 2019
Hardware-aware machine learning: modeling and optimization
  • Marculescu, Diana; Stamoulis, Dimitrios; Cai, Ermao
  • ICCAD '18: IEEE/ACM INTERNATIONAL CONFERENCE ON COMPUTER-AIDED DESIGN, Proceedings of the International Conference on Computer-Aided Design https://doi.org/10.1145/3240765.3243479
conference November 2018
A case for efficient accelerator design space exploration via Bayesian optimization conference July 2017
Hyperparameter Optimization in Binary Communication Networks for Neuromorphic Deployment conference July 2020
Evolving neural network controllers for unstable systems conference January 1991
PUMA: A Programmable Ultra-efficient Memristor-based Accelerator for Machine Learning Inference
  • Ankit, Aayush; Hajj, Izzat El; Chalamalasetti, Sai Rahul
  • ASPLOS '19: Architectural Support for Programming Languages and Operating Systems, Proceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems https://doi.org/10.1145/3297858.3304049
conference April 2019
Asymptotically efficient adaptive allocation rules journal March 1985
A Taxonomy of Global Optimization Methods Based on Response Surfaces journal December 2001
A fast and elitist multiobjective genetic algorithm: NSGA-II journal April 2002
The TENNLab Exploratory Neuromorphic Computing Framework journal July 2018
Staged Inference using Conditional Deep Learning for energy efficient real-time smart diagnosis conference July 2017
An evolutionary optimization framework for neural networks and neuromorphic architectures conference July 2016