Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Bayesian-based Hyperparameter Optimization for Spiking Neuromorphic Systems

Conference ·

Designing a neuromorphic computing system involves selection of several hyperparameters that not only affect the accuracy of the framework, but also the energy efficiency and speed of inference and training. These hyperparameters might be inherent to the training of the spiking neural network (SNN), the input/output encoding of the real-world data to spikes, or the underlying neuromorphic hardware. In this work, we present a Bayesian-based hyperparameter optimization approach for spiking neuromorphic systems, and we show how this optimization framework can lead to significant improvement in designing accurate neuromorphic computing systems. In particular, we show that this hyperparameter optimization approach can discover the same optimal hyperparameter set for input encoding as a grid search, but with far fewer evaluations and far less time. We also show the impact of hardware-specific hyperparameters on the performance of the system, and we demonstrate that by optimizing these hyperparameters, we can achieve significantly better application performance.

Research Organization:
Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
Sponsoring Organization:
USDOE; USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR) (SC-21)
DOE Contract Number:
AC05-00OR22725
OSTI ID:
1606790
Country of Publication:
United States
Language:
English

References (15)

The TENNLab Exploratory Neuromorphic Computing Framework journal July 2018
Benchmarks for progress in neuromorphic computing journal September 2019
Neuromorphic computing gets ready for the (really) big time journal June 2014
DANNA 2: Dynamic Adaptive Neural Network Arrays conference January 2018
TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip journal October 2015
PUMA: A Programmable Ultra-efficient Memristor-based Accelerator for Machine Learning Inference
  • Ankit, Aayush; Hajj, Izzat El; Chalamalasetti, Sai Rahul
  • ASPLOS '19: Architectural Support for Programming Languages and Operating Systems, Proceedings of the Twenty-Fourth International Conference on Architectural Support for Programming Languages and Operating Systems https://doi.org/10.1145/3297858.3304049
conference April 2019
Non-Traditional Input Encoding Schemes for Spiking Neuromorphic Systems conference July 2019
Analysis of Liquid Ensembles for Enhancing the Performance and Accuracy of Liquid State Machines journal May 2019
PABO: Pseudo Agent-Based Multi-Objective Bayesian Hyperparameter Optimization for Efficient Neural Accelerator Design conference November 2019
Evolving neural network controllers for unstable systems conference January 1991
An empirical evaluation of deep architectures on problems with many factors of variation conference January 2007
An evolutionary optimization framework for neural networks and neuromorphic architectures conference July 2016
Going Deeper in Spiking Neural Networks: VGG and Residual Architectures journal March 2019
Loihi: A Neuromorphic Manycore Processor with On-Chip Learning journal January 2018
Simulation of a memristor-based spiking neural network immune to device variations conference July 2011

Similar Records

Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
Conference · Tue Jun 01 00:00:00 EDT 2021 · OSTI ID:1814329

A Software Framework for Comparing Training Approaches for Spiking Neuromorphic Systems
Conference · Thu Jul 01 00:00:00 EDT 2021 · OSTI ID:1823363

Related Subjects