Multi-Objective Hyperparameter Optimization for Spiking Neural Network Neuroevolution
- ORNL
- George Mason University, Virginia
Neuroevolution has had significant success over recent years, but there has been relatively little work applying neuroevolution approaches to spiking neural networks (SNNs). SNNs are a type of neural network that includes temporal processing component, are not easily trained using other methods, and can be deployed into energy-efficient neuromorphic hardware. In this work, we investigate two evolutionary approaches for training SNNs. We explore the impact of the hyperparameters of the evolutionary approaches, including tournament size, population size, and representation type, on the performance of the algorithms. We present a multi-objective Bayesian-based hyperparameter optimization approach to tune the hyperparameters to produce the most accurate and smallest SNNs. We show that the hyperparameters can significantly affect the performance of these algorithms. We also perform sensitivity analysis and demonstrate that every hyperparameter value has the potential to perform well, assuming other hyperparameter values are set correctly.
- Research Organization:
- Oak Ridge National Laboratory (ORNL), Oak Ridge, TN (United States)
- Sponsoring Organization:
- USDOE Office of Science (SC)
- DOE Contract Number:
- AC05-00OR22725
- OSTI ID:
- 1814329
- Resource Relation:
- Conference: IEEE Conference on Evolutionary Computation (CEC) - Virtual, , Poland - 6/28/2021 8:00:00 AM-7/1/2021 8:00:00 AM
- Country of Publication:
- United States
- Language:
- English
Similar Records
Bayesian-based Hyperparameter Optimization for Spiking Neuromorphic Systems
Multi-Objective Optimization for Size and Resilience of Spiking Neural Networks