Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Skip-Connected Self-Recurrent Spiking Neural Networks with Joint Intrinsic Parameter and Synaptic Weight Training

Journal Article · · Neural Computation
DOI:https://doi.org/10.1162/neco_a_01393· OSTI ID:1765527
 [1];  [2]
  1. Univ. of California, Santa Barbara, CA (United States); University of California, Santa Barbara
  2. Univ. of California, Santa Barbara, CA (United States)
As an important class of spiking neural networks (SNNs), recurrent spiking neural networks (RSNNs) possess great computational power and have been widely used for processing sequential data like audio and text. However, most RSNNs suffer from two problems. 1. Due to the lack of architectural guidance, random recurrent connectivity is often adopted, which does not guarantee good performance. 2. Training of RSNNs is in general challenging, bottlenecking achievable model accuracy. To address these problems, we propose a new type of RSNNs called Skip-Connected Self-Recurrent SNNs (ScSr-SNNs). Recurrence in ScSr-SNNs is introduced in a stereotyped manner by adding self-recurrent connections to spiking neurons. The SNNs with self-recurrent connections can realize recurrent behaviors similar to those of more complex RSNNs while the error gradients can be more straightforwardly calculated due to the mostly feedforward nature of the network. The network dynamics is enriched by skip connections between nonadjacent layers. Moreover, we propose a new backpropagation (BP) method called backpropagated intrinsic plasticity (BIP) to further boost the performance of ScSr-SNNs by training intrinsic model parameters. Unlike standard intrinsic plasticity rules that adjust the neuron's intrinsic parameters according to neuronal activity, the proposed BIP method optimizes intrinsic parameters based on the backpropagated error gradient of a well-defined global loss function in addition to synaptic weight training. Here, based on challenging speech, neuromorphic speech, and neuromorphic image datasets, the proposed ScSr-SNNs can boost performance by up to 2.85% compared with other types of RSNNs trained by state-of-the-art BP methods.
Research Organization:
Univ. of California, Santa Barbara, CA (United States)
Sponsoring Organization:
USDOE Office of Science (SC), Advanced Scientific Computing Research (ASCR)
Grant/Contract Number:
SC0021319
OSTI ID:
1765527
Journal Information:
Neural Computation, Journal Name: Neural Computation Journal Issue: 7 Vol. 33; ISSN 0899-7667
Publisher:
MIT PressCopyright Statement
Country of Publication:
United States
Language:
English

Similar Records

Rethinking skip connections in Spiking Neural Networks with Time-To-First-Spike coding
Journal Article · 2024 · Frontiers in Neuroscience (Online) · OSTI ID:2407016

Design of a Robust Memristive Spiking Neuromorphic System with Unsupervised Learning in Hardware
Journal Article · 2021 · ACM Journal on Emerging Technologies in Computing Systems · OSTI ID:1814330