skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Dictionary Learning with Accumulator Neurons

Conference ·

The Locally Competitive Algorithm (LCA) uses local competition between non-spiking leaky integrator neurons to infer sparse representations, allowing for potentially real-time execution on massively parallel neuromorphic architectures such as Intel's Loihi processor. Here, we focus on the problem of inferring sparse representations from streaming video using dictionaries of spatiotemporal features optimized in an unsupervised manner for sparse reconstruction. Non-spiking LCA has previously been used to achieve unsupervised learning of spatiotemporal dictionaries composed of convolutional kernels from raw, unlabeled video. We demonstrate how unsupervised dictionary learning with spiking LCA (\hbox{S-LCA}) can be efficiently implemented using accumulator neurons, which combine a conventional leaky-integrate-and-fire (\hbox{LIF}) spike generator with an additional state variable that is used to minimize the difference between the integrated input and the spiking output. We demonstrate dictionary learning across a wide range of dynamical regimes, from graded to intermittent spiking, for inferring sparse representations of both static images drawn from the CIFAR database as well as video frames captured from a DVS camera. On a classification task that requires identification of the suite from a deck of cards being rapidly flipped through as viewed by a DVS camera, we find essentially no degradation in performance as the LCA model used to infer sparse spatiotemporal representations migrates from graded to spiking. We conclude that accumulator neurons are likely to provide a powerful enabling component of future neuromorphic hardware for implementing online unsupervised learning of spatiotemporal dictionaries optimized for sparse reconstruction of streaming video from event based DVS cameras.

Research Organization:
Pacific Northwest National Laboratory (PNNL), Richland, WA (United States)
Sponsoring Organization:
USDOE
DOE Contract Number:
AC05-76RL01830
OSTI ID:
1893845
Report Number(s):
PNNL-SA-175350
Resource Relation:
Conference: Proceedings of the International Conference on Neuromorphic Systems (ICONS 2022), July 27-29, 2022, Knoxville, TN
Country of Publication:
United States
Language:
English

References (11)

TrueNorth: Design and Tool Flow of a 65 mW 1 Million Neuron Programmable Neurosynaptic Chip journal October 2015
Nengo: a Python tool for building large-scale functional brain models journal January 2014
A neuromorph's prospectus journal March 2017
Sparse Coding via Thresholding and Local Competition in Neural Circuits journal October 2008
Extreme synergy: Spatiotemporal correlations enable rapid image reconstruction from computer-generated spike trains journal January 2010
Sparse encoding of binocular images for depth inference conference March 2016
SpiNNaker: A multi-core System-on-Chip for massively-parallel neural net simulation conference September 2012
Selectivity and robustness of sparse coding networks journal November 2020
Can Lateral Inhibition for Sparse Coding Help Explain V1 Neuronal Responses To Natural Stimuli? conference March 2020
Sparse Coding Enables the Reconstruction of High-Fidelity Images and Video from Retinal Spike Trains conference July 2018
Unsupervised Dictionary Learning via a Spiking Locally Competitive Algorithm conference July 2019

Similar Records

Related Subjects