Skip to main content
U.S. Department of Energy
Office of Scientific and Technical Information

Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware

Journal Article · · Neuromorphic Computing and Engineering

Abstract

In-memory computing with emerging non-volatile memory devices (eNVMs) has shown promising results in accelerating matrix-vector multiplications. However, activation function calculations are still being implemented with general processors or large and complex neuron peripheral circuits. Here, we present the integration of Ag-based conductive bridge random access memory (Ag-CBRAM) crossbar arrays with Mott rectified linear unit (ReLU) activation neurons for scalable, energy and area-efficient hardware (HW) implementation of deep neural networks. We develop Ag-CBRAM devices that can achieve a high ON/OFF ratio and multi-level programmability. Compact and energy-efficient Mott ReLU neuron devices implementing ReLU activation function are directly connected to the columns of Ag-CBRAM crossbars to compute the output from the weighted sum current. We implement convolution filters and activations for VGG-16 using our integrated HW and demonstrate the successful generation of feature maps for CIFAR-10 images in HW. Our approach paves a new way toward building a highly compact and energy-efficient eNVMs-based in-memory computing system.

Sponsoring Organization:
USDOE
Grant/Contract Number:
SC0019273
OSTI ID:
1997290
Alternate ID(s):
OSTI ID: 1994436
Journal Information:
Neuromorphic Computing and Engineering, Journal Name: Neuromorphic Computing and Engineering Journal Issue: 3 Vol. 3; ISSN 2634-4386
Publisher:
IOP PublishingCopyright Statement
Country of Publication:
United Kingdom
Language:
English

References (15)

Metal-Insulator Transition in ALD VO 2 Ultrathin Films and Nanoparticles: Morphological Control journal December 2014
A comparison of deep networks with ReLU activation function and linear spline-type methods journal February 2019
Deep learning journal May 2015
Wafer-scale growth of VO2 thin films using a combinatorial approach journal October 2015
Heterogeneous integration of single-crystalline rutile nanomembranes with steep phase transition on silicon substrates journal August 2021
Energy-efficient Mott activation neuron for full-hardware implementation of neural networks journal March 2021
In-memory computing with resistive switching devices journal June 2018
Conduction mechanism of a TaO x -based selector and its application in crossbar memory arrays journal January 2015
Research progress on solutions to the sneak path issue in memristor crossbar arrays journal January 2020
Oxide-based filamentary RRAM for deep learning journal December 2020
Reconfigurable anisotropy and functional transformations with VO 2 -based metamaterial electric circuits journal April 2015
Analog-to-Digital Conversion With Reconfigurable Function Mapping for Neural Networks Activation Function Acceleration journal June 2019
Monolithically Integrated RRAM- and CMOS-Based In-Memory Computing Optimizations for Efficient Deep Learning journal November 2019
Learning in Memristive Neural Network Architectures Using Analog Backpropagation Circuits journal February 2019
Compact Modeling of RRAM Devices and Its Applications in 1T1R and 1S1R Array Design journal December 2015

Similar Records

Exponential Escape Rate of Filamentary Incubation in Mott Spiking Neurons
Journal Article · Tue Feb 08 23:00:00 EST 2022 · Physical Review Applied · OSTI ID:1979643

AutoReP: Automatic ReLU Replacement for Fast Private Network Inference
Conference · Sat Dec 30 23:00:00 EST 2023 · OSTI ID:2328507

Related Subjects