Analysis and mitigation of parasitic resistance effects for analog in-memory neural network acceleration
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Univ. of Texas, Austin, TX (United States)
To support the increasing demands for efficient deep neural network processing, accelerators based on analog in-memory computation of matrix multiplication have recently gained significant attention for reducing the energy of neural network inference. However, analog processing within memory arrays must contend with the issue of parasitic voltage drops across the metal interconnects, which distort the results of the computation and limit the array size. This work analyzes how parasitic resistance affects the end-to-end inference accuracy of state-of-the-art convolutional neural networks, and comprehensively studies how various design decisions at the device, circuit, architecture, and algorithm levels affect the system's sensitivity to parasitic resistance effects. Here, a set of guidelines are provided for how to design analog accelerator hardware that is intrinsically robust to parasitic resistance, without any explicit compensation or re-training of the network parameters.
- Research Organization:
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- Grant/Contract Number:
- NA0003525
- OSTI ID:
- 1828786
- Report Number(s):
- SAND-2021-12073J; 700487
- Journal Information:
- Semiconductor Science and Technology, Vol. 36, Issue 11; ISSN 0268-1242
- Publisher:
- IOP PublishingCopyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Analog architectures for neural network acceleration based on non-volatile memory
In situ Parallel Training of Analog Neural Network Using Electrochemical Random-Access Memory