CrossSim Inference Manual v2.0
- Sandia National Laboratories (SNL), Albuquerque, NM, and Livermore, CA (United States)
Neural networks are largely based on matrix computations. During forward inference, the most heavily used compute kernel is the matrix-vector multiplication (MVM): $$W \vec{x} $$. Inference is a first frontier for the deployment of next-generation hardware for neural network applications, as it is more readily deployed in edge devices, such as mobile devices or embedded processors with size, weight, and power constraints. Inference is also easier to implement in analog systems than training, which has more stringent device requirements. The main processing kernel used during inference is the MVM.
- Research Organization:
- Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sandia National Lab. (SNL-CA), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE National Nuclear Security Administration (NNSA)
- DOE Contract Number:
- NA0003525
- OSTI ID:
- 1869509
- Report Number(s):
- SAND2022-7074R; 706724
- Country of Publication:
- United States
- Language:
- English
Similar Records
Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
Performance Characteristics of the BlueField-2 SmartNIC
Journal Article
·
2023
· Neuromorphic Computing and Engineering
·
OSTI ID:1994436
+4 more
Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
Journal Article
·
2023
· Neuromorphic Computing and Engineering
·
OSTI ID:1997290
+4 more
Performance Characteristics of the BlueField-2 SmartNIC
Technical Report
·
2021
·
OSTI ID:1783736
+1 more