skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: CrossSim Inference Manual v2.0

Technical Report ·
DOI:https://doi.org/10.2172/1869509· OSTI ID:1869509

Neural networks are largely based on matrix computations. During forward inference, the most heavily used compute kernel is the matrix-vector multiplication (MVM): $$W \vec{x} $$. Inference is a first frontier for the deployment of next-generation hardware for neural network applications, as it is more readily deployed in edge devices, such as mobile devices or embedded processors with size, weight, and power constraints. Inference is also easier to implement in analog systems than training, which has more stringent device requirements. The main processing kernel used during inference is the MVM.

Research Organization:
Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Sandia National Lab. (SNL-CA), Livermore, CA (United States)
Sponsoring Organization:
USDOE National Nuclear Security Administration (NNSA)
DOE Contract Number:
NA0003525
OSTI ID:
1869509
Report Number(s):
SAND2022-7074R; 706724
Country of Publication:
United States
Language:
English

Similar Records

Integration of Ag-CBRAM crossbars and Mott ReLU neurons for efficient implementation of deep neural networks in hardware
Journal Article · Tue Aug 29 00:00:00 EDT 2023 · Neuromorphic Computing and Engineering · OSTI ID:1869509

Performance Characteristics of the BlueField-2 SmartNIC
Technical Report · Wed May 19 00:00:00 EDT 2021 · OSTI ID:1869509

FPGAs as a Service to Accelerate Machine Learning Inference [PowerPoint]
Technical Report · Wed Mar 20 00:00:00 EDT 2019 · OSTI ID:1869509

Related Subjects