Introduction: Neuromorphic Materials
Journal Article
·
· Chemical Reviews
- Sandia National Lab. (SNL-CA), Livermore, CA (United States)
- Massachusetts Inst. of Technology (MIT), Cambridge, MA (United States)
The explosive growth in data collection and the need to process it efficiently, as well as the desire to automate increasingly complex tasks in transportation, medical care, manufacturing, security and many other fields have motivated a growing interest in neuromorphic computing. Unlike the binary, transistorbased ON/OFF logic gates and separate logic and memory functionalities employed in digital computing, neuromorphic computing is inspired by animal brains that use interconnected synapses and neurons to perform processing, storage and transmission of information at the same location, while only consuming ~20 W or less of power. Motivated by the brain’s efficiency, adaptability, self-learning and resiliency qualities, neuromorphic computing can be broadly defined as an approach to processing and storing information using hardware and algorithms inspired by models of biological neural systems. Present research in neuromorphic computing encompasses approaches that vary significantly in their degree of neuro-inspiration, from systems that only incorporate features such as asynchronous, event-driven operation or use crossbar arrays of non-volatile memory (NVM) elements to accelerate deep neural networks (DNNs), to designs that embrace the extreme parallelism, sparsity, reconfigurability, adaptability, complexity and stochasticity observed in nervous systems. The term ‘neuromorphic’ computing is often credited to Carver Mead, who in the 1980s investigated Si-based analog electronics to replicate functions of the animal retina. Earlier important advances in this field include the work of Frank Rosenblatt, who proposed the concept of the perceptron, Bernard Widrow, who used this concept to build one of the first analog neural networks, the Adaline and many other researchers (see ref. 6 for an historical perspective on neuromorphic computing). With the recent increase in the use of artificial intelligence and large language models, and rising concerns over the associated energy costs, interest in neuromorphic hardware has expanded rapidly. According to some estimates, driven largely by the drastic growth in the training use of artificial intelligence (AI) models using the current computing architectures, the energy cost of computing is projected to reach the energy supply worldwide by 2045. Furthermore, while this is not a realistic outcome, it means that, if more efficient computing technologies are not developed -- soon -- the world will soon become one where demand for energy and market constraints limit the continued increase of societal access to AI and cloud services from data centers. Data centers used for training and use of these models consume hundreds of terawatt hours of electricity, already past 4% of the US electricity demand.
- Research Organization:
- Northwestern Univ., Evanston, IL (United States); Sandia National Laboratories (SNL-CA), Livermore, CA (United States)
- Sponsoring Organization:
- USDOE Laboratory Directed Research and Development (LDRD) Program; USDOE National Nuclear Security Administration (NNSA); USDOE Office of Science (SC), Basic Energy Sciences (BES)
- Grant/Contract Number:
- NA0003525; SC0023450
- OSTI ID:
- 2585587
- Alternate ID(s):
- OSTI ID: 3015273
- Report Number(s):
- SAND--2025-07843J; 1777847
- Journal Information:
- Chemical Reviews, Journal Name: Chemical Reviews Journal Issue: 10 Vol. 125; ISSN 1520-6890; ISSN 0009-2665
- Publisher:
- American Chemical Society (ACS)Copyright Statement
- Country of Publication:
- United States
- Language:
- English
Similar Records
Probabilistic Neural Computing with Stochastic Devices
Journal Article
·
Wed Nov 16 19:00:00 EST 2022
· Advanced Materials
·
OSTI ID:1898714