Optimal neural computations require analog processors
This paper discusses some of the limitations of hardware implementations of neural networks. The authors start by presenting neural structures and their biological inspirations, while mentioning the simplifications leading to artificial neural networks. Further, the focus will be on hardware imposed constraints. They will present recent results for three different alternatives of parallel implementations of neural networks: digital circuits, threshold gate circuits, and analog circuits. The area and the delay will be related to the neurons` fan-in and to the precision of their synaptic weights. The main conclusion is that hardware-efficient solutions require analog computations, and suggests the following two alternatives: (i) cope with the limitations imposed by silicon, by speeding up the computation of the elementary silicon neurons; (2) investigate solutions which would allow the use of the third dimension (e.g. using optical interconnections).
- Research Organization:
- Los Alamos National Lab., Div. of Space and Atmospheric Sciences, NM (United States)
- Sponsoring Organization:
- USDOE Assistant Secretary for Human Resources and Administration, Washington, DC (United States)
- DOE Contract Number:
- W-7405-ENG-36
- OSTI ID:
- 334348
- Report Number(s):
- LA-UR-98-3325; CONF-9809118-; ON: DE99002277; TRN: AHC29914%%152
- Resource Relation:
- Conference: Intenational conference on parallel computing in electrical engineering, Biialystok (Poland), 2-5 Sep 1998; Other Information: PBD: [1998]
- Country of Publication:
- United States
- Language:
- English
Similar Records
Modular Spiking Neural Circuits for Mapping Long Short-Term Memory on a Neurosynaptic Processor
Biomimetic, Soft-Material Synapse for Neuromorphic Computing: from Device to Network