skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: A high-storage capacity content-addressable memory and its learning algorithm

Journal Article · · IEEE (Institute of Electrical and Electronics Engineers) Transactions on Circuits and Systems; (USA)
DOI:https://doi.org/10.1109/31.31325· OSTI ID:5172980
; ; ;  [1]
  1. Universite Catholique de Louvain, Laboratoire de microelectronique, 1348 Louvain-la-Neuve (BE)

Hopfield's neural networks show retrieval and speed capabilities that make them good candidates for content-addressable memories (CAM's) in problems such as pattern recognition and optimization. This paper presents a new implementation of a VLSI fully interconnected neural network with only two binary memory points per synapse (the connection weights are restricted to three different values: + 1.0 and -1). The small area of single synaptic cells (about 10/sup 4/ {mu}m/sup 2/) allows the implementation of neural networks with more than 500 neurons. Because of the poor storage capability of Hebb's learning rule, especially in VLSI neural networks where the range of the synapse weights is limited by the number of memory points contained in each connection, a new algorithm is proposed for programming a Hopfield neural network as a high-storage capacity CAM. The results of the VLSI circuit programmed with this new algorithm are promising for pattern recognition applications.

OSTI ID:
5172980
Journal Information:
IEEE (Institute of Electrical and Electronics Engineers) Transactions on Circuits and Systems; (USA), Vol. 36:5; ISSN 0098-4094
Country of Publication:
United States
Language:
English