Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Optimizing one-shot learning with binary Sandro Romani
 

Summary: Optimizing one-shot learning with binary
synapses
Sandro Romani
, Daniel J Amit
, and Yali Amit
December 12, 2007
Abstract
A network of excitatory synapses trained with a conservative ver-
sion of Hebbian learning is used as a model for recognizing the famil-
iarity of thousands of once seen stimuli from those never seen before.
Such networks were initially proposed for modeling memory retrieval
(selective delay activity). We show that the same framework allows for
the incorporation of both familiarity recognition and memory retrieval,
and estimate the network's capacity. In the case of binary neurons, we
extend the analysis of [4] to obtain capacity limits based on compu-
tations of signal to noise ratio of the field difference between selective
and non-selective neurons of learned signals. We show that with fast
learning - potentiation probability 1 - the most recently learned pat-
terns can be retrieved in working memory (selective delay activity).
A much higher number of once-seen learned patterns elicit a realistic

  

Source: Amit, Yali - Departments of Computer Science & Statistics, University of Chicago

 

Collections: Computer Technologies and Information Sciences