Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
An Analysis of Contrastive Divergence Learning in Gaussian Boltzmann Machines
 

Summary: An Analysis of Contrastive Divergence Learning in
Gaussian Boltzmann Machines
Christopher K. I. Williams and Felix V. Agakov
Division of Informatics, University of Edinburgh, Edinburgh EH1 2QL, UK
c.k.i.williams@ed.ac.uk, felixa@dai.ed.ac.uk
http://anc.ed.ac.uk
May 17, 2002
Abstract
The Boltzmann machine (BM) learning rule for random field models with latent
variables can be problematic to use in practice. These problems have (at least
partially) been attributed to the negative phase in BM learning where a Gibbs
sampling chain should be run to equilibrium. Hinton (1999, 2000) has introduced
an alternative called contrastive divergence (CD) learning where the chain is run
for only 1 step. In this paper we analyse the mean and variance of the parameter
update obtained after i steps of Gibbs sampling for a simple Gaussian BM. For this
model our analysis shows that CD learning produces (as expected) a biased estimate
of the true parameter update. We also show that the variance does usually increase
with i and quantify this behaviour.
Recently Hinton (1999, 2000) has introduced the contrastive divergence (CD) learning
rule. This was introduced in the context of Products of Experts architectures, although

  

Source: Agakov, Felix - Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh

 

Collections: Computer Technologies and Information Sciences