Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
An Auxiliary Variational Method Felix V. Agakov, David Barber
 

Summary: An Auxiliary Variational Method
Felix V. Agakov, David Barber
School of Informatics
University of Edinburgh, EH1 2QL, UK
felixa@inf.ed.ac.uk, dbarber@anc.ed.ac.uk
http://anc.ed.ac.uk
Abstract
Variational methods have proved popular and effective for inference and learning in intractable
graphical models. An attractive feature of the approaches based on the Kullback-Leibler divergence
is a rigorous lower bound on the normalization constants in undirected models. In the suggested work
we explore the idea of using auxiliary variables to improve on the lower bound of standard mean field
methods. Our approach forms a more powerful class of approximations than any structured mean
field technique. Furthermore, the existing lower bounds of the variational mixture models could be
seen as computationally expensive special cases of our method. A byproduct of our work is an efficient
way to calculate a set of mixture coefficients for any set of tractable distributions that principally
improves on a flat combination.
1 Introduction
Probabilistic graphical models provide a convenient framework for the graphical representation of joint
probability distributions via local constraints, and facilitate computation of many quantities of interest
required for both inference and learning. Such a probabilistic treatment of uncertainty offers a consistent

  

Source: Agakov, Felix - Institute for Adaptive and Neural Computation, School of Informatics, University of Edinburgh

 

Collections: Computer Technologies and Information Sciences