skip to main content
OSTI.GOV title logo U.S. Department of Energy
Office of Scientific and Technical Information

Title: Unsupervised learning of binary vectors: A Gaussian scenario

Abstract

We study a model of unsupervised learning where the real-valued data vectors are isotropically distributed, except for a single symmetry-breaking binary direction B(set-membership sign){l_brace}-1,+1{r_brace}{sup N}, onto which the projections have a Gaussian distribution. We show that a candidate vector J undergoing Gibbs learning in this discrete space, approaches the perfect match J=B exponentially. In addition to the second-order ''retarded learning'' phase transition for unbiased distributions, we show that first-order transitions can also occur. Extending the known result that the center of mass of the Gibbs ensemble has Bayes-optimal performance, we show that taking the sign of the components of this vector (clipping) leads to the vector with optimal performance in the binary space. These upper bounds are shown generally not to be saturated with the technique of transforming the components of a special continuous vector, except in asymptotic limits and in a special linear case. Simulations are presented which are in excellent agreement with the theoretical results. (c) 2000 The American Physical Society.

Authors:
 [1];  [2]
  1. Department of Chemistry and Biochemistry 0340, University of California San Diego, La Jolla, California 92093-0340 (United States)
  2. Limburgs Universitair Centrum, B-3590 Diepenbeek, (Belgium)
Publication Date:
OSTI Identifier:
20216773
Resource Type:
Journal Article
Journal Name:
Physical Review. E, Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics
Additional Journal Information:
Journal Volume: 61; Journal Issue: 6; Other Information: PBD: Jun 2000; Journal ID: ISSN 1063-651X
Country of Publication:
United States
Language:
English
Subject:
71 CLASSICAL AND QUANTUM MECHANICS, GENERAL PHYSICS; LEARNING; GAUSS FUNCTION; ISING MODEL; PHASE TRANSFORMATIONS; STATISTICAL MECHANICS; THEORETICAL DATA

Citation Formats

Copelli, Mauro, and Van den Broeck, Christian. Unsupervised learning of binary vectors: A Gaussian scenario. United States: N. p., 2000. Web. doi:10.1103/PhysRevE.61.6971.
Copelli, Mauro, & Van den Broeck, Christian. Unsupervised learning of binary vectors: A Gaussian scenario. United States. doi:10.1103/PhysRevE.61.6971.
Copelli, Mauro, and Van den Broeck, Christian. Thu . "Unsupervised learning of binary vectors: A Gaussian scenario". United States. doi:10.1103/PhysRevE.61.6971.
@article{osti_20216773,
title = {Unsupervised learning of binary vectors: A Gaussian scenario},
author = {Copelli, Mauro and Van den Broeck, Christian},
abstractNote = {We study a model of unsupervised learning where the real-valued data vectors are isotropically distributed, except for a single symmetry-breaking binary direction B(set-membership sign){l_brace}-1,+1{r_brace}{sup N}, onto which the projections have a Gaussian distribution. We show that a candidate vector J undergoing Gibbs learning in this discrete space, approaches the perfect match J=B exponentially. In addition to the second-order ''retarded learning'' phase transition for unbiased distributions, we show that first-order transitions can also occur. Extending the known result that the center of mass of the Gibbs ensemble has Bayes-optimal performance, we show that taking the sign of the components of this vector (clipping) leads to the vector with optimal performance in the binary space. These upper bounds are shown generally not to be saturated with the technique of transforming the components of a special continuous vector, except in asymptotic limits and in a special linear case. Simulations are presented which are in excellent agreement with the theoretical results. (c) 2000 The American Physical Society.},
doi = {10.1103/PhysRevE.61.6971},
journal = {Physical Review. E, Statistical Physics, Plasmas, Fluids, and Related Interdisciplinary Topics},
issn = {1063-651X},
number = 6,
volume = 61,
place = {United States},
year = {2000},
month = {6}
}