Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Learning Semantic Representations with Hidden Markov Topics Mark Andrews (m.andrews@ucl.ac.uk)
 

Summary: Learning Semantic Representations with Hidden Markov Topics
Models
Mark Andrews (m.andrews@ucl.ac.uk)
Gabriella Vigliocco (g.vigliocco@ucl.ac.uk)
Cognitive, Perceptual and Brain Sciences
University College London,
26 Bedford Way
London, WC1H 0AP
United Kingdom
Abstract
In this paper, we describe a model that learns seman-
tic representations from the distributional statistics of
language. This model, however, goes beyond the com-
mon bag-of-words paradigm, and infers semantic repre-
sentations by taking into account the inherent sequential
nature of linguistic data. The model we describe, and
which we refer to as a Hidden Markov Topics model is
a natural extension of the current state of the art in
Bayesian bag-of-words models, i.e. the Topics model of
Griffiths, Steyvers, and Tenenbaum (2007), preserving

  

Source: Andrews, Mark W. - Department of Cognitive, Perceptual and Brain Sciences, University College London
Vigliocco, Gabriella - Department of Psychology, University College London

 

Collections: Biology and Medicine; Computer Technologies and Information Sciences