Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
A measure of the information content of EIT data Andy Adler1
 

Summary: A measure of the information content of EIT data
Andy Adler1
, Richard Youmaran2
, William R.B. Lionheart3
1
Systems and Computer Engineering, Carleton University, Ottawa, Canada
2
School of Information Technology and Engineering, University of Ottawa, Canada
3
School of Mathematics, University of Manchester, U.K.
E-mail: adler@sce.carleton.ca
Abstract. We ask: how many bits of information (in the Shannon sense) do we
get from a set of EIT measurements? Here, the term information in measurements
(IM) is defined as: the decrease in uncertainty about the contents of a medium, due
to a set of measurements. This decrease in uncertainly is quantified by the change
from the the inter-class model, q, defined by the prior information, to the intra-class
model, p, given by the measured data (corrupted by noise). IM is measured by the
expected relative entropy (Kullback-Leibler divergence) between distributions q and
p, and corresponds to the channel capacity in an analogous communications system.
Based on a Gaussian model of the measurement noise, n, and a prior model of the

  

Source: Adler, Andy - Department of Systems and Computer Engineering, Carleton University

 

Collections: Computer Technologies and Information Sciences