Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
Generalized Source Coding Theorems and Hypothesis Testing: Part I --Information Measures
 

Summary: Generalized Source Coding Theorems and Hypothesis
Testing: Part I -- Information Measures
Po­Ning Chen Fady Alajaji
Dept. of Communications Engineering Dept. of Mathematics and Statistics
National Chiao Tung University Queen's University
1001, Ta­Hsueh Road, Hsin Chu Kingston, Ontario K7L 3N6
Taiwan 30050, R.O.C. Canada
Key Words: entropy, mutual information, divergence, ''­capacity
Abstract
Expressions for ''­entropy rate, ''­mutual information rate and ''­divergence rate are introduced.
These quantities, which consist of the quantiles of the asymptotic information spectra, generalize
the inf/sup­entropy/information/divergence rates of Han and Verd'u. The algebraic properties
of these information measures are rigorously analyzed, and examples illustrating their use in the
computation of the ''­capacity are presented. In Part II of this work, these measures are employed
to prove general source coding theorems for block codes and the general formula of the Neyman­
Pearson hypothesis testing type­II error exponent subject to upper bounds on the type­I error
probability.

I. Introduction and Motivation
Entropy, divergence and mutual information are without a doubt the most important infor­

  

Source: Alajaji, Fady - Department of Mathematics and Statistics, Queen's University (Kingston)

 

Collections: Engineering