Home

About

Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network
FAQHELPSITE MAPCONTACT US


  Advanced Search  

 
IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 47, NO. 4, MAY 2001 1553 Rnyi's Divergence and Entropy Rates for Finite Alphabet
 

Summary: IEEE TRANSACTIONS ON INFORMATION THEORY, VOL. 47, NO. 4, MAY 2001 1553
Rényi's Divergence and Entropy Rates for Finite Alphabet
Markov Sources
Ziad Rached, Student Member, IEEE,
Fady Alajaji, Senior Member, IEEE, and
L. Lorne Campbell, Life Fellow, IEEE
Abstract--In this work, we examine the existence and the computation
of the Rényi divergence rate, lim ( ), between two
time-invariant finite-alphabet Markov sources of arbitrary order and ar-
bitrary initial distributions described by the probability distributions
and , respectively. This yields a generalization of a result of Nemetz
where he assumed that the initial probabilities under and are
strictly positive. The main tools used to obtain the Rényi divergence rate
are the theory of nonnegative matrices and Perron­Frobenius theory. We
also provide numerical examples and investigate the limits of the Rényi di-
vergence rate as 1 and as 0. Similarly, we provide a formula
for the Rényi entropy rate lim ( ) of Markov sources and
examine its limits as 1 and as 0. Finally, we briefly provide an
application to source coding.
Index Terms--Kullback­Leibler divergence rate, nonnegative matrices,

  

Source: Alajaji, Fady - Department of Mathematics and Statistics, Queen's University (Kingston)

 

Collections: Engineering