Advanced Search

Browse by Discipline

Scientific Societies

E-print Alerts

Add E-prints

E-print Network

  Advanced Search  

Mutual Information, Relative Entropy, and Estimation in the Poisson Channel

Summary: Mutual Information, Relative Entropy, and Estimation
in the Poisson Channel
Rami Atar1
and Tsachy Weissman2, 1
Technion Israel Institute of Technology, Haifa 32000, Israel
Stanford University, Stanford, CA 94305
December 31, 2010
Let X be a non-negative random variable and let the conditional distribution of a random variable Y , given
X, be Poisson( X), for a parameter 0. We identify a natural loss function such that:
The derivative of the mutual information between X and Y with respect to is equal to the minimum mean
loss in estimating X based on Y , regardless of the distribution of X.
When X P is estimated based on Y by a mismatched estimator that would have minimized the expected
loss had X Q, the integral over all values of of the excess mean loss is equal to the relative entropy
between P and Q.
For a continuous time setting where XT
= {Xt, 0 t T} is a non-negative stochastic process and the conditional
law of Y T


Source: Atar, Rami - Department of Electrical Engineering, Technion, Israel Institute of Technology


Collections: Engineering