 
Summary: Mutual Information, Relative Entropy, and Estimation
in the Poisson Channel
Rami Atar1
and Tsachy Weissman2, 1
1
Technion Israel Institute of Technology, Haifa 32000, Israel
2
Stanford University, Stanford, CA 94305
December 31, 2010
Abstract
Let X be a nonnegative random variable and let the conditional distribution of a random variable Y , given
X, be Poisson( · X), for a parameter 0. We identify a natural loss function such that:
· The derivative of the mutual information between X and Y with respect to is equal to the minimum mean
loss in estimating X based on Y , regardless of the distribution of X.
· When X P is estimated based on Y by a mismatched estimator that would have minimized the expected
loss had X Q, the integral over all values of of the excess mean loss is equal to the relative entropy
between P and Q.
For a continuous time setting where XT
= {Xt, 0 t T} is a nonnegative stochastic process and the conditional
law of Y T
