 
Summary: Information Equals Amortized Communication
Mark Braverman
Anup Rao
November 8, 2010
Abstract
We show how to efficiently simulate the sending of a message M to a receiver who has partial information
about the message, so that the expected number of bits communicated in the simulation is close to the amount
of additional information that the message reveals to the receiver. This is a generalization and strengthening of
the SlepianWolf theorem, which shows how to carry out such a simulation with low amortized communication
in the case that M is a deterministic function of X. A caveat is that our simulation is interactive.
As a consequence, we obtain new relationships between the randomized amortized communication com
plexity of a function, and its information complexity. We prove that for any given distribution on inputs, the
internal information cost (namely the information revealed to the parties) involved in computing any relation
or function using a two party interactive protocol is exactly equal to the amortized communication complex
ity of computing independent copies of the same relation or function. Here by amortized communication
complexity we mean the average per copy communication in the best protocol for computing multiple copies,
with a bound on the error in each copy. This significantly simplifies the relationships between the various
measures of complexity for average case communication protocols, and proves that if a function's information
cost is smaller than its communication complexity, then multiple copies of the function can be computed more
efficiently in parallel than sequentially.
