# Seminar: October 23, 2008, 2:30pm

## The Communication Complexity of Correlation

Note non-standard day and time!

In this talk, we examine the communication required for generating correlated random variables remotely. Consider a pair of joint random variables (X,Y). Suppose Alice is given a sample x distributed according to X, and she has to send a message to Bob who is then required to generate a value with distribution exactly Y|_{X=x} (i.e, the conditional distribution of Y given X=x). We consider the minimum expected number of bits (which we call C(X:Y)) Alice needs to send Bob to achieve this. We show that if Alice and Bob are allowed to share random bits (independent of Alice's input x), then Alice need send no more than approximately the mutual information number of bits. More formally,

I[X:Y] <= C(X:Y) <= I[X:Y] + O(log I[X:Y]) + O(1)

where I[X:Y] = H[X] + H[Y] - H[X,Y] is the mutual information between the random variables X and Y.

As an application of this characterization of mutual information, we derive a direct sum theorem in communication complexity that substantially improves the previous such result shown by Jain et al 2003.

Our proofs are based on a rejection sampling procedure that relates the relative entropy between two distributions to the communication complexity of generating one distribution from the other.