Seminar: May 24

Mark Braverman, University of Toronto

Information and Interactive Communication

Notions of entropy and information, pioneered by Shannon, have been very powerful tools in coding theory. Coding theory aims to solve the problem of one-way communication: sending a message from Alice to Bob using as little communication as possible, sometimes over a noisy channel.

We will discuss several extensions of information-theoretic notions to the two-way communication setting. We use them to prove a direct sum theorem for randomized communication complexity, showing that implementing k copies of a functionality requires substantially more communication than just one copy.

More generally, we will show that information cost I(f) can be defined as a natural fundamental property of a functionality f. We will describe several new tight connections between I(f), direct sum theorems, interactive compression schemes, and amortized communication complexity.