Seminar: April 26

Tom Hayes, University of New Mexico

Faster Liftings for Markov Chains

A "lifting" of a Markov chain is a larger chain obtained by replacing each state of the original chain by a set of states, with transition probabilities defined in such a way that the lifted chain projects down exactly to the original one. It is well known that lifting can potentially speed up the mixing time substantially. Essentially all known examples of efficiently implementable liftings have required a high degree of symmetry in the original chain. Addressing an open question of Chen, Lovasz, and Pak, we present the first example of a successful lifting for a complex Markov chain that has been used in sampling algorithms. This chain, first introduced by Sinclair and Jerrum, samples a leaf uniformly at random in a large tree, given approximate information about the number of leaves in any subtree, and has applications to the theory of approximate counting and to importance sampling in Statistics. Our lifted version of the chain (which, unlike the original one, is non-reversible) gives a significant speedup over the original version whenever the error in the leaf counting estimates is o(1). Our lifting construction, based on flows, is systematic, and we conjecture that it may be applicable to other Markov chains used in sampling algorithms.

The paper appeared in RANDOM 2010 Joint work with Alistair Sinclair of U.C. Berkeley