Coupling from the past explained
Among Markov chain Monte Carlo (MCMC) algorithms, coupling from the past is a method for sampling from the stationary distribution of a Markov chain. Contrary to many MCMC algorithms, coupling from the past gives in principle a perfect sample from the stationary distribution. It was invented by James Propp and David Wilson in 1996.
The basic idea
Consider a finite state irreducible aperiodic Markov chain
with state space
and (unique) stationary distribution
(
is a probability vector). Suppose that we come up with a probability distribution
on the set of maps
with the property that for every fixed
, its image
is distributed according to the transition probability of
from state
. An example of such a probability distribution is the one where
is
independent from
whenever
, but it is often worthwhile to consider other distributions. Now let
for
be independent samples from
.
Suppose that
is chosen randomly according to
and is independent from the sequence
. (We do not worry for now where this
is coming from.) Then
is also distributed according to
, because
is
-stationary and our assumption on the law of
. Define
Fj:=f-1\circf-2\circ … \circf-j.
Then it follows by induction that
is also distributed according to
for every
. However, it may happen that for some
the image of the map
is a single element of
.In other words,
for each
. Therefore, we do not need to have access to
in order to compute
. The algorithm then involves finding some
such that
is a
singleton, and outputting the element of that singleton. The design of a good distribution
for which the task of finding such an
and computing
is not too costly is not always obvious, but has been accomplished successfully in several important instances.
[1] The monotone case
There is a special class of Markov chains in which there are particularly good choicesfor
and a tool for determining if
. (Here
denotes
cardinality.) Suppose that
is a
partially ordered set with order
, which has a unique minimal element
and a unique maximal element
; that is, every
satisfies
. Also, suppose that
may be chosen to be supported on the set of
monotone maps
. Then it is easy to see that
if and only if
, since
is monotone. Thus, checking this becomes rather easy. The algorithm can proceed by choosing
for some constant
, sampling the maps
, and outputting
if
. If
the algorithm proceeds by doubling
and repeating as necessary until an output is obtained. (But the algorithm does not resample the maps
which were already sampled; it uses the previously sampled maps when needed.)
References
- cs2 . Propp . James Gary . Wilson . David Bruce . Proceedings of the Seventh International Conference on Random Structures and Algorithms (Atlanta, GA, 1995) . 1611693 . 1996 . Exact sampling with coupled Markov chains and applications to statistical mechanics . 223–252.
Notes and References
- Web site: Web Site for Perfectly Random Sampling with Markov Chains.