In probability theory, de Finetti's theorem states that exchangeable observations are conditionally independent relative to some latent variable. An epistemic probability distribution could then be assigned to this variable. It is named in honor of Bruno de Finetti.
For the special case of an exchangeable sequence of Bernoulli random variables it states that such a sequence is a "mixture" of sequences of independent and identically distributed (i.i.d.) Bernoulli random variables.
A sequence of random variables is called exchangeable if the joint distribution of the sequence is unchanged by any permutation of the indices. In general, while the variables of the exchangeable sequence are not themselves independent, only exchangeable, there is an underlying family of i.i.d. random variables. That is, there are underlying, generally unobservable, quantities that are i.i.d. – exchangeable sequences are mixtures of i.i.d. sequences.
A Bayesian statistician often seeks the conditional probability distribution of a random quantity given the data. The concept of exchangeability was introduced by de Finetti. De Finetti's theorem explains a mathematical relationship between independence and exchangeability.[1]
An infinite sequence
X1,X2,X3,...
of random variables is said to be exchangeable if for any natural number n and any finite sequence i1, ..., in and any permutation of the sequence π: →,
(X | |
i1 |
,...,X | |
in |
)and
(X | |
\pi(i1) |
,...,X | |
\pi(in) |
)
both have the same joint probability distribution.
If an identically distributed sequence is independent, then the sequence is exchangeable; however, the converse is false—there exist exchangeable random variables that are not statistically independent, for example the Pólya urn model.
A random variable X has a Bernoulli distribution if Pr(X = 1) = p and Pr(X = 0) = 1 - p for some p ∈ (0, 1).
De Finetti's theorem states that the probability distribution of any infinite exchangeable sequence of Bernoulli random variables is a "mixture" of the probability distributions of independent and identically distributed sequences of Bernoulli random variables. "Mixture", in this sense, means a weighted average, but this need not mean a finite or countably infinite (i.e., discrete) weighted average: it can be an integral over a measure rather than a sum.
More precisely, suppose X1, X2, X3, ... is an infinite exchangeable sequence of Bernoulli-distributed random variables. Then there is some probability measure m on the interval [0, 1] and some random variable Y such that
Suppose
X1,X2,X3,\ldots
X1,X2,X3,\ldots
X1,X2,\ldots
Here is a concrete example. We construct a sequence
X1,X2,X3,...
of random variables, by "mixing" two i.i.d. sequences as follows.
We assume p = 2/3 with probability 1/2 and p = 9/10 with probability 1/2. Given the event p = 2/3, the conditional distribution of the sequence is that the Xi are independent and identically distributed and X1 = 1 with probability 2/3 and X1 = 0 with probability 1 - 2/3. Given the event p = 9/10, the conditional distribution of the sequence is that the Xi are independent and identically distributed and X1 = 1 with probability 9/10 and X1 = 0 with probability 1 - 9/10.
This can be interpreted as follows: Make two biased coins, one showing "heads" with 2/3 probability and one showing "heads" with 9/10 probability. Flip a fair coin once to decide which biased coin to use for all flips that are recorded. Here "heads" at flip i means Xi=1.
The independence asserted here is conditional independence, i.e. the Bernoulli random variables in the sequence are conditionally independent given the event that p = 2/3, and are conditionally independent given the event that p = 9/10. But they are not unconditionally independent; they are positively correlated.
In view of the strong law of large numbers, we can say that
\limn → infty
X1+ … +Xn | |
n |
=\begin{cases} 2/3&withprobability1/2,\\ 9/10&withprobability1/2. \end{cases}
Rather than concentrating probability 1/2 at each of two points between 0 and 1, the "mixing distribution" can be any probability distribution supported on the interval from 0 to 1; which one it is depends on the joint distribution of the infinite sequence of Bernoulli random variables.
The definition of exchangeability, and the statement of the theorem, also makes sense for finite length sequences
X1,...,Xn,
but the theorem is not generally true in that case. It is true if the sequence can be extended to an exchangeable sequence that is infinitely long. The simplest example of an exchangeable sequence of Bernoulli random variables that cannot be so extended is the one in which X1 = 1 - X2 and X1 is either 0 or 1, each with probability 1/2. This sequence is exchangeable, but cannot be extended to an exchangeable sequence of length 3, let alone an infinitely long one.
Versions of de Finetti's theorem for finite exchangeable sequences,[2] [3] and for Markov exchangeable sequences[4] have been proved by Diaconis and Freedman and by Kerns and Szekely. Two notions of partial exchangeability of arrays, known as separate and joint exchangeability lead to extensions of de Finetti's theorem for arrays by Aldous and Hoover.[5]
The computable de Finetti theorem shows that if an exchangeable sequence of real random variables is given by a computer program, then a program which samples from the mixing measure can be automatically recovered.[6]
In the setting of free probability, there is a noncommutative extension of de Finetti's theorem which characterizes noncommutative sequences invariant under quantum permutations.[7]
Extensions of de Finetti's theorem to quantum states have been found to be useful in quantum information,[8] [9] [10] in topics like quantum key distribution[11] and entanglement detection.[12] A multivariate extension of de Finetti’s theorem can be used to derive Bose–Einstein statistics from the statistics of classical (i.e. independent) particles.[13]