Chain rule (probability) explained

In probability theory, the chain rule[1] (also called the general product rule[2] [3]) describes how to calculate the probability of the intersection of, not necessarily independent, events or the joint distribution of random variables respectively, using conditional probabilities. This rule allows you to express a joint probability in terms of only conditional probabilities.[4] The rule is notably used in the context of discrete stochastic processes and in applications, e.g. the study of Bayesian networks, which describe a probability distribution in terms of conditional probabilities.

Chain rule for events

Two events

A

and

B

, the chain rule states that

P(A\capB)=P(B\midA)P(A)

,

where

P(B\midA)

denotes the conditional probability of

B

given

A

.

Example

An Urn A has 1 black ball and 2 white balls and another Urn B has 1 black ball and 3 white balls. Suppose we pick an urn at random and then select a ball from that urn. Let event

A

be choosing the first urn, i.e.

P(A)=P(\overline{A})=1/2

, where

\overlineA

is the complementary event of

A

. Let event

B

be the chance we choose a white ball. The chance of choosing a white ball, given that we have chosen the first urn, is

P(B|A)=2/3.

The intersection

A\capB

then describes choosing the first urn and a white ball from it. The probability can be calculated by the chain rule as follows:

P(A\capB)=P(B\midA)P(A)=

23
12
=
13.

Finitely many events

For events

A1,\ldots,An

whose intersection has not probability zero, the chain rule states

\begin{align} P\left(A1\capA2\cap\ldots\capAn\right)&=P\left(An\midA1\cap\ldots\capAn-1\right)P\left(A1\cap\ldots\capAn-1\right)\\ &=P\left(An\midA1\cap\ldots\capAn-1\right)P\left(An-1\midA1\cap\ldots\capAn-2\right)P\left(A1\cap\ldots\capAn-2\right)\\ &=P\left(An\midA1\cap\ldots\capAn-1\right)P\left(An-1\midA1\cap\ldots\capAn-2\right)\ldotsP(A3\midA1\capA2)P(A2\midA1)P(A1)\\ &=P(A1)P(A2\midA1)P(A3\midA1\capA2)\ldotsP(An\midA1\cap...\capAn-1)\\ &=

n
\prod
k=1

P(Ak\midA1\cap...\capAk-1)\\ &=

n
\prod
k=1

P\left(Ak|

k-1
cap
j=1

Aj\right). \end{align}

Example 1

For

n=4

, i.e. four events, the chain rule reads

\begin{align} P(A1\capA2\capA3\capA4)&=P(A4\midA3\capA2\capA1)P(A3\capA2\capA1)\\ &=P(A4\midA3\capA2\capA1)P(A3\midA2\capA1)P(A2\capA1)\\ &=P(A4\midA3\capA2\capA1)P(A3\midA2\capA1)P(A2\midA1)P(A1) \end{align}

.

Example 2

We randomly draw 4 cards without replacement from deck with 52 cards. What is the probability that we have picked 4 aces?

First, we set A_n := \left\. Obviously, we get the following probabilities

P(A1)=

4{52},
   P(A

2\midA1)=

3{51},
   P(A

3\midA1\capA2)=

2{50},
   P(A

4\midA1\capA2\capA3)=

1{49}
.

Applying the chain rule,

P(A1\capA2\capA3\capA4)=

4{52}
3{51}
2{50}
1{49}
.

Statement of the theorem and proof

Let

(\Omega,lA,P)

be a probability space. Recall that the conditional probability of an

A\inlA

given

B\inlA

is defined as

\begin{align} P(A\midB):=\begin{cases}

P(A\capB)
P(B)

,&P(B)>0,\ 0&P(B)=0.\end{cases} \end{align}

Then we have the following theorem.

Chain rule for discrete random variables

Two random variables

For two discrete random variables

X,Y

, we use the events

A:=\{X=x\}

and

B:=\{Y=y\}

in the definition above, and find the joint distribution as

P(X=x,Y=y)=P(X=x\midY=y)P(Y=y),

or

P(X,Y)(x,y)=PX(x\midy)PY(y),

where

PX(x):=P(X=x)

is the probability distribution of

X

and

PX(x\midy)

conditional probability distribution of

X

given

Y

.

Finitely many random variables

Let

X1,\ldots,Xn

be random variables and

x1,...,xn\inR

. By the definition of the conditional probability,

P\left(Xn=xn,\ldots,X1=x1\right)=P\left(Xn=xn|Xn-1=xn-1,\ldots,X1=x1\right)P\left(Xn-1=xn-1,\ldots,X1=x1\right)

and using the chain rule, where we set

Ak:=\{Xk=xk\}

, we can find the joint distribution as

\begin{align} P\left(X1=x1,\ldotsXn=xn\right)&=P\left(X1=x1\midX2=x2,\ldots,Xn=xn\right)P\left(X2=x2,\ldots,Xn=xn\right)\\ &=P(X1=x1)P(X2=x2\midX1=x1)P(X3=x3\midX1=x1,X2=x2)\ldots\\ &    P(Xn=xn\midX1=x1,...,Xn-1=xn-1)\\ \end{align}

Example

For

n=3

, i.e. considering three random variables. Then, the chain rule reads
\begin{align} P
(X1,X2,X3)

(x1,x2,x3) &=P(X1=x1,X2=x2,X3=x3)\ &=P(X3=x3\midX2=x2,X1=x1)P(X2=x2,X1=x1)\\ &=P(X3=x3\midX2=x2,X1=x1)P(X2=x2\midX1=x1)P(X1=x1)\\ &=

P
X3\midX2,X1

(x3\midx2,x1)

P
X2\midX1

(x2\midx1)

P
X1

(x1). \end{align}

Bibliography

Notes and References

  1. Book: Schilling, René L.. Measure, Integral, Probability & Processes - Probab(ilistical)ly the Theoretical Minimum . Technische Universität Dresden, Germany . 2021. 979-8-5991-0488-9. 136ff.
  2. Book: Schum, David A.. The Evidential Foundations of Probabilistic Reasoning. 1994. Northwestern University Press. 978-0-8101-1821-8. 49.
  3. Book: Klugh, Henry E.. Statistics: The Essentials for Research. 2013. Psychology Press. 978-1-134-92862-0. 149. 3rd.
  4. Web site: Virtue . Pat . 10-606: Mathematical Foundations for Machine Learning .