Blackwell-Girshick equation explained

The Blackwell-Girshick equation is an equation in probability theory that allows for the calculation of the variance of random sums of random variables.[1] It is the equivalent of Wald's lemma for the expectation of composite distributions.

It is named after David Blackwell and Meyer Abraham Girshick.

Statement

Let

N

be a random variable with values in

Z\ge

, let

X1,X2,X3,...

be independent and identically distributed random variables, which are also independent of

N

, and assume that the second moment exists for all

Xi

and

N

. Then, the random variable defined by
NX
Y:=\sum
i
has the variance
2+\operatorname{E}(N)\operatorname{Var}(X
\operatorname{Var}(Y)=\operatorname{Var}(N)\operatorname{E}(X
1)
.

The Blackwell-Girshick equation can be derived using conditional variance and variance decomposition.If the

Xi

are natural number-valued random variables, the derivation can be done elementarily using the chain rule and the probability-generating function.

Proof

For each

n\ge0

, let

\chin

be the random variable which is 1 if

N

equals

n

and 0 otherwise, and let

Yn:=X1+ … +Xn

. Then

\begin{align}\operatorname{E}(Y2)&=

infty
\sum
n=0

\operatorname{E}(\chin

2)\\ &=
Y
n
infty
\sum
n=0

\operatorname{P}(N=n)

2)\\ &=
\operatorname{E}(Y
n
infty\operatorname{P}(N=n)
\sum
n=0

(\operatorname{Var}(Yn)+\operatorname{E}(Y

2)\\ &=
n)
infty\operatorname{P}(N=n)
\sum
n=0

(n

2)\\ &=
\operatorname{Var}(X
1)

\operatorname{E}(N)\operatorname{Var}(X1)+\operatorname{E}(N2)

2. \end{align}
\operatorname{E}(X
1)
By Wald's equation, under the given hypotheses,

\operatorname{E}(Y)=\operatorname{E}(N)\operatorname{E}(X1)

. Therefore,

\begin{align}\operatorname{Var}(Y)&=\operatorname{E}(Y2)-\operatorname{E}(Y)2\\ &=\operatorname{E}(N)\operatorname{Var}(X1)+\operatorname{E}(N2)

2
\operatorname{E}(X
1)

-\operatorname{E}(N)2

2
\operatorname{E}(X
1)

\\ &=\operatorname{E}(N)\operatorname{Var}(X1)+\operatorname{Var}(N)

2, \end{align}
\operatorname{E}(X
1)
as desired.

Example

Let

N

have a Poisson distribution with expectation

λ

, and let

X1,X2,...

follow a Bernoulli distribution with parameter

p

. In this case,

Y

is also Poisson distributed with expectation

λp

, so its variance must be

λp

. We can check this with the Blackwell-Girshick equation:

N

has variance

λ

while each

Xi

has mean

p

and variance

p(1-p)

, so we must have

\operatorname{Var}(Y)=λp2+λp(1-p)=λp

.

Application and related concepts

The Blackwell-Girshick equation is used in actuarial mathematics to calculate the variance of composite distributions, such as the compound Poisson distribution. Wald's equation provides similar statements about the expectation of composite distributions.

Literature

Notes and References

  1. Book: Blackwell. D. A.. Girshick. M. A.. Theory of games and statistical decisions. Courier Corporation. 1979.