In probability theory, Lindeberg's condition is a sufficient condition (and under certain conditions also a necessary condition) for the central limit theorem (CLT) to hold for a sequence of independent random variables.[1] [2] [3] Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.[4]
Let
(\Omega,l{F},P)
Xk:\Omega\toR,k\inN
E[Xk]=\muk
Var[Xk]=
2 | |
\sigma | |
k |
2 | |
s | |
n |
:=
n | |
\sum | |
k=1 |
2 | |
\sigma | |
k |
.
If this sequence of independent random variables
Xk
\limn
1 | ||||||
|
n | |
\sum | |
k=1 |
E\left[(Xk-
2 | |
\mu | |
k) |
⋅
1 | |
\{|Xk-\muk|>\varepsilonsn\ |
for all
\varepsilon>0
Zn:=
| ||||||||||
sn |
converge in distribution to a standard normal random variable as
n\toinfty.
Lindeberg's condition is sufficient, but not in general necessary (i.e. the inverse implication does not hold in general).However, if the sequence of independent random variables in question satisfies
maxk=1,\ldots,n
| |||||||
|
\to0, asn\toinfty,
then Lindeberg's condition is both sufficient and necessary, i.e. it holds if and only if the result of central limit theorem holds.
Feller's theorem can be used as an alternative method to prove that Lindeberg's condition holds.[5] Letting
Sn:=
n | |
\sum | |
k=1 |
Xk
E[Xk]=0
if
\forall\varepsilon>0
\limnmax1P(|Xk|>\varepsilonsn)=0
Sn | |
sn |
n → infty
Xk
This theorem can be used to disprove the central limit theorem holds for
Xk
Xk
Because the Lindeberg condition implies
maxk=1,\ldots,n
| |||||||
|
\to0
n\toinfty
Xk
1\leqk\leqn
2 | |
s | |
n |
n
Consider the following informative example which satisfies the Lindeberg condition. Let
\xii
ai
Now, define the normalized elements of the linear combination:
which satisfies the Lindeberg condition:
but
2 | |
\xi | |
i |
ai
\varepsilon