Lindeberg's condition explained

In probability theory, Lindeberg's condition is a sufficient condition (and under certain conditions also a necessary condition) for the central limit theorem (CLT) to hold for a sequence of independent random variables.[1] [2] [3] Unlike the classical CLT, which requires that the random variables in question have finite variance and be both independent and identically distributed, Lindeberg's CLT only requires that they have finite variance, satisfy Lindeberg's condition, and be independent. It is named after the Finnish mathematician Jarl Waldemar Lindeberg.[4]

Statement

Let

(\Omega,l{F},P)

be a probability space, and

Xk:\Omega\toR,k\inN

, be independent random variables defined on that space. Assume the expected values

E[Xk]=\muk

and variances

Var[Xk]=

2
\sigma
k
exist and are finite. Also let
2
s
n

:=

n
\sum
k=1
2
\sigma
k

.

If this sequence of independent random variables

Xk

satisfies Lindeberg's condition:

\limn

1
2
s
n
n
\sum
k=1

E\left[(Xk-

2
\mu
k)

1
\{|Xk-\muk|>\varepsilonsn\
} \right] = 0

for all

\varepsilon>0

, where 1 is the indicator function, then the central limit theorem holds, i.e. the random variables

Zn:=

n
\sum(Xk-\muk)
k=1
sn

converge in distribution to a standard normal random variable as

n\toinfty.

Lindeberg's condition is sufficient, but not in general necessary (i.e. the inverse implication does not hold in general).However, if the sequence of independent random variables in question satisfies

maxk=1,\ldots,n

2
\sigma
k
2
s
n

\to0,asn\toinfty,

then Lindeberg's condition is both sufficient and necessary, i.e. it holds if and only if the result of central limit theorem holds.

Remarks

Feller's theorem

Feller's theorem can be used as an alternative method to prove that Lindeberg's condition holds.[5] Letting

Sn:=

n
\sum
k=1

Xk

and for simplicity

E[Xk]=0

, the theorem states

if

\forall\varepsilon>0

,

\limnmax1P(|Xk|>\varepsilonsn)=0

and
Sn
sn
converges weakly to a standard normal distribution as

ninfty

then

Xk

satisfies the Lindeberg's condition.

This theorem can be used to disprove the central limit theorem holds for

Xk

by using proof by contradiction. This procedure involves proving that Lindeberg's condition fails for

Xk

.

Interpretation

Because the Lindeberg condition implies

maxk=1,\ldots,n

2
\sigma
k
2
s
n

\to0

as

n\toinfty

, it guarantees that the contribution of any individual random variable

Xk

(

1\leqk\leqn

) to the variance
2
s
n
is arbitrarily small, for sufficiently large values of

n

.

Example

Consider the following informative example which satisfies the Lindeberg condition. Let

\xii

be a sequence of zero mean, variance 1 iid random variables and

ai

a non-random sequence satisfying:

\max_i^n \frac

\rightarrow 0

Now, define the normalized elements of the linear combination:

X_ = \frac

which satisfies the Lindeberg condition:

\sum_i^n \mathbb E \left [\left | X_i\right |^2 1(|X_i| > \varepsilon)\right ] \leq \sum_i^n \mathbb E \left [\left | X_i\right |^2 1 \left(|\xi_i| > \varepsilon \frac{\|a\|_2}{\max_i^n |a_i|} \right)\right ] = \mathbb E \left [\left | \xi_i\right |^2 1 \left(|\xi_i| > \varepsilon \frac{\|a\|_2}{\max_i^n |a_i|} \right)\right ]

but

2
\xi
i
is finite so by DCT and the condition on the

ai

we have that this goes to 0 for every

\varepsilon

.

See also

Notes and References

  1. Book: Billingsley, P. . Probability and Measure . Wiley . 1986 . 2nd . 369 . 0-471-80478-9 .
  2. Book: Ash, R. B. . Probability and measure theory . limited . 2000 . 2nd . 307 . 0-12-065202-1 .
  3. Book: Resnick, S. I. . A probability Path . limited . 1999 . 314 .
  4. J. W. . Lindeberg . Eine neue Herleitung des Exponentialgesetzes in der Wahrscheinlichkeitsrechnung . 1922 . 211–225 . . 15 . 1 . 10.1007/BF01494395 . 119730242 .
  5. Book: Athreya, K. B. . S. N. . Lahiri . 2006 . Measure Theory and Probability Theory . Springer . 348 . 0-387-32903-X .