Cramér's theorem (large deviations) explained

Cramér's theorem is a fundamental result in the theory of large deviations, a subdiscipline of probability theory. It determines the rate function of a series of iid random variables.A weak version of this result was first shown by Harald Cramér in 1938.

Statement

The logarithmic moment generating function (which is the cumulant-generating function) of a random variable is defined as:

Λ(t)=log\operatornameE[\exp(tX1)].

Let

X1,X2,...

be a sequence of iid real random variables with finite logarithmic moment generating function, i.e.

Λ(t)<infty

for all

t\inR

.

Then the Legendre transform of

Λ

:

Λ*(x):=\supt\left(tx(t)\right)

satisfies,

\limn

1n
log
n
\left(P\left(\sum
i=1

Xi\geqnx\right)\right)=*(x)

for all

x>\operatornameE[X1].

In the terminology of the theory of large deviations the result can be reformulated as follows:

If

X1,X2,...

is a series of iid random variables, then the distributions

\left(lL(\tfrac1n

n
\sum
i=1

Xi)\right)n

satisfy a large deviation principle with rate function

Λ*

.

References