In mathematics, uniform integrability is an important concept in real analysis, functional analysis and measure theory, and plays a vital role in the theory of martingales.
Uniform integrability is an extension to the notion of a family of functions being dominated in
L1
Definition A: Letbe a positive measure space. A set(X,ak{M},\mu)
is called uniformly integrable if\Phi\subsetL1(\mu)
, and to each\supf\in\Phi
\|f\| L1(\mu) <infty
there corresponds a\varepsilon>0
such that\delta>0
\intE|f|d\mu<\varepsilon
whenever
andf\in\Phi
\mu(E)<\delta.
Definition A is rather restrictive for infinite measure spaces. A more general definition[3] of uniform integrability that works well in general measures spaces was introduced by G. A. Hunt.
Definition H: Letbe a positive measure space. A set(X,ak{M},\mu)
is called uniformly integrable if and only if\Phi\subsetL1(\mu)
}|f|\, d\mu=0
inf
g\in
1 L +(\mu) \supf\in\Phi\int\{|f|>g\
where
.
1 L +(\mu)=\{g\in L1(\mu):g\geq0\}
Since Hunt's definition is equivalent to Definition A when the underlying measure space is finite (see Theorem 2 below), Definition H is widely adopted in Mathematics.
The following result[4] provides another equivalent notion to Hunt's. This equivalency is sometimes given as definition for uniform integrability.
Theorem 1: Ifis a (positive) finite measure space, then a set(X,ak{M},\mu)
is uniformly integrable if and only if\Phi\subsetL1(\mu)
inf
g\in
1 L +(\mu) \supf\in\Phi\int(|f|-g)+d\mu=0
If in addition
, then uniform integrability is equivalent to either of the following conditions\mu(X)<infty
1.
.infa>0\supf\in\int(|f|-a)+d\mu=0
2.
}|f|\,d\mu=0infa>0\supf\in\int\{|f|>a\
When the underlying space
(X,ak{M},\mu)
\sigma
Theorem 2: Letbe a(X,ak{M},\mu)
-finite measure space, and\sigma
be such thath\inL1(\mu)
almost everywhere. A seth>0
is uniformly integrable if and only if\Phi\subsetL1(\mu)
, and for any\supf\in\Phi
\|f\| L1(\mu) <infty
, there exits\varepsilon>0
such that\delta>0
\supf\in\Phi\intA|f|d\mu<\varepsilon
whenever
.\intAhd\mu<\delta
A consequence of Theorems 1 and 2 is that equivalence of Definitions A and H for finite measures follows. Indeed, the statement in Definition A is obtained by taking
h\equiv1
In the theory of probability, Definition A or the statement of Theorem 1 are often presented as definitions of uniform integrability using the notation expectation of random variables.,[5] [6] [7] that is,
1. A class
l{C}
M
X
l{C}
\operatornameE(|X|)\leqM
\varepsilon>0
\delta>0
A
P(A)\leq\delta
X
l{C}
\operatornameE(|X|IA)\leq\varepsilon
or alternatively
2. A class
l{C}
\varepsilon>0
K\in[0,infty)
\operatornameE(|X|I|X|\geq)\le\varepsilon forallX\inl{C}
I|X|\geq
I|X|\geq=\begin{cases}1&if|X|\geqK,\ 0&if|X|<K.\end{cases}
One consequence of uniformly integrability of a class
l{C}
\{P\circ|X|-1( ⋅ ):X\inl{C}\}
\delta>0
a>0
X\inl{C}
This however, does not mean that the family of measures
l{V}l{C
\Omega
There is another notion of uniformity, slightly different than uniform integrability, which also has many applications in probability and measure theory, and which does not require random variables to have a finite integral
Definition: Supposeis a probability space. A classed(\Omega,l{F},P)
of random variables is uniformly absolutely continuous with respect tol{C}
if for anyP
, there is\varepsilon>0
such that\delta>0
wheneverE[|X|IA]<\varepsilon
.P(A)<\delta
It is equivalent to uniform integrability if the measure is finite and has no atoms.
The term "uniform absolute continuity" is not standard, but is used by some authors.[8] [9]
The following results apply to the probabilistic definition.
\geq K |
\Omega=[0,1]\subsetR
Xn\inL1
\operatornameE(|Xn|)=1 ,
L1
Xn
\delta
(0,1/n)
\delta
E[|Xm|:(0,1/n)]=1
m\gen
X
L1
Xn
Y
l{C}
\{Xn\}
Lp
p>1
In the following we use the probabilistic framework, but regardless of the finiteness of the measure, by adding the boundedness condition on the chosen subset of
L1(\mu)
Xn\subsetL1(\mu)
\sigma(L1,Linfty)
\{X\alpha\}\alpha\in\Alpha\subsetL1(\mu)
G(t)
See main article: Convergence of random variables. A sequence
\{Xn\}
X
L1
X
. Albert Nikolayevich Shiryaev. 1995. Probability. 2. Springer-Verlag. New York. 187–188. 978-0-387-94549-1.
. Walter Rudin. 1987. Real and Complex Analysis. 3. McGraw–Hill Book Co.. Singapore. 133. 0-07-054234-1.
. J. J. Benedetto . 1976. Real Variable and Integration. B. G. Teubner . Stuttgart. 89. 3-519-02209-5.
. C. W. Burrill. 1972. Measure, Integration, and Probability. McGraw-Hill. 180. 0-07-009223-0.