In mathematics, the Khintchine inequality, named after Aleksandr Khinchin and spelled in multiple ways in the Latin alphabet, is a theorem from probability, and is also frequently used in analysis. Heuristically, it says that if we pick
N
x1,...,xN\inC
\pm1
2 | |
\sqrt{|x | |
1| |
+ … +
2 | |
|x | |
N| |
Let
\{\varepsilonn\}
N | |
n=1 |
P(\varepsilon | ||||
|
n=1,\ldots,N
0<p<infty
x1,\ldots,xN\inC
Ap\left(
N | |
\sum | |
n=1 |
2 | |
|x | |
n| |
\right)1/2\leq\left(\operatorname{E}
N | |
\left|\sum | |
n=1 |
\varepsilonn
p | |
x | |
n\right| |
\right)1/p\leqBp
N | |
\left(\sum | |
n=1 |
2\right) | |
|x | |
n| |
1/2
for some constants
Ap,Bp>0
p
Ap,Bp
Ap=1
p\ge2
Bp=1
0<p\le2
Haagerup found that
\begin{align} Ap&=\begin{cases} 21/2-1/p&0<p\lep0,\\ 21/2(\Gamma((p+1)/2)/\sqrt{\pi})1/p&p0<p<2\\ 1&2\lep<infty \end{cases} \\ &and \\ Bp&=\begin{cases} 1&0<p\le2\\ 21/2(\Gamma((p+1)/2)/\sqrt\pi)1/p&2<p<infty \end{cases}, \end{align}
p0 ≈ 1.847
\Gamma
Bp
The uses of this inequality are not limited to applications in probability theory. One example of its use in analysis is the following: if we let
T
Lp(X,\mu)
Lp(Y,\nu)
1<p<infty
\|T\|<infty
N | |
\left\|\left(\sum | |
n=1 |
2 | |
|Tf | |
n| |
\right)1/2
\right\| | |
Lp(Y,\nu) |
\leqCp
N | |
\left\|\left(\sum | |
n=1 |
2\right) | |
|f | |
n| |
1/2
\right\| | |
Lp(X,\mu) |
for some constant
Cp>0
p
\|T\|
For the case of Rademacher random variables, Pawel Hitczenko showed[1] that the sharpest version is:
A
N | |
\left(\sqrt{p}\left(\sum | |
n=b+1 |
2\right) | |
x | |
n |
1/2+
b | |
\sum | |
n=1 |
xn\right) \leq\left(\operatorname{E}
N | |
\left|\sum | |
n=1 |
\varepsilonn
p | |
x | |
n\right| |
\right)1/p\leqB
N | |
\left(\sqrt{p}\left(\sum | |
n=b+1 |
2\right) | |
x | |
n |
1/2+
b | |
\sum | |
n=1 |
xn\right)
where
b=\lfloorp\rfloor
A
B
p
Here we assume that the
xi