The log sum inequality is used for proving theorems in information theory.
Let
a1,\ldots,an
b1,\ldots,bn
ai
a
bi
b
n | |
\sum | |
i=1 |
a | ||||
|
\geqalog
a | |
b |
,
with equality if and only if
ai | |
bi |
i
ai=cbi
i
(Take
ailog
ai | |
bi |
0
ai=0
infty
ai>0,bi=0
0
Notice that after setting
f(x)=xlogx
n | |
\begin{align} \sum | |
i=1 |
a | ||||
|
&{}=
n | |
\sum | |
i=1 |
bif\left(
ai | |
bi |
\right) =
n | |
b\sum | |
i=1 |
bi | f\left( | |
b |
ai | |
bi |
\right)\\ &{}\geqb
n | |
f\left(\sum | |
i=1 |
bi | |
b |
ai | |
bi |
\right)=bf\left(
1 | |
b |
n | |
\sum | |
i=1 |
ai\right) =bf\left(
a | |
b |
\right)\\ &{}=alog
a | |
b |
, \end{align}
bi | |
b |
\geq0
| ||||
\sum | ||||
i=1 |
=1
f
The inequality remains valid for
n=infty
a<infty
b<infty
g
f(x)=xg(x)
Another generalization is due to Dannan, Neff and Thiel, who showed that if
a1,a2 … an
b1,b2 … bn
a1+a2 … +an=a
b1+b2 … +bn=b
k\geq0
n | |
\sum | |
i=1 |
ailog\left(
ai | |
bi |
+k\right)\geqalog\left(
a | |
b |
+k\right)
The log sum inequality can be used to prove inequalities in information theory. Gibbs' inequality states that the Kullback-Leibler divergence is non-negative, and equal to zero precisely if its arguments are equal. One proof uses the log sum inequality.
The inequality can also prove convexity of Kullback-Leibler divergence.