In mathematics, the Kantorovich inequality is a particular case of the Cauchy–Schwarz inequality, which is itself a generalization of the triangle inequality.
The triangle inequality states that the length of two sides of any triangle, added together, will be equal to or greater than the length of the third side. In simplest terms, the Kantorovich inequality translates the basic idea of the triangle inequality into the terms and notational conventions of linear programming. (See vector space, inner product, and normed vector space for other examples of how the basic ideas inherent in the triangle inequality—line segment and distance—can be generalized into a broader context.)
More formally, the Kantorovich inequality can be expressed this way:
Let
pi\geq0, 0<a\leqxi\leqbfori=1,...,n.
Let
An=\{1,2,...,n\}.
Then
\begin{align} &{} \left(
n | |
\sum | |
i=1 |
pixi\right)\left
n | |
(\sum | |
i=1 |
pi | |
xi |
\right)\\ &\leq
(a+b)2 | |
4ab |
\left
n | |
(\sum | |
i=1 |
pi\right
| ||||
) |
⋅ min\left\{\left(\sumipi-\sumjpj\right)2:{X\cupY=An},{X\capY=\varnothing}\right\}. \end{align}
The Kantorovich inequality is used in convergence analysis; it bounds the convergence rate of Cauchy's steepest descent.
Equivalents of the Kantorovich inequality have arisen in a number of different fields. For instance, the Cauchy - Schwarz - Bunyakovsky inequality and the Wielandt inequality are equivalent to the Kantorovich inequality and all of these are, in turn, special cases of the Hölder inequality.
The Kantorovich inequality is named after Soviet economist, mathematician, and Nobel Prize winner Leonid Kantorovich, a pioneer in the field of linear programming.
There is also Matrix version of the Kantorovich inequality due to Marshall and Olkin (1990). Its extensions and their applications to statistics are available; see e.g. Liu and Neudecker (1999) and Liu et al. (2022).