Polynomial SOS explained
In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms
of degree
m such that
Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.[1] The same is also valid for the analog problem on positive symmetric forms.[2] [3]
Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4] [5] Moreover, every real nonnegative form can be approximated as closely as desired (in the
-norm of its coefficient vector) by a sequence of forms
that are SOS.
[6] Square matricial representation (SMR)
To establish whether a form is SOS amounts to solving a convex optimization problem. Indeed, any can be written aswhere
} is a vector containing a base for the forms of degree
m in
x (such as all
monomials of degree
m in
x), the prime ′ denotes the
transpose,
H is any
symmetric matrix satisfying
and
is a linear parameterization of the
linear spaceThe dimension of the vector
} is given by
whereas the dimension of the vector
is given by
Then, is SOS if and only if there exists a vector
such that
meaning that the
matrix
is
positive-semidefinite. This is a
linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression
h(x)=x\{m\'}\left(H+L(\alpha)\right)x\{m\
} was introduced in
[7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.
[8] Examples
- Consider the form of degree 4 in two variables
. We have
Since there exists α such that
, namely
, it follows that
h(
x) is SOS.
- Consider the form of degree 4 in three variables
. We have
Since
for
\alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57)
, it follows that is SOS.
Generalizations
Matrix SOS
A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms
of degree
m such that
Matrix SMR
To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR aswhere
is the
Kronecker product of matrices,
H is any symmetric matrix satisfying
and
is a linear parameterization of the linear space
The dimension of the vector
is given by
Then, is SOS if and only if there exists a vector
such that the following LMI holds:
The expression
}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^\otimes I_r\right) was introduced in
[9] in order to establish whether a matrix form is SOS via an LMI.
Noncommutative polynomial SOS
Consider the free algebra R⟨X⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn.By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form . When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.
A noncommutative polynomial is SOS if there exists noncommutative polynomials
such that
Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]
See also
Notes and References
- Hilbert. David. Ueber die Darstellung definiter Formen als Summe von Formenquadraten. Mathematische Annalen. September 1888. 32. 3. 342–350. 10.1007/bf01443605. 177804714 .
- Choi. M. D.. Lam. T. Y.. An old question of Hilbert. Queen's Papers in Pure and Applied Mathematics. 1977. 46. 385–405.
- Goel. Charu. Kuhlmann. Salma. Reznick. Bruce. On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms. Linear Algebra and Its Applications. May 2016. 496. 114–120. 10.1016/j.laa.2016.01.024. 1505.08145. 17579200 . Bruce Reznick.
- Lasserre. Jean B.. Sufficient conditions for a real polynomial to be a sum of squares. Archiv der Mathematik. 89. 5. 390–398. 10.1007/s00013-007-2251-y. math/0612358. 2007. 10.1.1.240.4438. 9319455 .
- Powers. Victoria. Victoria Powers. Wörmann. Thorsten. An algorithm for sums of squares of real polynomials. Journal of Pure and Applied Algebra. 1998. 127. 1. 99–104. 10.1016/S0022-4049(97)83827-3. free.
- Lasserre. Jean B.. A Sum of Squares Approximation of Nonnegative Polynomials. SIAM Review. 2007. 49. 4. 651–669. 10.1137/070693709. math/0412398. 2007SIAMR..49..651L.
- On convexification of some minimum distance problems . Chesi . G. . Tesi . A. . Vicino . A. . Genesio . R. . 1999 . IEEE . Proceedings of the 5th European Control Conference . 1446–1451 . Karlsruhe, Germany.
- Sums of squares of real polynomials . Choi . M. . Lam . T. . Reznick . B. . 1995 . Proceedings of Symposia in Pure Mathematics . 103–125 .
- Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions . Chesi . G. . Garulli . A. . Tesi . A. . Vicino . A. . 2003 . IEEE . Proceedings of the 42nd IEEE Conference on Decision and Control . 4670–4675 . Maui, Hawaii . 10.1109/CDC.2003.1272307 .
- Helton. J. William. "Positive" Noncommutative Polynomials Are Sums of Squares. The Annals of Mathematics. September 2002. 156. 2. 675–694. 10.2307/3597203. 3597203.
- Burgdorf. Sabine. Cafuta. Kristijan. Klep. Igor. Povh. Janez. Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials. Computational Optimization and Applications. 25 October 2012. 55. 1. 137–153. 10.1007/s10589-012-9513-8. 10.1.1.416.543. 254416733 .