Polynomial SOS explained

In mathematics, a form (i.e. a homogeneous polynomial) h(x) of degree 2m in the real n-dimensional vector x is sum of squares of forms (SOS) if and only if there exist forms

g1(x),\ldots,gk(x)

of degree m such thath(x) = \sum_^k g_i(x)^2 .

Every form that is SOS is also a positive polynomial, and although the converse is not always true, Hilbert proved that for n = 2, 2m = 2, or n = 3 and 2m = 4 a form is SOS if and only if it is positive.[1] The same is also valid for the analog problem on positive symmetric forms.[2] [3]

Although not every form can be represented as SOS, explicit sufficient conditions for a form to be SOS have been found.[4] [5] Moreover, every real nonnegative form can be approximated as closely as desired (in the

l1

-norm of its coefficient vector) by a sequence of forms

\{f\epsilon\}

that are SOS.[6]

Square matricial representation (SMR)

To establish whether a form is SOS amounts to solving a convex optimization problem. Indeed, any can be written ash(x) = x^\left(H+L(\alpha)\right)x^where

x\{m\

} is a vector containing a base for the forms of degree m in x (such as all monomials of degree m in x), the prime ′ denotes the transpose, H is any symmetric matrix satisfyingh(x) = x^Hx^and

L(\alpha)

is a linear parameterization of the linear space\mathcal = \left\.

The dimension of the vector

x\{m\

} is given by\sigma(n,m) = \binom,whereas the dimension of the vector

\alpha

is given by\omega(n,2m) = \frac\sigma(n,m)\left(1+\sigma(n,m)\right)-\sigma(n,2m).

Then, is SOS if and only if there exists a vector

\alpha

such thatH + L(\alpha) \ge 0,meaning that the matrix

H+L(\alpha)

is positive-semidefinite. This is a linear matrix inequality (LMI) feasibility test, which is a convex optimization problem. The expression

h(x)=x\{m\'}\left(H+L(\alpha)\right)x\{m\

} was introduced in [7] with the name square matricial representation (SMR) in order to establish whether a form is SOS via an LMI. This representation is also known as Gram matrix.[8]

Examples

4
h(x)=x
2
. We have m = 2,~x^ = \begin x_1^2\\x_1x_2\\x_2^2\end\!,~H+L(\alpha) = \begin1&0&-\alpha_1\\0&-1+2\alpha_1&0\\-\alpha_1&0&1\end\!. Since there exists α such that

H+L(\alpha)\ge0

, namely

\alpha=1

, it follows that h(x) is SOS.
3x
h(x)=2x
2+x
2x
2x

3-2x1x

4
3
. We have m=2,~x^=\beginx_1^2\\x_1x_2\\x_1x_3\\x_2^2\\x_2x_3\\x_3^2\end,~H+L(\alpha) = \begin2&-1.25&0&-\alpha_1&-\alpha_2&-\alpha_3\\-1.25&2\alpha_1&0.5+\alpha_2&0&-\alpha_4&-\alpha_5\\0&0.5+\alpha_2&2\alpha_3&\alpha_4&\alpha_5&-1\\-\alpha_1&0&\alpha_4&5&0&-\alpha_6\\-\alpha_2&-\alpha_4&\alpha_5&0&2\alpha_6&0\\-\alpha_3&-\alpha_5&-1&-\alpha_6&0&1\end. Since

H+L(\alpha)\ge0

for

\alpha=(1.18,-0.43,0.73,1.13,-0.37,0.57)

, it follows that is SOS.

Generalizations

Matrix SOS

A matrix form F(x) (i.e., a matrix whose entries are forms) of dimension r and degree 2m in the real n-dimensional vector x is SOS if and only if there exist matrix forms

G1(x),\ldots,Gk(x)

of degree m such thatF(x)=\sum_^k G_i(x)'G_i(x) .

Matrix SMR

To establish whether a matrix form F(x) is SOS amounts to solving a convex optimization problem. Indeed, similarly to the scalar case any F(x) can be written according to the SMR asF(x) = \left(x^\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^\otimes I_r\right)where

is the Kronecker product of matrices, H is any symmetric matrix satisfyingF(x) = \left(x^\otimes I_r\right)'H\left(x^\otimes I_r\right)and

L(\alpha)

is a linear parameterization of the linear space\mathcal=\left\.

The dimension of the vector

\alpha

is given by\omega(n,2m,r)=\fracr\left(\sigma(n,m)\left(r\sigma(n,m)+1\right)-(r+1)\sigma(n,2m)\right).

Then, is SOS if and only if there exists a vector

\alpha

such that the following LMI holds:H+L(\alpha) \ge 0.

The expression

F(x)=\left(x\{m\

}\otimes I_r\right)'\left(H+L(\alpha)\right)\left(x^\otimes I_r\right) was introduced in [9] in order to establish whether a matrix form is SOS via an LMI.

Noncommutative polynomial SOS

Consider the free algebra RX⟩ generated by the n noncommuting letters X = (X1, ..., Xn) and equipped with the involution T, such that T fixes R and X1, ..., Xn and reverses words formed by X1, ..., Xn.By analogy with the commutative case, the noncommutative symmetric polynomials f are the noncommutative polynomials of the form . When any real matrix of any dimension r × r is evaluated at a symmetric noncommutative polynomial f results in a positive semi-definite matrix, f is said to be matrix-positive.

A noncommutative polynomial is SOS if there exists noncommutative polynomials

h1,\ldots,hk

such thatf(X) = \sum_^ h_i(X)^T h_i(X).

Surprisingly, in the noncommutative scenario a noncommutative polynomial is SOS if and only if it is matrix-positive.[10] Moreover, there exist algorithms available to decompose matrix-positive polynomials in sum of squares of noncommutative polynomials.[11]

See also

Notes and References

  1. Hilbert. David. Ueber die Darstellung definiter Formen als Summe von Formenquadraten. Mathematische Annalen. September 1888. 32. 3. 342–350. 10.1007/bf01443605. 177804714 .
  2. Choi. M. D.. Lam. T. Y.. An old question of Hilbert. Queen's Papers in Pure and Applied Mathematics. 1977. 46. 385–405.
  3. Goel. Charu. Kuhlmann. Salma. Reznick. Bruce. On the Choi–Lam analogue of Hilbert's 1888 theorem for symmetric forms. Linear Algebra and Its Applications. May 2016. 496. 114–120. 10.1016/j.laa.2016.01.024. 1505.08145. 17579200 . Bruce Reznick.
  4. Lasserre. Jean B.. Sufficient conditions for a real polynomial to be a sum of squares. Archiv der Mathematik. 89. 5. 390–398. 10.1007/s00013-007-2251-y. math/0612358. 2007. 10.1.1.240.4438. 9319455 .
  5. Powers. Victoria. Victoria Powers. Wörmann. Thorsten. An algorithm for sums of squares of real polynomials. Journal of Pure and Applied Algebra. 1998. 127. 1. 99–104. 10.1016/S0022-4049(97)83827-3. free.
  6. Lasserre. Jean B.. A Sum of Squares Approximation of Nonnegative Polynomials. SIAM Review. 2007. 49. 4. 651–669. 10.1137/070693709. math/0412398. 2007SIAMR..49..651L.
  7. On convexification of some minimum distance problems . Chesi . G. . Tesi . A. . Vicino . A. . Genesio . R. . 1999 . IEEE . Proceedings of the 5th European Control Conference . 1446–1451 . Karlsruhe, Germany.
  8. Sums of squares of real polynomials . Choi . M. . Lam . T. . Reznick . B. . 1995 . Proceedings of Symposia in Pure Mathematics . 103–125 .
  9. Robust stability for polytopic systems via polynomially parameter-dependent Lyapunov functions . Chesi . G. . Garulli . A. . Tesi . A. . Vicino . A. . 2003 . IEEE . Proceedings of the 42nd IEEE Conference on Decision and Control . 4670–4675 . Maui, Hawaii . 10.1109/CDC.2003.1272307 .
  10. Helton. J. William. "Positive" Noncommutative Polynomials Are Sums of Squares. The Annals of Mathematics. September 2002. 156. 2. 675–694. 10.2307/3597203. 3597203.
  11. Burgdorf. Sabine. Cafuta. Kristijan. Klep. Igor. Povh. Janez. Algorithmic aspects of sums of Hermitian squares of noncommutative polynomials. Computational Optimization and Applications. 25 October 2012. 55. 1. 137–153. 10.1007/s10589-012-9513-8. 10.1.1.416.543. 254416733 .