Singular value decomposition explained

In linear algebra, the singular value decomposition (SVD) is a factorization of a real or complex matrix into a rotation, followed by a rescaling followed by another rotation. It generalizes the eigendecomposition of a square normal matrix with an orthonormal eigenbasis to any matrix. It is related to the polar decomposition.

Specifically, the singular value decomposition of an

m x n

complex matrix is a factorization of the form

M=

U\SigmaV*,
where is an complex unitary matrix,

\Sigma

is an

m x n

rectangular diagonal matrix with non-negative real numbers on the diagonal, is an

n x n

complex unitary matrix, and

V*

is the conjugate transpose of . Such decomposition always exists for any complex matrix. If is real, then and can be guaranteed to be real orthogonal matrices; in such contexts, the SVD is often denoted

U\SigmaVT.

The diagonal entries

\sigmai=\Sigmai

of

\Sigma

are uniquely determined by and are known as the singular values of . The number of non-zero singular values is equal to the rank of . The columns of and the columns of are called left-singular vectors and right-singular vectors of, respectively. They form two sets of orthonormal bases and and if they are sorted so that the singular values

\sigmai

with value zero are all in the highest-numbered columns (or rows), the singular value decomposition can be written as

\mathbf = \sum_^\sigma_i\mathbf_i\mathbf_i^,

where

r\leqmin\{m,n\}

is the rank of

The SVD is not unique, however it is always possible to choose the decomposition such that the singular values

\Sigmai

are in descending order. In this case,

\Sigma

(but not and) is uniquely determined by

The term sometimes refers to the compact SVD, a similar decomposition in which is square diagonal of size where is the rank of and has only the non-zero singular values. In this variant, is an semi-unitary matrix and

V

is an semi-unitary matrix, such that

U*U=V*V=Ir.

Mathematical applications of the SVD include computing the pseudoinverse, matrix approximation, and determining the rank, range, and null space of a matrix. The SVD is also extremely useful in all areas of science, engineering, and statistics, such as signal processing, least squares fitting of data, and process control.

Intuitive interpretations

Rotation, coordinate scaling, and reflection

In the special case when is an real square matrix, the matrices and can be chosen to be real matrices too. In that case, "unitary" is the same as "orthogonal". Then, interpreting both unitary matrices as well as the diagonal matrix, summarized here as as a linear transformation of the space the matrices and represent rotations or reflection of the space, while represents the scaling of each coordinate by the factor Thus the SVD decomposition breaks down any linear transformation of into a composition of three geometrical transformations: a rotation or reflection followed by a coordinate-by-coordinate scaling followed by another rotation or reflection

In particular, if has a positive determinant, then and can be chosen to be both rotations with reflections, or both rotations without reflections. If the determinant is negative, exactly one of them will have a reflection. If the determinant is zero, each can be independently chosen to be of either type.

If the matrix is real but not square, namely with it can be interpreted as a linear transformation from to Then and can be chosen to be rotations/reflections of and respectively; and besides scaling the first coordinates, also extends the vector with zeros, i.e. removes trailing coordinates, so as to turn into

Singular values as semiaxes of an ellipse or ellipsoid

As shown in the figure, the singular values can be interpreted as the magnitude of the semiaxes of an ellipse in 2D. This concept can be generalized to -dimensional Euclidean space, with the singular values of any square matrix being viewed as the magnitude of the semiaxis of an -dimensional ellipsoid. Similarly, the singular values of any matrix can be viewed as the magnitude of the semiaxis of an -dimensional ellipsoid in -dimensional space, for example as an ellipse in a (tilted) 2D plane in a 3D space. Singular values encode magnitude of the semiaxis, while singular vectors encode direction. See below for further details.

The columns of and are orthonormal bases

Since and are unitary, the columns of each of them form a set of orthonormal vectors, which can be regarded as basis vectors. The matrix maps the basis vector to the stretched unit vector By the definition of a unitary matrix, the same is true for their conjugate transposes and except the geometric interpretation of the singular values as stretches is lost. In short, the columns of and are orthonormal bases. When is a positive-semidefinite Hermitian matrix, and are both equal to the unitary matrix used to diagonalize However, when is not positive-semidefinite and Hermitian but still diagonalizable, its eigendecomposition and singular value decomposition are distinct.

Relation to the four fundamental subspaces

Geometric meaning

Because and are unitary, we know that the columns of yield an orthonormal basis of and the columns of yield an orthonormal basis of (with respect to the standard scalar products on these spaces).

The linear transformation

T : \left\