In mathematics, the spectrum of a matrix is the set of its eigenvalues. More generally, if
T\colonV\toV
λ
T-λI
In many applications, such as PageRank, one is interested in the dominant eigenvalue, i.e. that which is largest in absolute value. In other applications, the smallest eigenvalue is important, but in general, the whole spectrum provides valuable information about a matrix.
Let V be a finite-dimensional vector space over some field K and suppose T : V → V is a linear map. The spectrum of T, denoted σT, is the multiset of roots of the characteristic polynomial of T. Thus the elements of the spectrum are precisely the eigenvalues of T, and the multiplicity of an eigenvalue λ in the spectrum equals the dimension of the generalized eigenspace of T for λ (also called the algebraic multiplicity of λ).
Now, fix a basis B of V over K and suppose M ∈ MatK (V) is a matrix. Define the linear map T : V → V pointwise by Tx = Mx, where on the right-hand side x is interpreted as a column vector and M acts on x by matrix multiplication. We now say that x ∈ V is an eigenvector of M if x is an eigenvector of T. Similarly, λ ∈ K is an eigenvalue of M if it is an eigenvalue of T, and with the same multiplicity, and the spectrum of M, written σM, is the multiset of all such eigenvalues.
The eigendecomposition (or spectral decomposition) of a diagonalizable matrix is a decomposition of a diagonalizable matrix into a specific canonical form whereby the matrix is represented in terms of its eigenvalues and eigenvectors.
The spectral radius of a square matrix is the largest absolute value of its eigenvalues. In spectral theory, the spectral radius of a bounded linear operator is the supremum of the absolute values of the elements in the spectrum of that operator.