In linear algebra, a Hessenberg matrix is a special kind of square matrix, one that is "almost" triangular. To be exact, an upper Hessenberg matrix has zero entries below the first subdiagonal, and a lower Hessenberg matrix has zero entries above the first superdiagonal.[1] They are named after Karl Hessenberg.[2]
A Hessenberg decomposition is a matrix decomposition of a matrix
A
P
H
P*
A square
n x n
A
ai,j=0
i,j
i>j+1
An upper Hessenberg matrix is called unreduced if all subdiagonal entries are nonzero, i.e. if
ai+1,i ≠ 0
i\in\{1,\ldots,n-1\}
A square
n x n
A
ai,j=0
i,j
j>i+1
A lower Hessenberg matrix is called unreduced if all superdiagonal entries are nonzero, i.e. if
ai,i+1 ≠ 0
i\in\{1,\ldots,n-1\}
Consider the following matrices.
The matrix
A
B
C
Many linear algebra algorithms require significantly less computational effort when applied to triangular matrices, and this improvement often carries over to Hessenberg matrices as well. If the constraints of a linear algebra problem do not allow a general matrix to be conveniently reduced to a triangular one, reduction to Hessenberg form is often the next best thing. In fact, reduction of any matrix to a Hessenberg form can be achieved in a finite number of steps (for example, through Householder's transformation of unitary similarity transforms). Subsequent reduction of Hessenberg matrix to a triangular matrix can be achieved through iterative procedures, such as shifted QR-factorization. In eigenvalue algorithms, the Hessenberg matrix can be further reduced to a triangular matrix through Shifted QR-factorization combined with deflation steps. Reducing a general matrix to a Hessenberg matrix and then reducing further to a triangular matrix, instead of directly reducing a general matrix to a triangular matrix, often economizes the arithmetic involved in the QR algorithm for eigenvalue problems.
Any
n x n
Let
A
n x n
A\prime
(n-1) x n
A
A
\prime | |
a | |
1 |
A'
(n-1) x (n-1)
V1=I(n-1)-2
ww* | |
\|w\|2 |
This householder matrix will map
\prime | |
a | |
1 |
\prime | |
\|a | |
1\| |
e1
U1=\begin{bmatrix}1&0\ 0&V1\end{bmatrix}
A
U1A
(n-2) x (n-2)
V2
V1
V2
A\prime\prime
\prime\prime | |
\|a | |
1\| |
e1
A\prime\prime
A\prime
A\prime
U2=\begin{bmatrix}1&0&0\ 0&1&0\ 0&0&V2\end{bmatrix}
U1A
U2U1A
V3
U3
A\prime\prime\prime
A\prime\prime
n-2
By construction of
Uk
k
n x n
* | |
U | |
k |
U(n-2)(...(U2(U1A
*) | |
U | |
2 |
...
* | |
)U | |
(n-2) |
=U(n-2)...U2U1A(U(n-2)...U2U
* | |
1) |
=UAU*
A Jacobi rotation (also called Givens rotation) is an orthogonal matrix transformation in the form
A\toA'=J(p,q,\theta)TAJ(p,q,\theta) ,
J(p,q,\theta)
p<q
\left\{\begin{align} J(p,q,\theta)ii&{}=1 \foralli\nep,q\\ J(p,q,\theta)pp&{}=\cos(\theta)\\ J(p,q,\theta)qq&{}=\cos(\theta)\\ J(p,q,\theta)pq&{}=\sin(\theta)\\ J(p,q,\theta)qp&{}=-\sin(\theta) . \end{align}\right.
A'p-1,q
\theta
Ap-1,p\sin\theta+Ap-1,q\cos\theta=0 ,
(p,q)
(p,q)=(2,3),(2,4),...,(2,n),(3,4),...,(3,n),...,(n-1,n)
A
For
n\in\{1,2\}
n x n
The product of a Hessenberg matrix with a triangular matrix is again Hessenberg. More precisely, if
A
T
AT
TA
A matrix that is both upper Hessenberg and lower Hessenberg is a tridiagonal matrix, of which the Jacobi matrix is an important example. This includes the symmetric or Hermitian Hessenberg matrices. A Hermitian matrix can be reduced to tri-diagonal real symmetric matrices.[6]
S
The eigenvalues of each principal submatrix of the Hessenberg operator are given by the characteristic polynomial for that submatrix. These polynomials are called the Bergman polynomials, and provide an orthogonal polynomial basis for Bergman space.