In mathematics, the matrix exponential is a matrix function on square matrices analogous to the ordinary exponential function. It is used to solve systems of linear differential equations. In the theory of Lie groups, the matrix exponential gives the exponential map between a matrix Lie algebra and the corresponding Lie group.
Let be an real or complex matrix. The exponential of, denoted by or, is the matrix given by the power series
where
X0
I
X
Equivalently,
where is the identity matrix.
When is an diagonal matrix then will be an diagonal matrix with each diagonal element equal to the ordinary exponential applied to the corresponding diagonal element of .
Let and be complex matrices and let and be arbitrary complex numbers. We denote the identity matrix by and the zero matrix by 0. The matrix exponential satisfies the following properties.[2]
We begin with the properties that are immediate consequences of the definition as a power series:
The next key result is this one:
XY=YX
eXeY=eX+Y
The proof of this identity is the same as the standard power-series argument for the corresponding identity for the exponential of real numbers. That is to say, as long as
X
Y
X
Y
X
Y
Consequences of the preceding identity are the following:
Using the above results, we can easily verify the following claims. If is symmetric then is also symmetric, and if is skew-symmetric then is orthogonal. If is Hermitian then is also Hermitian, and if is skew-Hermitian then is unitary.
Finally, a Laplace transform of matrix exponentials amounts to the resolvent,for all sufficiently large positive values of .
See main article: Matrix differential equation.
One of the reasons for the importance of the matrix exponential is that it can be used to solve systems of linear ordinary differential equations. The solution ofwhere is a constant matrix and y is a column vector, is given by
The matrix exponential can also be used to solve the inhomogeneous equationSee the section on applications below for examples.
There is no closed-form solution for differential equations of the formwhere is not constant, but the Magnus series gives the solution as an infinite sum.
By Jacobi's formula, for any complex square matrix the following trace identity holds:[3]
In addition to providing a computational tool, this formula demonstrates that a matrix exponential is always an invertible matrix. This follows from the fact that the right hand side of the above equation is always non-zero, and so, which implies that must be invertible.
In the real-valued case, the formula also exhibits the mapto not be surjective, in contrast to the complex case mentioned earlier. This follows from the fact that, for real-valued matrices, the right-hand side of the formula is always positive, while there exist invertible matrices with a negative determinant.
The matrix exponential of a real symmetric matrix is positive definite. Let
S
x\in\Rn
Since
eS/2
x=0
xTeSx>0
x
eS
For any real numbers (scalars) and we know that the exponential function satisfies . The same is true for commuting matrices. If matrices and commute (meaning that), then,
However, for matrices that do not commute the above equality does not necessarily hold.
Even if and do not commute, the exponential can be computed by the Lie product formula[4]
Using a large finite to approximate the above is basis of the Suzuki-Trotter expansion, often used in numerical time evolution.
In the other direction, if and are sufficiently small (but not necessarily commuting) matrices, we havewhere may be computed as a series in commutators of and by means of the Baker–Campbell–Hausdorff formula:[5] where the remaining terms are all iterated commutators involving and . If and commute, then all the commutators are zero and we have simply .
See main article: Golden–Thompson inequality. For Hermitian matrices there is a notable theorem related to the trace of matrix exponentials.
If and are Hermitian matrices, then[6]
There is no requirement of commutativity. There are counterexamples to show that the Golden–Thompson inequality cannot be extended to three matrices – and, in any event, is not guaranteed to be real for Hermitian,, . However, Lieb proved[7] [8] that it can be generalized to three matrices if we modify the expression as follows
The exponential of a matrix is always an invertible matrix. The inverse matrix of is given by . This is analogous to the fact that the exponential of a complex number is always nonzero. The matrix exponential then gives us a mapfrom the space of all n×n matrices to the general linear group of degree, i.e. the group of all n×n invertible matrices. In fact, this map is surjective which means that every invertible matrix can be written as the exponential of some other matrix[9] (for this, it is essential to consider the field C of complex numbers and not R).
For any two matrices and,
where denotes an arbitrary matrix norm. It follows that the exponential map is continuous and Lipschitz continuous on compact subsets of .
The mapdefines a smooth curve in the general linear group which passes through the identity element at .
In fact, this gives a one-parameter subgroup of the general linear group since
The derivative of this curve (or tangent vector) at a point t is given byThe derivative at is just the matrix X, which is to say that X generates this one-parameter subgroup.
More generally,[10] for a generic -dependent exponent,,
Taking the above expression outside the integral sign and expanding the integrand with the help of the Hadamard lemma one can obtain the following useful expression for the derivative of the matrix exponent,[11]
The coefficients in the expression above are different from what appears in the exponential. For a closed form, see derivative of the exponential map.
Let
X
n x n
X=Erm{diag}(Λ)E*
E
X
E*
Λ=\left(λ1,\ldots,λn\right)
n x n
V
\exp:X\toeX
X
V
\bar{V}=E*VE
\odot
1\leqi,j\leqn
G