In matrix calculus, Jacobi's formula expresses the derivative of the determinant of a matrix A in terms of the adjugate of A and the derivative of A.[1]
If is a differentiable map from the real numbers to matrices, then
d | |
dt |
\detA(t) =\operatorname{tr}\left(\operatorname{adj}(A(t))
dA(t) | |
dt |
\right) =\left(\detA(t)\right) ⋅ \operatorname{tr}\left(A(t)-1 ⋅
dA(t) | |
dt |
\right)
As a special case,
{\partial\det(A)\over\partialAij
Equivalently, if stands for the differential of, the general formula is
d\det(A)=\operatorname{tr}(\operatorname{adj}(A)dA).
The formula is named after the mathematician Carl Gustav Jacob Jacobi.
We first prove a preliminary lemma:
Lemma. Let A and B be a pair of square matrices of the same dimension n. Then
\sumi\sumjAijBij=\operatorname{tr}(A\rmB).
Proof. The product AB of the pair of matrices has components
(AB)jk=\sumiAjiBik.
Replacing the matrix A by its transpose AT is equivalent to permuting the indices of its components:
(A\rmB)jk=\sumiAijBik.
The result follows by taking the trace of both sides:
\operatorname{tr}(A\rmB)=\sumj(A\rmB)jj=\sumj\sumiAijBij=\sumi\sumjAijBij. \square
Theorem. (Jacobi's formula) For any differentiable map A from the real numbers to n × n matrices,
d\det(A)=\operatorname{tr}(\operatorname{adj}(A)dA).
Proof. Laplace's formula for the determinant of a matrix A can be stated as
\det(A)=\sumjAij\operatorname{adj}\rm(A)ij.
Notice that the summation is performed over some arbitrary row i of the matrix.
The determinant of A can be considered to be a function of the elements of A:
\det(A)=F(A11,A12,\ldots,A21,A22,\ldots,Ann)
so that, by the chain rule, its differential is
d\det(A)=\sumi\sumj{\partialF\over\partialAij
This summation is performed over all n×n elements of the matrix.
To find ∂F/∂Aij consider that on the right hand side of Laplace's formula, the index i can be chosen at will. (In order to optimize calculations: Any other choice would eventually yield the same result, but it could be much harder). In particular, it can be chosen to match the first index of ∂ / ∂Aij:
{\partial\det(A)\over\partialAij
Thus, by the product rule,
{\partial\det(A)\over\partialAij
Now, if an element of a matrix Aij and a cofactor adjT(A)ik of element Aik lie on the same row (or column), then the cofactor will not be a function of Aij, because the cofactor of Aik is expressed in terms of elements not in its own row (nor column). Thus,
{\partial\operatorname{adj}\rm(A)ik\over\partialAij
so
{\partial\det(A)\over\partialAij
All the elements of A are independent of each other, i.e.
{\partialAik\over\partialAij
where δ is the Kronecker delta, so
{\partial\det(A)\over\partialAij
Therefore,
d(\det(A))=\sumi\sumj\operatorname{adj}\rm(A)ijdAij,
and applying the Lemma yields
d(\det(A))=\operatorname{tr}(\operatorname{adj}(A)dA). \square
Lemma 1.
\det'(I)=tr
\det'
\det
This equation means that the differential of
\det
\det'(I)
Proof. Using the definition of a directional derivative together with one of its basic properties for differentiable functions, we have
\det'(I)(T)=\nablaT\det(I)=\lim\varepsilon\to0
\det(I+\varepsilonT)-\detI | |
\varepsilon |
\det(I+\varepsilonT)
\varepsilon
T
\varepsilon= 0
\varepsilon
tr T
Lemma 2. For an invertible matrix A, we have:
\det'(A)(T)=\detA tr(A-1T)
Proof. Consider the following function of X:
\detX=\det(AA-1X)=\det(A) \det(A-1X)
We calculate the differential of
\detX
X=A
\det'(A)(T)=\detA \det'(I)(A-1T)=\detA tr(A-1T)
Theorem. (Jacobi's formula)
d | |
dt |
\detA=tr\left(adj A
dA | |
dt |
\right)
Proof. If
A
T=dA/dt
d | |
dt |
\detA=\detA tr\left(A-1
dA | |
dt |
\right)=tr\left(adj A
dA | |
dt |
\right)
using the equation relating the adjugate of
A
A-1
Both sides of the Jacobi formula are polynomials in the matrixcoefficients of and . It is thereforesufficient to verify the polynomial identity on the dense subsetwhere the eigenvalues of are distinct and nonzero.
If factors differentiably as
A=BC
tr(A-1A')= tr((BC)-1(BC)')= tr(B-1B')+ tr(C-1C').
I=L-1L
0=tr(I-1I')= tr(L(L-1)')+ tr(L-1L').
A=L-1DL
tr(A-1A')= tr(L(L-1)')+ tr(D-1D')+ tr(L-1L')= tr(D-1D').
λi
i=1,\ldots,n
\det(A)' | |
\det(A) |
=
n | |
\sum | |
i=1 |
λi'/λi=tr(D-1D')= tr(A-1A'),
The following is a useful relation connecting the trace to the determinant of the associated matrix exponential:This statement is clear for diagonal matrices, and a proof of the general claim follows.
A(t)
d | |
dt |
\detA(t)=\detA(t) \operatorname{tr}\left(A(t)-1
d | |
dt |
A(t)\right)
Considering
A(t)=\exp(tB)
d | |
dt |
\detetB=\operatorname{tr}(B)\detetB
The desired result follows as the solution to this ordinary differential equation.
Several forms of the formula underlie the Faddeev–LeVerrier algorithm for computing the characteristic polynomial, and explicit applications of the Cayley–Hamilton theorem. For example, starting from the following equation, which was proved above:
d | |
dt |
\detA(t)=\detA(t) \operatorname{tr}\left(A(t)-1
d | |
dt |
A(t)\right)
A(t)=tI-B
d | |
dt |
\det(tI-B)=\det(tI-B)\operatorname{tr}[(tI-B)-1]=\operatorname{tr}[\operatorname{adj}(tI-B)]