|
|
Line 1: |
Line 1: |
| In [[matrix calculus]], '''Jacobi's formula''' expresses the [[derivative]] of the [[determinant]] of a matrix ''A'' in terms of the [[adjugate]] of ''A'' and the derivative of ''A''.<ref>{{harvtxt|Magnus|Neudecker|1999}}, Part Three, Section 8.3</ref> If ''A'' is a differentiable map from the real numbers to ''n'' × ''n'' matrices,
| | Andrew Simcox is the name his parents gave him and he totally enjoys this name. I've usually loved residing in Alaska. To play lacross is something I really enjoy doing. Distributing manufacturing is where her main earnings comes from.<br><br>Visit my blog :: psychic chat online [[http://www.herandkingscounty.com/content/information-and-facts-you-must-know-about-hobbies herandkingscounty.com]] |
| | |
| :<math> \frac{d}{dt} \det A(t) = \mathrm{tr} (\mathrm{adj}(A(t)) \, \frac{dA(t)}{dt}).\,</math>
| |
| | |
| Equivalently, if ''dA'' stands for the [[differential (infinitesimal)|differential]] of ''A'', the formula is
| |
| | |
| :<math> d \det (A) = \mathrm{tr} (\mathrm{adj}(A) \, dA).</math>
| |
| | |
| It is named after the mathematician [[Carl Gustav Jacob Jacobi|C.G.J. Jacobi]].
| |
| | |
| ==Derivation==
| |
| We first prove a preliminary lemma:
| |
| | |
| '''Lemma.''' Let ''A'' and ''B'' be a pair of square matrices of the same dimension ''n''. Then
| |
| | |
| :<math>\sum_i \sum_j A_{ij} B_{ij} = \mathrm{tr} (A^{\rm T} B).</math>
| |
| | |
| ''Proof.'' The product ''AB'' of the pair of matrices has components
| |
| | |
| :<math>(AB)_{jk} = \sum_i A_{ji} B_{ik}.\,</math>
| |
| | |
| Replacing the matrix ''A'' by its [[transpose]] ''A''<sup>T</sup> is equivalent to permuting the indices of its components:
| |
| | |
| :<math>(A^{\rm T} B)_{jk} = \sum_i A_{ij} B_{ik}.</math>
| |
| | |
| The result follows by taking the trace of both sides:
| |
| | |
| :<math>\mathrm{tr} (A^{\rm T} B) = \sum_j (A^{\rm T} B)_{jj} = \sum_j \sum_i A_{ij} B_{ij} = \sum_i \sum_j A_{ij} B_{ij}.\ \square</math>
| |
| | |
| '''Theorem.''' (Jacobi's formula) For any differentiable map ''A'' from the real numbers to ''n'' × ''n'' matrices,
| |
| | |
| : <math>d \det (A) = \mathrm{tr} (\mathrm{adj}(A) \, dA).</math>
| |
| | |
| ''Proof.'' [[Laplace expansion|Laplace's formula]] for the determinant of a matrix ''A'' can be stated as
| |
| | |
| :<math>\det(A) = \sum_j A_{ij} \mathrm{adj}^{\rm T} (A)_{ij}.</math>
| |
| | |
| Notice that the summation is performed over some arbitrary row ''i'' of the matrix.
| |
| | |
| The determinant of ''A'' can be considered to be a function of the elements of ''A'':
| |
| | |
| :<math>\det(A) = F\,(A_{11}, A_{12}, \ldots , A_{21}, A_{22}, \ldots , A_{nn})</math>
| |
| | |
| so that, by the [[chain rule]], its differential is
| |
| | |
| :<math>d \det(A) = \sum_i \sum_j {\partial F \over \partial A_{ij}} \,dA_{ij}.</math>
| |
| | |
| This summation is performed over all ''n''×''n'' elements of the matrix.
| |
| | |
| To find ∂''F''/∂''A''<sub>''ij''</sub> consider that on the right hand side of Laplace's formula, the index ''i'' can be chosen at will. (In order to optimize calculations: Any other choice would eventually yield the same result, but it could be much harder). In particular, it can be chosen to match the first index of ∂ / ∂''A''<sub>''ij''</sub>:
| |
| | |
| :<math>{\partial \det(A) \over \partial A_{ij}} = {\partial \sum_k A_{ik} \mathrm{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}} = \sum_k {\partial (A_{ik} \mathrm{adj}^{\rm T}(A)_{ik}) \over \partial A_{ij}}</math>
| |
| | |
| Thus, by the product rule,
| |
| | |
| :<math>{\partial \det(A) \over \partial A_{ij}} = \sum_k {\partial A_{ik} \over \partial A_{ij}} \mathrm{adj}^{\rm T}(A)_{ik} + \sum_k A_{ik} {\partial \, \mathrm{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}}.</math> | |
| | |
| Now, if an element of a matrix ''A''<sub>''ij''</sub> and a [[minor (linear algebra)|cofactor]] adj<sup>T</sup>(''A'')<sub>''ik''</sub> of element ''A''<sub>''ik''</sub> lie on the same row (or column), then the cofactor will not be a function of ''A<sub>ij</sub>'', because the cofactor of ''A''<sub>''ik''</sub> is expressed in terms of elements not in its own row (nor column). Thus,
| |
| | |
| :<math>{\partial \, \mathrm{adj}^{\rm T}(A)_{ik} \over \partial A_{ij}} = 0,</math> | |
| | |
| so
| |
| | |
| :<math>{\partial \det(A) \over \partial A_{ij}} = \sum_k \mathrm{adj}^{\rm T}(A)_{ik} {\partial A_{ik} \over \partial A_{ij}}.</math>
| |
| | |
| All the elements of ''A'' are independent of each other, i.e.
| |
| | |
| :<math>{\partial A_{ik} \over \partial A_{ij}} = \delta_{jk},</math>
| |
| | |
| where ''δ'' is the [[Kronecker delta]], so
| |
| | |
| :<math>{\partial \det(A) \over \partial A_{ij}} = \sum_k \mathrm{adj}^{\rm T}(A)_{ik} \delta_{jk} = \mathrm{adj}^{\rm T}(A)_{ij}.</math>
| |
| | |
| Therefore,
| |
| | |
| :<math>d(\det(A)) = \sum_i \sum_j \mathrm{adj}^{\rm T}(A)_{ij} \,d A_{ij},</math>
| |
| | |
| and applying the Lemma yields
| |
| | |
| :<math>d(\det(A)) = \mathrm{tr}(\mathrm{adj}(A) \,dA).\ \square</math>
| |
| | |
| ==Corollary==
| |
| For any [[invertible matrix]] ''A'', the inverse ''A''<sup>-1</sup> is related to the adjugate by ''A''<sup>-1</sup> = (det ''A'')<sup>-1</sup> adj ''A''. It follows that if ''A''(''t'') is invertible for all ''t'', then
| |
| :<math>\frac{d}{dt} \det A(t) = (\det A(t)) \, \mathrm{tr} \left(A(t)^{-1} \, \frac{d}{dt} A(t)\right).</math>
| |
| | |
| Furthermore, taking ''A''(''t'' ) = exp(''tB'' ), this reduces to
| |
| :<math>\frac{d}{dt} \det e^{tB} =\mathrm{tr} \left(B\right) \det e^{tB} , </math>
| |
| solved by
| |
| {{Equation box 1
| |
| |indent =:
| |
| |equation = <math> \det e^{tB} = e^{\mathrm{tr} \left(tB\right)}, </math>
| |
| |cellpadding= 6
| |
| |border
| |
| |border colour = #0073CF
| |
| |bgcolor=#F9FFF7}}
| |
| a useful relation connecting the [[Trace (linear algebra)|trace]] to the determinant of the associated [[matrix exponential]].
| |
| | |
| ==Alternative Derivation==
| |
| A quicker proof of Jacobi's formula is as follows. By the chain rule, we have
| |
| | |
| <math>\frac{d}{dt}\mbox{det}\left(A\left(t\right)\right)=\left(\nabla\mbox{det}\left(A\left(t\right)\right)\right):\left(\frac{d}{dt}A\left(t\right)\right)=\mbox{tr}\left(\mbox{adj}\left(A\left(t\right)\right)\frac{d}{dt}A\left(t\right)\right)
| |
| </math>
| |
| | |
| where (:) denotes tensor double-contraction.
| |
| | |
| ==Notes==
| |
| {{Reflist}}
| |
| | |
| == References ==
| |
| * {{citation|first1=Jan R.|last1=Magnus|first2=Heinz|last2=Neudecker|title=Matrix Differential Calculus with Applications in Statistics and Econometrics|publisher=Wiley|year=1999|isbn=0-471-98633-X}}
| |
| *{{citation|last= Bellmann|first= Richard |year= 1987| title=Introduction to Matrix Analysis| publisher=SIAM| ISBN= 0898713994 }}
| |
| {{DEFAULTSORT:Jacobi's Formula}}
| |
| [[Category:Determinants]]
| |
| [[Category:Matrix theory]]
| |
| [[Category:Articles containing proofs]]
| |
Andrew Simcox is the name his parents gave him and he totally enjoys this name. I've usually loved residing in Alaska. To play lacross is something I really enjoy doing. Distributing manufacturing is where her main earnings comes from.
Visit my blog :: psychic chat online [herandkingscounty.com]