Jacques Hadamard: Difference between revisions

From formulasearchengine
Jump to navigation Jump to search
en>Takvaal
m add honorary appointment
 
en>Ceyockey
→‎See also: removed article xref - deleted via AfD
 
Line 1: Line 1:
[[File:Jordan blocks.svg|right|thumb|250px|An example of a matrix in Jordan normal form. The grey blocks are called Jordan blocks.]]
<br><br>I'm a 48 years old, married and study at the university (American Politics).<br>In my free time I teach myself Turkish. I have been twicethere and look forward to returning sometime near future. I love to read, preferably on my ipad. I really love to watch American Dad and [http://Www.tumblr.com/tagged/Breaking+Bad Breaking Bad] as well as docus about anything astronomical. I like College football.<br><br>Also visit my web page - health ([http://www.glyconutrientsinfo.com/ http://www.glyconutrientsinfo.com])
In [[linear algebra]], a '''Jordan normal form''' (often called '''Jordan canonical form''')<ref>
Shilov defines the term ''Jordan canonical form'' and in a footnote says that ''Jordan normal form'' is synonymous.
These terms are sometimes shortened to ''Jordan form''. (Shilov)
The term ''Classical canonical form'' is also sometimes used in the sense of this article. (James & James, 1976)
</ref>
of a [[linear operator]] on a [[finite-dimensional]] [[vector space]] is an [[upper triangular matrix]] of a particular form called a [[Jordan matrix]], representing the operator on some [[Basis (linear algebra)|basis]]. The form is characterized by the condition that any non-diagonal entries that are non-zero must be equal to&nbsp;1, be immediately above the main diagonal (on the [[superdiagonal]]), and have identical diagonal entries to the left and below them. If the vector space is over a [[field (mathematics)|field]] ''K'', then a basis on which the matrix has the required form exists [[if and only if]] all [[eigenvalue]]s of ''M'' lie in ''K'', or equivalently if the [[characteristic polynomial]] of the operator splits into linear factors over ''K''. This condition is always satisfied if ''K'' is the field of [[complex number]]s. The diagonal entries of the normal form are the eigenvalues of the operator, with the number of times each one occurs being given by its [[algebraic multiplicity]].
 
If the operator is originally given by a [[square matrix]] ''M'', then its Jordan normal form is also called the Jordan normal form of ''M''. Any square matrix has a Jordan normal form if the field of coefficients is extended to one containing all the eigenvalues of the matrix. In spite of its name, the normal form for a given ''M'' is not entirely unique, as it is a [[block diagonal matrix]] formed of [[Jordan block]]s, the order of which is not fixed; it is conventional to group blocks for the same eigenvalue together, but no ordering is imposed among the eigenvalues, nor among the blocks for a given eigenvalue, although the latter could for instance be ordered by weakly decreasing size. The [[Jordan–Chevalley decomposition]] is particularly simple on a basis on which the operator takes its Jordan normal form. The diagonal form for [[diagonalizable]] matrices, for instance [[normal matrix|normal matrices]], is a special case of the Jordan normal form.
 
The Jordan normal form is named after [[Camille Jordan]].
 
== Motivation ==
An ''n'' &times; ''n'' matrix ''A'' is [[diagonalizable matrix|diagonalizable]] if and only if the sum of the dimensions of the eigenspaces is ''n''. Or, equivalently, if and only if ''A'' has ''n'' [[linearly independent]] [[eigenvectors]]. Not all matrices are diagonalizable. Consider the following matrix:
 
<math>A=
\left[\!\!\!\begin{array}{*{20}{r}}
  5 &  4 &  2 &  1 \\[2pt]
  0 &  1 & -1 & -1 \\[2pt]
-1 & -1 &  3 &  0 \\[2pt]
  1 &  1 & -1 &  2
\end{array}\!\!\right].</math>
 
Including multiplicity, the [[eigenvalues]] of ''A'' are λ = 1, 2, 4, 4. The [[Hamel dimension|dimension]] of the [[Kernel (linear algebra)|kernel]] of (''A''&nbsp;&minus;&nbsp;4'''[[identity matrix|I<sub>n</sub>]]''') is 1 (and not 2), so ''A'' is not diagonalizable. However, there is an invertible matrix ''P'' such that ''A'' = ''PJP''<sup>&minus;1</sup>, where
 
:<math>J = \begin{bmatrix}
1 & 0 & 0 & 0 \\[2pt]
0 & 2 & 0 & 0 \\[2pt]
0 & 0 & 4 & 1 \\[2pt]
0 & 0 & 0 & 4 \end{bmatrix}.</math>
 
The matrix J is almost diagonal. This is the Jordan normal form of ''A''. The section [[#Example|''Example'']] below fills in the details of the computation.
 
== Complex matrices ==
 
In general, a square complex matrix ''A'' is [[similar (linear algebra)|similar]] to a [[block diagonal matrix]]
 
:<math>J = \begin{bmatrix}
J_1 & \;    & \; \\
\;  & \ddots & \; \\
\;  & \;    & J_p\end{bmatrix}</math>
 
where each block ''J''<sub>i</sub> is a square matrix of the form
 
:<math>J_i =
\begin{bmatrix}
\lambda_i & 1            & \;    & \;  \\
\;        & \lambda_i    & \ddots & \;  \\
\;        & \;          & \ddots & 1  \\
\;        & \;          & \;    & \lambda_i     
\end{bmatrix}.</math>
So there exists an invertible matrix ''P'' such that ''P<sup>-1</sup>AP'' = ''J'' is such that the only non-zero entries of ''J'' are on the diagonal and the superdiagonal. ''J'' is called the '''Jordan normal form''' of ''A''. Each ''J''<sub>''i''</sub> is called a [[Jordan block]] of ''A''. In a given Jordan block, every entry on the super-diagonal is 1.
 
Assuming this result, we can deduce the following properties:
 
* Counting multiplicity, the eigenvalues of ''J'', therefore ''A'', are the diagonal entries.
* Given an eigenvalue λ<sub>''i''</sub>, its '''[[geometric multiplicity]]''' is the dimension of Ker(''A'' &minus; λ<sub>''i'' </sub>'''[[identity matrix|I]]'''), and it is the number of Jordan blocks corresponding to λ<sub>''i''</sub>.<ref name="HJp321">{{harvtxt|Horn|Johnson|1985|loc=§3.2.1}}</ref>
* The sum of the sizes of all Jordan blocks corresponding to an eigenvalue λ<sub>''i''</sub> is its '''algebraic multiplicity'''.<ref name="HJp321" />
* ''A'' is diagonalizable if and only if, for every eigenvalue λ of ''A'', its geometric and algebraic multiplicities coincide.
* The Jordan block corresponding to λ is of the form λ '''I''' + ''N'', where ''N'' is a [[nilpotent matrix]] defined as ''N''<sub>''ij''</sub> = δ<sub>''i'',''j''&minus;1</sub> (where δ is the [[Kronecker delta]]). The nilpotency of ''N'' can be exploited when calculating ''f''(''A'') where ''f'' is a complex analytic function. For example, in principle the Jordan form could give a closed-form expression for the exponential exp(''A'').
* The number of Jordan blocks corresponding to λ of size at least ''j'' is dim Ker(''A - λI)<sup>j</sup> -'' dim Ker''(A - λI)<sup>j-1</sup>''. Thus, the number of Jordan blocks of size exactly ''j'' is
:<math>2 \dim \ker (T - \lambda_i I)^j - \dim \ker (T - \lambda_i I)^{j+1} - \dim \ker (T - \lambda_i I)^{j-1}</math>
 
=== Generalized eigenvectors ===
{{main|Generalized eigenvectors}}
Consider the matrix ''A'' from the example in the previous section. The Jordan normal form is obtained by some similarity transformation ''P''<sup>&minus;1</sup>''AP'' = ''J'', i.e.
 
:<math>\; AP = PJ.</math>
 
Let ''P'' have column vectors ''p''<sub>''i''</sub>, ''i'' = 1, ..., 4, then
 
: <math>A \begin{bmatrix} p_1 & p_2 & p_3 & p_4 \end{bmatrix} = \begin{bmatrix} p_1 & p_2 & p_3 & p_4 \end{bmatrix}
\begin{bmatrix}
1 & 0 & 0 & 0 \\
0 & 2 & 0 & 0 \\
0 & 0 & 4 & 1 \\
0 & 0 & 0 & 4 \end{bmatrix} = \begin{bmatrix} p_1 & 2p_2 & 4p_3 & p_3+4p_4 \end{bmatrix}.</math>
 
We see that
 
:<math>\; (A - 1 I) p_1 = 0 </math>
 
:<math>\; (A - 2 I) p_2 = 0 </math>
 
:<math>\; (A - 4 I) p_3 = 0 </math>
 
:<math>\; (A - 4 I) p_4 = p_3. </math>
 
For ''i'' = 1,2,3 we have <math>p_i \in \operatorname{Ker}(A-\lambda_{i} I)</math>, i.e. ''p''<sub>i</sub> is an eigenvector of ''A'' corresponding to the eigenvalue λ<sub>i</sub>. For ''i''=4, multiplying both sides by <math>(A-4I)</math> gives
:<math>\; (A-4I)^2 p_4 = (A-4I) p_3. </math>
But <math>(A-4I)p_3 = 0</math>, so
:<math>\; (A-4I)^2 p_4 = 0. </math>
Thus, <math>p_4 \in \operatorname{Ker}(A-4 I)^2.</math>
 
Vectors such as <math>p_4</math> are called [[generalized eigenvector]]s of ''A''.
 
Thus, given an eigenvalue λ, its corresponding Jordan block gives rise to a '''Jordan chain'''. The '''generator''', or '''lead vector''', say ''p<sub>r</sub>'', of the chain is a generalized eigenvector such that (''A'' &minus; λ '''I''')<sup>''r''</sup>''p''<sub>''r''</sub> = 0, where ''r'' is the size of the Jordan block. The vector ''p''<sub>1</sub> =  (''A'' &minus; λ '''I''')<sup>''r''&minus;1</sup>''p''<sub>''r''</sub> is an eigenvector corresponding to λ. In general, ''p''<sub>''i''</sub> is a preimage of ''p''<sub>''i''&minus;1</sub> under ''A'' &minus; λ '''I'''. So the lead vector generates the chain via multiplication by (''A'' &minus; λ '''I''').
 
Therefore, the statement that every square matrix ''A'' can be put in Jordan normal form is equivalent to the claim that there exists a basis consisting only of eigenvectors and generalized eigenvectors of ''A''.
 
=== A proof ===
 
We give a proof by induction. The 1 &times; 1 case is trivial. Let ''A'' be an ''n'' &times; ''n'' matrix. Take any eigenvalue λ of ''A''. The range of ''A'' &minus; λ '''I''', denoted by Ran(''A'' &minus; λ '''I'''), is an [[invariant subspace]] of ''A''. Also, since λ is an eigenvalue of ''A'', the dimension Ran(''A'' &minus; λ '''I'''), ''r'', is strictly less than ''n''. Let ''A' '' denote the restriction of ''A'' to Ran(''A'' &minus; λ '''I'''), By inductive hypothesis, there exists a basis {''p''<sub>1</sub>, ..., ''p''<sub>''r''</sub>} such that ''A' '', expressed in terms of this basis, is in Jordan normal form.
 
Next consider the subspace Ker(''A'' &minus; λ '''I'''). If
 
:<math>\mathrm{Ran}(A - \lambda I) \cap \mathrm{Ker}(A - \lambda I) = \{0\},</math>
 
the desired result follows immediately from the [[rank–nullity theorem]]. This would be the case, for example, if ''A'' was Hermitian.
 
Otherwise, if
 
:<math>Q = \mathrm{Ran}(A - \lambda I) \cap \mathrm{Ker}(A - \lambda I) \neq \{0\},</math>
 
let the dimension of ''Q'' be ''s'' ≤ ''r''. Each vector in ''Q'' is an eigenvector of ''A' '' corresponding to eigenvalue ''λ''. So the Jordan form of ''A' '' must contain ''s'' Jordan chains corresponding to ''s'' linearly independent eigenvectors. So the basis {''p''<sub>1</sub>, ..., ''p''<sub>''r''</sub>} must contain ''s'' vectors, say {''p''<sub>''r''&minus;''s''+1</sub>, ..., ''p''<sub>''r''</sub>}, that are lead vectors in these Jordan chains from the Jordan normal form of ''A'''. We can "extend the chains" by taking the preimages of these lead vectors. (This is the key step of argument; in general, generalized eigenvectors need not lie in Ran(''A'' &minus; λ '''I''').) Let ''q''<sub>''i''</sub> be such that
 
:<math>\; (A - \lambda I) q_i = p_i \mbox{ for } i = r-s+1, \ldots, r.</math>
 
Clearly no non-trivial linear combination of the ''q''<sub>''i''</sub> can lie in Ker(''A'' &minus; λ I). Furthermore, no non-trivial linear combination of the ''q''<sub>''i''</sub> can be in Ran(''A'' &minus; λ '''I'''), for that would contradict the assumption that each ''p<sub>i</sub>'' is a lead vector in a Jordan chain. The set {''q''<sub>''i''</sub>}, being preimages of the linearly independent set {''p''<sub>''i''</sub>} under ''A'' &minus; λ '''I''', is also linearly independent.
 
Finally, we can pick any linearly independent set {''z''<sub>''1''</sub>, ..., ''z''<sub>''t''</sub>} that spans
 
:<math>\; \mathrm{Ker}(A - \lambda I) / Q.</math>
 
By construction, the union the three sets {''p''<sub>1</sub>, ..., ''p''<sub>''r''</sub>}, {''q''<sub>''r''&minus;''s'' +1</sub>, ..., ''q''<sub>''r''</sub>}, and  {''z''<sub>1</sub>, ..., ''z''<sub>''t''</sub>} is linearly independent. Each vector in the union is either an eigenvector or a generalized eigenvector of ''A''. Finally, by rank–nullity theorem, the cardinality of the union is ''n''. In other words, we have found a basis that consists of eigenvectors and generalized eigenvectors of ''A'', and this shows ''A'' can be put in Jordan normal form.
 
=== Uniqueness ===
 
It can be shown that the Jordan normal form of a given matrix ''A'' is unique up to the order of the Jordan blocks.
 
Knowing the algebraic and geometric multiplicities of the eigenvalues is not sufficient to determine the Jordan normal form of ''A''. Assuming the algebraic multiplicity ''m''(λ) of an eigenvalue λ is known, the structure of the Jordan form can be ascertained by analyzing the ranks of the powers (''A'' &minus; λ I)<sup>''m''(λ)</sup>. To see this, suppose an ''n'' &times; ''n'' matrix ''A'' has only one eigenvalue λ. So ''m''(λ) = ''n''. The smallest integer ''k''<sub>1</sub> such that
 
:<math>(A - \lambda I)^{k_1} = 0</math>
 
is the size of the largest Jordan block in the Jordan form of ''A''. (This number ''k''<sub>1</sub> is also called the '''index''' of λ. See discussion in a following section.) The rank of
 
:<math>(A - \lambda I)^{k_1 - 1}</math>
 
is the number of Jordan blocks of size ''k''<sub>1</sub>. Similarly, the rank of
 
:<math>(A - \lambda I)^{k_1 - 2}</math>
 
is twice the number of Jordan blocks of size ''k''<sub>1</sub> plus the number of  Jordan bl Jordan structure of ''A''. The general case is similar.
 
This can be used to show the uniqueness of the Jordan form. Let ''J''<sub>1</sub> and ''J''<sub>2</sub> be two Jordan normal forms of ''A''. Then ''J''<sub>1</sub> and ''J''<sub>2</sub> are similar and have the same spectrum, including algebraic multiplicities of the eigenvalues. The procedure outlined in the previous paragraph can be used to determine the structure of these matrices. Since the rank of a matrix is preserved by similarity transformation, there is a bijection between the Jordan blocks of ''J''<sub>1</sub> and ''J''<sub>2</sub>. This proves the uniqueness part of the statement.
 
== Real matrices ==
If ''A'' is a real matrix, its Jordan form can still be non-real, however there exists a real invertible matrix ''P'' such that ''P<sup>-1</sup>AP'' = ''J'' is a real [[block diagonal matrix]] with each block being a real Jordan block. A real Jordan block is either identical to a complex Jordan block (if the corresponding eigenvalue <math>\lambda_i</math> is real), or is a block matrix itself, consisting of 2&times;2 blocks as follows (for non-real eigenvalue <math>\lambda_i = a_i+ib_i</math>). The diagonal blocks are identical, of the form
 
:<math>C_i =
\begin{bmatrix}
a_i  & b_i \\
-b_i & a_i \\
\end{bmatrix}</math>
 
and describe multiplication by <math>\lambda_i</math> in the complex plane. The superdiagonal blocks are 2&times;2 identity matrices. The full real Jordan block is given by
 
:<math>J_i =
\begin{bmatrix}
C_i    & I       & \;    & \;    \\
\;    & C_i    & \ddots & \;    \\   
\;    & \;      & \ddots & I    \\
\;    & \;      & \;    & C_i  \\
\end{bmatrix}.</math>
 
This real Jordan form is a consequence of the complex Jordan form. For a real matrix the nonreal eigenvectors and generalized eigenvectors can always be chosen to form [[complex conjugate]] pairs. Taking the real and imaginary part (linear combination of the vector and its conjugate), the matrix has this form in the new basis.
 
== Consequences ==
 
One can see that the Jordan normal form is essentially a classification result for square matrices, and as such several important results from linear algebra can be viewed as its consequences.
 
=== Spectral mapping theorem ===
 
Using the Jordan normal form, direct calculation gives a spectral mapping theorem for the [[functional calculus|polynomial functional calculus]]: Let ''A'' be an ''n'' &times; ''n'' matrix with eigenvalues λ<sub>1</sub>, ..., λ<sub>''n''</sub>, then for any polynomial ''p'', ''p''(''A'') has eigenvalues ''p''(λ<sub>1</sub>), ..., ''p''(λ<sub>''n''</sub>).
 
=== Cayley–Hamilton theorem ===
 
The [[Cayley–Hamilton theorem]] asserts that every matrix ''A'' satisfies its characteristic equation: if {{math|''p''}} is the [[characteristic polynomial]] of {{math|''A''}}, then {{math|''p''(''A'') {{=}} 0}}. This can be shown via direct calculation in the Jordan form, since any Jordan block for {{math|''&lambda;''}} is annihilated by {{math|(''X'' − &lambda;)<sup>''m''</sup>}} where {{math|''m''}} is the multiplicity of the root {{math|''&lambda;''}} of {{math|''p''}}, the sum of the sizes of the Jordan blocks for {{math|''&lambda;''}}, and therefore no less than the size of the block in question. The Jordan form can be assumed to exist over a field extending the base field of the matrix, for instance over the [[splitting field]] of {{math|''p''}}; this field extension does not change the matrix {{math|''p''(''A'')}} in any way.
 
=== Minimal polynomial ===
 
The [[Minimal polynomial (linear algebra)|minimal polynomial]] P of a square matrix ''A'' is the unique [[monic polynomial]] of least degree, ''m'', such that ''P''(''A'') = 0. Alternatively, the set of polynomials that annihilate a given ''A'' form an ideal ''I'' in ''C''[''x''], the [[principal ideal domain]] of polynomials with complex coefficients. The monic element that generates ''I'' is precisely ''P''.
 
Let λ<sub>1</sub>, ..., λ<sub>''q''</sub> be the distinct eigenvalues of ''A'', and ''s''<sub>''i''</sub> be the size of the largest Jordan block corresponding to λ<sub>''i''</sub>. It is clear from the Jordan normal form that the minimal polynomial of ''A'' has degree {{math|''&Sigma;''}}''s''<sub>''i''</sub>.
 
While the Jordan normal form determines the minimal polynomial, the converse is not true. This leads to the notion of '''elementary divisors'''. The elementary divisors of a square matrix ''A'' are the characteristic polynomials of its Jordan blocks. The factors of the minimal polynomial ''m'' are the elementary divisors of the largest degree corresponding to distinct eigenvalues.
 
The degree of an elementary divisor is the size of the corresponding Jordan block, therefore the dimension of the corresponding invariant subspace. If all elementary divisors are linear, ''A'' is diagonalizable.
 
=== Invariant subspace decompositions ===
 
The Jordan form of a ''n'' &times; ''n'' matrix ''A'' is block diagonal, and therefore gives a decomposition of the ''n'' dimensional Euclidean space into [[invariant subspace]]s of ''A''. Every Jordan block ''J''<sub>''i''</sub> corresponds to an invariant subspace ''X''<sub>''i''</sub>. Symbolically, we put
 
:<math>\mathbb{C}^n = \bigoplus_{i = 1}^k X_i</math>
 
where each ''X''<sub>''i''</sub> is the span of the corresponding Jordan chain, and ''k'' is the number of Jordan chains.
 
One can also obtain a slightly different decomposition via the Jordan form. Given an eigenvalue λ<sub>''i''</sub>, the size of its largest corresponding Jordan block ''s''<sub>i</sub> is called the '''index''' of  λ<sub>''i''</sub> and denoted by ν(λ<sub>''i''</sub>). (Therefore the degree of the minimal polynomial is the sum of all indices.) Define a subspace ''Y''<sub>''i''</sub> by
 
:<math>\; Y_i = \operatorname{Ker} (\lambda_i I - A)^{\nu(\lambda_i)}.</math>
 
This gives the decomposition
 
:<math>\mathbb{C}^n = \bigoplus_{i = 1}^l Y_i</math>
 
where ''l'' is the number of distinct eigenvalues of ''A''. Intuitively, we glob together the Jordan block invariant subspaces corresponding to the same eigenvalue. In the extreme case where ''A'' is a multiple of the identity matrix we have ''k'' = ''n'' and ''l'' = 1.
 
The projection onto ''Y<sub>i</sub>'' and along all the other ''Y<sub>j</sub>'' ( ''j'' ≠ ''i'' ) is called '''the spectral projection of ''A'' at λ<sub>''i''</sub>''' and is usually denoted by '''''P''(λ<sub>''i''</sub> ; ''A'')'''. Spectral projections are mutually orthogonal in the sense that ''P''(λ<sub>''i''</sub> ; ''A'') ''P''(λ<sub>''j''</sub> ; ''A'') = 0 if ''i'' ≠ ''j''. Also they commute with ''A'' and their sum is the identity matrix. Replacing every λ<sub>''i''</sub> in the Jordan matrix ''J'' by one and zeroising all other entries gives ''P''(λ<sub>''i''</sub> ; ''J''), moreover if ''U J U''<sup> -1</sup> is the similarity transformation such that ''A'' = ''U J U''<sup> -1</sup> then ''P''(λ<sub>''i''</sub> ; ''A'') = ''U P''(λ<sub>''i''</sub> ; ''J'') ''U''<sup> -1</sup>. They are not confined to finite dimensions. See below for their application to compact operators, and in [[holomorphic functional calculus]] for a more general discussion.
 
Comparing the two decompositions, notice that, in general, ''l'' ≤ ''k''. When ''A'' is normal, the subspaces ''X''<sub>''i''</sub>'s in the first decomposition are one-dimensional and mutually orthogonal. This is the [[spectral theorem]] for normal operators. The second decomposition generalizes more easily for general compact operators on Banach spaces.
 
It might be of interest here to note some properties of the index, ν(''λ''). More generally, for a complex number λ, its index can be defined as the least non-negative integer ν(λ) such that
 
:<math>\mathrm{Ker}(\lambda - A)^{\nu(\lambda)} = \operatorname{Ker} (\lambda - A)^m, \; \forall m \geq \nu(\lambda) .</math>
 
So ν(λ) &gt; 0 if and only if λ is an eigenvalue of ''A''. In the finite dimensional case, ν(λ) ≤ the algebraic multiplicity of λ.
 
== Generalizations ==
=== Matrices with entries in a field ===
 
Jordan reduction can be extended to any square matrix ''M'' whose entries lie in a [[field (mathematics)|field]] ''K''.  The result states that any ''M'' can be written as a sum ''D'' + ''N'' where ''D'' is [[semisimple operator|semisimple]], ''N'' is [[nilpotent matrix|nilpotent]], and ''DN'' = ''ND''. This is called the [[Jordan–Chevalley decomposition]]. Whenever ''K'' contains the eigenvalues of ''M'', in particular when ''K'' is [[algebraically closed]], the normal form can be expressed explicitly as the [[direct sum]] of Jordan blocks.
Similar to the case when ''K'' is the complex numbers, knowing the dimensions of the kernels of (''M'' &minus; λ''I'')<sup>''k''</sup> for 1 ≤ ''k'' ≤ ''m'', where ''m'' is the algebraic multiplicity of the eigenvalue λ, allows one to determine the Jordan form of ''M''. We may view the underlying vector space ''V'' as a ''K''[''x'']-[[module (mathematics)|module]] by regarding the action of ''x'' on ''V'' as application of ''M'' and extending by ''K''-linearity. Then the polynomials (''x''&nbsp;&minus;&nbsp;λ)<sup>''k''</sup> are the elementary divisors of ''M'', and the Jordan normal form is concerned with representing ''M'' in terms of blocks associated to the elementary divisors.
 
The proof of the Jordan normal form is usually carried out as an application to the [[ring (mathematics)|ring]] ''K''[''x''] of the [[structure theorem for finitely generated modules over a principal ideal domain]], of which it is a corollary.
 
=== Compact operators ===
 
In a different direction, for [[compact operator]]s on a [[Banach space]], a result analogous to the Jordan normal form holds. One restricts to compact operators because every point ''x'' in the spectrum of a compact operator ''T'', the only exception being when ''x'' is the limit point of the spectrum, is an eigenvalue. This is not true for bounded operators in general. To give some idea of this generalization, we first reformulate the Jordan decomposition in the language of functional analysis.
 
==== Holomorphic functional calculus ====
{{ Details|holomorphic functional calculus}}
Let ''X'' be a Banach space, ''L''(''X'') be the bounded operators on ''X'', and σ(''T'') denote the [[spectrum (functional analysis)|spectrum]] of ''T'' ∈ ''L''(''X''). The [[holomorphic functional calculus]] is defined as follows:
 
Fix a bounded operator ''T''. Consider the family Hol(''T'') of complex functions that is [[holomorphic]] on some open set ''G'' containing σ(''T''). Let Γ = {γ<sub>''i''</sub>} be a finite collection of [[Jordan curve]]s such that σ(''T'') lies in the ''inside'' of Γ, we define ''f''(''T'') by
 
: <math>f(T) = \frac{1}{2 \pi i} \int_{\Gamma} f(z)(z - T)^{-1} dz.</math>
 
The open set ''G'' could vary with ''f'' and need not be connected. The integral is defined as the limit of the Riemann sums, as in the scalar case. Although the integral makes sense for continuous ''f'', we restrict to holomorphic functions to apply the machinery from classical function theory (e.g. the Cauchy integral formula). The assumption that σ(''T'') lie in the inside of Γ ensures ''f''(''T'') is well defined; it does not depend on the choice of Γ. The functional calculus is the mapping Φ from Hol(''T'') to ''L''(''X'') given by
 
: <math>\; \Phi(f) = f(T).</math>
 
We will require the following properties of this functional calculus:
# Φ extends the polynomial functional calculus.
# The ''spectral mapping theorem'' holds: σ(''f''(''T'')) = ''f''(σ(''T'')).
# Φ is an algebra homomorphism.
 
==== The finite dimensional case ====
 
In the finite dimensional case, σ(''T'') = {λ<sub>''i''</sub>} is a finite discrete set in the complex plane. Let ''e''<sub>''i''</sub> be the function that is 1 in some open neighborhood of λ<sub>''i''</sub> and 0 elsewhere. By property 3 of the functional calculus, the operator
 
:<math>\; e_i(T)</math>
 
is a projection. Moreoever, let ν<sub>''i''</sub> be the index of λ<sub>''i''</sub> and
 
:<math>f(z)= (z - \lambda_i)^{\nu_i}.</math> 
 
The spectral mapping theorem tells us
 
:<math> f(T) e_i (T) = (T - \lambda_i)^{\nu_i} e_i (T)</math>
 
has spectrum {0}. By property 1, ''f''(''T'') can be directly computed in the Jordan form, and by inspection, we see that the operator ''f''(''T'')''e<sub>i</sub>''(''T'') is the zero matrix.
 
By property 3, ''f''(''T'') ''e''<sub>''i''</sub>(''T'') = ''e''<sub>''i''</sub>(''T'') ''f''(''T''). So ''e''<sub>''i''</sub>(''T'') is precisely the projection onto
the subspace
 
:<math>\mathrm{Ran} \; e_i (T) = \mathrm{Ker}(T - \lambda_i)^{\nu_i}.</math>
 
The relation
 
:<math>\; \sum_i e_i = 1</math>
 
implies
 
:<math>\mathbb{C}^n = \bigoplus_i \; \mathrm{Ran}\; e_i (T) = \bigoplus_i \; \mathrm{Ker}(T - \lambda_i)^{\nu_i}</math>
 
where the index ''i'' runs through the distinct eigenvalues of ''T''. This is exactly the invariant subspace decomposition
 
:<math>\mathbb{C}^n = \bigoplus_i Y_i</math>
 
given in a previous section. Each ''e<sub>i</sub>''(''T'') is the projection onto the subspace spanned by the Jordan chains corresponding to λ<sub>''i''</sub> and along the subspaces spanned by the Jordan chains corresponding to λ<sub>''j''</sub> for ''j'' ≠ ''i''. In other words ''e<sub>i</sub>''(''T'') = ''P''(λ<sub>''i''</sub>;''T''). This explicit identification of the operators ''e<sub>i</sub>''(''T'') in turn gives an explicit form of holomorphic functional calculus for matrices:
 
:For all ''f'' ∈ Hol(''T''),
 
:<math>f(T) = \sum_{\lambda_i \in \sigma(T)} \sum_{k = 0}^{\nu_i -1} \frac{f^{(k)}}{k!} (T - \lambda_i)^k e_i (T).</math>
 
Notice that the expression of ''f''(''T'') is a finite sum because, on each neighborhood of λ<sub>''i''</sub>, we have chosen the Taylor series expansion of ''f'' centered at λ<sub>''i''</sub>.
 
==== Poles of an operator ====
 
Let ''T'' be a bounded operator λ be an isolated point of σ(''T''). (As stated above, when ''T'' is compact, every point in its spectrum is an isolated point, except possibly the limit point 0.)
 
The point λ is called a '''pole''' of operator ''T'' with order ν if the [[Resolvent formalism|resolvent]] function ''R''<sub>''T''</sub> defined by
 
:<math>\; R_T(\lambda) = (\lambda - T)^{-1}</math>
 
has a [[pole (complex analysis)|pole]] of order ν at λ.
 
We will show that, in the finite dimensional case, the order of an eigenvalue coincides with its index. The result also holds for compact operators.
 
Consider the annular region ''A'' centered at the eigenvalue λ with sufficiently small radius ε such that the intersection of the open disc ''B''<sub>ε</sub>(λ) and σ(''T'') is {λ}. The resolvent function ''R''<sub>''T''</sub> is holomorphic on ''A''.
Extending a result from classical function theory, ''R''<sub>''T''</sub> has a [[Laurent series]] representation on ''A'':
 
:<math>R_T(z) = \sum _{- \infty} ^{\infty} a_m (\lambda - z)^m</math>
 
where
 
:<math>a_{-m} = - \frac{1}{2 \pi i} \int_C (\lambda - z) ^{m-1} (z - T)^{-1} d z</math> and ''C'' is a small circle centered at λ.
 
By the previous discussion on the functional calculus,
 
:<math>\; a_{-m} = -(\lambda - T)^{m-1} e_{\lambda} (T)</math> where <math>\; e_{\lambda}</math> is 1 on <math>\; B_{\epsilon}(\lambda)</math> and 0 elsewhere.
 
But we have shown that the smallest positive integer ''m'' such that
 
:<math>a_{-m} \neq 0</math> and <math>a_{-l} = 0 \; \; \forall \; l \geq m</math>
 
is precisely the index of λ, ν(λ). In other words, the function ''R''<sub>''T''</sub> has a pole of order ν(λ) at λ.
 
== Example ==
 
This example shows how to calculate the Jordan normal form of a given matrix. As the next section explains, it is important to do the computation exactly instead of rounding the results.
 
Consider the matrix
:<math>A =
\begin{bmatrix}
5 &  4 &  2 &  1 \\
0 &  1 & -1 & -1 \\
-1 & -1 &  3 &  0 \\
1 &  1 & -1 &  2
\end{bmatrix}</math>
which is mentioned in the beginning of the article.
 
The [[characteristic polynomial]] of ''A'' is
:<math> \chi(\lambda) = \det(\lambda I - A) = \lambda^4 - 11 \lambda^3 + 42 \lambda^2 - 64 \lambda + 32 = (\lambda-1)(\lambda-2)(\lambda-4)^2. \, </math>
This shows that the eigenvalues are 1, 2, 4 and 4, according to algebraic multiplicity. The eigenspace corresponding to the eigenvalue 1 can be found by solving the equation ''Av'' = ''λ v''. It is spanned by the column vector ''v'' = (&minus;1, 1, 0, 0)<sup>T</sup>. Similarly, the eigenspace corresponding to the eigenvalue 2 is spanned by ''w'' = (1, &minus;1, 0, 1)<sup>T</sup>. Finally, the eigenspace corresponding to the eigenvalue 4 is also one-dimensional (even though this is a double eigenvalue) and is spanned by ''x'' = (1, 0, &minus;1, 1)<sup>T</sup>. So, the geometric multiplicity (i.e. dimension of the eigenspace of the given eigenvalue) of each of the three eigenvalues is one. Therefore, the two eigenvalues equal to 4 correspond to a single Jordan block, and the Jordan normal form of the matrix ''A'' is the [[Matrix addition#Direct sum|direct sum]]
:<math> J = J_1(1) \oplus J_1(2) \oplus J_2(4) =
\begin{bmatrix} 1 & 0 & 0 & 0 \\ 0 & 2 & 0 & 0 \\ 0 & 0 & 4 & 1 \\ 0 & 0 & 0 & 4 \end{bmatrix}. </math>
There are three chains. Two have length one: {''v''} and {''w''}, corresponding to the eigenvalues 1 and 2, respectively. There is one chain of length two corresponding to the eigenvalue 4. To find this chain, calculate
: <math>\ker{(A-4I)}^2 = \operatorname{span} \, \left\{ \begin{bmatrix} 1 \\ 0 \\ 0 \\ 0 \end{bmatrix}, \begin{bmatrix} 1 \\ 0 \\ -1 \\ 1 \end{bmatrix} \right\}.</math>
Pick a vector in the above span that is not in the kernel of ''A''&nbsp;&minus;&nbsp;4''I'', e.g., ''y'' = (1,0,0,0)<sup>T</sup>. Now, (''A''&nbsp;&minus;&nbsp;4''I'')''y'' = ''x'' and (''A''&nbsp;&minus;&nbsp;4''I'')''x'' = 0, so {''y'', ''x''} is a chain of length two corresponding to the eigenvalue 4.
 
The transition matrix ''P'' such that ''P''<sup>&minus;1</sup>''AP'' = ''J'' is formed by putting these vectors next to each other as follows
:<math> P = \Big[ \,v\, \Big| \,w\, \Big| \,x\, \Big| \,y\, \Big] =
\begin{bmatrix}
-1 &  1 &  1 &  1 \\
1 & -1 &  0 &  0 \\
0 &  0 & -1 &  0 \\
0 &  1 &  1 &  0
\end{bmatrix}. </math>
A computation shows that the equation ''P''<sup>&minus;1</sup>''AP'' = ''J'' indeed holds.
 
:<math>P^{-1}AP=J=\begin{bmatrix}
1 & 0 & 0 & 0 \\
0 & 2 & 0 & 0 \\
0 & 0 & 4 & 1 \\
0 & 0 & 0 & 4 \end{bmatrix}.</math>
 
If we had interchanged the order of which the chain vectors appeared, that is, changing the order of ''v'', ''w'' and {''x'', ''y''} together, the Jordan blocks would be interchanged. However, the Jordan forms are equivalent Jordan forms.
 
== Numerical analysis ==
 
If the matrix ''A'' has multiple eigenvalues, or is close to a matrix with multiple eigenvalues, then its Jordan normal form is very sensitive to perturbations. Consider for instance the matrix
:<math> A = \begin{bmatrix} 1 & 1 \\ \varepsilon & 1 \end{bmatrix}. </math>
If ε = 0, then the Jordan normal form is simply
:<math> \begin{bmatrix} 1 & 1 \\ 0 & 1 \end{bmatrix}. </math>
However, for ε ≠ 0, the Jordan normal form is
:<math> \begin{bmatrix} 1+\sqrt\varepsilon & 0 \\ 0 & 1-\sqrt\varepsilon \end{bmatrix}. </math>
This [[condition number|ill conditioning]] makes it very hard to develop a robust numerical algorithm for the Jordan normal form, as the result depends critically on whether two eigenvalues are deemed to be equal. For this reason, the Jordan normal form is usually avoided in [[numerical analysis]]; the stable [[Schur decomposition]] is often a better alternative.<ref>See Golub & Van Loan (1996), §7.6.5; or Golub & Wilkinson (1976) for details.</ref>
 
== Powers ==
If ''n'' is a [[natural number]], the ''n''<sup>th</sup> power of a matrix in Jordan normal form will be a direct sum of upper triangular matrices, as a result of [[Block matrix|block multiplication]]. More specifically, after exponentiation each Jordan block will be an upper triangular block.
 
For example,
:<math>
\begin{bmatrix}
2 & 1 & 0 & 0 & 0 \\
0 & 2 & 1 & 0 & 0 \\
0 & 0 & 2 & 0 & 0 \\
0 & 0 & 0 & 5 & 1 \\
0 & 0 & 0 & 0 & 5
\end{bmatrix}^4
=\begin{bmatrix}
16 & 32 & 24 & 0  & 0 \\
0  & 16 & 32 & 0  & 0 \\
0  & 0  & 16 & 0  & 0 \\
0  & 0  & 0  & 625 & 500 \\
0  & 0  & 0  & 0  & 625
\end{bmatrix}.</math>
 
Further, each triangular block will consist of λ<sup>''n''</sup> on the main diagonal, <math>\tbinom{n}{1}</math> times λ<sup>''n''-1</sup> on the upper diagonal, and so on. This expression is valid for negative integer powers as well if one extends the notion of the binomial coefficients <math>\tbinom{n}{k}\mapsto\left(\frac{n}{|n|}\right)^k\tbinom{|n|}{k}</math>.
 
For example,
 
:<math>
\begin{bmatrix}
\lambda_1 & 1 & 0 & 0 & 0 \\
0 & \lambda_1 & 1 & 0 & 0 \\
0 & 0 & \lambda_1 & 0 & 0 \\
0 & 0 & 0 & \lambda_2 & 1 \\
0 & 0 & 0 & 0 & \lambda_2
\end{bmatrix}^n
=\begin{bmatrix}
\lambda_1^n & \tbinom{n}{1}\lambda_1^{n-1} & \tbinom{n}{2}\lambda_1^{n-2} & 0  & 0 \\
0  & \lambda_1^n & \tbinom{n}{1}\lambda_1^{n-1} & 0  & 0 \\
0  & 0  & \lambda_1^n & 0  & 0 \\
0  & 0  & 0  & \lambda_2^n & \tbinom{n}{1}\lambda_2^{n-1} \\
0  & 0  & 0  & 0  & \lambda_2^n
\end{bmatrix}.</math>
 
== See also ==
* [[Canonical form]]
* [[Frobenius normal form]]
* [[Jordan matrix]]
* [[Jordan–Chevalley decomposition]]
* [[Matrix decomposition]]
* [[Weyr canonical form]]
 
== Notes ==
<references/>
 
==References==
<div class="references-small">
* N. Dunford and J.T. Schwartz, ''Linear Operators, Part I: General Theory'', Interscience, 1958.
* Daniel T. Finkbeiner II, ''Introduction to Matrices and Linear Transformations, Third Edition'', Freeman, 1978.
* [[Gene H. Golub]] and [[Charles F. Van Loan]], ''Matrix Computations'' (3rd ed.), Johns Hopkins University Press, Baltimore, 1996.
* Gene H. Golub and J. H. Wilkinson, Ill-conditioned eigensystems and the computation of the Jordan normal form, ''SIAM Review'', vol. 18, nr. 4, pp. 578–619, 1976.
* {{Citation | last1=Horn | first1=Roger A. | last2=Johnson | first2=Charles R. | title=Matrix Analysis | publisher=[[Cambridge University Press]] | isbn=978-0-521-38632-6 | year=1985}}.
* Glenn James and Robert C. James, ''Mathematics Dictionary, Fourth Edition'', Van Nostrand Reinhold, 1976.
* Saunders MacLane and Garrett Birkhoff, ''Algebra'', MacMillan, 1967.
* Anthony N. Michel and Charles J. Herget, ''Applied Algebra and Functional Analysis'', Dover, 1993.
* Georgi E. Shilov, ''Linear Algebra'', Dover, 1977.
* [[Igor Shafarevich|I. R. Shafarevich]] & A. O. Remizov (2012) ''Linear Algebra and Geometry'', [[Springer Science+Business Media|Springer]] ISBN 978-3-642-30993-9.
* [http://mathworld.wolfram.com/JordanCanonicalForm.html ''Jordan Canonical Form'' article at mathworld.wolfram.com]
</div>
[[Category:Linear algebra]]
[[Category:Matrix theory]]
[[Category:Matrix normal forms]]
[[Category:Matrix decompositions]]
{{Link FA|ca}}

Latest revision as of 14:11, 12 October 2014



I'm a 48 years old, married and study at the university (American Politics).
In my free time I teach myself Turkish. I have been twicethere and look forward to returning sometime near future. I love to read, preferably on my ipad. I really love to watch American Dad and Breaking Bad as well as docus about anything astronomical. I like College football.

Also visit my web page - health (http://www.glyconutrientsinfo.com)