2.2 Spectral Decompositions
The computation of eigenvalues and eigenvectors is an important issue in
the analysis of matrices.
The spectral decomposition or Jordan decomposition links the structure of
a matrix to the eigenvalues and the eigenvectors.
THEOREM 2.1 (Jordan Decomposition)
Each symmetric matrix
![${\data{A}}(p\times p)$](mvahtmlimg262.gif)
can be written as
![\begin{displaymath}
{{\data{A}}} = \Gamma\ \Lambda \ \Gamma^{\top}
= \sum^p_{j=1} \lambda_j \gamma_{\col{j}} \gamma_{\col{j}}^{\top}
\end{displaymath}](mvahtmlimg389.gif) |
(2.18) |
where
and where
is an orthogonal matrix consisting of the eigenvectors
![$
\gamma_{\col{j}}$](mvahtmlimg392.gif)
of
![${\data{A}}
$](mvahtmlimg241.gif)
.
EXAMPLE 2.4
Suppose that
![${\data{A}} = \left({1\atop 2}{2\atop 3} \right)$](mvahtmlimg393.gif)
.
The eigenvalues are found by solving
![$\vert{\data{A}}-\lambda {\data{I}}\vert=0$](mvahtmlimg394.gif)
.
This is equivalent to
Hence, the eigenvalues are
![$\lambda _1=2+\sqrt 5$](mvahtmlimg396.gif)
and
![$\lambda_2=2-\sqrt 5$](mvahtmlimg397.gif)
.
The eigenvectors are
![$\gamma_{\col{1}}= (0.5257, 0.8506)^{\top}$](mvahtmlimg398.gif)
and
![$\gamma_{\col{2}}= (0.8506, -0.5257)^{\top}$](mvahtmlimg399.gif)
.
They are orthogonal since
![$\gamma_1^{\top} \gamma_2=0$](mvahtmlimg400.gif)
.
Using spectral decomposition, we can define powers of a matrix
. Suppose
is a symmetric matrix.
Then by Theorem 2.1
and we define for some
![\begin{displaymath}
\data{A}^\alpha =\Gamma\Lambda ^\alpha \Gamma^{\top},
\end{displaymath}](mvahtmlimg403.gif) |
(2.19) |
where
.
In particular, we can easily calculate the inverse of the matrix
. Suppose that the eigenvalues of
are positive. Then
with
, we obtain the inverse of
from
![\begin{displaymath}
\data{A}^{-1}= \Gamma \Lambda^{-1} \Gamma^{\top}.
\end{displaymath}](mvahtmlimg406.gif) |
(2.20) |
Another interesting decomposition which is later used is given in
the following theorem.
THEOREM 2.2 (Singular Value Decomposition
)
Each matrix
![${\data{A}}(n\times p)$](mvahtmlimg244.gif)
with rank
![$r$](mvahtmlimg407.gif)
can be decomposed as
where
![$\Gamma(n\times r)$](mvahtmlimg409.gif)
and
![$\Delta(p\times r)$](mvahtmlimg410.gif)
. Both
![$\Gamma$](mvahtmlimg411.gif)
and
![$\Delta$](mvahtmlimg412.gif)
are column orthonormal, i.e.,
![$\Gamma^{\top}\Gamma = \Delta^{\top}\Delta={\data{I}}_r$](mvahtmlimg413.gif)
and
![$\Lambda=\mathop{\hbox{diag}}\left( \lambda_1^{1/2}, \ldots,
\lambda_r^{1/2} \right) $](mvahtmlimg414.gif)
,
![$\lambda_j>0$](mvahtmlimg415.gif)
.
The values
![$\lambda_1,\ldots ,\lambda_r$](mvahtmlimg416.gif)
are the non-zero eigenvalues of the
matrices
![$\data{A}\data{A}^{\top}$](mvahtmlimg417.gif)
and
![$\data{A}^{\top}\data{A}$](mvahtmlimg418.gif)
.
![$\Gamma$](mvahtmlimg411.gif)
and
![$\Delta$](mvahtmlimg412.gif)
consist of the corresponding
![$r$](mvahtmlimg407.gif)
eigenvectors of
these matrices.
This is obviously a generalization of Theorem 2.1 (Jordan
decomposition). With Theorem 2.2, we can find a
-inverse
of
. Indeed, define
. Then
. Note that the
-inverse is
not unique.
EXAMPLE 2.5
In Example
2.2, we showed that the generalized inverse
of
![${\data{A}}=\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)$](mvahtmlimg316.gif)
is
![${\data{A}}^{-}\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)$](mvahtmlimg422.gif)
. The following also holds
which means that the matrix
![$\left(
\begin{array}{ll}
1&0\\
0&8
\end{array}\right)$](mvahtmlimg424.gif)
is also a generalized inverse of
![${\data{A}}
$](mvahtmlimg241.gif)
.
Summary
![$\ast$](mvahtmlimg108.gif)
- The Jordan decomposition gives a representation of a
symmetric matrix in terms of eigenvalues and eigenvectors.
![$\ast$](mvahtmlimg108.gif)
- The eigenvectors belonging to the largest eigenvalues
indicate the ``main direction'' of the data.
![$\ast$](mvahtmlimg108.gif)
- The Jordan decomposition allows one to easily compute the power
of a symmetric matrix
:
.
![$\ast$](mvahtmlimg108.gif)
- The singular value decomposition (SVD) is a generalization
of the Jordan decomposition to non-quadratic matrices.