2.2 Spectral Decompositions

The computation of eigenvalues and eigenvectors is an important issue in the analysis of matrices. The spectral decomposition or Jordan decomposition links the structure of a matrix to the eigenvalues and the eigenvectors.

THEOREM 2.1 (Jordan Decomposition)   Each symmetric matrix ${\data{A}}(p\times p)$ can be written as
\begin{displaymath}
{{\data{A}}} = \Gamma\ \Lambda \ \Gamma^{\top}
= \sum^p_{j=1} \lambda_j \gamma_{\col{j}} \gamma_{\col{j}}^{\top}
\end{displaymath} (2.18)

where

\begin{displaymath}\Lambda = \mathop{\hbox{diag}}(\lambda_1, \ldots, \lambda_p) \end{displaymath}

and where

\begin{displaymath}\Gamma = (\gamma_{\col{1}}, \gamma_{\col{2}}, \ldots,
\gamma_{\col{p}}) \end{displaymath}

is an orthogonal matrix consisting of the eigenvectors $
\gamma_{\col{j}}$ of ${\data{A}}
$.

EXAMPLE 2.4   Suppose that ${\data{A}} = \left({1\atop 2}{2\atop 3} \right)$. The eigenvalues are found by solving $\vert{\data{A}}-\lambda {\data{I}}\vert=0$. This is equivalent to

\begin{displaymath}\left \vert\begin{array}{cc} 1-\lambda &2\\ 2&3-\lambda \end{array}\right
\vert=(1-\lambda )(3-\lambda )-4=0.\end{displaymath}

Hence, the eigenvalues are $\lambda _1=2+\sqrt 5$ and $\lambda_2=2-\sqrt 5$. The eigenvectors are $\gamma_{\col{1}}= (0.5257, 0.8506)^{\top}$ and $\gamma_{\col{2}}= (0.8506, -0.5257)^{\top}$. They are orthogonal since $\gamma_1^{\top} \gamma_2=0$.

Using spectral decomposition, we can define powers of a matrix $ \data{A}( p \times p) $. Suppose $\data{A}$ is a symmetric matrix. Then by Theorem 2.1

\begin{displaymath}\data{A} = \Gamma\Lambda \Gamma^{\top},\end{displaymath}

and we define for some $\alpha \in \mathbb{R}$
\begin{displaymath}
\data{A}^\alpha =\Gamma\Lambda ^\alpha \Gamma^{\top},
\end{displaymath} (2.19)

where $\Lambda ^\alpha =\mathop{\hbox{diag}}(\lambda ^\alpha _1,\ldots ,\lambda ^\alpha_p)$. In particular, we can easily calculate the inverse of the matrix $\data{A}$. Suppose that the eigenvalues of $\data{A}$ are positive. Then with $\alpha =-1$, we obtain the inverse of $\data{A}$ from
\begin{displaymath}
\data{A}^{-1}= \Gamma \Lambda^{-1} \Gamma^{\top}.
\end{displaymath} (2.20)

Another interesting decomposition which is later used is given in the following theorem.

THEOREM 2.2 (Singular Value Decomposition)   Each matrix ${\data{A}}(n\times p)$ with rank $r$ can be decomposed as

\begin{displaymath}{\data{A}} = \Gamma\ \Lambda\ \Delta^{\top},\end{displaymath}

where $\Gamma(n\times r)$ and $\Delta(p\times r)$. Both $\Gamma$ and $\Delta$ are column orthonormal, i.e., $\Gamma^{\top}\Gamma = \Delta^{\top}\Delta={\data{I}}_r$ and $\Lambda=\mathop{\hbox{diag}}\left( \lambda_1^{1/2}, \ldots,
\lambda_r^{1/2} \right) $, $\lambda_j>0$. The values $\lambda_1,\ldots ,\lambda_r$ are the non-zero eigenvalues of the matrices $\data{A}\data{A}^{\top}$ and $\data{A}^{\top}\data{A}$. $\Gamma$ and $\Delta$ consist of the corresponding $r$ eigenvectors of these matrices.

This is obviously a generalization of Theorem 2.1 (Jordan decomposition). With Theorem 2.2, we can find a $G$-inverse ${\data{A}}^-$ of ${\data{A}}
$. Indeed, define ${\data{A}}^-
=\Delta\ \Lambda^{-1}\ \Gamma^{\top}$. Then ${\data{A}}\ {\data{A}}^-\ {\data{A}}
= \Gamma\ \Lambda\ \Delta^{\top}={\data{A}}$. Note that the $G$-inverse is not unique.

EXAMPLE 2.5   In Example 2.2, we showed that the generalized inverse of ${\data{A}}=\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)$ is ${\data{A}}^{-}\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)$. The following also holds

\begin{displaymath}
\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)
\left...
...ight)
=
\left(
\begin{array}{ll}
1&0\\
0&0
\end{array}\right)
\end{displaymath}

which means that the matrix $\left(
\begin{array}{ll}
1&0\\
0&8
\end{array}\right)$ is also a generalized inverse of ${\data{A}}
$.

Summary
$\ast$
The Jordan decomposition gives a representation of a symmetric matrix in terms of eigenvalues and eigenvectors.
$\ast$
The eigenvectors belonging to the largest eigenvalues indicate the ``main direction'' of the data.
$\ast$
The Jordan decomposition allows one to easily compute the power of a symmetric matrix $\data{A}$: $\data{A}^\alpha=\Gamma\Lambda^\alpha\Gamma^{\top}$.
$\ast$
The singular value decomposition (SVD) is a generalization of the Jordan decomposition to non-quadratic matrices.