The aim of this section is to present a duality relationship
between the two approaches shown in Sections 8.2 and 8.3.
Consider the eigenvector equations in
![\begin{displaymath}
(\data{X}\data{X}^{\top})v_k=\mu _kv_k
\end{displaymath}](mvahtmlimg2554.gif) |
(8.6) |
for
, where
.
Multiplying by
, we have
so that each eigenvector
of
corresponds to an
eigenvector
of
associated with the same eigenvalue
.
This means that every non-zero eigenvalue of
is an eigenvalue
of
. The corresponding eigenvectors are related by
where
is some constant.
Now consider the eigenvector equations in
:
![\begin{displaymath}
(\data{X}^{\top}\data{X})u_k=\lambda_ku_k
\end{displaymath}](mvahtmlimg2566.gif) |
(8.9) |
for
Multiplying by
, we have
![\begin{displaymath}
(\data{X}\data{X}^{\top})(\data{X}u_k)=\lambda _k(\data{X}u_k),
\end{displaymath}](mvahtmlimg2568.gif) |
(8.10) |
i.e., each eigenvector
of
corresponds to an
eigenvector
of
associated with the
same eigenvalue
.
Therefore, every non-zero eigenvalue of
is an
eigenvalue of
. The corresponding eigenvectors are related by
where
is some constant.
Now, since
we have
. This lead to the following result:
THEOREM 8.4 (Duality Relations
)
Let
![$r$](mvahtmlimg407.gif)
be the rank of
![$\data{X}$](mvahtmlimg608.gif)
. For
![$k\le r $](mvahtmlimg2555.gif)
, the eigenvalues
![$ \lambda_{k}$](mvahtmlimg2575.gif)
of
![$ \data{X}^{\top}\data{X}$](mvahtmlimg1714.gif)
and
![$\data{X}\data{X}^{\top}$](mvahtmlimg2535.gif)
are the same
and the eigenvectors
(
![$ u_{k} $](mvahtmlimg2576.gif)
and
![$v_{k}$](mvahtmlimg2577.gif)
, respectively) are related by
![$\displaystyle u_{k} = \frac{1}{\sqrt{\lambda_{k}}} \data{X}^{\top}v_{k}$](mvahtmlimg2578.gif) |
|
|
(8.11) |
![$\displaystyle v_{k} = \frac{1}{\sqrt{\lambda_{k}}} \data{X}u_{k}.$](mvahtmlimg2579.gif) |
|
|
(8.12) |
Note that the projection of the
variables
on the factorial axis
is given by
![\begin{displaymath}
w_{k} = \data{X}^{\top}v_{k} = \frac{1}{\sqrt{\lambda_{k}}} \data{X}^{\top}\data{X}u_{k} =
\sqrt{\lambda_{k}}\ u_{k}.
\end{displaymath}](mvahtmlimg2580.gif) |
(8.13) |
Therefore, the eigenvectors
do not have to be explicitly
recomputed to get
.
Note that
and
provide the SVD of
(see Theorem
2.2). Letting
and
we have
so that
![\begin{displaymath}
x_{ij} = \sum_{k=1}^r \lambda_k^{1/2} \, v_{ik} \, u_{jk}.
\end{displaymath}](mvahtmlimg2585.gif) |
(8.14) |
In the following section
this method is applied in analysing consumption behavior
across different household types.
Summary
![$\ast$](mvahtmlimg108.gif)
-
The matrices
and
have the
same non-zero eigenvalues
, where
.
![$\ast$](mvahtmlimg108.gif)
-
The eigenvectors of
can be calculated from the
eigenvectors of
and vice versa:
![$\ast$](mvahtmlimg108.gif)
-
The coordinates representing the variables (columns) of
in a
-dimensional subspace can be easily calculated by
.