Matrix decompositions are useful in numerical problems, in particular
for solving systems of linear equations. All
XploRe
examples for this section
can be found in
XLGmatrix08.xpl
.
Let be a square matrix of dimension
. A scalar
is an eigenvalue and a nonzero vector
is an eigenvector of
if
.
The eigenvalues are the roots of the characteristic polynomial
of order defined as
, where
denotes
the
-dimensional identity matrix. The determinant of the matrix
is equal to the product of its
eigenvalues:
.
The function
eigsm
calculates the eigenvectors and eigenvalues
of a given symmetric matrix.
We evaluate the eigenvalues and eigenvectors of nonsymmetric matrices with
the function
eiggn
.
The function
eigsm
has as its unique argument the matrix and
returns the eigenvalues and vectors. The returned
arguments are unsorted with respect to the eigenvalues.
Consider the following example:
x = #(1, 2)~#(2, 3) y = eigsm(x) yin which we define a matrix x, and calculate its eigenvalues and eigenvectors. XploRe stores them in a variable of list type, y: The variable y.values contains the eigenvalues, while the variable y.vectors contains the corresponding eigenvectors:
Contents of y.values [1,] -0.23607 [2,] 4.2361 Contents of y.vectors [1,] 0.85065 0.52573 [2,] -0.52573 0.85065We verify that the determinant of the matrix x is equal to the product of the eigenvalues of x:
det(x) - y.values[1] * y.values[2]gives
Contents of _tmp [1,] 4.4409e-16i.e. something numerically close to zero.
If the eigenvalues of the matrix
are different, this matrix
can be decomposed as
, where
is the
diagonal matrix the diagonal elements of which are the eigenvalues,
and
is the matrix obtained by the
concatenation of the eigenvectors. The transformation matrix
is
orthonormal, i.e.
. This decomposition of the
matrix
is called the spectral decomposition.
We check that the matrix of concatenated eigenvectors is orthonormal:
y.vectors'*y.vectorsyields a matrix numerically close to the identity matrix:
Contents of _tmp [1,] 1 -1.0219e-17 [2,] -1.0219e-17 1
We verify the spectral decomposition of our example:
z = y.vectors *diag(y.values) *y.vectors' zwhich gives the original matrix x:
Contents of z [1,] 1 2 [2,] 2 3
If the matrix can be decomposed as
, then
. In particular,
.
Therefore, the inverse of x could be calculated as
xinv = y.vectors*inv(diag(y.values))*y.vectors' xinvwhich gives
Contents of xinv [1,] -3 2 [2,] 2 -1which is equal to the inverse of x
inv(x)
Contents of cinv [1,] -3 2 [2,] 2 -1
|
Let be a
matrix, with
, and rank
, with
. The singular value decomposition of the matrix
decomposes this matrix as
, where
is the
orthonormal matrix of eigenvectors of
,
is the
orthonormal matrix of eigenvectors of
associated with the
nonzero eigenvalues, and
is a
diagonal matrix.
The function
svd
computes the singular value decomposition of a
matrix x. This function returns the matrices
,
and
in the form of a list.
x = #(1, 2, 3)~#(2, 3, 4) y = svd(x) yXploRe returns the matrix
Contents of y.u [1,] 0.84795 0.3381 [2,] 0.17355 0.55065 [3,] -0.50086 0.7632 Contents of y.l [1,] 0.37415 [2,] 6.5468 Contents of y.v [1,] -0.82193 0.56959 [2,] 0.56959 0.82193
We test that y.u *diag(y.l) *y.v' equals x with the commands
xx = y.u *diag(y.l) *y.v' xxThis displays the matrix x:
Contents of xx [1,] 1 2 [2,] 2 3 [3,] 3 4
|
The LU decomposition of an -dimensional square matrix
is defined
as
x = #(1, 2)~#(2, 3) lu = ludecomp(x) lugives
Contents of lu.l [1,] 1 0 [2,] 0.5 1 Contents of lu.u [1,] 2 3 [2,] 0 0.5 Contents of lu.index [1,] 2 [2,] 1
We re-obtain the original matrix x by using the function
index
, which takes as its argument the
row-permutations in the LU decomposition. The instruction
index(lu.l*lu.u,lu.index)returns the matrix x
Contents of index [1,] 1 2 [2,] 2 3
|
Let be an
-dimensional square matrix, symmetric, i.e.
, and positive definite, i.e.
-dimensional vectors
. The Cholesky decomposition
decomposes the matrix
as
, where
is a lower
triangular matrix.
The function
chold
computes a triangularization of the
input matrix. The following steps are necessary to find
the lower triangular
:
library("xplore") ; unit is in library xplore! x = #(2,2)~#(2,3) tmp = chold(x, rows(x)) d = diag(xdiag(tmp)) ; diagonal matrix b = tmp-d+diag(matrix(rows(tmp))) ; lower triangular L = b'*sqrt(d) ; Cholesky triangularWe can check that the decomposition works by comparing x and L*L'. The instruction
x - L*L'displays
Contents of _tmp [1,] -4.4409e-16 -4.4409e-16 [2,] -4.4409e-16 -4.4409e-16which means that the difference between both matrices is practically zero.
The Cholesky decomposition is used in computational statistics for
inverting the Hessian matrices and the matrices in regression
analysis.