# Eigenvalues and Eigenvectors: Properties

## Setup

This vignette uses an example of a $$3 \times 3$$ matrix to illustrate some properties of eigenvalues and eigenvectors. We could consider this to be the variance-covariance matrix of three variables, but the main thing is that the matrix is square and symmetric, which guarantees that the eigenvalues, $$\lambda_i$$ are real numbers. Covariance matrices are also positive semi-definite, meaning that their eigenvalues are non-negative, $$\lambda_i \ge 0$$.

A <- matrix(c(13, -4, 2, -4, 11, -2, 2, -2, 8), 3, 3, byrow=TRUE)
A
##      [,1] [,2] [,3]
## [1,]   13   -4    2
## [2,]   -4   11   -2
## [3,]    2   -2    8

Get the eigenvalues and eigenvectors using eigen(); this returns a named list, with eigenvalues named values and eigenvectors named vectors.

ev <- eigen(A)
# extract components
(values <- ev$values) ##  17 8 7 (vectors <- ev$vectors)
##         [,1]    [,2]   [,3]
## [1,]  0.7454  0.6667 0.0000
## [2,] -0.5963  0.6667 0.4472
## [3,]  0.2981 -0.3333 0.8944

The eigenvalues are always returned in decreasing order, and each column of vectors corresponds to the elements in values.

## Properties of eigenvalues and eigenvectors

The following steps illustrate the main properties of eigenvalues and eigenvectors. We use the notation $$A = V' \Lambda V$$ to express the decomposition of the matrix $$A$$, where $$V$$ is the matrix of eigenvectors and $$\Lambda = diag(\lambda_1, \lambda_2, \dots, \lambda_p)$$ is the diagonal matrix composed of the ordered eivenvalues, $$\lambda_1 \ge \lambda_2 \ge \dots \lambda_p$$.

1. Orthogonality: Eigenvectors are always orthogonal, $$V' V = I$$. zapsmall() is handy for cleaning up tiny values.
crossprod(vectors)
##           [,1]      [,2]      [,3]
## [1,] 1.000e+00 3.053e-16 5.551e-17
## [2,] 3.053e-16 1.000e+00 0.000e+00
## [3,] 5.551e-17 0.000e+00 1.000e+00
zapsmall(crossprod(vectors))
##      [,1] [,2] [,3]
## [1,]    1    0    0
## [2,]    0    1    0
## [3,]    0    0    1
1. trace(A) = sum of eigenvalues, $$\sum \lambda_i$$.
library(matlib)   # use the matlib package
tr(A)
##  32
sum(values)
##  32
1. sum of squares of A = sum of squares of eigenvalues, $$\sum \lambda_i^2$$.
sum(A^2)
##  402
sum(values^2)
##  402
1. determinant = product of eigenvalues, $$det(A) = \prod \lambda_i$$. This means that the determinant will be zero if any $$\lambda_i = 0$$.
det(A)
##  952
prod(values)
##  952
1. rank = number of non-zero eigenvalues
R(A)
##  3
sum(values != 0)
##  3
1. eigenvalues of $$A^{-1}$$ = 1/eigenvalues of A. The eigenvectors are the same, except for order, because eigenvalues are returned in decreasing order.
AI <- solve(A)
AI
##          [,1]    [,2]     [,3]
## [1,]  0.08824 0.02941 -0.01471
## [2,]  0.02941 0.10504  0.01891
## [3,] -0.01471 0.01891  0.13340
eigen(AI)$values ##  0.14286 0.12500 0.05882 eigen(AI)$vectors
##        [,1]    [,2]    [,3]
## [1,] 0.0000  0.6667  0.7454
## [2,] 0.4472  0.6667 -0.5963
## [3,] 0.8944 -0.3333  0.2981
1. There are similar relations for other powers of a matrix: values(mpower(A,p)) = values(A)^p, where mpower(A,2) = A %*% A, etc.
eigen(A %*% A)
## eigen() decomposition
## $values ##  289 64 49 ## ##$vectors
##         [,1]    [,2]   [,3]
## [1,]  0.7454  0.6667 0.0000
## [2,] -0.5963  0.6667 0.4472
## [3,]  0.2981 -0.3333 0.8944
eigen(A %*% A %*% A)$values ##  4913 512 343 eigen(mpower(A, 4))$values
##  83521  4096  2401