Eigenvalues
Eigenvalues
Ax=x A=S S-1 When and x are both unknown, Ax=x is a nonlinear equation.
- Applications of Matrix Diagonalization: Powers Ak and (i) The vector x is in the nullspace of AI
Exponential eAt (ii) The number is chosen so that AI has a nullspace
Markov processes (application of Ak) The nullspace of AI is then called “eigenspace” of A. All x’s in the
Ordinary Differential Equation (the most often mentioned but is eigenspace are eigenvectors corresponding to the same .
- Minimum and Maximum Principles We are interested in nonzero vector x AI must be singular
Principal Component/Discriminant Analysis Conventional solution: find first by det A I 0 . Then, the
Px=1x=x column space of P is the eigenspace with 1=1 = 1 or 3/4 or 1/2 eigenvalues are diagonal entries
Px=0x=0 nullspace is an eigenspace too with 2=0 How to transform matrix A into a diagonal or triangular matrix
If a projection matrix P has dim()=r and dim()=nr, =1 has a without changing its eigenvalues? Elimination doesn’t work any more!
r-dimensional eigenspace (repeats r times) and =0 has a To find eigenvalues is already a headache (unlike solving Ax=b where
Only for “certain” vector x, transformation A can be simplified? Some checks on the eigenvalues (proof?)
3 a b .
Let x=x1+5x2 Ax 1 x1 52 x2 . The action of A can still be det 2 trace determinant;
c d 2
10
determined by eigenvalues and eigenvectors!
3 4
Diagonal Form of Matrix Independent Eigenvectors
nn matrix A with n linearly independent eigenvectors from Ex: defective matrix
1 0 1
. 0 S AS=0 SS ASS =0 A=0
A
2 0 0
eigenspaces: S 1 AS where S’s columns are
But A is not zero! Contradictory! A is not diagonalizable!
n
A is defective not because the eigenvalues are repeated zeros but because
formed by the eigenvectors and 1, 2,3,...,n may not be all distinct.
not enough independent eigenvectors can be found.
Proof
3 1
Ex: A and A=I
0 3
AS A x1 x2 xn 1 x1 2 x2 n xn .
Diagonalizability is concerned with the eigenvectors
Not all matrixes are diagonalizable. When diagonalizable, only S with Reason: For 2 by 2 matrix, let the eigenvectors be x1 and x2
eigenvector columns can diagonalize A into with eigenvalues as its corresponding to and Let c1x1+c2x2=0 (1) then A(c1x1+c2x2)=0
diagonal entries.
c1x1+c2x2=0 (2) (2)(1)
If there are n distinct eigenvalues, all eigenvectors are independent
c1 1 2 x1 0. But 1 2 and x1 0 c1=0
the matrix can be diagonalized.
same for c2 and for any number of eigenvectors.
S is not unique. Example: A=I: eigenvalue=1; the eigenspace filling
up the entire n-dimensional space.
5 6
More on Eigenvalues and Eigenvectors
Examples of Diagonalization 1 1 1 1
1 1 1 1
1 / 2 1 / 2 1 0 Example 1:
Ex: projection matrix A 1 1 1 1
1 / 2 1 / 2 0 0
1 1 1 1
1 1 1 0
S and AS S .
1 1 1 0
0 1
Ex: rotation K
1 0
K rotates any vector through 900. Are there vectors rotated without
eigenvalues of A.
Ex: S 1 AS S 1 AS 2 or S 1 A2 S 2 .
7 8
Powers: Ak=SkS-1 Product of two matrices
1 0 1 0 0 1 0
if Ax x then x A1 x and x A1 x. Example: AB .
0 0 1 0 0 0
The eigenvalues of A-1 is 1/ A and B are diagonalizable:
Example: rotation through 90o They share the same eigenvector matrix S if and only if AB=BA
be longer)
Ak Sk S 1
9 10
Fibonacci Sequence and Difference Equations Power of Matrix and Difference Equations
Fibonacci sequence: 0, 1, 1, 2, 3, 5, 8, 13,… If A SS 1 then
Fk 2 Fk 1 Fk .
uk Ak u0 SS 1 SS 1 SS 1 u0 Sk S 1u0 .
This is a form of difference equation. Numbers in the sequence turn By setting S 1u0 c Sc u0 (i.e. expressing u0 as linear combination
up in fantastic natural patterns (e.g. sunflower seeds). of eigenvectors), the solution becomes
1 c1
k
uk Sk c x1 xn k k
c11 x1 cn n xn .
kn cn
Fk 2 Fk 1 Fk 1 1 c11k x1 cn kn xn .
becomes uk 1 uk .
Fk 1 Fk 1 1 0 c’s are determined by the initial conditions:
The second equation is trivial and is a standard trick for second order c1
u0 x1 xn Sc
and c S 1u0 .
difference equation. For higher order equations, say order s, we need s-1
cn
trivial equations.
u k A k u0
11 12
More on Fibonacci and Difference Equations Markov Processes and Difference Equations
1 1 1 5 1 5 Example: each year 1/10 of the people outside California move in,
A , det A I 2 1, 1 , 2 .
1 0 2 2
1
and 2/10 of the people inside California move out. What is the
1 1 1 2
and initial conditions: c S 1u0 1 2 .
1 1 0 1 1 2 population of California after 5 years, 10 years, or 100 years? (y:
corresponding to 1 Fibonacci number and compounded interest become larger and larger
(a) 1 1 is an eigenvalue (each column adds up to 1) Markov Process converges to constants neutrally stable
(b) Its eigenvector x1 is nonnegative and is the steady state since Given: uk Sk S 1u0 c11k x1 cn kn xn .
Ax1=x1 Ax1=x1 (i.e. A can no longer change x1) The difference equation uk 1 Auk is
(d) If any power of A has all positive entries, then these other neutrally stable if some i 1 and other i 1
1 0 4
(e.g. California’s population approaches ( y0 z0 ) ) Example: stable matrix A 1 has eigenvalues 0 and 1/2
3 0 2
Reason (a): each column of A1I adds up to 11=0 rows of AI adds 0 4 2 1 1
u0 , u1 1 , u2 1 , u3 1 , u4 12 ,
1 2 4 8 16
up to 0 AI is singular 1 1
The first step is to split u0 into two eigenvectors:
(b): the steady state should maintain positive proportions
0 8 8
(c)(d): otherwise uk c11k x1 cn kn xn will blow up. If ik (other than 1 1 0 1
=1) goes to zero when k becomes very large. uk c1 x1 u The first vector is multiplied by its eigenvalue 0 and is thus annihilated.
15 16
Differential Equations Example: Diffusion Model Solution to Ordinal Differential Equation (ODE)
v and w are concentrations. du
Solution? Look at au , u u0 at t 0 first
dt
At each time t the diffusion rate between two adjacent segments : stability : oscillations
dv Solution:
w v 0 v v du 2v w 2 1
dt u ,
dw u. u (t ) e At u0 Se t S 1u0
0 w v w w dt v 2 w 1 2
dt
Difference equation: uk=Aku0 depending on power of A
Then the system becomes:
Differential equation: u (t ) e At u0 depending on exponential of A
du
Au, u u 0 at t 0.
dt Problem: what is exponential of A: eAt ?
17 18
Exponential of a Matrix More on Exponential: eAt=SetS-1
x2 x3 2 1
Imitate e x 1 x . Example: exponential of A=
2! 3! 1 2
1
At At 2 At 3 1 1 e t 1 1
. has the following properties:
t 1
e I At e At
Se S 3 t
2! 3! 1 1 e 1 1
1. e As e At e As t 2. e At e At I 3.
dt
d At
e Ae At
1 e t e 3 t
2 e t e 3t
e t e 3t
.
e t e 3 t
Property 3 is how we solve the differential equations If x1 is the eigenvector of A corresponding to eigenvalue 1, then x1 is
e At I At
At 2 At 3 the eigenvector of eAt corresponding to eigenvalue e 1t , i.e.
.
2! 3!
e At x1 e 1t x1
1 2 1 3
S S t 2
S S t 3
e At I SS 1t
2! 3! Proof:
S I t
t 2 t 3 S 1 Se t S 1. At 2 x At 3 x
e At x1 Ix1 Atx1 1
2! 3! 2! 3!
1
t2 2 t3
e t I t
t 2 t 3 Ix1 Ax1t A x1 A3 x1
2! 3!
2! 3! 2
t t3
Ix1 1 x1t 12 x1 13 x1
1t 2 1t 3 2! 3!
1 1t
2! 3!
Ix1 1tx1
1t 2 x
1t 3 x
2!
1
3!
1
1 n t
nt 2 nt 3
2! 3! ( I 1t
1t 2 1t 3 ) x e 1t x1
1
2! 3!
e 1 t
Matrix eAt is never singular: its eigenvalue et is never zero and
n t
e
det e At e1t e2t ent e trace At
19 20
Matrix Exponential and ODE Solution Back to ODE for Diffusion Model
21 22
Stability of Differential Equation Stability for a 2 by 2 System
u t Se t S 1u0 c1e 1t x1 cn e n t xn du a b
u
dt c d
Eigenvalues decide how u(t) behaves as t
a b
i t det 2 trace det 0.
Stability is governed by e by real part of i c d
If a ib ,
12 trace
trace2 4det .
e t eat eibt e at cos bt i sin bt and e t eat Stability test: 1. The trace a+d must be negative
Decays for a<0; becomes constant for a=0; and explodes for a>0. 2. The determinant ad-bc must be positive
The du/dt=Au system is When the eigenvalues are real, the two tests guarantee them to be
Neutrally stable when all Re i 0 and some Re i =0 When the eigen values are complex pair x iy , the trace=2x and the
If b=c then trace 2 4det a d 2 4 ad b2 a d 2 4b2 0
23 24
Example of 2 by 2 System Skew-Symmetric Matrices
2 1 0 1
Example 2: diffusion equation du / dt u is stable since Example 1: du / dt u0
1 2 1 0
eigenvalues are –1 and -3 1
trace=0; det=1 2 1 0 so i and i.
1
1 1
eigenvectors: (1, -i) and (1, i) u t c1eit c2 e it .
Example 3: diffusion model with two ends closed off. i i
du 1 1 dv dt w v c c cos t i c1 c2 sin t
u or substituting eit cos t i sin t : u t 1 2
dt 1 1 dw dt v w
i c1 c2 cos t c1 c2 sin t
This is a continuous Markov process. At t=0, u0=(a, b) Sc=u0 (c1+c2)=a and –i(c1-c2)=b
Markov matrix: each column adds up to 1= max a cos t b sin t cos t sin t a cos t sin t
u t u0 sends u0
b cos t a sin t sin t cos t b sin t cos t
Cont. Markov matrix: each column adds up to 0= max
around a circle.
A is a Markov matrix if and only if B=A-I is a continuous Markov
Also, since A2=I
matrix.
t 2 t3
1 1 t
cos t
The steady state (v=w) is the eigenvector (= ) corresponding to At 2 2 6 sin t
1 e At
I At
2 t 3 t 2 sin t cos t
t 1
max 6 2
eAt is an orthogonal matrix!
AT A and e At T e At e At e At T I e At is orthogoanl
e At u0 u0 Conservative systems : no energy is lost
25 26