Eigenvector Eigenvalue

A very good Linear Algebra textbook is Gilbert Strang's "Introduction to Linear Algebra" from MIT press.

Eigenvector and eigenvalue

Typical matrix equation (a set of linear equations) b = Ax can be solved by elimination.

However when it comes to a series of changes, the problem is completely different.

Assume a vector u changes over time along the same direction. This is represented by equation:

   u(t+1) = Au(t)

A is some matrix. u(t) is the vector at time t. Moving in the same direction by a delta step will reach u(t+1) at time t+1.

This equation can no longer solved by elimination, given that it needs to gurantee u(t+1) and u are in the same direction, which is not the case for b=Ax.

Here brings in the concept of eigenvector. 

Given a matrix A, find a vector x such that 

    Ax has the same direction as x

Then this vector x is called the eigenvector of A.

Because Ax is the same direction of x, it equals to some scalar value lambda times x:

    Ax = lambda x

The lambda here is called the eigenvalue.

A matrix could have multiple eigenvectors and the corresponding eigenvalues.

Manipulate the equation a bit we have:

    Ax - lambda x = 0

Ax - lambda I x = 0

(A - lambda I) x = 0

so we need determinant of (A - lambda I) = 0, solving the equation will give us all eigenvalues.

Then place eigenvalues in and we figure out the eigen vectors x.