Linear Algebra: Eigenvalues, Eigenvectors, Cayley-Hamilton Theorem
Linear Algebra - Eigenvalues, Eigenvectors and Cayley Hamilton Theorem - Outline of Contents:
| Eigenvalues, eigenvectors, Cayley Hamilton Theorem|
Here's a quick outline of topics to be covered in this tutorial:
RankIf all the minors of the matrix A of order r+1 are zero and at least one minor of order r≠0 then matrix A is said to be of rank ‘r’
Echelon form of MatrixA matrix ’A’ is said to be in echelon form if,
1. All the zero rows of ‘A’ followed by non-zero row
2. The number of zero’s before the first non-zero element in a non-zero row is less than the number of such zeros in the next row
3. The first non-zero element in a non-zero row is ‘1’
Elementary Transformations or Elementary operations of a matrix -and what they result in:The following three operations applied on the rows (columns) of a matrix are called elementary row (column) transformations. We'll take a look at what happens after making these elementary transformations.
1. Interchange of any two rows(columns)
2. Multiplying all elements of a row(column) of a matrix by a non-zero scalar
3. Adding to the elements of a row (column) ,the corresponding elements of any other row(column) multiplied by any scalar k.
A major focus of this tutorial will be to introduce Eigenvalues and Eigenvectors
- How to compute the Eigen-values and Eigen-Vectors of a Matrix.
- Finding the characteristic equations of a Matrix
- The next tutorial and problem set will introduce you to problems related to these.
We'll also introduce important theorems such as :Rank of the matrix ‘A’ is the number of non-zero rows in its echelon formThe product of the Eigen values of a square matrix A is |A|
Cayley Hamilton Theorem: Every square matrix of order n satisfies its own characteristic equation
... and so on
Complete Tutorial :
Understanding Eigenvalues and Eigen-vectors more intuitively:
Take a bunch of data points. Now find the ellipsoid that best fits the data. The 1st eigenvector will be the semi-major axis and the value is the magnitude, similarly the 2nd one is the semi-minor axis. This can be carried onto n-dimensions. And from an answer I gave on Quora.
The real world application of these concepts is quite fascinating - in the domain of Social, Economic or Internet based Networks, and connected agents in a system. Networks over here doesn't necessarily mean computer or communication networks, just the way in which people are linked.
Think of a "network" as an NxN matrix, which has information about how N people are connected to each other.
The Adjacency Matrix is an NxN matrix, let's say it looks something like this. People who aren't connected to each other have A[i][j] = 0, people with weak relationships have A[i][j] = 0.1, people with medium ties have A[i][j]=0.4, and people with strong links have A[i][j] = 0.6
1---0.0 0.1 0.4 0.6
2---0.1 0.0 0.4 0.0
3---0.4 0.4 0.0 0.1
4---0.6 0.0 0.1 0.0
If we look at the chart above, 1 and 2 are weakly connected, 1 and 3 have medium ties, 1 and 4 have strong ties. This is just a quick example to give you a quick idea. These matrices may not always be symmetric either. People with "higher eigenvector centrality" are people who are better connected to each other. This takes into account, not just how many people the person knows, but also whom the person knows. If a person is connected to a few well connected people, that might fetch him more centrality than a person connected to a lot of people who are not so well connected. This is a kind of a recursive function, similar to the page/priority rank used by search engines to prioritize websites and pages. Social and Economic NetworksPage on Mit
The above is a case study, based on an experiment in Karnataka about how Micro-finance diffuses. Micro-finance involves small amount of capital; is used to help people set up small businesses; and it can potentially pull a lot of people out of poverty. It is found, that when Micro-finance is injected by introducing it to people who have higher eigenvector centrality, it spreads much more quickly. In the rural setting where this experiment was conducted, this eigenvector centrality will be higher for someone like the priest of a temple or a community head, or the person who leads the local governance body. Similarly, if a product or service needs to be advertised, if it is introduced to "better connected" people with higher eigenvector-centrality, it is likely to see a better and faster adoption rate.
You might like to take a look at some of our other Linear Algebra tutorials :
| Introduction to Matrices - Part I Introduction to Matrices. Theory, definitions. What a Matrix is, order of a matrix, equality of matrices, different kind of matrices: row matrix, column matrix, square matrix, diagonal, identity and triangular matrices. Definitions of Trace, Minor, Cofactors, Adjoint, Inverse, Transpose of a matrix. Addition, subtraction, scalar multiplication, multiplication of matrices. Defining special types of matrices like Symmetric, Skew Symmetric, Idempotent, Involuntary, Nil-potent, Singular, Non-Singular, Unitary matrices.||Introduction to Matrices - Part II Problems and solved examples based on the sub-topics mentioned above. Some of the problems in this part demonstrate finding the rank, inverse or characteristic equations of matrices. Representing real life problems in matrix form.||Determinants Introduction to determinants. Second and third order determinants, minors and co-factors. Properties of determinants and how it remains altered or unaltered based on simple transformations is matrices. Expanding the determinant. Solved problems related to determinants. || Simultaneous linear equations in multiple variablesRepresenting a system of linear equations in multiple variables in matrix form. Using determinants to solve these systems of equations. Meaning of consistent, homogeneous and non-homogeneous systems of equations. Theorems relating to consistency of systems of equations. Application of Cramer rule. Solved problems demonstrating how to solve linear equations using matrix and determinant related methods. |
|Basic concepts in Linear Algebra and Vector spacesTheory and definitions. Closure, commutative, associative, distributive laws. Defining Vector space, subspaces, linear dependence, dimension and bias. A few introductory problems proving certain sets to be vector spaces. ||Introductory problems related to Vector Spaces - Problems demonstrating the concepts introduced in the previous tutorial. Checking or proving something to be a sub-space, demonstrating that something is not a sub-space of something else, verifying linear independence; problems relating to dimension and basis; inverting matrices and echelon matrices.||More concepts related to Vector SpacesDefining and explaining the norm of a vector, inner product, Graham-Schmidt process, co-ordinate vectors, linear transformation and its kernel. Introductory problems related to these.||Problems related to linear transformation, linear maps and operators - Solved examples and problems related to linear transformation, linear maps and operators and other concepts discussed theoretically in the previous tutorial. |
|Definitions of Rank, Eigen Values, Eigen Vectors, Cayley Hamilton Theorem |
Eigenvalues, eigenvectors, Cayley Hamilton Theorem
|More Problems related to Simultaneous Equations; problems related to eigenvalues and eigenvectors Demonstrating the Crammer rule, using eigenvalue methods to solve vector space problems, verifying Cayley Hamilton Theorem, advanced problems related to systems of equations. Solving a system of differential equations . || A few closing problems in Linear AlgebraSolving a recurrence relation, some more of system of equations.|| |