MIT Linear Algebra Invertible MatrixA square matrix that is not invertible is called singular or degenerate. A square matrix is singular if and only if its determinant is 0. Singular matrices are rare in the sense that if you pick a random square matrix over a continuous uniform distribution on its entries, it will almost surely not be singular. Matrix A multiplying matrix B (usually a vector) represents a linear function. f(B)=AB. f(X+Y)=f(X)+f(Y) f(cX)=cf(X), for constant c This means that in a geometrical sense, origin is fixed. As example, reflection, rotation, scaling, ... can be represented with a linear transformation; affine transformation: a linear transformation followed by a translation x -> Ax+b Affine transformation can be represented as a linear transformation in a higher dimension ![]() Eigenvector (of a matrix) If matrix acts on these vectors changes their magnitude, but doesn't change their direction (Just possibly reversing it). Eigenspace Matrix acts on an eigenvector by multiplying its magnitude by a factor (changes the magnitude, and if the value of the factor is negative, reverses the direction). This factor is the Eigenvalue associated with that eigenvector. Set of all eigenvectors that have the same eigenvalue, together with the zero vector. ------- If matrix A is a linear transformation, a non-null vector x is an eigenvector of A if there is a scalar λ such that Ax = λx The scalar λ is said to be an eigenvalue of A corresponding to the eigenvector x. Benefits of knowing the eigenvectors (and values) of a Matrixthe effects of the action of the matrix on the system can be predicted. ------------- if this abstract direction is unchanged by a given linear transformation, the prefix "eigen" is used
only certain special vectors x are eigenvectors, and only certain special scalars λ are eigenvalues. If λ = 1, the vector remains unchanged (unaffected by the transformation). Identity transformation To get the eigenvectors: λ = 1, reflection |
Tech in T: depth + breadth > Math >