SVD

Singular Value Decomposition

the singular value decomposition of an m × n real or complex matrix M is a factorization of the form M = UΣV, where
U is an m × m real or complex unitary matrix(multiplying by their respective conjugate transposes yields identity matrices,),
Σ is an m × n rectangular diagonal matrix with non-negative real numbers on the diagonal,
V (the conjugate transpose of V, or simply the transpose of V if V is real) is an n × n real or complex unitary matrix.

The diagonal entries Σi,i of Σ are known as the singular values of M. The m columns of U and the n columns of V are called the left-singular vectors and right-singular vectors of M, respectively.

The singular value decomposition and the eigendecomposition are closely related. Namely:

  • The left-singular vectors of M are eigenvectors of MM.
  • The right-singular vectors of M are eigenvectors of MM.
  • The non-zero singular values of M (found on the diagonal entries of Σ) are the square roots of the non-zero eigenvalues of both MM and MM.
When M is an m × m real square matrix with positive determinant, U, V, and Σ are real m × m matrices as well, Σ can be regarded as a scaling matrix, and U, V can be viewed as rotation matrices. Thus the expression UΣV can be intuitively interpreted as a rotation, a scaling, and another rotation.


The Image shows:
Upper Left: The unit Disc with the two canonical unit Vectors
Upper Right: Unit Disc et al. transformed with M and signular Values sigma_1 and sigma_2 indicated
Lower Left: The Action of V^* on the Unit disc. This is a just Rotation.
Lower Right: The Action of Sigma * V^* on the Unit disc. Sigma scales in vertically and horizontally.
The this special Case the singularValues are Phi and 1/Phi where Phi is the Golden Ratio. V^* is a (counter clockwise) Rotation by an angle alpha where alpha satisfies tan(alpha) = -Phi. U is a Rotation by beta with tan(beta) = Phi-1


The columns of U and V are orthonormal bases

Applications

Pseudoinverse

M+ = VΣ+U   where Σ+ is the pseudoinverse of Σ, which is formed by replacing every non-zero diagonal entry by its reciprocal and transposing the resulting matrix.

Solving homogeneous linear equations

Ax = 0 vector x can be characterized as a right-singular vector corresponding to a singular value of A that is zero.

Total least squares minimization

determining the vector x which minimizes the 2-norm of a vector Ax under the constraint ||x|| = 1. The solution turns out to be the right-singular vector of A corresponding to the smallest singular value

Low-rank matrix approximation

Some practical applications need to solve the problem of approximating a matrix M with another matrix \tilde{\mathbf{M}}, said truncated, which has a specific rank r.

Separable models

By separable, we mean that a matrix A can be written as an outer product of two vectors A = uv, or, in coordinates, \mathbf{A(i, j)} = \mathbf{u}(i) \mathbf{v}(j). Specifically, the matrix M can be decomposed as

Nearest orthogonal matrix

Relation to eigenvalue decomposition

The singular value decomposition is very general in the sense that it can be applied to any m × n matrix whereas eigenvalue decomposition can only be applied to certain classes of square matrices.




Comments