Instead of manual calculations, I will use the Python libraries to do the calculations and later give you some examples of using SVD in data science applications. In this article, bold-face lower-case letters (like a) refer to vectors. Bold-face capital letters (like A) refer to matrices, and italic lower-case letters (like a) refer to scalars. To understand SVD we need to first understand the Eigenvalue Decomposition of a matrix. We can think of a matrix A as a transformation that acts on a vector x by multiplication to produce a new vector Ax.
Here the rotation matrix is calculated for θ=30⁰ and in the stretching matrix k=3. y is the transformed vector of x. To plot the vectors, the quiver() function in matplotlib has been used. Figure 1 shows the output of the code.
The matrices are represented by a 2-d array in NumPy. We can use the np.matmul(a,b) function to the multiply matrix a by b However, it is easier to use the @ operator to do that. The vectors can be represented either by a 1-d array or a 2-d array with a shape of (1,n) which is a row vector or (n,1) which is a column vector.
These images are the output images of different sizes compressed through our code.