Welcome, this is the website for the Applied Linear Algebra course, during the semester 02-2021. Here you will find:
An update of the course content covered on session-by-session basis.
Bibliography and reading materials.
Test dates and solutions.
Numerical Linear Algebra. LLoyd Trefethen and David Bau, III. Volume 50 in Other Titles in Applied Mathematics, SIAM 1997
Pattern Recognition and Machine Learning. Christopher M. Bishop. Information Science and Statistics, Springer 2006.
Linear Algebra, Friedberg, Insel and Spence. Pearson Educaction 2018.
Discrete Probability Models and Methods. Pierre Brémaud Springer, 2017.
February 5th, 2022. The second test will take place from 6:30 AM to 13:00 PM in classrooom 43-307. THANKS TO ALL THE PARTICIPANTS FOR A GREAT JOB!!!
February 1st, 2022. Examples computing expectations: Poisson Distribution, Hypergeometric distribution. From Reference 5: Example 2.1.22, Theorem 2.1.24, Theorem 2.1.25, Theorem 2.1.26, Theorem 2.1.27 (incomplete).
January 27th, 2022. Theorem 2.1.2, Independent Random Variables. From Reference 5: Definition 2.1.3, Definition 2.1.4, Theorem 2.1.15, Definition 2.1.16, Example 2.1.17 (discussion on the uniform distribution argument on the spaces) Definition 2.1.18, Definition 2.1.19. Expectation, Definition 2.1.20, Example of the elementary Coint Tossing and Dice Throwing Random Variables, Example 2.1.21, extension to the general binomial distribution case.
January 25th, 2022. Conditional Independence, from Reference 5: Definition 1.3.14 and Example 1.3.15. Random Variables, from Reference 5: Definition 2.1.1, Example 2.1.3, Example 2.1.4, Theorem 2.1.5, Theorem 2.1.6, Definition 2.1.7 Example 2.1.8, Example 2.1.10, The Binomial Distribution, The Hyergeometric Distribution.
January 20th, 2022. Conditional Probability, concept review with the dice example. From Reference 5, Definition 1.3.7, Theorem 1.3.8 (Bayes), Theorem 1.3.9, Example 1.3.10, Example 1.3.11, Example 1.3.12 is INCORRECT, Theorem 1.3.13, Definition 1.3.14.
January 18th, 2022. Suspended due to lack of quorum.
January 13th, 2022. From Reference 5, section 1.3.1 Independent Events. Examples of independent events based on throwing dies. Definition 1.3.1, Definition 1.3.2, Theorem 1.3.3, Theorem 1.3.5, Theorem 1.3.6.
January 11th, 2022. From Reference 5, Random Variables and their Distributions. Examples, cumulative distribution function. Counting elements in a set using indicator functions. Motivations behind the definitions and counting methods.
December 16th, 2021. From Reference 5, Theorem 1.2.5, Theorem 1.26, Theorem 1.2.7, Definition 1.2.8, Theorem 1.2.12, Corollary 1.2.13, Theorem 1.2.16. Comments on analogies between measure properties and cardinals of sets. Analogies between conditional probability and structuring discrete sets for counting.
December 14th, 2021. Measure Theory, basic definition and basic properties. Sigma Algebra of Sets. Limits of sets: inferior and superior. Interpretation of infinitely often and all but finite sets.
December 9th, 2021.
The Neumann series for invertibility of matrices. Concluding the presentation of linear systems conditioning.
Overview of Numerical Linear Algebra topics: Direct Methods, Methods targeted to matrices with given structure (e.g. Cholesky factorization), Iterative Methods. Eigenvalue problems. Overview of Numerical Analysis: Error Analysis, Interpolation, Numerical Integration, Finding Roots or Zeros, Numerical Solution of ODE's and PDE's, Finite Differences, Variational Methods: Finite Element Methods, Finite Volume Methods, Boundary Methods, Virtual Element Methods,...
Probability. Notion of Probability, Events and Sample Space.
December 7th, 2021.
Discussion on the statement of Theorem 11.1 from Reference 1. The normal equations, comments on existence and uniqueness, comments on the pseudo-inverse. Solving the normal equations using: Cholesky factorization, QR factorization and SVD. Closing the Least Squares chapter.
Conditioning of a problem. Opening the discussion in general, example of eigenvalue problems conditioning. Conditioning number of a matrix. Estimating the relative conditioning of the solution in terms of the relative perturbation of the data.
December 2nd, 2021. Closing the discussion on FLOPS and asymptotic notation. Discussion on growth conditions. Least Squares Problems. Review of Projection operator characterizations as a distance minimizer. Setting the normal equations. Discussion on existence and uniqueness of the solution. Example 11.1 Reference 1 from on polynomial interpolation. The Legendre polynomials and proving that the interpolation polynomial has a unique solution. Discussion on adjusting a polynomial of lower degree to given points. Discussion on the existence and uniqueness of the normal equations.
November 30th, 2021. Online Class Here, Blackboard here QR factorization, motivation and process. Construction based on nested subspaces. V_1, V_2, ..., V_n. Identification of orthogonal projectors P_j onto the nested subspaces V_j. The Gram-Schmidt process and algorithm; algorithm 7.1 from Reference 1. Theorem 7.1 from Reference 1. Comments on uniqueness of the A = QR factorization, comments on Gram-Schmidt algorithm's numerical instability. Comments of the generalization to infinite-dimensional spaces, the Legendre Polynomials and comments on orthogonal polynomials. Operation counting, definition of Floating Point Operations = FLOPS. Examples of operation counting. Comments on the relevance of operation counting for algorithm analysis.
November 27th, 2021. First test (online).
November 25th, 2021. Review class previous to the test. Discussion of Problem 2 in Problem set 2. Mending Problem 4 from Problem set 2. Several comments.
November 23th, 2021. Online Class Here, Blackboard here. Correction on the previous lecture argument about the deduction of the reduced SVD for a projection matrix P = Q * Q^t. Proof of the statement: a matrix A is full-rank if and only if the matrix B = A^t A is invertible (the argument uses SVD). Computing the orthogonal projection matrix P onto the subspace col(A). End of lecture 5 on SVD. Lecture 6: initial idea and motivation of the A = QR facctorization, reduced and extended.
November 18th, 2021. Projectors: definition, complementary projector. Properties of the image and the null space of a projector. Subspaces in direct sum, comments and examples and its relation with projectors. Orthogonal projectors, equivalence between orthogonal and selfadjoint projectors. The matrix for the orthogonal projection on a one dimensional subspace and extrapolation to orthogonal projections onto subspaces with multiple dimensions. Problem set 2, available here.
November 16th, 2021. Online Class Here, Blackboard here. Quick review of theorems 5.7, 5.8 and statement of theorem 5.9 without proof. Resolution of Homework together with some highlights on efficient/strategic proofs vs. brute force proofs. Presentation of some python commands for the fundamental spaces as well as SVD. Python code available here.
November 11th, 2021. Theorem 5.3 (this time an efficient proof), Theorem 5.4, Theorem 5.5, Theorem 5.6, Theorem 5.7 and Theorem 5.8 (with intermediate lemma) from Reference 1.
November 9th, 2021. Details on the GEOMETRIC and ALGEBRAIC MEANING of the matrix structure X^t A Y. Discussion on the Hyperellipse concept and its variations in terms of dimensions. Details on the Reduced SVD and the Full SVD. Comparison between SVD and Diagonalization: bases involved, orthogonality, existence, dimension requirements. Theorem 5.1, Theorem 5.2 and Theorem 5.3 (incomplete) from Reference 1.
November 4th, 2021. Bounding the norm of matrix product (operator composition), both with vector-induced norms (p-norms) and with the Frobenious norm. Theorem 3.1 from Reference 1, comments on the flexibility of Q in Theorem 3.1, its interpretation and wide extent. Review of eigenvalues, eigenvectors and eigenspaces. Motivations behind the Singular Value Decomposition SVD, Theorem 4.1 from Reference 1. Comments on the concept difference between eigenvectors vs. singular vectors as well as singular values vs. eigenvalues. Comments on the spectral theorem and its resemblance with the SVD theorem.
November 2th, 2021. Online Class here, Blackboard here. Matrix norms induced by vectors (operator norms), proof the definition actually yields a norm. Computation of the norm operator for several examples: the 2-norm of an outer product matrix A = u v^t, the 1-norm of a matrix as the maximum of the 1-norms of its columns. the infinity-norm of a matrix as the maximum of the 1-norms of its rows. Frobenious (or Hilbert-Schmidt) norms, definition, several equivalences. Proof of the trace form for the Frobenious norm.
October 28th, 2021. Homework 1 available here. Vector Norms, 2, 1, p and infinity. Remarks on the extensions of the norms to abstract vector spaces: spaces of sequences and spaces of functions. Comments on the motivation and interpretation behind the introduction of a norm. The Cauchy-Schwartz-Bunyakovsky inequality, highlight on the structure of the proof. Norms of Matrices: Operator Norms induced by vector-matrix multiplication in p-norms. Comments on the finiteness of the Operator Norm for the finite dimensional case. Example of an unbounded-discontinuous linear operator on an infinite dimensional space. Comments on the dual and the isometric-isomorphism between dual and primal spaces for R^n.
October 26th, 2021. Online Class here, Blackboard here. Orthogonal matrices and coordinates in the inverse matrix. Full rank matrices, definitions and comments. Theorem 1.2 of Reference 1 on equivalence between injectivity and full rank. Equivalence between Invertible matrices, full rank matrices and and matrices whose columns are a basis of R^n. Vector norms, definitions and examples. Motivations underlying the introduction of norm and comments on their properties of definition. Balls in R^n, examples and limit of p-norms. Comments on Measure Theory.
October 21st, 2021. Orthogonal bases and coordinates, examples. Orthogonal matrices and inverse. Orthogonal matrices as isometric transformations. Gramm-Schmidt process. Comments on the existence of bases for general vector spaces and the good definition of dimension for the finite case. Complex Case: comments on the inner product and the hermitian matrix as the equivalent of the transpose matrix. Comments on the transpose matrix as the dual or adjoint operator.
October 19th, 2021. Orthogonal Projection on subscapces definition and examples. The Projection Theorem: comments on existence and uniqueness. Equivalence between conditions of minimum distance and orthogonality (characterization). Properties of the Projection operator: linear, idempotent, null space (kernel), norm contraction. Projection onto the orthogonal subspace. Orthogonal sets and liner independence. Orthogonal and Orthonormal bases. Characterization of the Projection onto a subspace in terms of an orthonormal basis.
October 14th, 2021. The four fundamental spaces. Dot product and orthogonality. Orthogonal subspaces and examples. The duality relationship y^t (Ax) = (A^t y) x. The relationships between the fundamental spaces. Orthogonal Projection of one vector over the others.
October 12th, 2021. Suspended due to University's Self-Evaluation general meetings.
October 7th, 2021. Linear Algebra Review I, Vector Spaces, Vector Subspaces. Linear combinations, linear independence, generated subspaces spaces. Vector bases, dimension. Lecture 1, Reference 1: Matrix-Vector Multiplication, The column matrix space, Matrix-Matrix multiplication.
October 5th, 2021. Introductory class, main motivations behind the Applied Linear Algebra Course: linear systems solution and data linear regression. Evaluation method, bibliography and policies.