Matrix Iterative Methods I

Course Journal

Last update: 10/02/20

Lecture 1 - 02/10/19

Introduction: Presentation, goals, and methods of the course. Refer to:

  • [LSB] Introduction, pp. 1-11;

  • F. Klein, Lectures on mathematics, Vol. 339. American Mathematical Soc., 2000, pp. 49-50;

  • On the Mathematics Curriculum of the High School, Source: The American Mathematical Monthly, Vol. 69, No. 3 (Mar., 1962), pp. 189-193.

Projection processes: Search and constraints subspaces, projected subsystem, well defined processes. Refer to:

  • [LSB] Chapter 2, pp. 12-16 (in particular, Theorem 2.1.2).

Lecture 2 - 07/10/19

Projection processes: Orthogonal and oblique projection processes, conditions for well-defined processes, finite termination property, nested search spaces, Krylov subspaces and their properties, Krylov subspace methods. Refer to:

  • [LSB] Chapter 2, pp. 16-22 (in particular, Theorem 2.1.6, Lemma 2.1.7, Lemma 2.2.2, Theorem 2.2.3).

Lecture 3 - 09/10/19

Mathematical characterization of the conjugate gradient method (CG), and of the minimal residual method (MINRES) and generalized minimal residual method (GMRES). Optimality and orthogonality. Arnoldi and Hermitian Lanczos algorithms, elements of non-Hermitian Lanczos algorithm. Refer to:

  • [LSB] Chapter 2, pp. 22-33 (in particular, Theorem 2.3.1, Theorem 2.3.2).

Lecture 4 - 14/10/19

Derivation of CG using Hermitian Lanczos algorithm. Derivation of CG from the minimization problem of a quadratic form (first part). Refer to:

Lecture 5 - 16/10/19

Derivation of CG from the minimization problem of a quadratic form (second part). Derivation of GMRES using Arnoldi algorithm. Remarks on short recurrences, local and global orthogonality. Vorobyev problem of moments, matching moment property. Connection between CG (Lanczos) and Gauss quadrature. Refer to:

  • [LSB] Chapter 2, pp. 50-53, pp. 57-58;

  • [LSB] Chapter 3, pp. 146-148, pp. 136-139 (in particular, Theorem 3.7.1).

Lecture 6 - 21/10/19

Remarks on Gauss quadrature, matrix functions, Stieltjes integral representation of an operator. Interpolatory quadrature, orthogonal polynomials and Gauss quadrature (first part). Refer to:

  • [LSB] Chapter 3, pp. 76-84 (in particular, Lemmas 3.2.1, 3.2.4, and 3.2.5, Theorem 3.2.7).

Lecture 7 - 23/10/19

Interpolatory quadrature, orthogonal polynomials and Gauss quadrature (second part): interlacing and interlocking property, three-term recurrences, eigenvalues of the Jacobi matrix and roots of the orthogonal polynomials. Refer to:

  • [LSB] Chapter 3, pp. 89-96 (in particular, Theorem 3.3.1, Corollary 3.3.3, and Theorem 3.3.4 this last one without proof).

Lecture 8 - 30/10/19

Algebraic solution of the Stieltjes problem of moments. (Minimal) partial realization and connection with the Stieltjes problem of moments. Refer to:

Lecture 9 - 04/11/19

Summary of the connections between orthogonal polynomials and Lanczos algorithm. Every Jacobi matrix is given by a Lanczos process. Eigenvalues and eigenvectors of a Jacobi matrix and the interlacing property. Refer to:

  • [LSB] Chapter 3, pp. 108-110 and pp. 115-117 (in particular, Proposition 3.4.5 and 3.4.7).

Lecture 10 - 06/11/19

Persistence Theorem, stabilized nodes. Isometry between CG and Gauss quadrature errors, they have same convergence behavior. Refer to:

  • [LSB] Chapter 3, pp. 121-124 and pp. 129-130 (in particular, Theorem 3.4.9, Definition 3.4.10, Theorem 3.4.12).

  • [LSB] Chapter 3, pp. 140-142 (in particular, Theorem 3.5.2).

Lecture 11 - 11/11/19

Poisson example, Galerkin finite element method, total, algebraic and discretization errors, issues regarding the error evaluation. Localization of the algebraic error (first part). Refer to:

Lecture 12 - 13/11/19

Summary. In particular, Galerkin finite element method and CG, CG formulation by Lanczos algorithm, model reduction, Gauss quadrature, interlacing (and interlocking) property, persistence theorem, moment Stieltjes problem and the realization problem.

Lecture 13 - 20/11/19

Backward errors, localization of algebraic errors (second part). Recall of the main differences between direct and iterative methods. Computational cost and remarks on complexity theory. Refer to:

  • [LSB] Chapter 5, pp. 231-250, Chapter 2, pp. 45-47 (in particular, Theorem 2.5.2).

  • Meme of the day

Lecture 14 - 25/11/19

Linear stationary iterative method and concept of convergence, Chebyshev polynomials, Richardson and Chebyshev semi-iterative method. Refer to:

  • [LSB] Chapter 5, pp. 250-254 (in particular, Theorem 5.5.1).

Lecture 15 - 27/11/19

CG in exact arithmetic. Proof of Theorem 5.6.1 with and without global orthogonality property. Relation between global orthogonality and optimality. Concept of loss of global orthogonality. Refer to:

  • [LSB] Chapter 5, pp. 258-261 (in particular, Theorem 5.6.1).

Lecture 16 - 02/12/19

Krylov subspace method as nonlinear methods, the convergence behavior of the Chebyshev semi-iterative method is not the convergence behavior of CG. Expression for CG errors, eigenvalue based convergence of CG (first part). Refer to:

  • [LSB] Chapter 5, pp. 258-270 (in particular, Corollary 5.6.2, Theorem 5.6.3 with the correction shown in the lecture and the sketch of proof, Theorem 5.6.5 without proof, Theorem 5.6.6, Corollary 5.6.7).

Lecture 17 - 04/12/19

Superlinear convergence and eigenvalue deflation. Remarks on Theorem 5.6.3. Error bound and eigenvalue deflation, the convergence of Ritz values and convergence behavior of CG, examples and illustrations, the effect of a cluster of eigenvalues on CG. Refer to:

  • [LSB] Chapter 5, pp. 275-283 (in particular, Figure 5.7-5.9, Theorem 5.6.10, while skip Theorem 5.6.9).

Lecture 18 - 09/12/19

The effect of a cluster of eigenvalues on CG and connection with GQ sensitivity, false myths about CG convergence behavior. GMRES in exact arithmetic: residual bound, convergence bounds (particular cases: normal matrices, SPD matrices, symmetric indefinite matrices), role of the condition number of the eigenvector matrix, worst-case and ideal GMRES (mention of the role of the pseudospectrum and the field of values). Refer to:

Lecture 19 - 11/12/19

In GMRES, any nonincreasing convergence curve is possible with any spectrum. Remark: the difference between non-Hermitian Lanczos and Arnoldi algorithms in terms of subspace bases and moment matching property. Refer to:

  • [LSB] Chapter 5, pp. 297-303 (in particular, Lemma 5.7.6 and Theorem 5.7.7).

Lecture 20 - 16/12/19

Deflation, the effect of a cluster on deflation, Paige result on Ritz values, Greenbaum results on CG in finite precision. Refer to:

Lecture 21 - 18/12/19

Maximal attainable accuracy in finite precision CG. GMRES in finite precision arithmetic: choice of the basis (simpler GMRES example), dependency on the orthogonalization process, MGS GMRES is backward stable. Refer to:

Lecture 22 - 06/01/20

Other applications of Krylov subspace methods (complex networks centrality approximation, minimal partial realization and system identification).

Lecture 23 - 08/01/20

Questions, discussion, preparation for the exam.

Exam:

Exam dates:

  • 13-14/01/2020

  • 20/01/2020

  • 27/01/2020

The exam has only oral part.

Materials and literature:

Literature:

  • [LSB] J. Liesen and Z. Strakos, Krylov Subspace Methods, Principles and Analysis, Oxford University Press, 2012, 408p. (Main source) [Library]

  • [HB] W. Hackbusch, Iterative Solution of Large Sparse Systems of Equations, Springer-Verlag, 1994, 429p.;

  • [SB] Y. Saad, Iterative Methods for Sparse Linear Systems, SIAM Publications, 2003, 528p.;

  • [VB] Y. V. Vorobyev, Method of Moments in Applied Mathematics, Gordon and Breach Sci. Publ., 1965, 165p.

Material discussed in class (in order of appearance):