Course Title: Convex Analysis for Optimization
Instructors: Juan Vera - Tilburg University (j.c.veralizcano@tilburguniversity.edu) and Olga Kuryatnikova - Erasmus University Rotterdam (kuryatnikova@ese.eur.nl).
Convex Analysis and Optimization is a fundamental course that introduces the principles of convex sets, functions, and optimization problems. Convexity is key as it allows to guarantee optimality. Also, many problem classes with no optimality guarantee use convex optimization problems as subroutines. The course will cover various methods and algorithms for solving optimization problems.
Week 1: Introduction to Convexity
Intro to convex sets, functions, problems, and applications. Convex, affine, and conic combinations, hulls. Half-spaces and convex cones. Set convexity-preserving operations.
Textbook chapters: 1.1.1, 1.2, 1.3 up to Proposition 1.3.2.
Slides without notes: link
Notes from the class: link
Non-graded exercises for practice: link
Graded exercises: link
Week 2: Convex Sets
Closedness and relative interior. Recession and normal cone. Extreme points and rays. Polar and dual cones (we have not covered those yet, will do next lecture).
Textbook chapters: 1.3 up to Section 1.3.2, 1.4 up to Section 1.4.1, 2.1, 2.2. For more info on extreme points and faces: Chapter 18 of Rockafellar, Convex Analysis.
Slides without notes: link
Notes from the class: link
Non-graded exercises for practice: link
Graded exercises: link
Week 3: Finish convex sets + start convex functions
Hyperplane theorems. Properties and examples of convex functions. Function convexity preserving operations.
Textbook chapters: 1.5, 2.2, 2.3 up to 2.3.3. For info on popular convex functions and convexity preserving operations on functions: Sections 3.1, 3.2 of S. Boyd and L. Vandenberghe, Convex Optimization.
Slides without notes: link
Notes from the class: link
Non-graded exercises for practice: link
Graded exercises: link
Week 4: Dual description of convex functions
Continuity and closedness. Differentiability. Subgradients. Prox operators. Conjugate functions.
Textbook chapters: 1.1.2-1.1.4, 1.3.2, 1.6, 5.4. For more info on proximal operators: lecture notes, slides.
Slides without notes: link
Notes from the class: link
Non-graded exercises for practice: link
Graded exercises: link
Week 5: Duality and Optimization
Fenchel duality and Lagrangian duality.
Lecture notes
Graded exercises 5
Week 6-7: Introduction to algorithms - Descend methods
Rates of convergence. Error bounds. Numerical issues.
Lecture notes
Graded exercises 6 exercises 7
Week 8 -9 : Fix point iteration point of view - Averaged operators
Convergence properties. Douglas-Rachford, ADMM, etc. Equivalence of linear convergence and error bounds.
Lecture notes
Graded exercises 8-9
Textbook: D. Bertsekas, Convex Optimization Theory, Athena Scientific, 2009. Online version
Extra notes written by the lecturers.
Supplementary Books:
S. Boyd and L. Vandenberghe, Convex Optimization, Cambridge University Press, 2004. Online version
R. T. Rockafellar, Convex Analysis, Princeton University Press, 1970 or later editions. Online version
Assignments are done in groups of at most 2 students. For each week of the course, two problems will be given to be solved (one easier and one harder). These will be special exercises, different from the "practice" exercises published each week. In total, this makes 18 problems thus. You should submit all the solutions in one PDF file. The deadline for this shall be December 8th. Please send out PDF of solutions (single file, either LaTeX, or scanned nice handwritten versions) to j.c.veralizcano@tilburguniversity.edu by December 8th, 2024, 23:59.