CAO 2022

This is the webpage for the LNMB PhD course "Convex Analysis for Optimization" given in the academic year 2022-23, given by Juan Vera (Tilburg University) and Krzysztof Postek (TU Delft).

Goals TL;DR: Convex analysis is essential to proving optimality of solutions in optimization problems. It also sharpens your understanding of geometry and mathematical analysis. Follow this course if any of those sounds like a useful tool to you.

Goals full version: The aim of the course is to show to you how one of the most successful concepts in Mathematics - the straight line and its generalization, the hyperplane - can be used to show that a broad class of optimization problems (convex problems) can be solved to full optimality. Convex analysis is not only useful for convex problems. It plays also an important role in studying (relaxations of) mixed-integer problems, and general nonlinear optimization problems. This course is likely to be useful to you if anywhere in your research you need to make a claim about optimality of the solutions you found.

Passing the course: Assignments done in groups of at most 2 students. For each week of the course, there will be two problems given to be solved (one easier and one harder). These will be special exercises, different from the "practice" exercises published each week. In total, this makes 18 problems thus. It is preferred that you submit all the solutions in one PDF file (either latex or scans of a reasonably nice handwriting). The deadline for this shall be December 15th, end of the day. Please send out PDF of solutions (single file, either LaTeX, or scanned nice handwritten versions) to k.s.postek@tudelft.nl by December 15th, 2022, 23.59.

Please find here (click) the final version of the file with the assignment problems.

Literature: Textbook "Convex Optimization Theory" by D. Bertsekas, available online. plus extra notes written by the lecturers.

Week-by-week readings and exercises will be regularly updated below.

When working on exercises, you might find useful the website of another book by Dimitri Bertsekas, where there are lists of problems and solutions: link

Videos of the first two lectures of the academic year 2020/2021: Lecture 1 and Lecture 2

Week 1 (12 Sep 2022). Making friends with convexity and the goals of the course.

Concepts discussed: Convex sets, combinations, hulls. Subspaces. Affine combinations, hulls. (Convex) cones and their examples. Dimension of a set as a dimension of the affine hull. Recession cone of a convex set. Convexity-preserving operations: product, image, inverse image under affine transforms, intersection, convex hull of the union. Relative interior, closure. What to be careful with when doing calculus of convex sets (closedness, relative interior). Radon's theorem. Application of Radon's theorem to VC dimension of linear classifiers in Machine Learning.

Reading (only stuff related to the above concepts): Sections 1.1, 1.2, 1.3.1, 1.4

Exercise set week 1 (link).

Week 2 (19 Sep 2022). Convex sets - continuation.

Concepts discussed: Closedness and relative interior of sets behavior under affine transformations. Line segment principle. Prolongation lemma. Extreme points. Cones, conic combinations, rays and extreme rays. Dual description of a convex set - supporting hyperplane theorem. Other separation theorems - mentioned. Relation of supporting/separating hyperplanes to the cutting plane algorithm.

Reading (only stuff related to the above concepts): Sections 1.3.1, 1.5

Exercise set week 2. (link)

Week 3 (26 Sep 2022). Convex functions - primal view.

Concepts discussed: Definition of a convex function and where do they pop up. Convexity-preserving operations. Global minimum property of convex functions. Epigraphs, strict convexity, effective domain. Properness. Continuity of convex functions. Closure. Differentiability-based criter for convexity. Dual cones.

Reading (only stuff related to the above concepts): 1.1, 1.3.2, 1.3.3

Exercise set week 3. (link)

Week 4 (3 Oct 2022). Dual description of convex functions.

Concepts discussed: Subgradients. Moreau-Rockafellar and Dubovitskii-Milyutin theorems. Conjugate functions. Rules for computing conjugates of a sum/maximum of two conjugate functions. Double conjugate as the maximum closed convex underapproximator of a given function.

Reading (only stuff related to the above concepts): 1.6, 5.4

Exercise set week 4. (link)

Weeks 5-6 (10 and 17 Oct 2022). Convex optimization problems.

Concepts discussed: Problems, key challenges. KKT optimality conditions. Lagrange duality based on subdifferential calculus.

Readings for these two weeks: (click) or alternatively in the book: 3.1, 3.2

Two exercises to sharpen the understanding of the key theorem we have proven in Lecture 5: (click)

Weeks 7-9 (31 Oct, 7 Nov, 14 Nov 2022)

Concepts discussed: Gradient and subgradient methods, convergence of algorithms for convex optimization problems.

Please see the lecture notes: click