Convex and Distributed Optimization
MSIAM — Université Grenoble Alpes — 2019/2020
Lecturers: Jérôme Malick, Yassine Laguel, Sélim Chraibi
Content
Content
- Introduction to convex optimization: concepts in convex analysis (duality, proximal operators), illustrations in supervised learning.
- Convex optimization algorithms (Gradient, Proximal Gradient, Conditional Gradient, ADMM).
- Stochastic gradient and incremental algorithms (SGD, SAGA, SVRG).
- Distributed optimisation algorithms, stochastic algorithms, asynchronous methods.
Lectures
Lectures
- Lecture 0: Overview.
- Lecture 1: First-order optimization.
- Lecture 2: Stochastic optimization.
- Lecture 3: Distributed optimization.
- Lecture 4: Federated optimization.
Evaluation
Evaluation
3 ETCS: report on a practical session (1/3) + presentation of a research article (2/3)