SC-653: Optimisation for large scale machine learning

 Instructor: Avishek Ghosh, Assistant Professor, SysCon and C-MInDS, IIT Bombay

Contact: Room 110, SysCon; Email: avishek_ghosh@iitb.ac.in 

Timing: Tuesday/Friday 2:00 - 3:25 pm

Classroom: LT 002, Lecture Hall Complex-2

TA: 1) Siddhartha Ganguly (email: sganguly@iitb.ac.in)

     2) Souvik Das (email: souvikd@iitb.ac.in)


Scribe Format: Here 

Scribe Schedule: Here 


 About the course:  This is a course on Optimisation algorithms. It is roughly divided in two parts: 

Part I (Foundational ): Convergence rates of Convex objective, Gradient Descent (GD), Constraint Optimisation, Algorithms for constraint objectives--Projected GD, Frank-Wolfe. Non-smooth optimisation; Sub-Gradients, Large Scale Sparse Opt., Proximal Methods; Stochastic Optimisation, SGD, Convergence rate of SGD for convex and non-convex objectives

Part II (Advanced): Accelerated Gradient methods, Adaptive Gradients--Adagrad, ADAM, Large Scale Learning: Federated Learning (FL), Federated Averaging, Convergence of Fed-Avg, Challenges in FL, Neural Networks, SGD convergence for NN, Benign Overfitting


Grading: HWs (25%), 1 mid term (25%), Final (30%), Scribe (10%) and Class participation (10%)


References: We won't follow any standard textbook. The following are the references (pdfs of all the following are available online):

For Part I: 

1) Optimization for Data Analysis; S. Wright and Ben. Recht (Rough draft available here)

2) Convex Optimization; S Boyd and L Vandenberghe

3) Numerical Optimization; Nocedal and Wright

For Part II: 

1) Lectures on Convex Optimization; Y. Nesterov

2) Research Papers (will be announced prior to class)


Apart from these resources, we will follow similar courses like: Convex Optimization (EECS 227C, UC Berkeley) by Prof. Martin J. Wainwright, Theoretical Foundations of Data Science II (DSC 40B, UC San Diego) by Prof. Arya Mazumdar,  Convex Optimization (EECS 227C, UC Berkeley) by Prof. Jiantao Jiao to name a few.

 Lectures:

Scribes will be uploaded twice--once before the midsem, once before the endsem.


General Guidelines for Homeworks: