Farzin Haddadpour

Previous Position and Education

Postdoc (2021-2023)

Ph.D. (2016-2021)

M.Sc. (2010-2013)

B.Sc. (2006-2010)

Research Interest

I am broadly interested in computational machine learning, designing and analysis of distributed algorithms with  focus on:

Ph.D. Thesis:  Communication-efficient and Fault-tolerant Algorithms for Distributed Machine Learning

P.s: I was extremely fortunate to work with Mehrdad Mahdavi and Viveck Cadambe as their PhD student

Experience

Mentor: Dr. Ping Li

Mentor: Sidharth Jaggi 

Selected Publications 

Black-Box Generalization

 Konstantinos E. Nikolakakis, Farzin Haddadpour, Dionysios S. Kalogerias, and Amin Karbasi.


Learning distributionally robust models at scale via composite optimization

 Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Amin Karbasi

Federated learning with compression: Unified analysis and sharp guarantees

 Farzin Haddadpour, Mohammad Mahdi Kamani, Aryan Mokhtari and Mehrdad Mahdavi

Local SGD with periodic averaging: Tighter analysis and adaptive synchronization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Viveck R. Cadambe

Trading redundancy for communication: Speeding up distributed SGD for non-convex optimization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Viveck R. Cadambe

On the convergence of local descent methods in federated learning
Farzin Haddadpour and Mehrdad Mahdavi

Services

Reviewer, (2016-ongoing)

ICML, NeurIPS, ICLR, AISTATS, ISIT


Program Committee Member, (2021-ongoing)

ICML workshop on Federated Learning.

AAAI workshop on Federated Learning.