Farzin Haddadpour
Farzin Haddadpour
Postdoc (2021-2023)
Yale Institute for Network Science (YINS), Yale University
Advisor: Amin Karbasi
Ph.D. (2016-2021)
Electrical Engineering and Computer Science School (EECS), Pennsylvania State University (Penn State)
Advisers: Mehrdad Mahdavi and Viveck Cadambe
B.Sc. (2006-2010)
ECE Department, University of Tabriz
I am broadly interested in computational machine learning, designing and analysis of distributed algorithms with focus on:
Federated Optimization
Large-scale Machine Learning
Distributionally Robust Optimization (DRO)
Falut-tolerant Distributed Computing
Ph.D. Thesis: Communication-efficient and Fault-tolerant Algorithms for Distributed Machine Learning
P.s: I was extremely fortunate to work with Mehrdad Mahdavi and Viveck Cadambe as their PhD student.
Research Intern, Baidu AI, Seattle, (summer 2020)
Mentor: Dr. Ping Li
Research Intern, Chinese University of Hong Kong, Hong Kong (2013)
Mentor: Sidharth Jaggi
Black-Box Generalization
Konstantinos E. Nikolakakis, Farzin Haddadpour, Dionysios S. Kalogerias, and Amin Karbasi.
NeurIPS, 2022
Learning distributionally robust models at scale via composite optimization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Amin Karbasi
ICLR, 2022
Federated learning with compression: Unified analysis and sharp guarantees
Farzin Haddadpour, Mohammad Mahdi Kamani, Aryan Mokhtari and Mehrdad Mahdavi
AISTATS, 2021
Local SGD with periodic averaging: Tighter analysis and adaptive synchronization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Viveck R. Cadambe
NeurIPS, 2019
Trading redundancy for communication: Speeding up distributed SGD for non-convex optimization
Farzin Haddadpour, Mohammad Mahdi Kamani, Mehrdad Mahdavi and Viveck R. Cadambe
ICML, 2019
On the convergence of local descent methods in federated learning
Farzin Haddadpour and Mehrdad Mahdavi
arXiv, 2019
Reviewer, (2016-ongoing)
ICML, NeurIPS, ICLR, AISTATS, ISIT
Program Committee Member, (2021-ongoing)