News
[16 Jan 2024] Our work "Adaptive Federated Learning with Auto-Tuned Clients" just got accepted to journal of the Twelfth International Conference on Learning Representations (ICLR 2024).
[02 Nov 2023] Our work "PersA-FL: Personalized Asynchronous Federated Learning" just got accepted to journal of Optimization Methods and Software.
[02 May 2023] I am thrilled to announce that I have been awarded the Lodieska Stockbridge Vaughn Fellowship for 2023-2024.
[08 Dec 2022] Two of our papers are presented in CDC 2022:
[02 Dec 2022] I presented two papers at Neurips 2022 workshops in Federated Learning and Meta-Learning.
[30 Sep 2022] Our paper "Unbounded Gradients in Federated Learning with Buffered Asynchronous Aggregation" has been accepted at 58th Annual Allerton Conference on Communication, Control, and Computing (Allerton).
[23 May 2022] I joined Microsoft as a research intern under the supervision of Dr. Rasoul Shafipour, Mr. Maximilian Golub, and Dr. Bita Darvish-Rouhani.
[13 Mar 2022] I gave a talk about our paper "On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks" at INFORMS Optimization Society Conference.
[28 Feb 2022] Our paper "On Arbitrary Compression for Decentralized Consensus and Stochastic Optimization over Directed Networks" has been accepted at European Control Conference (ECC).
[31 Jan 2022] Our paper "Scalable Average Consensus with Compressed Communications" has been accepted at American Control Conference (ACC).
[26 Jan 2022] I gave a talk about our work "Communication-Efficient and Scalable Algorithms for Decentralized Consensus, Stochastic Optimization, Inference" at the 27th Annual MIT LIDS Student Conference.
[09 Nov 2021] I presented our work "Scalable Average Consensus with Compressed Communications" at Google 2021 Workshop on Federated Learning and Analytics.
[06 Nov 2021] I presented our work "Communication-efficient Distributed Cooperative Learning with Compressed Beliefs" at SIAM Texas-Louisiana Section (TXLA21).
[25 Oct 2021] I presented our work "Scalable Average Consensus with Compressed Communications" at Ken Kennedy Institute AI and Data Science Conference.
[22 Oct 2021] Our paper "Scalable Average Consensus with Compressed Communications" has been accepted at Neurips 2021 workshop on New Frontiers in Federated Learning: Privacy, Fairness, Robustness, Personalization and Data Ownership.
[04 Aug 2021] Our work on "Scalable Compressed Communication in Distributed Inference and Optimization" was presented at Modeling and Optimization: Theory and Applications (MOPTA 2021).
[07 Jun 2021] I joined Yahoo! Research as a summer research intern.
[09 Apr 2021] I presented our work "Communication-efficient Distributed Cooperative Learning with Compressed Beliefs" at Communication Efficient Distributed Optimization Workshop.
[05 Mar 2021] I presented our work "Communication-efficient Distributed Cooperative Learning with Compressed Beliefs" at Ken Kennedy Institute Oil & Gas HPC Conference.
[25 Feb 2021] I presented our work "Communication-efficient Distributed Cooperative Learning with Compressed Beliefs" at 16th Coordinated Science Laboratory (CSL) Student Conference.
[27 Oct 2020] I gave a talk about "Estimating Drugs Sensitivity on Pancreatic Cancer as a Cold Start Problem in Recommender Systems" at Ken Kennedy Institute Data Science Conference.
[17 Aug 2020] I presented our work "MP-Boost: Minipatch Boosting via Adaptive Feature and Observation Sampling" at the Summer School of Machine Learning at Skoltech (SMILES).