Federated Learning One World Seminar

Future Talks

October 4, 2023 @ 1pm Coordinated Universal Time (UTC) 

Jongho Park (KAUST)

DualFL: A duality-based federated learning algorithm with communication acceleration in the general convex regime

host: Sebastian Stich

Abstract: In this talk, we propose a novel training algorithm called DualFL (Dualized Federated Learning), for solving a distributed optimization problem in federated learning. Our approach is based on a specific dual formulation of the federated learning problem. DualFL achieves communication acceleration under various settings on smoothness and strong convexity of the problem. Moreover, it theoretically guarantees the use of inexact local solvers, preserving its optimal communication complexity even with inexact local solutions. DualFL is the first federated learning algorithm that achieves communication acceleration, even when the cost function is either nonsmooth or non-strongly convex. 

This is a joint work with Jinchao Xu.