Abstract: Abstract
Short bio
Abstract: Abstract
Short bio
Abstract: Google’s production system for federated learning now leverages trusted execution environments (TEEs) to address some of the challenges of cross-device federated learning. The system offers full external verifiability of the server-side components of federated learning, improves operability, and enables scaling to much larger models. In this talk, we’ll explain the history and evolution of FL at Google, and introduce an updated definition of federated learning based on its privacy principles (transparency/auditability, data minimization, and data anonymization) rather than on the placement of data processing. We’ll describe how the new approach compares to traditional cross-device federated learning and some new algorithms and use cases unique to the TEE-hosted federated learning setting.
Katharine Daly has built infrastructure for multiple generations of federated learning and federated analytics systems at Google Research. Recently she has focused on designing scalable systems that achieve verifiable differential privacy guarantees via TEEs (Trusted Execution Environments) for GenAI use cases.
Daniel Ramage directs the Google Research teams responsible for the production systems and research roadmap powering federated learning at Google. He is a co-inventor of federated learning and federated analytics, overseen their deployment in Google systems, and focuses on systems and methods for private and secure AI.