INRIA Associate Team

First-Order Accelerated Methods for machine learning (FOAM)

Researchers

Alexandre d'Aspremont

co-PI (France)

Cristóbal Guzmán

co-PI (Chile)

France

Chile

Information

  • Team Acronym: FOAM

  • Period of activity: first year (2020-2022), postponed to (2022-2024)

  • Principal Investigator INRIA: Alexandre d'Aspremont (ENS, CNRS)

  • Principal Investigator (partner institution): Cristóbal Guzmán (PUC-Chile)

  • Other participants: SIERRA team, INRIA-Chile

  • Keywords: Operations Research, Optimization, Machine Learning, Signal Processing

Objectives

  • Our first goal is to conduct a systematic study of optimal convergence rates in statistical learning, stochastic convex optimization, variational inequalities and fixed-point problems under the performance estimation problem (PEP)

  • On the other hand, we are interested in extending results obtained through PEP to non-Euclidean settings. This possibly requires exploring alternative approaches, such as optimal transport and discretizations of continuous dynamics

  • Our final goal is to apply our algorithmic advances in multiple application areas, including the analysis of astronomical data, seriation problems and training of generative adversarial models

Work Program (2020)

Student supervision: M.Sc.

  • Santiago Armstrong (Eng., PUC): Algorithms for Circular Seriation (supervised by C. Sing-Long and C. Guzmán, graduation: Jan 2021)

  • Patricio Ulloa (Eng., PUC): Algorithmic Stability via Performance Estimation (supervised by C. Guzmán, graduation: September 2021)

  • Juan Pablo Flores (Math, PUC) : Non-Euclidean Stochastic Convex Optimization (supervised by C. Guzmán, exp. graduation: July 2022)

Student supervision: Ph.D.

  • Juan Pablo Contreras: TBD (supervised by R. Cominetti, exp. graduation: 2022)

Publications:

  • Differentially Private Stochastic Optimization: New Results in Convex and Nonconvex Settings (with R. Bassily and M. Menart). NeurIPS 2021

  • Best-Case Lower Bounds in Online Learning (with N. Mehta and A. Mortazavi). NeurIPS 2021

  • An Optimal Algorithm for Strict Circular Seriation (with S. Armstrong and C. Sing-Long). SIAM Journal on Mathematics of Data Science.

  • Non-Euclidean Differentially Private Stochastic Convex Optimization (with R. Bassily & A. Nandi). COLT 2021

  • The Complexity of Nonconvex-Strongly-Concave Minimax Optimization (with S. Zhang, J. Yang, N. Kiyavash & N. He). UAI 2021

  • Stability of Stochastic Gradient Descent on Nonsmooth Convex Losses (with R. Bassily, V. Feldman and K. Talwar), NeurIPS 2020 (spotlight presentation)

  • A Bregman Method for Structure Learning on Sparse Directed Acyclic Graphs. M. Romain, A. d'Aspremont

  • M. Barré, A. Taylor, A. d'Aspremont: Convergence of Constrained Anderson Acceleraion

  • M. Barré, C. Giron, M. Mazzolini, A. d'Aspremont. Averaging Atmospheric Gas Concentration Data using Wasserstein Barycenters

  • A. Askari, Q Rebjock, A. d'Aspremont, L. El Ghaoui: FANOK: Knockoffs in Linear Time

  • G. Mialon, D. Chen, A. d'Aspremont, J. Mairal: A Trainable Optimal Transport Embedding for Feature Aggregation

  • M. Barré, A. Taylor, A. d'Aspremont: Complexity Guarantees for Polyak Steps with Momentum (COLT 2020)

On-going research:

  • Algorithms for Strict Circular Seriation. This is the M.Sc. Thesis of S. Armstrong. We study the problem of inferring circular ordering under pairwise dissimilarity measures. This is an important topic in unsupervised learning, with applications in genetics and other fields. We are interested in continuing this line of research by incorporating higher order spectral information in seriation methods. Preliminary experiments show promising results in real data, and our goal is to provide theoretical analysis that supports these results (A. d’Aspremont, S. Armstrong, C. Guzmán & C. Sing-Long).

  • Algorithmic Stability via Performance Estimation. We are interested in tight generalization bounds for first order methods via the PEP approach (C. Guzmán and P. Ulloa).

  • The complexity of Stochastic Fixed Points. We are interested in finding optimal algorithms and lower bounds for approximating fixed points of nonexpansive operators under stochastic oracles. Despite the topic being classical, the stochastic setting has been substantially less studied, and to the best of our knowledge, all existing results refer to asymptotic convergence, rather than (nonasymptotic) complexity results. This project will be carried out starting in 2021 (R. Cominetti, J.P. Contreras, C. Guzmán).

  • Stable and Differentially Private Stochastic Convex Optimization. This is the M.Sc. thesis of J.P. Flores. We are interested in settling the sample and computational complexity of differentially private stochastic convex optimization in ell_p spaces. (J.P. Flores and C. Guzmán).

Activities: Due to the traveling restrictions and quarantine measures, most of the planned activities for 2020 had to be suspended. However, to maintain collaboration we run an online seminar and reading group (see https://sites.google.com/view/cguzman/talks-and-events/foam-associate-team/reading-group-on-iterative-methods?authuser=0).


Activities/Workshops