Continuous time methods for machine learning

ICML 2022 Workshop

The aim of the workshop is to facilitate discussions on the importance of continuous time systems in machine learning.

We encourage discussions on how various fields in machine learning (including architectures, generative models, optimisation) can benefit for further integration with dynamical systems, and what applications of machine learning can benefit of such improvements (such as climate).

In machine learning, discrete time approaches such as gradient descent algorithms and discrete building layers for neural architectures have traditionally dominated. Recently, we have seen that by bridging these discrete systems with their continuous counterparts we can not only develop new insights but we can construct novel and competitive ML approaches. By leveraging time, we can tap into the centuries of research such as dynamical systems, numerical integration and differential equations, and continue enhancing what is possible in ML.

Workshop goals

  • to disseminate knowledge about the use of continuous time methods in ML;

  • to create a discussion forum and create a vibrant community around the topic;

  • to provide a preview of what dynamical system methods might further bring to ML;

  • to find the biggest hurdles in using continuous time systems in ML and steps to alleviate them;

  • to showcase how continuous time methods can enable ML to have large impact in certain application domains, such as climate prediction and physical sciences.


Associate Professor, Stanford University.

Research: probabilistic modeling, learning and inference, with applications to computer vision and sustainability.

Junior Professor, University of Freiburg.

Research: mathematics for deep learning, stochastic gradient descent methods, SDEs, numerical analysis, dynamical systems

PhD student, Bosch Center for Artificial Intelligence.

Research: dynamical systems and hybrid modeling; Neural Ordinary Differential equations and their applications.

Assistant Professor, Tel Aviv University.

Research: theoretical and algorithmic foundations of deep learning; analyzing expressiveness, optimization and generalization.

Postdoctoral fellow, University of Tübingen.

Research: Continuous-time models for time-series, video prediction and reinforcement learning; connections with optimal control.


Tatjana Chavdarova

Postdoctoral Fellow

UC Berkeley

Ricky Chen

Research Scientist


Priya Donti

PhD student


Adil Salim

Research Scientist

Microsoft Research


Mihaela Rosca

Staff Research Engineer, DeepMind

PhD student, UCL

Chongli Qin

Research Scientist, DeepMind

Julien Mairal



Université Grenoble Alpes

Marc Deisenroth



Reviewing committee

We would like to thank the reviewers who generously lent their time and expertise to review papers:

  • Alexia Jolicoeur-Martineau

  • Clara Lucía Galimberti

  • Daniel Worrall

  • Elliot Creager

  • Marin Biloš

  • Michael N. Arbel