The One World Seminar Series on the Mathematics of Machine Learning is an online platform for research seminars, workshops and seasonal schools in theoretical machine learning. The focus of the series lies on theoretical advances in machine learning and deep learning. The series was started during the Covid-19 epidemic in 2020 to bring together researchers from all over the world for presentations and discussions in a virtual environment. It follows in the footsteps of other community projects under the One World Umbrella which originated around the same time.
We welcome suggestions for speakers concerning new and exciting developments and are committed to providing a platform also for junior researchers. We recognize the advantages that online seminars provide in terms of flexibility, and we are experimenting with different formats. Any feedback on different events is welcome.
Zoom talks are held on Wednesdays at 12:00 pm New York time (9:00pm Pacific).
A list of past seminars can be found here and recordings can be viewed on our Youtube channel. The invitation to future seminars will be shared on this site before the talk and distributed via email.
Wed 14 Jan
Andrew Ilersich
Learning Stochastic Multiscale Models of Spatiotemporal Systems
The physical sciences are replete with dynamical systems that require the resolution of a wide range of length and time scales. This presents significant computational challenges since direct numerical simulation requires discretization at the finest relevant scales, leading to a high-dimensional state space. In this seminar, I discuss my recent paper "Learning Stochastic Multiscale Models," which proposes an approach for learning latent stochastic differential equation models directly from observational data. Drawing inspiration from physics-based multiscale modeling approaches, this approach resolves the macroscale state on a coarse mesh while introducing a microscale latent state to explicitly model unresolved dynamics. It learns the parameters of the multiscale model using a simulator-free variational inference method with a Product of Experts likelihood that enforces scale separation. The learned multiscale models achieve superior predictive accuracy compared to under-resolved direct numerical simulation and closure-type models at equivalent resolution, as well as reduced-order modeling approaches.
Sign up here to join our mailing list and receive announcements. If your browser automatically signs you into a google account, it may be easiest to join on a university account by going through an incognito window. With other concerns, please reach out to one of the organizers.Â
Sign up here for our google calendar with all seminars.
Ricardo Baptista (University of Toronto)
Wuyang Chen (Simon Fraser University)
Bin Dong (Peking University)
Lyudmila Grigoryeva (University of St. Gallen)
Boumediene Hamzi (Caltech)
Yuka Hashimoto (NTT)
Qianxiao Li (National University of Singapore)
Lizao Li (Google)
George Stepaniants (Caltech)
Zhiqin Xu (Shanghai Jiao Tong University)
Simon Shaolei Du (University of Washington)
Franca Hoffmann (Caltech)
Surbhi Goel (Microsoft Research NY)
Issa Karambal (Quantum Leap Africa)
Tiffany Vlaar (University of Glasgow)
Chao Ma (Stanford University)
Song Mei (UC Berkeley)
Philipp Petersen (University of Vienna)
Matthew Thorpe (University of Warwick)
Stephan Wojtowytsch (University of Pittsburgh)