Women in Theory Forum
Welcome to the Stanford Women in Theory Forum (WTF). We are group of women-identifying CS-theorist-identifying (and theory-adjacent) folks who meet monthly-ish for socializing, snacks, and a research talk. If you'd like to join our mailing list, please email the organizers, or sign up here: https://mailman.stanford.edu/mailman/listinfo/womens-theory-forum
Organizers: Tselil Schramm (tselil-at-stanford-dot-edu) and Mary Wootters (marykw-at-stanford-dot-edu)
If the weather is nice, we will meet outside in the engineering quad (in the treepit with the whiteboards). If the weather is not so nice, we'll meet inside, probably Gates 415. Join the mailing list for announcements about locations.
Wednesday October 19, 3:30pm: Ellen Vitercik
Title: Theoretical Foundations of Machine Learning for Cutting Plane Selection
Abstract: Cutting-plane methods have enabled remarkable successes in integer programming over the last few decades. State-of-the-art solvers integrate a myriad of cutting-plane techniques to speed up the underlying tree search algorithm used to find optimal solutions. In this talk, we provide sample complexity guarantees for learning high-performing cut-selection policies tailored to the instance distribution at hand. This talk is based on joint work with Nina Balcan, Siddharth Prasad, and Tuomas Sandholm from NeurIPS'21.
Wednesday November 16, 3:30pm: Sumegha Garg
Tuesday May 23, 1:30pm: June Vuong
Title: Learning to Generate Multimodal Distributions via Early-Stopped Langevin Diffusions
Abstract: There has been a recent explosion of interest in generative modeling using a score matching approach, i.e. attempting to learn the gradient of the log-likelihood of the true distribution. It is by now well-known that vanilla score matching has significant difficulties learning multimodal distributions, and a number of modifications of score matching have been proposed to overcome this difficulty --- for example, by attempting to additionally learn the score function for noised versions of the ground truth (a potentially more difficult learning task). Is there a natural way to sample multimodal distributions using just the vanilla score? Inspired by a long line of related experimental work, we prove that the Langevin diffusion with early stopping, initialized at the empirical distribution, and run on a score function estimated from data can successfully learn natural multimodal distributions (mixtures of log-concave distributions from parametric families) with sample complexity polynomial in the dimension. Joint work with Frederic Koehler.
The schedule has been lost to time.