December 2024, NeurIPS workshop on statistical frontiers in LLMs and foundation models. Predictive inference in multi-environment scenarios.
December 2023, NeurIPS. The s-value: evaluating stability with respect to distributional shifts.
February 2022, ETH Zurich AI symposium. Stability and reliability under distributional shifts. (Invited talk)
January 2022, Causality group, University of Copenhagen, Denmark. Stability under Distributional Shifts. (Invited talk)
November 2021, Industrial Affiliates Conference, Stanford University. Stability under Distributional Shifts. (Invited talk)
October 2021, Statistics student seminar, Columbia University. Stability and reliability under distributional shifts. (Invited talk)
August 2021, Joint Statistical Meeting. The s-value: evaluating stability with respect to distributional shifts.
July 2021, Bernoulli-IMS 10th World Congress in Probability and Statistics, Seoul National University, Seoul, Korea. The s-value: evaluating stability with respect to distributional shifts.
July 2021, Spotlight talk, ICML Workshop on Distribution-free Uncertainty Quantification. Robust Validation: Confident Predictions Even When Distributions Shift.
July 2021, Berkeley-Stanford Joint Colloquium, UC Berkeley. Robust Validation: Confident Predictions Even When Distributions Shift.
October 2020, Stanford Data Practices Conference, Stanford University. Robust Validation: Confident Predictions Even When Distributions Shift.
October 2020, Stanford Berkeley Joint Colloquium- Student Seminar. Robust Validation: Confident Predictions Even When Distributions Shift.
April 2020, Machine Learning group, Stanford University. Knowing what you know: valid confidence sets in multiclass and multilabel prediction.