Saturday, July 23

Workshop on Distribution-Free Uncertainty Quantification

at ICML 2022 in Baltimore, Maryland

Room #308

Hybrid format - See details here.


ICML Website for registrants: https://icml.cc/virtual/2022/workshop/13460

Distribution-free methods enable rigorous uncertainty quantification with any (misspecified) model and (unknown) data distribution.

Accuracy alone does not suffice for reliable, consequential decision-making; we also need uncertainty.

Distribution-free UQ gives finite-sample statistical guarantees for any predictive model, no matter how bad/misspecified, and any data distribution, even if unknown.

DF techniques such as conformal prediction represent a new, principled approach to UQ for complex prediction systems, such as deep learning.

This workshop will bridge applied machine learning and distribution-free uncertainty quantification, catalyzing work at this interface.

This workshop is an inclusive, beginner-friendly introduction to distribution-free UQ. We encourage everybody to submit a paper. Spotlight talks and posters presentations will be selected from the submissions.

Speakers and Panelists

University of Chicago

Pietro Perona

Caltech

Emmanuel J. Candès

Stanford University

Victor Chernozhukov

MIT

Zhimei Ren

University of Chicago

Michael I. Jordan

UC Berkeley

Kory D. Johnson

Vienna University of Technology

Insup Lee

University of Pennsylvania

Yao Xie

Georgia Institute of Technology

What is distribution-free uncertainty quantification?

Distribution-free methods make minimal assumptions about the data distribution or model, yet still provide uncertainty quantification. Examples of DF methods include conformal prediction, tolerance regions, risk-controlling prediction sets, calibration by binning, and more. We take a broad outlook on DF methods, so any assumption-light uncertainty quantification approaches are welcome.

Organizers