Uncertainty Quantification through Conformal Regressors and Predictive Systems
Uncertainty Quantification through Conformal Regressors and Predictive Systems
Abstract. Conformal prediction is a framework that allows for controlling the error rate of any predictive model. This is achieved by turning point predictions into set predictions, which are guaranteed to include the true target with a probability that is specified by the user. This guarantee is provided without any model or distributional assumptions, except for that of exchangeability, which is weaker than the standard IID assumption. In this talk, we will focus on conformal regressors, which transform point predictions of any underlying regression model into prediction intervals, and on a recent generalization of these, called conformal predictive systems, which produce cumulative distribution functions. From the latter, prediction intervals can be obtained, as well as percentiles, calibrated point predictions, and p-values for given target values. A gentle introduction to the two frameworks will be provided, covering standard, normalized and Mondrian conformal regressors and predictive systems. The techniques will be illustrated using the Python package "crepes".
Bio. Henrik Boström is professor of computer science - data science systems at KTH Royal Institute of Technology, Sweden. His research focuses on machine learning algorithms and applications, in particular conformal prediction, ensemble learning, interpretable and explainable machine learning. He served for several years as action editor of the Machine Learning journal as well as of the Data Mining and Knowledge Discovery journal, and is a frequent senior program committee member of several conferences in the area, e.g., SIGKDD and IJCAI. He was general chair for the 11th Symposium on Conformal and Probabilistic Prediction with Applications (COPA 2022) and is the author of the Python package crepes (https://github.com/henrikbostrom/crepes).