Presentations

Quantifying Surrogate Trustworthiness:
A Principled Uncertainty Gauge through a Bayesian Approach

ABSTRACT: Surrogate models are deemed trustworthy, or not, based on heuristic diagnostics. A limitation of this binary view is: i) if the surrogate is not trustworthy, it is obsolete. ii) if the surrogate is trustworthy, then the contribution of the uncertainty of the surrogate itself to the inferential uncertainty on the quantity of interest (QoI) is neglected. Here, we quantify surrogate uncertainty as a continuous random variable. Then, we can also investigate the inferential uncertainty caused by the surrogate uncertainty itself. We discovered that, most curiously, from this assumption emerges naturally a generalized measure of surrogate trustworthiness that is gauged to an objective scale. This suggests a completely new interpretation of convergence of and uncertainties obtained by surrogates. For generalized linear surrogate models and a Student-t likelihood for the simulation data, we find simple Bayesian estimates for surrogate uncertainty and its effect on QoI uncertainty. Terms are identified as input-parametric uncertainty and inferential uncertainty [Ranftl & von der Linden 2021: Bayesian Surrogate Analysis and Uncertainty Propagation. DOI: 10.3390/psf2021003006]. We discuss the special cases of Polynomial Chaos or Gaussian Processes, and demonstrate a numerical example where surrogate uncertainties are in part negligible and in part non-negligible [Ranftl et al. 2022: A Bayesian approach to Blood Rheological Uncertainties in Aortic Hemodynamics. DOI: 10.1002/cnm.3576].

Presented at the SIAM UQ 2022 Meeting (hybrid) in Atlanta, GA, USA, 11th-15th April 2022
Some media players might have difficulties with the audio. Switch to a different browser or player if sound does not work.

My slides are available here as PDF and PPT

Bayesian Surrogate Analysis and Uncertainty Propagation

ABSTRACT: The quantification of uncertainties of computer simulations due to input parameter uncertainties is paramount to assess a model’s credibility. For computationally expensive simulations, this is often feasible only via surrogate models that are learned from a small set of simulation samples. The surrogate models are commonly chosen and deemed trustworthy based on heuristic measures, and substituted for the simulation in order to approximately propagate the simulation input uncertainties to the simulation output. In the process, the contribution of the uncertainties of the surrogate itself to the simulation output uncertainties is usually neglected. In this work, we specifically address the case of doubtful surrogate trustworthiness, i.e., non-negligible surrogate uncertainties. We find that Bayesian probability theory yields a natural measure of surrogate trustworthiness, and that surrogate uncertainties can easily be included in simulation output uncertainties. For a Gaussian likelihood for the simulation data, with unknown surrogate variance and given a generalized linear surrogate model, the resulting formulas reduce to simple matrix multiplications. The framework contains Polynomial Chaos Expansions as a special case, and is easily extended to Gaussian Process Regression. Additionally, we show a simple way to implicitly include spatio-temporal correlations. Lastly, we demonstrate a numerical example where surrogate uncertainties are in part negligible and in part non-negligible

Presented at the 40th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering
MaxEnt 2020/2021, Virtual, 4th-9th July 2021
Some media players might have difficulties with the audio. Switch to a different browser or player if sound does not work.

Find the slides HERE

Find the paper: Bayesian Surrogate Analysis and Uncertainty Propagation. Phys. Sci. Forum 2021, 3, 6.
S. Ranftl and W. von der Linden, W.
https://doi.org/10.3390/psf2021003006

Cite this video as:
Ranftl, S. (2021, July 6). Bayesian Surrogate Analysis and Uncertainty Propagation. MaxEnt 2021. Graz University of Technology.
DOI: https://doi.org/10.3217/kn2sx-3qs12 . LINK: https://repository.tugraz.at/records/kn2sx-3qs12

On the Uncertainty of the Surrogate - a Bayesian Estimate

Presented at RAMSES workshop 14th-17th Dec 2021, SISSA, Trieste, Italy (indico.sissa.it/event/43).
My slides are available
HERE

ramses2021-ranftl.pdf

Bayesian Uncertainty Quantification for Numerical Simulations of Aortic Dissection

Presentation of my Doctoral Thesis at Graz University of Technology. Slides are available HERE

BUQNSAD-ranftl.pdf

On the Diagnosis of Aortic Dissection with Impedance Cardiography:
A Bayesian Feasibility Study Framework with Multi-Fidelity Simulation Data

Presented at the 40th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering
MaxEnt 2019, 30th Juny - 5th July 2019, Max-Planck-Institute for Plasma Physics, Garching/Munich, Germany

My slides are available HERE

Find the full paper: Bayesian Uncertainty Quantification with Multi-Fidelity Data and Gaussian Processes for Impedance Cardiography of Aortic Dissection. Entropy 2020, 22, 58. Ranftl, S.; Melito, G.M.; Badeli, V.; Reinbacher-Köstinger, A.; Ellermann, K.; von der Linden, W. https://doi.org/10.3390/e22010058

maxent2019-ranftl.pdf

ABSTRACT: Aortic dissection is a cardiovascular disease with a disconcertingly high mortality. When it comes to diagnosis, medical imaging techniques such as Computed Tomography, Magnetic Resonance Tomography or Ultrasound certainly do the job, but also have their shortcomings. Impedance cardiography is a standard method to monitor a patients heart function and circulatory system by injecting electric currents and measuring voltage drops between electrode pairs attached to the human body. If such measurements distinguished healthy from dissected aortas, one could improve clinical procedures. Experiments are quite difficult, and thus we investigate the feasibility with finite element simulations beforehand. In these simulations, we find uncertain input parameters, e.g., the electrical conductivity of blood. Inference on the state of the aorta from impedance measurements defines an inverse problem in which forward uncertainty propagation through the simulation with vanilla Monte Carlo demands a prohibitively large computational effort. To overcome this limitation, we combine two simulations: one simulation with a high fidelity and another simulation with a low fidelity, and low and high computational costs accordingly. We use the inexpensive low-fidelity simulation to learn about the expensive high-fidelity simulation. It all boils down to a regression problem—and reduces total computational cost after all.

Bayesian Analysis of Femtosecond Pump-Probe Photoelectron-Photoion Coincidence Spectra

Presented at the 38th International Workshop on Bayesian Inference and Maximum Entropy Methods in Science and Engineering
MaxEnt 2018, 2th-6th July 2018, Alan-Turing-Institute, London, UK

This talk was honoured with the Young Author Best Paper Award sponsored by Entropy!

My slides are available here as PDF and PPT

Find the full paper: Bayesian Analysis of Femtosecond Pump-Probe Photoelectron-Photoion Coincidence Spectra with Fluctuating Laser Intensities. Entropy 2019, 21, 93. Heim, P.; Rumetshofer, M.; Ranftl, S.; Thaler, B.; Ernst, W.E.; Koch, M.; von der Linden, W. https://doi.org/10.3390/e21010093

Sascha-Ranftl-Bayesian-Analysis-of-Femtosecond-Pump-Probe-Photoelectron-Photoion-Coincidence-Spectra-MaxEnt-2018-presentation.pdf

ABSTRACT: Ultrafast processes on a femtosecond timescale in excited molecules can be observed with pump-probe measurements, in which information about the dynamics is obtained from the transient signal associated with the excited state. Background signals caused by pump and/or probe pulses alone often obscure the signals of interest. Simple subtraction of pump-only and/or probe-only measurements from the pump-probe measurement results in a degradation of the signal-to-noise ratio and, in the case of coincidence detection, the danger of overrated background subtraction. Here we present a Bayesian approach that overcomes these problems. For a pump-probe experiment with photoelectron-photoion coincidence detection we reconstruct the interesting excited state spectrum from pump-probe and pump-only measurements. We demonstrate that the Bayesian formalism has many advantages over simple signal subtraction; such as a compensation of false coincidences, no overestimation of pump-only contributions, significantly increased signal-to-noise ratio and applicability to any experimental situation and noise statistics. Most importantly, by accounting for false coincidences, our approach allows to run experiments at higher ionization rates, resulting in a significant reduction of data acquisition times. Our method is thoroughly scrutinized by mock data and applied to experiments on acetone molecules, enabling novel interpretations of the spectra.