Workshop - Uncertainty Quantification, Machine Learning & Bayesian Statistics in Scientific Computing
1 - 5 July 2019, Ruprecht-Karls-Universität Heidelberg
Scientific computing concerns the development of mathematical models and high-performance software able to describe, simulate and learn the behaviour of complex phenomena. Applications can arise from any area of applied sciences (e.g. engineering, physics, biology, chemistry) and typically retain the challenging task of quantifying high-dimensional uncertainty due to known unknowns and unknown unknowns present in the natural system. When this is added to the computational burden of approximating and solving complex mathematical models, the application of standard inference algorithms, e.g. for parameter estimation, prediction or optimization, becomes quickly unfeasible within a reasonable computational budget.
The aim of the workshop is to bring together researchers working in Uncertainty Quantification, Machine Learning and Bayesian Statistics with a particular focus on high- and infinite-dimensional problems from scientific computing, where the sparsity or uncertainty of data requires an integration of inference and learning algorithms with established physical models, such as partial differential equations. Advances in this complex field of research require a concerted effort from many disciplines, which we hope to foster at the workshop.
This workshop is part of the Thematic Semester Uncertainty Quantification, Machine Learning & Bayesian Statistics in Scientific Computing at MAThematics Center Heidelberg (MATCH) in conjunction with the Excellence Cluster STRUCTURES. The financial support from MATCH and from the Heidelberg Graduate School of Mathematical and Computational Methods for the Sciences (HGS MathComp) is gratefully acknowledged.
Main organizer: Professor Dr. Robert Scheichl (Heidelberg University)
Co-organizer: Gianluca Detommaso (University of Bath)
Administration: Herta Fitzer (Heidelberg University)