The link for online participation is sent out before the talk through our mailing list.




Tuesday, 04 June 2024, 13:30–15:00h, Room G309


Speaker: Prof. Dr. Lyudmila Grigoryeva


Title: Learning Dynamic Processes with Reservoir Computing


Host: TT.-Prof. Dr. Tobias Sutter


Abstract: Many dynamic problems in engineering, control theory, signal processing, time series analysis, and forecasting can be described using input/output (IO) systems. Whenever a true functional IO relation cannot be derived from first principles, parsimonious and computationally efficient state-space systems can be used as universal approximants. We have shown that Reservoir Computing (RC) state-space systems with simple and easy-to-implement architectures enjoy universal approximation properties proved in different setups. The defining feature of RC systems is that some components (usually the state map) are randomly generated, and the observation equation is of a tractable form. From the machine learning perspective, RC systems can be seen as recurrent neural networks with random weights and a simple-to-train readout layer (often a linear map). RC systems serve as efficient, randomized, online computational tools for learning dynamic processes and enjoy generalization properties that can be explicitly derived. We will make a general introduction to up-to-date theoretical developments, discuss connections with research contributions in other fields, and address details of RC systems’ applications.

PDF Version




Tuesday, 25 June 2024, 13:30–15:00h, Room G309


Speaker: Dr. Matthias C. Caro


Title: (Un-)Decidability and (Un-)Computability in Machine Learning


Host: Dr. Lothar Sebastian Krapp


Abstract: While statistical learning theory has its origins in statistics and probability theory, algorithmic aspects, such as computational efficiency, of learning procedures have come into focus with the development of computational learning theory by computer scientists. Nevertheless, partially due to their statistical origins, fundamental results in learning theory are often phrased without taking requirements of (Turing) computability into account. In this talk, I will give an introduction to recent work exploring the intersection point between learning and computability theory. On the one hand, I will discuss results about the undecidability of (different notions of) learnability. On the other hand, I will outline current developments aiming towards a theory of computable learnability. Thereby, I hope to demonstrate the fruitfulness of revisiting our established understanding of learning theory with an emphasis on learning as an algorithmic process.

PDF Version




Tuesday, 02 July 2024, 13:30–15:00h, Room M628


Speaker: Prof. Dr. Gitta Kutyniok


Title: Reliable AI: Successes, Challenges, and Limitations


Host: Prof. Dr. Stefan Volkwein


Abstract: Artificial intelligence is currently leading to one breakthrough after the other, in industry, public life, and the sciences. However, one current major drawback is the lack of reliability of such methodologies.

In this talk we will take a mathematical viewpoint towards this problem, showing the power of such approaches to reliability. We will first provide an introduction into this vibrant research area, and also discuss the impact of the EU AI Act and the G7 Hiroshima Process. We will then survey recent advances, in particular, concerning generalization and explainability. This is followed by a discussion of fundamental limitations, which affect reliability of artificial intelligence, and show solutions to this serious obstacle in terms of an intriguing connection to next generation AI computing.

PDF Version




Tuesday, 16 July 2024, 13:30–15:00h, Room L0602


Speaker: Asst. Prof. Dr. Ariel Neufeld


Title: Quantum Monte Carlo algorithm for solving Black-Scholes PDEs for high-dimensional option pricing in finance and its complexity analysis


Host: TT.-Prof. Dr. Tobias Sutter


Abstract: In this talk we present a quantum Monte Carlo algorithm to solve high-dimensional Black-Scholes PDEs with correlation for high-dimensional option pricing. The payoff function of the option is of general form and is only required to be continuous and piece-wise affine (CPWA), which covers most of the relevant payoff functions used in finance. We provide a rigorous error analysis and complexity analysis of our algorithm. In particular, we prove that the computational complexity of our algorithm is bound polynomially in the space dimension d of the PDE and the reciprocal of the prescribed accuracy ε. Moreover, we show that for payoff functions which are bounded, our algorithm indeed has a speed-up compared to classical Monte Carlo methods. Furthermore, we present numerical simulations in one and two dimensions using our developed package within the Qiskit framework tailored to price CPWA options with respect to the Black-Scholes model, as well as discuss the potential extension of the numerical simulations to arbitrary space dimension. This talk is based on joint work with Jianjun Chen and Yongming Li.

PDF Version





Tuesday, TBA November 2024, 13:30-15:00h, Room TBA


Speaker: Prof. Dr. Daniel Kuhn


Title: TBA


Host: TBA


Abstract: TBA

(PDF-Version will follow here)




Tuesday, 03 December 2024, 13:30–15:00h, Room TBA


Speaker: Dr. Francis Bach


Title: TBA


Host: TBA


Abstract: TBA

(PDF-Version will follow here)