Welcome to the homepage of the lecture course on singular learning theory held by Daniel Windisch at KU Leuven during the summer term 2026!
Singular learning theory has been established by Sumio Watanabe based on work on singular integrals by mathematicians around V. I. Arnol'd.
It generalizes the asymptotic theory of Bayesian statistics from regular to general statistical models by employing resolution of singularities from analytic and algebraic geometry.
This includes model selection via marginal likelihood approximation for general models, a method most powerful and relevant in modern data science and machine learning.
As such, singular learning learning theory is a theoretical framework that can explain puzzling phenomena in machine learning like good generalization behavior or grokking, and it helps with improving the learning process by educated model selection.
Wednesdays, 10:00 - 11:00 (CET/CEST) at
KU Leuven
Department of Mathematics, Room 02.18
Celestijnenlaan 200B
3001 Leuven, Belgium
If you want to attend online, please write me an email (daniel.windisch.math(at)gmail.com), and I will provide you with a link.
11.02.2026: Motivation and overview
What is singular learning theory good for? Why should I care as a person? Why should I care as an algebraic geometer? What is this course?
18.02.2026: Bayesics
What is a statistical model and how do I pick the best one for my data?
25.02.2026: Some geometric concepts
What is a locally ringed space, a scheme, an analytic space, a coherent sheaf of ideals?
04.03.2026: no course due to absence
11.03.2026: Resolution of singularities
How do I transform my ideal to a monomial ideal?
18.03.2026: Real log canonical thresholds
Which is the relevant algebraic information used in statistical learning theory?
25.03.2026: The free energy formula I
Is there something from singular learning theory I can actually use in real life applications?
01.04.2026: The free energy formula II (proof)
08.04.2026: Easter break
15.04.2026: Easter break
22.04.2026: The free energy formula III (proof)
29.04.2026: A singular Bayesian information criterion
Is there something from singular learning theory I can actually really use in real life applications?
06.05.2026: Towards neural networks
For Bayes, what are neural networks? Why do they generalize well? Can you tell me about the dream of inner model selection?
13.05.2026: General theorems on real log canonical thresholds
How can I possibly compute the algebraic invariants I need for singular learning theory?
20.05.2026: Two reduction techniques from analytic to algebraic
If I don't know analytic geometry what do I do?
27.05.2026: Real log canonical thresholds for special models
Is there actually any chance that I can explicitely construct a resolution of singularities?
You can download the lecture notes here. The notes will grow as the course advances.
Sumio Watanabe, Algebraic Geometry and Statistical Learning Theory. Cambridge University Press, 2011.
"gray book", the standard reference
Sumio Watanabe, Mathematical Theory of Bayesian Statistics. Chapman & Hall, 2020.
"green book", supplementary to the grey book
Shaowei Lin, Algebraic Methods for Evaluating Integrals in Bayesian Statistics. PhD thesis, Berkeley, 2011.
well-structured and good to read, under the supervision of Bernd Sturmfels
Metauni seminar 2022-2023 on singular learning theory (with video recordings)
also has many more useful links
The current singular learning theory research seminar
Very nice motivational overview on singular learning theory, by Jesse Hoogland