Mathematical Foundations of Computation Seminar
Mathematical Foundations of Computation Seminar
Welcome to the webpage of the seminar of the Mathematical foundations of computation group. The seminar in this semester usually takes place on Thursdays at 13:15. To join the seminar mailing list, please contact me.
11/12/2025, Samson Abramsky
6/11/2025, Anupam Das (School of Computer Science, University of Birmingham): A proof complexity theory of (N)L, via (non-)deterministic branching programs
(Non-)deterministic logspace ((N)L) has a natural (non-uniform) model given by (non-)deterministic branching programs ((N)BPs). These are just the dag-like versions of (non-deterministic) decision trees. In [BDK '20] systems of (N)BPs were given a proof theoretic presentation, using disjunction to model non-determinism and Tseitin extension to model dagness. The same syntax (without extension) was used by Alessio and Chris to give a deep inference system in the setting of subatomic logic in [BG '22]. In this talk I will survey some of the proof complexity theoretic aspects of the induced systems. In particular I will present recent joint work with Delkos [DD '25] that gives a proof complexity theoretic analogue of the famous Immerman-Szelepcsényi theorem, that NL is closed under complements.
[BDK '20] Proof Complexity of Systems of (Non-Deterministic) Decision Trees and Branching Programs. Sam Buss, Anupam Das and Sasha Knop. CSL '20.
[BG '22] A Subatomic Proof System for Decision Trees. Chris Barrett and Alessio Guglielmi. ACM ToCL.
[DD '25] Prover-Adversary games for systems over (non-deterministic) branching programs. Anupam Das and Avgerinos Delkos. In review.
30/10/2025, Arnold Beckmann (Department of Computer Science, Swansea University): On the Complexity of Confluence and Church-Rosser Proofs, with an application to Bounded Arithmetic
We investigate confluence and the Church-Rosser property - two well-studied properties of rewriting and the $\lambda$-calculus - from the viewpoint of proof complexity. With respect to confluence, our main contribution is that the size, measured in number of symbols, of the smallest rewrite proof is polynomial in the size of the peak. For the Church-Rosser property we obtain exponential lower bounds for the size of the join in the size of the equality proof. The obtained (worst-case) bounds in both cases are precise. We also study the complexity of proving confluence in the context of the $\lambda$-calculus. Here, we establish an exponential (worst-case) lower bound of the size of the join in the size of the peak. We will explain as a motivation an application of a (yet to be achieved) feasible Church-Rosser property to the separation problem in Bounded Arithmetic. This is joint work with Georg Moser, published in the proceedings of MFCS 2024.
23/10/2025, Erfan Khaniki (Department of Computer Science, University of Oxford): Not all Kripke models of HA are locally PA
Let K be an arbitrary Kripke model of Heyting Arithmetic (HA). For every node k in K, we can view the classical structure at k, M_k, as a model of some classical theory of arithmetic. Let T be a classical theory in the language of arithmetic. We say that K is locally T if and only if, for every k in K, M_k is a model of T. A basic question originating from Troelstra et al. (Notre Dame Journal of Formal Logic, 1986) asks whether every Kripke model of HA is locally PA (Peano arithmetic). In this talk, we construct a rooted Kripke model of HA whose root is not a model of PA, thus answering the question of Troelstra et al. in the negative.
18/7/2025, Katsuhiko Sano (Faculty of Humanities and Human Sciences, Hokkaido University): Craig Interpolation for Bi-intuitionistic Stable Tense Logic
Bi-intuitionistic stable tense logic (BiSKt), introduced by Stell et al. (2016), provides a logical framework for mathematical morphology on graphs consisting of nodes and edges. In this talk, we establish the Craig interpolation theorem for BiSKt from a proof-theoretic perspective -- that is, in terms of a sequent calculus for BiSKt. Although our sequent calculus is not cut-free, applications of the cut rule can be restricted to analytic ones -- namely, those in which the cut formula is a subformula of the conclusion of the cut rule. To establish the Craig interpolation theorem for this calculus, we use a symmetric interpolation method originally proposed by Mints (2001) for the multi-succedent calculus of first-order intuitionistic logic. This method can be seen as a generalization of Maehara's method. Our proof-theoretic approach also simplifies the method developed by Kowalski and Ono (2017) for proving the Craig interpolation theorem in bi-intuitionistic logic. This is a joint work with Hiroakira Ono (JAIST).
10/6/2025, Ondra Ježil (Department of Algebra of the Faculty of Mathematics and Physics, Charles University): Prime Factorization in Models of PV_1
Under a hypothesis on the hardness of factoring, we show that the bounded arithmetic PV_1, corresponding to polynomial-time reasoning, cannot prove that every number has some prime divisor. The result holds even when we extend PV_1 by the sharply bounded choice scheme.
10/4/2025, Joseph Tooby-Smith (Department of Computer Science, University of Bath): PhysLean: Digitalizing physics into Lean 4
For centuries, physicists have been using pen and paper or word-processors like LaTeX to disseminate, record and manipulate information. Proof assistances like Lean, give us a completely new way to do all three of these things, with added advantages. Over the last couple of years Lean has been fruitfully used by mathematicians in this regard. In particular, there is a project called Mathlib which aims to be a monolithic library of results from mathematics digitalized into Lean 4. In this talk I introduce PhysLean - which aims to be a monolithic library of results from physics digitalized into Lean 4. I will discuss the challenges involved in this project, and the progress made thus far. I will then delve deeper into a specific example from physics to show how what the process of digitalization looks like.
28/3/2025, Raheleh Jalali (Department of Computer Science, University of Bath): Skolemization and the logic of quantifier shifts
Skolemization is a method to remove strong quantifiers (i.e., positive occurrences of the universal quantifier and negative occurrences of the existential quantifier) from a first-order formula and replace them with fresh function symbols. It is a well-known fact that Skolemization is sound and complete with respect to the classical predicate logic, CQC. At the same time, this is not the case for the intuitionistic predicate logic, IQC. In this talk, we consider the three formulas known as the "quantifier shifts", which are provable in CQC but not in IQC. One may suspect that the failure of the quantifier shifts is why Skolemization fails in IQC. Therefore, asking what happens if we add the quantifier shifts to IQC is natural. Does the resulting logic have Skolemization? If not, for which class of formulas does the Skolemization hold? These questions build the motivation of the present research study that investigates the logic of quantifier shifts and its Skolemization. This is a joint work with Matthias Baaz, Rosalie Iemhoff, and Mariami Gamsakhurdia.
27/2/2025, Pedro Azevedo de Amorim (Department of Computer Science, University of Bath): Denotational Foundations for Expected Cost Analysis
Reasoning about the cost of executing programs is one of the fundamental questions in computer science. In the context of programming with probabilities, however, the notion of cost stops being deterministic, since it depends on the probabilistic samples made throughout the execution of the program. This interaction is further complicated by the non-trivial interaction between cost, recursion and evaluation strategy. In this talk, I will go over a novel denotational approach to reasoning about expected cost. Compared to previous approaches, our semantics can naturally express the costful and probabilistic aspects of program execution in a modular way. I will demonstrate the versatility and naturality of this semantics by using it to reason about familiar probabilistic programs and show how the semantics captures cost structures as probabilistic recurrence relations.