Code of Conduct
We are committed to make the summer school a safe and welcoming space for everyone. Please, take a moment to review our Code of Conduct, which outlines our expectations for participant behavior.
Omer Bobrowski (Queen Mary University, London)
Random topology – theory and applications
Lecture #1: Introduction
Lecture #2: Morse Theory in Stochastic Topology
Lecture #3: The Thermodynamic Limit
Lecture #4: Universality in TDA
Anna Gusakova (University of Münster)
From random polytopes to random tessellations - an overview
Anthea Monod (Imperial College, London)
Persistent Homology in Statistics, Machine Learning, and AI
Raphaël Lachièze-Rey (INRIA Paris)
Limit theorems for topological and geometric functionals
There will be a poster session on the first afternoon. This session will provide the participants with the opportunity to tell their peers about their own research topics. This poster session will both serve as an opportunity for scientific exchange and will also help to initiate networking among the participants. It will greatly facilitate the interactions during the entire week when the participants know about the mutual research interests from day one.
Vincent Grande
Point-Level Topological Representation Learning on Point Clouds
Topological Data Analysis (TDA) allows us to extract powerful topological and higher-order information on the global shape of a data set or point cloud. Tools like Persistent Homology or the Euler Transform give a single complex description of the global structure of the point cloud. However, common machine learning applications like classification require point-level information and features to be available. In this paper, we bridge this gap and propose a novel method to extract node-level topological features from complex point clouds using discrete variants of concepts from algebraic topology and differential geometry. We verify the effectiveness of these topological point features (TOPF) on both synthetic and real-world data and study their robustness under noise and heterogeneous sampling.
Björn Wehlin
Large-Scale Topological Data Analysis
Considerable effort has been spent towards efficient computations of persistent homology on large point clouds. However, subsampling procedures, etc., instead lead to many small point clouds. Combining this with optimization schemes for finding hyperparameters, and so on, multiplies the effort. Our aim is to leverage modern heterogeneous compute environments (CPU + GPU) to accelerate TDA computations for massive datasets and experiments. In addition to providing the TDA community with new computational tools, we hope to gain insight for developing new mathematics.
Martina Petráková
Large-Deviation Analysis for Canonical Gibbs Measures
In this poster, we present a large deviation theory developed for functionals of binomial Gibbs processes with fixed intensity in increasing windows. Our method relies on the traditional large deviation result for the Poisson point process noting that the binomial point process is obtained from the Poisson point process by conditioning on the point number. Our main methodological contribution is the development of coupling constructions that allow us to handle delicate and unlikely pathological events. The presented results cover a broad class of both the interaction function (possibly unbounded) and the functionals (given as a sum of possibly unbounded local score functions). The poster is based on a joint work with Christian Hirsch.
Péter Juhász
Functional Stable Limit in Random Connection Hypergraphs
We investigate a dynamic random connection hypergraph model based on a bipartite connection structure, in which nodes and hyperedges are modeled by two independent marked Poisson point processes. Nodes are equipped with birth-death dynamics, while hyperedges are temporally localized. Then, edges are formed under spatial and temporal constraints influenced by the vertex marks. In this system, we focus on the edge count process as a function of time within a growing spatial observation window. Under suitable assumptions, we show a functional stable limit theorem: the properly rescaled and centered edge count process converges in distribution to a non-Gaussian, heavy-tailed limit in the Skorokhod space.
Karambir Das
Functional CLTs for the Degree Distribution in Random Geometric Graphs in the Sub-Connective Regime
Consider a sequence of random geometric graphs G(P(n), r(n,t)), with vertex set P(n) being a Poisson point process of intensity n in the unit cube. There is an edge between two vertices if they are within distance r(n,t). For a suitable choice of r(n,t) we derive conditions under which functional central limit theorems hold for the number of vertices of degree k in the sub-connective regime.
Thomas van der Jagt
Nonparametric Inference for Poisson-Laguerre Tessellations
We consider statistical inference for Poisson-Laguerre tessellations in R^d. The object of interest is a distribution function F which describes the distribution of the arrival times of the generator points. The function F uniquely determines the intensity measure of the underlying Poisson process. Two nonparametric estimators for F are introduced which depend only on the points of the Poisson process which generate non-empty cells and the actual cells corresponding to these points. The proposed estimators are proven to be strongly consistent, as the observation window expands unboundedly to the whole space. We also consider a stereological setting, where one is interested in estimating the distribution function associated with the Poisson process of a higher dimensional Poisson-Laguerre tessellation, given that a corresponding sectional Poisson-Laguerre tessellation is observed.
Thomas Burnett
Introducing the Forgetful Rips Filtration for Directed Data
This poster introduces some concepts from an upcoming paper in collaboration with Kate Turner and Vin de Silva, in which we introduce a new filtration of simplicial complexes for the application of persistent homology to data like directed graphs and networks. The new filtration is formed by 'forgetting' the permutation of vertices up to the sign of the permutation. By relating it to a corresponding Ordered-Tuple Complex, we are able to prove stability with respect to correspondence distortion distance, but with tractable computation times. It is also isomorphic to the Rips filtration in the case of symmetric functions, making it a clear choice for generalising Rips filtrations to the domain of directed data. Given TDA's unique contribution to data science through offering quantitative methods to describe local and global structure, successful integration of directed properties is critical for its applicability to contexts with hierarchical structure and asymmetric relationships.
Kang-Ju Lee
Simplicial Kirchhoff Index of Random Complexes
Kirchhoff index is an electrical network-theoretic invariant which is defined as the sum of effective resistances between all pairs of vertices. As a robustness measure of simplicial networks, a simplicial analogue of the Kirchhoff index is defined to be the sum of simplicial effective resistances for all subsets of vertices of size dimension plus one. In this paper, we investigate the Kirchhoff index of random simplicial complexes as a generalization of random graphs. We present a formula for the expectation of the random variable and show how it concentrates around the expectation. We also perform numerical experiments revealing that the expectation and the fluctuation are still valid for realizations of the random simplicial Kirchhoff index. This is a joint work with Woong Kook.
Zahra Tabatabaei
When Topology Meets Biology: Topological Insights into Sheep Milk Protein
Protein aggregation is a critical phenomenon in milk processing that directly influences product texture, stability, and nutritional quality. From Cubical complexes, we calculate Betti₁ numbers across multiple thresholds. The features are robust to noise, equivariant with monotonic intensity transformation and spatial scaling, making them ideal for characterizing images of complex protein networks
Johanna Marie Gegenfurtner and Albert Kjøller Jacobsen
Learning with Geometry-Aware Input Noise
It has been shown that input noise implicitly regularises the gradient of the model function, leading to smoother functions with improved generalisation. However, previous research only considered ambient noise in the input space, without considering topological features of the data. In our work, we present two different methods of adding informed input noise that accounts for the lower dimensional submanifold our input space constitutes of. First, we suggest projecting Gaussian noise onto the tangent space of the manifold at each point. Secondly, we introduce Brownian Motion noise, which moves in random directions along the manifold. These methods lead to better results in several examples, in which the domain of the model is known. In the next step, we will extend the methods to scenarios in which we do not know the domain.