DANGER 4

Talks

More titles and abstracts will appear shortly.

Speaker: Laure Daviaud
Title: Modelling machine learning systems

Abstract: How can we make sure that a machine learning system gives results as it should, is not biased or is safe to use - as we would like for self-driving cars, for example? We cannot. Not fully. But we can try!  This talk will be an incursion in theoretical computer science and formal verification. We will explain ways one can model machine learning systems to try to obtain partial guarantees on them, and how to enforce some safety property.

Speaker: Maximilian Doré 
Title: Formalising Topological Data Analysis in Cubical Agda
Abstract: Topological data analysis (TDA) offers a plethora of methods and tools to study the "shape of data". This talk is about work towards implementing a TDA tool inside a theorem prover, leading to software that is formally proved to implement the intricate mathematical theory underlying TDA. The theorem prover of choice for this project is Cubical Agda, which implements ideas from Homotopy Type Theory and offers a logic to directly reason about (homotopy types of) topological spaces. We give a taste of what it's like to reason in this theorem prover, presenting a formalisation of discrete Morse theory for graphs, and discuss challenges towards turning this formalisation into a fully fledged tool for TDA.

Speaker: Sergei Gukov
Title: Math + AI = AGI
Abstract: In this talk, we explore the transformative potential of off-the-shelf reinforcement learning (RL) algorithms in accelerating solutions to complex, research-level mathematical challenges. We begin by illustrating how these algorithms have achieved a 10X improvement in areas where previous advances of the same magnitude required many decades. A comparative analysis of different network architectures is presented to highlight their performance in this context. We then delve into the application of RL algorithms to exceptionally demanding tasks, such as those posed by the Millennium Prize problems and the smooth Poincaré conjecture in four dimensions. Drawing on our experiences, we discuss the prerequisites for developing new RL algorithms and architectures that are tailored to these high-level challenges.

Speaker: Yuka Hashimoto
Title: Reproducing kernel Hilbert C*-module for data analysis
Abstract: Reproducing kernel Hilbert C*-module (RKHM) is a generalization of Reproducing kernel Hilbert space (RKHS) and is characterized by a C*-algebra-valued positive definite kernel and the inner product induced by this kernel. The advantages of applying RKHMs to data analysis instead of RKHSs are that we can enlarge representation spaces, construct positive definite kernels using the product structure in the C*-algebra, and use the operator norm for theoretical analyses. We show fundamental properties in RKHMs for data analysis, such as a minimization property of the orthogonal projection and representer theorems. Then, we propose a deep RKHM, which is constructed as the composition of multiple RKHMs. This framework is valid, for example, for analyzing image data.

Speaker: Elira Shaska
Title: Machine learning in the moduli space of genus two curves 

Abstract:   We use machine learning to study the moduli space of genus two curves. During the talk we will focus on the locus 𝓛ₙ of genus two curves with (n,n)-split Jacobian. More precisely we design a transformer model which given moduli points (i.e. values for the Igusa invariants) determines if the corresponding genus two curve is in the locus 𝓛ₙ, for n = 2, 3, 5, 7. During this study we discover some interesting arithmetic properties which seem difficult to guess otherwise. For example we show  that there are no rational points p 𝓛ₙ with weighted moduli height ≤ 2 in any of 𝓛₂, 𝓛₃, and 𝓛₅

Speaker: Tony Shaska
Title:  Machine models for weighted spaces
Abstract: We explore the idea of using machine learning to determine rational points on weighted projective varieties. Our work leads to many interesting questions on the geometry of weighted projective spaces and how machine learning models can be used to study weighted  spaces. While mathematically it seems that such models would be superior to more traditional models,  it is still an open question to provide computational evidence to support such claims. 

Speaker: Sara Veneziale
Title: Asymptotic formulae for the regularized quantum period
Abstract: In this talk, we explore the asymptotic behaviour a certain sequence, which contains the coefficients of a series called the regularized quantum period. This is conjectured to be a complete invariant of Fano varieties and its can be considered as a numerical fingerprint of each Fano variety. We discuss how these results originate from the study of large datasets using data analysis and machine learning techniques and show how they can be used to visualise the landscape of these objects. This is joint work with Tom Coates and Alexander Kasprzyk.