I am a Reader (Associate Professor) in Mathematics and Machine Learning at the Department of Mathematics at Imperial College London
I am a Co-Director of the Erlangen AI Hub for Mathematical and Computational Foundations of Intelligence
I am a Member of the European Laboratory for Learning and Intelligent Systems – ELLIS Society
My research uses theory from pure mathematics (algebraic geometry and algebraic topology) to develop computational and statistical methods for data that have complex structures, or that live in complex spaces.
Research Interests:
Applied and Computational Topology including Topological Data Analysis
Applied Algebraic Geometry including Algebraic Statistics
Geometric Statistics and Geometric Deep Learning
Theoretical Machine Learning
Mathematical Biology
Contact: a [dot] monod [at] imperial [dot] ac [dot] uk
Office: Roderic Hill 318
£10 million to study the mathematics of AI:
I use pure mathematics (algebraic topology and algebraic geometry) to advance understanding of deep learning and modern AI technology. My research in this direction is supported by a £10M EPSRC Mathematical and Computational Foundations of Artificial Intelligence Hub [EP/Y028872/1], which I Co-Direct. You can read the press release about it here.
I am recruiting a UK/home PhD student for October 2026! Raphaël Lachièze-Rey and I have been awarded a joint Imperial–CNRS PhD studentship to work on the mathematical foundations of efficient and sustainable AI. If you are a home student with a strong background in probability, graph theory, and machine learning (theory and practice), please get in touch via email with your CV and summary of background and research interests!
Riemannian Neural Optimal Transport has been accepted to ICML 2026! Joint work with Alessandro Micheli, Yueqi Cao, and Samir Bhatt.
New paper on the arXiv: Entropic Riemannian Neural Optimal Transport, where we develop a unified framework that achieves amortized out-of-sample evaluation with intrinsic entropic OT on Riemannian manifolds. Joint work with Alessandro Micheli, Silvia Sapora, and Samir Bhatt.
New paper on the arXiv: Trajectory-Restricted Optimization Conditions and Geometry-Aware Linear Convergence, where we study and explain why convergence of linear first-order methods actually perform better in practice than the theory predicts. Joint work with Faris Chaudhry and Keisuke Yano.
Our paper that proposes a Gaussian mixture model for bone marrow morphology based on signed distance persistent homology has been accepted to the Annals of Applied Statistics! Joint work with Qiquan Wang, Anna Song, Antoniana Batsivari, and Dominique Bonnet.
My PhD student Daniele Tramontano defended his PhD on 19 March 2026 entitled "Causal Inference in Non-Gaussian Models"! Huge congratulations, Daniele!
New preprint on the uniqueness of polytope Fréchet means! Joint work with Roan Talbut and Andrew McCormack.
Our paper that studies Fréchet means in tropical geometry and proposes a fully symbolic approach to their exact computation has been accepted to the Journal of Symbolic Computation! Joint work with Kamillo Ferry, Bo Lin, Carlos Améndola, and Rudy Yoshida.
New preprint where we use representation theory to identify a new bottleneck to GNNs' ability to distinguish non-isomorphic graphs that happens at the readout stage and solve it! Joint work with Mouad Talhi and Arne Wolf.
I was elected as a Member of the ELLIS Society in recognition of my contributions to the mathematical foundations of AI!
I am an editor for the newly launched Association for Mathematical Research (AMR) journal Applied & Computational Topology & Geometry (ACTG)!