News
2024
[10 Jul 24] New preprint out on Convergence rates for Poisson learning to a Poisson equation with measure data. In this paper (which basically consists of three independent part) we prove convergence rates of a Poisson equation on a random geometric graph to a Poisson equation with measure data in the continuum. The challenging aspects are the singularities of the solutions and the lack of a variational interpretation.
[31 May 24] Our paper Polarized consensus-based dynamics for optimization and sampling with Tim Roith and Philipp Wacker was just published in Mathematical Programming.
[26 Apr 24] Our paper Gamma-convergence of a nonlocal perimeter arising in adversarial machine learning with Kerrek Stinson was just published in Calculus of Variations and Partial Differential Equations.
[23 Apr 24] New preprint out on adversarial machine learning: A mean curvature flow arising in adversarial training. Together with Tim Laux and Kerrek Stinson we propose an adversarial training method that discretizes a mean curvature flow of the decsision boundary.
[01 Feb 24] I'm happy to welcome Eloi Martinet as new postdoc in my group in Würzburg.
[16 Jan 24] Today I gave my inaugural lecture at the University of Würzburg, where I spoke about The Mathematics of Adversarial Machine Learning.
[08 Jan 24] Happy that our paper Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian was accepted for publication at the Annals of Applied Probability.
2023
[30 Nov 23] Two papers accepted! Both deal with the infinity Laplacian. In this paper, which as accepted at the Journal for Scientific Computing, we construct a convergent numerical method for the infinity eigenvalue problem. In this one, accepted and published online at Communications in PDEs, I derived convergence rates of p-Laplacian approximations of the infinity Laplace equation in terms of p.
[01 Nov 23] Together with Des Higham and Laura Thesing we are organizing a special issue on "Adversarial Robustness of Artificial Intelligence" at EJAM . Click here for the call for papers.
[01 Oct 23] I am excited that I started my new position as professor for mathematics of machine learning at the University of Würzburg.
[31 May 23] A new preprint is available on arxiv: It begins with a boundary: A geometric view on probabilistically robust learning
[02 May 23] Excited and honored to be a new associate editor for Advances in Continuous and Discrete Models: Theory and Applications in the core area of data science.
[15 Mar 23] Our paper "Complete Deterministic Dynamics and Spectral Decomposition of the Linear Ensemble Kalman Inversion" has appeared in the SIAM/ASA Journal on Uncertainty Quantification. This is joint work with Philipp Wacker.
[12 Mar 23] Excited about this week's workshop on Purpose-driven particle models in Leiden/NL.
[01 Mar 23] I started a new position as junior research group leader at TU Berlin!
[17 Feb 23] I have released a new preprint where I prove a curious results: namely convergence rates for solutions to the p-Laplace equation to the solution of the infinity-Laplace equation. The proof just uses comparison principles and (SPOILER!) the rate scales like (1/p)^(1/4).
[27 Jan 23] My article "The inhomogeneous p-Laplacian equation with Neumann boundary conditions in the limit p→∞" has appeared open access at Advances in Continuous and Discrete Models.
[13 Jan 23] Our paper "The Geometry of Adversarial Training in Binary Classification" has appeared at Information and Inference: A Journal of the IMA.
[10 Jan 23] Great to be back at the Simons Institute in Berkeley for the GMOS Reunion Workshop.
2022
[29 Nov 22] Together with Kerrek Stinson we wrote a preprint where we prove Gamma-convergence of the nonlocal perimeter functionals that describe the regularizing effect in adversarial training.
[11 Nov 22] A new preprint is out on arxiv. Together with Philipp Wacker and Tim Roith we developed polarized particle dynamics to improve over consensus-based optimization and sampling methods for objective functions with multiple global minima or modes. Code is available on GitHub.
[26 Oct 22] Excited, that I will speak about recent work on convergence rates for infinity Laplacian equations on sparse graphs in the One World Seminar Series on the Mathematics of Machine Learning.
[17 Oct 22] Our preprint "Ratio convergence rates for Euclidean first-passage percolation: Applications to the graph infinity Laplacian" is available on arxiv. In this joint work with Jeff Calder and Tim Roith we prove convergence rates of the graph distance function and solutions of the graph infinity Laplacian equation for percolation length scales.
[15 Oct 22] Our article "Eigenvalue Problems in L∞: Optimality Conditions, Duality, and Relations with Optimal Transport" has appeared at Communications of the American Mathematical Society.
[22 Sep 22] Our article "Uniform convergence rates for Lipschitz learning on graphs" has appeared in the IMA Journal for Numerical Analysis. With Jeff Calder and Tim Roith we prove convergence rates for the infinity Laplacian equation on weighted graphs for length scales down to the connectivity threshold.
[29 Aug 22] Looking forward to the program on Geometric Aspects of Nonlinear Partial Differential Equations at the beautiful Institut Mittag-Leffler in Stockholm.
[08 Aug 22] Visiting Guy Gilboa at the Technion in Haifa.
[04 Aug 22] Our paper "A Bregman Learning Framework for Sparse Neural Networks" is now published at JMLR.
[18 Jul 22] Visiting Jeff Calder at the University of Minnesota, Twin Cities.
[20 Jun 22] Supper happy about the Argelander Starter-Kit Grant which I got from the University of Bonn to work on the project "Adversarial Robustness in Semi-Supervised Learning".
[17 Jun 22] Happy about a wonderful workshop on "Synergies between Data Science and PDE Analysis" which I organized together with Franca Hoffmann. Was great to see so many friends and colleagues in person.
[15 May 22] Our paper "Improving Robustness against Real-World and Worst-Case Distribution Shifts through Decision Region Quantification" was accepted at ICML 2022 and is available on arxiv. We propose and analyze the DRQ algorithm which works at test-time and improves robustness of any pretrained model.
[08 May 22] Visiting José Mazón for a one-week research stay in Valencia.
[22 Mar 22] Happy about a sucessful minisymposium on "Recent Advances on Stable Neural Networks" at SIAM Imaging Science 2022, which I organized together with Tim Roith and Daniel Tenbrinck. My talk can be found here.
[11 Mar 22] Had a great experience teaching a minicourse together with Bubacarr Bah at the "Foundational Methods in Data Science" Training School at AIMS Rwanda.
[21 Jan 22] Tim Roith's and my paper "Continuum Limit of Lipschitz Learning on Graphs" has appeared open access in Foundations of Computational Mathematics.
[20 Jan 22] Our chapter on "Gradient flows and nonlinear power methods for the computation of nonlinear eigenfunctions" has appeared in the Handbook of Numerical Analysis.
2021
[15 Dez 21] My new preprint "The inhomogeneous p-Laplacian equation with Neumann boundary conditions in the limit p→∞" is on arxiv. I prove that solutions to the p-Poisson equation converge to an optimal transport potential as p→∞ and I characterize the limit as solution to an infinity Laplacian equation.
[29 Nov 21] Our new paper "The Geometry of Adversarial Training in Binary Classification" is on arxiv. Together with Nicolás García Trillos and Ryan Murray we show that adversarial training is nothing but total variation regularized risk minimization.
[25 Nov 21] Glad to announce that our preprint "Uniform Convergence Rates for Lipschitz Learning on Graphs" is available on arxiv. In this joint work with Jeff Calder and Tim Roith we prove convergence rates of solutions to the graph infinity Laplacian equation as the graph approximates a continuum domain.
[30 Aug 21] Today I have started a two-month visiting postdoc position at UC Berkeley, participating in the thematic program "Geometric Methods in Optimization and Sampling".
[4 Aug 21] Our paper "Nonlinear Power Method for Computing Eigenvectors of Proximal Operators and Neural Networks" has appeared in SIAM Journal of Imaging Sciences.
[27 Jul 21] Our preprint "Eigenvalue Problems in L∞: Optimality Conditions, Duality, and Relations with Optimal Transport" is online. In this joint work with Yury Korolev we develop a novel subdifferential calculus to tackle eigenvalue and other variational problems in L∞.
[4 Jun 21] Our new preprint "Neural Architecture Search via Bregman Iterations" is available on arxiv. We use an inverse scale space training scheme to unveil neural network architectures like residual autoencoders.
[1 Jun 21] Excited that today I have started a postdoc position at the Hausdorff Center for Mathematics in the group of Franca Hoffmann.
[19 May 21] Together with Martin Burger, I have written a chapter on "Gradient Flows, Nonlinear Power Methods, and Computation of Nonlinear Eigenfunctions" which will appear in the Handbook of Numerical Analysis.
[18 May 21] Our paper "Nonlinear Spectral Decompositions by gradient flows of one-homogenoeus functionals" has appeared in Analysis&PDE. This is joint work with Martin Burger, Antonin Chambolle, and Matteo Novaga.
[11 May 21] Our preprint "A Bregman Learning Framework for Sparse Neural Networks" has appeared on arxiv. This is joint work with Tim Roith, Daniel Tenbrinck, and Martin Burger. We train sparse neural networks in an inverse scale space manner utilizing linearized Bregman iterations.
[30 Apr 21] Our chapter "CLIP: Cheap Lipschitz Training of Neural Networks" has appeared. We will present CLIP on Tue, 18 May at SSVM 2021.
[27 Apr 21] Our new preprint "Complete Deterministic Dynamics and Spectral Decomposition of the Ensemble Kalman Inversion" is out. This is joint work with Philipp Wacker. Using spectral techniques we analyze the dynamics of the Ensemble Kalman Inversion and its covariance matrix and study its time-asymptotic behavior.
[24 Mar 21] Our new preprint "CLIP: Cheap Lipschitz Training of Neural Networks" is out. We regularize the Lipschitz constant of a neural network using a variable Lipschitz training set which evolves towards points with large Lipschitz constant. This work is accepted at SSVM 2021.
[24 Feb 21] Check out our new preprint "Identifying Untrustworthy Predictions in Neural Networks by Geometric Gradient Analysis". We use the geometry of the loss landscape of a neural network to identify whether classifications lie in local minima or instable regions.
[22 Feb 21] Check out the poster on our preprint "Continuum Limit of Lipschitz Learning on Graphs", presented at the Winter School on Analysis and Applied Mathematics (University of Münster).
2020
[7 Dec 20] Our preprint "Continuum Limit of Lipschitz Learning on Graphs" in collaboration with Tim Roith is available on arxiv. We prove Gamma-convergence and compactness of Lipschitz constant functionals on weighted graphs. Our results apply to nonlinear ground states, which can be characterized as geodesic distance functions.
[27 Nov 20] I have been awarded the Biennial French-German Mathematics in Imaging Phd Prize and may present my work at MIA'21!
[01 Sep 20] I have finished my PhD! My thesis has the title Nonlinear Spectral Analysis with Variational Methods.