Kick-off meeting
Signatures for Images

May 10-11 @ CAS, Oslo

At this 2-day meeting, we will present the research program "Signatures for Images" (SFI@CAS) together with our team members, Christa, Ilya, Joscha, and Samy

We will describe the project and outline its scope and goals. In addition, we will hear talks from four researchers from Norwegian universities, working on various topics closely related to the main aim of the project, i.e., unfolding new mathematical structures tailored for image analysis in the context of signature methods and the theory of rough paths. 

Wednesday, May 10

 

18:00 Dinner for invited participants.  


Thursday, May 11



Titles and Abstracts

Vegard Antun 

Title: Implicit regularization in AI meets generalized hardness of approximation 

Abstract:  Why is deep learning so successful in many applications of modern AI? This question has puzzled the AI community for more than a decade, and many attribute the success of deep learning to the implicit regularization imposed by the different neural network (NN) architectures and the gradient descent algorithm. In this talk, we will investigate the implicit regularization of so-called linear NNs in the simplified setting of linear regression. Furthermore, we will show how this theory meets fundamental computational boundaries imposed by the phenomenon of generalized hardness of approximation. That is, the phenomenon where certain optimal NNs can be proven to exist, but any algorithm will fail to compute these NNs to an accuracy below a certain approximation threshold. Thus, paradoxically, there will exist deep learning methods that are provably optimal, but that can only be computed to a certain accuracy. [slides]


Fred Espen Benth

Title: Recent advances on forward curve modeling and applications 

Abstract: In this talk, we survey some recent advances on forward curve modeling. We present infinite-dimensional stochastic volatility models, including leverage, and discuss the question of option pricing in this context. Neural networks in Hilbert spaces provide an attractive numerical method to price options on forward curves, where stylised structures of the curves are used as additional information in the training. Finally, we present new limit theorems on the realised variation of forward curves, which can be used for estimation.  The talk is based on joint works with Nils Detering (Santa Barbara), Luca Galimberti (Trondheim), Dennis Schroers (Bonn), Carlo Sgarra (Milano), and Almut Veraart (London). [slides]


Elena Celledoni

Title: An introduction to shape analysis and deep learning for
optimal reparametrizations of shapes

Abstract: Shape analysis is a mathematical approach to problems of pattern and object recognition and has developed considerably in the last decade. The use of shapes is natural in applications where it is interesting to compare curves or surfaces independently of their parametrisation. Considering a smooth setting where the parametrized curves or surfaces belong to an infinite dimensional Riemannian manifold, one defines the corresponding shapes to be equivalence classes of curves differing only by their parametrization.  Under appropriate assumptions, the Riemannian metric can be used to obtain a meaningful measure of distance on the space of shapes. 
One computationally efficient approach to shape analysis is based on the Square Root Velocity Transform, and we have proposed a generalisation of this approach to shapes on Lie groups and homogeneous manifolds. The Lie group approach can be effective when dealing with skeletal animation data coming for example from human motion.
A demanding task when approximating shape distances is finding the optimal reparametrization. The problem can be phrased as an optimisation problem on the infinite dimensional group of orientation preserving diffeomorphisms defined on the domain where the curves or surfaces are defined.  In the case of curves, one robust approach to compute optimal reparametrizations is based on dynamic programming, but this method seems difficult to generalise to surfaces.
We consider here a method where the approximations are obtained composing in succession a number of elementary diffeomorphism and optimising simultaneously over a larger number of parameters. This approach works for curves and surfaces and is reminiscent of deep learning with ResNets and the optimal control interpretation of deep learning.
If time permits I will discuss connections to structure preserving numerical discretization of differential equations, structure preserving deep learning and equivariant neural networks. [slides]


Ilya Chevyrev

Title: Moments of stochastic processes

Abstract: Path signatures are a descriptive and compact feature of time-ordered data. A remarkable property of these features is that, in a probabilistic context, they act as a generalisation of moments to random paths. In particular, in analogy to the moment problem, it is known that many stochastic processes, including Brownian motion, are characterised in law, up to reparametrisation, by their expected signature. I will survey the results on what we know about random signatures and highlight some open problems. I will also present applications to hypothesis testing on pathspace, where a normalised version of the expected signature plays a key role. [slides]


Christa Cuchiero 

Title: From Lévy's stochastic area formula to universality of affine
and polynomial processes 

Abstract:  A plethora of stochastic models used in diverse areas, like mathematical finance, population genetics or physics, stems from the class of affine and polynomial processes. The history of these processes is on the one hand closely connected with the important concept of tractability, that is a substantial reduction of computational efforts due to special structural features, and on the other hand with a unifying framework for a large number of probabilistic models. One early instance in the literature where this unifying affine and polynomial point of view can be applied is Lévy's stochastic area formula. Starting from this example, we present a guided tour through the main properties and results as well as classical and recent applications, which culminates in the surprising insight that infinite dimensional affine and polynomial processes are actually close to generic stochastic processes. [slides]


Joscha Diehl

Title: Multiparameter (iterated) sums 

Abstract: Iterated sums (or integrals) have proven very beneficial in time series analysis. I demonstrate how ideas from this one-parameter setting can be used to study multiparameter data, for example, images. I will also sketch the Hopf algebraic background and the relation to iterated integrals. This is joint work with Leonard Schmitz (University of Greifswald). [slides]


Hans Munthe-Kaas

Title: Geometry, Algebra, Computation: An Eternal Golden Braid

Abstract: Computational mathematics has traditionally been largely separated from geometry. Computational algorithms are approximations of the ideal mathematical world, and the goal has been to minimise the errors, rather than preserving geometric structures. In the last decades, the importance of geometric structure preserving algorithms has dramatically changed the landscape of numerical computations. Algorithms based on global, coordinate independent structures such as Lie group actions and algebraic chain complexes are now common tools in the box of computational  mathematicians. Structure preservation yields numerical integrators with better stability properties and global error growth.
Modern computational mathematics, therefore, needs abstract structures from geometry and algebra. Surprisingly, recent developments show important interactions also in the opposite direction. The need to analyse geometric integration algorithms has led to the developments of new, fundamental mathematical structures in the borderland between geometry and algebra. In this talk, we will give a survey of these developments, for a general mathematical audience. [slides]


Samy Tindel

Title: Two applications of signatures methods

Abstract: In this talk, I will describe briefly 2 instances in which signature methods proved to be very useful for my research. (1) I will first focus on a numerical study, which indicates that low dimensional features based on 2-d signatures discriminate efficiently between different textures in image processing. The type of signatures used in this case are extracted from change of variable formulae in the plane, similar to the usual rough paths setting. They prove to be useful in data analysis as well. (2) I will then discuss a separate study concerning a reinforcement learning problem in a rough environment. I will show how to derive a Hamilton-Jacobi type equation in this context. Then we will see how useful functionals can be approximated by linear maps on signatures. This talk will focus on a rather applied side of the current investigation on signatures. I will try to give a broad overview of the objects I’m manipulating, without entering into technical details. [slides]