Below is the list of speakers who kindly agreed to give talks during the workshop:
Speaker: H`a Quang Minh RIKEN Center for Advanced Intelligence Project (RIKEN-AIP) in Tokyo, Japan.
Title: Regularized Wasserstein distances between Gaussian measures and Gaussian processes
Abstract: Optimal transport (OT) has been attracting much research attention in various fields, in particular machine learning and statistics. It is well-known that the exact OT distances are generally computationally demanding and suffer from the curse of dimensionality. One approach to alleviate these problems is via regularization. In this talk, we present recent results on the entropic regularization of OT in the setting of Gaussian measures and their generalization to the infinite-dimensional setting of Gaussian processes.In these settings, the regularized Wasserstein distances admit closed form expressions, which satisfy many favorable theoretical properties, especially in comparison with the exact distance. The mathematical formulation will be illustrated with numerical experimentson Gaussian processes.
Bio: H`a Quang Minh is currently a Unit Leader at the RIKEN Center for Advanced Intelligence Project (RIKEN-AIP) in Tokyo, Japan. He received the Ph.D. degree in mathematics from Brown University, Providence, RI, USA, in May 2006, under the supervision of Steve Smale. Prior to joining RIKEN, he was a researcher in the Department of Pattern Analysis and Computer Vision (PAVIS) with the Istituto Italiano di Tecnologia (IIT), Genova, Italy. Prior to that, he held research positions at the University of Chicago, the University of Vienna, Austria, and Humboldt University of Berlin, Germany. His current research interests include applied and computational functional analysis, applied and computational differential geometry, machine learning, computer vision, and image and signal processing. He received the Microsoft Best Paper Award at the Conference on Uncertainty in Artificial Intelligence (UAI) in 2013 and the IBM Pat Goldberg Memorial Best Paper Award in Computer Science, Electrical Engineering, and Mathematics in 2013.
Speaker: Xavier Pennec, Université Côte d'Azur and Inria (France)
Title: Effect of curvature on the Empirical Fréchet mean estimation in manifolds
Abstract: Statistical inference in manifolds most often rely on the Fréchet mean in the Riemannian case, or on exponential barycenters in affine connection spaces. The uncertainty of the empirical mean estimation with a fixed number of samples is a key question. In sufficient concentration conditions, a central limit theorem was established in Riemannian manifolds by Bhattacharya & Patrangenaru in 2005. We present in this talk an asymptotic development valid in Riemannian and affine cases which better explain the role of the curvature in the modulation of the speed of convergence of the empirical mean. We also establish a non-asymptotic development in high concentration which shows a statistical bias on the empirical mean in the direction of the average gradient of the curvature. These curvature effects become important with large curvature and can drastically modify the estimation of the mean. They could partly explain the phenomenon of sticky means recently put into evidence in stratified spaces, notably in the case of negative curvature.
Reference: Xavier Pennec. Curvature effects on the empirical mean in Riemannian and affine Manifolds: a non-asymptotic high concentration expansion in the small-sample regime. ARXIV preprint 1906.07418, June 2019.
Bio: After a PhD in computer Science from the French Ecole Polytechnique in 1996 and a post-doctoral stay at MIT, Xavier Pennec joined Inria in 1998, where he became Senior Research Scientist (Directeur de Recherche) in 2007. He is also affiliated with the Université Côte d'Azur (Nice and Sophia-Antipolis, France) and holds a Chair in Artificial Intelligence from the 3IA Côte d'Azur. His research topics are on the theory of geometric statistics, subject on which he obtained the prestigious ERC grant G-Statistics in 2018. The goal is to study data living in non-linear spaces like Riemannian manifolds, Lie groups, or stratified quotient spaces. The application of this theory to medical imaging problems is at the heart of computational anatomy, an intrinsically multidisciplinary domain at the frontier of mathematics, computer science and medicine that aims at describing statistically the normal and pathological shape of organs.
Speaker: Shantanu Joshi University of California Los Angeles
(USA)
Title: Geometric Data Alignment in Biomedical Imaging
Abstract: This talk presents approaches for geometric alignment of signals, functional measures and diffusion measures from neuroimaging data. We present methods for temporal alignment of both amplitude and phase of the functional magnetic resonance imaging (fMRI) time course and spectral densities. Experimental results show significant increases in pairwise node to node correlations and coherences following alignment. Additionally, we show results for task based fMRI signals, where we see improved power of detection of clusters and activations for single subject data. We also present a geometric approach for minimizing the variability in the shape of along- tract diffusion profiles by performing diffeomorphic alignment across the tracts as well as across populations. Finally, we present an approach for accelerating the alignment process using deep learning.
Bio: Shantanu Joshi is an Associate Professor of Neurology, Bioengineering, and Computational & Systems Biology, and a faculty of the UCLA Ahmanson Lovelace Brain Mapping Center at UCLA. He received his PhD degree in Electrical Engineering from Florida State University. His research interests lie in the modeling and development of novel biomedical signal and image processing approaches for brain mapping of structure and function. His work in shape morphometrics led to the identification of a new genus of Lambeosaurine dinosaurs as listed in the International Commission of Zoological Nomenclature (ICZN). He is the recipient of the NIH Career Development Award in 2015, the Ziskind-Somerfeld Research Award Finalist from the Society of Biological Psychiatry, and the UCLA Faculty Career Award in 2019. He is on the editorial board of the International Journal of Computer Assisted Radiology and Surgery and an Associate Editor of Frontiers in Neuroscience, Brain Imaging Methods.
Speaker: Nicolas Boumal, mathematics department at EPFL, Switzerland
Title: When the search space is not smooth: apocalypses and smooth lifts
Abstract: The set of matrices of a certain size and rank is a smooth manifold. Unfortunately, it is not closed: this is uncomfortable for optimization. The closure of that manifold, namely, the set of matrices with bounded rank is an algebraic variety, but it is not smooth. That is also uncomfortable for optimization. Case in point, the gradient norm of the cost function along a sequence on such a set can go to zero even if the limit point of the sequence is not stationary: we call this eventuality an apocalypse. I will present general characterizations of this phenomenon and its dual of sorts (serendipities) together with classes of sets where this may or may not happen. Then, I will discuss liftings of such optimization problems to smooth manifolds, and answer general questions about properties of lifts such as: if we can compute global/local minima or (approximate) first-/second-order stationary points in the lifted space, what does that afford us for the original problem?
Joint work with Eitan Levin (CalTech) and Joe Kileel (UT Austin).
Bio: Nicolas Boumal is an assistant professor in the mathematics department at EPFL. He studies non-convex optimization, numerical analysis and statistical estimation, exploiting mathematical structures such as smooth geometry, convex geometry and low rank. He is the author of a popular Riemannian optimization toolbox called Manopt, and of a book (draft) available on his webpage.