Eigenmaps-Data-and-Geometry-2023

Online Geometric Analysis Workshop 

on Eigenmaps, Data, and Geometry

May 17 - May 18, 2023

In recent years, classical techniques from Geometric Analysis have been applied in Data Science.   One is the notion of an eigenmap, which is an embedding of a Riemannian manifold by rescaled eigenfunctions closely related to the heat kernel embeddings first defined by Bérard-Besson-Gallot.  These maps have been applied with great success by Belkin-Niyogi and Coifman-Lafon to achieve dimension reduction of data lying on very high dimensional manifolds in Euclidean space (see this survey) These important applications have lead to an increased interest in the properties of truncated eigenmaps, estimates on the number of eigenfunctions needed to achieve those properties, convergence of the truncated eigenmaps, and the spectral properties of converging sequences of Riemannian manifolds under various curvature bounds.   In this workshop we will explore the deep geometric analysis related to these questions.

Speakers:

Organizers:

A survey article introducing the topics:  

Schedule of Talks: 


May 17


May 18


Webpage: 

https://sites.google.com/site/professorsormani/home/conferences/eigenmaps-data-and-geometry-2023

Zoom Info: 

Titles and Abstracts: 


May 17


Spectral distances on RCD spaces


Abstract: Berard-Besson-Gallot defined a distance d_t between closed Riemannian manifolds via their spectral information for a fixed time t>0. It is emphasized that in general d_t-convergence does not hold even for a smooth convergence of Riemannian metrics. The goal of this talk is to give:

- a generalization of d_t for nonsmooth spaces, so-called RCD (metric measure) spaces;

- characterizations of d_t-convergence for RCD spaces in terms of measured Gromov-Hausdorff convergence, which are new even in the smooth framework.

This talk is based on a preprint, arXiv:2303.11136.



Geometric Analysis behind data analysis 


Abstract: This talk will consist of the following: 

(1) What is the Manifold Learning problem in data analysis and how spectral embeddings, as well as spectral distances, could be used for the Manifold Learning problem. 

(2) In particular, we will narrow our focus to the diffusion maps (a generalization of eigenmaps) and the vector diffusion maps 

(3) Last, I will discuss some embedding results of the vector diffusion maps

May 18



(Semi)continuity of spectral quantities under convergence of the underlying spaces.


Abstract: In this talk I will link together several results describing the semicontinuity and continuity of spectral quantities under convergence of the underlying spaces. The spectral quantities involved are often associated to nonlinear operators that act on functions on the underlying spaces. I'll start the review with semicontinuity of certain min-max values under volume flat convergence, and I will indicate the extension result that lies at its core. Improvements of this extension result go into proving that capacity is semicontinuous under volume flat convergence. However, the min-max values considered before, turn out to not be proper generalizations of eigenvalues. The search for proper generalizations then brings us to Krasnoselskii eigenvalues, for which I review their continuity for operators defined on CD(K,\infty) spaces. In the end, I hope to indicate how to show semicontinuity of Krasnoselskii eigenvalues under volume flat convergence. The talk contains results from joint works with Luigi Ambrosio, Shouhei Honda, Jeff Jauregui and Raquel Perales.



Geometric Clustering: Theory and Applications to Imaging


Abstract: We propose approaches to unsupervised clustering based on data-dependent distances and dictionary learning.  By considering metrics derived from data-driven graphs, robustness to noise and ambient dimensionality is achieved.  The proposed algorithms enjoy theoretical performance guarantees on flexible data models and in some cases guarantees ensuring quasilinear scaling in the number of data points.  Connections to geometric analysis, stochastic processes, and deep learning are emphasized.  Applications to hyperspectral image processing will be shown, demonstrating state-of-the-art empirical performance.



Hawking mass monotonicity for initial data sets


Abstract: An interesting feature of General Relativity is the presence of singularities which can happen in even the simplest examples such as the Schwarzschild spacetime. However, in this case the singularity is cloaked behind the event horizon of the black hole which has been conjectured to be generically the case. To analyze this so-called Cosmic Censorship Conjecture Penrose proposed in 1973 a test which involves Hawking's area theorem, the final state conjecture and a geometric inequality on initial data sets (M,g,k). For k=0 this Penrose inequality has been proven by Huisken-Ilmanen and by Bray using different methods, but in general the question is wide open. Huisken-Ilmanen's proof relies on the Hawking mass monotonicity formula under inverse mean curvature flow (IMCF), and the purpose of this talk is to generalize the Hawking mass monotonicity formula to initial data sets. For this purpose, we start with recalling spacetime harmonic functions and their applications which have been introduced together with Demetre Kazaras and Marcus Khuri in the context of the spacetime positive mass theorem. We also discuss recent work on p-harmonic functions joint with Pengzi Miao and Luen-Fai Tam.