Winter 2022

Feb 11, 2022: Di Fang (UC Berkeley)


Title: Quantum algorithms for Hamiltonian simulation with unbounded operators

Abstract: Recent years have witnessed tremendous progress in developing and analyzing quantum algorithms for Hamiltonian simulation of bounded operators. However, many scientific and engineering problems require the efficient treatment of unbounded operators, which may frequently arise due to the discretization of differential operators. Such applications include molecular dynamics, electronic structure theory, quantum differential equations solver and quantum machine learning. We will introduce some recent progresses in quantum algorithms for efficient unbounded Hamiltonian simulation, including Trotter type splitting and the quantum highly oscillatory protocol (qHOP) in the interaction picture.

Host person: Sui Tang

Feb 18, 2022: James Murphy (Tufts)

Title: Geometric Structures in High-Dimensional Data: Graphs, Manifolds, and Barycenters (recording)

Abstract: The curse of dimensionality renders statistical and machine learning in high dimensions intractable without additional assumptions on the underlying data.  We consider geometric models for data that allow for mathematical performance guarantees and efficient algorithms that deflect the curse.  The first part of the talk develops a family of data-driven metrics that balance between density and geometry in the underlying data.  We consider discrete graph operators based on these metrics and prove performance guarantees for clustering with them in the spectral graph paradigm.  Fast algorithms based on Euclidean nearest-neighbor graphs are proposed and connections with partial differential equations on Riemannian manifolds are developed.  In the second part of the talk, we move away from Euclidean spaces and focus on representation learning of probability distributions in Wasserstein space.  We introduce a general barycentric coding model in which data are represented as Wasserstein-2 (W2) barycenters of a set of fixed reference measures.  Leveraging the Riemannian structure of W2-space, we develop a tractable optimization program to learn the barycentric coordinates when given access to the densities of the underlying measures.  We provide a consistent statistical procedure for learning these coordinates when the measures are accessed only by i.i.d. samples.  Our consistency results and algorithms exploit entropic regularization of optimal transport maps, thereby allowing our barycentric modeling approach to scale efficiently.  Throughout the talk, applications to image and natural language processing demonstrate the efficacy of our geometric methods.

Host person: Sui Tang



March 4, 2022: Yanqiu Guo (FIU)

Title: Inertial manifolds for the hyper-viscous Navier-Stokes equations 

Abstract: 

One of the central problems for dissipative systems generated by PDEs is concerned with whether the underlying dynamics is effectively finite dimensional and can be described by a system of ODEs. Indeed, for a large class of dissipative evolution equations, long-time behavior of solutions possesses a resemblance of the behavior of finite-dimensional systems. In order to capture such phenomena, Foias, Sell, and Temam introduced the concept of inertial manifolds. Indeed, an inertial manifold of an evolution equation is a finite-dimensional Lipschitz invariant manifold attracting exponentially all the trajectories of a dynamical system induced by the underlying evolution equation. The existence of an inertial manifold for an infinite-dimensional evolution equation represents the best analytical form of reduction of an infinite system to a finite-dimensional one. But, whether the Navier-Stokes equations possess an inertial manifold is unknown. In this talk, I will focus on the existence of an inertial manifold for the hyper-viscous Navier-Stokes equations on a three-dimensional torus when the exponent of the hyper-dissipation is 3/2. For this case, the spectral gap condition is not fulfilled, and we employ the spatial averaging method introduced by Mallet-Paret and Sell. Interestingly, the gaps between certain quadratic forms of integers play an important role in the proof. This is a joint work with C. Gal.


Host person: Quyuan Lin

March 11, 2022: Akram Aldroubi (Vanderbilt)

Title: Transport  Transforms  for Data Analysis and Machine Learning

Abstract: Recently, a new set of transforms have been developed that are based on optimal transport theory. The  transport transforms are non-linear transforms that are well suited for many applications of data analysis, classification, and processing.  In many situations, ranging from machine learning and signal processing, transport-based algorithms have outperformed state-of-the-art methods. In this talk, we will describe the  Cumulative Distribution Transform (CDT), and its recent extension, the Signed Cumulative Distribution Transform (SCDT). The CDT and SCDT, are transforms for 1-D signals. By combining them with the Radon transform, we get the Radon-CDT, which are multivariate transforms. I will give an overview of these transforms, and some of the mathematical properties that make them suitable for data classification. I will present several successful applications related to medical diagnostic.

Host person: Sui Tang