Mathematics in Imaging, Data and Optimization

Department of Mathematical Science

Rensselaer Polytechnic Institute

 Topics:        Seminars focus on mathematics in imaging science, data science and optimization.

 Schedule:   Seminars will run weekly via Webex or in person. Each seminar will be 50 minutes followed by 10 minutes for Q&A. 

 Youtube:      https://www.youtube.com/channel/UC_zDPrwayiXoH_R6gvBTMhA 

 Mailing list: To receive the latest information, please subscribe the mailing list here.

2:00 - 3:00 pm,  Oct. 25, 2023 (EST), Mingrui Liu, George Mason University

Title:  New Federated Algorithms for Deep Learning with Unbounded Smooth Landscape

Slides Video

Abstract: The current analysis of federated optimization algorithms for training deep neural networks typically requires the loss landscape to be smooth. However, there is a class of neural networks which is inherently nonsmooth, with a potentially unbounded smoothness parameter. Examples include recurrent neural networks, long-short-term memory networks, and transformers. It remains unclear how to design provably efficient algorithms for training these neural networks. In this talk, I will consider the federated deep learning setting in the presence of unbounded smoothness. I will introduce a few efficient algorithms for various settings of federated learning, including homogeneous data, heterogeneous data, and partial client participation. The main result is two folds. First, we show that the designed algorithms provably enjoy linear speedup and require significantly fewer communication rounds.  Second, we establish a lower bound for a variant of minibatch SGD to show the provable advantage of our proposed algorithms in the unbounded smoothness setting.


Bio: Mingrui Liu is an assistant professor at the Department of Computer Science at George Mason University since August 2021. Before that he was a postdoctoral fellow at Boston University from 2020-2021. He received his Ph.D. in Computer Science at The University of Iowa in August 2020. His research interests include machine learning, mathematical optimization, statistical learning theory, and deep learning. He has served as an area chair for NeurIPS/AISTATS/IJCAI.


2:00 - 3:00 pm,  Nov. 01, 2023 (EST), Caroline Moosmueller, UNC

Title:  Manifold learning for point-cloud data with applications in biology

Slides Video

Abstract: In this talk, I will introduce LOT Wassmap, a computationally feasible algorithm to uncover low-dimensional structures in the Wasserstein space. The algorithm is motivated by the observation that many datasets are naturally interpreted as point-clouds or probability measures rather than points in $\mathbb{R}^n$, and that finding low-dimensional descriptions of such datasets requires manifold learning algorithms in the Wasserstein space. Most available algorithms are based on computing the pairwise Wasserstein distance matrix, which can be computationally challenging for large datasets in high dimensions. The proposed algorithm leverages approximation schemes such as Sinkhorn distances and linearized optimal transport to speed-up computations, and in particular, avoids computing a pairwise distance matrix. Experiments demonstrate that LOT Wassmap attains correct embeddings and that the quality improves with increased sample size. On the application side, I'll show how different cancer types can be clustered based on gene expression data interpreted as distributions over gene networks.

This is joint work with Alex Cloninger (UCSD), Keaton Hamm (UT Arlington) and Varun Khurana (UCSD).


Bio: Caroline Moosmueller joined UNC Chapel Hill in 2022 as an Assistant Professor in the Department of Mathematics. She holds a B.Sc and M.Sc in Mathematics from the University of Vienna, and a Ph.D in Technical Mathematics from Graz University of Technology (Austria). 

Her research addresses the development of numerical methods for nonlinear and high-dimensional data analysis, with a focus on algorithms that preserve geometric structure. She works on optimal transport problems, classification tasks in machine learning, linear and nonlinear approximation, and applications in biology and cancer research.


4:00 - 5:00 pm,  Nov. 06, 2023 (EST), Gesualdo Scutari, Purdue University

Title:  Statistical Inference over Networks: Decentralized Optimization Meets High-dimensional  Statistics

Location: Amos Eaton 215

Abstract: There is growing interest in solving large-scale statistical machine learning problems over decentralized networks, where data are distributed across the nodes of the network and no centralized coordination is present (we termed these systems as “mesh” networks). Inference from massive datasets poses  a fundamental challenge at the nexus of the computational and statistical sciences: ensuring the quality of statistical inference when computational resources, like time and communication, are constrained.   While statistical-computation tradeoffs have been largely explored in the centralized setting, our understanding over mesh networks is limited: (i) distributed schemes, designed and performing well in the classical low-dimensional regime, can break down in the high-dimensional case; and (ii) existing convergence studies may fail to predict algorithmic behaviors, with some findings directly contradicted by empirical tests. This is mainly due to the fact that the majority of distributed algorithms  have been designed and studied only from the optimization perspective, lacking the statistical dimension. This talk will discuss some vignettes from  high-dimensional statistical inference suggesting  new analyses (and designs) aiming at bringing statistical thinking in distributed optimization.


Bio: Gesualdo Scutari  is a Professor with the School of Industrial Engineering and Electrical and Computer Engineering (by courtesy) at  Purdue University, West Lafayette, IN, USA, and he is a Purdue Faculty Scholar. His research interests include continuous optimization, equilibrium programming, and their applications to signal processing and statistical learning. Among others, he was a recipient of the 2013 NSF CAREER Award, the 2015 IEEE Signal Processing Society Young Author Best Paper Award, and the 2020 IEEE Signal Processing Society Best Paper Award. He serves as an IEEE Signal Processing Distinguish Lecturer (2023-2024). He served on the editorial broad of several IEEE journals and he is currently an Associate Editor of SIAM Journal on Optimization. He is IEEE Fellow.


OrganizersRongjie Lai ,     Rensselaer Polytechnic Institute

                      John MitchellRensselaer Polytechnic Institute

                      Yangyang Xu,   Rensselaer Polytechnic Institute