Writings & updates
A cheat sheet for Brascamp-Lieb inequalities
A short summary of publications and preprints
Graph Laplacians on Shared Nearest Neighbor graphs and graph Laplacians on k-Nearest Neighbor graphs having the same limit. Shared Nearest Neighbor (SNN) measures have been touted as being less prone to the curse of dimensionality than conventional distance measures and have been widely used in applications. In this work, we show that large scale asymptotics of an SNN graph Laplacian reach a consistent continuum limit that is the same as that of a k-NN graph Laplacian. Moreover, we show that the pointwise convergence rate of the graph Laplacian is linear with respect to expected neighborhood radius with high probability.
Superiority of GNN over NN in generalizing bandlimited functions. We confirmed mathematically the optimal performance of NN in a node-focused task in this pioneering work. In machine learning, such a task is known as semi-supervised learning; in mathematics, it resembles a sampling problem. We're the first to bring in sampling theory as an analytic angle to study neural networks.
Restricted Riemannian geometry for positive semidefinite matrices. We derived a notion of mean for a finite cluster of low-rank square matrices. To do that, we created a Riemannian geometry, by using a low-rank Cholesky decomposition, on a dense, full-dimensional submanifold of n by p matrices, from which we derived closed-form endpoint geodesics, Frechet means, and parallel transports.
Functions of nearly maximal Gowers-Host-Kra norms on Euclidean spaces. It's known that the functions that maximize their GHK norms on Euclidean spaces are generalized Gaussians. We showed that those which nearly maximize their GHK norms were close to a Gaussian (in Lebesque norm) up to a global phase. Continuous inequalities related to GHK norm inequalities range from Holder's inequality, Hausdorff-Young inequality to rearrangement inequalities and isoperimetric inequality.
Technical reviews & Math blogging
I have ventured into different mathematical principles. Some worked out; some didn't. Along the way, I've had the chance to read and digest fascinating papers. While some aided me in my work, others were there for a short journey. Still, I want to write my opinions on interesting works that I've taken the time to analyze. Mostly I focus on what snippets I've learned from the writers, whether it's a proof technique or a new concept.
My summary of the paper "A probabilistic Takens theorem". As a fan of Whitney's embedding theorem, I couldn't pass up a chance to learn more about coordinate-delay embedding theorems.
My technical review of the paper "Scaling algorithms for unbalanced optimal transport problems ", written in a simplified manner. I was most interested in understanding how the so-called scaling algorithms came about and the Thompson metric for fixed-point convergence, even though they are only a fraction of the vast amount of information given in the said paper.
My technical review of the paper "Minimax estimation of smooth optimal transport maps ". It is one of the first papers in statistical optimal transport that I've read. Like the previous paper, this one also shares a tremendous amount of information. It's where I learned about min-max theory and statistical wavelet theory.
My technical review of the paper "Optimal sampling rates for approximating analytic functions from pointwise samples ". It is where I learned about the stability problem in sampling theory as well as the Impossibility theorem. The topic is related to one of my works.
My technical review of the paper "Equivalence of approximation by convolutional neural networks and fully-connected networks ". It is one of my first reads on the mathematics of neural networks and undoubtedly one of my favorite reads, just for the sheer rigorousness alone.
My summary/review of the paper "Stable phase retrieval in infinite dimensions ". I learned the many applications of the phase retrieval problem through this paper.
My semi-technical review of the paper "Condensation in preferential attachment models with location-based choice" by Haslegrave et al. This is one of my first exposure to the concept of an evolving, dynamical network.
My cheat sheet for extremizing Brascamp-Lieb inequalities and a discussion of an entropy lemma
Diagram showing how the Loomis-Whitney inequality can be obtained via,
applying a continuous entropy lemma on the hypergraph H=((x,y,z), (yz,xz,xy))
specifying a Brascamp-Lieb datum (B,p)
Amendment
Amendment 1. An amended version of "Restricted Riemannian geometry for positive semidefinite matrices," which is also available on arXiv. The mathematical content remains unchanged from the original publication in LAA. However, the revised version features updated Figures 1 and 2, which have been arranged for improved clarity. The content of both the old and new versions of the figures is identical, and therefore a corrigendum is not necessary. The paper's main results remain intact: (1) the restricted geometry is diffeomorphic to a half-plane with the same dimension, and (2) the Cholesky map can be extended to a continuous map on a punctured plane to a Riemann surface that is diffeomorphic to a sphere punctured at the two poles. The revised version features figures that do not include any hand-drawn elements.
Amendment 2. An updated version of "Functions of nearly maximal Gowers-Host-Kra norms on Euclidean spaces." This version differs from the published version in the following points:
better presentation (better equation alignments, better referencing)
typesetting errors corrected
references that were pre-prints at the time of publication are now cited with their corresponding appropriate journals.
The mathematical content is unchanged.
Amendment 3. Some extra calculations that were cut from "Superiority of GNN over NN in generalizing bandlimited functions." Particularly, I discussed how the approximation scheme using Taylor expansions adapted to a NN structure might require an extra log factor of network weights than what would be required using a GNN.