A cheat sheet for Brascamp-Lieb inequalities
I have ventured into different mathematical principles. Along the way, I've had the chance to read and digest fascinating papers. While some aided me in my work, others were there for a short journey. Still, I want to write my opinions on interesting works that I've taken the time to analyze. Mostly I focus on what snippets I've learned from the writers, whether it's a proof technique or a new concept.
My summary of the convergence of discrete harmonic extensions based on the paper "Random walk on sphere packings and Delaunay triangulations in arbitrary dimension". The key idea is the quantitative uniform convergence of the suitably adaptive discrete Dirichlet problem, obtained by leveraging an energy-control estimate.
My thoughts on the coupling of Hebbian learning and preferrential attachment. This coupling suggests the potential for applying free probability to the study of spiking neural networks.
My summary of the paper "A probabilistic Takens theorem". As a fan of Whitney's embedding theorem, I couldn't pass up a chance to learn more about coordinate-delay embedding theorems.
My technical review of the paper "Scaling algorithms for unbalanced optimal transport problems ", written in a simplified manner. I was most interested in understanding how the so-called scaling algorithms came about and the Thompson metric for fixed-point convergence, even though they are only a fraction of the vast amount of information given in the said paper.
My technical review of the paper "Minimax estimation of smooth optimal transport maps ". It is one of the first papers in statistical optimal transport that I've read. Like the previous paper, this one also shares a tremendous amount of information. It's where I learned about min-max theory and statistical wavelet theory.
My technical review of the paper "Optimal sampling rates for approximating analytic functions from pointwise samples ". It is where I learned about the stability problem in sampling theory as well as the Impossibility theorem. The topic is related to one of my works.
My technical review of the paper "Equivalence of approximation by convolutional neural networks and fully-connected networks ". It is one of my first reads on the mathematics of neural networks and undoubtedly one of my favorite reads, just for the sheer rigorousness alone.
My summary/review of the paper "Stable phase retrieval in infinite dimensions ". I learned the many applications of the phase retrieval problem through this paper.
My semi-technical review of the paper "Condensation in preferential attachment models with location-based choice" by Haslegrave et al. This is one of my first exposure to the concept of an evolving, dynamical network.
My cheat sheet for extremizing Brascamp-Lieb inequalities and a discussion of an entropy lemma
Diagram showing how the Loomis-Whitney inequality can be obtained via,
applying a continuous entropy lemma on the hypergraph H=((x,y,z), (yz,xz,xy))
specifying a Brascamp-Lieb datum (B,p)
Amendment 1. An amended version of "Restricted Riemannian geometry for positive semidefinite matrices," which is also available on arXiv. The mathematical content remains unchanged from the original publication in LAA. However, the revised version features updated Figures 1 and 2, which have been arranged for improved clarity.
Amendment 2. An updated version of "Functions of nearly maximal Gowers-Host-Kra norms on Euclidean spaces." This version differs from the published version in the following points:
better presentation (better equation alignments, better referencing)
typesetting errors corrected
references that were pre-prints at the time of publication are now cited with their corresponding appropriate journals.
The mathematical content is unchanged.
Amendment 3. (Probably outdated) Some extra calculations that were cut from "Superiority of GNN over NN in generalizing bandlimited functions." Particularly, I discussed how the approximation scheme using Taylor expansions adapted to a NN structure might require an extra log factor of network weights than what would be required using a GNN.