News
Will be at Aspen Center of Physics for the workshop on Theoretical Physics and Deep Learning Theory.
New Paper - Training Data Size Induced Double Descent For Denoising Feedforward Neural Networks and the Role of Training Noise accepted at TMLR.
Talk at Flatiron on Suprises in Denoising and Linear regression on March 31st
Talk at Johns Hopkins on Metric Repair - March 28th
Award - Top Reviewer at NeurIPS 2022
New Paper - Paper from mentoring REU students: Knowledge Graphs for QAnon Twitter Network at IEEE BigData Workshop.
New Paper - Paper from menotring student: Hyperbolic and Mixed Geometry Neural Networks at NeurIPS Neurreps workshop.
New Paper - Project and Forget was accepted at JMLR
New Paper - Predicting the Future of AI with AI: High-quality link prediction in an exponentially growing knowledge network
Award - Peter Smereka Award for Best Applied Math Thesis
Award - AMS Simons Travel Grant
I will be at Max Planck MPI in Leibzig over the summer of 2022.
I will be the AMS MRC on Data Science in June 2022.
New Paper - ICLR 2022 Challenges for Computational Geometry and Topology at PMLR 2022
New Paper - CubeRep: Learning Relations Between Different Views of Data at PMLR 2022
New Paper - Dynamic Embedding-based Methods for Link Prediction in Machine Learning Semantic Network at IEEE BigData Conference 2021
New Paper - Paper from mentoring REU students: An Analysis of COVID-19 Knowledge Graphs Construction and Application at IEEE BigData Conference 2021.
New Paper - Dual Regularized Optimal Transport accepted at Optimal Transport and Machine Learning Workshop at NeurIPS 2021. Work with Anna C. Gilbert. [ArXiv Link]
New Paper - What can go wrong with multidimensional scaling? Accepted at Neurips 2021! Work with Anna Gilbert, Ben Raichel and Greg Van Buskirk
About Me
Hi! I am a Hedrick Assistant Adjunct Professor at UCLA under Andrea Bertozzi, Jacob Foster, and Guido Montufar . I obtained by Ph.D. in Applied and Interdisciplinary Mathematics from the University of Michigan. My advisors were Anna C. Gilbert and Raj Rao Nadakuditi. I did my undergrad at Carnegie Mellon University where I obtained a B.S. in Discrete Math and Computer Science.
I am interested in using Math to develop and analyze tools and algorithms for data science and machine learning.
Current Projects
More details about each project can be found in my research statement.
Denoising Autoencoder - Denosing autoencoders work by learning a map from nosiy data to denoised data. Hence we need to have denoised training data to train the neurla network. Currently, this noise is either added in an ad hoc manner or added so that the training data SNR is the same as the test data SNR. I currently working on theoretically deteriming for the training data SNR should be.
Using Hyperbolic Geometry for Machine Learning
Generative Modelling
Algebraic Structure of Graphical Models
Past Projects
See pubications for more prior projects.
If you have any questions, ideas you want to discuss, or just want to talk about math and computer science, ways to contact me can be found under the contact me tab.