- New Paper- Anna C. Gilbert and Rishi Sonthalia. Project and Forget: Bregman Projections with Cutting Planes.
- NEW SEMINAR: Saibal and I will be hosting a student machine learning seminar. More information about it can be found here.
- I shall be co-cordinator for Calculus 2 for Fall 2018
- I will be chairing the session "Learning, Inference, and Hypothesis Testing" at 2018 Allerton Conference. I will also be presenting my paper there
- New Paper - Anna C. Gilbert and Rishi Sonthalia. Unrolling Swiss Cheese: Metric repair on manifolds with holes [ArXiv Version]
- New Paper - Anna C. Gilbert and Rishi Sonthalia. General Metric Repair [ArXiv Version]
Hi! I am a third year Applied and Interdisciplinary Mathematics Ph.D. student at University of Michigan. My advisor is Anna C. Gilbert. I did my undergrad at Carnegie Mellon University where I obtained a B.S. in Discrete Math and Computer Science. I am interested in using Math to develop and analyze tools and algorithms for data science and machine learning.
I am also involved in starting and running a student machine learning seminar in the math department. More information about the upcoming talks can be found here.
More details about each project can be found here
- Representation Learning: Given data, I am interested in learning good representations. In particular I wish to do this by first, learning the optimal appropriate metric and then learning an embedding. This may be a direct embedding, or in the form of charts.
- Metric Repair: Given a weighted graph, we want to "repair" the smallest number of weights such that the resulting graph satisfies a metric. The goal of this project is two fold: Develop algorithms that use metric repair for ML applications, and analyze metric repair from complexity theory and graph algorithm viewpoint
- Filling in Missing Data in a Sequential Model: The goal here is given a piece of text with missing characters, to fill these characters.
- Non Linear Independent Component Analysis: Develop a method via which we can do stable independent component analysis that is robust
- Theoretical Justification for Dropout: We want to prove that using Dropout while training a neural network provably reduces overfitting.
If you have any questions, ideas you want to discuss, or just want to talk about math and computer science, ways to contact me can be found under the contact me tab.