The DRP Symposium was held in-person this semester. Projects are sorted by room, in order of appearance. There were 48 projects this semester.
Analysis
Abstract: I will give an explicit formula for the heat equation on the real line using the Fourier transform method.
Mentor: Jeffrey Cheng
Abstract: In this talk, we will explore properties of the Fourier Transform and see how it can be applied to find the solution to the Schrodinger Equation. We will also explore some properties of solutions to the Schrodinger Equation using analysis.
Mentor: Zachary Lee
Abstract: In this talk, I will demonstrate the method of windowed Fourier Transform for producing the spectrogram, a graph displaying information about time and frequency of an input wave function which has applications in linguistics, spectroscopy, and image processing. With the analysis of time and frequency, a window function convolved with the signal over time is used to track the present time-evolving frequencies. One of the central challenges being: choosing the optimal window. How is this optimal window chosen to give the most information of both time and frequency simultaneously? How does this relate to the uncertainty principle? What can happen if we veer too far away from the optimal window size? Additionally, what sort of windows are used and what are the applications of each?
Mentor: Justin Toyota
Abstract: Given the simple, closed, and smooth curve with a fixed length, we need to find a maximum area of the region it bounds. We will use the Fourier Series to establish an upper bound of the area.
Mentor: Sai Sivakumar
Abstract: This talk will cover the basics of measure theory, discussing the motivation behind it and have some examples of non measurable sets.
Mentor: Aaron Benda
Abstract: Focusing on the Hilbert‑space version of the Riesz Representation Theorem, we’ll cover the one‑to‑one correspondence between continuous linear functionals and inner‑product with a unique vector. Along the way, we’ll tackle examples in L², and see how this theorem underlies Green’s identities in PDE boundary problems.
Mentor: Justin Toyota
Dynamics and Probability
Abstract: In this talk, I will briefly introduce the concept of measure in dynamical systems and then show a proof of the recurrence theorem along with applications.
Mentor: Aaron Benda
Abstract: In this talk, we will present the weak law of large numbers (WLLN), some of the ideas necessary to develop the theorem, and the proof of the theorem. We will also discuss some applications of the WLLN with practical examples and simple simulations.
Mentor: Jake Wellington
Abstract: This talk focuses on the applications of real analysis to probability. We will begin by discussing how we can think of measures of sets as probabilities of events. We will then introduce the different notions of function convergence, focusing specifically on pointwise convergence and convergence in measure. Then, we will introduce L^p spaces. Ultimately, we will use these building blocks to prove the Borel-Cantelli Lemma, Markov’s Inequality, and end with the Strong Law of Large Numbers.
Mentor: Zach Richey
Abstract: This presentation introduces the concept of a symmetric random walk, a fundamental example of a stochastic process. We focus on the mathematical structure of the walk, where each step is an independent random variable taking values
+1 or −1 with equal probability. Using this model, we derive key properties such as the expectation and variance of the walker’s position after 𝑛 steps. We show that the expected position is zero and that the variance grows linearly with time, leading to a standard deviation of sqrt(𝑛). These results imply that while the average position remains centered, the uncertainty increases over time.
Mentor: Lewis Liu
Abstract: This talk introduces the Birkhoff Ergodic Theorem, a key result in ergodic theory that connects ideas from dynamical systems and measure theory. I will explain what it means for a system to be measure-preserving and ergodic, and how these ideas help us understand long-term behavior. The goal is to give an intuitive sense of what the theorem says and why it’s important.
Mentor: Aaron Benda
Dynamics and Probability
Abstract: Overview on the definition of regular continued fractions and patterns that emerge in their structure when representing various types of numbers.
Mentor: Martha Hartt
Abstract: This talk will explore the math behind why some digits appear more often as the leading digit in real data, and how it can be applied to dynamical systems.
Mentor: Aaron Benda
Abstract: This talk will explore the fascinating world of continuous but nowhere differentiable functions through the lens of analysis and its surprising connections to randomness and machine learning. We will begin by revisiting the foundational concepts of continuity and differentiability before introducing the Weierstrass function—a pathological function that is continuous everywhere but differentiable nowhere. From there, we'll pivot to real-world applications, including Brownian motion in stochastic processes and the surprising role of non-smoothness in modern machine learning.
Mentor: Sara Ansari
Abstract: This talk introduces the foundational ideas of measure theory, including sigma-algebras, measures, and the construction of Lebesgue measure on the real line. We’ll explore how the Lebesgue integral generalizes the Riemann integral, enabling the integration of more complex functions and providing a rigorous framework for modern analysis and probability.
Mentor: Luisa Velasco
Abstract: I introduce martingales, the martingale transform, doob decomposition theorem, and an application of the decomposition theorem to American options
Mentor: Luisa Velasco
Algebra and Number Theory
Abstract: I will be talking about the group law over the addition operations on elliptic curves. I will then talk about recent developments in elliptic curves and elliptic curve cryptography.
Mentor: Abhishek Shivkumar
Abstract: This talk discusses introductory theory regarding partitions and generating functions, with a short showcase of different problems in the field worked on by the great Ramanujan and others.
Mentor: Abhishek Shivkumar
Abstract: Humans have a lot to feel guilty about. One thing to be proud of is the Frobenius theory of representations. Representation theory is the study of how groups—abstract collections of symmetries—can be realized as linear transformations of vector spaces. In this talk, we'll explore what a representation is and how Frobenius's vision lets us understand the structure of groups through the lens of linear algebra.
Mentor: Ryan Wandsnider
Abstract: In this talk, we will give a brief introduction to Lie groups and their structure, matrix representations of Lie Groups, then transitioning into a discussion of tangent spaces, the exponential operator, and Lie algebras.
Mentor: John Teague
Applied Mathematics
Abstract: We know a smile when we see one—but how can we teach a machine to do the same? In this talk, we walk through a simple and intuitive approach to training a machine to recognize smiling faces. Using just 128 labeled images and a few meaningful measurements between facial landmarks, we show how to turn images into data, build a mathematical model, and use it to predict whether someone is smiling.
We’ll explore how to represent data as matrices, formulate a predictive model as a linear combination of features, and solve for the optimal weights using least squares. This journey offers an approachable introduction to the mathematical foundations of classification—covering matrix-vector multiplication, linear models, and basic optimization—while keeping a strong connection to real-world intuition. No prior machine learning experience required.
Mentor: Jake Wellington
Abstract: Talking about mainly two multi-stage gradient descent: Gradient descent with Chebyshev stepsize and OERKD, their motivation and comparison.
Mentor: Lewis Liu
Abstract: An overview into the foundations of mathematical cryptography, specifically the RSA crypto system.
Mentor: Isaac Hellerman
Abstract: At their core, Large Language Models (LLMs) are series' of very large matrix multiplications, which transform a textual input into textual output (represented as very high dimensional embedding vectors). The high dimensionality of these matrices and the relative slowness of computing matrix multiplications results in performance bottlenecks for LLMs, especially when they are operating on low-resource systems. The purpose of this project is to investigate the impact of using matrix decomposition methods to reduce the dimensionality of our model's layers on the accuracy of the model inference. The model we choose to experiment on is TinyLLama, a small scale LLM with around 1 billion parameters. The decomposition methods we investigate are a simple SVD decomposition and a random projection method inspired by the Johnson Lindenstrauss Lemma. We evaluate the accuracy of the decomposed model by testing on various benchmarks used to train TinyLLama. By careful analysis into the matrix multiplications being performed during inference, we explore which specific layers and matrices are most conducive to decomposition, and what impact decomposition of various parts of the model have on the output. Through this, we learn the effectiveness of matrix decomposition tools on improving the efficiency of LLMs.
Mentor: Luke Galvan
Abstract: Nowadays, an enormous amount of information can make data analysis complicated, as our computers may not be able to store and execute so much information at once. For this reason, techniques such as SVD have been applied to reduce the so-called data matrix. However, some of these algorithms may be computationally expensive. Therefore, in this presentation, I will discuss random projections as a less computationally expensive alternative to reducing our data matrix in some scenarios. I will talk about the Johnson-Lindenstrauss lemma, how to apply it in the K-Nearest Neighbors classifier, and its limitations.
Mentor: Addie Duncan
Abstract: This talk will introduce Wasserstein Generative Adversarial Networks (WGANs) and how they can address some of the issues associated with GAN training. It will start with a brief overview of GAN theory and an example where some of the traditional methods for the loss function do not suffice. We will then discuss the WGAN formulation using Wasserstein-1 distance and how this can be applied to the problem to ensure training stability and model convergence.
Mentor: Patrícia Muñoz Ewald
Geometry
Abstract: This talk will provide an introduction or review of planar curves and their curvature, then introduce a few methods for applying properties of curvature to analyze discrete planar curves.
Mentor: Aru Mukherjea
Abstract: This presentation explores Huygen’s Tautochrone Clock by examining the historical motivations and geometric properties of the tautochrone curve. The invention of Huygens’ Tautochrone Clock was prompted by the longitude problem. Naval exploration during the 17th and 18th century was hindered by the inability to determine one’s longitude while at sea, which led to many maritime disasters. Huygen—credited for the creation of pendulum clocks—discovered that a frictionless circular arc, called a tautochrone curve, exhibits a unique property by which the time for a pendulum to reach the bottom is independent of the starting point. In this presentation we will prove that an inverted cycloid is a tautochrone curve. This presentation will also briefly explore the modern applications of tautochrone curves.
Mentor: Iris Jiang
Abstract: Einstein’s General Theory of Relativity introduces the notion of curved spacetimes and the emergence of gravity. To mathematically formalize physics in a non-Euclidean space, we are required to use differentiable manifolds. In Einstein’s theory, we assume spacetime to be a 4D differentiable manifold—a kind of blob. In this talk, I will introduce the ideas leading up to the development of Einstein’s theory of gravity, as well as the basic concepts of differentiable manifolds. I will then present the role these manifolds play in constructing a successful theory of gravity.
Mentor: Remy Bohm
Abstract: An introduction to de Rham cohomology using rotation and gradient operators. This includes examples of computing de Rham comology groups, as well as a heuristic geometric interpretation of these groups.
Mentor: Daniel Koizumi
Abstract: Introduce Morse Theory and Morse Inequality that helps us compute the homology group of topological space
Mentor: Toby Aldape
Abstract: Nother's theorem from the viewpoint of differential geometry and group actions
Mentor: Jacob Gaiter
Topology
Abstract: This talk will be a simple introduction to differential topology, discussing topological spaces and smooth manifolds.
Mentor: Adrian Flores
Abstract: In this talk, we cover cellular homology, which is a way to distinguish topological spaces via algebraic methods. In particular, we will take the homology of a surface and distinguish a pair of surfaces.
Mentor: Ansel Goh
Abstract: This talk will introduce the Five Lemma, a result in homological algebra that is useful in algebraic topology. We will define kernels and images in the context of exact sequences. Then, we’ll prove this lemma, which gives us a way to prove that a map is an isomorphism in a particular circumstance.
Mentor: Ansel Goh
Abstract: In this talk, we define what a knot is and explore concepts such as embeddings, homeomorphisms, isotopy, ambient isotopy, and homotopy. Then we discuss Seifert surfaces, from which we derive knot invariants such as the minimum genus number. We also plan to provide examples of constructing a Seifert surface associated with a given knot and demonstrate why Seifert surfaces themselves are not knot invariants. Finally, we explore some recent research topics involving knots and Seifert Surfaces.
Mentor: Audrick Pyronneau
Abstract: How to hang a picture on a wall with 5 nails and a string so that the removal of any nail causes the picture to fall? We will show how the notion of fundamental groups gives an elegant answer to this demanding engineering question. We will also show how the famous Brouwer fixed point theorem has a neat proof using the fundamental groups.
Mentor: Jayden Wang
Topology
Abstract: The talk is going to introduce the knots, links, Redemeister theorem and Jones polynomial, making connections to topological quantum field theory in physics.
Mentor: Ian Montague
Abstract: In this talk, I will explore the concept of 2-dimensional knots in 4-dimensional spaces. Specifically, I will provide a method of visualizing knots in 4 dimensions through a motion picture diagram, then I will define slice knots and outline how we can use the concordance class of these knots to create an abelian group.
Mentor: Nathan Louie
Abstract: The Yoneda Lemma is a theorem from category theory. It is an abstract result on functors of the type morphisms into a fixed object. I'll give a brief introduction into the basics of category theory and then explain the theorem and its proof.
Mentor: Zimao Tian
Abstract: Homotopy type theory is a new approach to foundations of mathematics built around the univalence axiom, which states that identity is equivalent to equivalence. This talk will include an overview of the motivation behind homotopy type theory, introduce some basic definitions and results, and highlight some applications in formalizing mathematics and computer science.
Mentor: Mark Saving
Abstract: This talk will introduce the fundamental group and higher homotopy groups of a topological space, then cover the homotopy Excision and freudenthal Suspension theorems, and end by defining stable homotopy theory.
Mentor: Jemma Schroder
Geometric Group Theory
Abstract: This talk explores Tutte’s 1-Factor Theorem—a foundational result in graph theory that characterizes when a graph has a perfect matching. Beginning with essential definitions such as matchings, 1-factors, and odd components, we introduce Tutte’s condition: a graph has a 1-factor if and only if the number of odd components in any subgraph G - S does not exceed the size of S. The proof highlights the structural implications of unmatched vertices and their connection to Hall’s Marriage Theorem. Beyond theory, we demonstrate how the theorem applies to real-world problems. This talk aims to bridge theoretical insights with practical applications through rigorous reasoning and visual illustrations.
Mentor: Suraj Dash
Abstract: In this talk, we investigate the free group and why it satisfies the axioms required to be a group. We especially investigate word concatenation and word reduction in the example of the free group on two generators.
Mentor: Erin Bevilacqua
Abstract: Through the semester, I've read portions of the book 'Office Hours with a Geometric Group Theorist' and specifically will discuss Quasi-isometries. We begin by exploring mappings between two objects (bijections, isomorphisms and then isometries), and then explore what it means to be a quasi-isometries and attempt to motivate the strength of the definition.
Mentor: Erin Bevilacqua
Abstract: Discuss Cayley Graphs and Gromov-hyperbolic groups as a lead up to the Milnor-Svarc Lemma, Curve Complexes, and Ivanov's Theorem.
Mentor: Aislinn Smith
Abstract: I’ll be presenting the Švarc-Milner lemma, hyperbolic groups, and quasi-isometric rigidity (and will give non-trivial examples of Q-I rigidity).
Mentor: Aislinn Smith
Abstract: We develop curve complexes and mapping class groups on a punctured sphere, then discuss how this forms a hyperbolic group.
Mentor: Aislinn Smith