The presentation sessions for the Summer 2025 iteration of the DRP were held on Aug 28 from 9AM to 11AM in SEO 636. There were 5 presentations:
From Monte Carlo Simulations to Analytic Solutions of Ising Models
Myles Allred, mentored by Jennifer Vaccaro
Abstract: This talk will first use a simple toy model to introduce the basic toolkit of statistical mechanics for analyzing systems with many interacting degrees of freedom. I will then talk about the significantly more complicated frustrated 2D lattice model and mean-field Sherrington-Kirkpatrick (SK) models, using Markov chain Monte Carlo simulations to illustrate their distinct energetic and magnetic behaviors. Finally, I will briefly introduce the Replica Method, an analytic technique that provides an exact solution to the SK model and yields very accurate physical predictions, despite ongoing debate over its mathematical rigor.
Groups, Cayley Graphs, and the Word Problem
Will Brown, mentored by Shin Kim
Abstract: Groups, both finite and infinite can be represented by directed labeled graphs called Cayley graphs, and by a presentation given with respect to generators and relations. Any group elements can be represented with a string of generators forming a word. Dehn’s Word Problem asks if, given a certain word representing a sequence of generators, can one determine if this word is equivalent to the identity. This talk will discuss the correspondance between Cayley graphs and the solvability of this problem.
A brief on Galois Theory
Ted Nguyen, mentored by Andre de Moura
Abstract: I’ll give a brief on Galois theory and the fundamental theorem via examples.
Building Flexible Biological Models With Bayesian Statistics
Elijah Siajunza, mentored by Nick Christo
Abstract: This presentation will cover some building block foundational tools in statistics. Starting from more basic ideas and models in bayesian statistics, I show how gradually more complex models can be constructed from these elements to model more broader problems that can arise in different contexts, with a focus on biological application. In particular, I will attempt to demonstrate how and why a bayesian view on certain scenarios can present unique advantages compared to a frequentist perspective.
Farey Graph
Hita Bharwad, mentored by Darius Alizadeh
Abstract: We will define the Farey graph, its vertices and edges, and describe SL2(Z) acting on the Farey graph.
Limits and Adjunctions C-Category
Aadi Kapil, mentored by Julien Benali
Abstract: A continuation of DRP presentation from Spring 2025 on categories.
The presentation sessions for the Spring 2025 iteration of the DRP were held on Thursday, May 1 from 2PM to 5PM and on Friday, May 2 from 12PM to 2:30PM in SEO 636. There were nineteen presentations:
Graph Convolutional Networks
Myles Allred, mentored by Jennifer Vaccaro
Abstract: Graph Convolution Networks (GCNs) are used to classify graphs and their features by aggregating information from adjacent nodes according to learnable parameters. They are a powerful tool which extends the concept of convolution on grid structured data (i.e., image processing contexts) to graph structured data. This finds applications in numerous settings including functional brain connectivity analysis, node classification, molecular property prediction, and link prediction.
Achievability in Shannon’s Noisy Channel Coding Theorem
Harry Alvarado, mentored by Abhijeet Mulgund
Abstract: We will discuss the noisy channel coding problem and Shannon’s analysis. In particular, we will see that arbitrarily low error is achievable for fixed rates of communication, contrary to widely held beliefs at the time.
Dijkstra's Algorithm
Haroon Azhar, mentored by Lisa Cenek
Abstract: This presentation explains Dijkstra's Algorithm, showing how it finds the shortest path in a graph with a step by step process. It includes basic graph theory concepts, real world applications (GPS navigation), and explains why the algorithm works correctly with text and visuals.
Random Variables and the Law of Large Numbers
Kasandra Bahena, mentored by Michael Gintz
Abstract: We’ll introduce some ideas and terminology in probability, exploring in particular different kinds of convergence of random variables, and use this information to talk about Chebyshev’s Inequality and the Law of Large Numbers, and go into detail with some examples.
Positive Tilings of the Plane
Hita Bharwad, mentored by Amelia Pompilio
Abstract: Presentation will prove that up to isomorphism there are five distinct one-sided tilings in the plane ℝ². The proof will briefly discuss different cases of subgroups obtained by rotations of an order of 3.
Finding & Solving Recurrences Using the Catalan Numbers
Will Brown, mentored by Nick Christo
Abstract: The Catalan Numbers are a sequence that arise from counting the number of labeled binary trees of a given size, among other things. We will deduce a recurrence relation for the sequence and then derive and prove a closed formula in two different ways. Our first derivation will make use of a generating function and some binomial coefficient identities to prove a closed formula. Then we will derive and prove the same closed formula using a combinatorial argument.
Group Actions on Graphs
Peace Fonanih, mentored by Theo Sandstrom
Abstract: A group is an algebraic object that contains elements under an operation and we study the relationship between different types of groups and their geometric applications.
Illustrating the Probabilistic Method with Random Graphs
Sriya Gandhi, mentored by Clay Mizgerd
Abstract: To begin, a general overview of graph theory will lead the talk (here I’ll define a graph, colorings, subgraphs, & random graphs). Then I’ll introduce the probabilistic method’s idea. Lastly I’ll introduce a fun question which integrates the probabilistic method with graph theory & show how to solve it (if time permits).
The Lord of the Rings: Entering the World of Rings in Abstract Algebra
Juan Garcia, mentored by Vignesh Jagathese
Abstract: We will give an introduction to rings in abstract algebra. We will discuss the definition of a ring, maps between them, quotient rings, and adjoining elements, and provide examples.
Unitary Diagonalization of Hermitian Operators in Quantum Theory
Noah Hubener, mentored by Shin Kim
Abstract: In quantum theory, Hermitian operators represent physical observables. This talk will explain how unitary operators can diagonalize Hermitian operators, making their eigenvalues the possible outcomes of measurements explicit.
Introduction to Category Theory
Aadi Kapil, mentored by Julian Benali
Abstract: In this talk, I will introduce the notion of a category, and talk about categories as its own mathematical structure. I will end by displaying the Yoneda Lemma.
Cross Ratio Considerations
Marcelo Lozano, mentored by Amelia Pompilio
Abstract: The cross ratio is a famed double ratio central in the study of geometry due to its invariance under projective transformations. This presentation will introduce the cross ratio and showcase a geometric exercise in which it is featured. This talk will also give brief comments regarding how Kelly and Strauss (1958) use this exercise to identify certain kinds of Hilbert geometries as non-Euclidean geometries of the hyperbolic plane.
C[0,1] with the 1-norm is not complete
Matt Mrozinski, mentored by Hannah Sheats
Abstract: We will discuss normed spaces, Cauchy and convergent sequences, and completeness. We will show that the space of continuous functions on the interval [0,1] is not complete with respect to the 1-norm.
Cantor-Bernstein Theorem
Ted Nguyen, mentored by Sacha L'Heveder
Abstract: I will prove the Cantor-Bernstein Theorem. If there is time, I will talk about paradoxical decompositions.
Intro to Topology
Gabriel R. D. Ruiz, mentored by Jagerynn Verano
Abstract: I will introduce the definition of topological spaces and key definitions. I will also talk about compactness and make reference to the standard topology on the real line.
Utility of Graph Theory in Neuroscience
Elijah Siajunza, mentored by Nick Christo
Abstract: In this talk we will discuss one way to simplify potentially complicated neural structures by building smaller models with graphs.
Logistic Regression and application to healthcare data
Phung Vuong, mentored by Karoline Dubin
Abstract: In this talk we will discuss the fundamentals of regression analysis: to illustrate the usefulness of logistic regression, we’ll first talk about linear regression (a model may be familiar with), and we’ll compare this to logistic regression. To demonstrate the uses of logistic regression, we implemented a logistic regression model to predict diabetes using healthcare data from Kaggle. The goal was to build a more efficient classifier to identify individuals at risk of diabetes. The dataset includes blood glucose level, hypertension, heart disease, HbA1c, BMI, age, smoking history, and gender. We performed data cleaning, handling missing values, normalizing data to transform into numerical data to ensure it was suitable for modeling. The evaluation matrix such as accuracy and precision, confusion matrix were used to assess the model performance. This work demonstrates how logistic regression can be applied to healthcare data for early detection and decision making.
Permutations and Derangements
Caleb Williams, mentored by Hannah Sheats
Abstract: Abstract: A permutation is a bijective mapping from a set to itself. A derangement is a permutation which has no fixed points. Using combinatorial reasoning, one can count the number of derangements of any finite set. As it turns out, the number of derangements of a set containing n elements is the nearest integer to n!/e.
PAC Learning, Support Vector Machines, and the Foundations of Machine Learning
Amir Yasin, mentored by Duan Tu
Abstract: Introduction definition of PAC learning and several complexity notions such as Rademacher complexity and VC dimension. Introduce SVM algorithm with implementation to a real life breast cancer data set using Python.
The Fall 2024 DRP at UIC Presentation Session was held on December 6, 2024. There were ten presentations:
Information Theory: Huffman Codes
Harry Alvarado, mentored by Abhijeet Mulgund
Abstract: Introduce the mathematical ideas of encoding and ultimately prove that Huffman codes are optimal encodings
An application of the Borel-Cantelli lemma
Albert Arias, mentored by Shin Kim
Abstract: In this presentation we explore how the Borel-Cantelli lemma can be used to show that the largest run of heads in n independent tosses of a fair coin will will almost surely grow like log(n).
Differential Privacy and its Real Life Applications
Qiming Li, mentored by Duan Tu
Abstract: Differential privacy is an important technique for protecting individual privacy while still allowing useful data analysis. In this presentation, I’ll explain the basic idea of differential privacy using a practical example: figuring out if two students cheated on an exam. This example shows how differential privacy can be used in real-world situations to balance privacy and accuracy.
Introduction to Differential Geometry
Gabriel R. D. Ruiz, mentored by Jaegeon Shin
Abstract: This presentation will contain a review of curvature and torsion, cover Fernet Frame which is local theory, and then move to global theory. We then will look at the isoperimetric inequality.
A summary of Gödel's Incompleteness Theorems
Michael Schmidt, mentored by Julian Benali
Abstract: We gave a brief history and explanation of Gödel's Incompleteness Theorems. We'll give an example of Gödel numbering and the proof idea for Gödel's theorems.
Wanna solve Hilbert's tenth? think again
Bill Shepelak, mentored by Katie Kruzan
Abstract: Some problems are not solvable through computation. Computability theory gives us the tools to show why this is the case with certain types of problems. Using results from computability theory we are able to show through a technique called reduction that no decidable algorithm exists as a solution to Hilbert's 10th problem regarding finding solutions to Diophantine equations with more than one independent variable.
Chaotic Cellular Automata Image Compression
Sam Stuckey, mentored by Karoline Dubin
Abstract: Cellular automata cover the breadth of elementary and very complex systems and have been rashly relegated to the theory of computing and modeling dynamical systems. But, they can be so much more than that! Using the variance in CA rules, people came up with ways to use CA in practical ways, including image compression, which will be the focus of this talk.
(Weak) Law of Large Numbers
Sebastian Tous, mentored by Nick Christo
Abstract: The Weak Law of Large Numbers, a cornerstone of probability theory, states that the sample mean of independent, identically distributed random variables converges in probability to the true mean as the sample size grows. This presentation covers the foundational theory required to understand this result, its extension to the Strong Law of Large Numbers, and the key differences between the two. We then examine its application to the St. Petersburg paradox, a coin-flipping game with infinite expected value, to examine how much one should rationally wager to play.
Structured Sets, Categories, and Certain Applications
Caleb Williams, mentored by Vignesh Jagathese
Abstract: We review rigorous definitions of sets and classes, ordered pairs, relations, mappings of sets, structured sets such as groups, topological spaces, and sigma-algebras, and properties of compositions of maps between them that preserve the structure that these sets are equipped with. We introduce the definition of a category and introduce basic examples of categories.
Geometric Group Theory and the Milnor-Schwarz Lemma
Noah Zabelka, mentored by Darius Alizadeh
Abstract: Assuming no familiarity with geometric group theory (though some knowledge on groups and metric spaces will be very helpful), we will be taking a crash course from a standard introduction to one of its most fundamental results: the Milnor-Schwarz Lemma. This will include brief descriptions from the geometric properties of a group, up to quasi-isometries and the Milnor-Schwarz Lemma.
The Summer 2024 DRP at UIC Presentation Session was held on September 9, 2024. There were four presentations:
Fantastic Fractals
Janien Hammonds, mentored by Lisa Cenek
Abstract: Fantastic Fractals explores the paradoxical nature of fractals by building on the concept of the Cantor set. I use the Koch curve and Koch island to exemplify this characteristic.
Pólya's Recurrence Theorem & Electric Networks
Marcelo Lozano, mentored by Karoline Dubin
Abstract: "A drunk man will return home, but a drunk bird may get lost forever." Will a point wandering randomly on an n-dimensional grid return to its initial position? George Pólya's Recurrence Theorem tells us that simple random walks are recurrent on one and two dimensional lattices and are transient in higher dimensions. This talk will apply the language of electric networks to develop intuition for this famous result (and outline one of its proofs).
Wait, There is More than One Geometry?
Juan Jose Rosendo, mentored by Amelia Pompilio
Abstract: When taking a high school level geometry class, we are exposed to the reality of shapes. It is all we know, until one decides to learn more about a "geometry." Therefore, we will explore the differences between various geometries, including Euclidean, Hilbert, Spherical, Hyperbolic, and Projective geometry. A brief introduction will be given for each as well as a fun fact of why each one differs from the other.
Why Lebesgue? Understanding the Limitations of Riemann Integration
Sebastian Tous, mentored by Nick Christo
Abstract: While Riemann integration provides a straightforward and geometrically intuitive approach to finding the area under a curve, it falls short in cases involving discontinuous functions and is insufficient for nice general convergence theorems to hold. In this presentation, I will discuss the motivation behind Lebesgue integration, its relation to the Riemann integral, and some of its limitations.
The Spring 2024 DRP at UIC Presentation Session was held on April 26, 2024. There were eight presentations:
Using Causal Inference To Study the Spread of COVID-19
Zach Alzubi, mentored by Abhijeet Mulgund
Abstract: We demonstrate a use case of causal inference to learning a Bayesian network modeling the spread of COVID-19 across the USA. We describe the methods used as well as challenges faced. We also interpret the model learned and discuss additional methods to potentially improve its accuracy.
The Mathematics of Pricing Assets
Raghav Bhutani, mentored by Kevin Zhou
Abstract: This presentation introduces essential financial tools, focusing specifically on derivatives. We will explore the fundamental mechanics of derivatives, including detailed visual representations through payoff diagrams. The core of the discussion will center on the pricing of these assets, particularly through the lens of the Black-Scholes formula. Attendees will gain an understanding of what the Black-Scholes formula represents, its significance in financial markets, and its practical applications in asset pricing.
Representations and Lie Groups
Mustafa Nawaz, mentored by Jennifer Vaccaro
Abstract: In this talk, we will define Lie groups, Lie algebras, and representations. We will discuss properties, and present Schur’s Lemma.
Sheaves and Cohomology: An Introduction
Max Nguyen, mentored by Vignesh Jagathese
Abstract: This presentation will introduce the notion of a sheaf, a useful tool to study properties of spaces. Despite this, sheaves still have some shortcomings. By studying these shortcomings one can derive a powerful invariant in the form of cohomology.
Foundations of of Machine Learning: Learning Guarantees and Dimensionality Reduction
Markus Perez, mentored by Duan Tu
Abstract: When designing machine learning systems, fundamental questions from “What can be learned?” to “Can we design accurate and efficient learning algorithms?” arise. This presentation will provide an introduction to the PAC-Learning Framework, which allows us to analyze learnability, and the related concepts of Rademacher Complexity and VC-Dimension. Additionally, the basics of Dimensionality Reduction will be covered.
Models of Random Graphs
Juan Jose Rosendo, mentored by Katie Kruzan
Abstract: There are various models of random graphs that have different purposes. The different models and the Preferential Attachment Model. Each model will be defined, accompanied with an that will be introduced are the Erdos-Renyi graph, Generalized Random Graphs, Configuration Model, example. Last, a brief description behind the motivation of each model will be given.
Introduction to Ideals and Varieties
Emma Todd, mentored by Emily Cairncross
Abstract: The discussion will define and cover some examples of affine varieties. Then we will define general ideals and radical ideals and explore their connection to varieties.
Convergence Theorem for Finite Markov Chains
Sebastian Tous, mentored by Nick Christo
Abstract: When does a finite Markov chain converge? At what rate does convergence occur? In this presentation, I will discuss the prerequisite conditions for convergence and give a full proof of the Convergence Theorem for Finite Markov Chains.
The Fall 2023 DRP at UIC Presentation Session was held on December 1, 2023. There were four presentations:
Computational Learning Theory and Probabilistic Models
Zach Alzubi, mentored by Abhijeet Mulgund
Abstract: We examine the framework of PAC learning and provide examples of problems that are PAC learnable as well problems that are not. We then investigate this framework applied to learning Bayesian Networks. We address the limitations of this framework, and provide a concrete example of learning a Bayesian network given real-world medical data.
Towards counting lines on a smooth cubic
Leila Dahlia, mentored by Sixuan Lou
Abstract: We will introduce modern machinery to count numbers of lines lying on a given surface.
Algebraic Topology and The Fundamental Theorem of Algebra
Mustafa Nawaz, mentored by Michael Gintz
Abstract: We will talk about the notion of homology and how it can be used to give an invariant of a circle. Then, using this we can prove the fundamental theorem of algebra.
Markov Chains: Introduction and Applications
Zahra Vasi, mentored by Clay Mizgerd
Abstract: Markov chains serve as useful mathematical models for studying movements among elements in a set. These chains have unique properties that allow mathematicians to understand and predict their long-term behavior. This presentation discusses some essential definitions and properties associated with Markov chains and several examples of useful and interesting applications.