Event Photos
Zoom Recording (Including All Presentations)
5:00–5:10pm: Arrival / setup
5:10–5:30pm: Presentation 1
Nick Kyriakides-Crowe, Masaya Fujita, Options Pricing: Mathematics Behind Financial Derivatives (Mentor: Bixing Qiao)
Abstract: We begin with a simple discrete model — the binomial tree — where an asset price moves up or down at each step like a structured coin flip. From this intuitive foundation, we will introduce the key principle of risk-neutral pricing and show how no-arbitrage conditions uniquely determine option prices. We then take the continuous-time limit, developing the necessary tools from stochastic calculus: Brownian motion as a model for random asset price evolution, and Itô's Lemma as the chain rule of this new calculus. These ideas come together in the Black-Scholes equation, one of the most famous results in mathematical finance, which yields a closed-form formula for pricing options.
Delivery: PowerPoint/slides
5:30–5:50pm: Presentation 2
Ethan Kellogg, Terry Ye, Samuel Wang, Jiarui Liu, Quantum Mechanics: Best Hits (Mentor: Dustin Young)
Abstract: A brief summary of quantum mechanics with discussion of Schrödinger equation, Heisenberg uncertainty, infinite potential well, spin, basic band theory, and adiabatic theorem.
Delivery: PowerPoint/slides
5:50–6:10pm: Presentation 3
Qige Wang, Bott Periodicity and Topological K-Theory (Mentor: Baran Çetin)
Abstract: Topological K-theory studies vector bundles. Many results, like Bott periodicity and Adams' theorem on division algebras, have simpler formulations and proofs using K-theory. I will introduce basic elements of topological K-theory, particularly focusing on deducing Bott periodicity, and sketch its application to Adams' theorem.
Delivery: PowerPoint/slides
6:10–6:40pm: Food break
6:40–7:00pm: Presentation 4
Yuming Zhang, Finitely Generated Modules over a PID and Linear Algebra (Mentor: Boxi Hao)
Abstract: We use Finitely Generated Modules over PID to find simplest forms of matrices
Delivery: PowerPoint/slides
7:00–7:20pm: Presentation 5
Mark Chen, Recovering Relative Homology via Cofibre Sequences (Mentor: Li Gu)
Abstract: This presentation studies the relationship between cofibre sequences and relative homology. Starting from a continuous map f: X → Y, we construct the mapping cone C_f and the associated cofibre sequence X → Y → C_f → ΣX → ΣY → ···. After applying reduced homology, this sequence gives a long exact sequence. When f: X ↪ Y is an inclusion, C_f identifies, up to homotopy, with Y/X, giving H̃_n(C_f) ≅ H_n(Y, X). This recovers the usual long exact sequence of relative homology from the cofibre sequence viewpoint.
Delivery: PowerPoint/slides
7:20–7:40pm: Presentation 6
Calvin Koh, Markov Chains: From Random Processes to Real-World Decision Making (Mentor: Alex Fu)
Abstract: Exploring how Markov chains provide the mathematical foundation for Google’s PageRank algorithm, transforming a complex web of hyperlinks into a system for ranking the importance of webpages.
Delivery: PowerPoint/slides
7:40–7:50pm: Short break
7:50–8:10pm: Presentation 7
Farbod Malayeri, Lola Mason, Cross Validation (Mentor: Frank Gao)
Abstract: An introduction to cross validation and its failure on financial time series.
Delivery: PowerPoint/slides
8:10–8:30pm: Presentation 8
Andrew Man Yuan, From Normal-Normal Conjugacy to Hierarchical Bayesian Models (Mentor: Mo Chen)
Abstract: Hierarchical Bayesian models provide a natural framework for borrowing information across related groups while still allowing each group to have its own uncertainty. This presentation uses the Normal-Normal conjugate model as a simple entry point to this idea. By deriving the posterior distribution, we show how Bayesian updating balances prior belief with observed data through a weighted-average structure. This intuition is then extended to the hierarchical setting, where individual parameters are connected through a shared population-level distribution. The presentation highlights the connection between conjugate Bayesian updating and the broader structure of hierarchical Bayesian modeling.
Delivery: No slides
8:30–8:40pm: Short break
8:40–9:00pm: Presentation 9
Jordy Ajanel, Introduction to Knot Theory (Mentor: Zijian Rong)
Abstract: We introduce basic properties of knots on the plane, sphere, and torus. We end by defining the knot group which is an invariant for knots and how to calculate it.
Delivery: PowerPoint/slides
9:00–9:20pm: Presentation 10
Elen Mkrtchyan, Kevin Zhang, Drew Auster, A First Look at Convex Analysis & Optimization (Mentor: Maria Segall)
Abstract: Convex optimization underlies much of modern quantitative finance, machine learning, and operations research. In this talk, we present the basic theory of convex sets and convex functions, discussing the structural properties that make them natural objects of study and the analytic tools needed to guarantee the existence of minimizers. We then turn to optimization, developing first-order conditions for minimization over a convex set and extending them to non-smooth objectives through the language of the subdifferential. We close with a brief discussion of the convex conjugate and duality, hinting at the deeper theory that underlies modern optimization and pointing toward natural directions for further study.
Delivery: PowerPoint/slides
9:20–9:40pm: Presentation 11
Yuhao Wang, Neural Network-Based Breast Cancer Classification Using Medical Feature Analysis (Mentor: Abid Hassan)
Abstract: This project uses machine learning to classify breast cancer tumors as benign or malignant based on medical features. After data preprocessing and model training, performance is evaluated using metrics such as accuracy and recall. The results show that machine learning can provide fast and reliable predictions, supporting early detection and improving diagnostic decisions.
Delivery: PowerPoint/slides
Ivan Z. Feng, Contact: ifeng@usc.edu