Notes
Expositions / Notes on Number Theory
Notes & Slides(Day 1, Day 2, Day 3) for a series of lectures on arithmetic statistics over global function fields (KIAS)
Notes for a presentation on p-Selmer groups and Grothendieck-Lefschetz trace formula (specialty exam)
Notes for a presentation on local Lubin-Tate theory
Notes from Arizona Winter School 2019: Topology and Arithmetic
Notes from Workshop on Geometry and Arithmetic of Surfaces (University of Wisconsin-Madison)
Notes for Graduate Number Theory Seminar talk on equidistribution of Heegner points
Preparatory talk for Ilya Khayutin's talk: Equidistribution of Special Points on Shimura Varieties.
Notes for Graduate Number Theory Seminar talk on Kloosterman sums and Kuznetsov trace formula
Preparatory talk for Nick Andersen's talk: Modular Invariants for Real Quadratic Fields.
Notes for a reading seminar on Local Langlands Correspondence for GLn over p-adic Fields
Notes for a reading seminar on Models over DVR by Tim Dokchitser
Notes from University of Chicago Summer Workshop: The roots of topology: miracles of algebraic geometry, braids, and Hilbert's 13th problem.
Exposition on Galois covers and generalized Fermat equations
Exposition on rationality of zeta functions over finite fields (University of Chicago REU 2016)
Exposition on Chebotarev density theorem (University of Chicago Proseminar: supervised by Frank Calegari and Bao Le Hung)
Exposition on existence of the Frobenius element and its applications (University of Chicago REU 2015)
Exposition on an Introduction to Dynamical Billiards (University of Chicago REU 2014)
Other Notes / Posters / Presentation Slides
Notes on applying Sylvester's law of inertia to QUBO formulations for systems of linear equations, coauthored with Kyungtaek Jun. (2021)
(This note generalizes the ideas presented in the conference proceeding "On the application of matrix congruence to QUBO formulations for systems of linear equations", coauthored with Hyunju Lee, Byung Chun Kim, Youngho Woo, and Kyungtaek Jun)
Slides for a seminar talk on attention-transformers (2021, NIMS)