JHU AMS Graduate Student Seminar
Welcome to the homepage of the Graduate Student Seminar (PhD)
at the
Johns Hopkins Dept of Applied Math and Statistics.
We run weekly seminars to present our research work and/or teach our peers interesting (applied) math and stats!
Spring 2025
Talks every Tuesday, 1:30-2:30pm
Wyman N425
Schedule
January
January 28th
Hamilton Sawczuk
Title: The Hamiltonian Cycle Polynomial
Abstract:
February
February 4th
Drew Henrichsen
Title: How Much Data is Required for Probabilistic Bounds on Classification Certainty?
Abstract: An important question in classification, including in ML, is how much data is required to construct a classifier? Another important question is "How much certainty is there to the result of a classification mapping?" In general, there are several sources of error or uncertainty in classification results, including error due to limited data, model form error, optimization error, and bias, i.e. training and testing data being drawn from different distributions. Leveraging recent advances in level-set theory, it has become possible to address the error due to limited data by answering the question "How much data is required to establish probabilistic bounds on the inherent uncertainty of mapping results?" This uses the concept of inherent uncertainty as the unremovable uncertainty for a classification mapping. Next, we create an answer to the questions above, by showing both exact bounds, and asymptotic convergence to the inherent uncertainty, in the sources of error related to limited data.
February 11th
Yue Wu
Title: Some Unified Theory for Variance Reduced Prox-Linear Methods
Abstract: We consider the nonconvex, nonsmooth problem of minimizing a composite objective of the form f(g(x)) + h(x) where the inner mapping g is a smooth finite summation or expectation amenable to variance reduction. In such settings, prox-linear methods can enjoy variance-reduced speed-ups despite the existence of nonsmoothness. In this talk, we will present a unified convergence theory applicable to a wide range of common variance-reduced vector and Jacobian constructions. Our theory (i) only requires operator norm bounds on Jacobians (whereas prior works used potentially much larger Frobenius norms), (ii) provides state-of-the-art high probability guarantees, and (iii) allows inexactness in proximal computations.
February 18th
Merrick Ohata
Title:
Abstract:
February 25th
Kaleigh Rudge
Title: Manipulating the Propagation of Plasmons on a Graphene Sheet
Abstract: In this talk, we discuss the propagation of plasmons on graphene with a space-time perturbation in the Drude weight. The graphene is modeled as a 1-dimensional conductive sheet in a 2-D medium where the EM field obeys Maxwell’s equations. The current density on the sheet is governed by Drude’s law. The system of equations and boundary conditions can be rewritten as a single integro-differential equation. The equation gives the current density on the graphene as a function of the position and time parameters. We discuss methods to obtain a closed-form solution to the integro-differential equation, as well as numerical experiments to present a perturbation analysis for various wave types.
March
March 4th
Andrew Searns
Title: What is.... Fair Division (or Juggling Math - tbd)
Abstract:
March 11th
Ian McPherson
Title: Convergence Rates for Riemannian Proximal Bundle Methods
Abstract: Nonsmooth convex optimization is a classically studied regime with a plethora of different optimization algorithms being developed in order to solve them. Of these methods, proximal bundle methods have been created and used within the Euclidean setting for decades - attempting to mimic the dynamics of the proximal point method. While practitioners have enjoyed very robust convergence results with respect to choice of parameters, it was not unitl the late 2020s that we have had theoretical results giving non-asymptotic guarantees - recovering optimal convergence rates. Just within 2024, the first Riemannian Proximal Bundle Methods have been proposed, again lacking non-asymptotic guarantees. Within this talk, we discuss how we are able to lift the non-asymptotic rates to the Riemannian setting without access to exponential maps or parallel transports, while also recovering results given inexact subgradient oracles in the Euclidean setting as a special case. In addition, to our knowledge these are the first theoretical guarantees for non-smooth geodesically convex optimization in the Riemannian setting, that also enjoy guarantees of bounded iterates (and thus guaranteed convergence). The work presented is joint work with Mateo Diaz and Benjamin Grimmer.
March 18th
Spring Break!
March 25th
April
April 1st
Speaker
Title:
Abstract:
April 8th
Speaker
Title:
Abstract:
April 15th
Speaker
Title:
Abstract:
April 22nd
Matthew Hudes
Title: A Functional Renormalization Group Approach to Spontaneous Stochasticity
Abstract: This talk explores the application of the Functional Renormalization Group (FRG) to study spontaneous stochasticity. We begin by discussing a Gaussian problem of Brownian motion in the zero-diffusion limit, where the FRG flow is solvable. We then discuss application of FRG to spontaneous stochasticity in a 1D toy model driven by a fractional Brownian motion process. If time permits, we discuss a related Bayesian interpretation of the Wetterich-Morris FRG flow equations.
Organizers: Matthew Hudes & Kaleigh Rudge
Food coordinators: Jiayue (Zoe) Zhang & Michelle Dai