Fall 2022
Fall 2022 Talk Schedule
Fall 2022 Talk Schedule
August 29 - Organizational Meeting
August 29 - Organizational Meeting
September 12 - Invited Speaker
Speaker: Prof. Thibaud TaillefumierTitle: A conversation with Prof. TaillefumierAbstract: I will discuss how questions in neuroscience can inspire innovative mathematical techniques and how at the same time, biological insights can be gained from the consideration of idealized neuronal models. Depending on the audience interest, the mathematical techniques at stake will draw on the fields of combinatorics, stochastic analysis, and nonlocal functional equations.September 12 - Invited Speaker
September 19
Speaker: Allie EmbryTitle: Applications of Topology to Machine LearningAbstract: I will discuss a popular idea in Machine Learning and Artificial Intelligence called the Manifold Hypothesis. This idea allows us to improve ML algorithms using ideas from topology, such as atlases of charts and smooth structures. In this talk, I will give a bit of background on the goals of ML, describe the Manifold Hypothesis, and discuss some interesting ways this idea is applied in practice. (Note: This is a very similar talk to the talk I gave a couple weeks ago in Jr Topology Seminar). September 19
September 26
Speaker: Jayden WangTitle: MatroidsAbstract: Gabriel García Márquez has a wonderful autobiography “Living To Tell The Tale”. Sometimes the evolution of mathematics sounds like a tale. This is what I will do in this talk — to tell a tale. Since I love trees, the story will begin and end with trees. Since this is a seminar in applied math, the story will begin and end with applications. The problem of connecting a group of computers with minimal cost can be modeled as an optimization problem on a graph, which admits a greedy algorithm. Existence of a greedy algorithm doesn’t depend on the existence of a graph. It only depends on the structure called matroids. Matroids simultaneously and vastly generalize many objects in combinatorics. A discrete optimization problem has a greedy algorithm if and only if the feasibility set is a matroid. Moreover, matroids exhibit many intriguing properties, many of which can be stated as the log-concavity of certain integer sequences. Curious enough, log-concavity is extremely ubiquitous, spanning algebraic geometry, differential geometry, representation theory, and optimization. It turns out that behind every log-concavity property there is a polynomial in charge. This family of polynomials is called stable polynomials. Among their many applications, I will show how to use them to construct a probability measure on spanning trees of a graph that satisfies certain properties.September 26
October 3
Speaker: Ziheng ChenTitle: Prison Game and Random PermutationsAbstract: In this talk, we are going to explore some interesting properties of random permutations by examining a few fun games in this setting. The plan is to calculate the characteristics of giant cycles without using intrinsic generating functions.October 3
October 10
Speaker: Ryan EllinTitle: A Conceptual Introduction to Differential Privacy and Its PropertiesAbstract: Differential Privacy is a mechanism for protecting the privacy of people whose data is used for algorithms and analysis. It is a field of growing interest, both among researchers and regulators, and was famously used in the recent 2020 US Census. In this talk, I will give an intuitive introduction to the topic, its basic properties, and discuss some exciting recent work on the stability benefits afforded to private algorithms.October 10
October 17 - Talk Rescheduled
October 17 - Talk Rescheduled
October 24
Speaker: Zachary RicheyTitle: Equilibria in the compartmental disease modelsAbstract: We will discuss multiple methods for studying the existence and stability of equilibria to ODEs, specifically in the context of compartmental disease models.October 24
October 31
Speaker: Jennifer RozenblitTitle: Neuroscience, probability, and topologyAbstract: Neuroscience is a relatively young field -- the intersection of mathematics and neuroscience (specifically probability, topology, and neuroscience) is just in its beginnings. However, more and more models show that the way our brains process and form memories of the world around us is synonymous with notions found within fundamental theorems and properties of topology and probability; In this talk, I will give a broad survey of some of the theoretic and probability-related topology that exists thus far in mathematical neuroscience. This talk will be accessible to those without knowledge of either neuroscience or topology, and will be made intuitive -- all are welcome.October 31
November 7 - Talk Rescheduled
November 7 - Talk Rescheduled
November 14
Speaker: Addie DuncanTitle: The Geometry and Algebra of Computer Vision (or how the words "applied" and "algebraic geometry" can fit in the same sentence)Abstract: The haters like to make fun of the silly little pure mathematicians and their silly little math problems but what the haters don't know is that algebraic geometry is the reason we have a COVID-19 vaccine... No, this isn't click bait--this is the truth. Don't be a hater and come learn what applied algebraic geometry has to do with one of the most important pharmacological innovations of our time! In this talk, we will focus on a particular computer vision task that involves reconstructing 3D geometry from 2D images. It turns out that algebraic geometry (and group theory) are crucial for constructing the mathematical framework in which such a task can be completed. But lucky for you, algebraic geometry will not be crucial for understanding this talk. I will take you on a journey through both the geometric and algebraic set up for the reconstruction problem and discuss some very cool applications. By the end, I hope we will all leave with a little bit more respect for both pure and applied mathematicians.November 14
November 21 - Thanksgiving Break
November 21 - Thanksgiving Break
November 28
Speaker: Benjamin NativiTitle: SVM, The Kernel Trick and Random Fourier Features.Abstract: While not the most commonly used method for classification today, the Support Vector Machine (SVM) is still an interesting technique for classification with a rich mathematical development. Linear SVMs work well but only in a restrictive context. However, the SVM can be generalized with the kernel trick to Kernel Machines. Kernel Machines are a more expressive classification technique, but do not scale well with the number of datapoints in your dataset. Thus, we finish by discussing random feature methods, which also use kernels but scale better with the size of the dataset. One particularly cool random feature method relies on the Fourier transform to generate random features, and this will be the method we discuss. This talk will be a great introduction for anyone interested in learning about SVMs, kernels or random features. You will not need any familiarity with SVMs or the Fourier transform to listen to this talk. While I will cover each part only at a surface level, you should come away with the core ideas/motivations. The goal of my talk is to help you understand the story, not the technical details.November 28
December 5 - No Talk
December 5 - No Talk