Morgane Austern

I am an assistant professor of Statistics at Harvard University. I am interested in understanding the behavior of algorithms and statistical estimators in the presence of a large amount of dependence. My work consists in developing new probability tools, and in using those to establish the properties of learning algorithms in structured and dependent data contexts.   I graduated with a PhD in statistics from Columbia University in 2019 where I worked in collaboration with Peter Orbanz and Arian Maleki on limit theorems for dependent and structured data.  For two great years (2019-2021), I was a  postdoctoral researcher at Microsoft Research New England. In the fall of 2022 I was a long term visitor at the Simons Institute for theoretical computing. In 2022 I was named a Kavli fellow by the National Academy of science. In 2023 I was invited to speak at the National Academies of Science, Engineering and Medecine on the mathematical foundation of machine learning in a symposium on AI for mathematical reasoning.

My research has notably extended limit theorems for dependent data and matrices,  studied graph representation learning, proposed new methods to obtain concentration inequalities, and established the properties of resampling methods such as the cross-validation and bootstrap method. My current work is motivated by high-dimensional statistics, stable matching problems and random matrix theory.