Speakers

Andrey Gromov (University of Maryland)

Andrey Gromov is an assistant professor of Physics at the University of Maryland College Park & Condensed Matter Theory Center. His physics research focused on emergent macroscopic properties in classical and quantum strongly correlated systems including quantum Hall effect, topological phases, liquid crystals and hydrodynamics. He is currently interested in developing quantitative understanding of emergent properties in neural networks such as grokking, scaling laws, abrupt changes in performance, neural network pruning and more.

Elvis Dohmatob is a researcher at FAIR (Meta AI) in Paris. His research focuses on theoretical and algorithmic aspects of trustworthy ML, the main themes being adversarial examples and more recently, efficient recommender systems under fairness constraints. That includes studying phase transitions in models' behaviour and understanding the fundamental tradeoffs between robustness, fairness, and accuracy.

Eric Michaud is a PhD student at MIT, advised by Max Tegmark. His research is aimed at improving our scientific understanding of deep neural networks. This broadly includes understanding (1) why does deep learning work well, (2) what do deep neural networks learn, (3) and how does their structure develop over training? His recent work studies grokking, neural scaling laws, and emergence. Before his PhD, Eric studied math at UC Berkeley.

Hugo Cui (EPFL)

Hugo Cui is a PhD student at the École Polytechnique Fédérale de Lausanne (EPFL), advised by Lenka Zdeborová. His research lies at the crossroads between machine learning theory, statistical physics, and high-dimensional statistics. Prior to this, Hugo graduated from ENS Paris in theoretical physics.

Jacob Steinhardt is an assistant professor of statistics at the University of California, Berkeley, with a research focus on making conceptual advances necessary for reliable and human value-aligned machine learning systems. His research areas include the robustness and security of machine learning systems, reward specification and learning human values, and scalable alighnment through studying the macroeconomic equilibria of such systems. Recently, he has also studied the science of deep learning. Beyond his technical research, Jacob collaborates with policy researchers on the use and misuse of machine learning, is a technical advisor to the \textit{Open Philanthropy Project} and writes a blog including forecasts for capabilities of future ML systems.

Kshitij Gupta

KG

Neel Nanda is an independent researcher who focuses on the field of mechanistic interpretability, which involves reverse engineering the algorithms learned by a trained neural network. Previously, he worked as a language model interpretability researcher at Anthropic, under the supervision of Chris Olah. Prior to that, he completed an undergraduate degree in pure mathematics at Cambridge. Neel is passionate about addressing the issue of existential risk posed by powerful AI, which he considers to be one of the most crucial problems of this century.

Pascal Jr. Tikeng Notsawo

Pascal Jr. Tikeng Notsawo is a first-year Master's student in Computer Science at the University of Montréal and Mila, where he is working on understanding generalization and optimization in deep learning both from dynamical systems and neuroscience perspectives. Pascal also holds a master of engineering in computer science from the National Advanced School of Engineering Yaounde.

Rylan Schaeffer

Rylan Schaeffer is a CS PhD student at Stanford Univerity in Professor Sanmi Koyejo's lab. Rylan is interning this summer on Meta's Generative AI LLM team, and previously researched/worked/interned at MIT, Harvard, DeepMind, University College London and Uber. For fun, he enjoys organizing and leading bakery crawls.

Ziming Liu is a Physics PhD student at MIT and IAIFI, advised by Prof. Max Tegmark. His research is focused on the intersection of artificial intelligence and physics, including but not limited to (1) AI for physics: extracting physical insights (e.g. conservation laws and symmetries) from data, improving prediction accuracy and sampling efficiency for data analysis in physics. (2) Physics for AI: developing effective theories to understand the dynamics and generalization of neural networks, and building physics-inspired machine learning models. Before that, Ziming received his Bachelor's degree in physics from Peking University.