Yasaman Bahri
Research Scientist, Google DeepMind (formerly Brain)
Contact: yasamanbahri@gmail.com
I am a computational and theoretical scientist working at the intersection of machine learning and the physical sciences (condensed matter, materials, molecular systems).
A notable pillar in my recent research has been on building scientific foundations for deep learning, combining theory and experiment. I've been working on theories of learning and computation in artificial deep neural networks, often drawing from tools and perspectives in the mathematical sciences (statistical mechanics, dynamical systems). My work has naturally given rise to new computational methods in machine learning which form the basis for the open-source Neural Tangents Library released by Google. By making systematic and core progress on our understanding of deep learning, my research in this area has been impactful for several disciplinary communities who study or use these tools, including computer science, theoretical and computational neuroscience, statistics, and for scientists applying ML to other fields. See Quanta magazine for past coverage on my research.
More recently, I have also begun working on applications of machine learning to scientific areas connected with condensed matter physics, such as molecular prediction of electronic structure, discovery of quantum materials, and the applications of language models for scientific computation.
Collaboration and mentorship: Please reach out if my work interests you.
Bio: I was trained as a theoretical quantum condensed matter physicist, and I received my Ph.D. in Physics from UC Berkeley in 2017. My graduate work is specifically in the field of quantum many-body theory and strongly correlated physics. I was fortunate to have Professor Ashvin Vishwanath as my thesis advisor. I worked on several areas as part of my doctoral research, including topological phases, many-body localization, and non-Fermi liquids. My dissertation proposed new classes of quantum behavior; new routes towards realizing exotic quantum phases; and new classes of mechanical behavior through topological mechanisms. I got started in theory as an undergraduate through research on tensor networks and entanglement in quantum systems with Professor Joel Moore at UC Berkeley.
Recent News
Our paper "Explaining neural scaling laws" will appear in PNAS.
New preprint on evaluating the ability of large language models to perform analytic calculations in quantum many-body physics, on challenging problems taken from research in the wild. (Under review at Nature Communications.)
Recent preprint for a series of lectures I gave at Les Houches School of Physics to appear in Journal of Statistical Mechanics: Theory & Experiment.
I spoke at the Simons Institute (UC Berkeley) workshop on Large Language Models.Â
I am co-organizing a KITP program, Deep Learning from the Perspective of Physics & Neuroscience, in Winter 2023.
Invited talk at American Physical Society (APS) March Meeting 2023.