"Quantum Many-Body Physics Calculations with Large Language Models."

H. Pan, N. Mudur, W. Taranto, M. Tikhanovskaya, S. Venugopalan, Y. Bahri, M. Brenner, E. Kim

arxiv: 2403.03154 (2024). Under review at Nature Communications.

"Les Houches Lectures on Deep Learning at Large and Infinite Width."

Y. Bahri, B. Hanin, A. Brossollet, V. Erba, C. Keup, R. Pacelli, J. Simon

arxiv: 2309.01592 (2023). To appear in Journal of Statistical Mechanics: Theory & Experiment.

"Explaining Neural Scaling Laws."

Y. Bahri,  E. Dyer, J. Kaplan, J. Lee, U. Sharma 

arxiv: 2102.06701. To appear in PNAS.



"Beyond the Imitation Game: Quantifying and Extrapolating the Capabilities of Language Models." 

A. Srivastava, et al. (Collaborative benchmark from 100+ institutions.)

Transactions of Machine Learning Research, 2835-8856 (2023).


"The Evolution of Out-of-Distribution Robustness Throughout Fine-Tuning."

A. Andreassen, Y. Bahri, B. Neyshabur, R. Roelofs

Transactions on Machine Learning Research, 2835-8856 (2022)

"Statistical Mechanics of Deep Learning."

Y. Bahri, J. Kadmon, J. Pennington, S.S. Schoenholz, J. Sohl-Dickstein, S. Ganguli

Annual Review of Condensed Matter Physics (2020)



"The large learning rate phase of deep learning: the catapult mechanism."

A. Lewkowycz, Y. Bahri, E. Dyer, J. Sohl-Dickstien, G. Gur-Ari

arxiv: 2003.02218 (2020)



"Infinite-attention: NNGP and NTK for deep attention networks."

J. Hron, Y. Bahri, J. Sohl-Dickstien, R. Novak

ICML 2020 (International Conference of Machine Learning)



"Exact Posterior Distributions of Wide Bayesian Neural Networks.

J. Hron, Y. Bahri, R. Novak, J. Pennington, J. Sohl-Dickstein

ICML 2020 Workshop on Uncertainty in Deep Learning



"Wide Neural Networks of Any Depth Evolve as Linear Models Under Gradient Descent."

J. Lee*, L. Xiao*, S. S. Schoenholz, Y. Bahri, J. Sohl-Dickstein, J. Pennington

NeurIPS 2019 (Advances in Neural Information Processing Systems)

Also re-published in special edition of Journal of Statistical Mechanics: Theory and Experiment (2020) 124002.



"Bayesian Convolutional Neural Networks with Many Channels are Gaussian Processes."

R. Novak, L. Xiao, J. Lee*, Y. Bahri*, G. Yang, J. Hron, D. Abolafia, J. Pennington, J. Sohl-Dickstein

*Equal contribution

ICLR 2019 (International Conference on Learning Representations)



"Dynamical isometry and a mean-field theory of convolutional networks: how to train 10,000-layer vanilla CNNs."

L. Xiao, Yasaman Bahri, J. Sohl-Dickstein, S.S. Schoenholz, J. Pennington

ICML 2018 (International Conference on Machine Learning)



"Deep Neural Networks as Gaussian Processes."

J. Lee*, Y. Bahri*, R. Novak, S. S. Schoenholz, J. Pennington, J. Sohl-Dickstein

*Equal contribution

ICLR 2018 (International Conference on Learning Representations)



Sensitivity and Generalization in Neural Networks: an Empirical Study.

R. Novak, Y. Bahri, D. Abolafia, J. Pennington, J. Sohl-Dickstein

ICLR 2018 (International Conference on Learning Representations)



Geometry of Neural Network Loss Surfaces via Random Matrix Theory.

J. Pennington, Y. Bahri

ICML 2017 (International Conference on Machine Learning)



"Phonon analog of topological nodal semimetals."

H. C. Po, Y. Bahri, A. Vishwanath

Phys. Rev. B. 93, 205158 (2016) (Editor's Suggestion)



Stable non-Fermi-liquid phase of itinerant spin-orbit coupled ferromagnets.

Y. Bahri, A. C. Potter

Phys. Rev. B 92, 035131 (2015)



Localization and topology protected quantum coherence at the edge of hot matter.

Y. Bahri, R. Vosk, E. Altman, A. Vishwanath

Nature Communications 6:7341 (2015)



Detecting Majorana fermions in quasi-one-dimensional topological phases using nonlocal order parameters.

Y. Bahri, A. Vishwanath

Phys. Rev. B 89, 155135 (2014)