Lechao Xiao (萧乐超)

I am a research scientist in Google Brain, NYC working on deep learning. Before that, I was a Hans Rademacher Instructor of Mathematics at the University of Pennsylvania. I received my PhD from the University of Illinois at Urbana-Champaign and my BA from Zhejiang University, Hangzhou, China. Here is my CV.

Email: XIAO dot HARMONIC at gmail dot com

Research Interests: Math + ML

I am interested in machine learning, theory of deep learning, optimization, Gaussian process, generalization, etc.

I also work on harmonic analysis: multilinear operators, oscillatory integrals, singular Radon-like transforms, time-frequency analysis and resolution of singularities.



Publications in Machine Learning

Neural Tangents: Fast and Easy Infinite Neural Networks in Python

Roman Novak*, Lechao Xiao*, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz*, ICLR 2020.

NEURAL TANGENTS is a library designed to enable research into infinite-width neural networks. It provides a high-level API for specifying complex and hierarchical neural network architectures. These networks can then be trained and evaluated either at finite-width as usual or in their infinite-width limit. Infinite-width networks can be trained analytically using exact Bayesian inference or using gradient descent via the Neural Tangent Kernel. Additionally, NEURAL TANGENTS provides tools to study gradient descent training dynamics of wide but finite networks in either function space or weight space. The entire library runs out-of-the-box on CPU, GPU, or TPU. All computations can be automatically distributed over multiple accelerators with near-linear scaling in the number of devices.




Wide Neural Networts of Any Depth Evolve As Linear Models Under Gradients Descent, Jaehoon Lee*, Lechao Xiao*, Samuel S. Schoenholz, Yasaman Bahri, Jascha Sohl-Dickstein, Jeffrey Pennington. NeurIPS 2019.


Dynamical isometry and a mean field theory of CNNs: How to train 10,000-layer vanilla convolutional neural networks, Lechao Xiao, Yasaman Bahri, Jascha Sohl-Dickstein, Samuel S. Schoenholz, Jeffrey Pennington. ICML 2018.


Training ultra-deep CNNs with critical initialization, Lechao Xiao, Yasaman Bahri, Samuel S. Schoenholz, Jeffrey Pennington. NIPS 2017 workshop.


Teaching

Penn

UIUC

Last Update: 12/05/2018