# Lechao Xiao （萧乐超）

# I am a research scientist in Google Brain, NYC working on deep learning. Before that, I was a Hans Rademacher Instructor of Mathematics at the University of Pennsylvania. I received my PhD from the University of Illinois at Urbana-Champaign and my BA from Zhejiang University, Hangzhou, China. Here is my CV.

Email: XIAO dot HARMONIC at gmail dot com

# Research Interests: Math + ML

I am interested in machine learning, theory of deep learning, optimization, Gaussian process, generalization, etc.

I also work on harmonic analysis: multilinear operators, oscillatory integrals, singular Radon-like transforms, time-frequency analysis and resolution of singularities.

## Publications in Machine Learning

Neural Tangents: Fast and Easy Infinite Neural Networks in Python

Roman Novak*, Lechao Xiao*, Jiri Hron, Jaehoon Lee, Alexander A. Alemi, Jascha Sohl-Dickstein, Samuel S. Schoenholz*, ICLR 2020.

**NEURAL TANGENTS** is a library designed to enable research into infinite-width neural networks. It provides a high-level API for specifying complex and hierarchical neural network architectures. These networks can then be trained and evaluated either at finite-width as usual or in their infinite-width limit. Infinite-width networks can be trained analytically using exact Bayesian inference or using gradient descent via the Neural Tangent Kernel. Additionally, NEURAL TANGENTS provides tools to study gradient descent training dynamics of wide but finite networks in either function space or weight space. The entire library runs out-of-the-box on CPU, GPU, or TPU. All computations can be automatically distributed over multiple accelerators with near-linear scaling in the number of devices.

Wide Neural Networts of Any Depth Evolve As Linear Models Under Gradients Descent, Jaehoon Lee*, Lechao Xiao*, Samuel S. Schoenholz, Yasaman Bahri, Jascha Sohl-Dickstein, Jeffrey Pennington. NeurIPS 2019.

Dynamical isometry and a mean field theory of CNNs: How to train 10,000-layer vanilla convolutional neural networks, Lechao Xiao, Yasaman Bahri, Jascha Sohl-Dickstein, Samuel S. Schoenholz, Jeffrey Pennington. ICML 2018.

Training ultra-deep CNNs with critical initialization, Lechao Xiao, Yasaman Bahri, Samuel S. Schoenholz, Jeffrey Pennington. NIPS 2017 workshop.

## Publications in Mathematics

Higher decay inequalities for multilinear oscillatory integrals, with M. Gilula and P.T. Gressman, Mathematical Research Letters, 25(3), 819-842, 2018

Endpoint estimates for one-dimensional oscillatory integral operators , Adv. in Math., 316 (2017), 255-291.

Sharp Estimates for Trilinear Oscillatory Integrals and an Algorithm of Two-dimensional Resolution of Singularities, Rev. Mat. Ibero., 33, No.1 (2017), 67-116**. **

Maximal Decay Inequalities for Trilinear Oscillatory Integrals of Convolution Type, with P.T. Gressman, J. Func. Anal., 271, No. 12 (2016), 3695 -3726.

Bilinear Hilbert Transforms Associated with Plane Curves, with J. Guo, J. of Geom. Anal., 26 (2016), no. 2, 967-995.

Uniform Estimates for Bilinear Hilbert Transforms and Bilinear Maximal Functions Associated to Polynomials, with X. Li, Amer. J. Math., 138, No. 4 (2016), 907-962.

# Teaching

## Penn

## UIUC

Last Update: 12/05/2018