Search this site
Embedded Files
Skip to main content
Skip to navigation
Aspen Winter 2019
Home
Schedule
Abstracts
Slides
Aspen Winter 2019
Home
Schedule
Abstracts
Slides
More
Home
Schedule
Abstracts
Slides
Slides
Samuel Schoenholz - Priors for Deep Infinite Networks
Sho Yaida - Fluctuation-Dissipation Relation for Stochastic Gradient Descent
Levent Sagun - Over-parametrization in neural networks: observations and a definition
Ekin Cubuk - Generalization under noise, adversarial examples, and data augmentation
Dmitri Chklovskii - Neuroscience-based machine learning
Dmitry Krotov - Unsupervised Learning on a Sphere
Jack Hidary & Stefan Leichenauer - Neural network classifiers on NISQ-regime processors
Dar Gilboa - Signal propagation and gradient stability in the LSTM and GRU
Felix Draxler - Essentially No Barriers In Neural Network Energy Landscape
Vladimir Kirilin - Gradient and hessian properties in logistic regression
Roger Grosse - Scalable Natural Gradient Training of Neural Networks
Zohar Ringel - The role of a layer in supervised learning
Jeffrey Pennington - Statistics of Random Neural Networks
Sam McCandlish - An Empirical Model of Large-Batch Training
Katherine Quinn - Visualizing Probabilities: Intensive Principal Component Analysis
Aristide Baratin - On the Spectral Bias of Neural Networks
Joshua Batson - Noise2Self: Blind Denoising with Self-Supervision
Joan Bruna - Geometric Deep Learning in a NonCommutative World
Alexander Alemi - TherML: A thermodynamics of machine learning
Jonathan Kadmon - High-dimensional manifold classification by multilayered perceptrons
Gavin Hartnett - Replica Symmetry Breaking in Bipartite Spin Glasses and Neural Networks
Yasaman Bahri - Wide, Deep Neural Networks as Gaussian Processes
Behnam Neyshabur - The Role of Over Parametrization in the Generalization of Neural Networks
Samuel Smith - What can stochastic differential equations tell us about stochastic gradient descent and natural gradient descent?
Paul Ginsparg - Rise of the machines: deep learning from Backgammon to Skynet
Uros Seljak - EL2O and MPM: alternatives to Variational Inference and MAP/MLE
Andrea Montanari - A mean field view of the landscape of two layers neural networks
Maxime Gabella - Introduction to Topological Data Analysis
Alex Cole - Local Deformations and Global Features in Persistent Homology, Machine Learning, and Physics
Jesse Thaler - Collision Course: Particle Physics as a Machine Learning Testbed
David Spergel - ML for sub-grid physics
Jascha Sohl-Dickstein - Parameter Estimation in Intractable Probabilistic Models by Minimum Probability Flow Learning
Adam Scherlis - The Goldilocks zone: Towards better understanding of neural network loss
Anders Andreassen - JUNIPR: a Framework for Unsupervised Machine Learning in Particle Physics
David Shih - Searching for New Physics with Deep Autoencoders
Lechao Xiao - Why ConvNets generalize better than fully-connected networks? An explanation via a mean field approach
Peter Eckersley - Impossibility and Uncertainty Theorems in AI value alignment (or why your AI should not have a utility function)
Masoud Mohseni - Approximate optimization and inference with tensor networks
Michael Douglas - Machine Learning by a String (Theorist)
Daniel Park - Optimal SGD Hyperparameters for Fully-connected Networks
Samuel Ocko - Diversity versus Depth in Neural Representations
Miles Stoudenmire - "Wavefunctions" of data: applying tensor network methods from physics to machine learning
Google Sites
Report abuse
Google Sites
Report abuse