Neural Nets Back to The Future @ ICML-16 - Abstracts


Machine Learning and Neural Nets at Bell Labs, Holmdel

Larry Jackel, North C Technologies Inc

From 1986 until the 1995 break-up of AT&T, Bell Labs in Holmdel, NJ was a hotbed of activity in machine learning and neural net research. Enormous talent gathered in the Holmdel group, including researchers Bernhard Boser, Leon Bottou, Jane Bromley, Corinna Cortes, John Denker, Harris Drucker, Hans Peter Graf, Isabelle Guyon, Don Henderson, Rich Howard, Wayne Hubbard, Yann LeCun, Stuart Mackie, Nada Matic, Urs Muller, Edi Sackinger, Patrice Simard, Marcus Schenkel, Bernard Scholkopf, Dan Schwartz, Sara Solla, Rulei Ting, and Vladimir Vapnik. Most of these people continue to have a major impact today. While at Holmdel, Boser, Guyon, and Vapnik, invented Kernel Support Vector Machines, one of the most widely used learned classification methods. During the same period, LeCun pioneered the trained Convolutional Neural Nets that have revolutionized pattern recognition and image analysis. Nearly all members of the group contributed to creating practical products that derived from their research, including high-volume character recognition technology used in check-reading.

This talk will describe the events to led to the formation of the Holmdel group and will highlight some of the achievements. We will also show how ConvNets, similar to those invented at Holmdel in 1989, are now being used at the same site to steer self-driving cars.

Bio:

Larry Jackel is President of North-C Technologies, where he does professional consulting. From 2003-2007 he was a DARPA Program Manager in the IPTO and TTO offices. He conceived and managed programs in Universal Network-Based Document Storage and in Autonomous Ground Robot navigation and Locomotion. For most of his scientific career Jackel was a manager and researcher in Bell Labs and then AT&T Labs. He has created and managed research groups in Microscience and Microfabrication, in Machine Learning and Pattern Recognition, and in Carrier-Scale Telecom Services. Jackel holds a PhD in Experimental Physics from Cornell University with a thesis in superconducting electronics. He is a Fellow of the American Physical Society and the IEEE.






Automatic Differentiation versus Back-Propagation: The Human Touch

Barak Pearlmutter, Hamilton Institute at Maynooth University 

The very first computer science PhD dissertation introduced forward accumulation mode automatic differentiation.  In the 1970s, tools for automated generation of adjoint codes (aka reverse accumulation mode automatic differentiation, aka backpropagation) were developed.  Yet even today, our tools for high-performance numeric computations do not support automatic differentiation as a first-class citizen.  In this talk, I will discuss the history of automatic differentiation and of backpropagation and its embelishments and variants (backpropagation through time, RTRL, etc) and try to account why these two subdisciplines---both concerned to a large extent with the efficient calculation of derivatives of high-dimensional numeric functions expressed as computer programs---have enjoyed sust a low level of cross-pollination.

Barak PearlmutterBio:

Prof Barak A. Pearlmutter received his PhD at Carnegie Mellon for work on neural networks as gradient systems, and the efficient calculation of gradients and related quantities (Hessian-vector products, etc). Since then, in addition to work in machine learning, signal processing, and theoretical neurobiology, one of his main interests has been automatic differentiation: both formalizing automatic differentiation, and adding efficient generalized first-class differentiation operators and to modern functional programming languages.