Back to the Future by Garry Knight

 
  Neural Nets Back to the Future @ ICML 16
     June 23rd 2016 at Crowne Plaza in NYC

   
A workshop linking the past, present and future research on neural networks


Overview

As research in deep learning is extremely active today, we could take a step back and examine its foundations. We propose to have a critical look at previous work on neural networks, and try to have a better understanding of the differences with today's work. Previous work can point at promising directions to follow, pitfalls to avoid, ideas and assumptions to revisit. Similarly, today's progress can allow a critical examination of what should  still be investigated, what has been answered...


This workshop will be the occasion to

- critical review of previous work

- discuss forgotten "gems" from the earlier literature

- understand how mechanisms we take for granted todays came to be (e.g. gating, transfer functions...)

- understand research trajectories: how ideas came from promising to successful or not, what inspired influential solutions...

- revisit assumptions about what could work or not (e.g. vanishing gradients issues)


The program mixes invited talk with influential scientists who have contributed to our field for a long time, presentation of forgotten "gems" and
panel discussion involving with a variety of researchers across neural network eras.


Recordings

The workshop was recorded, we will post link to talks and slides as soon as there are available.

Location

ICML Workshops are all within 1-2 block of each other around Time Square. We are located at 

Crowne Plaza (Broadway Room) at 1605 Broadway, New York, NY

Workshop Schedule

08:20 am

Welcome and Introduction


08:30 am

Larry Jackel - Machine Learning and Neural Nets at Bell Labs, Holmdel

North-C, Toyota Research Institute, NVIDIA [video]

09:15 am

Gerry Tesauro - On TD-Learning and links with DeepRL in Atari and AlphaGo

IBM Research [video]

10:00

Coffee Break


10:30 am

Yoshua Bengio - Learning Long-Term Dependencies with Gradient Descent is Difficult

University of Montréal [video]


10:50 am

C. Lee Giles -

Recurrent Neural Networks: State Machines and Pushdown Automata
Penn State University [video]


11:30 am

Panel discussion with Yoshua Bengio, C. Lee Giles and Tomas Mikolov


12:00 

Lunch Break


01:30 pm

Patrice Simard - Backpropagation without Multiplication

Microsoft Research, Redmond [video]

01:50 pm

Yann DauphinTangent Propagation 

Facebook AI Research, Menlo Park [video]


02:10 pm

John Platt - Let your Network Grow

Google Research, Seattle [video]


02:30 pm

Panel with Yann LeCun, Patrice Simard and John Platt

[video]


03:00 pm 

Coffee Break



03:30 pm

Barak Pearlmutter Automatic Differentiation versus Back-Propagation: The Human Touch 

Maynooth University, Ireland [video]

04:10pm

Leon Bottou Graph Transformer Networks 

Facebook AI Research, NYC [video]


Important Dates

June 23rd   Workshop Day
Organizers
         where 1=fb.com and 2=google.com

Acknowledgements
We are thankful to our speakers, and our program committee (Michael Auli, Ronan Collobert, Yann Dauphin, Marc'Aurelio Ranzato).
Photo Credit: Garry Knight