Deep Learning and Formal Languages: Building Bridges
Schedule
9:00-9:05 Opening Words.
9:05-9:45 Kevin Knight: Do Simpler Automata Learn Better?
9:45-9:51 Poster Spotlights:
- William Merrill: Sequential Neural Networks as Automata.
- Chunyang Xiao, Christoph Teichmann and Konstantine Arkoudas: Grammatical Sequence Prediction for Real-Time Neural Semantic Parsing.
- Mathijs Mul and Willem Zuidema: Siamese recurrent networks can learn first-order logic reasoning and exhibit zero-shot generalization.
9:51-10:30 Ariadna Quattoni: A story about weighted automata (WFAs), RNNs and low-rank Hankel Matrices.
10:30-11:00 Break
11:00-11:40 Remi Eyraud: Distilling computational models from Recurrent Neural Networks.
11:40-11:45 – Poster Spotlights:
- Fabio Massimo Zanzotto, Giordano Cristini and Giorgio Satta: CYK Parsing over Distributed Representations.
- Farhana Ferdousi Liza and Marek Grzes: Relating RNN layers with the spectral WFA ranks in sequence modelling.
11:45-12:25 – John Kelleher: Using formal grammars to test ability of recurrent neural networks to model long-distance dependencies in sequential data.
12:25-12:30 – Poster Spotlights
- Abhijit Mahalunkar and John Kelleher: Using SPk Languages to Explore the Characteristics of Long-Distance Dependencies.
- Mirac Suzgun, Yonatan Belinkov, Stuart Shieber and Sebastian Gehrmann: LSTM Networks Can Perform Dynamic Counting.
12:30-14:00 Lunch
14:00-15:30 Poster Session
15:30-16:00 Break
16:00-16:40 Robert Frank: Beyond testing and acceptance: On the study of formal and natural languages in neural networks.
16:40-17:20 Noah Smith: Rational Recurrences for Empirical Natural Language Processing.
17:20-17:30 Closing Discussion.