Schedule
Friday July 20, Room 211
09:00–09:10 Welcome and Opening Remarks
Findings of the Second Workshop on Neural Machine Translation and Generation
Alexandra Birch, Andrew Finch, Minh-Thang Luong, Graham Neubig and Yusuke Oda
09:10–10:00 Keynote 1: Real-time High-quality Neural MT Decoding on Mobile Devices
Jacob Devlin
10:00–10:30 Shared Task Overview
10:30–11:00 Coffee Break
11:00–11:30 Marian: Fast Neural Machine Translation in C++
11:30–12:20 Keynote 2: Why the Time Is Ripe for Discourse in Machine Translation (Slides)
Rico Sennrich
12:20–13:20 Lunch Break
13:20–13:50 Best Paper Session
13:50–14:40 Keynote 3: Beyond Softmax: Sparsity, Constraints, Latent Structure -- All End-to-End Differentiable! (Slides)
André Martins
14:40–15:30 Keynote 4: Towards Flexible but Controllable Language Generation (Slides)
Yulia Tsvetkov
15:30–16:00 Coffee Break
16:00–17:30 Poster Session
A Shared Attention Mechanism for Interpretation of Neural Automatic Post-Editing Systems
Inigo Jauregi Unanue, Ehsan Zare Borzeshi and Massimo Piccardi
Iterative Back-Translation for Neural Machine Translation
Vu Cong Duy Hoang, Philipp Koehn, Gholamreza Haffari and Trevor Cohn
Inducing Grammars with and for Neural Machine Translation
Yonatan Bisk and Ke Tran
Regularized Training Objective for Continued Training for Domain Adaptation in Neural Machine Translation
Huda Khayrallah, Brian Thompson, Kevin Duh and Philipp Koehn
Controllable Abstractive Summarization
Angela Fan, David Grangier and Michael Auli
Enhancement of Encoder and Attention Using Target Monolingual Corpora in Neural Machine Translation
Kenji Imamura, Atsushi Fujita and Eiichirō Sumita
Document-Level Adaptation for Neural Machine Translation
Sachith Sri Ram Kothur, Rebecca Knowles and Philipp Koehn
On the Impact of Various Types of Noise on Neural Machine Translation
Huda Khayrallah and Philipp Koehn
Bi-Directional Neural Machine Translation with Synthetic Parallel Data
Xing Niu, Michael Denkowski and Marine Carpuat
Multi-Source Neural Machine Translation with Missing Data
Yuta Nishimura, Katsuhito Sudoh, Graham Neubig and Satoshi Nakamura
Towards one-shot learning for rare-word translation with external experts
Ngoc-Quan Pham, Jan Niehues and Alexander Waibel
NICT Self-Training Approach to Neural Machine Translation at NMT-2018
Kenji Imamura and Eiichirō Sumita
Fast Neural Machine Translation Implementation
Hieu Hoang, Tomasz Dwojak, Rihards Krislauks, Daniel Torregrosa and Kenneth Heafield
OpenNMT System Description for WNMT 2018: 800 words/sec on a single-core CPU
Jean Senellart, Dakun Zhang, Bo WANG, Guillaume KLEIN, Jean-Pierre Ramatchandirin, Josep Crego and Alexander Rush
Marian: Cost-effective High-Quality Neural Machine Translation in C++
Marcin Junczys-Dowmunt, Kenneth Heafield, Hieu Hoang, Roman Grundkiewicz and Anthony Aue
On Individual Neurons in Neural Machine Translation
D. Anthony Bau, Yonatan Belinkov, Hassan Sajjad, Nadir Durrani, Fahim Dalvi and James Glass
Parameter Sharing Strategies in Neural Machine Translation
Sébastien Jean, Stanislas Lauly and Kyunghyun Cho
Modeling Latent Sentence Structure in Neural Machine Translation
Joost Bastings, Wilker Aziz, Ivan Titov and Khalil Simaan
Extreme Adaptation for Personalized Neural Machine Translation
Paul Michel and Graham Neubig
Exploiting Semantics in Neural Machine Translation with Graph Convolutional Networks
Diego Marcheggiani, Joost Bastings and Ivan Titov
17:30–17:40 Closing Remarks