Programme

Live Video Stream for attendees

For links to prerecorded videos and live chat sessions (#channel) see listed papers below

Schedule 10 July 2020 (Times in PDT Seattle ):

  • Chair: Lexi Birch
  • 8:30-9:00 Kenneth Heafield Findings of the Efficiency Task
  • 9:00-9:40 Stephen Mayhew Findings of the STAPLE Task
  • Chair: Yannis Konstas
  • 9:45-10:30 Live invited talk: Claire Gardent "Generating Long Form Text from Long Form Multi Document Input"
  • 10:30-10:45 Live Q&A
  • 10:45-11:30 Break
  • 11:30-12.15 Live invited talk: Shashi Narayan "On Analysing and Generating Faithful Text"
  • 12:15-12:30 Live Q&A
  • 12:30-12:40 Video of best paper talk: Meta-Learning for Few-Shot NMT Adaptation Amr Sharaf, Hany Hassan and Hal Daumé III (video) #wngt20-meta-learning
  • Chair: Lexi Birch
  • 12:40-12:50 Live Best Paper Prize giving and Q&A
  • 12:50-13:30 Break
  • 13:30-15:00 Live paper presentation session: Please see below for links to the 5-10 min video and links to the live chat channel sessions (authors can optionally create zoom sessions and link these in the chat)
  • Chair: Graham Neubig
  • 15:00-15:45 Live invited talk: He He "Text Generation by Offline Reinforcement Learning"
  • 15:45-16:00 Live Q&A
  • 16:00-16:45 Live invited talk: Jiatao Gu "Multilingual Denoising Pre-training for Neural Machine Translation"
  • 16:45-17:00 Live Q&A


Please find published papers here:

https://www.aclweb.org/anthology/volumes/2020.ngt-1/


Findings Papers:

  • Findings of the Fourth Workshop on Neural Generation and Translation Kenneth Heafield, Hiroaki Hayashi, Yusuke Oda, Ioannis Konstas, Andrew Finch, Graham Neubig, Xian Li and Alexandra Birch (live) #wngt20-findings
  • Simultaneous Translation and Paraphrasing for Language Education Stephen Mayhew, Klinton Bicknell, Chris Brust, Will Monroe, Bill McDowell and Burr Settles (live) #wngt20-simultaneous


Research Papers:

  • Learning to Generate Multiple Style Transfer Outputs for an Input Sentence Kevin Lin, Ming-Yu Liu, Ming-Ting Sun and Jan Kautz (video) #wngt20-learningtogenerate
  • Balancing Cost and Benefit with Tied-Multi Transformers Raj Dabre, Raphael Rubino and Atsushi Fujita (video) #paper-wngt-tiedtransformers
  • Compressing Neural Machine Translation Models with 4-bit Precision Alham Fikri Aji and Kenneth Heafield (video) #wngt20-compressingneural
  • Meta-Learning for Few-Shot NMT Adaptation Amr Sharaf, Hany Hassan and Hal Daumé III (video) #wngt20-meta-learning
  • Automatically Ranked Russian Paraphrase Corpus for Text Generation Vadim Gudkov, Olga Mitrofanova and Elizaveta Filippskikh (video) #wngt20-automatically
  • Increasing Lexical Diversity in Plug and Play Language Models Soham Parikh, Daphne Ippolito and Satyarth Vaidya (Extended Abstract) (video) #wngt20-increasing-lexical
  • A Deep Reinforced Model for Zero-Shot Cross-Lingual Summarization with Bilin- gual Semantic Similarity Rewards Zi-Yi Dou, Sachin Kumar and Yulia Tsvetkov (video) #wngt20-deepreinforced
  • A Question Type Driven and Copy Loss Enhanced Frameworkfor Answer-Agnostic Neural Question Generation WU XIUYU, Nan Jiang and Yunfang Wu (video) #wngt20-questiontype
  • When and Why is Unsupervised Neural Machine Translation Useless? Yunsu Kim, Miguel Graça and Hermann Ney (Cross Submission) (video) #wngt20-whenwhy
  • A Generative Approach to Titling and Clustering Wikipedia Sections Anjalie Field, Sascha Rothe, Simon Baumgartner, Cong Yu and Abe Ittycheriah (video) #wngt20-generativeapproach
  • The Unreasonable Volatility of Neural Machine Translation Models Marzieh Fadaee and Christof Monz (video) #wngt20-unreasonablevolatility
  • Leveraging Sentence Similarity in Natural Language Generation: Improving Beam Search using Range Voting Sebastian Borgeaud and Guy Emerson (video) #wngt20-leveragingsentence
  • Transformers without Tears: Improving the Normalization of Self-Attention Toan Q. Nguyen and Julian Salazar (Cross Submission) (video) #wngt20-transformerstears
  • Masked Language Model Scoring Julian Salazar, Davis Liang, Toan Q. Nguyen and Katrin Kirchhoff (Cross Submission) (video) #paper-main-240
  • Distill, Adapt, Distill: Training Small, In-Domain Models for Neural Machine Translation Mitchell Gordon and Kevin Duh (video) #wngt20-distilladapt
  • Improving Neural Machine Translation Using Energy-Based Models Subhajit Naskar, Amirmohammad Rooshenas and Andrew McCallum (video) #wngt20-energybased


EFFICIENCY TASK System Papers

  • Edinburgh’s Submissions to the 2020 Machine Translation Efficiency Task Nikolay Bogoychev, Roman Grundkiewicz, Alham Fikri Aji, Maximiliana Behnke, Kenneth Heafield, Sidharth Kashyap, Emmanouil-Ioannis Farsarakis and Mateusz Chudyk (video) #wngt20-edinburgh
  • Efficient and High-Quality Neural Machine Translation with OpenNMT Guillaume Klein, Dakun Zhang, Clément Chouteau, Josep Crego and Jean Senellart (video) #wngt20-opennmt
  • The NiuTrans System for WNGT 2020 Efficiency Task Chi Hu, Bei Li, Yinqiao Li, Ye Lin, Yanyang Li, Chenglong Wang, Tong Xiao and Jingbo Zhu (video) #wngt20-niutranssystem


DGT TASK System Papers

  • Improving Document-Level Neural Machine Translation with Domain Adaptation Sami Ul Haq, Sadaf Abdul Rauf, Arslan Shoukat, Noor-e- Hira #wngt20-improvingdocument


STAPLE TASK System Papers

  • Training and Inference Methods for High-Coverage Neural Machine Translation Michael Yang, Yixin Liu and Rahul Mayuranath (video) #wngt20-highcoverage
  • Meeting the 2020 Duolingo Challenge on a Shoestring Tadashi Nomoto (video) #wngt20-shoestring
  • English-to-Japanese Diverse Translation by Combining Forward and Backward Outputs Masahiro Kaneko, Aizhan Imankulova, Tosho Hirasawa and Mamoru Komachi (video) #wngt20-english-to-japanesediverse
  • POSTECH Submission on Duolingo Shared Task Junsu Park, Hongseok Kwon and Jong-Hyeok Lee (video) #wngt20-postech
  • The ADAPT System Description for the STAPLE 2020 English-to-Portuguese Translation Task Rejwanul Haque, Yasmin Moslem and Andy Way (video) #wngt20-adapt
  • Expand and Filter: CUNI and LMU Systems for the WNGT 2020 Duolingo Shared Task Jindrich Libovický, Zdenek Kasner, Jindrich Helcl and Ondrej Dušek (video) #wngt20-expandandfilter
  • Exploring Model Consensus to Generate Translation Paraphrases Zhenhao Li, Marina Fomicheva and Lucia Specia (video) #wngt20-modelconsensus
  • Growing Together: Modeling Human Language Learning With n-Best Multi-Checkpoint Machine Translation El Moatez Billah Nagoudi, Muhammad Abdul-Mageed and Hasan Cavusoglu (video) #wngt20-growingtogether
  • Generating Diverse Translations via Weighted Fine-tuning and Hypotheses Filtering for the Duolingo STAPLE Task Sweta Agrawal and Marine Carpuat (video) #wngt20-weightedfine-tuning
  • The JHU Submission to the 2020 Duolingo Shared Task on Simultaneous Translation and Paraphrase for Language Education Huda Khayrallah, Jacob Bremerman, Arya D. McCarthy, Kenton Murray, Winston Wu and Matt Post (video) #wngt20-jhu
  • Simultaneous paraphrasing and translation by fine-tuning Transformer models Rakesh Chada (video) #wngt20-simultaneousparaphrasing