Schedule
September 7, 2017
September 7, 2017
(Location: Nørrebros Runddel, CPH Conference-ground floor, Copenhagen, Denmark)
(Location: Nørrebros Runddel, CPH Conference-ground floor, Copenhagen, Denmark)
- 09:00-09:10 Opening Remarks
- Hinrich Schuetze
- 09:10--09:50 Subword-level Information in NLP using Neural Networks (slides)
- Invited Talk (Tomas Mikolov)
09:50--10:30 Chewing the Fat about Mincing Words (Slides)
- Invited Talk (Noah Smith)
- 10:30--11:00 Coffee break
- 11:00--11:40 Neural WFSTs: Tutorial Talk
- Invited Talk (Ryan Cotterell)
- 11:40--12:10 Best paper awards (sponsor: Google) & presentations
- 11:40--11:55 Character and Subword-Based Word Representation for Neural Language Modeling Prediction
- Matthieu Labeau and Alexandre Allauzen
- 11:55--12:10 Learning variable length units for SMT between related languages via Byte Pair Encoding
- Anoop Kunchukuttan and Pushpak Bhattacharyya
- 11:40--11:55 Character and Subword-Based Word Representation for Neural Language Modeling Prediction
- 12:10--14:00 Poster session & Lunch break
- Character and Subword-Based Word Representation for Neural Language Modeling Prediction
- Learning variable length units for SMT between related languages via Byte Pair Encoding
- Character Based Pattern Mining for Neology Detection
- Automated Word Stress Detection in Russian
- A Syllable-based Technique for Word Embeddings of Korean Words
- Supersense Tagging with a Combination of Character, Subword, and Word-level Representations
- Weakly supervised learning of allomorphy
- Character-based recurrent neural networks for morphological relational reasoning
- Glyph-aware Embedding of Chinese Characters
- Exploring Cross-Lingual Transfer of Morphological Knowledge In Sequence-to-Sequence Models
- Language Generation with Recurrent Generative Adversarial Networks without Pre-training
- Natural Language Generation through Character-Based RNNs with Finite-State Prior Knowledge
- Patterns versus Characters in Subword-aware Neural Language Modeling
- 14:00--14:40 Fully Character Level Neural Machine Translation (slides)
- Invited Talk (Kyunghyun Cho)
- 14:40--15:50 Poster session & Coffee break
- Unlabeled Data for Morphological Generation With Character-Based Sequence-to-Sequence Models
- Vowel and Consonant Classification through Spectral Decomposition
- Syllable-level Neural Language Model for Agglutinative Language
- Character-based Bidirectional LSTM-CRF with words and characters for Japanese Named Entity Recognition
- Word Representation Models for Morphologically Rich Languages in Neural Machine Translation
- Spell-Checking based on Syllabification and Character-level Graphs for a Peruvian Agglutinative Language
- What do we need to know about an unknown word when parsing German
- A General-Purpose Tagger with Convolutional Neural Networks
- Reconstruction of Word Embeddings from Sub-Word Parameters
- Inflection Generation for Spanish Verbs using Supervised Learning
- Neural Paraphrase Identification of Questions with Noisy Pretraining
- Sub-character Neural Language Modelling in Japanese
- Byte-based Neural Machine Translation
- Improving Opinion-Target Extraction with Character-Level Word Embeddings
- Align and Copy: Hard Attention Models for Morphological Inflection Generation
- 15:50--16:30 Acoustic Word Embeddings (slides)
- Invited Talk (Karen Livescu)
- 16:30--17:30 Panel discussion
- Kyunghyun Cho, Sharon Goldwater, Karen Livescu, Tomas Mikolov, Noah Smith
- 17:30--17:45 Closing remarks
- Hinrich Schuetze