9:10 Registration desk open
9:30 Opening
9:45 Invited Talk I
Lessons from Working on Spectral Learning and Neural Networks by Dr. Karl Stratos (Bloomberg L.P.)
Abstract:
In this talk, I will share a few lessons I have learned about research from working on spectral algorithms and more recently neural networks. First, I will describe how it took me many years in my PhD to come to grips with the rich theory underlying spectral methods, such as the deep connection between linear algebraic and variational characterizations. I will describe how this understanding was crucial for my thesis work that derived spectral methods for NLP. Second, I will bring up the tension between theory (i.e., what is right) and practice (i.e., what works) by contrasting the spectral framework with the paradigm of neural networks - the dominating approach in NLP today. I will conclude with a brief illustration of "real-world NLP" at Bloomberg and my personal suggestions to PhD students on doing research.
Profile:
Karl Stratos is a research scientist in the NLP team at Bloomberg. He received a PhD in computer science from Columbia University where he focused on spectral methods (i.e., algorithms that use eigendecomposition) in the context of NLP. His PhD advisor was Michael Collins; he also worked closely with Daniel Hsu during this time. He is broadly interested in statistical approaches to language processing. He is particularly interested in methods that can leverage unlabeled data, making his research tend toward semi-supervised/unsupervised learning.
http://karlstratos.com
10:45 Invited Talk II
Structured Neural Networks for NLP: From Idea to Code by Professor Graham Neubig (Language Technologies Institute, Carnegie Mellon University)
Abstract:
In this presentation, I'll talk about how to implement neural network models for natural language processing applications, specifically focusing on neural networks that have complicated structures like parse trees. The explanation will include examples of how to use the toolkit DyNet (https://github.com/clab/dynet), which was designed with these sorts of structured models in mind.
Profile:
Graham Neubig is an assistant professor at the Language Technologies Institute of Carnegie Mellon University. Previously, he was an assistant professor at the Nara Institute of Science and Technology (Japan) after receiving his doctoral degree from Kyoto University (Japan). His research focuses on linguistically motivated machine learning methods for language and speech processing, with a particular focus on machine translation and speech translation.
http://www.phontron.com (Twitter: @gneubig)
11:45 Lunch break
12:50 Invited Talk III
Variety in research, research in variety by Professor Barbara Plank (University of Groningen)
Abstract:
In NLP, we typically deal with data from a variety of sources, including data from different locations, languages and media. In this talk, I will focus on *variety* from two different angles: from a
research perspective, how data variety affects NLP tools, delineating possible ways to go about it; and from a personal perspective, suggesting how variety in research and different research environments can 'spice up' a young investigator's research life.
Profile:
Assistant Professor in Natural Language Processing
Center for Language and Cognition Groningen, University of Groningen, The Netherlands
http://www.let.rug.nl/~bplank/
13:50 Invited Talk IV
Recent Advances in Neural Machine Translation by Professor Yang Liu (Tsinghua University)
Abstract:
Neural machine translation (NMT), which aims to translate natural languages using neural networks, has attracted intensive attention in the recent two years. Unlike traditional statistical methods that rely on manual feature engineering, NMT is capable of directly learning representations from data and capturing long-distance dependencies via the gating and attention mechanisms. This talk will introduce recent advances in NMT, including new mechanisms and architectures, training models with respect to evaluation metrics, techniques for handling large vocabularies, and low resource language translation. The talk closes with discussing the future directions of NMT.
Profile:
Yang Liu is an Associate Professor in the Department of Computer Science and Technology, Tsinghua University. He received his PhD degree from Institute of Computing Technology, Chinese Academy of Sciences in 2007. His research focuses on natural language processing and machine translation. He has published over 40 papers on leading NLP/AI journals and conferences such as Computational Linguistics, ACL, AAAI, EMNLP, and COLING. He won the COLING/ACL 2006 Meritorious Asian NLP Paper Award. He served as ACL 2014 Tutorial Co-Chair, ACL 2015 Local Arrangement Co-Chair, ACL 2016 SRW Faculty Advisor, EMNLP 2016 Area Chair, and SIGHAN Information Officer.
http://nlp.csai.tsinghua.edu.cn/~ly/
14:50 Afternoon break
15:10 Company presentations
Poster flash talks
15:30 Poster session
1. Tree transducer induction for Question-Answering over Large Knowledge Bases by Pascual Martinez-Gomez and Yusuke Miyao
2. Rule Generation for a Filipino Style and Grammar Checker by Nathaniel Oco
3. Recursive Neural Networks for Sentiment Classification with Discourse and Constituent Parses by Shiima Sudayama, Takeshi Abekawa and Akihiko Takano
4. Phrase Compositionality using Image Groundings by Dan Han and Pascual Martinez Gomez
5. Incremental processing for highly responsive spoken interaction by Timo Baumann
6. Predictive Incremental Syntax Parsing by Arne Kohn
7. Short-long / Long-short Preferences in English/Japanese Processing Revisited by Zhang Longtu
8. Composing Distributed Representations of Relational Patterns by Sho Takase, Naoaki Okazaki and Kentaro Inui
9. Incremental alignment with the Berkeley Aligner for learner data evaluation by Katherine McCurdy
10. Aiming at computable and expressive event representations suited for semantic operations by Darina Benikova and Torsten Zesch
11. Generating Video Description using Sequence-to-sequence Model with Temporal Attention by Natsuda Laokulrat, Sang Phan, Noriki Nishida, Raphael Shu, Yo Ehara, Naoaki Okazaki, Yusuke Miyao and Hideki Nakayama
12. Fine grained twitter sentiment analysis by Georgios Balikas
13. Towards a Neural Conversation Model that Meets the Objective of the Dialogue by Shota Sato, Kazuaki Inada, Reina Akama, Sousuke Kobayashi, Naoya Inoue, Naoaki Okazaki and Kentaro Inui
14. Learning to Predict Success of Books by Suraj Maharjan, John Arevalo, Fabio A. Gonzalez and Thamar Solorio
15. Predicting Vocabulary of Second Language Learners and Its Application to Crowdsourcing Translation by Yo Ehara
16. Normalising Medical Concepts from Personal Health Messages in Social Media by Nut Limsopatham
17:00 Closing
18:00-20:00 Social event (subject to change; limited availability)
Price: JPY 3,500 (including drinks)
Location: Marubiru (Maru Bld.), Tako no Tetsu: http://www.takonotetsu.co.jp/
The event features Takoyaki, a ball-shaped pancake with octopus and vegetables, Osaka's representative local food. We will learn how to cook Takoyaki and practice that at the event!
If you wish to attend the event;
Speakers: we will send registration information through e-mail later,
Audience: please register through "Registration as audience."
Due to the limited number of seats, registration to the event may be closed earlier.
No worry if you do not like octopus; you could customize your own piece (avoid octopus, add your favorite ingredients, change source, etc.). For more information about Takoyaki, see: https://en.wikipedia.org/wiki/Takoyaki