Overview

The 2nd Workshop on Representation Learning for NLP (RepL4NLP) will be held on August 3 or August 4, and hosted by the 55th Annual Meeting of the Association for Computational Linguistics (ACL) in Vancouver, Canada. The workshop is being organised by Phil Blunsom, Antoine Bordes, Kyunghyun Cho, Shay Cohen, Chris Dyer, Edward Grefenstette, Karl Moritz Hermann, Laura Rimell, Jason Weston, and Scott Wen-tau Yih, and will be sponsored by DeepMind and Microsoft Research.

The 2nd Workshop on Representation Learning for NLP aims to continue the success of the 1st Workshop on Representation Learning for NLP (about 50 submissions and over 250 attendees; second most attended collocated event at ACL'16 after WMT) which was introduced as a synthesis of several years of independent *CL workshops focusing on vector space models of meaning, compositionality, and the application of deep neural networks and spectral methods to NLP. It provides a forum for discussing recent advances on these topics, as well as future research directions in linguistically motivated vector-based models in NLP.

Key Dates
  • Deadline for paper submission: April 21, 2017
  • Notification of acceptance: May 19, 2017
  • Camera ready submission due: May 26, 2017
  • Early registration deadline: TBD
  • Workshop: 4 or 5 August 2017
Keynote Speakers
Topics
  • Distributional compositional semantics
  • Analysis of language using eigenvalue, singular value and tensor decompositions
  • Latent-variable and representation learning for language
  • Neural networks and deep learning in NLP
  • Word embeddings and their applications
  • Spectral learning and the method of moments in NLP
  • Language modeling for automatic speech recognition, statistical machine translation, and information retrieval
  • The role of syntax in compositional models
  • Language modeling for logical and natural reasoning
  • Integration of distributional representations with other models
  • Multi-modal learning for distributional representations
  • Knowledge base embedding