Overview

The 3rd Workshop on Representation Learning for NLP (RepL4NLP) will be held on 20 July 2018, and hosted by ACL 2018 in Melbourne, Australia. The workshop is being organised by Isabelle Augenstein, Kris Cao, He He, Felix Hill, Spandana Gella, Jamie Kiros, Hongyuan Mei and Dipendra Misra, and advised by Kyunghyun Cho, Edward Grefenstette, Karl Moritz Hermann and Laura Rimell. The workshop receives generous sponsorship from DeepMindMicrosoft ResearchFacebook AI Research, Salesforce and Bloomberg.

The 3rd Workshop on Representation Learning for NLP aims to continue the success of the 1st Workshop on Representation Learning for NLP (about 50 submissions and over 250 attendees; second most attended collocated event at ACL'16 after WMT) and 2nd Workshop on Representation Learning for NLP, which were introduced as a synthesis of several years of independent *CL workshops focusing on vector space models of meaning, compositionality, and the application of deep neural networks and spectral methods to NLP. It provides a forum for discussing recent advances on these topics, as well as future research directions in linguistically motivated vector-based models in NLP.

Key Dates
  • Deadline for paper submission: April 2018
  • Notification of acceptance: 7 May 2018
  • Camera ready submission due: 28 May 2018
  • Early registration deadline: TBD
  • Workshop: 20 July 2018
Keynote Speakers
Further speakers to be confirmed.

Topics
  • Distributional compositional semantics
  • Analysis of language using eigenvalue, singular value and tensor decompositions
  • Latent-variable and representation learning for language
  • Neural networks and deep learning in NLP
  • Word embeddings and their applications
  • Spectral learning and the method of moments in NLP
  • Language modeling for automatic speech recognition, statistical machine translation, and information retrieval
  • The role of syntax in compositional models
  • Language modeling for logical and natural reasoning
  • Integration of distributional representations with other models
  • Multi-modal learning for distributional representations
  • Knowledge base embedding