Overview

The 4th Workshop on Representation Learning for NLP (RepL4NLP) will be hosted by ACL 2019 and held on 2 August 2019. The workshop is being organised by Isabelle Augenstein, Spandana Gella, Sebastian Ruder, Katharina Kann, Burcu Can, Alexis Conneau, Johannes Welbl, Xian Ren and Marek Rei; and advised by Kyunghyun Cho, Edward Grefenstette, Karl Moritz Hermann, Chris Dyer and Laura Rimell. The workshop is organised by the ACL Special Interest Group on Representation Learning (SIGREP) and receives generous sponsorship from Facebook AI Research, Amazon, and Naver.

The 4th Workshop on Representation Learning for NLP aims to continue the success of the 1st Workshop on Representation Learning for NLP (about 50 submissions and over 250 attendees; second most attended collocated event at ACL'16 after WMT), 2nd Workshop on Representation Learning for NLP and 3rd Workshop on Representation Learning for NLP. The workshop was introduced as a synthesis of several years of independent *CL workshops focusing on vector space models of meaning, compositionality, and the application of deep neural networks and spectral methods to NLP. It provides a forum for discussing recent advances on these topics, as well as future research directions in linguistically motivated vector-based models in NLP.


Key Dates

  • Deadline for paper submission: 26 April 2019 03 May 2019
  • Notification of acceptance: 24 May 2019
  • Camera ready submission due: 3 June 2019
  • Early registration deadline: TBA
  • Workshop: 2 August 2019


Keynote Speakers

  • Marco Baroni, Facebook AI Research
  • Yulia Tsvetkov, Carnegie Mellon University
  • Raquel Fernandez, University of Amsterdam
  • Mohit Bansal, UNC Chapel Hill


Discussion Panel Members

TBA.


Topics

  • Compositionality with distributed representations and the role of syntax
  • Analysis of language using eigenvalue, singular value and tensor decompositions
  • Latent-variable and representation learning for language
  • Neural networks and deep learning in NLP
  • Training, evaluating, and applying representations
  • Spectral learning and the method of moments in NLP
  • Language models for different applications and understanding representations
  • Multi-modal learning for distributional representations
  • Knowledge base and graph embeddings


Contact: repl4nlp@googlegroups.com