Aims and Scope

Vector embeddings of word meanings have become a mainstream tool in large scale natural language processing tools. The use of vectors to represent meanings in semantic spaces or feature spaces is also employed in cognitive science. Unrelated to natural language and cognitive science, vectors and vector spaces have been extensively used as models of physical theories and especially the theory of quantum mechanics. Crucial similarities between the vector representations of quantum mechanics and those of natural language are exhibited via bicompact linear logic and compact closed categorical structures in natural language.

Exploiting the common ground provided by vector spaces, the proposed workshop will bring together researchers working at the intersection of NLP, cognitive science, and physics, offering to them an appropriate forum for presenting their uniquely motivated work and ideas. The interplay between these three disciplines will foster theoretically motivated approaches to understanding how meanings of words interact with each other in sentences and discourse via grammatical types, how they are determined by input from the world, and how word and sentence meanings interact logically.

Topics of interest include (but are not restricted to):

  • Reasoning in semantic spaces
  • Compositionality in semantic spaces and conceptual spaces
  • Conceptual spaces in linguistics and natural language processing
  • Applications of quantum logic in natural language processing and cognitive science
  • Modelling functional words such as prepositions and relative pronouns in compositional distributional models of meaning
  • Diagrammatic reasoning for natural language processing and cognitive science
  • Modelling so-called ‘non-compositional’ phenomena such as metaphor

Special Session

We will have a special session on the relevance of formal grammar methods in deep learning and other statistical and vector space approaches to language. Examples of phenomena where these methods come into play include (but of course are not limited to) anaphora resolution, long-range filler-gap dependencies, function-argument relations, locality domains, and syntactic structures in general. This session is organised jointly with the Formal Grammar conference (http://fg.phil.hhu.de/2020/)