Invited Speakers

Ruth Kempson FBA, Professor Emerita, Philosophy Department, King’s College London.

Crossing disciplinary boundaries: Language as Interaction Control Mechanisms

In this talk, I shall urge the Dynamic Syntax (DS) view that a natural language grammar is a set of processes inducing incremental context-relative coordination of action (Kempson et al 2001, 2016, 2019) by showing how, on the one hand, this approach nests within enactive perspectives of cognition, and, on the other, how it fits within the multi-level selection hypothesis account of evolution in which both groups and individuals can function as primitives. From both points of view, language can then be seen as an adaptive niche-constructive capacity enabling distributed inter-individual control of cognitive and material achievements during interactions (rather than one for which interactive effects have to be inferentially explained).

Bio:

With many years working at the interface of syntax, semantics, and pragmatics, Ruth Kempson has recently become best known for her leadership of the Dynamic Syntax framework in which a natural language grammar is defined as a set of procedures inducing incremental context-relative growth of information.

Central to this model are concepts of initial underspecification of information and update reflecting the time-line of on-line processing, from which the dynamics of participant exchange in conversational dialogue follow as an immediate consequence. Current work includes exploring the potential of Vector Space Semantics for modelling nondeterminism of natural language meaning, and using the model to probe language evolution.

Detailed bio and recent publications, and abstract

Dr Jamie Kiros, Google Brain

Language Grounding, Structure and their relevance to Semantic Spaces

In the past year, advances in large-scale language modeling and contextualized word representations have led to significant improvements across several language processing tasks. The current trend is to scale up on text processed and model size, but is this sufficient? In this talk I will argue for the importance of research in language grounding, structural priors and their relevance to constructing semantic spaces. I will attempt to characterize when and where these approaches can be beneficial, as well as highlight recent research we have performed in these areas, including picture-based word representations, language-instructed RL agents, web navigation, semi-autoregressive machine translation and learning universal contextualized sentence representations.

Bio:

Jamie Kiros is a Research Scientist at Google Brain Toronto in Geoffrey Hinton's group. She completed her PhD at the University of Toronto under the supervision of Dr. Richard Zemel and Dr. Ruslan Salakhutdinov. Her recent work has focused on natural language grounding, learning representations of sentences and neural machine translation.

Dr Sanjaye Ramgoolam, Queen Mary University of London

Permutation invariant Gaussian Matrix/Tensor models for Compositional Distributional Semantics

Compositional distributional semantics produces large collections of matrices and tensors, associated with words, according to their grammatical type. An important problem is to characterize the statistics of these collections of matrices/tensors. This is a challenging problem involving many random variables, where meaning intermingles with randomness.

Random matrix theories in physics have exploited symmetries in order to extract universal features of matrix statistics, applicable to many systems such as complex nuclei, atoms and molecules as well as chaotic systems. Their applications have further extended to a variety of systems in natural and data sciences in recent years. Random matrices and higher tensors also find applications in theoretical models of holographic duality in string theory, where matrix/tensor systems admit emergent dual descriptions involving two-dimensional surfaces.

In the paper Linguistic Matrix Theory (Kartsaklis, Ramgoolam, Sadrzadeh, 2017), ideas from matrix theory were applied to characterize the statistics of matrices for adjectives and verbs generated in compositional distributional semantics. Permutation invariance was argued to be the appropriate symmetry. Expectation values of permutation invariant polynomial functions of the matrices were studied from an experimental and a theoretical perspective. The theory developed was a 5-parameter Gaussian model. The model uses linear and quadratic expectation values as input and produces higher order expectation values, which can be compared with the data.

Bio:

Dr Sanjaye Ramgoolam completed his PhD in theoretical physics at Yale University, and held post-doctoral positions at Princeton and Brown Universities. He was awarded an STFC Advanced Fellowship and took up a faculty position at Queen Mary University of London. Dr Ramgoolam’s research interests have centred on the phenomenon of gauge-string duality, where string theories emerge from quantum field theories. He has developed a framework of discrete families of algebras, and representation theoretic Fourier transforms on these algebras, as a tool for uncovering emergent features in the combinatorics of matrix/tensor quantum fields. He has collaborated on applications of this framework to problems in quantum information theory and computational linguistics.

Detailed abstract and bio