Neuro-Symbolic Methods for Language and Vision

AAAI 2022 Tutorial | Feb 23rd 2022 9 AM - 12 PM PST

Hamid Palangi, Antoine Bosselut, and Pradeep Dasigi

Description

Neuro-symbolic methods for representing knowledge combine compositionality and interpretability of symbolic methods with the ease of training of neural networks. These methods have been successfully applied in various language and vision tasks for learning useful intermediate representations that facilitate complex reasoning and for incorporating semi-structured external knowledge into task-specific models. We will provide a comprehensive overview of these methods, while drawing similarities between the learning algorithms used for various tasks wherever appropriate. From this tutorial, the audience can expect to get a good understanding of how neuro-symbolic methods work, their applicability to various language and vision tasks, and the open challenges that remain. This tutorial will help NLP or ML practitioners unfamiliar with neuro-symbolic methods learn the foundations in this growing area of research.

Outline

  1. Representation and reasoning in NLP with KGs: We will cover research on knowledge graph (KG) integration with NLP models. NLP models are often augmented by outside KGs to provide additional contextualization for the content being parsed by the model. We will explore different neural architectures, fusion methods, and training objectives for integrating knowledge from KGs into neural NLP models for commonsense representation and reasoning. We will also discuss different knowledge graphs and how their schemas and knowledge coverage motivate different integration methods.

  2. Semantic Parsing: We will cover research on semantic parers built towards various end-applications like Question Answering, Text2SQL, and building virtual assistants. All these tasks involve translating natural language utterances into task-specific executable meaning representations. Depending on the end task, obtaining annotations of the meaning representations may not be feasible, and a lot of work in the area was focused on learning algorithms that can leverage weak and distant supervision signals, which we will also cover in this session.

  3. Language Grounding in Vision: We will cover diverse set of works under two categories: (a) creating neuro-symbolic representations as input to any model through large scale pre-training (for example neural scene and video graphs), (b) constructing model specific neuro-symbolic representations via enforcing inductive biases in the model architecture (for example compositional vision-language models). Each of the categories includes several downstream tasks from visual object and relation detection to visual reasoning, image/video captioning, image-text retrieval.

Slides

aaai2022_tutorial_neurosymbolic_slides_intro_AB.pdf

Introduction

aaai2022_tutorial_neurosymbolic_slides_part1_AB.pdf

Section 1: KG-guided Neuro Symbolic Methods in NLP

AAAI 22 NSMLV Tutorial - Semantic Parsing.pdf

Section 2: Executable Semantic Parsing

AAAI_2022_Tutorial_Part_3.pdf

Section 3: Neuro-symbolic Representations for Language and Vision

Presenters

Hamid Palangi is a Senior Researcher at Microsoft Research Lab in Redmond, Washington. His current research interests are in the areas of Natural Language Processing and Language Grounding. He is a Senior Member of IEEE and has a PhD from the University of British Columbia.

Website, Contact Email

Antoine Bosselut is an assistant professor at the École Polytechnique Fédéral de Lausanne (EPFL). Previously, he was a postdoctoral scholar at Stanford University and a Young Investigator at the Allen Institute for AI (AI2). He completed his PhD at the University of Washington.

Website, Contact Email

Pradeep Dasigi is a Research Scientist at the Allen Institute for AI. He works on Natural Language Processing, and is particularly interested in Question Answering and Semantic Parsing. He has a PhD from Carnegie Mellon University and a Masters in Computer Science from Columbia University.

Website, Contact Email