Keynote Speakers
Full Professor @ University of Pisa
A journey through efficient deep learning on graphs
The talk will very briefly introduce the area of Deep Learning for graphs, with a focus on its origins.
We will then proceed to a discussion of advanced topics and current open issues through an overview of examples of recent progress in my research group, with a particular emphasis on the issue of efficiency and the interplay between the depth of models and the learning of complex data representations.
Associate Professor @ UiT The Arctic University of Norway
Tutorial: Hierarchical pooling in Graph Neural Networks
Pooling is a technique used to summarize local information within a deep learning architecture. In a GNN, hierarchical pooling produces increasingly smaller graphs, which can be used to gradually distill global graph properties and to extend the range of message-passing operations. This tutorial represents a gentle introduction to hierarchical graph pooling, and it covers: a general framework to express a graph pooling operator, the main families of graph pooling methods, and techniques used to evaluate the effectiveness of a pooling method.
PostDoc @Technion - Israel Institute of Technology
Advances in Subgraph GNNs for Expressive and Efficient Learning on Graphs
In this keynote we will discuss recent advances for efficient and expressive Graph Representation Learning, focussing on the family of Subgraph Graph Neural Networks (Subgraph GNNs).
After introducing the concept of expressive power and the limitations of traditional message-passing approaches, we will discuss Subgraph GNNs as flexible, more expressive alternatives to learning on graphs. We will chart their design space and expressiveness properties, whilst highlighting their main downside: a high computational complexity due to modelling graphs as bags of subgraphs.
At the core of the keynote we will present HyMN: a novel Subgraph GNN that balances expressiveness and efficiency by combining subgraph sampling and feature augmentations based on Structural Encodings (SEs). By a connection to perturbation analysis we will uncover how walk-based node centrality measures can be employed to effectively subsample bags of subgraphs and reduce their size. We will show how these measures can also be used to derive SEs, and illustrate how these are integrated in HyMN for enhanced discriminative power, strong empirical performance and affordable computational complexity.
We will conclude by sharing thoughts on future research directions for advancing expressive Graph Representation Learning.