Program

* You can book your place for the social dinner at the registration desk during the registration.

Keynote Speakers

Franco Scarselli

Theory of graph neural networks: some results and perspectives 

By using graphs, we can naturally represent relationships together with patterns, namely domains where the information is composed by pieces of related data. From this point of view,  machine learning methods for graphs are just an extension of standard models to more complex data. However, the presence of relationships and the fact that data complexity can be further increased suggests many novel application and research opportunities.  Having the above idea in mind, in this talk, I will focus  on theory of graph neural networks, recalling existing results and introducing some of those obtained in our lab. Moreover, I will discuss the perspectives and some of the open problems.

Alessandro Sperduti
Efficient Graph Neural Networks

We discuss different strategies to reduce the training burden of Graph Neural Networks with no or just marginal loss in performance. For node tasks we present a hierarchy of models based on simple graph convolution operators of increasing complexity that rely on linear transformations or controlled nonlinearities, and that can be implemented in single-layer graph convolutional networks. We also introduce a novel convolutional operator named Compact Multi-head EGC (CM-EGC) that beside exploiting a very simple graph convolution definition, also significantly reduces the number of learnable parameters compared to existing convolutions.Another applicable strategy consists in exploiting a reservoir architecture. Multiresolution Reservoir Graph Neural Networks (MRGNNs), inspired by graph spectral filtering, are an example of such approach. They are based on an explicit k-hop unsupervised graph representation amenable for further nonlinear processing.  On this line, we also report on results obtained by untrained Graph Neural Networks for fast and accurate graph classification. Finally, we present a Backpropagation-Free training algorithm that allows to achieve competitive results for node classification tasks, while considerably reducing the training burden.

Public Talk

Giulia Cencetti
What is Network Science?

Everybody talks about networks, networking, connections, and complexity, but are we sure we know what we are talking about? In this seminar we will try to understand together what networks are, and why we cannot talk about networks without talking about complex systems. We will discover the interdisciplinary power of this subject. Then I will give a taste of how to cope with networks: How to find structures, identify common patterns, detect similarities and differences. We will talk about centrality, six degrees of separation (and why it’s a small world), communities and higher-order networks. We will see how all this helps in explaining blackouts, managing epidemics, spreading climate change knowledge, and also understanding history. To conclude, we will focus on the role of women in network science.

The public discussion is scheduled to take place at "La Bookique" located at Via Torre d'Augusto, 29, 38122 Trento TN, on Wednesday, the 29th, at 9:00 pm. 

This event is open to everyone.

Poster


Graph Neural Networks for Problem Detection in Scientific Network Infrastructures

Glitter or Gold? Deriving Structured Insights from Sustainability Reports via Large Language Models

Link Prediction with Physics-Inspired Graph Neural Networks 

Meta-Path Learning for Multi-relational Graph Neural Networks

Hypergraph Neural Networks through the Lens of Message Passing: A Common Perspective to Homophily and Architecture Design

A characterization theorem for equivariant networks with point-wise activations.


Talks