Plenary talks

Bastian Rieck

Short Bio: Bastian is the principal investigator of the AIDOS Lab at the Institute of

AI for Health at Helmholtz Munich, Germany. His main research interests are

developing multi-scale algorithms for analysing complex data sets, with a

focus on biomedical applications and healthcare topics. Bastian is also

enticed by finding new ways to explain neural networks using concepts from

algebraic and differential topology. He is a big proponent of scientific

outreach and enjoys blogging about his research, academia, supervision, and

software development. Bastian received his M.Sc. degree in mathematics, as

well as his Ph.D. in computer science, from Heidelberg University in Germany.

Title: Topology-Based Graph Learning

Abstract: Topological data analysis is starting to establish itself as a powerful and effective framework in machine learning , supporting the analysis of neural networks, but also driving the development of novel algorithms that incorporate topological characteristics. As a problem class, graph representation learning is of particular interest here, since graphs are inherently amenable to a topological description in terms of their connected components and cycles. This talk will provide an overview of how to address graph learning tasks using machine learning techniques, with a specific focus on how to make such techniques 'topology-aware.' We will discuss how to learn filtrations for graphs and how to incorporate topological information into modern graph neural networks, resulting in provably more expressive algorithms. This talk aims to be accessible to a general scientific audience; knowledge of machine learning an/or TDA is helpful but not required.



SlidesRieck2022Aug04.pdf

Slides of Bastian Rieck's talk

Shubhendu Trivedi

Short Bio: Shubhendu is a researcher interested in foundational problems in machine learning, especially those that incorporate geometric structure into neural networks, employ statistical physics-based approaches for neural network analysis, and develop conformal prediction methods for uncertainty quantification. Shubhendu has been a research associate at MIT and an NSF Institute Fellow in Computational and Experimental Mathematics at Brown University working on problems in algebraic machine learning. He received his PhD and MS degrees working at the Toyota Technological Institute at Chicago and the University of Chicago in 2018, with a thesis on group equivariant and symmetry-preserving neural networks. He also holds a MS with work on applications of the Szemeredi Regularity Lemma and a Bachelor's degree in Electrical Engineering. Apart from academic research, Shubhendu has also led multiple teams for industrial research on health analytics, equivariant models for relational data, knowledge graph engineering and zero-shot transfer learning. He has also been associated with a semi-conductors startup.

Title: Advances in Equivariant Learning


Abstract: Originally inspired by convolutional neural networks in computer vision, equivariant networks have now emerged as a successful class of models in a wide variety of domains such as protein design, drug discovery, reinforcement learning, learning physics etc. The development of such networks however, requires a careful examination of the underlying symmetries/geometric structure of the problem. In the first part of this talk, I will give an overview of some theoretical results (including ongoing work) in the area that have unified and often guided some of the development of such networks. Then I will present an efficient and very general framework to construct such networks, with an example application via spherical image recognition, and summarizing some open questions. Next, I will discuss the construction of such networks in the context of partial symmetries (groupoids) and dynamical systems. Finally, I will very briefly discuss some work on quantifying the expressivity of equivariant networks and its implications for future network design.



Pablo Barceló

Short Bio: Full Professor at Pontificia Universidad Católica de Chile, where he also acts as Director of the Institute for Mathematical and Computational Engineering. Ph.D. in Computer Science from the University of Toronto, Canada. He is an associate researcher of the Millennium Institute for Foundational Research on Data (IMFD Chile) and the National Center for Artificial Intelligence (CENIA). He is the author of more than 90 technical papers, has chaired ICDT 2019, will be chairing ACM PODS 2022, and is currently a member of the editorial committee of Logical Methods in Computer Science. From 2011 to 2014 he was the editor of the database theory column of SIGMOD Record. His areas of interest are database theory, logic in computer science, and the emerging relationship between these areas and machine learning.


Title: Graph Neural Networks with Local Graph Parameters


Abstract: Various recent proposals increase the distinguishing power of Graph Neural Networks (GNNs) by propagating features between k-tuples of vertices. The distinguishing power of these “higher-order” GNNs is known to be bounded by the k-dimensional Weisfeiler-Leman (WL) test, yet their nonlinear memory requirements limit their applicability. Other proposals infuse GNNs with local higher-order graph structural information from the start, thereby inheriting the desirable linear memory requirement from GNNs at the cost of a one-time, possibly non-linear, preprocessing step. We propose local graph parameter enabled GNNs as a framework for studying the latter kind of approaches. We precisely characterize their distinguishing power, in terms of a variant of the WL test, and in terms of the graph structural properties that they can take into account. Local graph parameters can be added to any GNN architecture, and are cheap to compute. In terms of expressive power, our proposal lies in the middle of GNNs and their higher-order counterparts. Further, we propose several techniques to aid in choosing the right local graph parameters. Our results connect GNNs with deep results in finite model theory and finite variable logics.