In this tutorial, titled 'Relational Deep Learning: Challenges, Foundations, and Next Generation Architectures', we introduce the emerging field of relational deep learning (RDL), where relational databases are represented as graph-structured data by treating each table row as a node and primary-foreign key relationships as edges. We begin by highlighting the key challenges in the formulation of relational entity graphs as well as different aspects of data modeling such as temporality, heterogeneity and scale. Next, we discuss foundational methods that enable RDL, including Graph Neural Networks (GNNs) alongside their temporal and heterogeneous extensions, and graph transformers, and review existing benchmark datasets in this domain. Finally, we provide an overview of recent frontier architectures and propose new research directions to further advance the capabilities and impact of relational deep learning.
08:00 - 08:45 Introduction to Relational Deep Learning
08:45 - 09:30 Graph ML Models
09:30 - 09:45 Coffee Break
09:45 - 10:15 Next Gen RDL Architectures
10:15 - 11:00 Foundation Models for RDL
Vijay Prakash Dwivedi is a postdoctoral scholar at Stanford University working on relational and graph representation learning. He holds a PhD from Nanyang Technological University (NTU), Singapore. His work has made contributions to advancing benchmarks for GNNs, graph positional and structural encodings, and graph transformers as universal deep neural networks for graph learning. He has also contributed to the integration of parametric knowledge in LLMs for diverse applications, particularly in healthcare. Several of the methods he developed during his PhD are now widely adopted in modern graph learning models. For his research, he received one of the Outstanding PhD Thesis Awards from the NTU College of Computing and Data Science in 2024 and a Rising Star in AI 2025 recognition from KAUST, Saudi Arabia. He has also served as a tutor in multiple summer and winter school sessions focused on graph neural networks and graph transformers.
Charilaos I. Kanatsoulis is a research associate in the Department of Computer Science at Stanford University. Prior to that he was a postdoctoral researcher in the Department of Electrical and Systems Engineering at the University of Pennsylvania (Penn). He received his Ph.D. degree in electrical and computer engineering from the University of Minnesota, Twin Cities, in 2020. His research interests lie in the intersection between machine learning and signal processing and include novel Transformer and Neural Network design, graph representation learning, tensor analysis, and explainable AI. He has made significant contributions to the analysis and design of GNN and has advanced methods for solving inverse problems involving multi-dimensional data, particularly in hyperspectral and medical imaging. He is a co-instructor of CS246 and CS224W at Stanford and was the main instructor of ESE 5140 at Penn.
Shenyang Huang is a postdoctoral scholar at University of Oxford working on temporal graph learning. He recently obtained his PhD at McGill University and Mila, focusing on temporal graph learning (supervised by Prof. Reihaneh Rabbany and Prof. Guillaume Rabusseau). Previously, he obtained an Honors in Computer Science from McGill University. His research interest includes representation learning on temporal graphs, graph representation learning, data mining and continual learning. He is the community manager of a temporal graph learning slack space with over 330 members engaging in research discussions. He is also organizing the first reading group on temporal graphs - TGL-RG. He was also the Organization Chair for the Temporal Graph Learning Workshop at NeurIPS 2023 and 2022.
Prof. Jure Leskovec is a Professor of Computer Science at Stanford University. He is affiliated with the Stanford AI Lab, Machine Learning Group and the Center for Research on Foundation Models. In the past, he served as a Chief Scientist at Pinterest and was an investigator at Chan Zuckerberg Biohub. Most recently, he co-founded machine learning startup Kumo.AI. Leskovec pioneered the field of Graph Neural Networks and co-authored PyG, the most widely-used graph neural network library. Research from his group has been used by many countries to fight COVID-19 pandemic, and has been incorporated into products at Facebook, Pinterest, Uber, YouTube, Amazon, and more. His research received several awards including Microsoft Research Faculty Fellowship in 2011, Okawa Research award in 2012, Alfred P. Sloan Fellowship in 2012, Lagrange Prize in 2015, and ICDM Research Contributions Award in 2019. His research contributions have spanned social networks, data mining and machine learning, and computational biomedicine with the focus on drug discovery. His work has won 12 best paper awards and 5 10-year test of time awards at premier venues in these research areas.