Large Scale Graph Neural Networks

Goal of the Tutorial

This tutorial represents a significant milestone as the first comprehensive overview of techniques for large-scale machine learning on graphs, encompassing both theoretical foundations and practical applications. It delves into past and recent research endeavors aimed at enhancing the scalability of Graph Neural Networks (GNNs) and explores their diverse potential use cases.


Our Team

Rui Xue is a Ph.D. student in the Department of Electrical and Computer Engineering at North Carolina State University. He received the master’s degree in Electrical and Computer Engineering at University of Southern California. His main research interests include machine learning on graphs, scalability of machine learning, and signal processing. He has published several papers at signal processing and machine learning conferences.


Haoyu Han is currently a second-year Ph.D. candidate in the Department of Computer Science and Engineering at Michigan State University. He earned his Master’s degree in Computer Science from the University of Science and Technology of China. Haoyu’s primary research areas encompass graph data mining and large- scale machine learning. He has won two NeurIPS competitions, including the OGB-LGC. Additionally, he has authored several publications in the field of graph data mining.


Tong Zhao is a Research Scientist in the Computational Social Science group at Snap Research. He earned a Ph.D. in Computer Science and Engineering at University of Notre Dame in 2022. His research focuses on graph machine learning as well as their applications in real-world use cases. His work has resulted in 20+ conference and journal publications, in top venues such as ICML, ICLR, KDD, AAAI, WWW, TNNLS, etc. He also has experience organizing workshops and tutorials related to GNNs.


Neil Shah is a Lead Research Scientist and Manager at Snap Re- search, working on machine learning algorithms and applications on large-scale graph data. His work has resulted in 55+ conference and journal publications, in top venues such as ICLR, NeurIPS, KDD, WSDM, WWW, AAAI and more, including several best-paper awards. He has also served as an organizer, chair and senior pro- gram committee member at a number of these conferences. He has also organized workshops and tutorials on graph machine learning topics at KDD, WSDM, SDM, ICDM, CIKM, and WWW. He has had previous research experiences at Lawrence Livermore National Laboratory, Microsoft Research, and Twitch. He earned a PhD in Computer Science in 2017 from Carnegie Mellon University’s Computer Science Department, funded partially by the NSF Graduate Research Fellowship.

Jiliang Tang is a MSU Foundation professor in the computer science and engineering department at Michigan State University. He was an associate professor from 2021 to 2022 and an assistant professor from 2016 to 2021 in the same department. His research interests include data mining, machine learning and their applications in social media, biology, and education. He was the recipient of 2022 SDM IBM Early Career Data Mining Research Award, 2021 ICDM Tao Li Award, 2020 SIGKDD Rising Star Award, 2019 NSF Career Award, and 8 best paper awards (or runner-ups) including WSDM2018 and KDD2016. His dissertation won the 2015 KDD Best Dissertation runner up and Dean’s Dissertation Award. He serves as top data science conference organizers (e.g., KDD, SIGIR, WSDM, and SDM) and journal editors (e.g., TKDD and TKDE). He has published his research in highly ranked journals and top conference proceedings, which received more than 26,000 citations with h-index 77 and extensive media coverage. 


Xiaorui Liu is an assistant professor in Computer Science Department at North Carolina State University. He received his Ph.D. degree in Computer Science from Michigan State University in 2022. His research interests include deep learning on graphs, large- scale machine learning, and trustworthy artificial intelligence. He has published innovative works in top-tier conferences such as NeurIPS, ICML, ICLR, KDD, AISTATS, and SIGIR. He has experience of organizing and co-presenting multiple tutorials related to GNNs and Large-scale Machine Learning such as “Graph Representation Learning: Foundations, Methods, Applications, and Systems” in KDD 2021 and “Communication Efficient Distributed Learn- ing” in IJCAI 2021.