Past News (before 2022/12)
2022/12 Invited to give a keynote talk on temporal graph representation learning at TGL Workshop (google.com) (my talk starts from 7:29:20) and to present at the panels of New Frontiers in Graph Learning (GLFrontiers) and TGL Workshop (google.com) [NeurIPS'22 activities].
2022/12 Our paper on GNNs for pileup mitigation in high energy physics gets accepted by the European Physical Journal C. Big congrats to Shikun Liu, Tianchun Li and my great collaborators in physics, Yongbin Liu, Mia Liu, and Nhan Tran!
2022/11 One paper gets accepted by LoG'22 with oral presentation and best paper award (only 2 papers)! Big congrats to Yuhong!
"Neighborhood-aware Scalable Temporal Network Representation Learning," (codes). This work proposes a new framework to perform temporal network representation learning. Instead of tracking a long vector representation for each node, we propose to use a dictionary-type node representation that allows for online construction of structural features in temporal networks. We also propose a hash-type operation that allows fast manipulation of multiple dictionaries in parallel.
2022/11 Give a talk on our recent work on interpretable geometric deep learning for science at "AI + Math" Colloquia (SJTU) & Cross-Disciplinary AI Colloquia (PKU). You may find the recorded video here (in Chinese).
2022/11 Invited to present at the panel of ICAIF-22 Synthetic Data Workshop.
2022/10 Give a keynote talk on Graph Machine Learning for Science at the FastML workshop slides.
2022/09 Two papers got accepted by NeurIPS'22 (submitted/accepted = 3/2)! Big congrats to Haoyu, Rongzhe, Haoteng, and Other Collaborators!
"Unsupervised Learning for Combinatorial Optimization with Principled Objective Relaxation" (codes). This work shows that, how to relax a combinatorial optimization objective via neural networks, one may obtain a performance guarantee of the final integral solution by rounding the output of neural networks. We evaluate this idea over several graph optimization problems and circuit design problems.
"Understanding Non-linearity in Graph Neural Networks from the Bayesian-Inference Perspective"(codes) This work tries to establish the connection between Bayesian inference over a graphical model and Graph neural networks. With such a connection, we are able to measure the values of nonlinear operations in GNNs for the node classification task. We obtain a “negative” result: When node attributes are not very informative, non-linear operations during the message are kind of useless, which matches many previous empirically successful architectures such as APPNP, GPRGNN and spectral GNNs such as JacobiConv...
2022/06 Welcome to submitting your papers to Learning on Graphs (LoG) conference! I serve as an area chair here.
2022/06: One paper got accepted by VLDB'22 (submitted/accepted = 1/1)! Big congrats to Haoteng and Other Collaborators!
"Algorithm and System Co-design for Efficient Subgraph-based Graph Representation Learning" (codes). This is our first work that tries to design a fast graph representation learning computation framework. We abandon the current pipeline of GNNs while adopting subgraph representation learning. Subgraph representation learning has several advantages over node-feature refinement adopted by traditional GNNs due to its stronger expressive power, better robustness, etc. This work can be viewed as a system acceleration of our previous algorithm/theory works on distance encoding, labeling tricks, etc..
2022/05: One paper got accepted by KDD'22 (submitted/accepted = 2/1)! Big congrats to Yanchao, Carl and Others!
"4SDrug: Symptom-based Set-to-set Small and Safe Drug Recommendation" (to be released)
2022/05: One paper got accepted by ICML'22 (submitted/accepted =1/1)! Big congrats to Siqi and Miaoyuan!
(Stochasticity makes GNNs more interpretable and generalizable) "Interpretable and Generalizable Graph Learning via Stochastic Attention Mechanism." In this work, we showed the issues of posthoc model interpretation approaches. We also introduce a novel graph attention mechanism so that the model once trained can provide self-interpretation. The key idea is to inject stochasticity into the attention. (codes)
2022/03: Get the 2021 Sony Faculty Innovation Award! Big congrats! Thank Sony so much!
2022/03: Give an invited talk on "Distance Features, Labeling Tricks? Towards More Powerful Graph Neural Networks" in the seminar on Learning on Graph and Geometry. Here are the slides.
2022/03: One paper got accepted by CVPR'22 oral! My contributions are limited but big congrats to collaborators! I felt extremely happy to learn about this interesting application.
"Better Trigger Inversion Optimization in Backdoor Scanning, " Big congrats to Prof. Xiangyu Zhang and the team!![codes]
2022/02: Give a keynote talk on "Foundations on Deep Learning on Graphs" in AAAI, DLG workshop. Here are the slides.
2022/02: Prof. Shandian Zhe, Prof. Nate Veldt and I are co-editing a special session on Machine Learning and Analysis on Multiway, Multi-relational Data and Higher-order Graphs in the series of journal Frontiers in Big Data. We are looking forward to your contributions.
2022/02: One paper got accepted by DAC'22! Big congrats!
(Benchmarking GNNs for Hardware Design Evaluation) "High-Level Synthesis Performance Prediction using GNNs: Benchmarking, Modeling, and Advancing," with Nan, Hang, Cellie and Prof. Xie. Recently, there has been a trend in the community of hardware design to use GNN models to fast evaluate the performance of a hardware design (such as latency, resource cost, etc). In this work, we benchmark different GNN models on hardware performance prediction tasks.
2022/01: Two papers got accepted by ICLR'22 (submitted/accepted =2/2)! Big congrats!
(Theory for Positional Encoding) "Equivariant and Stable Positional Encoding for More Powerful Graph Neural Networks," with Haorui, Haoteng and Muhan. We explain the risk of being unstable (thus bad generalization) when naively using positional encoding as the node features. We propose provable techniques to address the problem,(codes)
(Optimal Transport for Graph Encoding) "Graph Auto-Encoder via Neighborhood Wasserstein Reconstruction," with Mingyue and Carl. We study using optimal transport to perform graph autoencoding. (codes)
2022/01: Two paper got accepted by WWW'22 (submitted/accepted =2/2)! Big congrats!
(Higher-order structure prediction) "Neural Predicting Higher-order Patterns in Temporal Networks", with Yunyu and Jianzhu. This is Yunyu's first first-authored paper in my team. Big congrats! (codes)
(Frequent Subgraph Mining) "SATMargin: Practical Maximal Frequent Subgraph Mining via Margin Space Sampling", with Muyi. This is a paper that comes from my course "Learning and Computation on Graphs".
2021/11: I am invited to give a talk in Georgia Tech ECE on "Deep Learning on Graphs: Feature Augmentation for More Powerful Graph Models"
2021/10: I am serving as a TPC for WWW'22
2021/10: One paper gets accepted by NeurIPS'21 AI4Science
"Semi-supervised Graph Neural Network for Particle-level Noise Removal", with Tianchun, Shikun, Yongbin, Nhan, Mia.
In this project, computer scientists and physicists from Purdue and Fermi's lab are collaborated to use graph learning approaches to perform signal denoising in particle physics experiments.
2021/09: Five papers got accepted by NeurIPS'21 (submitted/accepted =8/5)! Big congrats! Many thanks for the big effort of my great collaborators!
(Higher-order graph computation) "Local Hyper-Flow Diffusion", with Kimon and Shenghao (code)
(NAS foundation) "Generic Neural Architecture Search via Regression," with Yuhong, Callie, Jinjun, Deming . (code) (Spotlight acceptance rate < 3%)
(GNN algorithm) "Adversarial Graph Augmentation to Improve Graph Contrastive Learning", with Susheel, Callie, Jennifer (code)
(GNN foundation) "Labeling Trick: A Theory of Using Graph Neural Networks for Multi-Node Representation Learning," with Muhan, Yinglong, Kai, Long (code)
(GNN algorithm) "Nested Graph Neural Networks," with Muhan (code)
2021/09: I am serving as a TPC for SDM'22
2021/09: Receive an HDR institute grant on "Accelerated AI Algorithms for Data-Driven Discovery". Thanks, NSF!
19 PIs on board come from different scientific domains including high-energy physics, astronomy, neuroscience, and engineering domains on machine learning and hardware design.
I am thrilled to work on the following related projects on pushing graph machine learning techniques for scientific discovery.
2021/09: I am serving as a TPC for WSDM'22, a senior PC for AAAI'22, a PC for ICLR'22.
2021/08: One paper gets accepted by ICDM'21!
We generalize distance encoding by incorporating node types, which can be applied to heterogeneous networks! Joint work with Houye, Chuan, Cheng!
2021/06: Received JP Morgan faculty research award! Many thanks for the support!
2021/05: Invited talk on "Laplacian Operators for Hypergraphs with Applications in Spectral Clustering", in SIAM Applied Linear Algebra (LA21)
2021/05: Two papers get accepted by KDD'21! Many thanks for the great effort from collaborators!
We show how to adaptively combine proximity similarity and structure similarity to improve GNN's performance on networks that are agnostic to their associativity. Joint work with Susheel, Vinith, Jennifer and Jianzhu.
We apply GNNs to perform anomaly detection in large-scale industrial datasets from Amazon. Joint work with Andrew, Rex, Nikhil , Karthik and Jure.
2021/04: Invited talk on "Representation Learning on Temporal Networks", Amazon, DGL USER group.
2021/04: Invited talk on "Rethinking Graph Representation Learning --- Power and Robustness", UT-Austin, Dr. Atlas Wang's group.
2021/04: Invited talk on "Graph Neural Networks: Motivations and Some Recent Progress", MIT, Dr. Gregory W Wornell's group.
2021/03: Invited talk on "Graph Neural Networks: Motivations and Some Recent Progress", Emory University, Dr. Carl Yang's group.
2021/02: One paper on "SW/HW co-design for memory-efficient low-latency personalized PageRank computation" got accepted by DAC'21. Thanks to the collaborators, Dr. Yao Cheng and Dr. Callie Hao.
2021/01: Three papers got accepted by WWW'21 (submitted/accepted =4/3)!
Strongly local algorithms for hypergraph clustering (cardinality-based hyperedge cut cost) with Meng, Nate, Haoyu and Dr. David Gleich
Neural modeling of human interaction in offline games with Yanbang, Chongyang and Dr. Jure Leskovec
Anomaly detection in data stream with multi-dim features with Siddharth, Arjit, Ritesh, Dr. Bryam Hooi
2021/01: Two papers got accepted by ICLR' 21 (submitted/accepted =2/2)!
Neural modeling of network motifs: we proposed a novel neural encoding tool of network motifs, called causal anonymous walk, to inductively represent network dynamics. Casual anonymous walks automatically model and learn the impact of temporal network motifs on the evolvement of network structures. For example, casual anonymous walks can be used to model the triadic closure in social network evolving.
Power of Generalized PageRank in GNNs: Generalized PageRank solves every issue known till now in GNNs for node classification, including the over-smoothing issue, the overfitting issue, and the inapplicability to heterophilic networks, etc. Our results show that for node classification, GNNs do not need to add any non-linearity during message passing procedure. Different hops of message passing being associated with scalar learnable weights is good enough to capture the potentially complex structural relation in node classification tasks. This observation also demonstrates our previous theory by analyzing random-walk-type message passing over networks.
2020/10: Anomaly detection in dynamic network paper F-FADE got accepted by WSDM'21!
2020/09: Our graph information bottleneck paper got accepted by NeurIPS'20!
2020/09: Our distance encoding paper got accepted by NeurIPS'20 and is online! The paper is theoretical and rigorous. More illustrative examples can be found in the slides!
Distance encoding improves the power of GNNs from a novel angle rather than mimic WL tests! This method is extremely helpful in more-structural-related tasks: structural-role prediction, link prediction, triangle prediction! It can be be easily applied in any GNN models.
2020/08: Joined Purdue University! I start organizing my team and teaching!