dyGRASS is a dynamic graph spectral sparsification algorithm that updates spectral sparsifiers in ๐ (1) time per edge insertion or deletion by executing GPU-accelerated non-backtracking random walks for effectively identifying spectrally critical edges. This results in approximately a 10X speedup over the previous inGRASS algorithm while maintaining tight spectral similarity across fully dynamic graphs. Related publications: ICCAD'25.
SHyPar is a multilevel spectral framework for partitioning large-scale hypergraphs by leveraging hyperedge effective resistances and flow-based community detection techniques. A key component of SHyPar is a flow-based local clustering scheme for hypergraph coarsening, which incorporates a max-flow-based algorithm to produce clusters with substantially improved conductance. Additionally, SHyPar utilizes an effective resistance-based rating function for merging nodes that are strongly connected (coupled). Related publications: TCAD'25.
CirSTAG leverages GNNs to analyze the stability (robustness) of modern integrated circuits (ICs). CirSTAG is grounded in a spectral framework that examines the stability of GNNs by leveraging input/output graph-based manifolds. When two adjacent nodes on the input manifold are mapped (through a GNN model) to two remote nodes (data samples) on the output manifold, this indicates a significant mapping distortion (DMD) and consequently poor GNN stability. CirSTAG calculates a stability score equivalent to the local Lipschitz constant for each node and edge, considering both graph structure and node feature perturbations. This enables the identification of the most critical (sensitive) circuit elements that could significantly impact circuit performance. Related publications: DAC'25 (Best Paper Award Nomination), arXiv2402.08453.
We introduce a novel algorithm for incremental spectral sparsification of large undirected graphs. The proposed inGRASS algorithm is highly scalable and parallel-friendly, having a nearly-linear time complexity for the setup phase and the ability to update the spectral sparsifier in O(logN) time for each incremental change made to the original graph with N nodes. Related publications: DAC'24.
SGM-PINN is a graph-based importance sampling framework for accelerating the training of Physics-informed Neural Networks (PINNs). By biasing sampling towards more important node (point cloud) clusters, this framework allows much smaller mini-batches and training datasets to be involved in the training iterations, improving the overall training efficiency and solution quality of PINNs. Related publications: DAC'24.
In collaboration with Cornell University, we propose GARNET to significantly boost the adversarial robustness of GNN models. A key technical component in GARNET is an efficient scheme for learning reduced-rank base graphs by leveraging probabilistic graphical models (PGMs). Related publications: LoG'22 (Oral).
GRASPEL is a spectral method for graph topology learning from (high-dimensional) data samples through iteratively estimating attractive Markov Random Fields (MRFs). The graphs constructed by GRASPEL will have effective-resistance distances encoding the Euclidean distances between the original data samples. Related publications: WSDM'22.
HyperEF is a scalable low-resistance-diameter decomposition algorithm for partitioning large hypergraphs into smaller clusters with only a few inter-cluster hyperedges. A key component in HyperEF is a nearly-linear time algorithm for estimating hyperedge effective resistances, which allows incorporating the latest diffusion-based non-linear quadratic operators defined on hypergraphs. Related publications: ICCAD'22.
HyperSF is an efficient algorithm for hypergraph coarsening by exploiting flow-based local clustering. To further improve the algorithm efficiency, we propose a divide-and-conquer scheme by leveraging spectral clustering of the bipartite graphs corresponding to the original hypergraphs. Related publications: ICCAD'21.
In collaboration with Cornell University, we introduce SPADE for estimating the adversarial robustness of any machine-learning (ML) model (in collaboration with Cornell University). SPADE exploits the generalized Courant-Fischer theorem and the bijective distance mapping between the input/output graph-based manifolds for estimating the best Lipschitz constant of a given ML model. Related publications: ICML'21.
In collaboration with Cornell University, we introduce a multi-level spectral framework, GraphZoom, for boosting the performance and scalability of unsupervised graph embedding algorithms by leveraging spectral coarsening. GraphZoom allows any existing embedding methods to be applied to the coarsened graph before it progressively refines the embeddings obtained at the coarsest level to increasingly finer graphs. Related publications: ICLR'20 (Oral).
GRASS is a spectral graph sparsification engine for extracting ultra-sparse subgraphs that can well preserve graph spectral (structural) properties. The sparsified graphs can be immediately leveraged to accelerate spectral graph partitioning algorithms or preconditioned iterative methods for solving large sparse matrices. Related publications: DAC'16, DAC'18, TCAD'20.