Software Releases

GRASS

GRASS is a spectral graph sparsification engine for extracting ultra-sparse subgraphs that can well preserve graph spectral (structural) properties. The sparsified graphs can be immediately leveraged to accelerate spectral graph partitioning algorithms or preconditioned iterative methods for solving large sparse matrices. Related publications: DAC'16, DAC'18, TCAD'20.


GraphZoom

In collaboration with Cornell University, we introduce a multi-level spectral framework, GraphZoom, for boosting the performance and scalability of unsupervised graph embedding algorithms by leveraging spectral coarsening. GraphZoom allows any existing embedding methods to be applied to the coarsened graph before it progressively refines the embeddings obtained at the coarsest level to increasingly finer graphs. Related publications: ICLR'20 (Oral).


SPADE

In collaboration with Cornell University, we introduce SPADE for estimating the adversarial robustness of any machine-learning (ML) model (in collaboration with Cornell University). SPADE exploits the generalized Courant-Fischer theorem and the bijective distance mapping between the input/output graph-based manifolds for estimating the best Lipschitz constant of a given ML model. Related publications: ICML'21.


HyperSF

HyperSF is an efficient algorithm for hypergraph coarsening by exploiting flow-based local clustering. To further improve the algorithm efficiency, we propose a divide-and-conquer scheme by leveraging spectral clustering of the bipartite graphs corresponding to the original hypergraphs. Related publications: ICCAD'21.


GRASPEL

GRASPEL is a spectral method for graph topology learning from (high-dimensional) data samples through iteratively estimating attractive Markov Random Fields (MRFs). The graphs constructed by GRASPEL will have effective-resistance distances encoding the Euclidean distances between the original data samples. Related publications: WSDM'22.


HyperEF

HyperEF is a scalable low-resistance-diameter decomposition algorithm for partitioning large hypergraphs into smaller clusters with only a few inter-cluster hyperedges. A key component in HyperEF is a nearly-linear time algorithm for estimating hyperedge effective resistances, which allows incorporating the latest diffusion-based non-linear quadratic operators defined on hypergraphs. Related publications: ICCAD'22.


GARNET

In collaboration with Cornell University, we propose GARNET to significantly boost the adversarial robustness of GNN models. A key technical component in GARNET is an efficient scheme for learning reduced-rank base graphs by leveraging probabilistic graphical models (PGMs). Related publications: LoG'22 (Oral).


SGM-PINN

SGM-PINN is a graph-based importance sampling framework for accelerating the training of Physics-informed Neural Networks (PINNs). By biasing sampling towards more important node (point cloud) clusters, this framework allows much smaller mini-batches and training datasets to be involved in the training iterations, improving the overall training efficiency and solution quality of PINNs. Related publications: DAC'24.