Laplacian Linear Equations, Graph Sparsification, Local Clustering, Low-Stretch Trees, etc.
Shang-Hua Teng and I wrote a large paper on the problem of solving systems of linear equations in the Laplacian matrices of graphs. This paper required many graph-theoretic algorithms, most of which have been greatly improved. This page is an attempt to keep track of the major developments in and applications of these ideas.
Surveys, Classes, and Expository Articles
Network Solutions, by Ericha Klarreich. Simons Science News. Simons Foundation.
A fast solver for a class of linear systems, by Ioannis Koutis, Gary Miller, and Richard Peng. Communications of the ACM, Volume 55 Issue 10, October 2012 .
Lx = b, a book by Nisheeth Vishnoi.
Combinatorial Preconditioning
The combinatorial approach to preconditioning, also called support theory, was introduced by Vaidya in an unpublished paper. For more information on the history of this field, I recommend Bruce Hendrickson's Support Theory for Preconditioning page.
The following are some of the foundational papers of the field:
Support-Graph Preconditioners, by Bern, Gilbert, Hendrickson, Nguyen and Toledo. SIAM J. Matrix Anal. & Appl. 2006.
Support Theory for Preconditioning, by Boman and Hendrickson. SIAM J. Matrix Anal. & Appl. 2003.
Maximum-weight-basis preconditioners, by Boman, Chen, Hendrickson and Toledo, Numerical Linear Algebra and Applications 2004.
The Spielman-Teng Papers
Shang-Hua Teng and I wrote a large paper on this topic, "Nearly-Linear Time Algorithms for Graph Partitioning, Graph Sparsification and Solving Linear Systems", (prelim version in STOC '04, extended version on arxiv), in which we showed how to solve symmetric, diagonally-dominant linear sytems in nearly-linear time. We have since broken that paper into three papers, each of which may be read on its own, and which have been submitted to journals individually.
"A Local Clustering Algorithm for Massive Graphs and its Application to Nearly-Linear Time Graph Partitioning", (SICOMP 2013)
"Spectral Sparsification of Graphs", (SICOMP 2011) and
"Nearly-Linear Time Algorithms for Preconditioning and Solving Symmetric, Diagonally Dominant Linear Systems". (SIMAX 2014)
Expositions of the content of these papers may be found in the tutorial I gave at IPCO05 (here are the slides) and in some of the lectures of my classes Spectral Graph Theory and its Applications. and Spectral Graph Theory
Improved Solvers
The asymptotically fastest known solver appears in the following two papers
Preconditioning in Expectation (STOC 2014) by Michael B. Cohen, Rasmus Kyng, Jakub W. Pachocki, Richard Peng, and Anup Rao.
Stretching Stretch, by Michael B. Cohen, Gary L. Miller, Jakub W. Pachocki, Richard Peng, and Shen Chen Xu.
The simplest fast solver appears in the paper:
"Approximate Gaussian Elimination for Laplacians: Fast, Sparse, and Simple", by Kyng and Sachdeva (arXiv 2016). This is the simplest nearly linear time Laplacian solver.
This may be viewed as an improvement of the solver in the paper:
"Sparsified Cholesky and Multigrid Solvers for Connection Laplacians", with Kyng, Lee, Peng and Sachdeva (STOC 2016).
The algorithm of the later paper is faster asymptotically, but a little too complicated to really implement.
Another simple approach to solving these systems appears in the papers:
" A Simple, Combinatorial Algorithm for Solving SDD Systems in Nearly-Linear Time", by Kelner, Orecchia, Sidford and Zhu (STOC '13). It runs in time O(m log^2 n).
" Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems", by Lee and Sidford accelerates this to O(m log^(3/2) n).
The simplest algorithms based on augmented spanning tree preconditioners appear in the papers:
"Approaching optimality for solving SDD systems" by Koutis, Miller and Peng.
This paper is amazing for both its simplicity and for providing an algorithm that runs in time O(m log^2 n)
"Solving SDD systems in time O(m log n (loglog n)^2" by Koutis, Miller and Peng.
Improves upon the previous paper.
These papers find efficient (and incredibly simple) ways to construct the precondtioners whose existance was first proved in:
Subgraph Sparsification and Nearly Optimal Ultrasparsifiers, by Kolla, Makarychev, Saberi, and Teng (STOC 2010).
Efficient parallel algorithms that only depend on sparsification may be found in these papers:
"An Efficient Parallel Solver for SDD Linear Systems", with Peng (STOC 2014).
"Sparsified Cholesky Solvers for SDD linear systems", with Lee and Peng (arXiv 2015) (this was subsumed by the paper on Connection Laplacians)
Solvers for Directed (non-symmetric) Laplacians
A recent breakthrough in the solution of non-symmetric systems in the Laplacian matrices of directed graphs appears in the following two papers (with more to come):
Faster Algorithms for Computing the Stationary Distribution, Simulating Random Walks, and More (FOCS '16), by Cohen, Kelner, Peebles, Peng, Sidford, and Vladu.
Almost-Linear-Time Algorithms for Markov Chains and New Spectral Primitives for Directed Graphs (STOC '17), by Cohen, Kelner, Peebles, Peng, Rao, Sidford, and Vladu.
Code
Laplacians.jl is a package that implements many algorithms discussed in these papers.
Yiannis Koutis has written code that implements a version of combinatorial preconditiong. You can find it here.
An implementation of Vaidya's original approach is implemented in the TAUCS package, written by Doron Chen, Vladimir Rotkin and Sivan Toledo.
If you need to solve diagonally-dominant linear systems, I also suggest trying the Algebraic Multigrid algorithm.
You can find an implementation in the LAMG package, written by Oren Livne.
Local Graph Clustering
The problem of local graph clustering was introduced in paper number 1 above.
Given a massive graph and a vertex of interest, the problem is to find a cluster in the graph around the given vertex. Moreover, we try to do this in time proportional to the size of the cluster returned. Algorithms for this problem are useful for two reasons. The first is that they are very efficient as they can ignore most of the graph. The second is that they provide one of the few ways of identifying small clusters in certain regions.
Our paper has been substantially improved in the following papers:
Local graph partitioning using PageRank vectors, by Andersen, Chung and Lang. FOCS '08, to appear in Internet Mathematics (improves on paper 1 above)
A much cleaner proof of the correctness of this algorithm appears in the next paper.
Detecting sharp drops in PageRank and a simplified local partitioning algorithm, by Andersen and Chung. Theory and Applications of Models of Computation, Proceedings of TAMC 2007, LNCS 4484, Springer, (2007), 1--12.
Finding Sparse Cuts Locally Using Evolving Sets, by Andersen and Peres. Appeared in STOC '09.
Approximating the Expansion Profile and Almost Optimal Local Graph Clustering, by Shayan Oveis Gharan and Luca Trevisan. (FOCS '12).
Multi-agent random walks for local clustering on graphs, by Morteza Alamgir and Ulrike von Luxburg. (ICDM '10).
Random Walks and Evolving Sets: Faster Convergences and Limitations by Siu On Chan, Tsz Chiu Kwok, and Lap Chi Lau.
Algorithms from these papers have been used in a large number of clustering applications. Here are some examples.
Statistical properties of community structure in large social and information networks, by Andersen, Lang, Dasgupta and Mahoney, (WWW '08)
IsoRankN: spectral methods for global alignment of multiple protein networks, by Liao, Lu, Baym, Singh, and Berger (Bioinformatics, 2009)
Finding local communities in protein networks, by Voevodski, Teng and Xia (BMC Bioinformatics, 2009)
Algorithms to Detect Multiprotein Modularity Conserved during Evolution, by Hodgkinson and Karp (BIOINFORMATICS RESEARCH AND APPLICATIONS, 2011)
Low Stretch Spanning Trees
Lower-Stretch Spanning Trees, by Elkin, Emek, Spielman and Teng, SIAM J. Comput. 2008. (needed in paper 3 above)
Nearly Tight Low Stretch Spanning Trees, by Abraham, Bartal and Neiman. (FOCS 2008)
Using Petal-Decompositions to Build a Low Stretch Spanning Tree by Abraham and Neiman (STOC 2012).
The best low-stretch trees so far.
Also see the article "Spanning Trees with Low Average Stretch" by Abraham and Neiman.
Sparsification
Improvements in spectral sparsification of Laplacian matrices may be found in the following papers.
Graph Sparsification by Effective Resistances (with Srivastava, appeared in STOC '08).
The algorithm in this paper can be made practical by using the solvers of Koutis, Miller and Peng mentioned above.
Twice-Ramanujan Sparsifiers (with Batson and Srivastava, appeared in STOC '09).
This proves existance, and has a polynomial-time algorithm. But, it is not fast enough.
Improved spectral sparsification and numerical algorithms for SDD matrices by Koutis, Levin and Peng (STACS 2012).
This paper has faster algorithms for computing sparsifiers with a super-linear number of edges. It gets right many things that we done crudely in Spielman-Srivastava.
A Matrix Hyperbolic Cosine Algorithm and Applications by Anastasius Zouzias (ICALP 2012).
Linear-Sized Spectral Sparsification in Almost Quadratic Time and Regret Minimization Beyond Matrix Multiplicative Weight Updates, by Zeyuan Allen-Zhu, Zhenyu Liao, and Lorenzo Orecchia. (STOC '15)
This paper had the fastest algorithms for computing linear-sized sparsifiers.
An SDP-Based Algorithm for Linear-Sized Spectral Sparsification. (STOC '17) by Yin Tat Lee and He Sun. This gives a nearly-linear time algorithm that produces linear sized sparsifiers.
Approaches to spectral sparsification of arbitrary matrices appear in these two papers:
Sparse Sums of Positive Semidefinite Matrices by de Carli Silva, Harvey, and Sato (2011).
This improves on Batson-Spielman-Srivastava by sparsifying sums of low-rank matrices.
Uniform Sampling for Matrix Approximation, by Cohen, Lee, Musco, Musco, Peng and Sidford (ITCS 2015).
Uses of Laplacian solvers inside Interior-Point Methods
Faster Approximate Lossy Generalized Flow via Interior Point Algorithms (with Daitch, appeared in STOC '08)
This later paper shows how to solve linear systems in symmetric M-matrices by reducing them to Laplacian systems. It obtained the fastest algorithms for minimum-cost flow problems.
Algorithms for Lipschitz Learning on Graphs, with Rasmus Kyng, Anup Rao and Sushant Sachdeva. (COLT '15)
This uses Laplacian solvers inside interior point methods to regularize an interpolation problem on graphs.
Fast, Provable Algorithms for Isotonic Regression in all Lp-norms by Rasmus Kyng, Anup Rao and Sushant Sachdeva.
This uses Laplacian Solvers to construct the fastest algorithms for isotonic regression.
Some other support-theory based algorithms
Support-Graph Preconditioners for 2-Dimensional Trusses, (with Daitch)
Finding effective support-tree preconditioners, by Maggs, Miller, Parekh, Ravi and Woo. SPAA 2005.
A linear work, O(n^{1/6}) parallel time algorithm for solving planar Laplacians,by Koutis and Miller. SPAA 2008.
Combinatorial preconditioners for scalar elliptic finite-elements problems, by Avron, Chen, Shklarski and Toledo.
Solving Elliptic Finite Element Systems in Near-Linear Time with Support Preconditioners, by Boman, Hendrickson and Vavasis.
Mathematical foundations of combinatorial preconditioning
Rigidity in finite-element matrices: Sufficient conditions for the rigidity of structures and substructures, by Shklarski and Toledo.
On factor width and symmetric H-matrices, by Boman, Chen, Parekh and Toledo, Linear Algebra and its Applications, 2005.
Combinatorial characterization of the null spaces of symmetric H-matrices, by Chen and Toledo, Linear Algebra and its Applications, 2004.