Spectral Techniques in Theoretical Computer Science

Instructors: Anup Bhattacharya, Gopinath Mishra

Description: Spectral methods use eigenvalues or eigenvectors of a matrix to solve computational problems. In the past decade there has been phenomenal growth of use of spectral techniques to design faster algorithms for combinatorial problem. In fact the fastest known algorithm for max-flow uses spectral techniques. In this course we will cover spectral techniques used for graph partitioning and spectral sparsification problems, and eventually give a sketch of the spectral algorithm for max-flow.

The course is divided into a few segments. In the first part, we will cover eigenvalues of Laplacian matrices, and Cheeger's inequalities. In the second part, results on graph sparsification using spectral methods would be discussed. And if time permits, in the third part, we will give a sketch of the spectral algorithm for max-flow.

References: Our primary reference for this course would be a collection of monographs and lecture notes. For the first part, we will follow lecture notes by Luca Trevisan. For the second part, our primary reference would be lecture notes by Daniel Spielman. For the third part, our primary reference would be the monograph by Nisheeth Vishnoi. In addition to these we will cover some recent results using spectral techniques. We would be adding references for papers of interest to this course. If you are interested to present one of these papers, please inform us.

Lecture details:

Lecture 1 (Anup): We discussed the scope of the course, and some results using the eigenvalues of the adjacency matrix of an undirected graph.

Lecture 2 (Anup): We introduced the Laplacian matrix and studied properties of eigenvalues of the graph Laplacian. We followed the lecture notes here.

Lecture 3 (Anup): We proved both directions of Cheeger's inequality. We followed these notes for the proof.

Lecture 4 (Anup): We covered the spectral algorithm for max-cut. The result was proved in this paper. We followed the derivations given in Chapter 6 of this note.

Lecture 5 (Anup): We covered higher-order Cheeger's inequality from here (Chapter 7). The result was originally proved here.

These set of lectures mark the end of the first part of these lecture series.

---------------------------------------------------------------------------------------------------------------------------------------------------

Lecture 6 (Anup): We started by discussing basics of random walks in graphs.

Lecture 7 (Anup): We continued discussing random walks on graphs and its connection to electrical networks. We followed the lecture notes by Lap Chi Lau.

Paper details: We will keep adding papers here, if you want to volunteer to read a paper, let us know.

  1. Spectral Partitioning Works: Planar graphs and finite element meshes by Daniel Spielman and Shang-Hua Teng. In Linear Algebra and its Applications Volume 421, Issues 2-3, 1 March 2007, Pages 284-305. 2007

  2. Spectral Partitioning of Random Graphs by Frank McSherry in FOCS 2001

  3. Clustering with Spectral Norm and the k-means Algorithm by Amit Kumar and Ravi Kannan at FOCS 2010.

  4. Dimensionality Reduction for k-Means Clustering and Low Rank Approximation by Cohen et al. at STOC 2015.

  5. Turning big data into tiny data: Constant-size coresets for k-means, PCA and projective clustering by Feldman et al. at SODA 2013.

  6. Partitioning into Expanders by Shayan Oveis Gharan, Luca Trevisan in SODA 2014.

  7. Reference for Matrix Tree Theorem: link

  8. Expanders via Local Edge-flips in (SODA 2016)