How to cope with NP-hardness

One strategy for addressing NP-hard problems is to leverage any underlying structure of the problem to design simpler and more efficient algorithms. One example of this approach is the use of interval representations. This structure is well-known and widely used in many different areas of computer science. In this context, it is possible to solve the independent set problem in linear time, which is a significant improvement over the general case where the problem cannot be approximated in polynomial time to a constant factor, unless P = NP.

It's worth noting that this strategy is not limited to the independent set problem and interval representation. In fact, many other NP-hard problems can be tackled by exploiting structural properties. For example, problems that exhibit hierarchical structures, such as trees and graphs, can often be solved more efficiently by using dynamic programming or divide-and-conquer techniques. Another example is the use of sparsity or low-rank assumptions for problems that can be represented in matrix form.

Overall, the key to coping with NP-hardness is to look for any structural properties that can be exploited to simplify the problem and design more efficient algorithms, even though they may not solve the problem completely.

 Graph Searching:

Graph searching refers to the process of traversing a graph, visiting each vertex one at a time, according to a specific method. Two commonly used methods of graph searching are Breadth-First Search (BFS) and Depth-First Search (DFS). These are considered classical techniques in the field.

Chordal Graphs:

Chordal graphs are sometimes referred to as triangulated graphs as well. A graph G is chordal if the largest induced cycle in G is a triangle. It is not hard to convince yourself that chordality is a hereditary property i.e. if G is chordal, so is every induced subgraph of G. Recall that a graph G is perfect if for every induced subgraph H of G: the clique number of H is the same as the chromatic number of H. In the 1960s, Claude Berge introduced the concept of perfect graphs. Since then, they have been the subject of extensive research. One of the most interesting properties of perfect graphs is that they can be used to solve many NP-hard problems efficiently using the ellipsoid method. However, despite this progress, ongoing research is still dedicated to developing combinatorial algorithms that can solve these problems without relying on the ellipsoid method. Many graph families belong to the class of perfect graphs; interval graphs for instance, permutation graphs - which you may have seen in the longest subsequence problem.

Theorem Chordal graphs are perfect.


Recall that a separator of a graph G is a subset of vertices S ⊆ V whose removal from G disconnects the graph into two or more connected components. An ab-separator is a separator of G that disconnects a from b. A ab-separator S is minimal if no proper subset of S also separates a from b.
 

Theorem Every minimal separator of a chordal graph is a clique.

Recall that a vertex v  is simplicial if its neighbors N(v) induces a clique.
 

Theorem If G is chordal, then either G is complete or G has at least two non-adjacent simplicial vertices.

Corollary: G is chordal iff every induced subgraph of G has a simplicial vertex.


Perfect elimination order (PEO)

Given a graph G(V,E) and a total ordering s = v_1,...,v_n of its vertices, we can define two sets of neighbors for each vertex vi: N^-(v_i) and N^+(v_i). N^-(v_i) is the set of neighbors of v_i that appear to the left of v_i in the ordering σ, and N^+(v_i) is the set of neighbors that appear to the right of v_i in the ordering.

Formally, N^-(v_i) = {v_j : v_jv_i ∈ E and j < i} and N^+(v_i) = {v_j : v_jv_i ∈ E and i < j}

Additionally, for two vertices v_i,v_j where i < j, we can use the notation v_i ≺_s v_j to indicate that v_i appears to the left of v_j in the ordering s. If the ordering s is clear from context, we can simply write v_i ≺ v_j.

A vertex ordering s = v_1, v_2, . . . , v_n is a perfect elimination order - or PEO - if for all i ∈ [n], v_i is simplicial to the left in s, i.e. N^−(v_i) is a clique.

Structural graph theory faces a significant challenge known as graph recognition. This involves determining whether a given graph belongs to a specific class of graphs that possess a particular structure. One example of this is chordal graph recognition, which can be achieved by computing a PEO (Perfect Elimination Ordering). This is because PEOs fully characterize chordal graphs, as stated in Theorem 6. However, before exploring the complexity of computing PEOs, it is worth considering other potential uses for them.


Algorithm Greedy Col

Input:  A graph and an ordering s on the vertex set

Output: A proper coloring of the vertices

The GreedyCOL algorithm may not always produce optimal results when applied to certain graphs. In fact, it's easy to create examples where the algorithm performs poorly. However, when provided with a "good" ordering, such as for chordal graph families, the algorithm can produce an optimal coloring as stated in the following theorem.

Theorem If s is a PEO, algorithm GreedyCOL gives an optimal coloring.

Corollary. Algorithm GreedyCOL can be tweaked to compute a maximum clique on G if σ is a PEO

Lexicographic Breadth First Search

Lexicographic breadth first search (LexBFS) is a modified version of the Breadth-First Search (BFS) algorithm, which assigns lexicographic labels to the vertices of a graph. The lexicographic labels are assigned in a way that follows a specific ordering of the vertices, and it uses these labels to break any ties that may occur during the search. In other words, it uses a dictionary-like ordering to decide which vertex should be visited next in case of ties. This method can be particularly useful in certain graph-theoretic problems, such as finding perfect elimination orderings, which are needed for chordal graph recognition. Additionally, LexBFS also produces a linear ordering of the vertices of a graph, which can be useful in other graph-related problems.

 In pseudocode, the algorithm can be expressed as follows:

Algorithm LexBFS

Input: A graph and a start vertex u

Output: An ordering s of V

Lexicographic breadth first search (LexBFS) is a modified version of the Breadth-First Search (BFS) algorithm that prioritizes vertices with "stronger previous pull" during the search. This means that it takes into account not only the size of the left neighborhoods of the vertices, but also the ordering of the vertices as defined by σ (sigma), a linear ordering of the graph's vertices.

For example, in the case where vertices a and d have already been visited, LexBFS would visit vertex c before visiting vertex b because vertex c is "pulled" by d, meaning it has a stronger connection to d. This is in contrast to traditional BFS, which would visit vertices in a breadth-first manner without considering these additional factors.

It's worth noting that LexBFS is still a BFS algorithm and it must respect the "breadth first" condition before checking the size of left neighbourhoods. In other words, it first visits all the vertices at the same level before visiting the vertices at the next level, but the order in which vertices at the same level are visited is determined by the "stronger previous pull" condition.

Lexicographic breadth first search (LexBFS) was first introduced by Rose, Tarjan, and Lueker in [6] as a method for recognizing chordal graphs. They proved that when applied to chordal graphs, LexBFS produces a Perfect Elimination Ordering (PEO) as stated in the following thorem.

Theorem Let G(V,E) be a chordal grpah, then LexbFS(G) is a PEO.

To prove Theorem 8, the authors used a characterization of LexBFS orderings provided by Dragan, Falk and Brandstädt, known as the LexBFS 4 Point Condition. 

[The LexBFS 4 Point Condition] Let G(V, E) be an arbitrary graph, and σ an ordering of V . σ is a LexBFS ordering if and only if for every triple a ≺ b ≺ c, if ac ∈ E, ab not in E, then there exists a vertex d such that d ≺ a and db ∈ E, dc notin E.

In other words, if there is a triple of vertices such that a is connected to c but not to b and b is connected to c, then there must exist a vertex d that is connected to b but not to c and comes before a in the ordering. This ensures that b is visited before c, despite the connection between a and c, which captures the lexicographic variant.

It's easy to see that such a vertex d must exist and be a neighbor of b since σ is a BFS. The fact that dc not in E, ensures that b must indeed be visited first before c, despite the pull of a on c. This captures the intuition of the LexBFS algorithm.