On Percolation and NP-Hardness
Total Page:16
File Type:pdf, Size:1020Kb
Electronic Colloquium on Computational Complexity, Revision 2 of Report No. 132 (2015) On Percolation and NP-Hardness Huck Bennett ∗ Daniel Reichman y Igor Shinkar ∗ July 13, 2016 Abstract The edge-percolation and vertex-percolation random graph models start with an arbitrary graph G, and randomly delete edges or vertices of G with some fixed probability. We study the computational hardness of problems whose inputs are obtained by applying percolation to worst-case instances. Specifically, we show that a number of classical NP-hard graph problems remain essentially as hard on percolated instances as they are in the worst-case (assuming NP * BPP). We also prove hardness results for other NP-hard problems such as Constraint Satisfaction Problems and Subset-Sum, with suitable definitions of random deletions. We focus on proving the hardness of the Maximum Independent Set problem and the Graph Coloring problem on percolated instances. To show this we establish the robustness of the corresponding parameters α(·) and χ(·) to percolation, which may be of independent interest. Given a graph G, let G0 be the graph obtained by randomly deleting edges of G. We show that if α(G) is small, then α(G0) remains small with probability at least 0:99. Similarly, we show that if χ(G) is large, then χ(G0) remains large with probability at least 0:99. 1 Introduction The theory of NP-hardness suggests that we are unlikely to find optimal solutions to NP-hard problems in polynomial time. This theory applies to the worst-case setting where one considers the worst running-time over all inputs of a given size. It is less clear whether these hardness results apply to \real-life" instances. One way to address this question is to examine to what extent known NP-hardness results are stable under random perturbations, as it seems reasonable to assume that a given instance of a problem may be subjected to noise. Recent work has studied the effect of random perturbations of the input on the runtime of algorithms. In their seminal paper Spielman and Teng [33] introduced the idea of smoothed analy- sis to explain the superior performance of algorithms in practice compared with formal worst-case bounds. Roughly speaking, smoothed analysis studies the running time of an algorithm on a per- turbed worst-case instance. In particular, they showed that subjecting the weights of an arbitrary linear program to Gaussian noise yields instances on which the simplex algorithm runs in expected polynomial time, despite the fact that there are pathological linear programs for which the simplex algorithm requires exponential time. Since then smoothed analysis has been applied to a number of other problems [13, 34]. ∗Department of Computer Science, Courant Institute of Mathematical Sciences, New York University. Email: [email protected], [email protected] yElectrical Engineering and Computer Science, University of California Berkeley, Berkeley, CA, USA. Email: [email protected]. 1 ISSN 1433-8092 In contrast to smoothed analysis, we study when worst-case instances of problems remain hard under random perturbations. Specifically, we study to what extent NP-hardness results are ro- bust when instances are subjected to random deletions. Previous work is mainly concerned with Gaussian perturbations of weighted instances. Less work has examined the robustness of hardness results of unweighted instances with respect to discrete noise. We focus on two forms of percolation on graphs. Given a graph G = (V; E) and a parameter 0 p 2 (0; 1), we define Gp;e = (V; E ) as the probability space of graphs on the same set of vertices, 0 where each edge e 2 E is contained in E independently with probability p. We say that Gp;e 0 0 is obtained from G by edge percolation. We define Gp;v = (V ;E ) as the probability space of 0 graphs, in which every vertex v 2 V is contained in V independently with probability p, and Gp;v 0 is the subgraph of G induced by the vertices V . We say that Gp;v is obtained from G by vertex percolation. We also study appropriately defined random deletions applied to instances of other NP-hard problems, such as 3-SAT and Subset-Sum. Throughout we refer to instances that are subjected to random deletions as percolated instances. Our main question is whether such percolated instances remain hard to solve by polynomial-time algorithms assuming NP * BP P . The study of random discrete structures has resulted with a wide range of mathematical tools which have proven instrumental in proving rigorous results regarding such structures [5, 17, 19, 26]. One reason for studying percolated instances is that it may offer the opportunity to apply these methods to a broader range of distributions of instances of NP-hard problems beyond random graphs and random formulas. Furthermore, percolated instances are studied in host of different disciplines and are frequently used to explain properties of the the WWW [1,8], hence it is of interest to understand the computational complexity of instances arising from percolated networks. 1.1 A first example { 3-Coloring Consider the 3-Coloring problem, where given a graph G = (V; E) we need to decide whether G is 3-colorable. Suppose that given a graph G we sample a random subgraph G0 of G, by deleting 1 each edge of G independently with probability p = 2 , and ask whether the resulting graph is 3- colorable. Is there a polynomial time algorithm that can decide with high probability whether G0 is 3-colorable? We demonstrate that a polynomial-time algorithm for deciding whether G0 is 3-colorable is impossible assuming NP * BP P . We show this by considering the following polynomial time reduction from the 3-Coloring problem to itself. We will need the following definition. Definition 1.1. Given a graph G, the R-blowup of G is a graph G0 = (V 0;E0), where every vertex v is replaced by an independent set ve of size R, which we call the cloud corresponding to v. We then connect the clouds ue and ve by a complete R × R bipartite graph (u; v) 2 E. Given an n-vertex graph H the reduction outputs a graph G that is an R-blow-up of H for R = Cplog(n), where C > 0 is large enough. It is easy to see that H is 3-colorable if and only if G is 3-colorable. In fact, the foregoing reduction satisfies a stronger robustness property for random subgraphs 0 1 G of G obtained by deleting edges of G with probability 2 . Namely, if H is 3-colorable, then G is 3-colorable, and hence G0 is also 3-colorable with probability 1. On the other hand, if H is not 3-colorable, then G is not 3-colorable, and with high probability G0 is not 3-colorable either. 2 Indeed, for any edge (v1; v2) in H let U1;U2 be two clouds in G corresponding to v1 and v2. 0 0 Fixing two arbitrary sets U1 ⊆ U1 and U2 ⊆ U2 each of size R=3, the probability that there is 0 −R2=9 no edge in G connecting a vertex from U1 to a vertex in U2 is 2 . By union bounding over R 2 R2=9 0 0 0 the jEj · R=3 2 choices of U1;U2 we get that there is at least one edge between U1 and 0 0 U2 with high probability. When this holds we can decode any 3-coloring of G to a 3-coloring of H by coloring each vertex v of H with the color that appears the largest number of times in the coloring of the corresponding cloud in G0, breaking ties arbitrarily. Therefore, a polynomial time algorithm for deciding the 3-colorability of G0 implies a polynomial time algorithm for determining the 3-colorability of H with high probability, and hence unless N P ⊆ coRP there is no polynomial time algorithm that given a 3-colorable graph G finds a 3-coloring of a random subgraph of G.1 Toward a stronger notion of robustness The example above raises the question of whether the blow-up described above is really necessary. Na¨ıvely, one could hope for stronger hardness of the 3-Coloring problem, namely, that for any graph H if H is not 3-colorable, then with high probability a random subgraph H0 of H is not 3-colorable either. However, this is not true in general, as H can be a 3-critical graph, i.e., a 3-colorable graph such that deletion of any edge of H decreases its chromatic number (consider for example the case of an odd cycle). Nonetheless, if random deletions do not decrease the chromatic number of a graph by much, then one could use hardness of approximation results on the chromatic number to deduce hardness results for coloring percolated graphs. This naturally leads to the following question. Question. Let G be an arbitrary graph, and let G0 be a random subgraph of G obtained from G by deleting each edge of G with probability 1=2. Is it true that if χ(G) is large, then χ(G0) is also large with probability at least 0.99?2 In this paper we give a positive answer to this question and show that in some sense the chro- matic number of any graph is robust against random deletions. We also consider the question of robustness for other graph parameters. For independent sets we demonstrate that if the inde- pendence number of G is small, then with high probability the independence number of a random subgraph of G is small as well.