Complexity Theory

Complexity Theory

COMPLEXITY THEORY Lecture 20: Circuits and Parallel Computation Markus Krotzsch¨ Knowledge-Based Systems TU Dresden, 7th Jan 2020 The Power of Circuits Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 2 of 26 Nobody knows. p Theorem 20.1 (Karp-Lipton Theorem): If NP ⊆ P/poly then PH = Σ2. Proof sketch (see Arora/Barak Theorem 6.19): • if NP ⊆ P/poly then there is a polysize circuit family solving Sat • Using this, one can argue that there is also a polysize circuit family that computes the lexicographically first satisfying assignment (k output bits for k variables) • A Π2-QBF formula 8X~ .9Y~.' is true if, for all values of X~ , '(X~ ) is satisfiable. • P ~ In Σ2 , we can: (1) guess the polysize circuit for SAT, (2) check for all values of X if its output is really a satisfying assignment (to verify the guess) • P P This solves Π2 -hard problems in Σ2 • P But then the Polynomial Hierarchy collapses at Σ2 , as claimed. P/poly and NP We showed P ⊆ P/poly. Does NP ⊆ P/poly also hold? Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 3 of 26 Proof sketch (see Arora/Barak Theorem 6.19): • if NP ⊆ P/poly then there is a polysize circuit family solving Sat • Using this, one can argue that there is also a polysize circuit family that computes the lexicographically first satisfying assignment (k output bits for k variables) • A Π2-QBF formula 8X~ .9Y~.' is true if, for all values of X~ , '(X~ ) is satisfiable. • P ~ In Σ2 , we can: (1) guess the polysize circuit for SAT, (2) check for all values of X if its output is really a satisfying assignment (to verify the guess) • P P This solves Π2 -hard problems in Σ2 • P But then the Polynomial Hierarchy collapses at Σ2 , as claimed. P/poly and NP We showed P ⊆ P/poly. Does NP ⊆ P/poly also hold? Nobody knows. p Theorem 20.1 (Karp-Lipton Theorem): If NP ⊆ P/poly then PH = Σ2. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 3 of 26 P/poly and NP We showed P ⊆ P/poly. Does NP ⊆ P/poly also hold? Nobody knows. p Theorem 20.1 (Karp-Lipton Theorem): If NP ⊆ P/poly then PH = Σ2. Proof sketch (see Arora/Barak Theorem 6.19): • if NP ⊆ P/poly then there is a polysize circuit family solving Sat • Using this, one can argue that there is also a polysize circuit family that computes the lexicographically first satisfying assignment (k output bits for k variables) • A Π2-QBF formula 8X~ .9Y~.' is true if, for all values of X~ , '(X~ ) is satisfiable. • P ~ In Σ2 , we can: (1) guess the polysize circuit for SAT, (2) check for all values of X if its output is really a satisfying assignment (to verify the guess) • P P This solves Π2 -hard problems in Σ2 • P But then the Polynomial Hierarchy collapses at Σ2 , as claimed. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 3 of 26 Nobody knows. Theorem 20.2 (Meyer’s Theorem): p If ExpTime ⊆ P/poly then ExpTime = PH = Σ2. See [Arora/Barak, Theorem 6.20] for a proof sketch. Corollary 20.3: If ExpTime ⊆ P/poly then P , NP. p Proof: If ExpTime ⊆ P/poly then ExpTime = Σ2 (Meyer’s Theorem). p By the Time Hierarchy Theorem, P , ExpTime, so P , Σ2. So the Polynomial Hierarchy doesn’t collapse completely, and P , NP. P/poly and ExpTime We showed P ⊆ P/poly. Does ExpTime ⊆ P/poly also hold? Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 4 of 26 Corollary 20.3: If ExpTime ⊆ P/poly then P , NP. p Proof: If ExpTime ⊆ P/poly then ExpTime = Σ2 (Meyer’s Theorem). p By the Time Hierarchy Theorem, P , ExpTime, so P , Σ2. So the Polynomial Hierarchy doesn’t collapse completely, and P , NP. P/poly and ExpTime We showed P ⊆ P/poly. Does ExpTime ⊆ P/poly also hold? Nobody knows. Theorem 20.2 (Meyer’s Theorem): p If ExpTime ⊆ P/poly then ExpTime = PH = Σ2. See [Arora/Barak, Theorem 6.20] for a proof sketch. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 4 of 26 P/poly and ExpTime We showed P ⊆ P/poly. Does ExpTime ⊆ P/poly also hold? Nobody knows. Theorem 20.2 (Meyer’s Theorem): p If ExpTime ⊆ P/poly then ExpTime = PH = Σ2. See [Arora/Barak, Theorem 6.20] for a proof sketch. Corollary 20.3: If ExpTime ⊆ P/poly then P , NP. p Proof: If ExpTime ⊆ P/poly then ExpTime = Σ2 (Meyer’s Theorem). p By the Time Hierarchy Theorem, P , ExpTime, so P , Σ2. So the Polynomial Hierarchy doesn’t collapse completely, and P , NP. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 4 of 26 It turns out that these exponential circuits are really needed: Theorem 20.4 (Shannon 1949 (!)): For every n, there is a function f0, 1gn ! f0, 1g that cannot be computed by any circuit of size 2n/(10n). In fact, one can even show: almost every Boolean function requires circuits of size n > 2 /(10n) – and is therefore not in P/poly Is any of these functions in NP? Or at least in Exp? Or at least in NExp? Nobody knows. How Big a Circuit Could We Need? We should not be surprised that P/poly is so powerful: exponential circuit families are already enough to accept any language Exercise: show that every Boolean function over n variables can be expressed by a circuit of size ≤ n2n. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 5 of 26 Is any of these functions in NP? Or at least in Exp? Or at least in NExp? Nobody knows. How Big a Circuit Could We Need? We should not be surprised that P/poly is so powerful: exponential circuit families are already enough to accept any language Exercise: show that every Boolean function over n variables can be expressed by a circuit of size ≤ n2n. It turns out that these exponential circuits are really needed: Theorem 20.4 (Shannon 1949 (!)): For every n, there is a function f0, 1gn ! f0, 1g that cannot be computed by any circuit of size 2n/(10n). In fact, one can even show: almost every Boolean function requires circuits of size n > 2 /(10n) – and is therefore not in P/poly Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 5 of 26 Nobody knows. How Big a Circuit Could We Need? We should not be surprised that P/poly is so powerful: exponential circuit families are already enough to accept any language Exercise: show that every Boolean function over n variables can be expressed by a circuit of size ≤ n2n. It turns out that these exponential circuits are really needed: Theorem 20.4 (Shannon 1949 (!)): For every n, there is a function f0, 1gn ! f0, 1g that cannot be computed by any circuit of size 2n/(10n). In fact, one can even show: almost every Boolean function requires circuits of size n > 2 /(10n) – and is therefore not in P/poly Is any of these functions in NP? Or at least in Exp? Or at least in NExp? Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 5 of 26 How Big a Circuit Could We Need? We should not be surprised that P/poly is so powerful: exponential circuit families are already enough to accept any language Exercise: show that every Boolean function over n variables can be expressed by a circuit of size ≤ n2n. It turns out that these exponential circuits are really needed: Theorem 20.4 (Shannon 1949 (!)): For every n, there is a function f0, 1gn ! f0, 1g that cannot be computed by any circuit of size 2n/(10n). In fact, one can even show: almost every Boolean function requires circuits of size n > 2 /(10n) – and is therefore not in P/poly Is any of these functions in NP? Or at least in Exp? Or at least in NExp? Nobody knows. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 5 of 26 Modelling Parallelism With Circuits Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 6 of 26 Note: Using O(nd) processors efficiently requires a massively parallel algorithm. However, one could always use fewer processors (each taking on more work), possibly leading to a proportional increase in time. The hard bit in parallelisation is to utilise many processors effectively – reducing to fewer processors is easy. What is Efficiently Parallelisable? Experience suggests: Some problems can be solved efficiently in parallel, while others can not. How could this be shown? Intuitive definition: A problem has an efficient parallel algorithm if it can be solved for inputs of size n • in polylogarithmic time, i.e., in time O(logk n) for some k ≥ 0, • using a computer with a polynomial number of parallel processors, i.e., O(nd) processors for some d ≥ 0. Markus Krötzsch, 7th Jan 2020 Complexity Theory slide 7 of 26 What is Efficiently Parallelisable? Experience suggests: Some problems can be solved efficiently in parallel, while others can not. How could this be shown? Intuitive definition: A problem has an efficient parallel algorithm if it can be solved for inputs of size n • in polylogarithmic time, i.e., in time O(logk n) for some k ≥ 0, • using a computer with a polynomial number of parallel processors, i.e., O(nd) processors for some d ≥ 0. Note: Using O(nd) processors efficiently requires a massively parallel algorithm. However, one could always use fewer processors (each taking on more work), possibly leading to a proportional increase in time. The hard bit in parallelisation is to utilise many processors effectively – reducing to fewer processors is easy.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    41 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us