Randomized Complexity Classes; RP • Let N Be a Polynomial-Time Precise NTM That Runs in Time P(N) and Has 2 Nondeterministic Choices at Each Step

Randomized Complexity Classes; RP • Let N Be a Polynomial-Time Precise NTM That Runs in Time P(N) and Has 2 Nondeterministic Choices at Each Step

Randomized Complexity Classes; RP ² Let N be a polynomial-time precise NTM that runs in time p(n) and has 2 nondeterministic choices at each step. ² N is a polynomial Monte Carlo Turing machine for a language L if the following conditions hold: { If x 2 L, then at least half of the 2p(n) computation paths of N on x halt with \yes" where n = j x j. { If x 62 L, then all computation paths halt with \no." ² The class of all languages with polynomial Monte Carlo TMs is denoted RP (randomized polynomial time).a aAdleman and Manders (1977). °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 512 Comments on RP ² Nondeterministic steps can be seen as fair coin flips. ² There are no false positive answers. ² The probability of false negatives, 1 ¡ ², is at most 0.5. ² But any constant between 0 and 1 can replace 0.5. { By repeating the algorithm k = d¡ 1 e times, the log2 1¡² probability of false negatives becomes (1 ¡ ²)k · 0:5. ² In fact, ² can be arbitrarily close to 0 as long as it is of the order 1=q(n) for some polynomial q(n). { ¡ 1 = O( 1 ) = O(q(n)). log2 1¡² ² °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 513 Where RP Fits ² P ⊆ RP ⊆ NP. { A deterministic TM is like a Monte Carlo TM except that all the coin flips are ignored. { A Monte Carlo TM is an NTM with extra demands on the number of accepting paths. ² compositeness 2 RP; primes 2 coRP; primes 2 RP.a { In fact, primes 2 P.b ² RP [ coRP is an alternative \plausible" notion of e±cient computation. aAdleman and Huang (1987). bAgrawal, Kayal, and Saxena (2002). °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 514 ZPPa (Zero Probabilistic Polynomial) ² The class ZPP is de¯ned as RP \ coRP. ² A language in ZPP has two Monte Carlo algorithms, one with no false positives and the other with no false negatives. ² If we repeatedly run both Monte Carlo algorithms, eventually one de¯nite answer will come (unlike RP). { A positive answer from the one without false positives. { A negative answer from the one without false negatives. aGill (1977). °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 515 The ZPP Algorithm (Las Vegas) 1: fSuppose L 2 ZPP.g 2: fN1 has no false positives, and N2 has no false negatives.g 3: while true do 4: if N1(x) = \yes" then 5: return \yes"; 6: end if 7: if N2(x) = \no" then 8: return \no"; 9: end if 10: end while °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 516 ZPP (concluded) ² The expected running time for the correct answer to emerge is polynomial. { The probability that a run of the 2 algorithms does not generate a de¯nite answer is 0.5 (why?). { Let p(n) be the running time of each run of the while-loop. { The expected running time for a de¯nite answer is X1 0:5iip(n) = 2p(n): i=1 ² Essentially, ZPP is the class of problems that can be solved without errors in expected polynomial time. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 517 Large Deviations ² Suppose you have a biased coin. ² One side has probability 0:5 + ² to appear and the other 0:5 ¡ ², for some 0 < ² < 0:5. ² But you do not know which is which. ² How to decide which side is the more likely side|with high con¯dence? ² Answer: Flip the coin many times and pick the side that appeared the most times. ² Question: Can you quantify the con¯dence? °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 518 The Cherno® Bounda Theorem 69 (Cherno® (1952)) Suppose x1; x2; : : : ; xn are independent random variables taking the values 1 and 0 Pn with probabilities p and 1 ¡ p, respectively. Let X = i=1 xi. Then for all 0 · θ · 1, 2 prob[ X ¸ (1 + θ) pn ] · e¡θ pn=3: ² The probability that the deviate of a binomial random variable from its expected value Xn E[ X ] = E[ xi ] = pn i=1 decreases exponentially with the deviation. aHerman Cherno® (1923{). The Cherno® bound is asymptotically optimal. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 519 The Proof ² Let t be any positive real number. ² Then prob[ X ¸ (1 + θ) pn ] = prob[ etX ¸ et(1+θ) pn ]: ² Markov's inequality (p. 460) generalized to real-valued random variables says that £ ¤ prob etX ¸ kE[ etX ] · 1=k: ² With k = et(1+θ) pn=E[ etX ], we have prob[ X ¸ (1 + θ) pn ] · e¡t(1+θ) pnE[ etX ]: °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 520 The Proof (continued) Pn ² Because X = i=1 xi and xi's are independent, E[ etX ] = (E[ etx1 ])n = [ 1 + p(et ¡ 1) ]n: ² Substituting, we obtain prob[ X ¸ (1 + θ) pn ] · e¡t(1+θ) pn[ 1 + p(et ¡ 1) ]n t · e¡t(1+θ) pnepn(e ¡1) as (1 + a)n · ean for all a > 0. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 521 The Proof (concluded) ² With the choice of t = ln(1 + θ), the above becomes prob[ X ¸ (1 + θ) pn ] · epn[ θ¡(1+θ) ln(1+θ)]: θ2 θ3 θ4 ² The exponent expands to ¡ 2 + 6 ¡ 12 + ¢ ¢ ¢ for 0 · θ · 1, which is less than µ ¶ µ ¶ θ2 θ3 1 θ 1 1 θ2 ¡ + · θ2 ¡ + · θ2 ¡ + = ¡ : 2 6 2 6 2 6 3 °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 522 Power of the Majority Rule 2 ¡ θ pn From prob[ X · (1 ¡ θ) pn ] · e 2 (prove it): Corollary 70 If p = (1=2) + ² for some 0 · ² · 1=2, then " # Xn ¡²2n=2 prob xi · n=2 · e : i=1 ² The textbook's corollary to Lemma 11.9 seems incorrect. ² Our original problem (p. 518) hence demands ¼ 1:4k=²2 independent coin flips to guarantee making an error with probability at most 2¡k with the majority rule. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 523 BPPa (Bounded Probabilistic Polynomial) ² The class BPP contains all languages for which there is a precise polynomial-time NTM N such that: { If x 2 L, then at least 3=4 of the computation paths of N on x lead to \yes." { If x 62 L, then at least 3=4 of the computation paths of N on x lead to \no." ² N accepts or rejects by a clear majority. aGill (1977). °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 524 Magic 3=4? ² The number 3=4 bounds the probability of a right answer away from 1=2. ² Any constant strictly between 1=2 and 1 can be used without a®ecting the class BPP. ² In fact, 0.5 plus any inverse polynomial between 1=2 and 1, 1 0:5 + ; p(n) can be used. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 525 The Majority Vote Algorithm Suppose L is decided by N by majority (1=2) + ². 1: for i = 1; 2;:::; 2k + 1 do 2: Run N on input x; 3: end for 4: if \yes" is the majority answer then 5: \yes"; 6: else 7: \no"; 8: end if °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 526 Analysis ² The running time remains polynomial, being 2k + 1 times N's running time. ² By Corollary 70 (p. 523), the probability of a false 2 answer is at most e¡² k. ² By taking k = d 2=²2 e, the error probability is at most 1=4. ² As with the RP case, ² can be any inverse polynomial, because k remains polynomial in n. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 527 Probability Ampli¯cation for BPP ² Let m be the number of random bits used by a BPP algorithm. { By de¯nition, m is polynomial in n. ² With k = £(log m) in the majority vote algorithm, we can lower the error probability to, say, · (3m)¡1: °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 528 Aspects of BPP ² BPP is the most comprehensive yet plausible notion of e±cient computation. { If a problem is in BPP, we take it to mean that the problem can be solved e±ciently. { In this aspect, BPP has e®ectively replaced P. ² (RP [ coRP) ⊆ (NP [ coNP). ² (RP [ coRP) ⊆ BPP. ² Whether BPP ⊆ (NP [ coNP) is unknown. ² But it is unlikely that NP ⊆ BPP (p. 544). °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 529 coBPP ² The de¯nition of BPP is symmetric: acceptance by clear majority and rejection by clear majority. ² An algorithm for L 2 BPP becomes one for L¹ by reversing the answer. ² So L¹ 2 BPP and BPP ⊆ coBPP. ² Similarly coBPP ⊆ BPP. ² Hence BPP = coBPP. ² This approach does not work for RP. ² It did not work for NP either. °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 530 BPP and coBPP Ø\HVÙ ØQRÙ ØQRÙ Ø\HVÙ °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 531 \The Good, the Bad, and the Ugly" coNP NP ZPP coRP P RP BPP °c 2010 Prof. Yuh-Dauh Lyuu, National Taiwan University Page 532 Circuit Complexity ² Circuit complexity is based on boolean circuits instead of Turing machines. ² A boolean circuit with n inputs computes a boolean function of n variables. ² By identifying true/1 with \yes" and false/0 with \no," a boolean circuit with n inputs accepts certain strings in f 0; 1 gn.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    39 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us