Theory of Computation Chapter 1

Theory of Computation Chapter 1

Chapter 8: Memory, Paths, and Games slides © 2019, David Doty ECS 220: Theory of Computation based on “The Nature of Computation” by Moore and Mertens - Space versus time PSPACE = problems decidable with polynomial memory You can reuse space, but you can’t reuse time. Leads to unintuitive results: • nondeterminism doesn’t help space-bounded computation: PSPACE = NPSPACE • proving something doesn’t exist is as easy as proving it exists: NPSPACE = coNPSPACE But some intuitions hold: • with more space, you can compute more: SPACE(o(t)) SPACE(t) • if time is bounded, space is also bounded: TIME(t) SPACE(t) • if space is bounded, time is also bounded: SPACE(t) ⊊TIME(2O(t)) ⊆ Biggest open question: does space help more than⊆ time? P ≠ PSPACE? Chapter 8 2 Read-only and write-only memory • Sublinear-time computation largely uninteresting for Turing machines. • Somewhat interesting for RAM machines, e.g., binary search. • Sublinear-space computation makes more sense, e.g., for searching the graph G=(V,E), where V = {all web pages} • Cannot load input into memory. • Formalized by giving Turing machine/RAM machine read-only input, but read-write working memory… only latter is counted as space usage. • To talk about writing more output than allowed space usage, use a write-only output. (irrelevant for Boolean output, but not for space-bounded reductions) • Textbook assumes RAM (random access memory) in this chapter. • given input/working memory location i, read/write to location i takes one step (note this only matters if we also care about the time) • note: if size of input/working memory is k, takes log(k) bits to write i 8.1: Welcome to the State Space 3 Space-bounded complexity classes • SPACE(s(n)) = class of problems solvable with O(s(n)) working memory on inputs of size n • L = SPACE(log n) • PSPACE = SPACE(nc) � ∈ ℕ 8.1: Welcome to the State Space 4 Logarithmic space example deciding palindromes def palindrome(x): i = 1 Space complexity: j = |x| log n for i while i < j: if x[i] != x[j]: log n for j return False i += 1 j -= 1 return True 8.1: Welcome to the State Space 5 Polynomial space example def periodic_orbit(ca,init): n = |init| checking if a configuration is in a periodic orbit of a x = init cellular automaton for j = 1..2n: x = update(ca, x) if x = init: return True return False def update(ca, x): y = "" for i = 0..|x|-1: l = x[(i-1) mod |x|] m = x[i] r = x[(i+1) mod |x|] y.append(ca(l,m,r)) memory: init, x, y, j (n bits each) and i (log n bits) return y 8.1: Welcome to the State Space 6 Time bounds versus space bounds Assumption: a program allocates O(1) bits per time step… then O(t(n)) bits allocated total, so space bound s(n) = O(t(n)). TIME(t) SPACE(t) P PSPACE ⊆ ⊆ If a program uses O(s(n)) bits, it has at most 2O(s(n)) configurations. If it repeats one it runs forever… so if it halts on all inputs, t(n) = 2O(s(n)) SPACE(s) TIME(2O(s)) L P PSPACE EXP ⊆ ⊆ ⊆ 8.1: Welcome to the State Space 7 Nondeterministic time versus deterministic space Recall NTIME(t) TIME(2O(t)) (e.g., NP EXP) ⊆ ⊆ memory used: def exhaustive_search_A(x): n = |x| witness-length(n) ≤ t(n) for each w in {0,1}≤witness-length(n): if V (x,w) = True: space(VA) ≤ t(n) A return True return False NTIME(t) SPACE(t) NP PSPACE ⊆ 8.1: Welcome to the State Space⊆ 8 Putting all relationships together L P NP PSPACE EXP NEXP EXPSPACE (Time Hierarchy Theorem) ⊆ P ⊆ ⊆ ⊆ EXP ⊆ ⊆ NP NEXP (Nondeterministic ⊊ Time Hierarchy Theorem) L PSPACE⊊ EXPSPACE Space⊊ Hierarchy Theorem⊊: If s1 = o(s2), then SPACE(s1) SPACE(s2) 8.1: Welcome to the State Space 9 ⊊ Nondeterministic space-bounded computation • Textbook goes through some “prover/verifier” formulations. Key difference with NP: witnesses can be exponential length, for example, sequence of moves in a sliding block puzzle or chess game. • I prefer the “nondeterministic program” formulation: • NSPACE(s) = problems solvable by a nondeterministic program using space O(s(n)) on inputs of size n. • [correct answer = yes] [some computation path accepts] • [correct answer = no] [no computation path accepts] ⇒ n • NL = NSPACE(log ) ⇒ • NPSPACE = NSPACE(nc) HuaRongDao, Wikipedia � 8.2: Show Me the Way 10 ∈ ℕ Prover/Verifier characterization of NSPACE(s) • verifier algorithm V • input x • witness/proof w • |w| is arbitrary • V has read-only access to both x and w • RAM access to x • sequential access to bits of w from-left-to-right (like a DFA) (*) • x is a yes-instance ( w) V(x,w) accepts • Why (*)? Otherwise we could encode NP-complete problems using only ⇔ ∃ logarithmic space, e.g., HAMPATH would be in NL, so we would have NL = NP. 8.2: Show Me the Way 11 Reachability • REACHABILITY: • Given: directed graph G=(V,E) and two nodes s,t in V • Question: is there a path from s to t in G? def reachable(G=(V,E),s,t): u = s • Claim: REACHABILITY NL. num_searched = 1 Why? while u != t: memory needed: ∈ v = guess neighbor of u u = v u,v,num_searched (log |V|) if num_searched = |V|: return False num_searched += 1 return True 8.2: Show Me the Way 12 The long computational reach of REACHABILITY • If our nondeterministic program has space s, we can search graphs of size up to 2s (i.e., internet-sized graphs) • Flip this around: every nondeterministic program is defined completely by its configuration reachability graph: • V = set of configurations of the program (state of memory) • (u,v) E iff there is a nondeterministic transition from u to v • can assume single accepting configuration a (TM erases all tapes before halting) ∈ • deterministic programs have a line graph; nondeterministic is more general • So every problem in NSPACE(s) is equivalent to a REACHABILITY problem on a graph O(s) of size 2 : given input x with starting configuration cx, can we reach from cx to a? • REACHABILITY on G=(V,E) is solvable using DFS in time O(|V|+|E|) = O(2O(s) + 2O(s)) O(s) NSPACE(s) TIME(2 ) NL P NPSPACE EXP O(1) degree 8.2: Show Me the Way 13 ⊆ ⊆ ⊆ NL-completeness • Previous slide: • REACHABILITY NL • every problem in NL is equivalent to a REACHABILITY problem on a polynomial- size graph ∈ • We’ll define NL-completeness, and show REACHABILITY is NL-complete. • L = NL REACHABILITY L. ⇔ ∈ 8.3: L and NL-completeness 14 Logspace reductions • A reduction f:{0,1}* {0,1}* from A to B is logspace if computable by a O(log n)-space bounded program. Write A ≤L B. • input is read-only • output is write-only • worktape is read/write; only worktape counts against space usage • Most reductions used in NP-completeness proofs are logspace • e.g., to reduce CLIQUE to INDEPENDENT-SET, to determine whether to add edge {u,v} to output, one need only ask whether {u,v} is edge in input graph • B is NL-complete if B NL and B is NL-hard: for all A NL, A ≤L B. • Claim: logspace reductions are transitive: A ≤L B and B ≤L C A ≤L C. ∈ ∈ Why? can be very slow! lots of recomputation of already-computed bits just to save space 8.3: L and NL-completeness ⇒ 15 First NL-complete problem • NL-WITNESS-EXISTENCE: • Given: nondeterministic program P, input x, integer k in unary (string 1k) • Question: Is there a sequence of guesses P(x) can make so it accepts while using at most log k bits of memory? • NL-WITNESS-EXISTENCE is NL-hard: For any A NL, decided by c·log(n)-space-bounded program P, to reduce A to c NL-WITNESS-EXISTENCE, on input x, output (P, x, k), where k = ??n need to count how many 1’s ∈ reduction has written; takes • NL-WITNESS-EXISTENCE NL: log(nc) = c·log n bits to store nondeterministic program Q deciding if (P, x, k) NL-WITNESS-EXISTENCE: run P(x), checking to ensure∈ space usage never exceeds log k. Since k is given in unary, Q uses space log k ≤ log n, where n = |(P,x∈,k)|. 8.3: L and NL-completeness 16 REACHABILITY is NL-complete • reduction showing NL-WITNESS-EXISTENCE ≤L REACHABILITY • input (P,x,k), output (G,s,t) What are G, s, and t? • want P(x) accepts using ≤ log k space there’s a path from s to t in G • G = k-space-bounded configuration reachability graph of P • V = { configurations of P using ≤ log k space⇔ } • E = { (u,v) | P goes from u to v in one step } • s = starting configuration cx on input x • t = accepting configuration 17 Simulating nondeterminism deterministically • Seems to incur exponential time overhead: • NTIME(t) TIME(2O(t)), and we don’t know how to do better with time • We can do much⊆ better with space, incurring only a quadratic overhead. • Savitch’s Theorem: For any s(n) ≥ log n, NSPACE(s) SPACE(s2). • Corollary: NPSPACE = PSPACE ⊆ • Corollary: NL SPACE(log2 n) (polylogarithmic, but not logarithmic) But Space Hierarchy Theorem says L = SPACE(log n) ≠ SPACE(log2 n), so we still don’t know whether⊆ L = NL. 8.4: Middle-first search and nondeterministic space 18 REACHABILITY SPACE(log2 n) • BFS and DFS use linear memory • to avoid visiting the∈ same node twice, must store all the visited nodes • Savitch’s algorithm (“middle-first search”) visits each node repeatedly s s s k/4 Let s,t V and k > 0.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    35 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us