Polynomial Space 1 Context Sensitive Languages

Polynomial Space 1 Context Sensitive Languages

Polynomial Space 1 Context Sensitive Languages 2 Linear Bounded Automata 3 Polynomial Space Klaus Sutner Carnegie Mellon University 2019/03/26 Towards Polynomial Space 2 Context-Sensitive Grammars 3 Definition (CSG) A context-sensitive grammar (CSG) is a grammar where all productions are of Logarithmic space and linear (deterministic) space clearly make algorithmic the form sense. But how about nondeterministic linear space? αAβ αγβ where γ = ε → 6 Generalizing context-free grammars naturally leads to context-sensitive grammars and the obvious parsing “algorithm” for a context-sensitive language 2 For technical reasons, some authors also allow S ε in which case S may not is in NSPACE(n) (and, by Savitch, in SPACE(n )). → appear on the righthand side of any production. A language is context-sensitive if it can be generated by a context-sensitive grammar. Somewhat surprisingly, it’s just a small step from there to all of PSPACE. Note the constraint that the replacement string γ = ε; as a consequence we 6 have α β implies α β ⇒ | | ≤ | | Brute-Force Recognition 4 Needless to say . 5 Lemma Lemma Every context-sensitive language is decidable. Not all decidable languages are context-sensitive. Proof. Proof. Here is a cute diagonalization argument for this. ? Suppose w Σ and n = w . In any potential derivation (αi)i<N we have ? ∈ | | Let (xi)i be an effective enumeration of Σ and (Gi)i an effective enumeration αi n. | | ≤ of all CSG over Σ (say, both in length-lex order). Set So consider the derivation graph (recall the pictures from last time for D CFLs): L = xi xi / (Gi) { | ∈ L } n vertices are Γ≤ , 1 edges are α β. By the lemma, L is decidable. ⇒ Then w is in L if w is reachable from vertex S in . D But L cannot be context-sensitive by the usual diagonal mumbo-jumbo. 2 2 Example: Counting 6 Not so fast . 7 We know that the language Alas, we also need to show that (G) L. L = anbncn n 1 L ⊆ { | ≥ } This is a bit harder: we need to show that the productions cannot be abused in is not context free. Here is a context-sensitive grammar G for L: some unintended way to generate other strings: recall that there is no let V = S, B and set restriction on the order in which productions can be applied, they just have to { } match. S aSBc abc → | E.g., the following is allowed: cB Bc → bB bb S aaSBcBc aaSBBcc → ⇒ ⇒ A typical derivation looks like Exercise S an+1bc(Bc)n an+1bBncn+1 an+1bn+1cn+1 Figure out the details. ⇒ ⇒ ⇒ It follows by induction that L (G). ⊆ L Example: More Counting 8 Closure Properties 9 It is also known that the language ? L = x a, b, c #ax = #bx = #cx { ∈ { } | } Theorem Context-sensitive languages are closed under union, concatenation, Kleene star is not context free. But, again, it is easily context-sensitive: and reversal. let V = S, A, B, C and set They are also closed under ε-free homomorphisms. { } S S0 ε → | S0 S0ABC ABC → | Proof. Straightforward by manipulating the grammar. 2 XY YX for all X, Y A, B, C → ∈ { } A a → B b → Note that arbitrary homomorphisms do not work in this case: they erase too C c much information and can force too large a search. → Note that most productions are actually context free. The critical part is the commutation productions for A, B, C . { } Two Innocent Questions 10 Normal Form 11 Theorem (Kuroda) Are CSL closed under intersection? Every context-sensitive grammar can be written with productions of the form Are CSL closed under complement? A BC AB CDA a → → → The answer is Yes in both cases (so this is quite different from context-free languages). The proof is very similar to the argument for Chomsky normal form for CFG (using only productions A BC and A a). → → The proof for intersection can be based on a machine model, and is much easier than the proof for complement (which requires a special and very Note that the recognition algorithm becomes particularly simple when the CSG surprising counting technique; see next lecture). is given in Kuroda normal form: we first get rid of all terminals and then operate only on pairs of consecutive variables. Monotonicity 12 Monotonic is Context-Sensitive 13 Theorem A language is context-sensitive iff it is generated by a monotonic grammar. Derivations in CSGs are length-non-decreasing. Correspondingly, define a Proof. grammar to be monotonic if all productions are of the form Using LBAs introduced below, it suffices to show that every monotonic π : α β where α Γ?V Γ?, β Γ?, α β language can be accepted by an LBA. Initially, some terminal string → ∈ ∈ | | ≤ | | a1, a2, . , an is written on the tape. As we have seen already, a monotonic grammar can only generate a decidable Search handle: the head moves to the right a random number of places, language. say, to ai. In fact, these look rather similar to context-sensitive grammars except that we Check righthand side: the machine verifies that ai, . , aj = β where α β is a production. are now allowed to manipulate the context (but see 2 slides down). In → particular, every CSG is monotonic. Replace by lefthand side: the block ai, . , aj is replaced by α, possibly leaving some blanks. Collect: remove possible blanks by shifting the rest of the tape left. This loop repeats until the tape is reduced to S and we accept. If any of the guesses goes wrong we reject. Tiny Example 14 And the Difference? 15 One can also directly convert monotonic productions to context-sensitive ones. CFL recognition is cubic time, via a dynamic programming algorithm that As a simple example, consider a commutativity rule basically enumerates all possible parse trees. Why would a similar approach not also work for CSLs? As Kuroda NF shows, AB BA → on the face of it, the productions seem only mildly more complicated. Here are equivalent context-sensitive rules: Recall how one can associate undecidability with CFGs: we are interested in the AB AX language (M) of all accepting computations of a TM: → C AX BX → BX BA # C0 # C1 # ... # Cn # → Here X is a new variable and the green variable is the one that is being Unfortunately, this language fails to be context-free: the copy language ww w Σ? is not context-free, and this one is worse. replaced. Note how the left/right context is duly preserved. { | ∈ } Tricks 16 And Context-Sensitive? 17 Consider the complement Σ? (M). For context-sensitive languages there is no problem: we can directly generate − C (M). This requires a bit of work using a grammar directly, but is fairly easy C Consider a representation where we use alternating mirror images: to see from the machine model (see below): an LBA can easily perform the necessary checks. op op # C0 # C1 # C2 # C3 ... # Cn # This works essentially since palindromes are context-free. As a consequence even some the most basic questions about CSG are undecidable. This is really quite surprising. Claim Theorem The resulting language of non-computations is context-free. It is undecidable whether a CSG generates the empty language. Decidability Overview 18 x LL = L = Σ? L = KL K = ∈ ∅ ∩ ∅ 1 Context Sensitive Languages regular YYYYY DCFL YYYYN CFL YYNNN 2 Linear Bounded Automata CSL YNNNN decidable YNNNN semi-dec. NNNNN 3 Polynomial Space Standard decision problems for various language classes. Needless to say, for the decidable ones we would like a more precise complexity classification (in which case it may matter how precisely the instance is given). A Machine Model? 20 Backwards Derivation 21 In order to find a parser for CSL it seems natural to look for an associated Suppose L is context-sensitive via G. The idea is to run a derivation of G machine models: backwards, starting at a string x of terminals. semidecidable — Turing machines To this end, nondeterministically guess a handle in the current string, a place context free — pushdown automata where there is a substring of the form β, where α β is a production in G. → regular — finite state machines Erase β and replace it by α. Rinse and repeat. The original string x is in L iff we can ultimately reach S this way. We need a machine model that is stronger than pushdown automata (FSM plus stack), but significantly weaker than full Turing machines. Note that two stacks won’t work, we need a different approach. Of course, this is yet another path existence problem in a suitable digraph. Linear Bounded Automata 22 The Myhill-Landweber-Kuroda Theorem 23 Definition A linear bounded automaton (LBA) is a type of one-tape, nondeterministic The development happened in stages: Turing machine acceptor where the input is written between special end-markers and the computation can never leave the space between these Myhill 1960 considered deterministic LBAs. markers (nor overwrite them). Landweber 1963 showed that they produce only context-sensitive languages. Kuroda 1964 generalized to nondeterministic LBAs and showed that this Thus the initial configuration looks like produces precisely all the context-sensitive languages. #q0x1x2 . xn# and the tape head can never leave this part of the tape. Theorem It may seem that there is not enough space to perform any interesting A language is accepted by a (nondeterministic) LBA iff it is context-sensitive. computations on an LBA, but note that we can use a sufficiently large tape alphabet to “compress” the input to a fraction of its original size and make room. Proof 24 Digression: Physical Realizability II 25 It was recognized already in the 1950s that Turing machines are, in many ways, too general to describe anything resembling the type of computation that was One direction is by definition of an LBA.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us