2.7 Turing Machines and Grammars We Now Turn Our Attention Back to Turing Machines As Language Acceptors

2.7 Turing Machines and Grammars We Now Turn Our Attention Back to Turing Machines As Language Acceptors

2.7 Turing Machines and Grammars We now turn our attention back to Turing Machines as language acceptors. We have already seen in Sec. 2.4 how Turing Machines define two classes of languages, i.e., recursive language and recursively enumerable languages, depending on whether string membership in the respective languages can be decided on or merely accepted. Recursive and Recursively Enumerable Languages Accepting Vs Deciding M accepts a language when it halts on every member string. M decides a language when it halts (resp. hangs) on a string that is (resp. is not) a member of the language. Accepting, Deciding and Languages Recursive Languages are decidable. Recursively Enumerable are accepted (recognised) by Turing Machines. Slide 58 Chomsky Language Class Hierarchy Regular Languages: Recognised by Regular Grammars. Context Free Languages: Recognised by Context Free Gram- mars. ... Phrase Structured Languages: Recognised by Phrase Struc- tured Grammars. Slide 59 One alternative way how to categorise languages is using (Phrase Structured) Grammars. But how do these language classes compare in relation to those defined by types of Turing Machines? In an earlier course, we have already laid some foundation towards answering this question by considering machines that characterise these language classes. For instance, recall that regular languages were recognised by Finite State Automata. We also saw how the inverse of a regular language was also recognised by an FSA. Thus, if we could construct two Turing Machines corresponding to the FSMs that recognise a regular language L, and its inverse, L,respectively, then we could construct a third Turing Machine from these two Turing Machines that dovetails the two is search of whether x L and x L; the third Turing Machine terminates as soon as either of the sub-tms terminate and returns2 true is the2 Turing Machine accepting x L terminates as false if the Turing Machine accepting x L terminates. Clearly, this third Turing Machine2 will always terminate since, for any x,we have either x2 L or else x L. This leads us to conclude that regular languages are included as part of the recursive languages.2 The2 observation is complete by noting that a Turing Machine can easily act as an 45 Regular Grammars and Recursive Languages Theorem 41. LReg ⇢LRec Proof. Consider both inclusion relations: : FSA transitions can be encoded as Turing Machine transitions ✓ as δ(q1,a)=(q2, R) : Palindromes, 6◆ R R w(w) ,wa(w) w ⌃⇤,a ⌃ , | 2 2 are in but not in . LRec LReg Slide 60 FSA; all transitions would need to be of the form δ(q1,a)=(q2, R) whereby we only read from the tape and transition internally from one state to the other, never writing on the tape or moving in the other direction. This brief (and informal) analysis allows us to affirm our intuition that regular languages are included in recursive languages i.e., Reg Rec. In an earlier exercise we also discussed how palindromes areL decidable✓L and by this fact we conclude that they included in recursive languages. However, palindromes could not be recognised by FSAs (recall that we needed more powerful machines like Pushdown Automata for this). This allows us to establish the strict inclusion Lreg ⇢Lrec In the next subsections we will attempt to complete the picture of how phrase structured languages relate to recursive and recursively enumerable languages. 2.7.1 Turing Machines and Context Free Languages Context Free languages (CFLs) are languages that are recognised by Context Free Grammars, i.e., grammars whose production rules are of the form N (N ⌃)⇤. They are also languages that are recognised by a type of machine called Pushdown automata.! The type[ we considered in an earlier course were actually Non- deterministic PushDown Automata (NPDAs). We shall used this key information to show how languages recognised by Turing Machines relate to CFLs. The relationship we will show is outlined on Slide 61, i.e., that there is a strict inclusion between CFL and recursive languages. In order to show this we have to demonstrate that: 1. Every CFL can be decided by some Turing Machine. 2. There are recursive languages that are not CFL. As in the case of Slide 60, in order to prove the second point above, we only need to find one witness language, which together with the first point above would mean that the set of Recursive languages is 46 Context Free Langauges Theorem 42. Context Free Languages are strictly included in Recursive Languages. LCFG ⇢LRec Proof. Consider both inclusion relations: : CFG can be converted in Chomsky Normal Form where ✓ derivations for strings w are bounded by at most 2 w 1 steps. | |− : The language anbncn n 0 is in but not in . 6◆ { | ≥ } LRec LCFG Slide 61 strictly larger than that of Context Free Languages. The language anbncn n 0 satisfies our needs as this witness language. Recall that, in an earlier course, we had established{ that| this≥ } language could not be recognised by any Context Free Grammar. Moreover, on Slide 35 we showed how we can construct a Turing Machine that can decide this language i.e., showing that it is recursive. This leaves us with the task of showing that every CFL is decidable by a Turing Machine. It turns out that, with our present machinery, the proof to show this would be rather involving5. Instead here we prove a weaker result, namely that CFL are included in the set of recursively enumerable languages. This follows from Lemma 43 of Slide 62. Context Free Langauges and Recursively Enumerable Languages Lemma 43 (CFL and Acceptability). Every CFL can be recog- nised by some Turing Machine. Proof. Use 2-tape non-deterministic Turing Machine whereby: 1st tape simulates input tape with head moving only to the • right. 2nd tape simulates the stack (push and pop) using a string. • Slide 62 (Proof Outline). If a language is Context Free, then there exists a NDPA that can recognise it. Unfortunately, the inherent non-determinism in NPDAs does not allow us to state much about the termination of every 5This proof involves converting the CFG to its Chomsky Normal Form. Then we use the result that, for CFGs in normal form, any derivation of string w is bounded and requires at most 2n 1steps,wheren = w .Sincederivationshavean upper-bound, we can construct a membership checking Turing Machine− that always terminates| (and| returns a negative answer after 2n 1derivationsteps.) − 47 run of such a machine. All that NDPA recognition gives us is that there exists at least one run that accepts strings in the recognised CFL i.e., we only have termination guarantees for strings in the language and the non-determinism of the machine prohibits us from stating anything about decidability. Nevertheless, we can use a 2 tape Turing Machine to easily simulate an NPDA, whereby we use the first tape as the input tape (leaving the input string untouched and always moving the head to the right) and use the second tape to simulate the NDPA stack (adding to and removing from symbols at the rightmost position of the string on the second tape). In order to keep the simulation simpler, we can even use a non- deterministic Turing Machine.Such a simulation together with results form Sections 2.6.2 and 2.6.3 guarantee that there exists some deterministic Turing Machine that can simulate the NDPA and therefore recognise the language. 2.7.2 Turing Machines and Phrase Structured Languages Generic (Phrase Structured) Grammars (PSG), like Turing Machines, can be seen as a mechanical description for transforming a string to some other string. There are three key di↵erences however between the two models of computation: Turing Machines, at least the plain vanilla variant, are deterministic. This is not the case for PSG, • where the production rules may allow non-deterministic derivations i.e., ↵ β1,↵ β2 for the same substring ↵. Moreover, PSG allow expansions to happen at multiple points! in the! string being generated whereas Turing Machines can only alter the string at the location pointed to by the head. PSG do not express any explicit notion of state whereas Turing Machine descriptions are more in- • tentional i.e., closer to an ”implementation”. In fact, state plays a central role in order to determine computation termination in Turing Machines. String acceptance in Turing Machines starts from the string and works its way back, whereas PSG • string acceptance works in reverese by generating the string. Phrase Structured Langauges and Recursively Enumer- able Languages Both transform strings to strings but: Turing Machines, at least the plain vanilla variant, are de- • terministic. PSG do not express any explicit notion of state. • String acceptance in Turing Machines starts from the string • and works its way back, whereas PSG string acceptance works in reverese by generating the string. Slide 63 In what follows we will show that, despite these discrepancies, the two formalisms are equally expressive. By this we mean that every language that can be recognised by PSG, i.e., any PSL, can be recognised by a Turing Machine, and also that any language recognised by a Turing Machine can be recognised by a PSG. Thm. 44 on Slide 64 formalises this statement whereby, for the sake of notational consistency, we denote PSL as and recursively enumerable languages as . LPSG LRE In order to show RE PSG we need to establish some correspondence between computation on a Turing Machine and stringL ✓ derivationsL in a Phrase Structure Grammar. We start by formulating a string 48 Phrase Structured Langauges and Recursively Enumer- able Languages Theorem 44. = LPSG LRE Proof. We need to show: 1. LPSG ✓LRE 2.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    12 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us