Automaty a Gramatiky

Total Page:16

File Type:pdf, Size:1020Kb

Automaty a Gramatiky Context-Free Grammars and Languages More powerful than finite automata. Used to describe document formats, via DTD - document-type definition used in XML (extensible markup language) We introduce context-free grammar notation parse tree. There exist an ’pushdown automaton’ that describes all and only the context-free languages. Will be described later. Automata and Grammars Grammars 6 March 30, 2017 1 / 39 Palindrome example A string the same forward and backward, like otto or Madam, I’m Adam. w is a palindrome iff w = w R . The language Lpal of palindromes is not a regular language. A context-free grammar for We use the pumping lemma. palindromes If Lpal is a regular language, let n be the 1. P → λ asssociated constant, and consider: 2. P → 0 w = 0n10n. 3. P → 1 For regular L, we can break w = xyz such that y consists of one or more 0’s from the 4. P → 0P0 5. P → 1P1 first group. Thus, xz would be also in Lpal if Lpal were regular. A context-free grammar (right) consists of one or more variables, that represent classes of strings, i.e., languages. Automata and Grammars Grammars 6 March 30, 2017 2 / 39 Definition (Grammar) A Grammar G = (V , T , P, S) consists of Finite set of terminal symbols (terminals) T , like {0, 1} in the previous example. Finite set of variables V (nonterminals,syntactic categories), like {P} in the previous example. Start symbol S is a variable that represents the language being defined. P in the previous example. Finite set of rules (productions) P that represent the recursive definition of the language. Each has the form: αAβ → ω, A ∈ V , α, β, ω ∈ (V ∪ T )∗ notice the left side (head) contains at least one variable. The head - the left side, the production symbol →, the body - the right side. Definition (Context free grammar CFG) Context free grammar (CFG) je G = (V , T , P, S) has only productions of the form A → α, A ∈ V , α ∈ (V ∪ T )∗. Automata and Grammars Grammars 6 March 30, 2017 3 / 39 Chomsky hierarchy Grammar types according to productions allowed. Type 0 (recursively enumerable languages L0) general rules α → β, α, β ∈ (V ∪ T )∗, α contains at least one variable Type 1 (context languages L1) productions of the form αAβ → αωβ A ∈ V , α, β ∈ (V ∪ T )∗, ω ∈ (V ∪ T )+ with only exception S → λ, then S does not appear at the right side of any production Type 2 (context free languages L2) productions of the form A → ω, A ∈ V , ω ∈ (V ∪ T )∗ Type 3 (regular (right linear) languages L3) productions of the form A → ωB, A → ω, A, B ∈ V , ω ∈ T ∗ Automata and Grammars Grammars 6 March 30, 2017 4 / 39 Chomsky hierarchy The classes of languages are ordered L0 ⊇ L1 ⊇ L2 ⊇ L3 later we show proper inclusions L0 ⊃ L1 ⊃ L2 ⊃ L3 L0 ⊇ L1 recursively enumerable contain context free productions αAβ → αωβ have variable A in the head L2 ⊇ L3 context free contain regular languages productions A → ωB, A → ω have in the body a string (V ∪ T )∗ L1 ⊇ L2 context contain context free languages we have to eliminate rules A → λ, we can do it (later). Automata and Grammars Grammars 6 March 30, 2017 5 / 39 Derivations Using a Grammar Definition (One step derivation) Suppose G = (V , T , P, S) is grammar. Let α, ω, η, ν ∈ (V ∪ T )∗. Let α → ω be a production rule of G. Then one derivation step is: ηαν ⇒G ηων or just ηαν ⇒ ηων. We extend ⇒ to any number of derivation steps as follows. Definition (Derivation ⇒∗) Let G = (V , T , P, S) is CFG. ∗ ∗ Basis: For any string α ∈ (V ∪ T ) it derives itself, α ⇒G α. ∗ ∗ Induction: If α ⇒G β and β ⇒G γ then α ⇒G γ. ∗ ∗ If grammar G is understood, then we use ⇒ in place of ⇒G . Example (derivation E ⇒∗ a ∗ (a + b00)) E ⇒ E ∗E ⇒ I ∗E ⇒ a ∗E ⇒ a ∗(E) ⇒ a ∗(E +E) ⇒ a ∗(I +E) ⇒ a ∗(a +E) ⇒ ⇒ a ∗ (a + I) ⇒ a ∗ (a + I0) ⇒ a ∗ (a + I00) ⇒ a ∗ (a + b00) Automata and Grammars Grammars 6 March 30, 2017 6 / 39 The Language of a Grammar, Notation Convention for CFG Derivations a, b, c terminals A, B, C variables w, z strings of terminals X, Y either terminals or variables α, β, γ strings of terminals and/or variables. Definition (The Language of a Grammar) Let G = (V , T , P, S) is CFG. The language L(G) of G is the set of terminal strings that have derivations from the start symbol. ∗ ∗ L(G) = {w ∈ T |S ⇒G w} ∗ ∗ Language of a variable A ∈ V is defined L(A) = {w ∈ T |A ⇒G w}. Example (Not CFL example) L = {0n1n2n|n ≥ 1} is not context-free, there does not exist CFG grammar recognizing it. Automata and Grammars Grammars 6 March 30, 2017 7 / 39 Type 3 grammars and regular languages productions has the form A → wB, A → w, A, B ∈ V , w ∈ T ∗ an example of derivation: P : S → 0S|1A|λ, A → 0A|1B, B → 0B|1S S ⇒ 0S ⇒ 01A ⇒ 011B ⇒ 0110B ⇒ 01101S ⇒ 01101 Observations: each word contains exactly one variable (except the last one) the variable is always on the rightmost position the production A → w is the last one of the derivation any step generates terminal string and (possibly) changes the variable The relation of the grammar and a finite automata variable = state of the finite automata productions = transition function Automata and Grammars Grammars 6 March 30, 2017 8 / 39 Example of the reduction FA to a grammar Example (G, FA binary numbers divisible by 5) L = {w|w ∈ {a, b}∗&w binary numbers divisible by 5} 0 C2 E4 1 A → 1B|0A|λ 1 B → 0C|1D A 0 0 0 1 C → 0E|1A 1 D → 0B|1C 1 0 E → 0D|1E B1 D3 0 A ⇒ 0A ⇒ 0 (0) A ⇒ 1B ⇒ 10C ⇒ 101A ⇒ 101 (5) Derivation examples A ⇒ 1B ⇒ 10C ⇒ 101A ⇒ 1010A ⇒ 1010 (10) A ⇒ 1B ⇒ 11D ⇒ 111C ⇒ 1111A ⇒ 1111 (15) Automata and Grammars Grammars 6 March 30, 2017 9 / 39 FA to Grammar reduction Theorem (L ∈ RE ⇒ L ∈ L3) For any language recognized by a finite automata there exists a grammar Type 3 recognizing the language. Proof: FA to Grammar reduction L = L(A) for some automaton A = (Q, Σ, δ, q0, F ). We define a grammar G = (Q, Σ, P, q0), with productions P p → aq, iff δ(p, a) = q p → λ, iff p ∈ F Is L(A) = L(G)? λ ∈ L(A) ⇔ q0 ∈ F ⇔ (q0 → λ) ∈ P ⇔ λ ∈ L(G) a1 ... an ∈ L(A) ⇔ ∃q0,..., qn ∈ Q tž. δ(qi , ai+1) = qi+1, qn ∈ F ⇔ (q0 ⇒ a1q1 ⇒ ... a1 ... anqn ⇒ a1 ... an) is derivation of a1 ... an ⇔ a1 ... an ∈ L(G) Automata and Grammars Grammars 6 March 30, 2017 10 / 39 We aim to construct Grammar to FA reduction Opposite direction production A → aB are encoded to transition function productions A → λ define the accepting states we rewrite productions A → a1 ... anB, A → a1 ... an with more terminals we introduce new variables H2,..., Hn define productions A → a1H2, H2 → a2H3,..., Hn → anB or A → a1H2, H2 → a2H3,..., Hn → an productions A → B correspond to λ transitions Lemma For any Type 3 grammar there exist a Type 3 grammar with the same languages with all productions of the form: A → aB, A → B, A → λ,A, B ∈ V , a ∈ T. Automata and Grammars Grammars 6 March 30, 2017 11 / 39 Standard form of a grammar Type 3 Lemma For any Type 3 grammar there exist a Type 3 grammar with the same languages with all productions of the form: A → aB, A → B, A → λ,A, B ∈ V , a ∈ T. Proof. We define G| = (V |, T , S, P|), for each rule we introduce set of new variables Y2,..., Yn, Z1,..., Zn and define P P| A → aB A → aB A → λ A → λ A → a1 ... anBA → a1Y2, Y2 → a2Y3,... Yn → anB Z → a1 ... an Z → a1Z1, Z1 → a2Z2,..., Zn−1 → anZn, Zn → λ we may eliminate also rules: A → B transitive closure U(A) = {B|B ∈ V &A ⇒∗ B} A → w for all Z ∈ U(A) and (Z → w) ∈ P| Automata and Grammars Grammars 6 March 30, 2017 12 / 39 Theorem (Reduction Type 3 grammar to a λ–NFA) For any language L of a Type 3 grammar there exists a λ–NFA recognizing the same language. Proof: Reduction Type 3 grammar to a λ–NFA We take a grammar G = (V , T , P, S) with all productions of the form A → aB, A → B, A → λ, A, B ∈ V , a ∈ T generating L (previous lemma) we define a non–deterministic λ–NFA A = (V , T , δ, {S}, F ) where: F = {A|(A → λ) ∈ P} δ(A, a) = {B|(A → aB) ∈ P} δ(A, λ) = {B|(A → B) ∈ P} L(G) = L(A) λ ∈ L(G) ⇔ (S → λ) ∈ P ⇔ S ∈ F ⇔ λ ∈ L(A) a1 ... an ∈ L(G) ⇔ there exists a derivation ∗ ∗ (S ⇒ a1H1 ⇒ ... ⇒ a1 ... anHn ⇒ a1 ... an) ⇔∃ H0,..., Hn ∈ V tak že H0 = S, Hn ∈ F Hi+1 ∈ δ(Hi , ak ) for the step a1 ... ak−1Hi ⇒ a1 ... ak−1ak Hi+1 Hi+1 ∈ δ(Hi , λ) for the step a1 ... ak Hi ⇒ a1 ... ak Hi+1 ⇔ a1 ... an ∈ L(A) Automata and Grammars Grammars 6 March 30, 2017 13 / 39 Left (and right) linear grammars Definition (Left (and right) linear grammars) Type 3 grammars are also called right linear (the variable is always at right). A grammar G is left linear iff all production has the form A → Bw, A → w, A, B ∈ V , w ∈ T ∗.
Recommended publications
  • Grammars (Cfls & Cfgs) Reading: Chapter 5
    Context-Free Languages & Grammars (CFLs & CFGs) Reading: Chapter 5 1 Not all languages are regular So what happens to the languages which are not regular? Can we still come up with a language recognizer? i.e., something that will accept (or reject) strings that belong (or do not belong) to the language? 2 Context-Free Languages A language class larger than the class of regular languages Supports natural, recursive notation called “context- free grammar” Applications: Context- Parse trees, compilers Regular free XML (FA/RE) (PDA/CFG) 3 An Example A palindrome is a word that reads identical from both ends E.g., madam, redivider, malayalam, 010010010 Let L = { w | w is a binary palindrome} Is L regular? No. Proof: N N (assuming N to be the p/l constant) Let w=0 10 k By Pumping lemma, w can be rewritten as xyz, such that xy z is also L (for any k≥0) But |xy|≤N and y≠ + ==> y=0 k ==> xy z will NOT be in L for k=0 ==> Contradiction 4 But the language of palindromes… is a CFL, because it supports recursive substitution (in the form of a CFG) This is because we can construct a “grammar” like this: Same as: 1. A ==> A => 0A0 | 1A1 | 0 | 1 | Terminal 2. A ==> 0 3. A ==> 1 Variable or non-terminal Productions 4. A ==> 0A0 5. A ==> 1A1 How does this grammar work? 5 How does the CFG for palindromes work? An input string belongs to the language (i.e., accepted) iff it can be generated by the CFG G: Example: w=01110 A => 0A0 | 1A1 | 0 | 1 | G can generate w as follows: Generating a string from a grammar: 1.
    [Show full text]
  • Formal Language and Automata Theory (CS21004)
    Context Free Grammar Normal Forms Derivations and Ambiguities Pumping lemma for CFLs PDA Parsing CFL Properties Formal Language and Automata Theory (CS21004) Soumyajit Dey CSE, IIT Kharagpur Context Free Formal Language and Automata Theory Grammar (CS21004) Normal Forms Derivations and Ambiguities Pumping lemma Soumyajit Dey for CFLs CSE, IIT Kharagpur PDA Parsing CFL Properties DPDA, DCFL Membership CSL Soumyajit Dey CSE, IIT Kharagpur Formal Language and Automata Theory (CS21004) Context Free Grammar Normal Forms Derivations and Ambiguities Pumping lemma for CFLs PDA Parsing CFL Properties Formal Language and Automata Announcements Theory (CS21004) Soumyajit Dey CSE, IIT Kharagpur Context Free Grammar Normal Forms Derivations and The slide is just a short summary Ambiguities Follow the discussion and the boardwork Pumping lemma for CFLs Solve problems (apart from those we dish out in class) PDA Parsing CFL Properties DPDA, DCFL Membership CSL Soumyajit Dey CSE, IIT Kharagpur Formal Language and Automata Theory (CS21004) Context Free Grammar Normal Forms Derivations and Ambiguities Pumping lemma for CFLs PDA Parsing CFL Properties Formal Language and Automata Table of Contents Theory (CS21004) Soumyajit Dey 1 Context Free Grammar CSE, IIT Kharagpur 2 Normal Forms Context Free Grammar 3 Derivations and Ambiguities Normal Forms 4 Derivations and Pumping lemma for CFLs Ambiguities 5 Pumping lemma PDA for CFLs 6 Parsing PDA Parsing 7 CFL Properties CFL Properties DPDA, DCFL 8 DPDA, DCFL Membership 9 Membership CSL 10 CSL Soumyajit Dey CSE,
    [Show full text]
  • (Or Simply a Grammar) Is (V, T, P, S), Where (I) V Is a Finite Nonempty Set Whose Elements Are Called Variables
    UNIT - III GRAMMARS FORMALISM Definition of Grammar: A phrase-Structure grammar (or Simply a grammar) is (V, T, P, S), Where (i) V is a finite nonempty set whose elements are called variables. (ii) T is finite nonempty set, whose elements are called terminals. (iii) S is a special variable (i.e an element of V ) called the start symbol, and (iv) P is a finite set whose elements are ,where and are strings on V T, has at least one symbol from V. Elements of P are called Productions or production rules or rewriting rules. 1. Regular Grammar: A grammar G=(V,T,P,S) is said to be Regular grammar if the productions are in either right-linear grammar or left-linear grammar. ABx|xB|x 1.1 Right –Linear Grammar: A grammar G=(V,T,P,S) is said to be right-linear if all productions are of the form AxB or Ax. Where A,B V and x T*. 1.2 Left –Linear Grammar: A grammar G=(V,T,P,S) is said to be Left-linear grammar if all productions are of the form ABx or Ax Example: 1. Construct the regular grammar for regular expression r=0(10)*. Sol: Right-Linear Grammar: S0A A10A| Computer Science & Engineering Formal Languages and Automata Theory Left-Linear Grammar: SS10|0 2. Equivalence of NFAs and Regular Grammar 2.1 Construction of -NFA from a right –linear grammar: Let G=(V,T,P,S) be a right-linear grammar .we construct an NFA with -moves, M=(Q,T, ,[S],[ ]) that simulates deviation in ‘G’ Q consists of the symbols [ ] such that is S or a(not necessarily proper)suffix of some Right-hand side of a production in P.
    [Show full text]
  • A Complex Measure for Linear Grammars‡
    1 Demonstratio Mathematica, Vol. 38, No. 3, 2005, pp 761-775 A Complex Measure for Linear Grammars‡ Ishanu Chattopadhyay Asok Ray [email protected] [email protected] The Pennsylvania State University University Park, PA 16802, USA Keywords: Language Measure; Discrete Event Supervisory Control; Linear Grammars Abstract— The signed real measure of regular languages, introduced supervisors that satisfy their own respective specifications. Such and validated in recent literature, has been the driving force for a partially ordered set of sublanguages requires a quantitative quantitative analysis and synthesis of discrete-event supervisory (DES) measure for total ordering of their respective performance. To control systems dealing with finite state automata (equivalently, regular languages). However, this approach relies on memoryless address this issue, a signed real measures of regular languages state-based tools for supervisory control synthesis and may become has been reported in literature [7] to provide a mathematical inadequate if the transitions in the plant dynamics cannot be captured framework for quantitative comparison of controlled sublanguages by finitely many states. From this perspective, the measure of regular of the unsupervised plant language. This measure provides a languages needs to be extended to that of non-regular languages, total ordering of the sublanguages of the unsupervised plant such as Petri nets or other higher level languages in the Chomsky hierarchy. Measures for non-regular languages has not apparently and formalizes a procedure for synthesis of DES controllers for been reported in open literature and is an open area of research. finite state automaton plants, as an alternative to the procedure This paper introduces a complex measure of linear context free of Ramadge and Wonham [5].
    [Show full text]
  • A Turing Machine
    1/45 Turing Machines and General Grammars Alan Turing (1912 – 1954) 2/45 Turing Machines (TM) Gist: The most powerful computational model. Finite State Control Tape: Read-write head a1 a2 … ai … an … moves Note: = blank 3/45 Turing Machines: Definition Definition: A Turing machine (TM) is a 6-tuple M = (Q, , , R, s, F), where • Q is a finite set of states • is an input alphabet • is a tape alphabet; ; • R is a finite set of rules of the form: pa qbt, where p, q Q, a, b , t {S, R, L} • s Q is the start state • F Q is a set of final states Mathematical note: • Mathematically, R is a relation from Q to Q {S, R, L} • Instead of (pa, qbt), we write pa qbt 4/45 Interpretation of Rules • pa qbS: If the current state and tape symbol are p and a, respectively, then replace a with b, change p to q, and keep the head Stationary. p q x a y x b y • pa qbR: If the current state and tape symbol are p and a, respectively, then replace a with b, shift the head a square Right, and change p to q. p q x a y x b y • pa qbL: If the current state and tape symbol are p and a, respectively, then replace a with b, shift the head a square Left, and change p to q. p q x a y x b y 5/45 Graphical Representation q represents q Q s represents the initial state s Q f represents a final state f F p a/b, S q denotes pa qbS R p a/b, R q denotes pa qbR R p a/b, L q denotes pa qbL R 6/45 Turing Machine: Example 1/2 M = (Q, , , R, s, F) where: 6/45 Turing Machine: Example 1/2 M = (Q, , , R, s, F) where: • Q = {s, p, q, f}; s f p q 6/45 Turing
    [Show full text]
  • Context-Free Languages & Grammars (Cfls & Cfgs)
    Context-Free Languages & Grammars (CFLs & CFGs) Reading: Chapter 5 1 Not all languages are regular n So what happens to the languages which are not regular? n Can we still come up with a language recognizer? n i.e., something that will accept (or reject) strings that belong (or do not belong) to the language? 2 Context-Free Languages n A language class larger than the class of regular languages n Supports natural, recursive notation called “context- free grammar” n Applications: Context- n Parse trees, compilers Regular free n XML (FA/RE) (PDA/CFG) 3 An Example n A palindrome is a word that reads identical from both ends n E.g., madam, redivider, malayalam, 010010010 n Let L = { w | w is a binary palindrome} n Is L regular? n No. n Proof: N N (assuming N to be the p/l constant) n Let w=0 10 k n By Pumping lemma, w can be rewritten as xyz, such that xy z is also L (for any k≥0) n But |xy|≤N and y≠e + n ==> y=0 k n ==> xy z will NOT be in L for k=0 n ==> Contradiction 4 But the language of palindromes… is a CFL, because it supports recursive substitution (in the form of a CFG) n This is because we can construct a “grammar” like this: Same as: 1. A ==> e A => 0A0 | 1A1 | 0 | 1 | e Terminal 2. A ==> 0 3. A ==> 1 Variable or non-terminal Productions 4. A ==> 0A0 5. A ==> 1A1 How does this grammar work? 5 How does the CFG for palindromes work? An input string belongs to the language (i.e., accepted) iff it can be generated by the CFG G: n Example: w=01110 A => 0A0 | 1A1 | 0 | 1 | e n G can generate w as follows: Generating a string from a grammar: 1.
    [Show full text]
  • Pushdown Automata 1 Formal Definition
    Abstract. This chapter introduces Pushdown Au- finite tomata, and presents their basic theory. The two control δ top A language families defined by pda (acceptance by final state, resp. by empty stack) are shown to be equal. p stack The pushdown languages are shown to be equal to . the context-free languages. Closure properties of the . context-free languages are given that can be obtained ···a ··· using this characterization. Determinism is considered, input tape and it is formally shown that deterministic pda are less powerfull than the general class. Figure 1: Pushdown automaton Pushdown Automata input. Otherwise, it reaches the end of the input, which is then accepted only if the stack is empty. A pushdown automaton is like a finite state automa- ton that has a pushdown stack as additional mem- ory. Thus, it is a finite state device equipped with a 1 Formal Definition one-way input tape and a ‘last-in first-out’ external memory that can hold unbounded amounts of infor- The formal specification of a pushdown automaton mation; each individual stack element contains finite extends that of a finite state automaton by adding information. The usual stack access is respected: the an alphabet of stack symbols. For technical reasons automaton is able to see (test) only the topmost sym- the initial stack is not empty, but it contains a single bol of the stack and act based on its value, it is able symbol which is also specified. Finally the transition to pop symbols from the top of the stack, and to push relation must also account for stack inspection and symbols onto the top of the stack.
    [Show full text]
  • Chapter 4 Context-Free Grammars And
    LIACS — Second Course ... IV Chapter 4 Context-free Grammars and Lan- guages IV 1 context-free languages 4.0 Review 4.1 Closure properties counting letters 4.2 Unary context-free languages 4.6 Parikh’s theorem pumping & swapping 4.3 Ogden’s lemma 4.4 Applications of Ogden’s lemma 4.5 The interchange lemma subfamilies 4.7 Deterministic context-free languages 4.8 Linear languages 4.0 Review ⊲ The book uses a transition funtion ∗ δ : Q × (Σ ∪{ǫ}) × Γ → 2Q×Γ , i.e., a function into (finite) subsets of Q × Γ∗. My personal favourite is a (finite) transi- tion relation δ ⊆ Q × (Σ ∪{ǫ}) × Γ × Q × Γ∗. In the former one writes δ(p, a, A) ∋ (q,α) and in the latter (p,a,A,q,α) ∈ δ. The meaning is the same. ⊳ IV 2 pushdown automaton (syntax) 7-tuple finite top control δ A = (Q, Σ, Γ, δ, q0, Z0, F ) A Q states p, q p q0 ∈ Q initial state . F ⊆ Q final states ···a ··· Σ input alphabet a,b w,x Γ stack alphabet A,B α input tape stack Z0 ∈ Γ initial stack symbol transition function (finite) ∗ δ : Q × (Σ ∪{ǫ}) × Γ → 2Q×Γ from to ( p a A ) ∋ ( q α ) read pop push before after | {z } | {z } IV 3 pushdown automaton (semantics) Q × Σ∗ × Γ∗ configuration β p A = p state . Aγ ( p, w, β ) w input, unread part ···a ··· β stack, top-to-bottom w = ax move (step) ⊢A | {z } ( p, ax, Aγ ) ⊢A ( q,x, αγ ) iff ( p, a, A, q, α ) ∈ δ, x ∈ Σ∗ and γ ∈ Γ∗ α q computation ⊢∗ αγ A .
    [Show full text]
  • Formal Language and Automata Theory (CS21 004) Course Coverage
    Formal Language and Automata Theory (CS21 004) Course Coverage Class: CSE 2 nd Year 4th January, 201 0 ( 2 hours) : Tutorial-1 + Alphabet: Σ; Γ; , string over Σ, Σ 0 = " , Σ n = x : x = a a a ; a Σ; 1 i n , Σ ? = f g f 1 2 n i 2 6 6 g Σ n, Σ + = Σ ? " , ( Σ ?; co n; ") is a monoid. Language L over the alphabet Σ is a subset of n N nf g 2 ΣS?, L Σ ∗ . ⊆ Σ ? N The size of Σ ∗ is countably infinite , so the collection of all languages over Σ, 2 2 , is uncountably infinite . So every language cannot have finite description. ' 5 th January, 201 0 ( 1 hour) : No set is equinumerous to its power-set ( Cantor) . The proof is by reductio ad absurdum ( reduc- tion to a contradiction) . An one to one map from A to 2 A is easy to get, a ¡ a . f g Let A be a set and there is an onto map ( surjection ) f from A to 2 A. We consider the set ¢ A A B = x A: x f( x) 2 . As f: A 2 is a surjection, there is an element a0 A so that f 2 2 ¢ g ! 2 f( a ) = 2 A, but then a f( a ) = . So, a B, and B is a non-empty subset of A. So, there 0 ; 2 0 0 ; 0 2 is an element a1 A such that f( a1 ) = B. Does a1 f( a1 ) = B? This leads to contradiction. ¢ 2 ¢ 2 If a1 f( a1 ) = B, then a1 B, if a1 f( a1 ) = B, then a1 B i.
    [Show full text]
  • Formal Languages and Automata
    Formal Languages and Automata Stephan Schulz & Jan Hladik [email protected] [email protected] L ⌃⇤ ✓ with contributions from David Suendermann 1 Table of Contents Formal Grammars and Context-Free Introduction Languages Basics of formal languages Formal Grammars The Chomsky Hierarchy Regular Languages and Finite Automata Right-linear Grammars Regular Expressions Context-free Grammars Finite Automata Push-Down Automata Non-Determinism Properties of Context-free Regular expressions and Finite Languages Automata Turing Machines and Languages of Type Minimisation 1 and 0 Equivalence Turing Machines The Pumping Lemma Unrestricted Grammars Properties of regular languages Linear Bounded Automata Scanners and Flex Properties of Type-0-languages 2 Outline Introduction Basics of formal languages Regular Languages and Finite Automata Scanners and Flex Formal Grammars and Context-Free Languages Turing Machines and Languages of Type 1 and 0 3 Introduction I Stephan Schulz I Dipl.-Inform., U. Kaiserslautern, 1995 I Dr. rer. nat., TU Munchen,¨ 2000 I Visiting professor, U. Miami, 2002 I Visiting professor, U. West Indies, 2005 I Lecturer (Hildesheim, Offenburg, . ) since 2009 I Industry experience: Building Air Traffic Control systems I System engineer, 2005 I Project manager, 2007 I Product Manager, 2013 I Professor, DHBW Stuttgart, 2014 Research: Logic & Automated Reasoning 4 Introduction I Jan Hladik I Dipl.-Inform.: RWTH Aachen, 2001 I Dr. rer. nat.: TU Dresden, 2007 I Industry experience: SAP Research I Work in publicly funded research projects I Collaboration with SAP product groups I Supervision of Bachelor, Master, and PhD students I Professor: DHBW Stuttgart, 2014 Research: Semantic Web, Semantic Technologies, Automated Reasoning 5 Literature I Scripts I The most up-to-date version of this document as well as auxiliary material will be made available online at http://wwwlehre.dhbw-stuttgart.de/ ˜sschulz/fla2015.html and http://wwwlehre.dhbw-stuttgart.de/ ˜hladik/FLA I Books I John E.
    [Show full text]
  • On Chomsky Hierarchy of Palindromic Languages∗
    Acta Cybernetica 22 (2016) 703{713. On Chomsky Hierarchy of Palindromic Languages∗ P´alD¨om¨osi,y Szil´ardFazekas,z and Masami Itox Abstract The characterization of the structure of palindromic regular and palindromic context-free languages is described by S. Horv´ath,J. Karhum¨aki, and J. Kleijn in 1987. In this paper alternative proofs are given for these characterizations. Keywords: palindromic formal languages, combinatorics of words and lan- guages 1 Introduction The study of combinatorial properties of words is a well established field and its results show up in a variety of contexts in computer science and related disciplines. In particular, formal language theory has a rich connection with combinatorics on words, even at the most basic level. Consider, for example, the various pumping lemmata for the different language classes of the Chomsky hierarchy, where ap- plicability of said lemmata boils down in most cases to showing that the resulting words, which are rich in repetitions, cannot be elements of a certain language. After repetitions, the most studied special words are arguably the palindromes. These are sequences, which are equal to their mirror image. Apart from their combi- natorial appeal, palindromes come up frequently in the context of algorithms for DNA sequences or when studying string operations inspired by biological processes, e.g., hairpin completion [2], palindromic completion [10], pseudopalindromic com- pletion [3], etc. Said string operations are often considered as language generating formalisms, either by applying them to all words in a given language or by apply- ing them iteratively to words. One of the main questions, when considering the languages arising from these operations, is how they relate to the classes defined by the Chomsky hierarchy.
    [Show full text]
  • Binary Context-Free Grammars
    S S symmetry Article Binary Context-Free Grammars Sherzod Turaev 1,* , Rawad Abdulghafor 2 , Ali Amer Alwan 2 , Ali Abd Almisreb 3 and Yonis Gulzar 4 1 Department of Computer Science & Software Engineering, College of Information Technology, United Arab Emirates University, Al Ain 15551, UAE 2 Department of Computer Science, Faculty of Information and Communication Technology, International Islamic University Malaysia, Gombak, Selangor 53100, Malaysia; [email protected] (R.A.); [email protected] (A.A.A.) 3 Faculty of Engineering and Natural Sciences, International University of Sarajevo, 71210 Sarajevo, Bosnia and Herzegovina; [email protected] 4 Department of Management Information Systems, King Faisal University, Al-Ahsa 31982, Saudi Arabia; [email protected] * Correspondence: [email protected]; Tel.: +971-58-567-6013 Received: 17 June 2020; Accepted: 21 July 2020; Published: 24 July 2020 Abstract: A binary grammar is a relational grammar with two nonterminal alphabets, two terminal alphabets, a set of pairs of productions and the pair of the initial nonterminals that generates the binary relation, i.e., the set of pairs of strings over the terminal alphabets. This paper investigates the binary context-free grammars as mutually controlled grammars: two context-free grammars generate strings imposing restrictions on selecting production rules to be applied in derivations. The paper shows that binary context-free grammars can generate matrix languages whereas binary regular and linear grammars have the same power as Chomskyan regular and linear grammars. Keywords: formal language; binary grammar; context-free grammar; matrix grammar; relational grammar; computation power; chomsky hierarchy 1. Introduction A “traditional” phrase-structure grammar (also known as a Chomskyan grammar) is a generative computational mechanism that produces strings (words) over some alphabet starting from the initial symbol and sequentially applying production rules that rewrite sequences of symbols [1–3].
    [Show full text]