Grammars

A grammar is a 4-tuple (N,Σ,S,P) where Towards more complex grammar systems Some basic theory • N is a finite set of non-terminals

• Σ is a finite set of terminal symbols, with N ∩ Σ=∅ Detmar Meurers: Intro to Computational Linguistics I • S S ∈ N OSU, LING 684.01, 15. January 2004 is a distinguished start symbol,with

• P is a finite set of rewrite rules of the form α → β,withα, β ∈ (N ∪ Σ)∗ and α including at least one non-terminal symbol.

3

Overview A simple example

N = {S, NP, VP, Vi,Vt,Vs} • Grammars, or: how to specify linguistic knowledge Σ = {John, Mary, laughs, loves, thinks} • Automata, or: how to process with linguistic knowledge S =S   •  →  Levels of complexity in grammars and automata:  → NP John   S NP VP →  The  NP Mary  P = VP → Vi  Vi → laughs   VP → Vt NP   Vt → loves   VP → Vs S  Vs → thinks

2 4 How does a grammar define a language? Different levels of complexity in grammars and automata Assume α, β ∈ (N ∪ Σ)∗,withα containing at least one non-terminal. • A sentential form for a grammar G is defined as: Let A, B ∈ N, x ∈ Σ, α, β, γ ∈ (Σ ∪ T )∗, and δ ∈ (Σ ∪ T )+, then: − The start symbol S of G is a sentential form. Type Automaton Grammar − If αβγ is a sentential form and there is a rewrite rule β → δ then Memory Name Rule Name αδγ is a sentential form. 0 Unbounded TM α → β General rewrite 1 Bounded LBA βAγ→ βδγ Context-sensitive • α (directly or immediately) derives β if α → β ∈ P . One writes: 2 Stack PDA A → β Context-free − α ⇒∗ β if β is derived from α in zero or more steps 3 None FSA A → xB, A → x Right linear − α ⇒+ β if β is derived from α in one or more steps Abbreviations: • A sentence is a sentential form consisting only of terminal symbols. – TM: – LBA: Linear-Bounded Automaton • The language L(G) generated by the grammar G is the set of all – PDA: Push-Down Automaton sentences which can be derived from the start symbol S, – FSA: Finite-State Automaton ∗ i.e., L(G)={γ|S ⇒ γ} 5 7

Processing with grammars: automata Type 3: Right-Linear Grammars and FSAs

A right-linear grammar is a 4-tuple (N,Σ,S,P) with An automaton in general has three components: P a finite set of rewrite rules of the form α → β,withα ∈ N and β ∈ {γδ|γ ∈ Σ∗,δ ∈ N ∪{}}, i.e.: • an input tape, divided into squares with a read-write head positioned over one of the squares − left-hand side of rule: a single non-terminal, and − right-hand side of rule: a string containing at most one non-terminal, • an auxiliary memory characterized by two functions as the rightmost symbol − fetch: memory configuration → symbols − store: memory configuration × symbol → memory configuration Right-linear grammars are formally equivalent to left-linear grammars.

• and a finite-state control relating the two components. A finite-state automaton consists of – a tape – a finite-state control – no auxiliary memory 6 8 A example: (ab|c)ab ∗ (a|cb)? Type 2: Context-Free Grammars and Push-Down Automata Right-linear grammar:    → →  A context-free grammar is a 4-tuple (N,Σ,S,P) with  Expr ab X X aY  N = {Expr, X, Y, Z}  →  Expr cX → Σ = {a,b,c} P = Z a P a finite set of rewrite rules of the form α → β,withα ∈ N and β ∈  → →  S =Expr  Y bY Z cb  (Σ ∪ N)∗, i.e.: Y → Z Z →  − left-hand side of rule: a single non-terminal, and − right-hand side of rule: a string of terminals and/or non-terminals Finite-state transition network: A push-down automaton is a c a a 4 2 0 a b 1 c b − finite state automaton, with a b − 5 3 stack as auxiliary memory

9 11

Thinking about regular languages A context-free language example: anbn

Context-free grammar: Push-down automaton: − A language is regular iff one can define a FSM (or regular expression)  for it. N = {S} 0 1 − An FSM only has a fixed amount of memory, namely the number of Σ = {a, b} states. S =S   − Strings longer than the number of states, in particular also any infinite S → aSb a + push x b + pop x P = ones, must result from a loop in the FSM. S →  − Pumping Lemma: if for an infinite string there is no such loop, the string cannot be part of a regular language.

10 12 Type 1: Context-Sensitive Grammars and Type 0: General Rewrite Grammar and Turing Machines Linear-Bounded Automata

Aruleofacontext-sensitive grammar • In a general rewrite grammar there are no restrictions on the form – rewrites at most one non-terminal from the left-hand side. of a rewrite rule. – right-hand side of a rule required to be at least as long as the left- hand side, i.e. only contains rules of the form • A turing machine has an unbounded auxiliary memory. α → β with |α|≤|β| • Any language for which there is a recognition procedure can be S →  S β and optionally with the start symbol not occurring in any . defined, but recognition problem is not decidable.

A linear-bounded automaton is a – finite state automaton, with an – auxiliary memory which cannot exceed the length of the input string.

13 15

A context-sensitive language example: anbncn Properties of different language classes

Languages are sets of strings, so that one can apply set operations to Context-sensitive grammar: languages and investigate the results for particular language classes.

N = {S, B, C} Some closure properties:

Σ = {a, b} − All language classes are closed under union with themselves. S =S − All language classes are closed under intersection with regular   languages.  →   S aSBC,   →  −  S abC,  The class of context-free languages is not closed under bB → bb, intersection with itself. P =  bC → bc,    Proof: The intersection of the two context-free languages L1 and L2  cC → cc,    is not context free: CB → BC  n n i − L1 = a b c |n ≥ 1 and i ≥ 0 j n n 14 − L2 = a b c |n ≥ 1 and j ≥ 0 16 n n n − L1 ∩ L2 = {a b c |n ≥ 1} Criteria under which to evaluate grammar formalisms Language classes and natural languages (cont.)

There are three kinds of criteria: – linguistic naturalness • Any finite language is a regular language. – mathematical power – computational effectiveness and efficiency • The argument that natural languages are not regular relies on competence as an idealization, not performance. The weaker the type of grammar: – the stronger the claim made about possible languages • Note that even if English were regular, a context-free grammar – the greater the potential efficiency of the parsing procedure characterization could be preferable on the grounds that it is more transparent than one using only finite-state methods. Reasons for choosing a stronger grammar class: – to capture the empirical reality of actual languages – to provide for elegant analyses capturing more generalizations (→ more “compact” grammars)

17 19

Language classes and natural languages Accounting for the facts Natural languages are not regular vs. linguistically sensible analyses

(1) a. The mouse escaped. Looking at grammars from a linguistic perspective, one can distinguish b. The mouse that the cat chased escaped. their c. The mouse that the cat that the dog saw chased escaped. d. . − weak generative capacity, considering only the set of strings (2) a. aa generated by a grammar b. abba − strong generative capacity, considering the set of strings and their c. abccba syntactic analyses generated by a grammar d. .

Center-embedding of arbitrary depth needs to be captured to capture Two grammars can be strongly or weakly equivalent. language competence → Not possible with a finite state automaton.

18 20 Example for weakly equivalent grammars Grammar 2 rules: A weekly equivalent grammar eliminating the ambiguity (only licenses second structure).    → ,  S1 if T then S1   → , Example string: S1 if T then S2 else S1   → ,  S1 a  if x then if y then a else b S1 → b,  S2 → if T then S2 else S2,   S2 → a  Grammar 1:     S2 → b   → ,   S if T then S else S  T → x   → ,    S if T then S  T → y S → a  →  S b   →  T x  T → y

21 23

First analysis: S Reading assignment

if then else T S S

x b • Chapter 2 “Basic Formal Language Theory” of our Lecture Notes if T then S • Chapter 3 “Formal Languages and Natural Languages” of our y a Lecture Notes Second analysis: S • Chapter 13 “Language and complexity” of Jurafsky and Martin (2000) if then T S

x if T then S else S

y a b 22 24