Theorem Proving in Classical Logic

Total Page:16

File Type:pdf, Size:1020Kb

Theorem Proving in Classical Logic MEng Individual Project Imperial College London Department of Computing Theorem Proving in Classical Logic Supervisor: Dr. Steffen van Bakel Author: David Davies Second Marker: Dr. Nicolas Wu June 16, 2021 Abstract It is well known that functional programming and logic are deeply intertwined. This has led to many systems capable of expressing both propositional and first order logic, that also operate as well-typed programs. What currently ties popular theorem provers together is their basis in intuitionistic logic, where one cannot prove the law of the excluded middle, ‘A A’ – that any proposition is either true or false. In classical logic this notion is provable, and the_: corresponding programs turn out to be those with control operators. In this report, we explore and expand upon the research about calculi that correspond with classical logic; and the problems that occur for those relating to first order logic. To see how these calculi behave in practice, we develop and implement functional languages for propositional and first order logic, expressing classical calculi in the setting of a theorem prover, much like Agda and Coq. In the first order language, users are able to define inductive data and record types; importantly, they are able to write computable programs that have a correspondence with classical propositions. Acknowledgements I would like to thank Steffen van Bakel, my supervisor, for his support throughout this project and helping find a topic of study based on my interests, for which I am incredibly grateful. His insight and advice have been invaluable. I would also like to thank my second marker, Nicolas Wu, for introducing me to the world of dependent types, and suggesting useful resources that have aided me greatly during this report. Finally, I’d like to thank my friends and family for supporting me throughout my time at Imperial College. Contents 1 Introduction 1 1.1 Report Outline........................................1 1.2 Contributions........................................2 2 Logic 3 2.1 Propositional Logic.....................................3 2.2 First Order Logic......................................4 2.3 Natural Deduction......................................4 2.3.1 Logical Subsystems.................................4 2.3.2 Conjunction and Disjunction...........................6 2.3.3 First Order Natural Deduction..........................7 2.4 Sequent Calculus......................................8 2.4.1 Structural Rules...................................8 2.4.2 LK and LJ......................................8 3 λ-Calculus 10 3.1 The Untyped λ-Calculus.................................. 10 3.2 The Simply Typed λ-Calculus............................... 11 3.3 Principal Types....................................... 12 3.3.1 Unification...................................... 12 3.3.2 Principal Type.................................... 13 3.4 The Curry-Howard Isomorphism............................. 13 3.5 Proof-Term Syntax for Intuitionistic Propositional Logic................ 14 3.5.1 Inhabiting the Logic with Syntax......................... 14 + 3.5.2 STLC × for IPL................................... 15 4 λµ-Calculus 16 4.1 Evaluation Contexts..................................... 16 4.2 λµ-Terms........................................... 16 4.3 Type Assignments for the λµ-Calculus.......................... 18 4.3.1 Types and Contexts................................. 18 4.3.2 Type Assignments.................................. 18 4.4 A Proof-Term Syntax for Classical Propositional Logic................. 19 4.4.1 Translating Derivations............................... 20 4.4.2 Towards a Complete Logic............................. 21 4.5 Sums and Products..................................... 22 4.6 λµ-Calculus as a Natural Deduction System....................... 23 5 Dependent Types 25 5.1 Intuitionistic Type Theory................................. 25 5.2 The Problem of Control................................... 28 5.3 Avoiding the Degeneracy.................................. 28 5.3.1 dPA! ......................................... 28 5.3.2 Reductions...................................... 30 5.3.3 Type Assignments.................................. 30 5.4 Using dPA! .......................................... 31 5.5 Inductive Families...................................... 32 5.5.1 Defining Inductive Families............................ 32 vi 6 Simple Types with Name Polymorphism in λµ 35 6.1 Typing Algorithms for λµ-calculus............................ 35 6.1.1 Towards a Principal Pairing Algorithm...................... 35 6.1.2 Principal Pairing Definitions............................ 36 6.2 A Theorem Prover for Classical Propositional Logic................... 38 6.2.1 Syntax........................................ 38 6.2.2 Type Assignments for λµN ............................. 38 7 ECCλµ: Dependently Typed λµ-calculus 42 7.1 Some formalisms...................................... 42 7.2 Type System......................................... 42 7.2.1 Syntax........................................ 43 7.2.2 Reductions...................................... 44 7.2.3 Type Assignments.................................. 44 7.2.4 Properties...................................... 46 7.3 Dependent Algebraic Data Types............................. 47 7.3.1 Inductive Families.................................. 48 7.3.2 Inductive Records.................................. 50 7.3.3 Reductions for (Co)Inductive Types........................ 51 7.4 Typing Algorithm...................................... 52 7.4.1 Weak Head Normal Form............................. 52 7.4.2 Bidirectional Algorithm.............................. 53 7.4.3 nef Rules....................................... 54 8 Implementation 56 8.1 Variables and Representing Terms............................. 58 8.1.1 De Bruijn Indices.................................. 58 8.1.2 Unbound – Locally Nameless........................... 58 8.1.3 Using Unbound................................... 59 8.1.4 Structural Substitution............................... 59 8.2 Syntax............................................. 60 8.2.1 Parsing........................................ 60 8.2.2 User Syntax..................................... 60 8.2.3 AST and Parser Errors............................... 61 8.3 Type Checking Monad................................... 61 8.4 Simply Typed Theorem Prover............................... 62 8.4.1 Name Polymorhpism and Type Instantiation.................. 62 8.4.2 Expressing Logic.................................. 63 8.4.3 Diagram....................................... 63 8.5 Dependently Typed Theorem Prover........................... 63 8.5.1 Evaluation...................................... 63 8.5.2 (Co)Data....................................... 63 8.6 REPL............................................. 64 9 Evaluation 66 9.1 λµN .............................................. 66 9.1.1 Theory........................................ 66 9.1.2 Implementation................................... 66 9.2 ECCλµ ............................................. 66 9.2.1 Theory........................................ 66 9.2.2 Implementation................................... 68 10 Ethical Discussion 69 vii 11 Conclusion 70 11.1 Review............................................ 70 11.2 Future Work......................................... 70 11.2.1 Simple Types..................................... 70 11.2.2 Dependent Types.................................. 71 A λµN 73 A.1 Proofs............................................. 73 A.2 Implementation....................................... 82 A.2.1 Natural Deduction................................. 82 B ECCλµ 84 B.1 Proofs............................................. 84 B.2 Type Systems......................................... 93 B.3 Bidirectional Algorithms for ECCλµ ............................ 94 B.4 Derivations.......................................... 97 B.5 Implementation....................................... 98 B.5.1 Syntax........................................ 98 References 98 viii 1| Introduction Proof assistants are gaining in popularity. From helping to teach students how to write mathe- matical proofs1 to the large scale projects like checking compiler correctness [13], they are seeing more use in the computing world. Proof assistants are programming languages that correspond with a formal logic, and come in different shapes and forms; Coq [72] has a particular focus on the theorem proving aspect where proofs can be written with intuitive tactics, whereas the language Agda [59] is first and foremost a functional programming language, like Haskell. Under the hood, theorem provers ensure proof correctness by a strong type system. The mainstream languages are all based on so-called intuitionistic type theory [46]. They all capitalise on the astonishing fact that functions in a functional programming language correspond to proofs in intuitionistic logic; and the types of these functions correspond to the propositions that are derived by said proofs. Under this correspondence, type-checking a function is the same oper- ation as checking a proof of a proposition [77]. Intuitionism inescapably ties these languages to the fact that they are unable to prove a simple logical notion; that any proposition is either true or false. This is known as the law of the excluded middle ((lem)), and it characterises the logic we call classical logic [23]. Intuitionistic logic rejects this notion, and instead is based on the idea that
Recommended publications
  • Chapter 2 Introduction to Classical Propositional
    CHAPTER 2 INTRODUCTION TO CLASSICAL PROPOSITIONAL LOGIC 1 Motivation and History The origins of the classical propositional logic, classical propositional calculus, as it was, and still often is called, go back to antiquity and are due to Stoic school of philosophy (3rd century B.C.), whose most eminent representative was Chryssipus. But the real development of this calculus began only in the mid-19th century and was initiated by the research done by the English math- ematician G. Boole, who is sometimes regarded as the founder of mathematical logic. The classical propositional calculus was ¯rst formulated as a formal ax- iomatic system by the eminent German logician G. Frege in 1879. The assumption underlying the formalization of classical propositional calculus are the following. Logical sentences We deal only with sentences that can always be evaluated as true or false. Such sentences are called logical sentences or proposi- tions. Hence the name propositional logic. A statement: 2 + 2 = 4 is a proposition as we assume that it is a well known and agreed upon truth. A statement: 2 + 2 = 5 is also a proposition (false. A statement:] I am pretty is modeled, if needed as a logical sentence (proposi- tion). We assume that it is false, or true. A statement: 2 + n = 5 is not a proposition; it might be true for some n, for example n=3, false for other n, for example n= 2, and moreover, we don't know what n is. Sentences of this kind are called propositional functions. We model propositional functions within propositional logic by treating propositional functions as propositions.
    [Show full text]
  • Aristotle's Assertoric Syllogistic and Modern Relevance Logic*
    Aristotle’s assertoric syllogistic and modern relevance logic* Abstract: This paper sets out to evaluate the claim that Aristotle’s Assertoric Syllogistic is a relevance logic or shows significant similarities with it. I prepare the grounds for a meaningful comparison by extracting the notion of relevance employed in the most influential work on modern relevance logic, Anderson and Belnap’s Entailment. This notion is characterized by two conditions imposed on the concept of validity: first, that some meaning content is shared between the premises and the conclusion, and second, that the premises of a proof are actually used to derive the conclusion. Turning to Aristotle’s Prior Analytics, I argue that there is evidence that Aristotle’s Assertoric Syllogistic satisfies both conditions. Moreover, Aristotle at one point explicitly addresses the potential harmfulness of syllogisms with unused premises. Here, I argue that Aristotle’s analysis allows for a rejection of such syllogisms on formal grounds established in the foregoing parts of the Prior Analytics. In a final section I consider the view that Aristotle distinguished between validity on the one hand and syllogistic validity on the other. Following this line of reasoning, Aristotle’s logic might not be a relevance logic, since relevance is part of syllogistic validity and not, as modern relevance logic demands, of general validity. I argue that the reasons to reject this view are more compelling than the reasons to accept it and that we can, cautiously, uphold the result that Aristotle’s logic is a relevance logic. Keywords: Aristotle, Syllogism, Relevance Logic, History of Logic, Logic, Relevance 1.
    [Show full text]
  • UNIT-I Mathematical Logic Statements and Notations
    UNIT-I Mathematical Logic Statements and notations: A proposition or statement is a declarative sentence that is either true or false (but not both). For instance, the following are propositions: “Paris is in France” (true), “London is in Denmark” (false), “2 < 4” (true), “4 = 7 (false)”. However the following are not propositions: “what is your name?” (this is a question), “do your homework” (this is a command), “this sentence is false” (neither true nor false), “x is an even number” (it depends on what x represents), “Socrates” (it is not even a sentence). The truth or falsehood of a proposition is called its truth value. Connectives: Connectives are used for making compound propositions. The main ones are the following (p and q represent given propositions): Name Represented Meaning Negation ¬p “not p” Conjunction p q “p and q” Disjunction p q “p or q (or both)” Exclusive Or p q “either p or q, but not both” Implication p ⊕ q “if p then q” Biconditional p q “p if and only if q” Truth Tables: Logical identity Logical identity is an operation on one logical value, typically the value of a proposition that produces a value of true if its operand is true and a value of false if its operand is false. The truth table for the logical identity operator is as follows: Logical Identity p p T T F F Logical negation Logical negation is an operation on one logical value, typically the value of a proposition that produces a value of true if its operand is false and a value of false if its operand is true.
    [Show full text]
  • 6.5 the Recursion Theorem
    6.5. THE RECURSION THEOREM 417 6.5 The Recursion Theorem The recursion Theorem, due to Kleene, is a fundamental result in recursion theory. Theorem 6.5.1 (Recursion Theorem, Version 1 )Letϕ0,ϕ1,... be any ac- ceptable indexing of the partial recursive functions. For every total recursive function f, there is some n such that ϕn = ϕf(n). The recursion Theorem can be strengthened as follows. Theorem 6.5.2 (Recursion Theorem, Version 2 )Letϕ0,ϕ1,... be any ac- ceptable indexing of the partial recursive functions. There is a total recursive function h such that for all x ∈ N,ifϕx is total, then ϕϕx(h(x)) = ϕh(x). 418 CHAPTER 6. ELEMENTARY RECURSIVE FUNCTION THEORY A third version of the recursion Theorem is given below. Theorem 6.5.3 (Recursion Theorem, Version 3 ) For all n ≥ 1, there is a total recursive function h of n +1 arguments, such that for all x ∈ N,ifϕx is a total recursive function of n +1arguments, then ϕϕx(h(x,x1,...,xn),x1,...,xn) = ϕh(x,x1,...,xn), for all x1,...,xn ∈ N. As a first application of the recursion theorem, we can show that there is an index n such that ϕn is the constant function with output n. Loosely speaking, ϕn prints its own name. Let f be the recursive function such that f(x, y)=x for all x, y ∈ N. 6.5. THE RECURSION THEOREM 419 By the s-m-n Theorem, there is a recursive function g such that ϕg(x)(y)=f(x, y)=x for all x, y ∈ N.
    [Show full text]
  • Sequent Calculus and the Specification of Computation
    Sequent Calculus and the Specification of Computation Lecture Notes Draft: 5 September 2001 These notes are being used with lectures for CSE 597 at Penn State in Fall 2001. These notes had been prepared to accompany lectures given at the Marktoberdorf Summer School, 29 July – 10 August 1997 for a course titled “Proof Theoretic Specification of Computation”. An earlier draft was used in courses given between 27 January and 11 February 1997 at the University of Siena and between 6 and 21 March 1997 at the University of Genoa. A version of these notes was also used at Penn State for CSE 522 in Fall 2000. For the most recent version of these notes, contact the author. °c Dale Miller 321 Pond Lab Computer Science and Engineering Department The Pennsylvania State University University Park, PA 16802-6106 USA Office: +(814)865-9505 [email protected] http://www.cse.psu.edu/»dale Contents 1 Overview and Motivation 5 1.1 Roles for logic in the specification of computations . 5 1.2 Desiderata for declarative programming . 6 1.3 Relating algorithm and logic . 6 2 First-Order Logic 8 2.1 Types and signatures . 8 2.2 First-order terms and formulas . 8 2.3 Sequent calculus . 10 2.4 Classical, intuitionistic, and minimal logics . 10 2.5 Permutations of inference rules . 12 2.6 Cut-elimination . 12 2.7 Consequences of cut-elimination . 13 2.8 Additional readings . 13 2.9 Exercises . 13 3 Logic Programming Considered Abstractly 15 3.1 Problems with proof search . 15 3.2 Interpretation as goal-directed search .
    [Show full text]
  • Chapter 1 Preliminaries
    Chapter 1 Preliminaries 1.1 First-order theories In this section we recall some basic definitions and facts about first-order theories. Most proofs are postponed to the end of the chapter, as exercises for the reader. 1.1.1 Definitions Definition 1.1 (First-order theory) —A first-order theory T is defined by: A first-order language L , whose terms (notation: t, u, etc.) and formulas (notation: φ, • ψ, etc.) are constructed from a fixed set of function symbols (notation: f , g, h, etc.) and of predicate symbols (notation: P, Q, R, etc.) using the following grammar: Terms t ::= x f (t1,..., tk) | Formulas φ, ψ ::= t1 = t2 P(t1,..., tk) φ | | > | ⊥ | ¬ φ ψ φ ψ φ ψ x φ x φ | ⇒ | ∧ | ∨ | ∀ | ∃ (assuming that each use of a function symbol f or of a predicate symbol P matches its arity). As usual, we call a constant symbol any function symbol of arity 0. A set of closed formulas of the language L , written Ax(T ), whose elements are called • the axioms of the theory T . Given a closed formula φ of the language of T , we say that φ is derivable in T —or that φ is a theorem of T —and write T φ when there are finitely many axioms φ , . , φ Ax(T ) such ` 1 n ∈ that the sequent φ , . , φ φ (or the formula φ φ φ) is derivable in a deduction 1 n ` 1 ∧ · · · ∧ n ⇒ system for classical logic. The set of all theorems of T is written Th(T ). Conventions 1.2 (1) In this course, we only work with first-order theories with equality.
    [Show full text]
  • April 22 7.1 Recursion Theorem
    CSE 431 Theory of Computation Spring 2014 Lecture 7: April 22 Lecturer: James R. Lee Scribe: Eric Lei Disclaimer: These notes have not been subjected to the usual scrutiny reserved for formal publications. They may be distributed outside this class only with the permission of the Instructor. An interesting question about Turing machines is whether they can reproduce themselves. A Turing machine cannot be be defined in terms of itself, but can it still somehow print its own source code? The answer to this question is yes, as we will see in the recursion theorem. Afterward we will see some applications of this result. 7.1 Recursion Theorem Our end goal is to have a Turing machine that prints its own source code and operates on it. Last lecture we proved the existence of a Turing machine called SELF that ignores its input and prints its source code. We construct a similar proof for the recursion theorem. We will also need the following lemma proved last lecture. ∗ ∗ Lemma 7.1 There exists a computable function q :Σ ! Σ such that q(w) = hPwi, where Pw is a Turing machine that prints w and hats. Theorem 7.2 (Recursion theorem) Let T be a Turing machine that computes a function t :Σ∗ × Σ∗ ! Σ∗. There exists a Turing machine R that computes a function r :Σ∗ ! Σ∗, where for every w, r(w) = t(hRi; w): The theorem says that for an arbitrary computable function t, there is a Turing machine R that computes t on hRi and some input. Proof: We construct a Turing Machine R in three parts, A, B, and T , where T is given by the statement of the theorem.
    [Show full text]
  • Intuitionism on Gödel's First Incompleteness Theorem
    The Yale Undergraduate Research Journal Volume 2 Issue 1 Spring 2021 Article 31 2021 Incomplete? Or Indefinite? Intuitionism on Gödel’s First Incompleteness Theorem Quinn Crawford Yale University Follow this and additional works at: https://elischolar.library.yale.edu/yurj Part of the Mathematics Commons, and the Philosophy Commons Recommended Citation Crawford, Quinn (2021) "Incomplete? Or Indefinite? Intuitionism on Gödel’s First Incompleteness Theorem," The Yale Undergraduate Research Journal: Vol. 2 : Iss. 1 , Article 31. Available at: https://elischolar.library.yale.edu/yurj/vol2/iss1/31 This Article is brought to you for free and open access by EliScholar – A Digital Platform for Scholarly Publishing at Yale. It has been accepted for inclusion in The Yale Undergraduate Research Journal by an authorized editor of EliScholar – A Digital Platform for Scholarly Publishing at Yale. For more information, please contact [email protected]. Incomplete? Or Indefinite? Intuitionism on Gödel’s First Incompleteness Theorem Cover Page Footnote Written for Professor Sun-Joo Shin’s course PHIL 437: Philosophy of Mathematics. This article is available in The Yale Undergraduate Research Journal: https://elischolar.library.yale.edu/yurj/vol2/iss1/ 31 Crawford: Intuitionism on Gödel’s First Incompleteness Theorem Crawford | Philosophy Incomplete? Or Indefinite? Intuitionism on Gödel’s First Incompleteness Theorem By Quinn Crawford1 1Department of Philosophy, Yale University ABSTRACT This paper analyzes two natural-looking arguments that seek to leverage Gödel’s first incompleteness theo- rem for and against intuitionism, concluding in both cases that the argument is unsound because it equivo- cates on the meaning of “proof,” which differs between formalism and intuitionism.
    [Show full text]
  • Chapter 1 Logic and Set Theory
    Chapter 1 Logic and Set Theory To criticize mathematics for its abstraction is to miss the point entirely. Abstraction is what makes mathematics work. If you concentrate too closely on too limited an application of a mathematical idea, you rob the mathematician of his most important tools: analogy, generality, and simplicity. – Ian Stewart Does God play dice? The mathematics of chaos In mathematics, a proof is a demonstration that, assuming certain axioms, some statement is necessarily true. That is, a proof is a logical argument, not an empir- ical one. One must demonstrate that a proposition is true in all cases before it is considered a theorem of mathematics. An unproven proposition for which there is some sort of empirical evidence is known as a conjecture. Mathematical logic is the framework upon which rigorous proofs are built. It is the study of the principles and criteria of valid inference and demonstrations. Logicians have analyzed set theory in great details, formulating a collection of axioms that affords a broad enough and strong enough foundation to mathematical reasoning. The standard form of axiomatic set theory is denoted ZFC and it consists of the Zermelo-Fraenkel (ZF) axioms combined with the axiom of choice (C). Each of the axioms included in this theory expresses a property of sets that is widely accepted by mathematicians. It is unfortunately true that careless use of set theory can lead to contradictions. Avoiding such contradictions was one of the original motivations for the axiomatization of set theory. 1 2 CHAPTER 1. LOGIC AND SET THEORY A rigorous analysis of set theory belongs to the foundations of mathematics and mathematical logic.
    [Show full text]
  • Folktale Documentation Release 1.0
    Folktale Documentation Release 1.0 Quildreen Motta Nov 05, 2017 Contents 1 Guides 3 2 Indices and tables 5 3 Other resources 7 3.1 Getting started..............................................7 3.2 Folktale by Example........................................... 10 3.3 API Reference.............................................. 12 3.4 How do I................................................... 63 3.5 Glossary................................................. 63 Python Module Index 65 i ii Folktale Documentation, Release 1.0 Folktale is a suite of libraries for generic functional programming in JavaScript that allows you to write elegant modular applications with fewer bugs, and more reuse. Contents 1 Folktale Documentation, Release 1.0 2 Contents CHAPTER 1 Guides • Getting Started A series of quick tutorials to get you up and running quickly with the Folktale libraries. • API reference A quick reference of Folktale’s libraries, including usage examples and cross-references. 3 Folktale Documentation, Release 1.0 4 Chapter 1. Guides CHAPTER 2 Indices and tables • Global Module Index Quick access to all modules. • General Index All functions, classes, terms, sections. • Search page Search this documentation. 5 Folktale Documentation, Release 1.0 6 Chapter 2. Indices and tables CHAPTER 3 Other resources • Licence information 3.1 Getting started This guide will cover everything you need to start using the Folktale project right away, from giving you a brief overview of the project, to installing it, to creating a simple example. Once you get the hang of things, the Folktale By Example guide should help you understanding the concepts behind the library, and mapping them to real use cases. 3.1.1 So, what’s Folktale anyways? Folktale is a suite of libraries for allowing a particular style of functional programming in JavaScript.
    [Show full text]
  • John P. Burgess Department of Philosophy Princeton University Princeton, NJ 08544-1006, USA [email protected]
    John P. Burgess Department of Philosophy Princeton University Princeton, NJ 08544-1006, USA [email protected] LOGIC & PHILOSOPHICAL METHODOLOGY Introduction For present purposes “logic” will be understood to mean the subject whose development is described in Kneale & Kneale [1961] and of which a concise history is given in Scholz [1961]. As the terminological discussion at the beginning of the latter reference makes clear, this subject has at different times been known by different names, “analytics” and “organon” and “dialectic”, while inversely the name “logic” has at different times been applied much more broadly and loosely than it will be here. At certain times and in certain places — perhaps especially in Germany from the days of Kant through the days of Hegel — the label has come to be used so very broadly and loosely as to threaten to take in nearly the whole of metaphysics and epistemology. Logic in our sense has often been distinguished from “logic” in other, sometimes unmanageably broad and loose, senses by adding the adjectives “formal” or “deductive”. The scope of the art and science of logic, once one gets beyond elementary logic of the kind covered in introductory textbooks, is indicated by two other standard references, the Handbooks of mathematical and philosophical logic, Barwise [1977] and Gabbay & Guenthner [1983-89], though the latter includes also parts that are identified as applications of logic rather than logic proper. The term “philosophical logic” as currently used, for instance, in the Journal of Philosophical Logic, is a near-synonym for “nonclassical logic”. There is an older use of the term as a near-synonym for “philosophy of language”.
    [Show full text]
  • Learning Dependency-Based Compositional Semantics
    Learning Dependency-Based Compositional Semantics Percy Liang∗ University of California, Berkeley Michael I. Jordan∗∗ University of California, Berkeley Dan Klein† University of California, Berkeley Suppose we want to build a system that answers a natural language question by representing its semantics as a logical form and computing the answer given a structured database of facts. The core part of such a system is the semantic parser that maps questions to logical forms. Semantic parsers are typically trained from examples of questions annotated with their target logical forms, but this type of annotation is expensive. Our goal is to instead learn a semantic parser from question–answer pairs, where the logical form is modeled as a latent variable. We develop a new semantic formalism, dependency-based compositional semantics (DCS) and define a log-linear distribution over DCS logical forms. The model parameters are estimated using a simple procedure that alternates between beam search and numerical optimization. On two standard semantic parsing benchmarks, we show that our system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. 1. Introduction One of the major challenges in natural language processing (NLP) is building systems that both handle complex linguistic phenomena and require minimal human effort. The difficulty of achieving both criteria is particularly evident in training semantic parsers, where annotating linguistic expressions with their associated logical forms is expensive but until recently, seemingly unavoidable. Advances in learning latent-variable models, however, have made it possible to progressively reduce the amount of supervision ∗ Computer Science Division, University of California, Berkeley, CA 94720, USA.
    [Show full text]