Presupposition Projection and Entailment Relations
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
UNIT-I Mathematical Logic Statements and Notations
UNIT-I Mathematical Logic Statements and notations: A proposition or statement is a declarative sentence that is either true or false (but not both). For instance, the following are propositions: “Paris is in France” (true), “London is in Denmark” (false), “2 < 4” (true), “4 = 7 (false)”. However the following are not propositions: “what is your name?” (this is a question), “do your homework” (this is a command), “this sentence is false” (neither true nor false), “x is an even number” (it depends on what x represents), “Socrates” (it is not even a sentence). The truth or falsehood of a proposition is called its truth value. Connectives: Connectives are used for making compound propositions. The main ones are the following (p and q represent given propositions): Name Represented Meaning Negation ¬p “not p” Conjunction p q “p and q” Disjunction p q “p or q (or both)” Exclusive Or p q “either p or q, but not both” Implication p ⊕ q “if p then q” Biconditional p q “p if and only if q” Truth Tables: Logical identity Logical identity is an operation on one logical value, typically the value of a proposition that produces a value of true if its operand is true and a value of false if its operand is false. The truth table for the logical identity operator is as follows: Logical Identity p p T T F F Logical negation Logical negation is an operation on one logical value, typically the value of a proposition that produces a value of true if its operand is false and a value of false if its operand is true. -
Chapter 1 Logic and Set Theory
Chapter 1 Logic and Set Theory To criticize mathematics for its abstraction is to miss the point entirely. Abstraction is what makes mathematics work. If you concentrate too closely on too limited an application of a mathematical idea, you rob the mathematician of his most important tools: analogy, generality, and simplicity. – Ian Stewart Does God play dice? The mathematics of chaos In mathematics, a proof is a demonstration that, assuming certain axioms, some statement is necessarily true. That is, a proof is a logical argument, not an empir- ical one. One must demonstrate that a proposition is true in all cases before it is considered a theorem of mathematics. An unproven proposition for which there is some sort of empirical evidence is known as a conjecture. Mathematical logic is the framework upon which rigorous proofs are built. It is the study of the principles and criteria of valid inference and demonstrations. Logicians have analyzed set theory in great details, formulating a collection of axioms that affords a broad enough and strong enough foundation to mathematical reasoning. The standard form of axiomatic set theory is denoted ZFC and it consists of the Zermelo-Fraenkel (ZF) axioms combined with the axiom of choice (C). Each of the axioms included in this theory expresses a property of sets that is widely accepted by mathematicians. It is unfortunately true that careless use of set theory can lead to contradictions. Avoiding such contradictions was one of the original motivations for the axiomatization of set theory. 1 2 CHAPTER 1. LOGIC AND SET THEORY A rigorous analysis of set theory belongs to the foundations of mathematics and mathematical logic. -
Entailment, Presupposition, and Implicature in the Work of Ernest Hemingway and Tim O'brien
Louisiana State University LSU Digital Commons LSU Historical Dissertations and Theses Graduate School 1994 The tS ylistic Mechanics of Implicitness: Entailment, Presupposition, and Implicature in the Work of Ernest Hemingway and Tim O'Brien. Donna Glee Williams Louisiana State University and Agricultural & Mechanical College Follow this and additional works at: https://digitalcommons.lsu.edu/gradschool_disstheses Recommended Citation Williams, Donna Glee, "The tS ylistic Mechanics of Implicitness: Entailment, Presupposition, and Implicature in the Work of Ernest Hemingway and Tim O'Brien." (1994). LSU Historical Dissertations and Theses. 5768. https://digitalcommons.lsu.edu/gradschool_disstheses/5768 This Dissertation is brought to you for free and open access by the Graduate School at LSU Digital Commons. It has been accepted for inclusion in LSU Historical Dissertations and Theses by an authorized administrator of LSU Digital Commons. For more information, please contact [email protected]. INFORMATION TO USERS This manuscript has been reproduced from the microfilm master. UMI films the text directly from the original or copy submitted. Thus, some thesis and dissertation copies are in typewriter face, while others may be from any type of computer printer. The quality of this reproduction is dependent upon the quality of the copy submitted. Broken or indistinct print, colored or poor quality illustrations and photographs, print bleedthrough,m asubstandard r gins, and improper alignment can adversely affect reproduction. In the unlikely event that the author did not send UMI a complete manuscript and there are missing pages, these will be noted. Also, if unauthorized copyright material had to be removed, a note will indicate the deletion. Oversize materials (e.g., maps, drawings, charts) are reproduced by sectioning the original, beginning at the upper left-hand comer and continuing from left to right in equal sections with small overlaps. -
Folktale Documentation Release 1.0
Folktale Documentation Release 1.0 Quildreen Motta Nov 05, 2017 Contents 1 Guides 3 2 Indices and tables 5 3 Other resources 7 3.1 Getting started..............................................7 3.2 Folktale by Example........................................... 10 3.3 API Reference.............................................. 12 3.4 How do I................................................... 63 3.5 Glossary................................................. 63 Python Module Index 65 i ii Folktale Documentation, Release 1.0 Folktale is a suite of libraries for generic functional programming in JavaScript that allows you to write elegant modular applications with fewer bugs, and more reuse. Contents 1 Folktale Documentation, Release 1.0 2 Contents CHAPTER 1 Guides • Getting Started A series of quick tutorials to get you up and running quickly with the Folktale libraries. • API reference A quick reference of Folktale’s libraries, including usage examples and cross-references. 3 Folktale Documentation, Release 1.0 4 Chapter 1. Guides CHAPTER 2 Indices and tables • Global Module Index Quick access to all modules. • General Index All functions, classes, terms, sections. • Search page Search this documentation. 5 Folktale Documentation, Release 1.0 6 Chapter 2. Indices and tables CHAPTER 3 Other resources • Licence information 3.1 Getting started This guide will cover everything you need to start using the Folktale project right away, from giving you a brief overview of the project, to installing it, to creating a simple example. Once you get the hang of things, the Folktale By Example guide should help you understanding the concepts behind the library, and mapping them to real use cases. 3.1.1 So, what’s Folktale anyways? Folktale is a suite of libraries for allowing a particular style of functional programming in JavaScript. -
Theorem Proving in Classical Logic
MEng Individual Project Imperial College London Department of Computing Theorem Proving in Classical Logic Supervisor: Dr. Steffen van Bakel Author: David Davies Second Marker: Dr. Nicolas Wu June 16, 2021 Abstract It is well known that functional programming and logic are deeply intertwined. This has led to many systems capable of expressing both propositional and first order logic, that also operate as well-typed programs. What currently ties popular theorem provers together is their basis in intuitionistic logic, where one cannot prove the law of the excluded middle, ‘A A’ – that any proposition is either true or false. In classical logic this notion is provable, and the_: corresponding programs turn out to be those with control operators. In this report, we explore and expand upon the research about calculi that correspond with classical logic; and the problems that occur for those relating to first order logic. To see how these calculi behave in practice, we develop and implement functional languages for propositional and first order logic, expressing classical calculi in the setting of a theorem prover, much like Agda and Coq. In the first order language, users are able to define inductive data and record types; importantly, they are able to write computable programs that have a correspondence with classical propositions. Acknowledgements I would like to thank Steffen van Bakel, my supervisor, for his support throughout this project and helping find a topic of study based on my interests, for which I am incredibly grateful. His insight and advice have been invaluable. I would also like to thank my second marker, Nicolas Wu, for introducing me to the world of dependent types, and suggesting useful resources that have aided me greatly during this report. -
Learning Dependency-Based Compositional Semantics
Learning Dependency-Based Compositional Semantics Percy Liang∗ University of California, Berkeley Michael I. Jordan∗∗ University of California, Berkeley Dan Klein† University of California, Berkeley Suppose we want to build a system that answers a natural language question by representing its semantics as a logical form and computing the answer given a structured database of facts. The core part of such a system is the semantic parser that maps questions to logical forms. Semantic parsers are typically trained from examples of questions annotated with their target logical forms, but this type of annotation is expensive. Our goal is to instead learn a semantic parser from question–answer pairs, where the logical form is modeled as a latent variable. We develop a new semantic formalism, dependency-based compositional semantics (DCS) and define a log-linear distribution over DCS logical forms. The model parameters are estimated using a simple procedure that alternates between beam search and numerical optimization. On two standard semantic parsing benchmarks, we show that our system obtains comparable accuracies to even state-of-the-art systems that do require annotated logical forms. 1. Introduction One of the major challenges in natural language processing (NLP) is building systems that both handle complex linguistic phenomena and require minimal human effort. The difficulty of achieving both criteria is particularly evident in training semantic parsers, where annotating linguistic expressions with their associated logical forms is expensive but until recently, seemingly unavoidable. Advances in learning latent-variable models, however, have made it possible to progressively reduce the amount of supervision ∗ Computer Science Division, University of California, Berkeley, CA 94720, USA. -
Logical Vs. Natural Language Conjunctions in Czech: a Comparative Study
ITAT 2016 Proceedings, CEUR Workshop Proceedings Vol. 1649, pp. 68–73 http://ceur-ws.org/Vol-1649, Series ISSN 1613-0073, c 2016 K. Prikrylová,ˇ V. Kubon,ˇ K. Veselovská Logical vs. Natural Language Conjunctions in Czech: A Comparative Study Katrin Prikrylová,ˇ Vladislav Kubon,ˇ and Katerinaˇ Veselovská Charles University in Prague, Faculty of Mathematics and Physics Czech Republic {prikrylova,vk,veselovska}@ufal.mff.cuni.cz Abstract: This paper studies the relationship between con- ceptions and irregularities which do not abide the rules as junctions in a natural language (Czech) and their logical strictly as it is the case in logic. counterparts. It shows that the process of transformation Primarily due to this difference, the transformation of of a natural language expression into its logical representa- natural language sentences into their logical representation tion is not straightforward. The paper concentrates on the constitutes a complex issue. As we are going to show in most frequently used logical conjunctions, and , and it the subsequent sections, there are no simple rules which analyzes the natural language phenomena which∧ influence∨ would allow automation of the process – the majority of their transformation into logical conjunction and disjunc- problematic cases requires an individual approach. tion. The phenomena discussed in the paper are temporal In the following text we are going to restrict our ob- sequence, expressions describing mutual relationship and servations to the two most frequently used conjunctions, the consequences of using plural. namely a (and) and nebo (or). 1 Introduction and motivation 2 Sentences containing the conjunction a (and) The endeavor to express natural language sentences in the form of logical expressions is probably as old as logic it- The initial assumption about complex sentences contain- self. -
The Peripatetic Program in Categorical Logic: Leibniz on Propositional Terms
THE REVIEW OF SYMBOLIC LOGIC, Page 1 of 65 THE PERIPATETIC PROGRAM IN CATEGORICAL LOGIC: LEIBNIZ ON PROPOSITIONAL TERMS MARKO MALINK Department of Philosophy, New York University and ANUBAV VASUDEVAN Department of Philosophy, University of Chicago Abstract. Greek antiquity saw the development of two distinct systems of logic: Aristotle’s theory of the categorical syllogism and the Stoic theory of the hypothetical syllogism. Some ancient logicians argued that hypothetical syllogistic is more fundamental than categorical syllogistic on the grounds that the latter relies on modes of propositional reasoning such as reductio ad absurdum. Peripatetic logicians, by contrast, sought to establish the priority of categorical over hypothetical syllogistic by reducing various modes of propositional reasoning to categorical form. In the 17th century, this Peripatetic program of reducing hypothetical to categorical logic was championed by Gottfried Wilhelm Leibniz. In an essay titled Specimina calculi rationalis, Leibniz develops a theory of propositional terms that allows him to derive the rule of reductio ad absurdum in a purely categor- ical calculus in which every proposition is of the form AisB. We reconstruct Leibniz’s categorical calculus and show that it is strong enough to establish not only the rule of reductio ad absurdum,but all the laws of classical propositional logic. Moreover, we show that the propositional logic generated by the nonmonotonic variant of Leibniz’s categorical calculus is a natural system of relevance logic ¬ known as RMI→ . From its beginnings in antiquity up until the late 19th century, the study of formal logic was shaped by two distinct approaches to the subject. The first approach was primarily concerned with simple propositions expressing predicative relations between terms. -
Introduction to Logic 1
Introduction to logic 1. Logic and Artificial Intelligence: some historical remarks One of the aims of AI is to reproduce human features by means of a computer system. One of these features is reasoning. Reasoning can be viewed as the process of having a knowledge base (KB) and manipulating it to create new knowledge. Reasoning can be seen as comprised of 3 processes: 1. Perceiving stimuli from the environment; 2. Translating the stimuli in components of the KB; 3. Working on the KB to increase it. We are going to deal with how to represent information in the KB and how to reason about it. We use logic as a device to pursue this aim. These notes are an introduction to modern logic, whose origin can be found in George Boole’s and Gottlob Frege’s works in the XIX century. However, logic in general traces back to ancient Greek philosophers that studied under which conditions an argument is valid. An argument is any set of statements – explicit or implicit – one of which is the conclusion (the statement being defended) and the others are the premises (the statements providing the defense). The relationship between the conclusion and the premises is such that the conclusion follows from the premises (Lepore 2003). Modern logic is a powerful tool to represent and reason on knowledge and AI has traditionally adopted it as working tool. Within logic, one research tradition that strongly influenced AI is that started by the German philosopher and mathematician Gottfried Leibniz. His aim was to formulate a Lingua Rationalis to create a universal language based only on rationality, which could solve all the controversies between different parties. -
Lecture 7: Semantics and Pragmatics. Entailments, Presuppositions, Conversational and Conventional Implicatures. Grice's
Formal Semantics, Lecture 7 Formal Semantics, Lecture 7 B. H. Partee, RGGU April 1, 2004 p. 1 B. H. Partee, RGGU April 1, 2004 p. 2 Lecture 7: Semantics and Pragmatics. Entailments, presuppositions, (intentionally or unintentionally) much more than what is literally said by the words of her conversational and conventional implicatures. Grice’s conversational sentence. maxims. An example: (1) A: How is C getting along in his new job at the bank? 1. Grice’s Conversational Implicatures. .................................................................................................................1 B: Oh, quite well, I think; he likes his colleagues, and he hasn’t been to prison yet. 1.1. Motivation. Questions about the meanings of logical words.......................................................................1 1.2. Truth-conditional content (semantics) vs. Conversational Implicatures (pragmatics).................................2 What B implied, suggested, or meant is distinct from what B said. All B said was that C had 1.3. Conversational maxims. (“Gricean maxims”.) ............................................................................................3 not been to prison yet. 1.4. Generating implicatures. General principles. Examples............................................................................4 1.4.1. Characterization of conversational implicature. .............................................................................4 1.4.2. More Examples...............................................................................................................................4 -
The Composition of Logical Constants
the morphosemantic makeup of exclusive-disjunctive markers ∗ Moreno Mitrovi´c university of cyprus & bled institute . abstract. The spirit of this paper is strongly decompositional and its aim to meditate on the idea that natural language conjunction and disjunction markers do not necessarily (directly) incarnate Boo- lean terms like ‘0’ and ‘1’,respectively. Drawing from a rich collec- tion of (mostly dead) languages (Ancient Anatolian, Homeric Greek, Tocharian, Old Church Slavonic, and North-East Caucasian), I will examine the morphosemantics of ‘XOR’ (‘exclusive or’) markers, i.e. exclusive (or strong/enriched) disjunction markers of the “ei- Do not cite without consultation. ⭒ ther ...or ...”-type, and demonstrate that the morphology of the XOR marker does not only contain the true (generalised) disjunc- tion marker (I dub it κ), as one would expect on the null (Boolean hy- pothesis), but that the XOR-marker also contains the (generalised) conjunction marker (I dub it μ). After making the case for a fine- April 15, 2017 structure of the Junction Phrase (JP), a common structural denom- ⭒ inator for con- and dis-junction, the paper proposes a new syntax for XOR constructions involving five functional heads (two pairs of κ ver. 6 and μ markers, forming the XOR-word and combining with the respective coordinand, and a J-head pairing up the coordinands). draft I then move on to compose the semantics of the syntactically de- ⭒ composed structure by providing a compositional account obtain- ing the exclusive component, qua the semantic/pragmatic signa- ture of these markers. To do so, I rely on an exhaustification-based system of ‘grammaticised implicatures’ (Chierchia, 2013b) in as- manuscript suming silent exhaustification operators in the narrow syntax, which (in concert with the presence of alternative-triggering κ- and μ-ope- rators) trigger local exclusive (scalar) implicature (SI) computation. -
Presupposition and Entailment Yuanli Fan School of Foreign Studies, Xi’An University, Shaanxi Xi’An 710065
Advances in Computer Science Research (ACSR), volume 76 7th International Conference on Education, Management, Information and Mechanical Engineering (EMIM 2017) Presupposition and Entailment Yuanli Fan School of Foreign Studies, Xi’an University, Shaanxi Xi’an 710065 Keywords: Presupposition; Entailment; Negation test; Information focus Abstracts. This paper aimsat the distinction of Presupposition and Entailment. Since variouskinds of definition misused confusedly. The distinction is always a hot question for discussing. The author triesto distinguish the presupposition and entailment fromsemantic scope by negation test. The traditional negation test seemsto have itslimitation for the semantic negation hasitsfunction scope. Therefore, the information focus is introduced to make a distinction between semantic presupposition and entailment of a multi-elements sentence. Through analyzing preposition and entailment of English sentence, the semantic and pragmatic essence can be easily acquired in order to promote the English Communication Competence. However, some sentence pattern, such asthe Imperative sentence, isneeded to do further research. Introduction Entailment and presupposition showsthe different relationship between sentences. Both of themare the meaning and information deduced fromthe sentence itself. The distinction between the two is alwaysan attentive hot question and hasnot reached agreement. The key point about that lie in: firstly, different referring terms are used by the same symbol and causing the misunderstanding about presupposition and entailment. It isessential to make clear distinction of presupposition and entailment for the developing of pragmatic or semantic research. In thispaper, we will discussthe semantic presupposition, and the distinction of semantic presupposition and entailment. The Philosophy Origination of Presupposition Presupposition originates with debates in philosophy, specifically debates about the nature of reference and referring expressions.