Introduction to Logic Study Guide

Total Page:16

File Type:pdf, Size:1020Kb

Introduction to Logic Study Guide Leeward Community College (p ~ Philosophy 110 ~ q) S pring 2010 1 Introduction to Logic Study Guide The final exam is meant to assess your knowledge and skill you have acquired throughout the semester. The first thing to realize is that you have acquired some knowledge and skill, and preparing for the exam is just recognizing that fact. The downside is that like any test, the scale must be steep enough that everyone registers at a definite point, which means the test must be difficult enough that not everyone (or anyone) will score 100%. And as well, a test is only a sample of everything that you have learned, and thus is somewhat random. This means you should review everything to be ready for whatever it is that appears on the exam. But this is a good thing too, since reviewing is the best well to acquire that knowledge and skill you will be tested on, even if it is not on the test! What have we learned? What logic is: What an argument is: It isn't just saying “no it isn't.” Most importantly we learned what PROPOSITIONS are, that they are statements that can be affirmed or denied, or can be true or false, and that they can be premisses or conclusions in an argument. We learned the difference between DEDUCTIVE ARGUMENTS, where the conclusion follows necessarily, and INDUCTIVE ARGUMENTS, that only establish a conclusion with probability. We learned all about VALIDITY , and good thing too because there are lots of arguments out there that don’t have it. Arguments that are invalid (or otherwise flawed) but slip by are called FALLACIES, and we now know their names and where they live. (How many? What are their names?) In symbolic logic we learned about TRUTH FUNCTIONALITY, which is a really cool thing. It means that the truth-value of an complex proposition is a function of the truth values of the simple statements in the proposition and the truth functional CONNECTIVES which link those statements together. SIMPLE STATEMENTS for symbolic logic are propositions, that is, statements that are capable of being true or false (even if some of them are only ever one or the other), that are not analyzable into any more basic propositions. NEGATION: negation is a truth function, not a quality of a proposition, which means that the truth value of the negation depends on the truth value of the simple statement which it connects (to itself, sort of), as in ~p. Leeward Community College (p ~ Philosophy 110 ~ q) S pring 2010 2 DOUBLE NEGATION: if the value of the negation depends on the truth value of the simple statement, then the negation of negation depends on the value of the negation which depends on the value of the simple statement! OK, its not that great of a discovery, but it works. The rest of the LOGICAL CONNECTIVES require two simple statements, and they define a truth value that depend upon the truth value of both of the simple statements (usually). The connectives are CONJUNCTION, DISJUNCTION, CONDITIONAL, AND BI-CONDITIONAL We could repeat the complete definitions of these, but that would be pointless since you already have them memorized, right? But we might remind ourselves of the significant parts: p • q CONJUNCTION: only true it both conjuncts are true. p ∨ q DISJUNCTION: only false if both disjuncts are false. p q CONDITIONAL: only false if antecedent true and consequent false. p ≡ q BI-CONDITIONAL: only true is value of both simple statements are the same, or only false if value of simple statements is different. And the most wondrous thing, we have mechanism come home to help us! Now it appears in the form of a truth-table that exhaustively lists all the possible combinations of the simple statements, and so the values of any connectives, and ultimately the value of any proposition. So we can use a TRUTH-TABLE to determine, precisely, the truth value of any proposition, and so the value of any group of propositions. And if we have the truth values of a group of propositions, which just, shall we say, have the relation of premisses to a conclusion, we can test for the validity of the argument. TRUTH-TABLE TEST FOR VALIDITY: identify all the basic or simple statements, label them, put them into a matrix (remember, 2 n where n=the number of simple statements, first column is half true/half false, second 1/4, and so on until the last column (of a simple statement) should alternate T/F. Got it?). Then we lay out the propositions and determine their truth values in each row, and finally identify the premisses and the conclusion, look for lines where ALL the premisses might be true and the conclusion false, which would show the argument is invalid. If we find one, the argument is invalid. If we find none, the argument is valid. The BACKWARD METHOD is a short-cut way to use the truth tables, isolating out those lines of a possible truth table by assigning values that either make that premisses true or the conclusion false, and then seeing if it is still possible to make the conclusion false or the premisses true (respectively). Once we have the truth tables, we can prove the validity of arguments. BUT WAIT! If we can prove the validity of an argument, and validity is a structural rather than a semantic relations (concerns the form of the argument, not it's content), then if we show an argument is valid every other argument that has the same form must as well also be valid! This means that we can Leeward Community College (p ~ Philosophy 110 ~ q) S pring 2010 3 identify the FORM of an argument by substituting STATEMENT VARIABLES for the simple statements in the argument; and we can reverse this by substituting statements for statement variables, which gives us a SUBSTITUTION INSTANCE of an ARGUMENT FORM. But the difficulty is that we do not have to substitute only simple statements to produce an argument form; we could substitute a complex proposition for a statement variable, and vice versa (the other way around). If we want to capture a specific argument, we have to use the finest net available, and be sure to substitute statement variables only for simple statements so that we can obtain the SPECIFIC FORM of that argument. And then we can test for validity? And once we can show that an ARGUMENT FORM is valid, we can use it in any circumstance, and be assured of the correctness of our reasoning. In fact, we can use these VALID ARGUMENT FORMS in the process of evaluating the validity of longer, more complex arguments!! Oh Goody! We have nine (9) such argument forms, which we will then henceforth refer to as RULES OF INFERENCE, and they are: A further tool that is necessary for us to do deductions is logical equivalence, which allows use to replace propositions with their logically equivalent counterparts. A LOGICAL EQUIVALENCE is a material equivalence that is a TAUTOLOGY. This means that not only will the statement of material equivalence (which is double material implication) be true when both statements have the same truth value, but that both statements will always have the same truth value, which means that they must have the same meaning and so can be substituted!! Thus we can add to our nine RULES OF INFERENCE another ten RULES OF Leeward Community College (p ~ Philosophy 110 ~ q) S pring 2010 4 REPLACEMENT. Now we can take any argument, and if we can apply our rules of inference to the premisses, and any statements thus derived from the premises, to arrive at the conclusion of the argument, then the argument must be valid. This is because VALIDITY means that IF THE PREMISSES ARE TRUE, THE CONCLUSION NECESSARILY IS TRUE. And validity means that if the conclusion can be false when the premisses are true, the argument is invalid. Or that if the premisses are incapable of all being true (INCONSISTENT), the argument is necessarily valid (but not sound), and conversely, if the conclusion is incapable of being false (TAUTOLOGY), the argument must be valid as well. These results are related to the PARADOX OF MATERIAL IMPLICATION we encountered with the conditional, only more so since it is the PARADOX OF STRICT (or Logical) IMPLICATION. But we are not bothered by this, since if everything follows from inconsistent premisses, that is as good as nothing in particular following from them; and as well if a tautology follows from any premisses, if follows from none in particular, so we might as well deduce it from itself! Enough of that. Reductio ad absurdum If we can, using our rules of inference, derive a contradiction from a given set of premisses, if Leeward Community College (p ~ Philosophy 110 ~ q) S pring 2010 5 those premises are true, that contradiction must be true, but contradictions cannot be true! So we know the premisses are inconsistent, and the argument is necessarily unsound, and can't prove the conclusion to be true. But as well, we can use this to do an Indirect Proof . We can assume something, another proposition in addition to the premisses of the argument, and see if we can reach a contradiction with our rules. If we can, we know that that assumption cannot be true, so we are justified in deriving its negation. Usually our assumption is the negation of the conclusion, so we can get the conclusion itself by indirect proof. Review the strategy for doing deductive proofs, since it is not mechanical like the truth-table method.
Recommended publications
  • 7.1 Rules of Implication I
    Natural Deduction is a method for deriving the conclusion of valid arguments expressed in the symbolism of propositional logic. The method consists of using sets of Rules of Inference (valid argument forms) to derive either a conclusion or a series of intermediate conclusions that link the premises of an argument with the stated conclusion. The First Four Rules of Inference: ◦ Modus Ponens (MP): p q p q ◦ Modus Tollens (MT): p q ~q ~p ◦ Pure Hypothetical Syllogism (HS): p q q r p r ◦ Disjunctive Syllogism (DS): p v q ~p q Common strategies for constructing a proof involving the first four rules: ◦ Always begin by attempting to find the conclusion in the premises. If the conclusion is not present in its entirely in the premises, look at the main operator of the conclusion. This will provide a clue as to how the conclusion should be derived. ◦ If the conclusion contains a letter that appears in the consequent of a conditional statement in the premises, consider obtaining that letter via modus ponens. ◦ If the conclusion contains a negated letter and that letter appears in the antecedent of a conditional statement in the premises, consider obtaining the negated letter via modus tollens. ◦ If the conclusion is a conditional statement, consider obtaining it via pure hypothetical syllogism. ◦ If the conclusion contains a letter that appears in a disjunctive statement in the premises, consider obtaining that letter via disjunctive syllogism. Four Additional Rules of Inference: ◦ Constructive Dilemma (CD): (p q) • (r s) p v r q v s ◦ Simplification (Simp): p • q p ◦ Conjunction (Conj): p q p • q ◦ Addition (Add): p p v q Common Misapplications Common strategies involving the additional rules of inference: ◦ If the conclusion contains a letter that appears in a conjunctive statement in the premises, consider obtaining that letter via simplification.
    [Show full text]
  • Two Sources of Explosion
    Two sources of explosion Eric Kao Computer Science Department Stanford University Stanford, CA 94305 United States of America Abstract. In pursuit of enhancing the deductive power of Direct Logic while avoiding explosiveness, Hewitt has proposed including the law of excluded middle and proof by self-refutation. In this paper, I show that the inclusion of either one of these inference patterns causes paracon- sistent logics such as Hewitt's Direct Logic and Besnard and Hunter's quasi-classical logic to become explosive. 1 Introduction A central goal of a paraconsistent logic is to avoid explosiveness { the inference of any arbitrary sentence β from an inconsistent premise set fp; :pg (ex falso quodlibet). Hewitt [2] Direct Logic and Besnard and Hunter's quasi-classical logic (QC) [1, 5, 4] both seek to preserve the deductive power of classical logic \as much as pos- sible" while still avoiding explosiveness. Their work fits into the ongoing research program of identifying some \reasonable" and \maximal" subsets of classically valid rules and axioms that do not lead to explosiveness. To this end, it is natural to consider which classically sound deductive rules and axioms one can introduce into a paraconsistent logic without causing explo- siveness. Hewitt [3] proposed including the law of excluded middle and the proof by self-refutation rule (a very special case of proof by contradiction) but did not show whether the resulting logic would be explosive. In this paper, I show that for quasi-classical logic and its variant, the addition of either the law of excluded middle or the proof by self-refutation rule in fact leads to explosiveness.
    [Show full text]
  • Contradiction Or Non-Contradiction?  Hegel’S Dialectic Between Brandom and Priest
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Padua@research CONTRADICTION OR NON-CONTRADICTION? HEGEL’S DIALECTIC BETWEEN BRANDOM AND PRIEST by Michela Bordignon Abstract. The aim of the paper is to analyse Brandom’s account of Hegel’s conception of determinate negation and the role this structure plays in the dialectical process with respect to the problem of contradiction. After having shown both the merits and the limits of Brandom’s account, I will refer to Priest’s dialetheistic approach to contradiction as an alternative contemporary perspective from which it is possible to capture essential features of Hegel’s notion of contradiction, and I will test the equation of Hegel’s dialectic with Priest dialetheism. 1. Introduction According to Horstmann, «Hegel thinks of his new logic as being in part incompatible with traditional logic»1. The strongest expression of this new conception of logic is the first thesis of the work Hegel wrote in 1801 in order to earn his teaching habilitation: «contradictio est regula veri, non contradictio falsi»2. Hegel seems to claim that contradictions are true. The Hegelian thesis of the truth of contradiction is highly problematic. This is shown by Popper’s critique based on the principle of ex falso quodlibet: «if a theory contains a contradiction, then it entails everything, and therefore, indeed, nothing […]. A theory which involves a contradiction is therefore entirely useless I thank Graham Wetherall for kindly correcting a previous English translation of this paper and for his suggestions and helpful remarks. Of course, all remaining errors are mine.
    [Show full text]
  • Notes on Proof Theory
    Notes on Proof Theory Master 1 “Informatique”, Univ. Paris 13 Master 2 “Logique Mathématique et Fondements de l’Informatique”, Univ. Paris 7 Damiano Mazza November 2016 1Last edit: March 29, 2021 Contents 1 Propositional Classical Logic 5 1.1 Formulas and truth semantics . 5 1.2 Atomic negation . 8 2 Sequent Calculus 10 2.1 Two-sided formulation . 10 2.2 One-sided formulation . 13 3 First-order Quantification 16 3.1 Formulas and truth semantics . 16 3.2 Sequent calculus . 19 3.3 Ultrafilters . 21 4 Completeness 24 4.1 Exhaustive search . 25 4.2 The completeness proof . 30 5 Undecidability and Incompleteness 33 5.1 Informal computability . 33 5.2 Incompleteness: a road map . 35 5.3 Logical theories . 38 5.4 Arithmetical theories . 40 5.5 The incompleteness theorems . 44 6 Cut Elimination 47 7 Intuitionistic Logic 53 7.1 Sequent calculus . 55 7.2 The relationship between intuitionistic and classical logic . 60 7.3 Minimal logic . 65 8 Natural Deduction 67 8.1 Sequent presentation . 68 8.2 Natural deduction and sequent calculus . 70 8.3 Proof tree presentation . 73 8.3.1 Minimal natural deduction . 73 8.3.2 Intuitionistic natural deduction . 75 1 8.3.3 Classical natural deduction . 75 8.4 Normalization (cut-elimination in natural deduction) . 76 9 The Curry-Howard Correspondence 80 9.1 The simply typed l-calculus . 80 9.2 Product and sum types . 81 10 System F 83 10.1 Intuitionistic second-order propositional logic . 83 10.2 Polymorphic types . 84 10.3 Programming in system F ...................... 85 10.3.1 Free structures .
    [Show full text]
  • Chapter 1 Negation in a Cross-Linguistic Perspective
    Chapter 1 Negation in a cross-linguistic perspective 0. Chapter summary This chapter introduces the empirical scope of our study on the expression and interpretation of negation in natural language. We start with some background notions on negation in logic and language, and continue with a discussion of more linguistic issues concerning negation at the syntax-semantics interface. We zoom in on cross- linguistic variation, both in a synchronic perspective (typology) and in a diachronic perspective (language change). Besides expressions of propositional negation, this book analyzes the form and interpretation of indefinites in the scope of negation. This raises the issue of negative polarity and its relation to negative concord. We present the main facts, criteria, and proposals developed in the literature on this topic. The chapter closes with an overview of the book. We use Optimality Theory to account for the syntax and semantics of negation in a cross-linguistic perspective. This theoretical framework is introduced in Chapter 2. 1 Negation in logic and language The main aim of this book is to provide an account of the patterns of negation we find in natural language. The expression and interpretation of negation in natural language has long fascinated philosophers, logicians, and linguists. Horn’s (1989) Natural history of negation opens with the following statement: “All human systems of communication contain a representation of negation. No animal communication system includes negative utterances, and consequently, none possesses a means for assigning truth value, for lying, for irony, or for coping with false or contradictory statements.” A bit further on the first page, Horn states: “Despite the simplicity of the one-place connective of propositional logic ( ¬p is true if and only if p is not true) and of the laws of inference in which it participate (e.g.
    [Show full text]
  • A Non-Classical Refinement of the Interpolation Property for Classical
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Stirling Online Research Repository – Accepted for publication in Logique & Analyse – A non-classical refinement of the interpolation property for classical propositional logic Peter Milne Abstract We refine the interpolation property of the f^; _; :g-fragment of classical propositional logic, showing that if 2 :φ, 2 and φ then there is an interpolant χ, constructed using at most atomic formulas occurring in both φ and and negation, conjunction and disjunction, such that (i) φ entails χ in Kleene’s strong three-valued logic and (ii) χ entails in Priest’s Logic of Paradox. Keywords: Interpolation theorem for classical propositional logic · Kleene’s strong 3-valued logic · Priest’s Logic of Paradox 1 Introduction Suppose that φ classically entails , that φ is not a classical contradiction and that is not a classical tautology. Then φ and must share non-logical vo- cabulary, for else one could make φ true and false at the same time. That there must be some overlap in non-logical vocabulary between premise and con- clusion is obvious. Possession of the interpolation property takes this line of thought much further: If 2 :φ, 2 and φ then there is a formula χ containing only atomic formulas common to φ and and such that φ χ and χ .1 1Surprisingly few textbooks prove this theorem. (Hodges 2001) and (Hunter 1971) are excep- tions. They prove it in the slightly stronger form if φ and at least one atomic formula is common to φ and then there is a formula χ containing only atomic formulas common to φ and and such that φ χ and χ .
    [Show full text]
  • Chapter 10: Symbolic Trails and Formal Proofs of Validity, Part 2
    Essential Logic Ronald C. Pine CHAPTER 10: SYMBOLIC TRAILS AND FORMAL PROOFS OF VALIDITY, PART 2 Introduction In the previous chapter there were many frustrating signs that something was wrong with our formal proof method that relied on only nine elementary rules of validity. Very simple, intuitive valid arguments could not be shown to be valid. For instance, the following intuitively valid arguments cannot be shown to be valid using only the nine rules. Somalia and Iran are both foreign policy risks. Therefore, Iran is a foreign policy risk. S I / I Either Obama or McCain was President of the United States in 2009.1 McCain was not President in 2010. So, Obama was President of the United States in 2010. (O v C) ~(O C) ~C / O If the computer networking system works, then Johnson and Kaneshiro will both be connected to the home office. Therefore, if the networking system works, Johnson will be connected to the home office. N (J K) / N J Either the Start II treaty is ratified or this landmark treaty will not be worth the paper it is written on. Therefore, if the Start II treaty is not ratified, this landmark treaty will not be worth the paper it is written on. R v ~W / ~R ~W 1 This or statement is obviously exclusive, so note the translation. 427 If the light is on, then the light switch must be on. So, if the light switch in not on, then the light is not on. L S / ~S ~L Thus, the nine elementary rules of validity covered in the previous chapter must be only part of a complete system for constructing formal proofs of validity.
    [Show full text]
  • List of Rules of Inference 1 List of Rules of Inference
    List of rules of inference 1 List of rules of inference This is a list of rules of inference, logical laws that relate to mathematical formulae. Introduction Rules of inference are syntactical transform rules which one can use to infer a conclusion from a premise to create an argument. A set of rules can be used to infer any valid conclusion if it is complete, while never inferring an invalid conclusion, if it is sound. A sound and complete set of rules need not include every rule in the following list, as many of the rules are redundant, and can be proven with the other rules. Discharge rules permit inference from a subderivation based on a temporary assumption. Below, the notation indicates such a subderivation from the temporary assumption to . Rules for classical sentential calculus Sentential calculus is also known as propositional calculus. Rules for negations Reductio ad absurdum (or Negation Introduction) Reductio ad absurdum (related to the law of excluded middle) Noncontradiction (or Negation Elimination) Double negation elimination Double negation introduction List of rules of inference 2 Rules for conditionals Deduction theorem (or Conditional Introduction) Modus ponens (or Conditional Elimination) Modus tollens Rules for conjunctions Adjunction (or Conjunction Introduction) Simplification (or Conjunction Elimination) Rules for disjunctions Addition (or Disjunction Introduction) Separation of Cases (or Disjunction Elimination) Disjunctive syllogism List of rules of inference 3 Rules for biconditionals Biconditional introduction Biconditional Elimination Rules of classical predicate calculus In the following rules, is exactly like except for having the term everywhere has the free variable . Universal Introduction (or Universal Generalization) Restriction 1: does not occur in .
    [Show full text]
  • The Law of Non-Contradiction As a Metaphysical Principle T E.T D U [email protected]
    The Law of Non-Contradiction as a Metaphysical Principle Tuomas E. Tahko Durham University http://www.ttahko.net/ [email protected] Received by Greg Restall Published June 11, 2009 http://www.philosophy.unimelb.edu.au/ajl/2009 © 2009 Tuomas E. Tahko Abstract: The goals of this paper are two-fold: I wish to clarify the Aristotelian conception of the law of non-contradiction as a metaphysical rather than a se- mantic or logical principle, and to defend the truth of the principle in this sense. First I will explain what it in fact means that the law of non-contradiction is a metaphysical principle. The core idea is that the law of non-contradiction is a general principle derived from how things are in the world. For example, there are certain constraints as to what kind of properties an object can have, and especially: some of these properties are mutually exclusive. Given this char- acterisation, I will advance to examine what kind of challenges the law of non- contradiction faces—the main opponent here is Graham Priest. I will consider these challenges and conclude that they do not threaten the truth of the law of non-contradiction understood as a metaphysical principle. 1 introduction The purpose of this paper is to defend the idea that the law of non-contradic- tion (lnc) is a metaphysical rather than a logical principle.1 I will also de- fend the status of lnc as the best candidate for a fundamental metaphysical principle—if there are any principles which constrain the structure of reality, then lnc is certainly our most likely candidate.2 Some challenges to this view 1The idea has its roots in Aristotle’s Metaphysics, see also Politis (2004: ch.
    [Show full text]
  • Mathematical Semantics of Intuitionistic Logic 2
    MATHEMATICAL SEMANTICS OF INTUITIONISTIC LOGIC SERGEY A. MELIKHOV Abstract. This work is a mathematician’s attempt to understand intuitionistic logic. It can be read in two ways: as a research paper interspersed with lengthy digressions into rethinking of standard material; or as an elementary (but highly unconventional) introduction to first-order intuitionistic logic. For the latter purpose, no training in formal logic is required, but a modest literacy in mathematics, such as topological spaces and posets, is assumed. The main theme of this work is the search for a formal semantics adequate to Kol- mogorov’s informal interpretation of intuitionistic logic (whose simplest part is more or less the same as the so-called BHK interpretation). This search goes beyond the usual model theory, based on Tarski’s notion of semantic consequence, and beyond the usual formalism of first-order logic, based on schemata. Thus we study formal semantics of a simplified version of Paulson’s meta-logic, used in the Isabelle prover. By interpreting the meta-logical connectives and quantifiers constructively, we get a generalized model theory, which covers, in particular, realizability-type interpretations of intuitionistic logic. On the other hand, by analyzing Kolmogorov’s notion of semantic consequence (which is an alternative to Tarski’s standard notion), we get an alternative model theory. By using an extension of the meta-logic, we further get a generalized alternative model theory, which suffices to formalize Kolmogorov’s semantics. On the other hand, we also formulate a modification of Kolmogorov’s interpreta- tion, which is compatible with the usual, Tarski-style model theory.
    [Show full text]
  • Defining Double Negation Elimination
    Defining Double Negation Elimination GREG RESTALL, Department of Philosophy, Macquarie University, Sydney 2109, Australia. Email: [email protected]. Web: http://www.phil.mq.edu.au/staff/grestall/ Abstract In his paper “Generalised Ortho Negation” [2] J. Michael Dunn mentions a claim of mine to the effect that there is no condition on ‘perp frames’ equivalent to the holding of double negation elimination ∼∼A ` A. That claim is wrong. In this paper I correct my error and analyse the behaviour of conditions on frames for negations which verify a number of different theses.1 1 Compatibility Frames Dunn’s work on general models for negation has been a significant advance in our understanding of negation in non-classical logics [1, 2]. These models generalise Kripke models for intuitionistic logic and Routley–Meyer models for relevant implication. I will recount the essential details of these models for negation here before we look at the behaviour of inference patterns such as double negation elimination. Definition 1.1 A frame is a triple hP; C; vi consisting of a set P of points, and two binary relations C and v on P such that •vis a partial order on P . That is, v is reflexive, transitive and antisymmetric on P . • C is antitone in both places. That is, for any x and y in P ,ifxCy, x0 v x and y0 v y then x0Cy0. The relation v between points may be interpreted as one of information inclusion. Apointx is informationally included in y if everything warranted by x (or encoded by x or made true by x or included in x or however else information is thought to be related to points) is also warranted by y.
    [Show full text]
  • A Variant of the Double-Negation Translation∗
    A variant of the double-negation translation∗ Jeremy Avigad August 21, 2006 Abstract An efficient variant of the double-negation translation explains the relationship between Shoenfield’s and G¨odel’s versions of the Dialectica interpretation. Fix a classical first-order language, based on the connectives ∨, ∧, ¬, and ∀. We will define a translation to intuitionistic (even minimal) logic, based on the usual connectives. The translation maps each formula ϕ to the formula ∗ ϕ = ¬ϕ∗, so ϕ∗ is supposed to represent an intuitionistic version of the negation of ϕ. The map from ϕ to ϕ∗ is defined recursively, as follows: ϕ∗ = ¬ϕ, when ϕ is atomic (¬ϕ)∗ = ¬ϕ∗ (ϕ ∨ ψ)∗ = ϕ∗ ∧ ψ∗ (ϕ ∧ ψ)∗ = ϕ∗ ∨ ψ∗ (∀x ϕ)∗ = ∃x ϕ∗ Note that we can eliminate either ∨ or ∧ and retain a complete set of connectives. ∗ If Γ is the set of classical formulas {ϕ1, . , ϕk}, let Γ denote the set of formulas ∗ ∗ {ϕ1, . , ϕk}. The main theorem of this note is the following: Theorem 0.1 1. Classical logic proves ϕ ↔ ϕ∗. 2. If ϕ is provable from Γ in classical logic, then ϕ∗ is provable from Γ∗ in minimal logic. Note that both these claims hold for the usual G¨odel-Gentzen translation ϕ 7→ ϕN . Thus the theorem is a consequence of the following lemma: Lemma 0.2 For every ϕ, minimal logic proves ϕ∗ ↔ ϕN . ∗This note was written in response to a query from Grigori Mints. After circulating a draft, I learned that Ulrich Kohlenbach and Thomas Streicher had hit upon the same solution, and that the version of the double-negation translation described below is due to Jean-Louis Kriv- ine (see [3]).
    [Show full text]