Proofs As Cryptography: a New Interpretation of the Curry-Howard Isomorphism for Software Certificates Amrit Kumar, Pierre-Alain Fouque, Thomas Genet, Mehdi Tibouchi
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Ecumenical Modal Logic EK
Ecumenical modal logic Sonia Marin1, Luiz Carlos Pereira2, Elaine Pimentel3, and Emerson Sales4 1Department of Computer Science, University College London, UK 2Philosophy Department, PUC-Rio, Brazil 3Department of Mathematics, UFRN, Brazil 4Graduate Program of Applied Mathematics and Statistics, UFRN, Brazil June 1, 2020 Abstract The discussion about how to put together Gentzen’s systems for classical and intuitionistic logic in a single unified system is back in fashion. Indeed, recently Prawitz and others have been discussing the so called Ecumenical Systems, where connectives from these logics can co-exist in peace. In Prawitz’ system, the clas- sical logician and the intuitionistic logician would share the universal quantifier, conjunction, negation, and the constant for the absurd, but they would each have their own existential quantifier, disjunction, and implication, with different mean- ings. Prawitz’ main idea is that these different meanings are given by a semantical framework that can be accepted by both parties. In a recent work, Ecumenical se- quent calculi and a nested system were presented, and some very interesting proof theoretical properties of the systems were established. In this work we extend Prawitz’ Ecumenical idea to alethic K-modalities. arXiv:2005.14325v1 [cs.LO] 28 May 2020 1 Introduction In [21] Dag Prawitz proposed a natural deduction system for what was later called Ecumenical logic (EL), where classical and intuitionistic logic could coexist in peace. In this system, the classical logician and the intuitionistic logician would share the universal quantifier, conjunction, negation, and the constant for the absurd (the neutral connectives), but they would each have their own existential quantifier, disjunction, and implication, with different meanings. -
By Joseph Ray Davidson
AN INFORMATION THEORETIC APPROACH TO THE EXPRESSIVENESS OF PROGRAMMING LANGUAGES by Joseph Ray Davidson Submitted in fulfilment of the requirements for the degree of Doctor of Philosophy University of Glasgow College of Science and Engineering School of Computing Science July 2015 The copyright in this thesis is owned by the author. Any quotation from the thesis or use of any of the information contained in it must acknowledge this thesis as the source of the quotation or information. Abstract The conciseness conjecture is a longstanding notion in computer science that programming languages with more built-in operators, that is more expressive languages with larger semantics, produce smaller programs on average. Chaitin defines the related concept of an elegant program such that there is no smaller program in some language which, when run, produces the same output. This thesis investigates the conciseness conjecture in an empirical manner. Influenced by the concept of elegant programs, we investigate several models of computation, and implement a set of functions in each programming model. The programming models are Turing Machines, λ-Calculus, SKI, RASP, RASP2, and RASP3. The information content of the programs and models are measured as characters. They are compared to investigate hypotheses relating to how the mean program size changes as the size of the semantics change, and how the relationship of mean program sizes between two models compares to that between the sizes of their semantics. We show that the amount of information present in models of the same paradigm, or model family, is a good indication of relative expressivity and aver- age program size. -
Deductive Systems in Traditional and Modern Logic
Deductive Systems in Traditional and Modern Logic • Urszula Wybraniec-Skardowska and Alex Citkin Deductive Systems in Traditional and Modern Logic Edited by Urszula Wybraniec-Skardowska and Alex Citkin Printed Edition of the Special Issue Published in Axioms www.mdpi.com/journal/axioms Deductive Systems in Traditional and Modern Logic Deductive Systems in Traditional and Modern Logic Editors Alex Citkin Urszula Wybraniec-Skardowska MDPI • Basel • Beijing • Wuhan • Barcelona • Belgrade • Manchester • Tokyo • Cluj • Tianjin Editors Alex Citkin Urszula Wybraniec-Skardowska Metropolitan Telecommunications Cardinal Stefan Wyszynski´ USA University in Warsaw, Department of Philosophy Poland Editorial Office MDPI St. Alban-Anlage 66 4052 Basel, Switzerland This is a reprint of articles from the Special Issue published online in the open access journal Axioms (ISSN 2075-1680) (available at: http://www.mdpi.com/journal/axioms/special issues/deductive systems). For citation purposes, cite each article independently as indicated on the article page online and as indicated below: LastName, A.A.; LastName, B.B.; LastName, C.C. Article Title. Journal Name Year, Article Number, Page Range. ISBN 978-3-03943-358-2 (Pbk) ISBN 978-3-03943-359-9 (PDF) c 2020 by the authors. Articles in this book are Open Access and distributed under the Creative Commons Attribution (CC BY) license, which allows users to download, copy and build upon published articles, as long as the author and publisher are properly credited, which ensures maximum dissemination and a wider impact of our publications. The book as a whole is distributed by MDPI under the terms and conditions of the Creative Commons license CC BY-NC-ND. -
The Deduction Theorem (Before and After Herbrand)
The Deduction Theorem (before and after Herbrand) CURTIS FRANKS 1. Preview Attempts to articulate the real meaning or ultimate significance of a famous theorem comprise a major vein of philosophical writing about mathematics. The subfield of mathe- matical logic has supplied more than its fair share of case studies to this genre, Godel’s¨ (1931) incompleteness theorem being probably the most frequent exhibit. This note is about the De- duction Theorem—another result from mathematical logic, of roughly the same vintage. I aim to make clear, in the simplest possible terms, what the theorem says in the several guises it has taken, how to prove it in each associated framework, and especially the role it has played in logic’s development over nearly two centuries. But do not expect the theorem to submit, here, to anything like a final analysis. I intend the exercise to serve as an ancillary to a thesis contrary to such ambitions: that the meaning of important mathematics is unfixed—expanding over time, informed by new perspectives and theoretical advances, and in turn giving rise to new developments. I want to avoid any im- pression that the Deduction Theorem is the sort of thing that can be fully understood. Ideally, familiarity with the ways in which it has taken on new meaning over time will prepare readers for its continued reinvention, even to be the authors of its future embellishments. Other histories of the Deduction Theorem (Pogorzelski 1968, Porte 1982, Czelakowski 1985) have been written, but our focus will differ from theirs. Rather than document when the theorem was proved for different systems, when its minimal conditions were specified, or how it was generalized from the setting of sentential logic to larger classes of algebraic structures, we will not look beyond the basic phenomenon. -
Enriched Lawvere Theories for Operational Semantics
Enriched Lawvere Theories for Operational Semantics John C. Baez and Christian Williams Department of Mathematics U. C. Riverside Riverside, CA 92521 USA [email protected], [email protected] Enriched Lawvere theories are a generalization of Lawvere theories that allow us to describe the operational semantics of formal systems. For example, a graph-enriched Lawvere theory describes structures that have a graph of operations of each arity, where the vertices are operations and the edges are rewrites between operations. Enriched theories can be used to equip systems with oper- ational semantics, and maps between enriching categories can serve to translate between different forms of operational and denotational semantics. The Grothendieck construction lets us study all models of all enriched theories in all contexts in a single category. We illustrate these ideas with the SKI-combinator calculus, a variable-free version of the lambda calculus. 1 Introduction Formal systems are not always explicitly connected to how they operate in practice. Lawvere theories [20] are an excellent formalism for describing algebraic structures obeying equational laws, but they do not specify how to compute in such a structure, for example taking a complex expression and simplify- ing it using rewrite rules. Recall that a Lawvere theory is a category with finite products T generated by a single object t, for “type”, and morphisms tn ! t representing n-ary operations, with commutative diagrams specifying equations. There is a theory for groups, a theory for rings, and so on. We can spec- ify algebraic structures of a given kind in some category C with finite products by a product-preserving functor m : T ! C. -
The Proscriptive Principle and Logics of Analytic Implication
City University of New York (CUNY) CUNY Academic Works All Dissertations, Theses, and Capstone Projects Dissertations, Theses, and Capstone Projects 2-2017 The Proscriptive Principle and Logics of Analytic Implication Thomas M. Ferguson The Graduate Center, City University of New York How does access to this work benefit ou?y Let us know! More information about this work at: https://academicworks.cuny.edu/gc_etds/1882 Discover additional works at: https://academicworks.cuny.edu This work is made publicly available by the City University of New York (CUNY). Contact: [email protected] The Proscriptive Principle and Logics of Analytic Implication by Thomas Macaulay Ferguson A dissertation submitted to the Graduate Faculty in Philosophy in partial fulfillment of the requirements for the degree of Doctor of Philosophy, The City University of New York 2017 ii © 2017 Thomas Macaulay Ferguson All Rights Reserved iii This manuscript has been read and accepted by the Graduate Faculty in Philosophy in satisfaction of the dissertation requirement for the degree of Doctor of Philosophy. Professor Sergei Artemov Date Chair of Examining Committee Professor Iakovos Vasiliou Date Executive Officer Professor Graham Priest Professor Heinrich Wansing Professor Kit Fine Supervisory Committee The City University of New York iv Abstract The Proscriptive Principle and Logics of Analytic Implication by Thomas Macaulay Ferguson Adviser: Professor Graham Priest The analogy between inference and mereological containment goes at least back to Aristotle, whose discussion in the Prior Analytics motivates the validity of the syllogism by way of talk of parts and wholes. On this picture, the application of syllogistic is merely the anal- ysis of concepts, a term that presupposes—through the root ἀνά + λύω —a mereological background. -
Download (2MB)
Urlea, Cristian (2021) Optimal program variant generation for hybrid manycore systems. PhD thesis. http://theses.gla.ac.uk/81928/ Copyright and moral rights for this work are retained by the author A copy can be downloaded for personal non-commercial research or study, without prior permission or charge This work cannot be reproduced or quoted extensively from without first obtaining permission in writing from the author The content must not be changed in any way or sold commercially in any format or medium without the formal permission of the author When referring to this work, full bibliographic details including the author, title, awarding institution and date of the thesis must be given Enlighten: Theses https://theses.gla.ac.uk/ [email protected] Optimal program variant generation for hybrid manycore systems Cristian Urlea Submitted in fulfilment of the requirements for the Degree of Doctor of Philosophy School of Engineering College of Science and Engineering University of Glasgow January 2021 Abstract Field Programmable Gate Arrays (FPGAs) promise to deliver superior energy efficiency in het- erogeneous high performance computing (HPC), as compared to multicore CPUs and GPUs. The rate of adoption is however hampered by the relative difficulty of programming FPGAs. High-level synthesis tools such as Xilinx Vivado, Altera OpenCL or Intel’s HLS address a large part of the programmability issue by synthesizing a Hardware Description Languages (HDL) representation from a high-level specification of the application, given in programming lan- guages such as OpenCL C, typically used to program CPUs and GPUs. Although HLS solutions make programming easier, they fail to also lighten the burden of optimization. -
Double-Negation Elimination in Some Propositional Logics
Double-Negation Elimination in Some Propositional Logics Michael Beeson Robert Veroff San Jose State University University of New Mexico Math & Computer Science Department of Computer Science San Jose, CA 95192 Albuquerque, NM 87131 Larry Wos Mathematics and Computer Science Division Argonne National Laboratory Argonne, IL 60439-4801 December 19, 2004 Abstract This article answers two questions (posed in the literature), each concerning the guaranteed existence of proofs free of double negation. A proof is free of double negation if none of its deduced steps contains a term of the form n(n(t)) for some term t, where n denotes negation. The first question asks for conditions on the hypotheses that, if satisfied, guarantee the existence of a double-negation-free proof when the conclusion is free of double negation. The second question asks about the existence of an axiom system for clas- sical propositional calculus whose use, for theorems with a conclusion free of double negation, guarantees the existence of a double-negation-free proof. After giving conditions that answer the first question, we answer the sec- ond question by focusing on theLukasiewicz three-axiom system. We then extend our studies to infinite-valued sentential calculus and to intuitionistic logic and generalize the notion of being double-negation free. The double- negation proofs of interest rely exclusively on the inference rule condensed detachment, a rule that combines modus ponens with an appropriately gen- eral rule of substitution. The automated reasoning program Otter played an indispensable role in this study. 1 1 Origin of the Study This article features the culmination of a study whose origin rests equally with two questions, the first posed in Studia Logica [2] and the second (closely related to the first) posed in the Journal of Automated Reasoning [15]. -
Visualizing the Turing Tarpit
Visualizing the Turing Tarpit Jason Hemann Eric Holk Indiana University {jhemann,eholk}@cs.indiana.edu Abstract In this paper we begin by refreshing some of the the- Minimalist programming languages like Jot have limited in- ory behind this work, including the λ-calculus and its var- terest outside of the community of languages enthusiasts. ious reduction strategies, universal combinators and their This is a shame, as the simplicity of these languages as- application to languages like Iota and Jot (Section 2). Using cribes to them an inherent beauty and provides deep in- this, we devise several strategies for plotting the behavior of sight into the nature of computation. We present a fun way large numbers of Jot programs (Section 3). We make several of visualizing the behavior of many Jot programs at once, observations about the resulting visualizations (Section 4), providing interesting images and also hinting at somewhat describe related work in the field (Section 5), and suggest non-obvious relationships between programs. In the same possibilities for future work (Section 6). way that research into fractals yielded new mathematical insights, visualizations such as those presented here could yield new insights into the structure and nature of compu- tation. Categories and Subject Descriptors F.4.1 [Mathemat- ical Logic]: Lambda calculus and related systems; D.2.3 [Coding Tools and Techniques]: Pretty printers General Terms language, visualization Keywords Jot, reduction, tarpit, Iota 1. Introduction A Turing tarpit is a programming language that is Turing complete, but so bereft of features that it is impractical for use in actual programming tasks, often even for those con- sidered trivial in more conventional languages. -
Arxiv:1610.02247V3 [Cs.LO] 16 Oct 2016 Aeifrain O Xml,Cnie Hsad Ento Famon a of Definition Agda “Interface This an Consider Or Most Example, “Module” [7]
Logic as a distributive law Michael Stay1 and L.G. Meredith2 1 Pyrofex Corp. [email protected] 2 Synereo, Ltd [email protected] Abstract. We present an algorithm for deriving a spatial-behavioral type system from a formal presentation of a computational calculus. Given a 2-monad Calc : Cat → Cat for the free calculus on a category of terms and rewrites and a 2-monad BoolAlg for the free Boolean algebra on a category, we get a 2-monad Form = BoolAlg + Calc for the free category of formulae and proofs. We also get the 2-monad BoolAlg ◦Calc for subsets of terms. The interpretation of formulae is a natural transformation J−K: Form ⇒ BoolAlg◦Calc defined by the units and multiplications of the monads and a distributive law transformation δ : Calc ◦ BoolAlg ⇒ BoolAlg ◦ Calc. This interpretation is consistent both with the Curry-Howard isomor- phism and with realizability. We give an implementation of the “possibly” modal operator parametrized by a two-hole term context and show that, surprisingly, the arrow type constructor in the λ-calculus is a specific case. We also exhibit nontrivial formulae encoding confinement and liveness properties for a reflective higher-order variant of the π-calculus. 1 Introduction Sylvester coined the term “universal algebra” to describe the idea of expressing a mathematical structure as set equipped with functions satisfying equations; the idea itself was first developed by Hamilton and de Morgan [7]. Most modern programming languages have a notion of a “module” or an “interface” with this arXiv:1610.02247v3 [cs.LO] 16 Oct 2016 same information; for example, consider this Agda definition of a monoid. -
Download/Pdf/81105779.Pdf
UC Riverside UC Riverside Previously Published Works Title Enriched Lawvere Theories for Operational Semantics Permalink https://escholarship.org/uc/item/3fz1k4k1 Authors Baez, John C Williams, Christian Publication Date 2020-09-15 DOI 10.4204/eptcs.323.8 Peer reviewed eScholarship.org Powered by the California Digital Library University of California Enriched Lawvere Theories for Operational Semantics John C. Baez and Christian Williams Department of Mathematics U. C. Riverside Riverside, CA 92521 USA [email protected], [email protected] Enriched Lawvere theories are a generalization of Lawvere theories that allow us to describe the operational semantics of formal systems. For example, a graph-enriched Lawvere theory describes structures that have a graph of operations of each arity, where the vertices are operations and the edges are rewrites between operations. Enriched theories can be used to equip systems with oper- ational semantics, and maps between enriching categories can serve to translate between different forms of operational and denotational semantics. The Grothendieck construction lets us study all models of all enriched theories in all contexts in a single category. We illustrate these ideas with the SKI-combinator calculus, a variable-free version of the lambda calculus. 1 Introduction Formal systems are not always explicitly connected to how they operate in practice. Lawvere theories [20] are an excellent formalism for describing algebraic structures obeying equational laws, but they do not specify how to compute in such a structure, for example taking a complex expression and simplify- ing it using rewrite rules. Recall that a Lawvere theory is a category with finite products T generated by a single object t, for “type”, and morphisms tn ! t representing n-ary operations, with commutative diagrams specifying equations. -
A Formal System: Rigorous Constructions of Computer Models Arxiv:1510.04469V3 [Cs.LO] 13 Apr 2017 G
A Formal System: Rigorous Constructions of Computer Models arXiv:1510.04469v3 [cs.LO] 13 Apr 2017 G. Pantelis Preface This book draws upon a number of converging ideas that have emerged over recent decades from various researchers involved in the construction of computer models. These ideas are challenging the dominant paradigm where a computer model is constructed as an attempt to provide a discrete approximation of some continuum theory. For reasons discussed in the first chapter, there is an argument that supports a departure from this paradigm towards the construction of discrete models based on simple deterministic rules. Although still limited in their use in the sciences, these models are producing results that show promise and cannot be easily dis- missed. But one can take this one step further and argue that such discrete models not only provide alternative tools for simulation but in themselves can be used as a new language that describe real world systems. The question arises as to how one can build a solid foundation for validating such discrete models, both as a simulation tool as well as a language that describe the laws that govern the application. It appears that these two aspects of the model are highly linked and rely heavily upon a single overriding property, namely that of computability. Encouraged by current trends in theoretical physics, we are particularly interested in dynamical systems that model the flow and interaction of information. The state variables of such systems can only take on a finite number of assigned integer or rational values and are subject to the law of conservation of information.