The History of Standard ML

Total Page:16

File Type:pdf, Size:1020Kb

The History of Standard ML The History of Standard ML DAVID MACQUEEN, University of Chicago, USA ROBERT HARPER, Carnegie Mellon University, USA JOHN REPPY, University of Chicago, USA Shepherd: Kim Bruce, Pomona College, USA The ML family of strict functional languages, which includes F#, OCaml, and Standard ML, evolved from the Meta Language of the LCF theorem proving system developed by Robin Milner and his research group at the University of Edinburgh in the 1970s. This paper focuses on the history of Standard ML, which plays a central rôle in this family of languages, as it was the first to include the complete set of features that we now associate with the name “ML” (i.e., polymorphic type inference, datatypes with pattern matching, modules, exceptions, and mutable state). Standard ML, and the ML family of languages, have had enormous influence on the world of programming language design and theory. ML is the foremost exemplar of a functional programming language with strict evaluation (call-by-value) and static typing. The use of parametric polymorphism in its type system, together with the automatic inference of such types, has influenced a wide variety of modern languages (where polymorphism is often referred to as generics). It has popularized the idea of datatypes with associated case analysis by pattern matching. The module system of Standard ML extends the notion of type-level parameterization to large-scale programming with the notion of parametric modules, or functors. Standard ML also set a precedent by being a language whose design included a formal definition with an associated metatheory of mathematical proofs (such as soundness of the type system). A formal definition was one of the explicit goals from the beginning of the project. While some previous languages had rigorous definitions, these definitions were not integral to the design process, and the formal part was limitedtothe language syntax and possibly dynamic semantics or static semantics, but not both. The paper covers the early history of ML, the subsequent efforts to define a standard ML language, and the development of its major features and its formal definition. We also review the impact that the language had on programming-language research. CCS Concepts: • Software and its engineering ! Functional languages; Formal language definitions; Polymorphism; Data types and structures; Modules / packages; Abstract data types. Additional Key Words and Phrases: Standard ML, Language design, Operational semantics, Type checking ACM Reference Format: David MacQueen, Robert Harper, and John Reppy. 2020. The History of Standard ML. Proc. ACM Program. Lang. 4, HOPL, Article 86 (June 2020), 100 pages. https://doi.org/10.1145/3386336 86 Authors’ addresses: David MacQueen, Computer Science, University of Chicago, 5730 S. Ellis Avenue, Chicago, IL, 60637, USA, [email protected]; Robert Harper, Computer Science, Carnegie Mellon University, 5000 Forbes Avenue, Pittsburgh, PA, 15213, USA, [email protected]; John Reppy, Computer Science, University of Chicago, 5730 S. Ellis Avenue, Chicago, IL, 60637, USA, [email protected]. Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for third-party components of this work must be honored. For all otheruses, contact the owner/author(s). © 2020 Copyright held by the owner/author(s). 2475-1421/2020/6-ART86 https://doi.org/10.1145/3386336 Proc. ACM Program. Lang., Vol. 4, No. HOPL, Article 86. Publication date: June 2020. 86:2 David MacQueen, Robert Harper, and John Reppy Contents Abstract 1 Contents 2 1 Introduction: Why Is Standard ML important?4 2 Background6 2.1 LCF/ML — The Original ML Embedded in the LCF Theorem Prover7 2.1.1 Control Structures8 2.1.2 Polymorphism and Assignment8 2.1.3 Failure and Failure Trapping8 2.1.4 Types and Type Operators9 2.2 HOPE9 2.3 Cardelli ML9 2.3.1 Labelled Records and Unions 11 2.3.2 The ref Type 12 2.3.3 Declaration Combinators 12 2.3.4 Modules 13 3 An Overview of the History of Standard ML 13 3.1 Early Meetings 13 3.1.1 November 1982 13 3.1.2 April 1983 14 3.1.3 June 1983 14 3.1.4 June 1984 15 3.1.5 May 1985 15 3.2 The Formal Definition 15 3.3 Standard ML Evolution 16 3.3.1 ML 2000 16 3.3.2 The Revised Definition (Standard ML ’97) 17 4 The SML Type System 18 4.1 Early History of Type Theory 18 4.2 Milner’s Type Inference Algorithm 22 4.3 Type Constructors 25 4.3.1 Primitive Type Constructors 25 4.3.2 Type Abbreviations 26 4.3.3 Datatypes 27 4.4 The Problem of Effects 29 4.5 Equality Types 32 4.6 Overloading 33 4.7 Understanding Type Errors 33 5 Modules 34 5.1 The Basic Design 36 5.2 The Evolution of the Design of Modules 41 5.3 Modules and the Definition 46 5.4 Module Innovations in Version 0.93 of SML/NJ 46 5.5 Module Changes in Definition (Revised) 46 5.6 Some Problems and Anomalies in the Module System 47 5.7 Modules and Separate Compilation 48 6 The Definition of Standard ML 49 Proc. ACM Program. Lang., Vol. 4, No. HOPL, Article 86. Publication date: June 2020. The History of Standard ML 86:3 6.1 Early Work on Language Formalization 49 6.2 Overview of the Definition 51 6.3 Early Work on the Semantics 52 6.4 The Definition 54 6.5 The Revised Definition 58 7 Type Theory and New Definitions of Standard ML 59 7.1 Reformulating the Definition Using Types 61 7.2 A Mechanized Definition 62 7.3 Lessons of Mechanization 64 8 The SML Basis Library 64 8.1 A Brief History of the Basis Library 64 8.2 Design Philosophy 66 8.3 Interaction with the REPL 67 8.4 Numeric Types 67 8.4.1 Supporting Multiple Precisions 67 8.4.2 Conversions Between Types 68 8.4.3 Real Numbers 69 8.5 Input/Output 69 8.6 Sockets 70 8.7 Slices 72 8.8 Internationalization and Unicode 72 8.9 Iterators 73 8.10 Publication and Future Evolution 73 8.11 Retrospective 74 8.12 Participants 74 9 Conclusion 75 9.1 Mistakes in the Design of Standard ML 75 9.2 The Impact of Standard ML 75 9.2.1 Applications of SML 75 9.2.2 Language Design 76 9.2.3 Language Implementation 77 9.2.4 Concurrency and Parallelism 78 9.3 The Non-evolution of SML 79 9.4 Summary 80 Acknowledgments 80 A Standard ML Implementations 80 A.1 Alice 80 A.2 Edinburgh ML 81 A.3 HaMLeT 81 A.4 The ML Kit 81 A.5 MLton 81 A.6 MLWorks 81 A.7 Moscow ML 81 A.8 Poly ML 82 A.9 Poplog/SML 82 A.10 RML 82 A.11 MLj and SML.NET 82 A.12 Standard ML of New Jersey 82 Proc. ACM Program. Lang., Vol. 4, No. HOPL, Article 86. Publication date: June 2020. 86:4 David MacQueen, Robert Harper, and John Reppy A.13 SML# 83 A.14 TIL and TILT 83 References 83 1 INTRODUCTION: WHY IS STANDARD ML IMPORTANT? The ML family of languages started with the meta language (hence ML) of the LCF theorem proving system developed by Robin Milner and his research group at Edinburgh University from 1973 through 1978 [Gordon et al. 1978b].1 Its design was based on experience with an earlier version of the LCF system that Milner had worked on at Stanford University [Milner 1972]. Here is how Milner succinctly describes ML, the LCF metalanguage, in the introduction to the book Edinburgh LCF [Gordon et al. 1979]: ::: ML is a general purpose programming language. It is derived in different aspects from ISWIM, POP2 and GEDANKEN, and contains perhaps two new features. First, it has an escape and escape trapping mechanism, well-adapted to programming strategies which may be (in fact usually are) inapplicable to certain goals. Second, it has a polymorphic type discipline which combines the flexibility of programming in a typeless language with the security of compile-time type checking (as in other languages, you may also define your own types, which may be abstract and/or recursive); this is what ensures that a well-typed program cannot perform faulty proofs. This original ML was an embedded language within the interactive theorem proving system LCF, serving as a logically-secure scripting language. Its type system enforced the logical validity of proved theorems and it enabled the partial automation of the proof process through the definition of proof tactics. It also served as the interactive command language of LCF. The basic framework of the language design was inspired by Peter Landin’s ISWIM language from the mid 1960s [1966b], which was, in turn, a sugared syntax for Church’s lambda calculus. The ISWIM framework provided higher-order functions, which were used to express proof tactics and their composition. To ISWIM were added a static type system that could guarantee that programs that purport to produce theorems in the object language actually produce logically valid theorems and an exception mechanism for working with proof tactics that could fail. ML followed ISWIM in using strict evaluation and allowing impure features, such as assignment and exceptions. Initially the ML language was only available to users of the LCF system, but its features soon evoked interest in a wider community as a potentially general-purpose programming language.
Recommended publications
  • Functional Programming Is Style of Programming in Which the Basic Method of Computation Is the Application of Functions to Arguments;
    PROGRAMMING IN HASKELL Chapter 1 - Introduction 0 What is a Functional Language? Opinions differ, and it is difficult to give a precise definition, but generally speaking: • Functional programming is style of programming in which the basic method of computation is the application of functions to arguments; • A functional language is one that supports and encourages the functional style. 1 Example Summing the integers 1 to 10 in Java: int total = 0; for (int i = 1; i 10; i++) total = total + i; The computation method is variable assignment. 2 Example Summing the integers 1 to 10 in Haskell: sum [1..10] The computation method is function application. 3 Historical Background 1930s: Alonzo Church develops the lambda calculus, a simple but powerful theory of functions. 4 Historical Background 1950s: John McCarthy develops Lisp, the first functional language, with some influences from the lambda calculus, but retaining variable assignments. 5 Historical Background 1960s: Peter Landin develops ISWIM, the first pure functional language, based strongly on the lambda calculus, with no assignments. 6 Historical Background 1970s: John Backus develops FP, a functional language that emphasizes higher-order functions and reasoning about programs. 7 Historical Background 1970s: Robin Milner and others develop ML, the first modern functional language, which introduced type inference and polymorphic types. 8 Historical Background 1970s - 1980s: David Turner develops a number of lazy functional languages, culminating in the Miranda system. 9 Historical Background 1987: An international committee starts the development of Haskell, a standard lazy functional language. 10 Historical Background 1990s: Phil Wadler and others develop type classes and monads, two of the main innovations of Haskell.
    [Show full text]
  • The Effect of Compression and Expansion on Stochastic Reaction Networks
    IMT School for Advanced Studies, Lucca Lucca, Italy The Effect of Compression and Expansion on Stochastic Reaction Networks PhD Program in Institutions, Markets and Technologies Curriculum in Computer Science and Systems Engineering (CSSE) XXXI Cycle By Tabea Waizmann 2021 The dissertation of Tabea Waizmann is approved. Program Coordinator: Prof. Rocco De Nicola, IMT Lucca Supervisor: Prof. Mirco Tribastone, IMT Lucca The dissertation of Tabea Waizmann has been reviewed by: Dr. Catia Trubiani, Gran Sasso Science Institute Prof. Michele Loreti, University of Camerino IMT School for Advanced Studies, Lucca 2021 To everyone who believed in me. Contents List of Figures ix List of Tables xi Acknowledgements xiii Vita and Publications xv Abstract xviii 1 Introduction 1 2 Background 6 2.1 Multisets . .6 2.2 Reaction Networks . .7 2.3 Ordinary Lumpability . 11 2.4 Layered Queuing Networks . 12 2.5 PEPA . 16 3 Coarse graining mass-action stochastic reaction networks by species equivalence 19 3.1 Species Equivalence . 20 3.1.1 Species equivalence as a generalization of Markov chain ordinary lumpability . 27 3.1.2 Characterization of SE for mass-action networks . 27 3.1.3 Computation of the maximal SE and reduced net- work . 28 vii 3.2 Applications . 31 3.2.1 Computational systems biology . 31 3.2.2 Epidemic processes in networks . 33 3.3 Discussion . 36 4 DiffLQN: Differential Equation Analysis of Layered Queuing Networks 37 4.1 DiffLQN .............................. 38 4.1.1 Architecture . 38 4.1.2 Capabilities . 38 4.1.3 Syntax . 39 4.2 Case Study: Client-Server dynamics . 41 4.3 Discussion .
    [Show full text]
  • Type-Safe Multi-Tier Programming with Standard ML Modules
    Experience Report: Type-Safe Multi-Tier Programming with Standard ML Modules Martin Elsman Philip Munksgaard Ken Friis Larsen Department of Computer Science, iAlpha AG, Switzerland Department of Computer Science, University of Copenhagen, Denmark [email protected] University of Copenhagen, Denmark [email protected] kfl[email protected] Abstract shared module, it is now possible, on the server, to implement a We describe our experience with using Standard ML modules and module that for each function responds to a request by first deseri- Standard ML’s core language typing features for ensuring type- alising the argument into a value, calls the particular function and safety across physically separated tiers in an extensive web applica- replies to the client with a serialised version of the result value. tion built around a database-backed server platform and advanced Dually, on the client side, for each function, the client code can client code that accesses the server through asynchronous server serialise the argument and send the request asynchronously to the requests. server. When the server responds, the client will deserialise the re- sult value and apply the handler function to the value. Both for the server part and for the client part, the code can be implemented in 1. Introduction a type-safe fashion. In particular, if an interface type changes, the The iAlpha Platform is an advanced asset management system programmer will be notified about all type-inconsistencies at com- featuring a number of analytics modules for combining trading pile
    [Show full text]
  • Foundational Proof Checkers with Small Witnesses
    Foundational Proof Checkers with Small Witnesses Dinghao Wu Andrew W. Appel Aaron Stump Princeton University Washington University in St. Louis dinghao,appel @cs.princeton.edu [email protected] f g ABSTRACT tion of the proof witness to the size of the binary machine- Proof checkers for proof-carrying code (and similar) systems language program. Necula's LFi proof witnesses were about can suffer from two problems: huge proof witnesses and un- 4 times as big as the programs whose properties they proved. trustworthy proof rules. No previous design has addressed both of these problems simultaneously. We show the theory, Pfenning's Elf and Twelf systems [14, 15] are implementa- design, and implementation of a proof-checker that permits tions of the Edinburgh Logical Framework. In these sys- small proof witnesses and machine-checkable proofs of the tems, proof-search engines can be represented as logic pro- soundness of the system. grams, much like (dependently typed, higher-order) Prolog. Elf and Twelf build proof witnesses automatically; if each logic-program clause is viewed as an inference rule, then the 1. INTRODUCTION proof witness is a trace of the successful goals and subgoals In a proof-carrying code (PCC) system [10], or in other executed by the logic program. That is, the proof witness proof-carrying applications [3], an untrusted prover must is a tree whose nodes are the names of clauses and whose convince a trusted checker of the validity of a theorem by subtrees correspond to the subgoals of these clauses. sending a proof. Two of the potential problems with this approach are that the proofs might be too large, and that Necula's theorem provers were written in this style, origi- the checker might not be trustworthy.
    [Show full text]
  • The Common HOL Platform
    The Common HOL Platform Mark Adams Proof Technologies Ltd, UK Radboud University, Nijmegen, The Netherlands The Common HOL project aims to facilitate porting source code and proofs between members of the HOL family of theorem provers. At the heart of the project is the Common HOL Platform, which defines a standard HOL theory and API that aims to be compatible with all HOL systems. So far, HOL Light and hol90 have been adapted for conformance, and HOL Zero was originally developed to conform. In this paper we provide motivation for a platform, give an overview of the Common HOL Platform’s theory and API components, and show how to adapt legacy systems. We also report on the platform’s successful application in the hand-translation of a few thousand lines of source code from HOL Light to HOL Zero. 1 Introduction The HOL family of theorem provers started in the 1980s with HOL88 [5], and has since grown to include many systems, most prominently HOL4 [16], HOL Light [8], ProofPower HOL [3] and Is- abelle/HOL [12]. These four main systems have developed their own advanced proof facilities and extensive theory libraries, and have been successfully employed in major projects in the verification of critical hardware and software [1, 11] and the formalisation of mathematics [7]. It would clearly be of benefit if these systems could “talk” to each other, specifically if theory, proofs and source code could be exchanged in a relatively seamless manner. This would reduce the considerable duplication of effort otherwise required for one system to benefit from the major projects and advanced capabilities developed on another.
    [Show full text]
  • Type Checking Physical Frames of Reference for Robotic Systems
    PhysFrame: Type Checking Physical Frames of Reference for Robotic Systems Sayali Kate Michael Chinn Hongjun Choi Purdue University University of Virginia Purdue University USA USA USA [email protected] [email protected] [email protected] Xiangyu Zhang Sebastian Elbaum Purdue University University of Virginia USA USA [email protected] [email protected] ABSTRACT Engineering (ESEC/FSE ’21), August 23–27, 2021, Athens, Greece. ACM, New A robotic system continuously measures its own motions and the ex- York, NY, USA, 16 pages. https://doi.org/10.1145/3468264.3468608 ternal world during operation. Such measurements are with respect to some frame of reference, i.e., a coordinate system. A nontrivial 1 INTRODUCTION robotic system has a large number of different frames and data Robotic systems have rapidly growing applications in our daily life, have to be translated back-and-forth from a frame to another. The enabled by the advances in many areas such as AI. Engineering onus is on the developers to get such translation right. However, such systems becomes increasingly important. Due to the unique this is very challenging and error-prone, evidenced by the large characteristics of such systems, e.g., the need of modeling the phys- number of questions and issues related to frame uses on developers’ ical world and satisfying the real time and resource constraints, forum. Since any state variable can be associated with some frame, robotic system engineering poses new challenges to developers. reference frames can be naturally modeled as variable types. We One of the prominent challenges is to properly use physical frames hence develop a novel type system that can automatically infer of reference.
    [Show full text]
  • Dynamic Extension of Typed Functional Languages
    Dynamic Extension of Typed Functional Languages Don Stewart PhD Dissertation School of Computer Science and Engineering University of New South Wales 2010 Supervisor: Assoc. Prof. Manuel M. T. Chakravarty Co-supervisor: Dr. Gabriele Keller Abstract We present a solution to the problem of dynamic extension in statically typed functional languages with type erasure. The presented solution re- tains the benefits of static checking, including type safety, aggressive op- timizations, and native code compilation of components, while allowing extensibility of programs at runtime. Our approach is based on a framework for dynamic extension in a stat- ically typed setting, combining dynamic linking, runtime type checking, first class modules and code hot swapping. We show that this framework is sufficient to allow a broad class of dynamic extension capabilities in any statically typed functional language with type erasure semantics. Uniquely, we employ the full compile-time type system to perform run- time type checking of dynamic components, and emphasize the use of na- tive code extension to ensure that the performance benefits of static typing are retained in a dynamic environment. We also develop the concept of fully dynamic software architectures, where the static core is minimal and all code is hot swappable. Benefits of the approach include hot swappable code and sophisticated application extension via embedded domain specific languages. We instantiate the concepts of the framework via a full implementation in the Haskell programming language: providing rich mechanisms for dy- namic linking, loading, hot swapping, and runtime type checking in Haskell for the first time. We demonstrate the feasibility of this architecture through a number of novel applications: an extensible text editor; a plugin-based network chat bot; a simulator for polymer chemistry; and xmonad, an ex- tensible window manager.
    [Show full text]
  • Comparative Studies of Programming Languages; Course Lecture Notes
    Comparative Studies of Programming Languages, COMP6411 Lecture Notes, Revision 1.9 Joey Paquet Serguei A. Mokhov (Eds.) August 5, 2010 arXiv:1007.2123v6 [cs.PL] 4 Aug 2010 2 Preface Lecture notes for the Comparative Studies of Programming Languages course, COMP6411, taught at the Department of Computer Science and Software Engineering, Faculty of Engineering and Computer Science, Concordia University, Montreal, QC, Canada. These notes include a compiled book of primarily related articles from the Wikipedia, the Free Encyclopedia [24], as well as Comparative Programming Languages book [7] and other resources, including our own. The original notes were compiled by Dr. Paquet [14] 3 4 Contents 1 Brief History and Genealogy of Programming Languages 7 1.1 Introduction . 7 1.1.1 Subreferences . 7 1.2 History . 7 1.2.1 Pre-computer era . 7 1.2.2 Subreferences . 8 1.2.3 Early computer era . 8 1.2.4 Subreferences . 8 1.2.5 Modern/Structured programming languages . 9 1.3 References . 19 2 Programming Paradigms 21 2.1 Introduction . 21 2.2 History . 21 2.2.1 Low-level: binary, assembly . 21 2.2.2 Procedural programming . 22 2.2.3 Object-oriented programming . 23 2.2.4 Declarative programming . 27 3 Program Evaluation 33 3.1 Program analysis and translation phases . 33 3.1.1 Front end . 33 3.1.2 Back end . 34 3.2 Compilation vs. interpretation . 34 3.2.1 Compilation . 34 3.2.2 Interpretation . 36 3.2.3 Subreferences . 37 3.3 Type System . 38 3.3.1 Type checking . 38 3.4 Memory management .
    [Show full text]
  • Interactive Theorem Proving (ITP) Course Web Version
    Interactive Theorem Proving (ITP) Course Web Version Thomas Tuerk ([email protected]) Except where otherwise noted, this work is licened under Creative Commons Attribution-ShareAlike 4.0 International License version 3c35cc2 of Mon Nov 11 10:22:43 2019 Contents Part I Preface 3 Part II Introduction 6 Part III HOL 4 History and Architecture 16 Part IV HOL's Logic 24 Part V Basic HOL Usage 38 Part VI Forward Proofs 46 Part VII Backward Proofs 61 Part VIII Basic Tactics 83 Part IX Induction Proofs 110 Part X Basic Definitions 119 Part XI Good Definitions 163 Part XII Deep and Shallow Embeddings 184 Part XIII Rewriting 192 Part XIV Advanced Definition Principles 246 Part XV Maintainable Proofs 267 Part XVI Overview of HOL 4 289 Part XVII Other Interactive Theorem Provers 302 2 / 315 Part I Preface Except where otherwise noted, this work is licened under Creative Commons Attribution-ShareAlike 4.0 International License. Preface these slides originate from a course for advanced master students it was given by the PROSPER group at KTH in Stockholm in 2017 (see https://www.kth.se/social/group/interactive-theorem-) the course focused on how to use HOL 4 students taking the course were expected to I know functional programming, esp. SML I understand predicate logic I have some experience with pen and paper proofs the course consisted of 9 lectures, which each took 90 minutes there were 19 supervised practical sessions, which each took 2 h usually there was 1 lecture and 2 practical sessions each week students were expected to work about 10 h each
    [Show full text]
  • Four Lectures on Standard ML
    Mads Tofte March Lab oratory for Foundations of Computer Science Department of Computer Science Edinburgh University Four Lectures on Standard ML The following notes give an overview of Standard ML with emphasis placed on the Mo dules part of the language The notes are to the b est of my knowledge faithful to The Denition of Standard ML Version as regards syntax semantics and terminology They have b een written so as to b e indep endent of any particular implemen tation The exercises in the rst lectures can b e tackled without the use of a machine although having access to an implementation will no doubt b e b enecial The pro ject in Lecture presupp oses access to an implementation of the full language including mo dules At present the Edinburgh compiler do es not fall into this category the author used the New Jersey Standard ML compiler Lecture gives an introduction to ML aimed at the reader who is familiar with some programming language but do es not know ML Both the Core Language and the Mo dules are covered by way of example Lecture discusses the use of ML mo dules in the development of large programs A useful metho dology for programming with functors signatures and structures is presented Lecture gives a fairly detailed account of the static semantics of ML mo dules for those who really want to understand the crucial notions of sharing and signature matching Lecture presents a one day pro ject intended to give the student an opp ortunity of mo difying a nontrivial piece of software using functors signatures and structures
    [Show full text]
  • HOL-Testgen Version 1.8 USER GUIDE
    HOL-TestGen Version 1.8 USER GUIDE Achim Brucker, Lukas Brügger, Abderrahmane Feliachi, Chantal Keller, Matthias Krieger, Delphine Longuet, Yakoub Nemouchi, Frédéric Tuong, Burkhart Wolff To cite this version: Achim Brucker, Lukas Brügger, Abderrahmane Feliachi, Chantal Keller, Matthias Krieger, et al.. HOL-TestGen Version 1.8 USER GUIDE. [Technical Report] Univeristé Paris-Saclay; LRI - CNRS, University Paris-Sud. 2016. hal-01765526 HAL Id: hal-01765526 https://hal.inria.fr/hal-01765526 Submitted on 12 Apr 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. R L R I A P P O R HOL-TestGen Version 1.8 USER GUIDE T BRUCKER A D / BRUGGER L / FELIACHI A / KELLER C / KRIEGER M P / LONGUET D / NEMOUCHI Y / TUONG F / WOLFF B D Unité Mixte de Recherche 8623 E CNRS-Université Paris Sud-LRI 04/2016 R Rapport de Recherche N° 1586 E C H E R C CNRS – Université de Paris Sud H Centre d’Orsay LABORATOIRE DE RECHERCHE EN INFORMATIQUE E Bâtiment 650 91405 ORSAY Cedex (France) HOL-TestGen 1.8.0 User Guide http://www.brucker.ch/projects/hol-testgen/ Achim D. Brucker Lukas Brügger Abderrahmane Feliachi Chantal Keller Matthias P.
    [Show full text]
  • Functions-As-Constructors Higher-Order Unification
    Functions-as-constructors Higher-order Unification Tomer Libal and Dale Miller Inria Saclay & LIX/École Polytechnique, Palaiseau, France Abstract Unification is a central operation in the construction of a range of computational logic systems based on first-order and higher-order logics. First-order unification has a number of properties that dominates the way it is incorporated within such systems. In particular, first-order unifi- cation is decidable, unary, and can be performed on untyped term structures. None of these three properties hold for full higher-order unification: unification is undecidable, unifiers can be incomparable, and term-level typing can dominate the search for unifiers. The so-called pattern subset of higher-order unification was designed to be a small extension to first-order unification that respected the basic laws governing λ-binding (the equalities of α, β, and η-conversion) but which also satisfied those three properties. While the pattern fragment of higher-order unification has been popular in various implemented systems and in various theoretical considerations, it is too weak for a number of applications. In this paper, we define an extension of pattern unifica- tion that is motivated by some existing applications and which satisfies these three properties. The main idea behind this extension is that the arguments to a higher-order, free variable can be more than just distinct bound variables: they can also be terms constructed from (sufficient numbers of) such variables using term constructors and where no argument is a subterm of any other argument. We show that this extension to pattern unification satisfies the three properties mentioned above.
    [Show full text]