Henk: a Typed Intermediate Language

Total Page:16

File Type:pdf, Size:1020Kb

Henk: a Typed Intermediate Language Henk a typed intermediate language Simon Peyton Jones University of Glasgow and Oregon Graduate Institute Erik Meijer University of Utrecht and Oregon Graduate Institute May Abstract accurate compiletime types and the compiler can check its own activity if desired by checking the typecorrectness of the intermediate program We elab orate each of these p oints There is growing interest in the use of richlytyped interme in Section diate languages in sophisticated compilers for higherorder typed source languages These intermediate languages are This pap er describ es the design of a new typed intermediate typically stratied involving terms types and kinds As language Henk designed for compilers for purelyfunctiona l the sophisticati on of the type system increases these three languages It has the following distinctive features levels b egin to lo ok more and more similar so an attractive approach is to use a single syntax and a single data type in Henk is based directly on the lambdacube an expres the compiler to represent all three sive family of typed lambda calculi We have found a shortage of introductory material ab out the lambda The theory of socalled pure type systems makes precisely cub e so we present a tutorial in Section such an identication This pap er describ es Henk a new typed intermediate language based closely on a particular Henk is a smal l language there are only seven con pure type system the lambda cube On the way we give a structors in the data type of expressions Even so it tutorial introduction to the lambda cub e is a real language in the sense that it is rich enough to use as a compiler intermediate language Because of its lambdacub e heritage Henk uses a sin Overview gle syntax for terms types and kinds Better still compilers for Henk can use a single data type to repre Many compilers can b e divided into three main stages The sent all three levels This leads to considerable econ front end translates the source language into an intermedi omy in b oth the syntax of the language and the utility ate language the midd le end transforms the intermediate functions of the compiler itself language into a more ecient form and the back end trans Henk has an explicit concrete syntax Intermediate lan lates the intermediate language into the target language guages are typically expressed only as a data type in In the past intermediate languages for source languages a particular compiler Giving a concrete syntax is an with rich type systems have usually b een untyped The apparently trivial step but we b elieve it is an imp or compiler rst typechecks the source program and then tant one and it is one to which compilerwriters often translates the program to the intermediate language dis pay little attention A compiler front end can pro duce carding all the type information After all the type check Henk to b e consumed by the back end of a dierent ing simply ensured that the program would not go wrong compiler or to b e transformed by some external pro at runtime and once that is checked there is no further use gram b efore b eing fed back into the original compiler for types This pap er introduces no new technical results Rather our Recently however there has b een increasing interest in typed main contribution is to build a bridge b etween recently intermediate languages Peyton Jones Peyton Jones developed type theory and the compiler research commu et al Shao App el Tarditi et al nity There are several motivations for such an approach the compiler may b e able to take advantage of type information to generate b etter co de it may b e desirable to treat types as We are the rst to suggest using the lambda cub e values at runtime in which case it is necessary to maintain as the basis for a typed intermediate language Why b other We see two p ersuasive reasons { It dramatically reduces the number of data types This pap er will b e presented at the June Work and the volume of co de required in the com shop on Types in Compilation piler For example the Glasgow Haskell Compiler GHC has separate data types for terms types and kinds and separate algorithms for parsing functions Gill Launchbury Peyton Jones printing typing and transforming them Col Hu Iwasaki Takeichi Launch lapsing the three levels gets rid of all this dupli bury Sheard cation Traditional static type checking is p erformed com { It is easier to accommo date new developments in pletely at compile time In more sophisticated set the type system of the source language b ecause tings however it may b e useful to p ostp one some type the lambda cub es type language is already as ex checking until run time For example pressive as its term language { Some reasonable programs cannot readily b e ex We give a tutorial on the lambda cub e emphasising pressed in a static type systems notably ones in asp ects relevant to compiler builders Most of the lit volving metaprogramming There are many pro erature is relatively recent p ost and written p osals for incorp orating a type Dynamic in an oth from the p oint of view of theorists erwise staticallytyped language but all involve some runtime check that the type of a value matches an exp ected type Our initial fo cus is on nonstrict languages but we hop e to { Rather than statically sp ecialise a p olymorphic make Henk neutral with resp ect to the strictnonstrict ques function for the types at which it is called one tion That is rather than having two variants of Henk one can pass the type as an explicit argument so that strict one nonstrict we hop e to have a single language the function can b ehave appropriately Harp er that treats b oth styles as rst class citizens and allows free Morrisett mixing of the two in a single program { Tagfree garbage collectors need some runtime We assume familiarity with the lambda calculus in general type information to guide them Tarditi and with the secondorder lambda calculus also called F Tolmach in particular All these applicatio ns require accurate type informa tion to b e available at runtime and hence at compile time to o Background and motivation Debugging a compiler can b e a nightmare The only evidence of an incorrect transformation can b e a seg Typedirected compilation mentation fault in a large program compiled by the compiler Identifying the cause of the crash and trac ing it back to the faulty transformation is a long It has b een recognized for some time that maintaining type slow business If instead the compiler typechecks information right through to co de generation and b eyond the intermediatelanguag e program after every trans can b e b enecial Sp ecicall y formation the huge ma jority of transformation bugs can b e nailed in short order It is quite dicult to write Accurate type information can guide compiler analy an incorrect transformation that is type correct Fur ses and transformations thermore the program cannot crash if it is type cor For example rect so even incorrect transformations can only give an unexp ected result not a crash which is usually much { Strictness analysis over nonat domains has to easier to trace b e guided by type information Peyton Jones We call the Henk typechecker Core Lint If the Partain compiler is correct Core Lint do es nothing useful and { A compiler may b e able to use more ecient rep is switched o by default In our exp erience its ability resentations of data values if it knows their types to lo calise compiler bugs would by itself justify the use For example an integer can b e represented by the of a typed intermediate language integer itself rather than a p ointer to a b ox con taining the integer either in a strict language Compilers that maintain and use type information through or in a lazy one in contexts where the integer out the compiler have come to b e called typedirected is certainly evaluated The price to b e paid is Standard ML of New Jersey has b een typedirected for some that such unboxed integers cannot b e passed to time Shao App el but its internal type system is p olymorphic functions since their representation monomorphic so it cannot track types through p olymorphic diers from the simple p ointer that p olymorphic functions Since the Glasgow Haskell Compiler GHC co de typically exp ects Guided by type informa has used a language based on the secondorder lambda calcu tion one can sp ecialise the p olymorphic function lus as its intermediate language Peyton Jones Pey to the unboxed types at which it is used Pey ton Jones et al ton Jones Launchbury Tarditi et al More recently the TIL compiler uses a considerably more so { Transformations that remove intermediate data phisticated and complicated intermediate language capa structures often called deforestation or fusion ble of expressing intensional p olymorphism includin g recur transformations rely heavily on types to guide sive functions at the level of types Morrisset Tarditi the transformation In some cases the very cor et al TIL uses a stratied type system which sim rectness of the transformation relies on para plies the pro of theory but it do es lead to considerable du metricity a prop erty of welltyped p olymorphic plication in compiler transformations Tarditi We sp eculate that the lambda cub e might provide a theoreti Sp ecicall y Henk go es b eyond the second order lambda cal cally sound way to get the b est of b oth worlds culus in the following ways Shao is also developing a typed intermediate language It is elegantly parameterised Simply by selecting or FLINT with similar goals to TILs Shao
Recommended publications
  • Formalizing the Curry-Howard Correspondence
    Formalizing the Curry-Howard Correspondence Juan F. Meleiro Hugo L. Mariano [email protected] [email protected] 2019 Abstract The Curry-Howard Correspondence has a long history, and still is a topic of active research. Though there are extensive investigations into the subject, there doesn’t seem to be a definitive formulation of this result in the level of generality that it deserves. In the current work, we intro- duce the formalism of p-institutions that could unify previous aproaches. We restate the tradicional correspondence between typed λ-calculi and propositional logics inside this formalism, and indicate possible directions in which it could foster new and more structured generalizations. Furthermore, we indicate part of a formalization of the subject in the programming-language Idris, as a demonstration of how such theorem- proving enviroments could serve mathematical research. Keywords. Curry-Howard Correspondence, p-Institutions, Proof The- ory. Contents 1 Some things to note 2 1.1 Whatwearetalkingabout ..................... 2 1.2 Notesonmethodology ........................ 4 1.3 Takeamap.............................. 4 arXiv:1912.10961v1 [math.LO] 23 Dec 2019 2 What is a logic? 5 2.1 Asfortheliterature ......................... 5 2.1.1 DeductiveSystems ...................... 7 2.2 Relations ............................... 8 2.2.1 p-institutions . 10 2.2.2 Deductive systems as p-institutions . 12 3 WhatistheCurry-HowardCorrespondence? 12 3.1 PropositionalLogic.......................... 12 3.2 λ-calculus ............................... 17 3.3 The traditional correspondece, revisited . 19 1 4 Future developments 23 4.1 Polarity ................................ 23 4.2 Universalformulation ........................ 24 4.3 Otherproofsystems ......................... 25 4.4 Otherconstructions ......................... 25 1 Some things to note This work is the conclusion of two years of research, the first of them informal, and the second regularly enrolled in the course MAT0148 Introdu¸c˜ao ao Trabalho Cient´ıfico.
    [Show full text]
  • Proofs Are Programs: 19Th Century Logic and 21St Century Computing
    Proofs are Programs: 19th Century Logic and 21st Century Computing Philip Wadler Avaya Labs June 2000, updated November 2000 As the 19th century drew to a close, logicians formalized an ideal notion of proof. They were driven by nothing other than an abiding interest in truth, and their proofs were as ethereal as the mind of God. Yet within decades these mathematical abstractions were realized by the hand of man, in the digital stored-program computer. How it came to be recognized that proofs and programs are the same thing is a story that spans a century, a chase with as many twists and turns as a thriller. At the end of the story is a new principle for designing programming languages that will guide computers into the 21st century. For my money, Gentzen's natural deduction and Church's lambda calculus are on a par with Einstein's relativity and Dirac's quantum physics for elegance and insight. And the maths are a lot simpler. I want to show you the essence of these ideas. I'll need a few symbols, but not too many, and I'll explain as I go along. To simplify, I'll present the story as we understand it now, with some asides to fill in the history. First, I'll introduce Gentzen's natural deduction, a formalism for proofs. Next, I'll introduce Church's lambda calculus, a formalism for programs. Then I'll explain why proofs and programs are really the same thing, and how simplifying a proof corresponds to executing a program.
    [Show full text]
  • The Theory of Parametricity in Lambda Cube
    数理解析研究所講究録 1217 巻 2001 年 143-157 143 The Theory of Parametricity in Lambda Cube Takeuti Izumi 竹内泉 Graduate School of Informatics, Kyoto Univ., 606-8501, JAPAN [email protected] 京都大学情報学研究科 Abstract. This paper defines the theories ofparametricity for all system of lambda cube, and shows its consistency. These theories are defined by the axiom sets in the formal theories. These theories prove various important semantical properties in the formal systems. Especially, the theory for asystem of lambda cube proves some kind of adjoint functor theorem internally. 1Introduction 1.1 Basic Motivation In the studies of informatics, it is important to construct new data types and find out the properties of such data types. For example, let $T$ and $T’$ be data types which is already known. Then we can construct anew data tyPe $T\mathrm{x}T'$ which is the direct product of $T$ and $T’$ . We have functions left, right and pair which satisfy the following equations: left(pair $xy$) $=x$ , right(pair $xy$) $=y$ for any $x:T$ and $y:T’$ , $T\mathrm{x}T'$ pair(left $z$ ) right $z$ ) $=z$ for any $z$ : . As for another example, we can construct Lit(T) which is atype of lists whose components are elements of $T$ . We have the empty list 0as the elements of List(T), the functions (-,-), which is so called sons-pair, and listrec. These element and functions satisfy the following equations: $listrec()fe=e$, listrec(s,l) $fe=fx(listrec lfe)$ for any $x:T$, $l$ :List(T), $e:D$ and $f$ : $Tarrow Darrow D$, where $D$ is another type.
    [Show full text]
  • Polymorphism All the Way Up! from System F to the Calculus of Constructions
    Programming = proving? The Curry-Howard correspondence today Second lecture Polymorphism all the way up! From System F to the Calculus of Constructions Xavier Leroy College` de France 2018-11-21 Curry-Howard in 1970 An isomorphism between simply-typed λ-calculus and intuitionistic logic that connects types and propositions; terms and proofs; reductions and cut elimination. This second lecture shows how: This correspondence extends to more expressive type systems and to more powerful logics. This correspondence inspired formal systems that are simultaneously a logic and a programming language (Martin-Lof¨ type theory, Calculus of Constructions, Pure Type Systems). 2 I Polymorphism and second-order logic Static typing vs. genericity Static typing with simple types (as in simply-typed λ-calculus but also as in Algol, Pascal, etc) sometimes forces us to duplicate code. Example A sorting algorithm applies to any list list(t) of elements of type t, provided it also receives then function t ! t ! bool that compares two elements of type t. With simple types, to sort lists of integers and lists of strings, we need two functions with dierent types, even if they implement the same algorithm: sort list int :(int ! int ! bool) ! list(int) ! list(int) sort list string :(string ! string ! bool) ! list(string) ! list(string) 4 Static typing vs. genericity There is a tension between static typing on the one hand and reusable implementations of generic algorithms on the other hand. Some languages elect to weaken static typing, e.g. by introducing a universal type “any” or “?” with run-time type checking, or even by turning typing o: void qsort(void * base, size_t nmemb, size_t size, int (*compar)(const void *, const void *)); Instead, polymorphic typing extends the algebra of types and the typing rules so as to give a precise type to a generic function.
    [Show full text]
  • The Girard–Reynolds Isomorphism
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Information and Computation 186 (2003) 260–284 www.elsevier.com/locate/ic The Girard–Reynolds isomorphism Philip Wadler ∗ Avaya Laboratories, 233 Mount Airy Road, Room 2C05, Basking Ridge, NJ 07920-2311, USA Received 8 February 2002; revised 30 August 2002 Abstract The second-order polymorphic lambda calculus, F2, was independently discovered by Girard and Reynolds. Girard additionally proved a Representation Theorem: every function on natural numbers that can be proved total in second-order intuitionistic predicate logic, P2, can be represented in F2. Reynolds additionally proved an Abstrac- tion Theorem: for a suitable notion of logical relation, every term in F2 takes related arguments into related results. We observe that the essence of Girard’s result is a projection from P2 into F2, and that the essence of Reynolds’s result is an embedding of F2 into P2, and that the Reynolds embedding followed by the Girard projection is the identity. The Girard projection discards all first-order quantifiers, so it seems unreasonable to expect that the Girard projection followed by the Reynolds embedding should also be the identity. However, we show that in the presence of Reynolds’s parametricity property that this is indeed the case, for propositions corresponding to inductive definitions of naturals or other algebraic types. © 2003 Elsevier Science (USA). All rights reserved. 1. Introduction Double-barrelled names in science may be special for two reasons: some belong to ideas so subtle that they required two collaborators to develop; and some belong to ideas so sublime that they possess two independent discoverers.
    [Show full text]
  • A Language for Playing with Formal Systems
    University of Pennsylvania ScholarlyCommons Departmental Papers (CIS) Department of Computer & Information Science March 2003 TinkerType: a language for playing with formal systems Michael Y. Levin University of Pennsylvania Benjamin C. Pierce University of Pennsylvania, [email protected] Follow this and additional works at: https://repository.upenn.edu/cis_papers Recommended Citation Michael Y. Levin and Benjamin C. Pierce, "TinkerType: a language for playing with formal systems", . March 2003. Copyright Cambridge University Press. Reprinted from Journal of Functional Programming, Volume 13, Issue 2, March 2003, pages 295-316. This paper is posted at ScholarlyCommons. https://repository.upenn.edu/cis_papers/152 For more information, please contact [email protected]. TinkerType: a language for playing with formal systems Abstract TinkerType is a pragmatic framework for compact and modular description of formal systems (type systems, operational semantics, logics, etc.). A family of related systems is broken down into a set of clauses –- individual inference rules -– and a set of features controlling the inclusion of clauses in particular systems. Simple static checks are used to help maintain consistency of the generated systems. We present TinkerType and its implementation and describe its application to two substantial repositories of typed λ-calculi. The first epositr ory covers a broad range of typing features, including subtyping, polymorphism, type operators and kinding, computational effects, and dependent types. It describes both declarative and algorithmic aspects of the systems, and can be used with our tool, the TinkerType Assembler, to generate calculi either in the form of typeset collections of inference rules or as executable ML typecheckers. The second repository addresses a smaller collection of systems, and provides modularized proofs of basic safety properties.
    [Show full text]
  • Notes: Barendregt's Cube and Programming with Dependent Types
    Notes: Barendregt's cube and programming with dependent types Eric Lu I. ANNOTATED BIBLIOGRAPHY A. Lambda Calculi with Types @incollection{barendregt:1992 author = {Barendregt, Henk}, title = {Lambda Calculi with Types}, booktitle = {Handbook of Logic in Computer Science}, publisher = {Oxford University Press}, year = {1992}, editor = {Abramsky, S. and Gabbay, D.M. and Maibaum, T.S.E.}, volume = {2}, pages = {117--309}, address = {New York, New York} } Summary: As setup for introducing the lambda cube, Barendregt introduces the type-free lambda calculus and reviews some of its properties. Subsequently he presents accounts of typed lambda calculi from Curry and Church, including some Curry-style systems not belonging to the lambda cube (λU and λµ). He then describes the lambda cube construction that was first noted by Barendregt in 1991. The lambda cube describes an inclusion relation amongst eight typed lambda calculi. Moreover it explains a fine structure for the calculus of constructions arising from the presence or absence of three axiomatic additions to the simply-typed lambda calculus. The axes of the lambda cube are intuitively justified by the notion of dependency. After introducing the lambda cube, Barendregt gives properties of the languages belonging to the lambda cube, as well as some properties of pure type systems, which generalize the means of description of the axioms of the systems in the lambda cube. Evaluation: This work originates as a reference from the logical/mathematical community, but its content was important in providing a firm foundation for types in programming languages. The lambda cube explains prior work on a number of type systems by observing a taxonomy and relating them to logical systems via the Curry-Howard isomorphism.
    [Show full text]
  • Basics of Type Theory and Coq
    Basics of type theory and Coq Michael Shulman January 31, 2012 1 / 77 Type-theoretic foundations Set theory Type theory Logic Types ^; _; ); :; 8; 9 ×; +; !; Q; P Sets Logic ×; +; !; Q; P ^; _; ); :; 8; 9 x 2 A is a proposition x : A is a typing judgment 2 / 77 Type theory is programming For now, think of type theory as a programming language. • Closely related to functional programming languages like ML, Haskell, Lisp, Scheme. • More expressive and powerful. • Can manipulate “mathematical objects”. 4 / 77 Typing judgments Type theory consists of rules for manipulating judgments. The most important judgment is a typing judgment: x1 : A1; x2 : A2;::: xn : An ` b : B The turnstile ` binds most loosely, followed by commas. This should be read as: In the context of variables x1 of type A1, x2 of type A2,..., and xn of type An, the expression b has type B. Examples ` 0: N x : N; y : N ` x + y : N f : R ! R; x : R ` f (x): R 1 (n) 1 f : C (R; R); n: N ` f : C (R; R) 5 / 77 Type constructors The basic rules tell us how to construct valid typing judgments, i.e. how to write programs with given input and output types. This includes: 1 How to construct new types (judgments Γ ` A: Type). 2 How to construct terms of these types. 3 How to use such terms to construct terms of other types. Example (Function types) 1 If A: Type and B : Type, then A ! B : Type. 2 If x : A ` b : B, then λxA:b : A ! B.
    [Show full text]
  • Lambda Calculus a → B Λx:A.M MN
    F? ! FO λ! 1/ 30 Last time: Simply typed lambda calculus A ! B λx:A.M MN ... with products A × B hM, Ni fst M snd M ... and sums A + B inl [B]M inr [A]M case L of x.M | y.N Polymorphic lambda calculus 8α::K:A Λα::K.M M[A] ... with existentials 9α::K:A pack B,M as 9α::K:A open L as α,x in M 2/ 30 Typing rules for existentials Γ ` M : A[α::=B] Γ ` 9α::K:A :: ∗ 9-intro Γ ` pack B; M as 9α::K:A : 9α::K:A Γ ` M : 9α::K:A Γ; α :: K; x : A ` M0 : B 9-elim Γ ` open M as α; x in M0 : B 3/ 30 Unit in OCaml type u = Unit 4/ 30 Encoding data types in System F: unit The unit type has one inhabitant. We can represent it as the type of the identity function. Unit = 8α::∗.α ! α The unit value is the single inhabitant: Unit = Λα::∗.λa:α.a We can package the type and value as an existential: pack (8α::∗.α ! α, Λα::∗.λa:α.a) as 9U :: ∗:u We'll write 1 for the unit type and hi for its inhabitant. 5/ 30 Booleans in OCaml A boolean data type: type bool = False | True A destructor for bool: val _if_ : bool -> 'a -> 'a -> 'a let _if_ b _then_ _else_ = match b with False -> _else_ | True -> _then_ 6/ 30 Encoding data types in System F: booleans The boolean type has two inhabitants: false and true.
    [Show full text]
  • The Taming of the Rew: a Type Theory with Computational Assumptions Jesper Cockx, Nicolas Tabareau, Théo Winterhalter
    The Taming of the Rew: A Type Theory with Computational Assumptions Jesper Cockx, Nicolas Tabareau, Théo Winterhalter To cite this version: Jesper Cockx, Nicolas Tabareau, Théo Winterhalter. The Taming of the Rew: A Type Theory with Computational Assumptions. Proceedings of the ACM on Programming Languages, ACM, 2020, POPL 2021, 10.1145/3434341. hal-02901011v2 HAL Id: hal-02901011 https://hal.archives-ouvertes.fr/hal-02901011v2 Submitted on 8 Oct 2020 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. The Taming of the Rew A Type Theory with Computational Assumptions JESPER COCKX, TU Delft, Netherlands NICOLAS TABAREAU, Inria, France THÉO WINTERHALTER, Inria, France Dependently typed programming languages and proof assistants such as Agda and Coq rely on computation to automatically simplify expressions during type checking. To overcome the lack of certain programming primitives or logical principles in those systems, it is common to appeal to axioms to postulate their existence. However, one can only postulate the bare existence of an axiom, not its computational behaviour. Instead, users are forced to postulate equality proofs and appeal to them explicitly to simplify expressions, making axioms dramatically more complicated to work with than built-in primitives.
    [Show full text]
  • Explicit Convertibility Proofs in Pure Type Systems
    Explicit Convertibility Proofs in Pure Type Systems Floris van Doorn Herman Geuvers Freek Wiedijk Utrecht University Radboud University Nijmegen Radboud University Nijmegen fl[email protected] [email protected] [email protected] Abstract In dependent type systems, terms occur in types, so one can We define type theory with explicit conversions. When type check- do (computational) reduction steps inside types. This includes ing a term in normal type theory, the system searches for convert- β-reduction steps, but, in case one has inductive types, also ι- ibility paths between types. The results of these searches are not reduction and, in case one has definitions, also δ-reduction. A type stored in the term, and need to be reconstructed every time again. B that is obtained from A by a (computational) reduction step is In our system, this information is also represented in the term. considered to be “equal” to A. A common approach to deal with The system we define has the property that the type derivation this equality between types is to use an externally defined notion of of a term has exactly the same structure as the term itself. This has conversion. In case one has only β-reduction, this is β-conversion the consequence that there exists a natural LF encoding of such a which is denoted by A 'β B. This is the least congruence con- system in which the encoded type is a dependent parameter of the taining the β-reduction step. Then there is a conversion rule of the type of the encoded term.
    [Show full text]
  • Barendregt's Lambda Cube
    Barendregt's Lambda Cube I Classification of type λω λP! systems: dependency I Simply Typed Lambda Calculus λ2 λP2 I Extensions: 1. Polymorphic Types λω λP! 2. Type Operations 3. Dependent Types I Variation over same λ! λP theme: functions Dependencies between types and terms λω λP! I Terms depending on terms | Normal functions I Terms depending on types λ2 λP2 | Polymorphism Types depending on types I λω λP! | Type operators I Types depending on terms | Dependent types λ! λP Terms depending on terms Syntax: M; N ::= x j (λx : τ:N) j (MN) τ; σ ::= α j τ ! σ Rules: Γ(x) = τ Var Γ ` x : τ Γ; x : τ ` M : σ Abs Γ ` λx : τ:M : τ ! σ Γ ` M : τ Γ ` N : τ ! σ App Γ ` NM : σ About Simply Typed Lambda Calculus I Not very expressive I No genereal identity function I Propositional logic Terms depending on types Syntax: M; N ::= x j (λx : τ:N) j (MN) j Λα:M τ; σ ::= α j τ ! σ j 8α.τ Two new rules: Γ ` M : 8α.σ Inst Γ ` Mτ : σ[α := τ] Γ ` M : τ α 62 Γ Gen Γ ` Λα:M : 8α.τ About Polymorphic Lambda Calculus I Can express the parametric identity function: ` Λα.λx : α:x : 8α.α ! α I Second order logic I Many names, among other: System F I Type inference not computable I Impredicative | ML/Haskell predicative Types depending on types Syntax: a; b; A; B; F; T ::= x j ∗ j j T T j λx : T :T j T ! T Rules (s ranges over {∗; g): Axiom ` ∗ : Γ ` A : s Γ ` B : s Intro ! Γ ` A ! B : s Γ ` A : s x 62 Γ Var Γ; x : A ` x : A Γ ` F : A ! B Γ ` a : A App Γ ` F a : B Γ ` a : A Γ ` B : s x 62 Γ Weak Γ; x : B ` a : A Γ; x : A ` b : B Γ ` A ! B : s Abs Γ ` λx : A:b : A !
    [Show full text]