T the Formal Description of Programming Languages Using

Total Page:16

File Type:pdf, Size:1020Kb

T the Formal Description of Programming Languages Using t The Formal Description of Programming Languages using Predicate Logic by Christopher D.S. Moss Submitted for the Ph.D. Degree Department of Computing Imperial College, London July 1981 1 ABSTRACT Metamorphosis grammars and the Horn Clause subset of first-order predicate logic as used in the Prolog language provide a powerful formalism for describing all aspects of conventional programming languages. Colmerauer's M-grammars generalise the traditional grammar rewriting rules to apply to strings of function symbols with parameters. Expressed in first-order logic and using resolution these provide the facilities of other form- alisms such as W-grammars and Attribute Grammars with great- ly improved ease of use and comprehension and the added advantage that they may be run directly on a computer. The thesis provides a methodology for expressing both syntax and semantics of programming languages in a coherent framework. Unlike some formalisms which attempt to give most of the definition in terms of either syntax or semantics, this tries to preserve a natural balance between the two. The syntax separates lexical and grammar parts and generates an abstract syntax which includes the semantics of 'static' objects such as numbers. The semantics is expressed by means of relations which express state transformations using logic rather than the more traditional lambda calculus. Prolog has a well-defined fixpoint or denotational semantics as well as its proof theoretic semantics, which gives the definitions an adequate mathematical basis. The traditional axiomatic method can also be used to express the semantics using a metalevel proof system in which the proof rules become axioms of the system. 2 To demonstrate these principles, descriptions of three example languages are presented. These are ASPLE, a small language which has been used to compare other methods, the Prolog language itself (a non-deterministic applicative language) and a subset of Algol 68 including full jumps and procedures. The definition of the latter uses a method similar to the continuation method. An extensive survey is given of methods of syntax and semantic definition and several applications of the method are suggested, including language prototyping systems, comp- ilers and program proving systems. 3 CONTENTS 1. Introduction 7 2. Grammars and Logic 2.1. Metamorphosis Grammars 16 2.2. The Development of Syntax Descriptions. ... 35 3. Semantics 3.1. Relational Semantics 60 3.2. Axiomatic Semantics 71 3.3. The Development of Semantics 78 4. Examples of Formal Definitions 4.1. ASPLE 92 4.2. Prolog 119 4.3. Mini-Algol 68 139 5. Applications of Formal Definitions 5.1 Prototyping of languages ..... 153 5.2 Towards a logic compiler-compiler 158 5.3 Program proving and transformation 172 References 179 Appendices A. The definition of a subset of Algol 68 188 B. A Compiler for ASPLE 214 C. The Conversion of M-grammars to Prolog 222 4 At the still point of the turning world. Neither flesh nor fleshness; Neither from nor towards; at the still point, there the dance is, But neither arrest nor movement. And do not call it fixity, Where past and future are gathered. Neither movement from nor towards, Neither ascent nor decline. Except for the point, the still point, There would be no dance, and there is only the dance. T. S. Eliot (1935) The Four Quartets - Burnt Norton 1 Thanks . I would like to express my appreciation to everyone who has helped me in so many ways: by providing inspiration and frustration; by encouragement in chatting over issues and criticizing inane notions; by making life worth living in the real world that exists outside the thesis factory; and by practical help in many ways. In particular I must thank Bob Kowalski for his continual inspiration as my supervisor; Keith Clark and Maarten van Emden for discussions of tricky questions; Ian Moor and Moez Agha Hosseini for acting as sounding boards and being extremely hospitable room-mates; Sarah Bellows and Ellen Haigh, who managed to locate the most obscure reports in the library; Diane Reeve and Sandra Evans for typing large sections of the thesis; and Karen King for being patient with me when the whole exercise seemed futile. I was supported during this time by a studentship from the Science Research Council. They have my deep gratitude. Chris. 6 Chapter 1 Introduction and Summary The aim of providing an entirely formal specification for a programming language is a quest which has attracted a great deal of attention over the past twenty years. Al- though the majority of the problems have now been solved using a variety of techniques, what is still lacking is a common formalism with which to draw these together to make them readily comprehensible to the average practitioner of computing. The easiest part of a language to formalise is the context-free syntax. In this area BNF and its variants have gradually prevailed over the alternatives such as those used to define COBOL. The context-sensitive parts were solved in principle by van Wijngaarden in the definition of Algol 68, but other related formalisms, such as attribute grammars, have been attracting more attention because of their in- creased readability and amenability to computer implement- ation compared with W-grammars. The definition of semantics has taken much longer to establish and there is still considerable variation in the style of presentation, although the main lines are more generally agreed. Early definitions were essentially "oper- ational" in nature, based on simple automata which could "execute" programs. These are unsatisfactory on several counts: they cannot easily be used for many of the basic tasks for which semantics are required, such as proving properties of programs, or input-output relationships and equivalence; they have no way of describing non-terminating programs; and they are too "low-level" to provide an easy conceptualisation of the "meaning" of program constructs. 7 Introduction and Summary Later methods have been much more abstract, with a mathematical or logical basis. Currently, the most complete and widely used method is that of denotational semantics, introduced by Strachey and Scott, which describes a language in functional notation. The lambda calculus is used as a metalanguage and various mappings, of identifiers to stores and stores to values, are described in terms of this. There are two other popular methods which are more abstract than denotational semantics: one is the axiomatic or inductive assertion method of Floyd and Hoare which is widely used in program proving but requires the user to supply the induct- ive assertions along with the program, and also has diffi- culty with such intrinsic programming constructs as jumps and functions with side effects. The other is the algebraic method which characterises the semantics of programs by a set of properties which are required of programs. It is not clear at this point how well this deals with the more comp- licated parts of programming languages or how easy it is to show that an algebraic definition is complete. The axiomatic method is probably best considered as a set of theorems or lemmas derived from the denotational definition and useful for specific purposes such as program proving. In this thesis we demonstrate a logic programming ap- proach to the definition of programming languages. The basis of this is the Prolog language, which uses the Horn-clause subset of predicate logic linked with the resolution method for matching clauses. Each clause is composed of predicates in the form: A B <- ! & B2 & &Bn. where n> = 0 and A and are predicates, and ,<-' stands for 8 Introduction and Summary 'if1. This may be regarded as an assertion if n=0 and either an implication or a procedure if n>0. If A is absent this may either be regarded as a denial or a goal. Any variables in the clause are regarded as universally quantified over the clause. An example of a complete Prolog program (includ- ing a goal statement) is: Human(Turing). Human(Socrates). Fallible(x) <- Human(x). Greek(Socrates). <- Fallible(y) & Greek(y). for which the only valid solution is y=Socrates. The procedural interpretation involves matching goals with the heads (left hand sides) of procedures and replacing these by the bodies (right hand sides) of the procedures in a manner very similar to the productions of a grammar, with each branch terminating in an assertion. This process is non-deterministic, since more than one head may match a goal. It can also be interpreted or compiled on a computer with remarkable efficiency. An integral feature of Prolog systems is the use of metamorphosis grammars, originally envisaged by Colmerauer. These may be regarded as a regularised form of W-grammars and can be applied directly to the definition of both context-free and context-sensitive portions of programming languages. They are much simpler to comprehend than W- grammars because of the type-free nature of the logic used, and compare favourably with the use of attribute grammars for this purpose. In addition, they can be run directly on a computer to parse or generate a program, and for one-pass languages this requires no modification to the normal defin- ition. 9 Introduction and Summary Colmerauer's original definition followed the style of Chomsky's production rules in allowing several symbols on the left hand side, although only following a non-terminal. However, these grammars can be systematically transformed into rules which are similar to context-free rules in that they only have a single non-terminal on the left hand side. These are slightly more general than the "definite clause grammars" defined by Pereira and Warren since they allow non-terminals as parameters, and they have the same power as W-grammars.
Recommended publications
  • The Copyright Law of the United States (Title 17, U.S
    NOTICE WARNING CONCERNING COPYRIGHT RESTRICTIONS: The copyright law of the United States (title 17, U.S. Code) governs the making of photocopies or other reproductions of copyrighted material. Any copying of this document without permission of its author may be prohibited by law. CMU-CS-79-105 An Overview of the Production Quality Compiler-Compiler Project Bruce W. Leverett Roderic G. G. Cattell Steven 0. Hobbs Joseph M. Newcomer Andrew H. Reiner Bruce R. Schatz William A. Wulf Abstract The Production Quality Compiler-Compiler (PQCC) project is an investigation of the code generation process. The practical goal of the project is to build a truly automatic compiler-writing system. Compilers built with this system will be competitive in every respect with the best hand-generated compilers of today. They must generate highly optimized object code, and meet high standards of reliability and reasonable standards of performance. The system must operate from descriptions of both the source language and the target computer. Bringing up a new compiler, given a suitable language description and target architecture description, must be inexpensive and must not require the assistance of builders or maintainers of the compiler-writing system itself. This paper describes the goals and methodology of the PQCC project. This research was sponsored by the Defense Advanced Research Projects Agency (DOD), ARPA Order No. 3597, monitored by the Air Force Avionics Laboratory Under Contract F33615-78-C-1551. The views and conclusions contained in this document are those of the authors and should not be interpreted as representing the official policies, either expressed or implied, of the Defense Advanced Research Projects Agency or the US Government.
    [Show full text]
  • Compiler Construction
    Compiler construction PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Sat, 10 Dec 2011 02:23:02 UTC Contents Articles Introduction 1 Compiler construction 1 Compiler 2 Interpreter 10 History of compiler writing 14 Lexical analysis 22 Lexical analysis 22 Regular expression 26 Regular expression examples 37 Finite-state machine 41 Preprocessor 51 Syntactic analysis 54 Parsing 54 Lookahead 58 Symbol table 61 Abstract syntax 63 Abstract syntax tree 64 Context-free grammar 65 Terminal and nonterminal symbols 77 Left recursion 79 Backus–Naur Form 83 Extended Backus–Naur Form 86 TBNF 91 Top-down parsing 91 Recursive descent parser 93 Tail recursive parser 98 Parsing expression grammar 100 LL parser 106 LR parser 114 Parsing table 123 Simple LR parser 125 Canonical LR parser 127 GLR parser 129 LALR parser 130 Recursive ascent parser 133 Parser combinator 140 Bottom-up parsing 143 Chomsky normal form 148 CYK algorithm 150 Simple precedence grammar 153 Simple precedence parser 154 Operator-precedence grammar 156 Operator-precedence parser 159 Shunting-yard algorithm 163 Chart parser 173 Earley parser 174 The lexer hack 178 Scannerless parsing 180 Semantic analysis 182 Attribute grammar 182 L-attributed grammar 184 LR-attributed grammar 185 S-attributed grammar 185 ECLR-attributed grammar 186 Intermediate language 186 Control flow graph 188 Basic block 190 Call graph 192 Data-flow analysis 195 Use-define chain 201 Live variable analysis 204 Reaching definition 206 Three address
    [Show full text]
  • An Object-Oriented Compiler Construction Toolkit 1 Introduction
    An Ob ject-Oriented Compiler Construction To olkit TimothyP. Justice Department of Computer Science Oregon State University Corvallis, Oregon 97331{320 2 [email protected] March 12, 1993 Abstract Although standard to ols have b een used for lexical and syntactic analysis since the late 1970's, no standard to ols exist for the remaining parts of a compiler. Part of the reason for this de ciency is due to the diculty of pro ducing elegant to ols capable of handling the large amountofvariation involved in the compiling pro cess. The Ob ject-oriented Compiler Supp ort to olkit is a suite of reusable software comp onents designed to assist the compiler writer with symb ol management, typ e checking, intermediate representation construction, optimization, and co de generation. A collection of C++ classes de nes a common interface to these to ols. Variations in implementation are encapsulated in separately compiled mo dules that are selected and linked into the resulting compiler. 1 Intro duction A compiler is a program that translates a source language into an equivalent target language. The pro cess of compilation is well studied [ASU86, FL91, Hol90, Pys88]. Standard to ols have b een in use since the late 1970's for lexical analysis and syntactic analysis, however no such standard to ols exist for the remaining parts of a compiler. Perhaps the reason for this has less to do with our understanding of the problem or even the solution than the diculty of expressing an elegant solution that encompasses the large amount of variation existing within the problem.
    [Show full text]
  • Lily: a Parser Generator for LL(1) Languages
    Scholars' Mine Masters Theses Student Theses and Dissertations Summer 1987 Lily: A parser generator for LL(1) languages Timothy Topper Taylor Follow this and additional works at: https://scholarsmine.mst.edu/masters_theses Part of the Computer Sciences Commons Department: Recommended Citation Taylor, Timothy Topper, "Lily: A parser generator for LL(1) languages" (1987). Masters Theses. 507. https://scholarsmine.mst.edu/masters_theses/507 This thesis is brought to you by Scholars' Mine, a service of the Missouri S&T Library and Learning Resources. This work is protected by U. S. Copyright Law. Unauthorized use including reproduction for redistribution requires the permission of the copyright holder. For more information, please contact [email protected]. LILY: A PARSER GENERATOR FOR LL(1) LANGUAGES BY TIMOTHY TOPPER TAYLOR, 1961- A THESIS Presented to the Faculty of the Graduate School of the UNIVERSITY OF MISSOURI-ROLLA In Partial Fulfillment of the Requirements for the Degree MASTER OF SCIENCE IN COMPUTER SCIENCE 1987 Approved by ^ Advisor) / (7 ii ABSTRACT This paper discusses the design and implementation of Lily, a language for generating LL(1) language parsers, originally designed by Dr. Thomas J. Sager of the University of Missouri-Rolla. A method for the automatic generation of parser tables is described which creates small, highly optimized tables, suitable for conversion to minimal perfect hash functions. An implementation of Lily is discussed with attention to design goals, implementation of parser table generation, and table optimization techniques. Proposals are made detailing possibilities for further augmentation of the system. Examples of Lily programs are given as well as a manual for the system.
    [Show full text]
  • Implementing a Global Register Allocator for TCC
    Implementing a Global Register Allocator for TCC DIPLOMARBEIT zur Erlangung des akademischen Grades Diplom-Ingenieur im Rahmen des Studiums Technische Informatik eingereicht von Sebastian Falbesoner Matrikelnummer 0725433 an der Fakultät für Informatik der Technischen Universität Wien Betreuung: Ao.Univ.Prof. Dipl.-Ing. Dr.techn. Martin Ertl Wien, 22.08.2014 (Unterschrift Verfasser) (Unterschrift Betreuer) Technische Universität Wien A-1040 Wien Karlsplatz 13 Tel. +43-1-58801-0 www.tuwien.ac.at Erklärung zur Verfassung der Arbeit Sebastian Falbesoner Pachmüllergasse 1/14, 1120 Wien Hiermit erkläre ich, dass ich diese Arbeit selbständig verfasst habe, dass ich die verwen- deten Quellen und Hilfsmittel vollständig angegeben habe und dass ich die Stellen der Arbeit – einschließlich Tabellen, Karten und Abbildungen –, die anderen Werken oder dem Internet im Wortlaut oder dem Sinn nach entnommen sind, auf jeden Fall unter Angabe der Quelle als Entlehnung kenntlich gemacht habe. (Ort, Datum) (Unterschrift Verfasser) i In Dankbarkeit gewidmet meinen Eltern Lotte und Josef iii Abstract Register allocation is a long-standing research topic of computer science that has been studied extensively over the last decades. Its goal is to map a theoretically infinite number of program variables onto a finite, small set of CPU registers during compilation. Even though caches try to bridge the gap between register and memory access time, keeping as many values in registers as long as possible is crucial for good performance. Hence register allocation is still considered to be one of the most important compiler optimizations. The present diploma thesis describes the process of implementing a register allocator for TCC, a small single-pass C compiler written in C1.
    [Show full text]
  • CSC 330 Lecture Notes Week 2 Intro to Programming Language
    CSC330-S05-L2 1 CSC 330 LectureNotes Week 2 IntrotoProgramming Language Translation IntrotoJFlex I. Overviewofhigh-levelI/O specs for a programming language translator Major Module In Out Lexical Analyzer Source Code TokenStream Parser Token Stream Parse Tree Symbol Table Code GeneratorParse Tree Object Code Symbol Table II. "Compiler" versus "Translator" A. The viewofacompiler as a monolith whose only job is to generate object code has largely faded in a world of integrated development environments (IDEs). B. Frequently,programming language translators are expected to function as a collection of modular compo- nents in environments that provide a host of capabilities, only one of which is pure object code generation. C. Consider the comparison diagram in Figure 1. III. "Compiler" versus "Interpreter" A. The traditional distinction between compiler and interpreter is that a compiler generates machine code to be run on hardware, whereas an interpreter executes a program directly without generating machine code. B. However, this distinction is not always clear cut, since: 1. Manyinterpreters do some degree of compilation, producing code for some virtual rather than physical machine, e.g., Java’s virtual machine (JVM). 2. Compilers for languages with dynamic features (such as Lisp, SmallTalk, and evenC++) have large run- time support systems, amounting in some cases to essentially the same code found in interpreters. C. Consider again the diagram in Figure 1. IV.Rationale for non-monolithic translator design: A. Machine Independence and Component Reuse 1. A basic front end,consisting of a lexerand parser,translates from input source string to some machine- independent internal form, such as a parse tree.
    [Show full text]