Refinement Calculus

Total Page:16

File Type:pdf, Size:1020Kb

Refinement Calculus Θemi Refinement Calculus: A Systematic Introduction Ralph-Johan Back Joakim von Wright Draft, December 31, 1997 This is page ii Printer: Opaque this This is page iii Printer: Opaque this Preface The refinement calculus is a framework for reasoning about correctness and re- finement of programs. The primary application of the calculus is the derivation of traditional imperative programs, where programs are constructed by stepwise refinement. Each refinement step is required to preserve the correctness of the pre- vious version of the program. We study the basic rules for this kind of program derivation in detail, looking at specification statements and their implementation, how to construct recursive and iterative program statements, and how to use proce- dures in program derivations. The refinement calculus is an extension of Dijkstra’s weakest precondition calculus, where program statements are modeled as predicate transformers. We extend the traditional interpretation of predicate transformers as executable program statements to that of contracts that regulate the behavior of competing agents and to games that are played between two players. The traditional inter- pretation of predicate transformers as programs falls out of this as a special case, where one of the two players has only a very rudimentary role to play. Our extension yields a number of new potential applications for the refinement calculus theory, like reasoning about correctness of interactive program statements and analyzing two-person games and their winning strategies. Program refinement involves reasoning in many domains. We have to reason about properties of functions, predicates and sets, relations, and program statements de- scribed as predicate transformers. These different entities are in fact mathematically all very similar, and they form the refinement calculus hierarchy. In this book we examine the ways in which these different domains are related to each other and how one can move between them. Lattice theory forms a unifying conceptual basis iv that captures similarities between the domains, while higher-order logic serves as a common logic that we can use to reason about entities in these domains. We present a new foundation for the refinement calculus based on lattice theory and higher-order logic, together with a simple theory of program variables. These topics are covered in the first part of the book. Higher-order logic is described in a new way that is strongly influenced by lattice theory. This part also introduces notions needed for imperative programs, program variables, expressions, assignment statements, blocks with local variables, and procedures. The second part of the book describes the predicate transformer approach to programming logic and program semantics, as well as the notion of program refinement. Wealso study the operational semantics of contracts, games, and program statements, and show how it is related to the predicate transformer interpretation of these. The third part of the book shows how to handle recursion and iteration in the refinement calculus and also describes how to use the calculus to reason about games. It also presents case studies of program refinement. The last part of the book studies more specific issues related to program refinement, such as implementing specification statements, making refinements in context, and transforming iterative structures in a correctness-preserving way. The refinement calculus has been a topic of investigation for nearly twenty years now, and a great number of papers have been published in this area. However, a sys- tematic introduction to the mathematical and logical basis for program refinement has been missing. The information required to study this field is scattered through many journal and conference papers, and has never been presented as a coherent whole. This book addresses this problem by giving a systematic introduction to the field. The book partly summarizes work that has been published earlier, by ourselves and by other authors, but most of the material is new and has not been published before. The emphasis in this book is on the mathematical and logical basis of program re- finement. Although we do give examples and case studies of program construction, this is not primarily a hands-on book on how to construct programs in practice. For reasons of space we have been forced to omit a great deal of material that we had wanted include in the book. In particular, we decided not to include the important topics of data refinement and parallel and reactive system refinement here (though a simple approach to data refinement is described in Chapter 17). We plan to treat these topics in a comprehensive way in a separate book. The book is intended for graduate students and for advanced undergraduate students interested in the mathematics and logic needed for program derivations, as well as for programmers and researchers interested in a deeper understanding of these issues. The book provides new insight into the methods for program derivations, as well as new understanding of program semantics and of the logic for reasoning about programs. Θemv The book should be easy to use in teaching. We have tried to explain things quite carefully, and proofs are often given in reasonable detail. We have also included quite a number of examples and exercises. A previous understanding of logic and programming methodology is helpful but not necessary in order to understand the material in the book. The first, second, and third parts of the book can be given as successive courses, with suitable material added from the fourth part. Acknowledgments This book was inspired and influenced by a number of other researchers in the field. In particular, Edsger Dijkstra’s work on predicate transformers and his systematic approach to constructing programs has been an important source of inspiration, as has the work of Tony Hoare on the logic of programming. Jaco de Bakker’s work on programming language semantics has formed another strong influence, in particular the stringent way in which he has approached the topic. We very much appreciate Carroll Morgan’s work on the refinement calculus, which has provided many ideas and a friendly competitive environment that has been much appreciated. Finally, Michael Gordon’s work has shown the usefulness of higher-order logic as a framework for reasoning about both hardware and software and about formal theories in general. Our close coworkers Reino Kurki-Suonio and Kaisa Sere deserve a special thanks for many years of fruitful and inspiring collaboration in the area of programming methodology and the logic for systematic program construction. We are indebted to many other colleagues for numerous enlightening discussions on the topics treated in this book. Our thanks go to Sten Agerholm, Roland Backhouse, Eike Best, Michael Butler, Rutger Dijkstra, Wim Feijen, Nissim Francez, Marcel van de Goot, David Gries, Jim Grundy, John Harrison, Ian Hayes, Eric Hehner, Wim Hesselink, Peter Hofstee, Cliff Jones, Bengt Jonsson, He Jifeng, Joost Kok, Rustan Leino, Alain Martin, Tom Melham, David Nauman, Greg Nelson, John Tucker, Amir Pnueli, Xu Qiwen, Emil Sekerinski, Jan van de Snepscheut, Mark Staples, Reino Vainio, and Wolfgang Weck. We also want to thank our students, who have provided an inspiring environ- ment for discussing these topics and have read and given detailed comments on many previous versions of the book: Thomas Beijar, Martin B¨uchi, Eric Hedman, Philipp Heuberger, Linas Laibinis, Thomas Langbacka,˚ Leonid Mikhajlov, Anna Mikhajlova, Lasse Nielsen, Rimvydas Ruksˇenas,˙ Mauno R¨onkk¨o, and Marina Wald´en. The Department of Computer Science at Abo Akademi University and Turku Centre for Computer Science (TUCS) have provided a most stimulating working environment. The sabbaticals at the California Institute of Technology, Cambridge vi University, and Utrecht University have also been very fruitful and are hereby gratefully acknowledged. The generous support of the Academy of Finland and the Ministry of Education of Finland for the research reported in this book is also much appreciated. Last but not least, we want to thank our (sometimes more, sometimes less) patient wives and children, who have been long waiting for this book to be finished: Barbro Back, Pia, Rasmus, and Ida Back, Birgitta Snickars von Wright, Heidi, Minea, Elin, and Julius von Wright. This is page vii Printer: Opaque this Contents 1 Introduction 1 1.1 Contracts ........................... 1 1.2 Using Contracts ........................ 6 1.3 Computers as Agents ..................... 9 1.4 Algebra of Contracts ..................... 10 1.5 Programming Constructs ................... 12 1.6 Specification Constructs .................... 15 1.7 Correctness .......................... 18 1.8 Refinement of Programs .................... 20 1.9 Background .......................... 21 1.10 Overview of the Book ..................... 24 I Foundations 27 2 Posets, Lattices, and Categories 29 2.1 Partially Ordered Sets ..................... 29 2.2 Lattices ............................ 36 2.3 Product and Function Spaces ................. 40 2.4 Lattice Homomorphisms ................... 43 2.5 Categories ........................... 49 2.6 Summary and Discussion ................... 54 2.7 Exercises ........................... 54 3 Higher-Order Logic 57 3.1 Types and Terms ....................... 57 3.2 Semantics ........................... 62 3.3 Deductive Systems ...................... 63 3.4 Theories ............................ 65
Recommended publications
  • Proofs for the Working Engineer
    Research Collection Doctoral Thesis Proofs for the working engineer Author(s): Mehta, Farhad Dinshaw Publication Date: 2008 Permanent Link: https://doi.org/10.3929/ethz-a-005635243 Rights / License: In Copyright - Non-Commercial Use Permitted This page was generated automatically upon download from the ETH Zurich Research Collection. For more information please consult the Terms of use. ETH Library DISS. ETH NO. 17671 Proofs for the Working Engineer A dissertation submitted to ETH ZURICH for the degree of Doctor of Sciences presented by Farhad Dinshaw Mehta Master of Science, Technische Universit¨atM¨unchen Bachelor of Technology, Indian Institute of Technology Delhi born 11.01.1980 citizen of India accepted on the recommendation of Prof. Jean-Raymond Abrial Prof. Peter M¨uller Prof. Cliff Jones 2008 Abstract Over the last couple of decades the advantages of including formal proof within the development process for computer based systems has become in- creasingly clear. This has lead to a plethora of logics and proof tools that propose to fulfill this need. Nevertheless, the inclusion of theorem proving within the development process, even in domains where clear benefits can be expected, is rather an exception than the rule. One of the main goals of the formal methods endeavour is to bring the ac- tivity of formal theorem proving closer to the engineer developing computer based systems. This thesis makes some important practical contributions towards realising this goal. It hopes to shows that proper tool support can not only ease theorem proving, but also strenghten its role as a design aid. It shows that it is feasible to integrate interactive proof within a reactive development environment for formal systems.
    [Show full text]
  • Constructive Design of a Hierarchy of Semantics of a Transition System by Abstract Interpretation
    1 Constructive Design of a Hierarchy of Semantics of a Transition System by Abstract Interpretation Patrick Cousota aD´epartement d’Informatique, Ecole´ Normale Sup´erieure, 45 rue d’Ulm, 75230 Paris cedex 05, France, [email protected], http://www.di.ens.fr/~cousot We construct a hierarchy of semantics by successive abstract interpretations. Starting from the maximal trace semantics of a transition system, we derive the big-step seman- tics, termination and nontermination semantics, Plotkin’s natural, Smyth’s demoniac and Hoare’s angelic relational semantics and equivalent nondeterministic denotational se- mantics (with alternative powerdomains to the Egli-Milner and Smyth constructions), D. Scott’s deterministic denotational semantics, the generalized and Dijkstra’s conser- vative/liberal predicate transformer semantics, the generalized/total and Hoare’s partial correctness axiomatic semantics and the corresponding proof methods. All the semantics are presented in a uniform fixpoint form and the correspondences between these seman- tics are established through composable Galois connections, each semantics being formally calculated by abstract interpretation of a more concrete one using Kleene and/or Tarski fixpoint approximation transfer theorems. Contents 1 Introduction 2 2 Abstraction of Fixpoint Semantics 3 2.1 Fixpoint Semantics ............................... 3 2.2 Fixpoint Semantics Approximation ...................... 4 2.3 Fixpoint Semantics Transfer .......................... 5 2.4 Semantics Abstraction ............................. 7 2.5 Fixpoint Semantics Fusion ........................... 8 2.6 Fixpoint Iterates Reordering .......................... 8 3 Transition/Small-Step Operational Semantics 9 4 Finite and Infinite Sequences 9 4.1 Sequences .................................... 9 4.2 Concatenation of Sequences .......................... 10 4.3 Junction of Sequences ............................. 10 5 Maximal Trace Semantics 10 2 5.1 Fixpoint Finite Trace Semantics .......................
    [Show full text]
  • A Game Theoretical Semantics for a Logic of Formal Inconsistency
    A game theoretical semantics for a logic of formal inconsistency CAN BA¸SKENT∗, Department of Computer Science, University of Bath, BA2 7AY, England. Downloaded from https://academic.oup.com/jigpal/article/28/5/936/5213088 by guest on 25 September 2020 PEDRO HENRIQUE CARRASQUEIRA∗∗, Center for Logic, Epistemology and the History of Science (CLE), Institute of Philosophy and the Humanities (IFCH), State University of Campinas (UNICAMP), Campinas 13083-896, Brazil. Abstract This paper introduces a game theoretical semantics for a particular logic of formal inconsistency called mbC. Keywords: Game theoretical semantics, logic of formal inconsistency, mbC, non-deterministic runs, oracles 1 Motivation Paraconsistent logics are the logics of inconsistencies. They are designed to reason about and with inconsistencies and generate formal systems that do not get trivialized under the presence of inconsistencies. Formally, a logic is paraconsistent if the explosion principle does not hold—in paraconsistent systems, there exist some formulas ϕ, ψ such that ϕ, ¬ϕ ψ. Within the broad class of paraconsistent logics, Logics of Formal Inconsistency (LFIs, for short) present an original take on inconsistencies. Similar to the da Costa systems, LFIs form a large class of paraconsistent logics and make use of a special operator to control inconsistencies [8, 9]. Another interesting property of LFIs is that they internalize consistency and inconsistency at the object level. To date, the semantics for LFIs has been suggested using truth tables and algebraic structures. What we aim for in this paper is to present a game theoretical semantics for a particular logic within the class of LFIs. By achieving this, we attempt at both presenting a wider perspective for the semantics of paraconsistency and a broader, more nuanced understanding of semantic games.
    [Show full text]
  • Game Semantics of Homotopy Type Theory
    Game Semantics of Homotopy Type Theory Norihiro Yamada [email protected] University of Minnesota Homotopy Type Theory Electronic Seminar Talks (HoTTEST) Department of Mathematics, Western University February 11, 2021 N. Yamada (Univ. of Minnesota) Game semantics of HoTT Feb. 11, 2021, Western 1 / 19 Just like axiomatic set theory is explained by sets in an informal sense, the conceptual foundation of Martin-L¨oftype theory (MLTT) is computations in an informal sense (a.k.a. the BHK-interpretation). Proofs/objects as computations (e.g., succ : : max N); MLTT as a foundation of constructive maths. On the other hand, homotopy type theory (HoTT) is motivated by the homotopical interpretation of MLTT. HoTT = MLTT + univalence + higher inductive types (HITs); Homotopical interpretation: formulas as spaces, proofs/objects as points, and higher proofs/objects as paths/homotopies. Introduction Background: MLTT vs. HoTT N. Yamada (Univ. of Minnesota) Game semantics of HoTT Feb. 11, 2021, Western 2 / 19 Proofs/objects as computations (e.g., succ : : max N); MLTT as a foundation of constructive maths. On the other hand, homotopy type theory (HoTT) is motivated by the homotopical interpretation of MLTT. HoTT = MLTT + univalence + higher inductive types (HITs); Homotopical interpretation: formulas as spaces, proofs/objects as points, and higher proofs/objects as paths/homotopies. Introduction Background: MLTT vs. HoTT Just like axiomatic set theory is explained by sets in an informal sense, the conceptual foundation of Martin-L¨oftype theory (MLTT) is computations in an informal sense (a.k.a. the BHK-interpretation). N. Yamada (Univ. of Minnesota) Game semantics of HoTT Feb. 11, 2021, Western 2 / 19 MLTT as a foundation of constructive maths.
    [Show full text]
  • Fiendish Designs
    Fiendish Designs A Software Engineering Odyssey © Tim Denvir 2011 1 Preface These are notes, incomplete but extensive, for a book which I hope will give a personal view of the first forty years or so of Software Engineering. Whether the book will ever see the light of day, I am not sure. These notes have come, I realise, to be a memoir of my working life in SE. I want to capture not only the evolution of the technical discipline which is software engineering, but also the climate of social practice in the industry, which has changed hugely over time. To what extent, if at all, others will find this interesting, I have very little idea. I mention other, real people by name here and there. If anyone prefers me not to refer to them, or wishes to offer corrections on any item, they can email me (see Contact on Home Page). Introduction Everybody today encounters computers. There are computers inside petrol pumps, in cash tills, behind the dashboard instruments in modern cars, and in libraries, doctors’ surgeries and beside the dentist’s chair. A large proportion of people have personal computers in their homes and may use them at work, without having to be specialists in computing. Most people have at least some idea that computers contain software, lists of instructions which drive the computer and enable it to perform different tasks. The term “software engineering” wasn’t coined until 1968, at a NATO-funded conference, but the activity that it stands for had been carried out for at least ten years before that.
    [Show full text]
  • The Game Semantics of Game Theory
    The game semantics of game theory Jules Hedges We use a reformulation of compositional game theory to reunite game theory with game semantics, by viewing an open game as the System and its choice of contexts as the Environment. Specifically, the system is jointly controlled by n ≥ 0 noncooperative players, each independently optimising a real-valued payoff. The goal of the system is to play a Nash equilibrium, and the goal of the environment is to prevent it. The key to this is the realisation that lenses (from functional programming) form a dialectica category, which have an existing game-semantic interpretation. In the second half of this paper, we apply these ideas to build a com- pact closed category of ‘computable open games’ by replacing the underly- ing dialectica category with a wave-style geometry of interaction category, specifically the Int-construction applied to the traced cartesian category of directed-complete partial orders. 1 Introduction Although the mathematics of games shares a common ancestor in Zermelo’s work on backward induction, it split early on into two subjects that are essentially disjoint: game theory and game semantics. Game theory is the applied study of modelling real-world interacting agents, for example in economics or artificial intelligence. Game semantics, by contrast, uses agents to model – this time in the sense of semantics rather than mathematical modelling – situations in which a system interacts with an environment, but neither would usually be thought of as agents in a philosophical sense. On a technical level, game semantics not only restricts to the two-player zero-sum case, but moreover promotes one of the players to be the Player, and demotes the other to mere Opponent.
    [Show full text]
  • Oral History of Sir Antony Hoare
    Oral History of Sir Antony Hoare Interviewed by: Jonathan P. Bowen Recorded: September 8, 2006 Cambridge, United Kingdom CHM Reference number: X3698.2007 © 2006 Computer History Museum Oral History of Sir Antony Hoare Jonathan Bowen: Hello, Tony. Would you like to introduce yourself briefly? Sir Antony Hoare: I’m Tony Hoare, principal researcher at Microsoft Research Limited in Cambridge. Thank you for coming here to talk to me. Bowen: Thank you, Tony. I’m looking forward to our talk together. It would be interesting to know, first of all, how you grew up, and what your mother and father did. Hoare: My father was a colonial civil servant, and my mother was the daughter of a tea planter in Ceylon. She was called out to Ceylon to act as social secretary for my grandfather, and they met in Ceylon, married there, and I was born there. Bowen: And do you have any memories of Ceylon? Hoare: Oh, yes, I have quite vivid memories of going to school there. In those days it was still quite a wild place, and we used to go out to the country -- indeed into the forest -- to see animals and elephants and tigers. Had quite exciting adventures there in the school party. Bowen: And you had brothers and sisters? Hoare: I have two younger brothers and two younger sisters. My second brother was also born in Ceylon. Bowen: And you all got on well together? You were a happy family? Hoare: Oh, yes, in the end anyway. Bowen: Yes, like all families. Yes. Hoare: We still have the opportunity to meet quite frequently.
    [Show full text]
  • Game Theoretical Semantics for Paraconsistent Logics
    Game Theoretical Semantics for Paraconsistent Logics Can Ba¸skent Department of Computer Science, University of Bath, England [email protected], www.canbaskent.net/logic 1 Introduction Game theoretical semantics suggests a very intuitive approach to formal seman- tics. The semantic verification game for classical logic is played by two players, verifier and falsifier who we call Heloise and Abelard respectively. The goal of Heloise in the game is to verify the truth of a given formula in a given model whereas for Abelard it is to falsify it. The rules are specified syntactically based on the form of the formula. During the game, the given formula is broken into subformulas step by step by the players. The game terminates when it reaches the propositional literals and when there is no move to make. If the game ends up with a propositional literal which is true in the model in question, then Heloise wins the game. Otherwise, Abelard wins. Conjunction is ssociated with Abelard, disjunction with Heloise. That is, when the main connective is a conjunction, it is Abelard’s turn to choose and make a move, and similarly, disjunction yields a choice for Heloise. The negation operator switches the roles of the players: Heloise becomes the falsifier, Abelard becomes the verifier. The major result of this approach states that Heloise has a winning strategy if and only if the given formula is true in the given model. The semantic verification game and its rules are shaped by classical logic and consequently by its restrictions. In this work, we first observe how the verification games change in non-classical, especially propositional paraconsistent logics, and give Hintikka-style game theoretical se- mantics for them.
    [Show full text]
  • Roadmap for Enhanced Languages and Methods to Aid Verification Gary T
    Computer Science Technical Reports Computer Science 7-2006 Roadmap for Enhanced Languages and Methods to Aid Verification Gary T. Leavens Iowa State University Jean-Raymond Abrial ETH Zürich Don Batory University of Texas Michael Butler University of Southampton Alessandro Coglio Kestrel Institute See next page for additional authors Follow this and additional works at: https://lib.dr.iastate.edu/cs_techreports Part of the Computational Engineering Commons, Programming Languages and Compilers Commons, and the Software Engineering Commons Recommended Citation Leavens, Gary T.; Abrial, Jean-Raymond; Batory, Don; Butler, Michael; Coglio, Alessandro; Fisler, Kathi; Hehner, Eric; Jones, Cliff; Miller, Dale; Peyton-Jones, Simon; Sitaraman, Murali; Smith, Douglas R.; and Stump, Aaron, "Roadmap for Enhanced Languages and Methods to Aid Verification" (2006). Computer Science Technical Reports. 6. https://lib.dr.iastate.edu/cs_techreports/6 This Article is brought to you for free and open access by the Computer Science at Iowa State University Digital Repository. It has been accepted for inclusion in Computer Science Technical Reports by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Roadmap for Enhanced Languages and Methods to Aid Verification Abstract This roadmap describes ways that researchers in four areas -- specification languages, program generation, correctness by construction, and programming languages -- might help further the goal of verified software. It also describes what advances the ``verified software'' grand challenge might anticipate or demand from work in these areas. That is, the roadmap is intended to help foster collaboration between the grand challenge and these research areas. A common goal for research in these areas is to establish language designs and tool architectures that would allow multiple annotations and tools to be used on a single program.
    [Show full text]
  • A Game Theoretical Semantics for Logics of Nonsense
    A Game Theoretical Semantics for Logics of Nonsense Can Bas¸kent Department of Computer Science, Middlesex University, London, UK [email protected] Logics of non-sense allow a third truth value to express propositions that are nonsense. These logics are ideal formalisms to understand how errors are handled in programs and how they propagate throughout the programs once they appear. In this paper, we give a Hintikkan game semantics for logics of non-sense and prove its correctness. We also discuss how a known solution method in game theory, the iterated elimination of strictly dominated strategies, relates to semantic games for logics of nonsense. Finally, we extend the logics of nonsense only by means of semantic games, developing a new logic of nonsense, and propose a new game semantics for Priest’s Logic of Paradox. Keywords Game semantics, logics of nonsense, logic of paradox, iterated elimination of strictly dominated strategies. 1 Introduction Logics of nonsense allow a third truth value to express propositions that are nonsense. The initial mo- tivation behind introducing nonsensical propositions was to capture the logical behavior of semantic paradoxes that were thought to be nonsensical. As argued by Ferguson, nonsense is “infectious” [8]: “Meaninglessness [...] propagates through the language; that a subformula is meaningless entails that any complex formula in which it finds itself is likewise meaningless.” Logics of nonsense, for this reason, are intriguing. Nonsense appears in various contexts. In mathematics, the expression “1/x” is considered mean- ingless when x = 0. Certain approaches to truth disregard paradoxes of self-reference and exclude them as meaningless.
    [Show full text]
  • Application of Link Integrity Techniques from Hypermedia to the Semantic Web
    UNIVERSITY OF SOUTHAMPTON Faculty of Engineering and Applied Science Department of Electronics and Computer Science A mini-thesis submitted for transfer from MPhil to PhD Supervisor: Prof. Wendy Hall and Dr Les Carr Examiner: Dr Nick Gibbins Application of Link Integrity techniques from Hypermedia to the Semantic Web by Rob Vesse February 10, 2011 UNIVERSITY OF SOUTHAMPTON ABSTRACT FACULTY OF ENGINEERING AND APPLIED SCIENCE DEPARTMENT OF ELECTRONICS AND COMPUTER SCIENCE A mini-thesis submitted for transfer from MPhil to PhD by Rob Vesse As the Web of Linked Data expands it will become increasingly important to preserve data and links such that the data remains available and usable. In this work I present a method for locating linked data to preserve which functions even when the URI the user wishes to preserve does not resolve (i.e. is broken/not RDF) and an application for monitoring and preserving the data. This work is based upon the principle of adapting ideas from hypermedia link integrity in order to apply them to the Semantic Web. Contents 1 Introduction 1 1.1 Hypothesis . .2 1.2 Report Overview . .8 2 Literature Review 9 2.1 Problems in Link Integrity . .9 2.1.1 The `Dangling-Link' Problem . .9 2.1.2 The Editing Problem . 10 2.1.3 URI Identity & Meaning . 10 2.1.4 The Coreference Problem . 11 2.2 Hypermedia . 11 2.2.1 Early Hypermedia . 11 2.2.1.1 Halasz's 7 Issues . 12 2.2.2 Open Hypermedia . 14 2.2.2.1 Dexter Model . 14 2.2.3 The World Wide Web .
    [Show full text]
  • Semantics of Interaction Samson Abramsky
    Semantics of Interaction Samson Abramsky Abstract The “classical” paradigm for denotational semantics models data types as domains, i.e. structured sets of some kind, and programs as (suitable) functions between do- mains. The semantic universe in which the denotational modelling is carried out is thus a category with domains as objects, functions as morphisms, and composition of morphisms given by function composition. A sharp distinction is then drawn between denotational and operational semantics. Denotational semantics is often referred to as “mathematical semantics” because it exhibits a high degree of math- ematical structure; this is in part achieved by the fact that denotational semantics abstracts away from the dynamics of computation—from time. By contrast, op- erational semantics is formulated in terms of the syntax of the language being modelled; it is highly intensional in character; and it is capable of expressing the dynamical aspects of computation. The classical denotational paradigm has been very successful, but has some definite limitations. Firstly, fine-structural features of computation, such as se- quentiality, computational complexity, and optimality of reduction strategies, have either not been captured at all denotationally, or not in a fully satisfactory fashion. Moreover, once languages with features beyond the purely functional are con- sidered, the appropriateness of modelling programs by functions is increasingly open to question. Neither concurrency nor “advanced” imperative features such as local references have been captured denotationally in a fully convincing fashion. This analysis suggests a desideratum of Intensional Semantics, interpolating between denotational and operational semantics as traditionally conceived. This should combine the good mathematical structural properties of denotational se- mantics with the ability to capture dynamical aspects and to embody computational intuitions of operational semantics.
    [Show full text]