Chapter 3 Semantics

Total Page:16

File Type:pdf, Size:1020Kb

Chapter 3 Semantics Topics Introduction Chapter 3 Static Semantics Attribute Grammars Semantics Dynamic Semantics Operational Semantics Axiomatic Semantics Denotational Semantics Chapter 3: Semantics 2 Introduction Introduction Language implementors Well-designed programming language n Understand how all the constructs of the n Semantics should follow directly from syntax. language are form and their intended effect n Form of a statement should strongly suggest when executed. what the statement is meant to accomplish. Language users Definition of a programming language n Determine how to encode a possible solution n Complete: semantics and syntax are fully of a problem (program) using the reference defined. manual of the programming language. A language should provides a variety of Less knowledge of how to correctly define different constructs, each one with a the semantics of a language. precise definition. Chapter 3: Semantics 3 Chapter 3: Semantics 4 Introduction Introduction Language manuals Natural language description n Definition of semantics is given in ordinary n Ambiguous in its meaning natural language. Different readers come away with different n Construct interpretations of the semantics of a language construct. Syntax: a rule (or set of rules) from a BNF or other formal grammar. A method is needed for giving a Semantics: a few paragraphs and some readable, precise, and concise definition examples. of the semantics of an entire language. Chapter 3: Semantics 5 Chapter 3: Semantics 6 1 Static Semantics Static Semantics BNFs cannot describe all of the syntax of Some problems have nothing to do with programming languages. “meaning” in the sense of run-time behavior n Some context-specific parts are left out. n They are concern about the legal form of the Is there a form to generate L={anbncn} program. n Static semantics refers to type checking and resolving using a context-free grammar or a BNF? declarations. An attempt: n Examples: <string> ::= <aseq> <bseq> <c seq> All variables must be declared before they are referenced <aseq> ::= a | <aseq> a Ada: the name on the end of a procedure must match the procedure’s name <bseq> ::= b | <bseq> b L’={akbmcn | k=1, m=1, n=1} <c seq> ::= c | <c seq> c No context-free grammar generates L Both sides of an assignment must be of the same type. Chapter 3: Semantics 7 Chapter 3: Semantics 8 Attribute Grammars: Basic Static Semantics Concepts Earliest attempts to add semantics to a A context-free grammar extended to provide programming language context-sensitivity information by appending attributes to each node of a parse tree. Add extensions to the BNF grammar Each distinct symbol in the grammar has that defined the language. associated with it a finite, possibly empty, set of n Given a parse tree for a program, attributes. additional information could be extracted n Each attribute has a domain of possible values. from that tree. n An attribute may be assigned values from its domain during parsing. n Attributes can be evaluated in assignments and conditions. Chapter 3: Semantics 9 Chapter 3: Semantics 10 Attribute Grammars: Generalities Attribute Grammars: Parse Tree S Two classes of attributes: n Synthesized attribute Gets its value from the attributes attached to its children (subtree below the node). inherited Used to pass semantic information up a parse tree. n Inherited attribute Gets its value from the attributes attached to the A parent (subtree above the node). Used to pass semantic information down and synthesized across a tree. t Chapter 3: Semantics 11 Chapter 3: Semantics 12 2 Attribute Grammar Definition Attribute Grammars Associate some functions to compute the Starting with the underlying context-free value of the attributes with each grammar G=<N,T,P,S> production in the grammar. For every production p in P These local definitions associated with each production of the grammar define n Number of terminal and nonterminal symbols the values of the attributes for all parse in string a: n(p). trees. If a is the empty string, then n(p)=0. Sometimes each symbol of a production will be Given the definitions and a parse tree, considered individually. algorithms exist to compute the attributes n For all production pÎP : A? a or p0? p1,p2,…pn (p) of all the nodes in the three. Chapter 3: Semantics 13 Chapter 3: Semantics 14 Attribute Grammars Attribute Grammars: attributes Augment the context-free grammar by There is a set of attributes At(x)ÌAt to attributes and semantic rules. every grammar symbol xÎNÈT Set of attributes: At. n At(x) can be seen as additional information about the symbol x. n For each attribute aÎAt: associate a set of Set values Domain(a). n In(x) = { aÎAt(x) | aÎ In } n An attribute is just a name for a set of values n Syn(x) = { aÎAt(x) | aÎ Syn } Set of attributes: two disjoint classes: n Requirements: n Inherited attributes In and the synthesized In(S)= Æ (start symbol can inherit no information) attributes Syn (At=InÈ Syn and InÇ Syn=Æ). For all tÎT, Syn(t)= Æ (there is no structure beneath a terminal from which to synthesize information) Chapter 3: Semantics 15 Chapter 3: Semantics 16 Attribute Grammars: attribute Attribute Grammars: rules occurrences Same attribute can be associated with different Attribute occurrenceof a rule p is an ordered symbols appearing in the same grammar rule. pair of attributes and natural number <a,j> n Example: S? AB, all could inherit attribute int representing the attribute a at position j in associated to them: In(S)=In(A)=In(B)={int}. production p. n It is impossible to consider the set of attributes associated with all the symbols of a production n Particular rule pÎP an attribute occurrence at j will be written p.a. without losing track of which attributes appear more j than once. n Set of attribute occurrences for a production p is defined: AO(p) = { p.a | aÎAt(p), 0£ j £ n(p) } n More confusing: productions that have a nonterminal j j appearing more than once, as in S? ASA. Chapter 3: Semantics 17 Chapter 3: Semantics 18 3 Attribute Grammars: attribute Attribute Grammars: flow of occurrences attribute occurrences Rule: S? AB Set of attribute occurrences for a rule is subtree divided into two disjoint subsets. above S In(S) Syn (S) n Defined occurrences for a production p: S DO(p) = { p0.s | sÎSyn(p0)} È { pj .i | iÎIn(p j), 1£ j £ n(p) } In a parse tree, the set UO(p) represents the information In(B) Syn (B) flowing into the node of the parse tree labeled p0 In(A) Syn (A) A B n Used occurrences for a production p: UO(p) = { p0.i | iÎIn(p0)} È { pj .s | sÎSyn(p j), 1£ j £ n(p) } In a parse tree, the set DO(p) represents the information subtree subtree flowing out flowing into the node of the parse tree labeled p0 below A below B Chapter 3: Semantics 19 Chapter 3: Semantics 20 Attribute Grammars: used attribute Attribute Grammars: defined occurrences attribute occurrences Used attribute occurrences (the Defined attribute occurrences (the information flowing in) are In(S), Syn(A), information flowing out) are Syn(S), In(A), and Syn(B). and In(B). S A B S A B synthesized Syn(S) Syn(A) Syn(B) synthesized Syn(S) Syn(A) Syn(B) inherited In(S) In(A) In(B) inherited In(S) In(A) In(B) Chapter 3: Semantics 21 Chapter 3: Semantics 22 Attribute Grammars: semantic Attribute Grammar function Semantic function fp,v. n For every attribute occurrence vÎDO(p) An attribute grammar as a context-free n Defined values for attributes in DO(p) in terms of the grammar with two disjoint sets of values of the attributes in UO(p). n Produces a value for the attribute a from values of the attributes (inherited and synthesized) attributes of UO(p). and semantic functions for all defined n There is no requirement that all the attribute attribute occurrences. occurrences of UO(p) are used by fp,v . n Dependency set (Dp,v .) of fp,v: is the set of attribute occurrences used (subset of UO(p)) n Dp,v could be empty Value of the attribute: computed without any other additional information. The functionfp ,v is a constant. Chapter 3: Semantics 23 Chapter 3: Semantics 24 4 Attribute Grammar: binary digits Attribute Grammar: binary digits example example Context-free grammar that generates strings Compute the defined and the used occurrences for of binary digits. each production p: B ® D The defined occurrences is the set of synthesized attributes of the LHS plus the set of inherited q: B ® D B B D attributes of all the grammar symbols of the RHS. r: D ® 0 synthesized pos, val val s: D ® 1 Defined Used inherited pow Attributes: p B.pos, B.val, D.pow D.val q B1.pos, B1.val, D.pow B2.pos, B2.val, D.val n val: accumulate the value of the binary numbers r D.val D.pow n pow and pos: keep track of the position and the s power of 2. D.val D.pow Chapter 3: Semantics 25 Chapter 3: Semantics 26 Attribute Grammar: binary digits Attribute Grammar: binary digits example pos val example 4 10 pos val Function definitions for the eight defined attribute occurrences. B pos val 3 2 p: B ® D 3 8 D B pos val B.pos := 1 pos val B.val := D.val 2 0 2 2 D.pow := 0 B q: B ® D B pos val 1 2 1 D pos val B1.pos := B2.pos+1 1 2 B1.val := B 2.val +D.val 1 0 B D.pow := B2.pos D r: D ® 0 D.val := 0 0 pos val s: D ® 1 D 0 0 D.val := 2 D.pow 1 Evaluation of parse tree for 1010 0 Chapter 3: Semantics 27 Chapter 3: Semantics 28 Dynamic Semantics Dynamic Semantics Semantics of a programming language is Another way to view programming the definition of the meaning of any meaning is to start with a formal program that is syntactically valid.
Recommended publications
  • Operational Semantics Page 1
    Operational Semantics Page 1 Operational Semantics Class notes for a lecture given by Mooly Sagiv Tel Aviv University 24/5/2007 By Roy Ganor and Uri Juhasz Reference Semantics with Applications, H. Nielson and F. Nielson, Chapter 2. http://www.daimi.au.dk/~bra8130/Wiley_book/wiley.html Introduction Semantics is the study of meaning of languages. Formal semantics for programming languages is the study of formalization of the practice of computer programming. A computer language consists of a (formal) syntax – describing the actual structure of programs and the semantics – which describes the meaning of programs. Many tools exist for programming languages (compilers, interpreters, verification tools, … ), the basis on which these tools can cooperate is the formal semantics of the language. Formal Semantics When designing a programming language, formal semantics gives an (hopefully) unambiguous definition of what a program written in the language should do. This definition has several uses: • People learning the language can understand the subtleties of its use • The model over which the semantics is defined (the semantic domain) can indicate what the requirements are for implementing the language (as a compiler/interpreter/…) • Global properties of any program written in the language, and any state occurring in such a program, can be understood from the formal semantics • Implementers of tools for the language (parsers, compilers, interpreters, debuggers etc) have a formal reference for their tool and a formal definition of its correctness/completeness
    [Show full text]
  • Constructive Design of a Hierarchy of Semantics of a Transition System by Abstract Interpretation
    1 Constructive Design of a Hierarchy of Semantics of a Transition System by Abstract Interpretation Patrick Cousota aD´epartement d’Informatique, Ecole´ Normale Sup´erieure, 45 rue d’Ulm, 75230 Paris cedex 05, France, [email protected], http://www.di.ens.fr/~cousot We construct a hierarchy of semantics by successive abstract interpretations. Starting from the maximal trace semantics of a transition system, we derive the big-step seman- tics, termination and nontermination semantics, Plotkin’s natural, Smyth’s demoniac and Hoare’s angelic relational semantics and equivalent nondeterministic denotational se- mantics (with alternative powerdomains to the Egli-Milner and Smyth constructions), D. Scott’s deterministic denotational semantics, the generalized and Dijkstra’s conser- vative/liberal predicate transformer semantics, the generalized/total and Hoare’s partial correctness axiomatic semantics and the corresponding proof methods. All the semantics are presented in a uniform fixpoint form and the correspondences between these seman- tics are established through composable Galois connections, each semantics being formally calculated by abstract interpretation of a more concrete one using Kleene and/or Tarski fixpoint approximation transfer theorems. Contents 1 Introduction 2 2 Abstraction of Fixpoint Semantics 3 2.1 Fixpoint Semantics ............................... 3 2.2 Fixpoint Semantics Approximation ...................... 4 2.3 Fixpoint Semantics Transfer .......................... 5 2.4 Semantics Abstraction ............................. 7 2.5 Fixpoint Semantics Fusion ........................... 8 2.6 Fixpoint Iterates Reordering .......................... 8 3 Transition/Small-Step Operational Semantics 9 4 Finite and Infinite Sequences 9 4.1 Sequences .................................... 9 4.2 Concatenation of Sequences .......................... 10 4.3 Junction of Sequences ............................. 10 5 Maximal Trace Semantics 10 2 5.1 Fixpoint Finite Trace Semantics .......................
    [Show full text]
  • Denotational Semantics
    Denotational Semantics CS 6520, Spring 2006 1 Denotations So far in class, we have studied operational semantics in depth. In operational semantics, we define a language by describing the way that it behaves. In a sense, no attempt is made to attach a “meaning” to terms, outside the way that they are evaluated. For example, the symbol ’elephant doesn’t mean anything in particular within the language; it’s up to a programmer to mentally associate meaning to the symbol while defining a program useful for zookeeppers. Similarly, the definition of a sort function has no inherent meaning in the operational view outside of a particular program. Nevertheless, the programmer knows that sort manipulates lists in a useful way: it makes animals in a list easier for a zookeeper to find. In denotational semantics, we define a language by assigning a mathematical meaning to functions; i.e., we say that each expression denotes a particular mathematical object. We might say, for example, that a sort implementation denotes the mathematical sort function, which has certain properties independent of the programming language used to implement it. In other words, operational semantics defines evaluation by sourceExpression1 −→ sourceExpression2 whereas denotational semantics defines evaluation by means means sourceExpression1 → mathematicalEntity1 = mathematicalEntity2 ← sourceExpression2 One advantage of the denotational approach is that we can exploit existing theories by mapping source expressions to mathematical objects in the theory. The denotation of expressions in a language is typically defined using a structurally-recursive definition over expressions. By convention, if e is a source expression, then [[e]] means “the denotation of e”, or “the mathematical object represented by e”.
    [Show full text]
  • Quantum Weakest Preconditions
    Quantum Weakest Preconditions Ellie D’Hondt ∗ Prakash Panangaden †‡ Abstract We develop a notion of predicate transformer and, in particular, the weak- est precondition, appropriate for quantum computation. We show that there is a Stone-type duality between the usual state-transformer semantics and the weakest precondition semantics. Rather than trying to reduce quantum com- putation to probabilistic programming we develop a notion that is directly taken from concepts used in quantum computation. The proof that weak- est preconditions exist for completely positive maps follows from the Kraus representation theorem. As an example we give the semantics of Selinger’s language in terms of our weakest preconditions. 1 Introduction Quantum computation is a field of research that is rapidly acquiring a place as a significant topic in computer science. To be sure, there are still essential tech- nological and conceptual problems to overcome in building functional quantum computers. Nevertheless there are fundamental new insights into quantum com- putability [Deu85, DJ92], quantum algorithms [Gro96, Gro97, Sho94, Sho97] and into the nature of quantum mechanics itself [Per95], particularly with the emer- gence of quantum information theory [NC00]. These developments inspire one to consider the problems of programming full-fledged general purpose quantum computers. Much of the theoretical re- search is aimed at using the new tools available - superposition, entanglement and linearity - for algorithmic efficiency. However, the fact that quantum algorithms are currently done at a very low level only - comparable to classical computing 60 years ago - is a much less explored aspect of the field. The present paper is ∗Centrum Leo Apostel , Vrije Universiteit Brussel, Brussels, Belgium, [email protected].
    [Show full text]
  • Truth-Bearers and Truth Value*
    Truth-Bearers and Truth Value* I. Introduction The purpose of this document is to explain the following concepts and the relationships between them: statements, propositions, and truth value. In what follows each of these will be discussed in turn. II. Language and Truth-Bearers A. Statements 1. Introduction For present purposes, we will define the term “statement” as follows. Statement: A meaningful declarative sentence.1 It is useful to make sure that the definition of “statement” is clearly understood. 2. Sentences in General To begin with, a statement is a kind of sentence. Obviously, not every string of words is a sentence. Consider: “John store.” Here we have two nouns with a period after them—there is no verb. Grammatically, this is not a sentence—it is just a collection of words with a dot after them. Consider: “If I went to the store.” This isn’t a sentence either. “I went to the store.” is a sentence. However, using the word “if” transforms this string of words into a mere clause that requires another clause to complete it. For example, the following is a sentence: “If I went to the store, I would buy milk.” This issue is not merely one of conforming to arbitrary rules. Remember, a grammatically correct sentence expresses a complete thought.2 The construction “If I went to the store.” does not do this. One wants to By Dr. Robert Tierney. This document is being used by Dr. Tierney for teaching purposes and is not intended for use or publication in any other manner. 1 More precisely, a statement is a meaningful declarative sentence-type.
    [Show full text]
  • Rhetorical Analysis
    RHETORICAL ANALYSIS PURPOSE Almost every text makes an argument. Rhetorical analysis is the process of evaluating elements of a text and determining how those elements impact the success or failure of that argument. Often rhetorical analyses address written arguments, but visual, oral, or other kinds of “texts” can also be analyzed. RHETORICAL FEATURES – WHAT TO ANALYZE Asking the right questions about how a text is constructed will help you determine the focus of your rhetorical analysis. A good rhetorical analysis does not try to address every element of a text; discuss just those aspects with the greatest [positive or negative] impact on the text’s effectiveness. THE RHETORICAL SITUATION Remember that no text exists in a vacuum. The rhetorical situation of a text refers to the context in which it is written and read, the audience to whom it is directed, and the purpose of the writer. THE RHETORICAL APPEALS A writer makes many strategic decisions when attempting to persuade an audience. Considering the following rhetorical appeals will help you understand some of these strategies and their effect on an argument. Generally, writers should incorporate a variety of different rhetorical appeals rather than relying on only one kind. Ethos (appeal to the writer’s credibility) What is the writer’s purpose (to argue, explain, teach, defend, call to action, etc.)? Do you trust the writer? Why? Is the writer an authority on the subject? What credentials does the writer have? Does the writer address other viewpoints? How does the writer’s word choice or tone affect how you view the writer? Pathos (a ppeal to emotion or to an audience’s values or beliefs) Who is the target audience for the argument? How is the writer trying to make the audience feel (i.e., sad, happy, angry, guilty)? Is the writer making any assumptions about the background, knowledge, values, etc.
    [Show full text]
  • Towards a Unified Theory of Operational and Axiomatic Semantics — Extended Abstract —
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Illinois Digital Environment for Access to Learning and Scholarship Repository Towards a Unified Theory of Operational and Axiomatic Semantics — Extended Abstract — Grigore Ros¸u and Andrei S¸tefanescu˘ Department of Computer Science, University of Illinois at Urbana-Champaign fgrosu, [email protected] Abstract. This paper presents a nine-rule language-independent proof system that takes an operational semantics as axioms and derives program properties, including ones corresponding to Hoare triples. This eliminates the need for language-specific Hoare-style proof rules in order to verify programs, and, implicitly, the tedious step of proving such proof rules sound for each language separately. The key proof rule is Circularity, which is coinductive in nature and allows for reasoning about constructs with repetitive behaviors (e.g., loops). The generic proof system is shown sound and has been implemented in the MatchC program verifier. 1 Introduction An operational semantics defines a formal executable model of a language typically in terms of a transition relation cfg ) cfg0 between program configurations, and can serve as a formal basis for language understanding, design, and implementation. On the other hand, an axiomatic semantics defines a proof system typically in terms of Hoare triples f g code f 0g, and can serve as a basis for program reasoning and verification. Operational semantics are well-understood and comparatively easier to define than axiomatic semantics for complex languages. More importantly, operational semantics are typically executable, and thus testable. For example, we can test them by executing the program benchmarks that compiler testers use, as has been done with the operational semantics of C [5].
    [Show full text]
  • Axiomatic Semantics
    Advanced Topics in Programming Languages Spring Semester, 2012 Lecture 6: April 24, 2012 Lecturer: Mooly Sagiv Scribe: Michal Balas and Yair Asa Axiomatic Semantics 6.1 The basic idea The problem we would like to solve is how to prove that a program does what we require of it. Given a program, we specify its required behavior based on our intuitive understanding of it. We can run it according to the operational semantics or denotational semantics and compare to its behavior there. For some programs we need to be more abstract (for example programs that receive input), and then it is necessary to use some logic to reason about the program (and how it behaves on a set of inputs and not in one specific execution path). In this case we may eventually develop a formal proof system for properties of the program or showing that it satisfies a requirement. We can then use the proof system to show correctness. These rules of the proof system are called Hoare or Floyd-Hoare rules. Floyd-rules are for flow-charts and Hoare-rules are for structured languages. Originally their approach was advocated not just for proving properties of programs but also giving a method for explaining the meaning of program. The meaning of a program was specified in terms of \axioms" saying how to prove properties of it, in other words it is given by a set of verification rules. Therefore, this approach was named axiomatic semantics. Axiomatic semantics has many applications, such as: Program verifiers • Symbolic execution tools for bug hunting • Software validation tools • Malware detection • Automatic test generation • It is also used for proving the correctness of algorithms or hardware descriptions, \extended static checking (e.g., checking array bounds), and documenting programs and interfaces.
    [Show full text]
  • A Guide to Dynamic Semantics
    A Guide to Dynamic Semantics Paul Dekker ILLC/Department of Philosophy University of Amsterdam September 15, 2008 1 Introduction In this article we give an introduction to the idea and workings of dynamic se- mantics. We start with an overview of its historical background and motivation in this introductory section. An in-depth description of a paradigm version of dynamic semantics, Dynamic Predicate Logic, is given in section 2. In section 3 we discuss some applications of the dynamic kind of interpretation to illustrate how it can be taken to neatly account for a vast number of empirical phenom- ena. In section 4 more radical extensions of the basic paradigm are discussed, all of them systematically incorporating previously deemed pragmatic aspects of meaning in the interpretational system. Finally, a discussion of some more general, philosophical and theoretical, issues surrounding dynamic semantics can be found in section 5. 1.1 Theoretical Background What is dynamic semantics. Some people claim it embodies a radical new view of meaning, departing from the main logical paradigm as it has been prominent in most of the previous century. Meaning, or so it is said, is not some object, or some Platonic entity, but it is something that changes information states. A very simple-minded way of putting the idea is that people uses languages, they have cognitive states, and what language does is change these states. “Natu- ral languages are programming languages for minds”, it has been said. Others build on the assumption that natural language and its interpretation is not just concerned with describing an independently given world, but that there are lots of others things relevant in the interpretation of discourse, and lots of other functions of language than a merely descriptive one.
    [Show full text]
  • Operational Semantics with Hierarchical Abstract Syntax Graphs*
    Operational Semantics with Hierarchical Abstract Syntax Graphs* Dan R. Ghica Huawei Research, Edinburgh University of Birmingham, UK This is a motivating tutorial introduction to a semantic analysis of programming languages using a graphical language as the representation of terms, and graph rewriting as a representation of reduc- tion rules. We show how the graphical language automatically incorporates desirable features, such as a-equivalence and how it can describe pure computation, imperative store, and control features in a uniform framework. The graph semantics combines some of the best features of structural oper- ational semantics and abstract machines, while offering powerful new methods for reasoning about contextual equivalence. All technical details are available in an extended technical report by Muroya and the author [11] and in Muroya’s doctoral dissertation [21]. 1 Hierarchical abstract syntax graphs Before proceeding with the business of analysing and transforming the source code of a program, a com- piler first parses the input text into a sequence of atoms, the lexemes, and then assembles them into a tree, the Abstract Syntax Tree (AST), which corresponds to its grammatical structure. The reason for preferring the AST to raw text or a sequence of lexemes is quite obvious. The structure of the AST incor- porates most of the information needed for the following stage of compilation, in particular identifying operations as nodes in the tree and operands as their branches. This makes the AST algorithmically well suited for its purpose. Conversely, the AST excludes irrelevant lexemes, such as separators (white-space, commas, semicolons) and aggregators (brackets, braces), by making them implicit in the tree-like struc- ture.
    [Show full text]
  • Chapter 3 Describing Syntax and Semantics
    Chapter 3 Describing Syntax and Semantics 3.1 Introduction 110 3.2 The General Problem of Describing Syntax 111 3.3 Formal Methods of Describing Syntax 113 3.4 Attribute Grammars 128 3.5 Describing the Meanings of Programs: Dynamic Semantics 134 Summary • Bibliographic Notes • Review Questions • Problem Set 155 CMPS401 Class Notes (Chap03) Page 1 / 25 Dr. Kuo-pao Yang Chapter 3 Describing Syntax and Semantics 3.1 Introduction 110 Syntax – the form of the expressions, statements, and program units Semantics - the meaning of the expressions, statements, and program units. Ex: the syntax of a Java while statement is while (boolean_expr) statement – The semantics of this statement form is that when the current value of the Boolean expression is true, the embedded statement is executed. – The form of a statement should strongly suggest what the statement is meant to accomplish. 3.2 The General Problem of Describing Syntax 111 A sentence or “statement” is a string of characters over some alphabet. The syntax rules of a language specify which strings of characters from the language’s alphabet are in the language. A language is a set of sentences. A lexeme is the lowest level syntactic unit of a language. It includes identifiers, literals, operators, and special word (e.g. *, sum, begin). A program is strings of lexemes. A token is a category of lexemes (e.g., identifier). An identifier is a token that have lexemes, or instances, such as sum and total. Ex: index = 2 * count + 17; Lexemes Tokens index identifier = equal_sign 2 int_literal * mult_op count identifier + plus_op 17 int_literal ; semicolon CMPS401 Class Notes (Chap03) Page 2 / 25 Dr.
    [Show full text]
  • Making Classes Provable Through Contracts, Models and Frames
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by CiteSeerX DISS. ETH NO. 17610 Making classes provable through contracts, models and frames A dissertation submitted to the SWISS FEDERAL INSTITUTE OF TECHNOLOGY ZURICH (ETH Zurich)¨ for the degree of Doctor of Sciences presented by Bernd Schoeller Diplom-Informatiker, TU Berlin born April 5th, 1974 citizen of Federal Republic of Germany accepted on the recommendation of Prof. Dr. Bertrand Meyer, examiner Prof. Dr. Martin Odersky, co-examiner Prof. Dr. Jonathan S. Ostroff, co-examiner 2007 ABSTRACT Software correctness is a relation between code and a specification of the expected behavior of the software component. Without proper specifica- tions, correct software cannot be defined. The Design by Contract methodology is a way to tightly integrate spec- ifications into software development. It has proved to be a light-weight and at the same time powerful description technique that is accepted by software developers. In its more than 20 years of existence, it has demon- strated many uses: documentation, understanding object-oriented inheri- tance, runtime assertion checking, or fully automated testing. This thesis approaches the formal verification of contracted code. It conducts an analysis of Eiffel and how contracts are expressed in the lan- guage as it is now. It formalizes the programming language providing an operational semantics and a formal list of correctness conditions in terms of this operational semantics. It introduces the concept of axiomatic classes and provides a full library of axiomatic classes, called the mathematical model library to overcome prob- lems of contracts on unbounded data structures.
    [Show full text]