An Abstract Machine for Unification Grammars

Total Page:16

File Type:pdf, Size:1020Kb

An Abstract Machine for Unification Grammars An Abstract Machine for Unification Grammars with Applications to an HPSG Grammar for Hebrew Research Thesis Submitted in partial fulfillment of the requirements for the degree of Doctor of Science Shalom Wintner arXiv:cmp-lg/9709013v1 23 Sep 1997 Submitted to the Technion – Israel Institute of Technology Tevet5757 Haifa January1997 The work described herein was supervised by Prof. Nissim Francez Under the Auspices of the Faculty of Computer Science Acknowledgments This work could never have been what it is without the support I received from my advisor, Nissim Francez. I am grateful to Nissim for introducing me to this subject, leading and advising me throughout the project, bearing with me when I messed things up and encouraging me all along the way. Many thanks are due to the members of my Thesis Committee: Bob Carpenter, Michael El- hadad, Alon Itai, Uzzi Ornan and Mori Rimon. I am especially indebted to Bob for his major help, mostly in the preliminary stages of this project. I also want to thank Uzzi for replacing Nissim when he was on sabbatical. While working on this thesis I spent a fruitful summer in the University of T¨ubingen, Seminar f¨ur Sprachwissenschaft, during which I learned a lot from many discussions. I wish to thank Paul King for his generosity and his wisdom. Thanks are also due to Erhard Hinrichs, Dale Gerdemann, Thilo G¨otz, Detmar Meurers, John Griffith and Frank Morawietz. I am grateful to Evgeniy Gabrilovich for many fruitful discussions and for going over my code. Special thanks to Holger Maier and Katrine Kirk for their hospitality and their company. Finally, I want to thank Yifat and Galia for being there when I needed it. The generous financial help of the Minerva Stipendien Komitee and Intel Israel Ltd. is grate- fully acknowledged. The research was supported by a grant from the Israeli Ministry of Science: “Programming Languages Induced Computational Linguistics” and by the Fund for the Promotion of Research in the Technion. The project was carried out under the auspices of the Laboratory for Computational Linguistics of the Technion. Contents Abstract 1 1 Introduction 3 1.1 Motivation ........................................ 3 1.2 LiteratureSurvey.................................. ... 4 1.2.1 GrammaticalFormalisms ............................ 4 1.2.2 TheCurrentRoleofHPSG ........................... 5 1.2.3 AbstractMachineTechniques . 6 1.2.4 ProcessingHPSG................................. 7 1.2.5 ComputationalGrammarsforHebrew . .. 8 1.3 AchievementsoftheThesis .. .. .. .. .. .. .. .. .. .. .. .. ... 9 1.4 StructureofthisDocument . .... 10 2 Parsing with Typed Feature Structures 11 2.1 TheoryofFeatureStructures . ...... 11 2.2 Subsumption ....................................... 14 2.3 Well-FoundednessofSubsumption . .... 14 2.4 AbstractFeatureStructures . ....... 15 2.5 Unification ........................................ 19 2.6 A LinearRepresentationofFeatureStructures . ......... 21 2.7 Multi-rootedStructures .. .. .. .. .. .. .. .. .. .. .. .. .... 23 2.8 RulesandGrammars.................................. 27 2.9 ParsingasOperationalSemantics. ..... 32 2.10 ProofofCorrectness .............................. ..... 34 2.10.1 Soundness..................................... 35 2.10.2 Completeness................................... 36 2.10.3 SubsumptionCheck ............................... 37 2.10.4 Termination.................................... 37 3 AMALIA – An Abstract Machine for Linguistic Applications 40 3.1 TheInputLanguage ................................. .. 40 3.1.1 TypeSpecification ................................ 40 3.1.2 RulesandGrammars............................... 41 3.2 ATFSUnificationEngine................................ 41 3.2.1 First-OrderTermsvs.FeatureStructures . ...... 41 3.2.2 ProcessingScheme ................................ 42 3.2.3 Memory Representation of Feature Structures . ....... 43 3.2.4 FlatteningFeatureStructures . ... 43 3.2.5 ProcessingofaQuery .............................. 44 3.2.6 Compilation of the Type Hierarchy . 45 3.2.7 ProcessingofaProgram............................. 47 3.3 Parsing .......................................... 49 3.3.1 AParsingAlgorithm............................... 50 3.3.2 DataStructures.................................. 52 3.3.3 Compilation.................................... 52 3.3.4 EffectoftheMachineInstructions . .. 54 3.4 Optimizations and Extensions . ... 58 3.4.1 LazyEvaluationofFeatureStructures . .... 58 3.4.2 Partial Descriptions . 59 3.4.3 EmptyCategories ................................ 60 3.4.4 LexicalAmbiguity ................................ 60 3.4.5 FunctionalAttachments . 61 3.5 Implementation..................................... 61 4 An HPSG-based Grammar for Hebrew 63 4.1 PhraseStructureSchemata . ..... 63 4.2 TheStructureofNounPhrasesinHebrew . ...... 64 4.3 The Status of the Definite Article in Hebrew . .... 67 4.3.1 TheData ..................................... 68 4.3.2 HPSGApproach ................................. 68 4.3.3 An Analysis of Hebrew Definites . 69 4.4 Noun-NounConstructs. .. .. .. .. .. .. .. .. .. .. .. .. .... 71 5 Conclusion 74 A List of Machine Instructions 76 B The Hebrew Grammar 77 Bibliography 85 List of Figures 2.1 Afeaturestructure................................ .... 12 2.2 AninfinitedecreasingsequenceofTFSs . ..... 16 2.3 Anabstractfeaturestructure . ....... 18 2.4 Agraph-andAVM-representationofaMRS . ..... 24 2.5 Anexamplegrammar .................................. 28 2.6 Strongderivation................................... .. 30 2.7 Aleftmostderivation................................. .. 31 3.1 Anexampletypehierarchy ............................. .. 41 3.2 AnexamplegrammarinALEformat. .. 42 3.3 Heap representation of the TFS b(b( 1 d, 1 ),d) ..................... 43 3.4 Featurestructuresassetsofequations . ......... 44 3.5 The effect of the put instructions – put ........................ 44 3.6 Compiled code for the query b(b( 1 d, 1 ),d) ....................... 45 3.7 Typeunificationtables................................ .. 45 3.8 unify type[a,b] ..................................... 46 3.9 The effect of the type unification instructions . ....... 47 3.10 Compiled code for the program a( 3 d1, 3 ) ....................... 47 3.11 Implementation of the get/unify instructions . ........ 48 3.12 The code of the unify function............................. 49 3.13 Parsing – informal description . ..... 51 3.14 The effect of the proceed instruction.......................... 53 3.15 The effect of put rule .................................. 53 3.16Compiledrule....................................... 54 3.17 Compiled code for a grammar with k rules....................... 55 3.18 Compiled code obtained for the example grammar . ...... 56 3.19 The effect of the key-manipulation instructions . ........ 57 3.20 Theeffect of the edgetraversalinstructions . .......... 57 3.21 The effect of the call instruction............................ 58 3.22 The effect of the copy instructions ........................... 58 3.23 The fail function .................................... 59 3.24 Performance comparison of AMALIA andALE ................... 62 4.1 Subject-Headschema............................... .... 65 4.2 Head-Complementschema. ... 65 4.3 Head-Markerschema................................ ... 66 4.4 Head-Adjunctschema ............................... ... 66 4.5 The lexical entry of the noun sepr ............................ 67 4.6 TheeffectoftheDefiniteLexicalRule . .... 70 4.7 The effect of the ‘nismak’ lexical rule . .... 72 Abstract Contemporary linguistic formalisms have become so rigorous that it is now possible to view them as very high level declarative programming languages. Consequently, grammars for natural languages can be viewed as programs; this view enables the application of various methods and techniques that were proved useful for programming languages to the study of natural languages. One of the most successful implementation techniques for logic programming languages involves the use of an abstract machine. In this approach one defines an abstract machine with the fol- lowing properties: it is close enough to the high-level language, thus allowing efficient compilation to the abstract machine language; and it is sufficiently low-level to allow efficient interpretation of the machine instructions on a variety of host architectures. Abstract machines were used for processing procedural and functional languages, but they gained much popularity for logic pro- gramming languages since the introduction of the Warren Abstract Machine (WAM). Most current implementations of Prolog, as well as other logic languages, are based on abstract machines. The incorporation of such techniques usually leads to very efficient compilers in terms of both space and time requirements. In this work we have designed and implemented an abstract machine, AMALIA, for the linguistic formalism ALE, which is based on typed feature structures. This formalism is one of the most widely accepted in computational linguistics and has been used for designing grammars in various linguistic theories, most notably HPSG. AMALIA is composed of data structures and a set of instructions, augmented by a compiler from the grammatical formalism to the abstract instructions, and a (portable) interpreter of the abstract instructions. The effect of each instruction is defined using a low-level language that can be executed on ordinary hardware. The advantages of the abstract machine approach are twofold. From a theoretical point of view, the abstract machine gives
Recommended publications
  • Lecture Notes in Computer Science
    D. Dolev Z. Galil M. Rodeh (Eds.) Theory of Computing and Systems ISTCS '92, Israel Symposium Haifa, Israel, May 27-28, 1992 Proceedings Springer-Verlag Berlin Heidelberg NewYork London Paris Tokyo Hong Kong Barcelona Budapest Series Editors Gerhard Goos Juris Hartmanis Universit~it Karlsruhe Department of Computer Science Postfach 69 80 Cornell University Vincenz-Priessnitz-Stral3e 1 5149 Upson Hall W-7500 Karlsruhe, FRG Ithaca, NY 14853, USA Volume Editors Danny Dolev Hebrew University Givat Ram, Jerusalem, Israel Zvi Galil Columbia University, New York, NY 10027, USA and Tel Aviv University Ramat Aviv, Tel Aviv, Israel Michael Rodeh IBM Israel Ltd., Science and Technology, Technion City Haifa 32000, Israel CR Subject Classification (1991): F.1-4, B.3, C.2, G.1-2, 1.1, E.4, D.2-3 ISBN 3-540-55553-6 Springer-Verlag Berlin Heidelberg New York ISBN 0-387-55553-6 Springer-Verlag New York Berlin Heidelberg This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in any other way, and storage in data banks. Duplication of this publication or parts thereof is permitted only under the provisions of the German Copyright Law of September 9, 1965, in its current version, and permission for use must always be obtained from Springer-Verlag. Violations are liable for prosecution under the German Copyright Law. Springer-Verlag Berlin Heidelberg 1992 Printed in Germany Typesetting: Camera ready by author/editor Printing and binding: Druckhaus Beltz, Hemsbach/Bergstr.
    [Show full text]
  • Unification Grammars Nissim Francez and Shuly Wintner Frontmatter More Information
    Cambridge University Press 978-1-107-01417-6 - Unification Grammars Nissim Francez and Shuly Wintner Frontmatter More information Unification Grammars Like computer programs, the grammars of natural languages can be expressed as mathematical objects. Such a formal presentation of grammars facilitates mathematical reasoning with grammars (and the languages they denote), on one hand, and the compu- tational implementation of grammar processors, on the other hand. This book presents one of the most commonly used grammatical formalisms, unification grammars, which underlies such contemporary linguistic theories as lexical-functional grammar (LFG) and head-driven phrase structure grammar (HPSG). The book provides a robust and rigorous exposition of the formalism that is both mathematically well founded and lin- guistically motivated. Although the material is presented formally, and much of the text is mathematically oriented, a core chapter of the book addresses linguistic applications and the implementation of several linguistic insights in unification grammars. The authors provide dozens of examples and numerous exercises (many with solutions) to illustrate key points. Graduate students and researchers in both computer science and linguistics will find this book a valuable resource. NISSIM FRANCEZ is professor emeritus of computer science at the Technion-Israel Institute of Technology. His research for the last twenty years has focused on computational linguistics in general, and on the formal semantics of natural language in particular, mainly within the framework of type-logical grammar. His most recent research topic is proof-theoretic semantics for natural language. He is the author of several books and approximately 150 scientific articles. He regularly serves on editorial boards and program committees of several major conferences, including the committee of the FoLLI (Association for Logic, Language, and Information) Beth Dissertation Award.
    [Show full text]
  • Chp%3A10.1007%2F3-54
    PART III PROJECT SUMMARIES 345 Parallel Unification Machine Design and Simulation Contact Person: Professor F~li N. Sibai Dept. of Electrical Engineering University of Akron Akron, OH 44325-3904 U.S.A. mail: r [email protected] Keywords Parallel Unification, Term Matching, Consistency Check, Shared and Distributed Memory Organizations Summary A parallel machine architecture for first-order unification was designed by F. N. Sibai. This machine can be used as a hardware unifier by itself, or can be used as a co-processor to a host PROLOG computer. The machine takes two terms to be unified for input, and outputs the most general unifier in case of unification success, or FAIL in case of failure. The machine is composed of an array of match processors (MPs) in charge of matching the arguments of the two terms in parallel and can therefore detect early failures at the match level. If no match failure exists, the MPs send the generated bindings to a consistency check processor (CCP) in charge of checking the consistency of the bindings. The CCP is a powerful unification processor which maintains the bindings in a content-addressable memory for fast search and retrieval. Thus the operations of the array of MPs and the CCP are pipelined in the machine. The performance evaluation of two versions of PUM has been conducted with simulation. The first version is based on a shared-memory or- ganization whereas the second uses a distributed memory organization. Reference Sibai, Fadi N.,"A Parallel Machine for the Unification Algorithm: Design, Simulation, and Performance Evaluation," Ph.D.
    [Show full text]
  • Amalia--A Unified Platform for Parsing and Generation
    Amalia A Unied Platform for Parsing and Generation z y Evgeniy Gabrilovich and Nissim Francez Shuly Wintner Lab oratory for Computational Linguistics Seminar f ur Sprachwissenschaft Computer Science Department Universitat T ubingen Technion Israel Institute of Technology Kl Wilhelmstr Haifa Israel T ubingen Germany fgabrfrancezgcstechnionac il shulysfsnphilunituebin gen de Abstract mar and a string and parses the string according to the grammar For generation the input is a Contemporary linguistic theories in particular HPSG are declarative in nature they sp ecify semantic formula from which a phrase is gener constraints on p ermissible structures not how ated The earliest HPSG parsers eg Prudian such structures are to b e computed Gram mars designed under such theories are there Pollard Franz were designed in this fore suitable for b oth parsing and generation way A slightly more elab orate technique is the However practical implementations of such the use of some highlevel unicationbased logic pro ories dont usually supp ort bidirectional pro cessing of grammars We present a grammar gramming language eg Prolog or LIFE At development system that includes a compiler of Kaci Podelski for sp ecifying the grammar grammars for parsing and generation to ab Further along this line lies compilation of gram stract machine instructions and an interpreter for the abstract machine language The genera mars directly into Prolog using Prologs internal tion compiler inverts input grammars designed mechanisms for p erforming unication This
    [Show full text]
  • Semantics of Nondeterminism, Concurrency, and Communication*
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector JOURNAL OF COMPUTER AND SYSTEMSCIENCES 19, 29O-308 ( 1979) Semantics of Nondeterminism, Concurrency, and Communication* NISSIM FRANCEZ Department of Computer Science, The Technion, Haifa, Israel C. A. R. HOARE Programming Research Group, Oxford University, Oxford, OX2 6PE, England DANIEL J. LEHMANN Department of Computer Science, the Hebrew University, Jerusalem, Israel AND WILLEM P. DE ROEVER Department of Computer Science, University of Utrecht, Utrecht, Holland Received September 13, 1978; revised July 25, 1979 I. INTRODUCTION 1. Background and Motivation One of the more important and active areas in the theory of programming languages is that of concurrent programs, specifically their design, definition, analysis, and verifica- tion. Due to recent developments in the technology of microprocessors, there is a trend toward languages supporting distributed activities involving communication rather than concurrent activities on some shared resources, mainly memory. Thus, it becomes very important to supply adequate tools for the definition and analysis of such programs and programming languages. One recent attempt to design such a language was done by Hoare [9], where the language CSP (communicating sequential processes) was presented informally. This is a language for the expression of nondeterministic, concurrent, and communicating programs. * Based on work carried out by Francez, Hoare, and de Roever at Queen’s University, Belfast, during the academic year 1976-1977, supported by SRC Grant B/RG/74082. The completion of the paper was supported by NSF Grants MCS-78-673 and MCS-78-7461, during the stay of Francez and Lehmann at the University of Southern California, Los Angeles, and of de Roever at the University of California, Berkeley, 1977-1978.
    [Show full text]
  • David S. Brée
    David S. Br´ee Curriculum Vitae as of 16 December 2010 Date of Birth: 2 December 1939 Nationality: British Marital status: Married Children: One Address from 1 July 2010: Eerste Weteringdwars straat 32, 1017 TN, Amsterdam, The Netherlands Present position 2006 Emeritus Professor of Artificial Intelligence School of Computer Science University of Manchester Manchester, M13 9PL, United Kingdom 1 Previous full time posts 2010 Senior Researcher Center for Complex System, Institute for Social Studies University of Warsaw ul. Stawki 5/7, 00 - 183 Warsaw. - Worked with Professor Andrzej Nowak on a book about Complex Systems Science approach to Social Psychology, to be published in 2013 by Springer 2006-2010 Senior Researcher Institute for Scientific Interchange Villa Guallino, via Settimio Severo 65, 10133 Torino, Italy - Scientific management of three FP6 projects in the field of Complex Systems, with a total budget of 5 M€ over 3 to 4 years - Local supervision of 5 PhD students, all of whom have completed 1990-2006 Professor of Artificial Intelligence School of Computer Science, University of Manchester, United Kingdom - Set up and ran their first PhD program, raising the annual intake to over 30 students - Head of the Artificial Intelligence group 1988-91 Scientific Director (part-time) Foundation for Knowledge-Based Systems, Rotterdam, The Netherlands - Purpose: strategic research in the area of knowledge-based systems - Set up by SPIN, the Dutch equivalent of Alvey, with an injection of Hfl 10,000,000 (€ 4,500,000) over five years - Raised an equivalent
    [Show full text]
  • Off-Line Parsability and the Well-Foundedness of Subsumption
    Off-line Parsability and the Well-foundedness of Subsumption∗ Shuly Wintner and Nissim Francez Department of Computer Science Technion, Israel Institute of Technology 32000 Haifa, Israel Abstract Typed feature structures are used extensively for the specification of linguistic infor- mation in many formalisms. The subsumption relation orders TFSs by their information content. We prove that subsumption of acyclic TFSs is well-founded, whereas in the presence of cycles general TFS subsumption is not well-founded. We show an application of this result for parsing, where the well-foundedness of subsumption is used to guarantee termination for grammars that are off-line parsable. We define a new version of off-line parsability that is less strict than the existing one; thus termination is guaranteed for parsing with a larger set of grammars. Keywords: Computational Linguistics, Parsing, Feature structures, Unification arXiv:cmp-lg/9709011v1 23 Sep 1997 This paper has not been submitted elsewhere in identical or similar form 1 Introduction Feature structures serve as a means for the specification of linguistic information in current linguistic formalisms such as LFG (Kaplan and Bresnan, 1982), HPSG (Pollard and Sag, 1994) or (some versions of) Categorial Grammar (Haddock, Klein, and Morill, 1987). This paper focuses on typed feature structures (TFSs) that are a generalization of their untyped counterparts. TFSs are related by subsumption (see (Carpenter, 1992b)) according to their ∗ To appear in the Journal of Logic, Language and Information 1 information content. We show that the subsumption relation is well-founded for acyclic TFSs, but not for cyclic ones. We use this result to prove that parsing is terminating for grammars that are off-line parsable: this proposition is cited, but not proved, in (Shieber, 1992).
    [Show full text]
  • Proceedings of the 15Th Meeting on the Mathematics of Language (MOL 2017), Held at Queen Mary University of London, on July 13–14, 2017
    MOL 2017 The 15th Meeting on the Mathematics of Language Proceedings of the Conference July 13–14, 2017 London, UK Sponsored by c 2017 The Association for Computational Linguistics Association for Computational Linguistics (ACL) 209 N. Eighth Street Stroudsburg, PA 18360 USA Tel: +1-570-476-8006 Fax: +1-570-476-0860 [email protected] ISBN 978-1-945626-72-2 ii Introduction This is the proceedings of the 15th Meeting on the Mathematics of Language (MOL 2017), held at Queen Mary University of London, on July 13–14, 2017. The volume contains eleven regular papers and two invited papers. It also includes an abstract of a third invited talk. The regular papers were selected from a total of 23 submissions, using the EasyChair conference management system. The conference benefited from the financial support of the British Logic Colloquium (http://www. blc-logic.org) and of AYLIEN (http://aylien.com), which we gratefully acknowledge. Last but not least, we would like to express our sincere gratitude to all the reviewers for MOL 2017 and to all the people who helped with the local organization. Makoto Kanazawa, Philippe de Groote, and Mehrnoosh Sadrzadeh (editors) iii Program Chairs: Makoto Kanazawa, National Institute of Informatics (Japan) Philippe de Groote, INRIA Nancy (France) Local Chair: Mehrnoosh Sadrzadeh, Queen Mary University of London (UK) Local Organization: Sophie Chesney, Queen Mary University of London (UK) Maximilian Droog-Hayes, Queen Mary University of London (UK) Dimitri Kartsaklis, University of Cambridge (UK) Shalom Lappin, University
    [Show full text]
  • Notes on Computational Linguistics
    Notes on computational linguistics E. Stabler UCLA, Winter 2003 (under revision) Stabler - Lx 185/209 2003 Contents 1 Setting the stage: logics, prolog, theories 4 1.1Summary............................................................................. 4 1.2Propositionalprolog...................................................................... 6 1.3Usingprolog.......................................................................... 10 1.4Somedistinctionsofhumanlanguages.......................................................... 11 1.5PredicateProlog........................................................................ 13 1.6Thelogicofsequences.................................................................... 18 2 Recognition: first idea 26 2.1Aprovabilitypredicate.................................................................... 27 2.2Arecognitionpredicate.................................................................... 28 2.3Finitestaterecognizers.................................................................... 31 3 Extensions of the top-down recognizer 41 3.1Unificationgrammars..................................................................... 41 3.2Moreunificationgrammars:casefeatures........................................................ 42 3.3Recognizers:timeandspace................................................................ 44 3.4Trees,andparsing:firstidea................................................................ 46 3.5Thetop-downparser..................................................................... 47
    [Show full text]
  • Elephant 2000: a Programming Language Based on Speech Acts
    Elephant 2000: A Programming Language Based on Speech Acts John McCarthy, Stanford University I meant what I said, and I said what I meant. An elephant’s faithful, one hundred percent! moreover, An elephant never forgets! Abstract: Elephant 2000 is a vehicle for some ideas about programming language features. We expect these features to be valuable in writing and verifying programs that interact with people (e.g. transaction processing) or interact with programs belonging to other organizations (e.g. electronic data interchange) 1. Communication inputs and outputs are in an I-O language whose sen- tences are meaningful speech acts approximately in the sense of philosophers and linguists. These include questions, answers, offers, acceptances, declina- tions, requests, permissions and promises. 2. The correctness of programs is partly defined in terms of proper perfor- mance of the speech acts. Answers should be truthful and responsive, and promises should be kept. Sentences of logic expressing these forms of cor- rectness can be generated automatically from the form of the program. 3. Elephant source programs may not need data structures, because they can refer directly to the past. Thus a program can say that an airline passenger has a reservation if he has made one and hasn’t cancelled it. 4. Elephant programs themselves can be represented as sentences of logic. Their extensional properties follow from this representation without an in- tervening theory of programming or anything like Hoare axioms. 5. Elephant programs that interact non-trivially with the outside world can 1 have both input-output specifications, relating the programs inputs and out- puts, and accomplishment specifications concerning what the program accom- plishes in the world.
    [Show full text]
  • Rechnion - Israel Institute Ot Technology Computer Science Dep~Rtment
    '. rECHNION - Israel Institute ot Technology Computer Science Dep~rtment ,A.WEAKEST PRECO~DITION SEMANTICS FOR eOMMUNICATING PROCESSES by Tzilla Elrad ~nd Nissim Francez Technical .Report #244 May '1982 Technion - Computer Science Department - Technical Report CS0244 - 1982 - -.., ...... A WEAKEST PRECONDITION SEMANTICS' FOR "'.... Cm~MUNI,CAtIN'G PROCESSES (~) by TZil-lei El-rad(H and Nissim ~ranceL(2} nept. of Compu~er Science Technion - Israel Institute df Technology Haifa 32000, Israel -5Uftt1ARY Aweakest precondition semantics for communicating processes is presented, bas~d on a centr~lized one level approach. Semantic equa­ tions are given for the CSP constructs and th€1r continuity i$ proved. The 'representation of various operation9-'l con<;:epts, i.ncluding delay, is discussed. S~~eral examples bf appiying ~he rules ~re gi~en. Key Words and Concepts: Weakes t preconditi on, semanti cs, communi cati:ng processes, di stri I;>uted· 'programming, nonde~erminism, termination, deadlock. CR Categories: 5.24,4.32. (1) The work of the first author was supportep by NSf grant MCS~80-l7577. (2) The.work of the second author wa~ supported By a grant by IBM-Israel. Technion - Computer(*) A Science preliminary Department version - Technical of Report thist CS0244 paper -was 1982 pre~ented i'n the 5th Inter­ national Confer.ence on ProgrammJng, Torino, Apri' 198~. 1 ... 1. INTRODUCTION The 'mportance of the axiomatic approach to Tormal definixlons ;)­ of the semantics of 1anguages for conc~rrent programming js by 'now \:lidely' reco~nized:[OG, AfR , FS, LS, >LG]. 1fie purpose of~thi.s paper is to im"~stig.ate th"e use of WP (Weakest • 1 Pr~condi~ion) semantics as a tpol for th~ f~rmal definition of the semantics of languages for concurrent and'distributed pro~ramming.
    [Show full text]
  • Categorial Grammar and the Semantics of Contextual Prepositional Phrases
    Categorial Grammar and the Semantics of Contextual Prepositional Phrases Nissim Francez and Mark Steedman† September 12, 2004 The paper proposes a semantics for contextual (i.e., Temporal and Locative) Prepositional Phrases (CPPs) like during every meeting, in the garden, when Harry met Sally and where I’m calling from. The semantics is embodied in a multi-modal extension of Combinatory Categoral Grammar (CCG). The grammar allows the strictly monotonic compositional derivation of multiple correct interpretations for “stacked” or multiple CPPs, including interpretations whose scope relations are not what would be expected on standard assumptions about surface-syntactic command and monotonic derivation. A type-hierarchy of functional modalities plays a crucial role in the specification of the fragment. 1 Introduction In [25], a temporal semantics for temporal preposition phrases (TPPs) was introduced, free of any syntactic anal- ysis and syntax-semantics interface. This semantics employed some unorthodox operations, referred to as pseudo- applications, as well as semantic operations not triggered by any underlying syntactic manipulation. Furthermore, it used two types of sentential (and, hence, verbal) meanings. Here, we attempt to incorporate a semantics inspired by that of [25], but cast into the rigid syntax-semantics interface imposed by Combinatory Categorial Grammar (CCG, [27, 29]). In addition, we extend the semantics to two varieties of contextual preposition phrases (CPPs): these are the temporal preposition phrases (TPPs) of the earlier account, and Locative preposition phrases (LPPs), discussed in a related approach by [5], but not treated before in this framework. Of particular interest is mixed modification by both kinds of CPP. The fact that CCG is a strongly lexicalized grammatical theory allows us to bring all operations relating distinct sentential and verbal meanings under the control of lexical rules and mor- phemes such as tense inflections and the temporal prepositions themselves.
    [Show full text]