Situations and Syntactic Structures

Total Page:16

File Type:pdf, Size:1020Kb

Situations and Syntactic Structures MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm Situations and Syntactic Structures MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm Situations and Syntactic Structures Rethinking Auxiliaries and Order in English <optional edition> Gillian Ramchand The MIT Press Cambridge, Massachusetts London, England MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm c 2016 Massachusetts Institute of Technology All rights reserved. No part of this book may be reproduced in any form or by any elec- tronic or mechanical means (including photocopying, recording, or information storage and retrieval) without permission in writing from the publisher. For information about special quantity discounts, please email special [email protected]. This book was set in Syntax and Times Roman by the author. Printed and bound in the United States of America. Library of Congress Cataloging-in-Publication Data <Situations and Structure> / <Gillian Ramchandr> . p. cm. Includes bibliographical references and index. ISBN ???? 10 9 8 7 6 5 4 3 2 1 MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm Contents Preface xi 1 Introduction to Events and Situations in Grammar 1 1.1 Linguistic Generalizations and Constraints on the Syn-Sem Mapping3 1.2 Event Properties and Event Instantiations8 1.2.1 Philosophical Antecedents9 1.2.2 A Quotational Semantics for Natural Language 15 1.3 The Grammar of Auxiliation 22 1.3.1 Ordering 23 1.3.2 Lexical Specification and Polysemy 24 1.4 Morphology and Spanning 28 1.5 Roadmap 38 2 The Progressive in English 41 2.1 The Syntax of the Progressive: The Progressive is in the First Phase 42 2.1.1 Expletive Associates 43 2.1.2 VP fronting and pseudoclefts 44 2.1.3 British nonfinite do-substitution 45 2.2 The Semantics of the Progressive 49 2.3 The Present Proposal 59 2.3.1 The Progressive as an ‘Identifying State’ 60 2.3.2 Relating Event Properties to Situational Particulars 64 2.4 Conclusion 72 3 The Passive and its Participle 75 3.1 The Stative Participle in -en/ed 77 3.1.1 Event Implications 78 3.1.2 An Implementation in Terms of Reduced Spans 84 3.2 The Eventive Passive 95 3.2.1 Blocking 98 3.2.2 Interaction of Passive and Progressive within EvtP 104 MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm viii Contents 3.2.3 Adjectivalization 107 4 The Participle, the Perfect and the Spatiotemporal Domain 113 4.1 Semantic Background to the English Perfect 116 4.1.1 Aktionsart Sensitivity 119 4.1.2 Temporal Modification and the Present Perfect Puzzle 121 4.1.3 Lifetime Effects and Current Relevance 122 4.2 The Proposal 124 4.2.1 Accounting for the Pragmatic/Lifetime Effects 129 4.2.2 Temporal Modification and the Perfect 133 4.3 Summary 137 5 Modals and the Spatiotemporal Domain 145 5.1 Introduction 145 5.2 What We Know: Syntax 147 5.2.1 Thematic Relations: ‘Raising’ vs.‘Control’ 151 5.2.2 Scope wrt to a Quantified Subject 152 5.2.3 Modals and Symmetric Predicates 153 5.2.4 Ordering and Typology 155 5.2.5 Taking Stock 158 5.3 Interaction with Negation 159 5.3.1 English Modals are Idiosyncratic 160 5.3.2 A Semantic Account of the Idiosyncrasy? 163 5.4 Semantic Properties of Modals 167 5.4.1 The Classical Theory 167 5.5 Circumstantial Modals as Modifiers of Spatiotemporal Properties 174 5.6 Conclusion 183 6 Modals and Generalized Anchoring 185 6.1 Temporal Properties of Modals 186 6.2 Epistemic Modals as Modifiers of Assertions 196 6.2.1 Aktionsart Sensitivity of Epistemic Modals 199 6.2.2 Modals Embedding the Perfect 203 6.3 Evidence for Choice among Live-Alternatives as Basic to Modal Meaning 206 6.3.1 Relation to Dynamic Modal Interpretations 206 6.3.2 The Semantic ‘weakness’ of Universal Modals. 207 6.3.3 The Puzzle of Deontic Modality and its interaction with Disjunction. (The Paradox of Free Choice Disjunction (Ross 1941)) 209 6.4 Conclusion 210 7 Summary and Future Prospects 215 7.1 Architecture and Semantic Zones 215 7.2 Insertion: Lexical vs. Functional Items 219 MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm 7.3 Summary of the Pieces 221 7.4 Auxiliary Ordering Revisited 226 7.5 Open Questions and Further Research 228 7.5.1 The Nominal Domain 228 7.5.2 On the Universal vs. the Language Specific 229 7.5.3 The Future and the Search for Explanations 230 Bibliography 231 MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm Preface In some ways I have been writing this book in my head for twenty years, work- ing on various aspects of the syntax and semantics of the verbal extended pro- jection in English and other languages. The immediate impetus came from the collaborative paper I wrote with Peter Svenonius on reconciling minimal- ist and cartographic approaches to phrase structure (Ramchand and Svenonius 2014), and I knew that the compositional semantics component of that paper was a huge promissory note that had to be redeemed if the enterprise were to succeed. The present book is motivated by a conviction about what the rela- tionship between phrase structure and semantic interpretation should look like, a conviction I found that many shared, but which was rather difficult to actu- ally implement. Implementing the intuition required some radical changes to the assumed semantic ontology for natural language, ones which I believe are more in line with internalist intuitions (cf. Chomsky 1995, Pietroski 2005), without giving up on a system that grounds interpretation in truth. The formal semantic framework I have in mind to underpin the kind of system I propose is a version of Kit Fine’s truthmaker semantics (Fine 2014), using situations as exact verifiers for natural language clauses. The system is quite different from the kind of semantics that takes worlds as its foundation, and in which, instead, possible and impossible situations are primitives of the ontology. The intuition that is important to me is that the syntax of natural languages gives evidence on the meaning side for a natural language ontology which might be quite dif- ferent from the one that seems most compelling from a purely metaphysical point of view. Like Fine, (and Moltmann 2018) I am more concerned with how language puts meaning together than with how truthmakers are connected to a metaphysics of the real world (if indeed such a thing is even possible). One crucial innovation in the system requires reifying the linguistic sym- bol itself as an object in the ontology. I do not think this is a ‘trick’, but in fact simply acknowledges a feature of the natural language system that is very important— self referentiality and a deep indexicality (relativization to MITPress Linguistics.cls LATEX Book Style Typeset with PDFLaTeX Size: 6x9 October 27, 2017 3:28pm xii Preface the particular speaker) which I believe has desirable consequences that extend far beyond the scope of the present monograph. The first few chapters of this book were written while I was on sabbatical at the University of Edinburgh in the second half of 2015. I thank the University of Tromsø for its generous sabbatical provision, and its support of pure theo- retical research, and the University of Edinburgh for being a welcoming host. In the early stages of writing I benefitted a great deal from correspondence with Lucas Champollion, and to discussions with Ronnie Cann to whom I am very grateful. I would also like to thank audiences at the LOT Winter School in Nijmegen in January 2017 and to audiences at a mini course at the Univer- sity of Budapest in February 2017. I would like to thank Marcel den Dikken and Eva Dekány for inviting me to teach at the latter event. I also benefitted from discussion and feedback at Daniel Altshuler’s UMass semantics semi- nar in Spring 2017. I would further like to thank Daniel Altshuler personally and Miriam Butt for very useful detailed comments on an intermediate draft, and Sergey Minor for invaluable and detailed feedback on the whole prefinal manuscript. Special thanks go to Robert Henderson for turning up in Tromsø in late 2016 and giving a talk at our colloquium series on ideophones, which contributed the final piece of the puzzle. I remember that eureka moment very well when I realised that the exoticism of ideophones was just ‘the truth stand- ing on its head to get attention’. A big thank you goes to my colleagues at the University of Tromsø and in particular the CASTLFish milieu (Tarald Tarald- sen, Antonio Fábregas, Sergey Minor, Peter Svenonius and Björn Lundquist deserving special mention in this regard), for reacting and commenting on all things related to syntax and morphology, for providing the intellectual frame for the kinds of questions I find myself asking, and for providing standards for the kinds of answers that satisfy.
Recommended publications
  • Operational Semantics Page 1
    Operational Semantics Page 1 Operational Semantics Class notes for a lecture given by Mooly Sagiv Tel Aviv University 24/5/2007 By Roy Ganor and Uri Juhasz Reference Semantics with Applications, H. Nielson and F. Nielson, Chapter 2. http://www.daimi.au.dk/~bra8130/Wiley_book/wiley.html Introduction Semantics is the study of meaning of languages. Formal semantics for programming languages is the study of formalization of the practice of computer programming. A computer language consists of a (formal) syntax – describing the actual structure of programs and the semantics – which describes the meaning of programs. Many tools exist for programming languages (compilers, interpreters, verification tools, … ), the basis on which these tools can cooperate is the formal semantics of the language. Formal Semantics When designing a programming language, formal semantics gives an (hopefully) unambiguous definition of what a program written in the language should do. This definition has several uses: • People learning the language can understand the subtleties of its use • The model over which the semantics is defined (the semantic domain) can indicate what the requirements are for implementing the language (as a compiler/interpreter/…) • Global properties of any program written in the language, and any state occurring in such a program, can be understood from the formal semantics • Implementers of tools for the language (parsers, compilers, interpreters, debuggers etc) have a formal reference for their tool and a formal definition of its correctness/completeness
    [Show full text]
  • Extensible Transition System Semantics
    Extensible Transition System Semantics Casper Bach Poulsen Submitted to Swansea University in fulfilment of the requirements for the Degree of Doctor of Philosophy Department of Computer Science Swansea University 2015 Declaration This work has not been previously accepted in substance for any degree and is not being concurrently submitted in candidature for any degree. Signed ............................................................ (candidate) Date ............................................................ Statement 1 This thesis is the result of my own investigations, except where otherwise stated. Other sources are acknowledged by footnotes giving explicit references. A bibliography is appended. Signed ............................................................ (candidate) Date ............................................................ Statement 2 I hereby give my consent for my thesis, if accepted, to be available for photocopying and for inter-library loan, and for the title and summary to be made available to outside organisations. Signed ............................................................ (candidate) Date ............................................................ Abstract Structural operational semantics (SOS) come in two main styles: big-step and small- step. Each style has its merits and drawbacks, and it is sometimes useful to maintain specifications in both styles. But it is both tedious and error-prone to maintain multiple specifications of the same language. Additionally, big-step SOS has poor support for
    [Show full text]
  • Scaffolding Learning of Language Structures with Visual‐Syntactic Text
    UC Irvine UC Irvine Previously Published Works Title Scaffolding learning of language structures with visual-syntactic text formatting Permalink https://escholarship.org/uc/item/6235t25b Journal British Journal of Educational Technology, 50(4) ISSN 0007-1013 Authors Park, Y Xu, Y Collins, P et al. Publication Date 2018 DOI 10.1111/bjet.12689 Peer reviewed eScholarship.org Powered by the California Digital Library University of California British Journal of Educational Technology Vol 0 No 0 2018 1–17 doi:10.1111/bjet.12689 Scaffolding learning of language structures with visual-syntactic text formatting Youngmin Park, Ying Xu , Penelope Collins, George Farkas and Mark Warschauer Youngmin Park is a Lecturer at Pusan National University, Korea. She received her Ph.D. degree from the University of California, Irvine, specializing in Language, Literacy and Technology. Ying Xu is a Ph.D. student at the University of California, Irvine, with a specialization in specializing in Language, Literacy and Technology. Penelope Collins is an associate professor at the University of California, Irvine. Her research examines the development of language and literacy skills for children from linguistically diverse backgrounds. George Farkas is a professor at the University of California, Irvine. He has employed a range of statistical approaches and databases to examine the causes and consequences of reading achievement gap across varying age groups and educational settings. Mark Warschauer is a professor at the University of California, Irvine. He works on a range of research projects related to digital media in education. Address for correspondence: Ying Xu, University of California Irvine, 3200 Education Bldg, Irvine, CA 92697, USA.
    [Show full text]
  • A. Detailed Course Structure of MA (Linguistics)
    A. Detailed Course Structure of M.A. (Linguistics) Semester I Course Course Title Status Module & Marks Credits Code LIN 101 Introduction to Core 20(M1)+20(M2)+10(IA) 4 Linguistics LIN 102 Levels of Language Core 20(M1)+20(M2)+10(IA) 4 Study LIN 103 Phonetics Core 20(M1)+20(M2)+10(IA) 4 LIN 104 Basic Morphology & Core 20(M1)+20(M2)+10(IA) 4 Basic Syntax LIN 105 Indo-European Core 20(M1)+20(M2)+10(IA) 4 Linguistics & Schools of Linguistics Semester II LIN 201 Phonology Core 20(M1)+20(M2)+10(IA) 4 LIN 202 Introduction to Core 20(M1)+20(M2)+10(IA) 4 Semantics & Pragmatics LIN 203 Historical Linguistics Core 20(M1)+20(M2)+10(IA) 4 LIN 204 Indo-Aryan Core 20(M1)+20(M2)+10(IA) 4 Linguistics LIN 205 Lexicography Core 20(M1)+20(M2)+10(IA) 4 Semester III LIN 301 Sociolinguistics Core 20(M1)+20(M2)+10(IA) 4 LIN 302 Psycholinguistics Core 20(M1)+20(M2)+10(IA) 4 LIN 303 Old Indo-Aryan Discipline 20(M1)+20(M2)+10(IA) 4 Specific Elective Lin 304 Middle Indo-Aryan Discipline 20(M1)+20(M2)+10(IA) 4 Specific Elective LIN 305 Bengali Linguistics Discipline 20(M1)+20(M2)+10(IA) 4 1 Specific Elective LIN 306 Stylistics Discipline 20(M1)+20(M2)+10(IA) 4 Specific Elective LIN 307 Discourse Analysis Generic 20(M1)+20(M2)+10(IA) 4 Elective Semester IV LIN 401 Advanced Core 20(M1)+20(M2)+10(IA) 4 Morphology & Advanced Syntax LIN 402 Field Methods Core 20(M1)+20(M2)+10(IA) 4 LIN 403 New Indo-Aryan Discipline 20(M1)+20(M2)+10(IA) 4 Specific Elective LIN 404 Language & the Discipline 20(M1)+20(M2)+10(IA) 4 Nation Specific Elective LIN 405 Language Teaching Discipline 20(M1)+20(M2)+10(IA) 4 Specific Elective LIN 406 Term Paper Discipline 50 6 Specific Elective LIN 407 Language Generic 20(M1)+20(M2)+10(IA) 4 Classification & Elective Typology B.
    [Show full text]
  • Chapter 7 Linguistics As a Science of Structure Ryan M
    Chapter 7 Linguistics as a science of structure Ryan M. Nefdt University of the Western Cape Generative linguistics has rapidly changed during the course of a relatively short period. This has caused many to question its scientific status as a realist scientific theory (Stokhof & van Lambalgen 2011; Lappin et al. 2000). In this chapter, I argue against this conclusion. Specifically, I claim that the mathematical foundations of the science present a different story below the surface. I agree with critics that due to the major shifts in theory over the past 80 years, linguistics is indeed opened up to the problem of pessimistic meta-induction or radical theory change. However, I further argue that a structural realist approach (Ladyman 1998; French 2006) can save the field from this problem and at the same time capture its structural nature. I discuss particular historical instances of theory change in generative grammar as evidence for this interpretation and finally attempt to extend it beyond the gener- ative tradition to encompass previous frameworks in linguistics. 1 Introduction The generativist revolution in linguistics started in the mid-1950s, inspired in large part by insights from mathematical logic and in particular proof theory. Since then, generative linguistics has become a dominant paradigm, with many connections to both the formal and natural sciences. At the centre of the newly established discipline was the syntactic or formal engine, the structures of which were revealed through modelling grammatical form. The generativist paradigm in linguistics initially relied heavily upon the proof-theoretic techniques intro- duced by Emil Post and other formal logicians to model the form language takes (Tomalin 2006; Pullum 2011; 2013).1 Yet despite these aforementioned formal be- ginnings, the generative theory of linguistics has changed its commitments quite 1Here my focus will largely be on the formal history of generative syntax but I will make some comments on other aspects of linguistics along the way.
    [Show full text]
  • Syntactic Structures and Their Symbiotic Guests. Notes on Analepsis from the Perspective of On- Line Syntax*
    Pragmatics 24:3.533-560 (2014) International Pragmatics Association DOI: 10.1075/prag.24.3.05aue SYNTACTIC STRUCTURES AND THEIR SYMBIOTIC GUESTS. NOTES ON ANALEPSIS FROM THE PERSPECTIVE OF ON- LINE SYNTAX* Peter Auer Abstract The empirical focus of this paper is on utterances that re-use syntactic structures from a preceding syntactic unit. Next utterances of this type are usually treated as (coordination) ellipsis. It is argued that from an on-line perspective on spoken syntax, they are better described as structural latency: A grammatical structure already established remains available and can therefore be made use of with one or more of its slots being filled by new material. A variety of cases of this particular kind of conversational symbiosis are discussed. It is argued that they should receive a common treatment. A number of features of the general host/guest relationship are discussed. Keywords: Analepsis; On-line syntax; Structural latency. "Ein Blick in die erste beste Erzählung und eine einfache Ueberlegung muss beweisen, dass jede frühere Aeusserung des Erzählenden die Exposition aller nachfolgenden Prädikate bildet." (Wegener 1885: 46)1 1. Introduction The notion of 'ellipsis' is often regarded with some skepticism by Interactional Linguists - the orientation to 'full' sentences is all too obvious (cf. Selting 1997), and the idea that speakers produce complete sentences just in order to delete some parts of them afterwards surely fails to account for the processual dynamics of sentence production and understanding (cf. Kindt 1985) in time. On the other hand, there can be little doubt that speakers often produce utterance units that could not be understood * I wish to thank Elizabeth Couper-Kuhlen and Susanne Günthner for their helpful comments on a previous version of this paper.
    [Show full text]
  • LING83600: Context-Free Grammars
    LING83600: Context-free grammars Kyle Gorman 1 Introduction Context-free grammars (or CFGs) are a formalization of what linguists call “phrase structure gram- mars”. The formalism is originally due to Chomsky (1956) though it has been independently discovered several times. The insight underlying CFGs is the notion of constituency. Most linguists believe that human languages cannot be even weakly described by CFGs (e.g., Schieber 1985). And in formal work, context-free grammars have long been superseded by “trans- formational” approaches which introduce movement operations (in some camps), or by “general- ized phrase structure” approaches which add more complex phrase-building operations (in other camps). However, unlike more expressive formalisms, there exist relatively-efficient cubic-time parsing algorithms for CFGs. Nearly all programming languages are described by—and parsed using—a CFG,1 and CFG grammars of human languages are widely used as a model of syntactic structure in natural language processing and understanding tasks. Bibliographic note This handout covers the formal definition of context-free grammars; the Cocke-Younger-Kasami (CYK) parsing algorithm will be covered in a separate assignment. The definitions here are loosely based on chapter 11 of Jurafsky & Martin, who in turn draw from Hopcroft and Ullman 1979. 2 Definitions 2.1 Context-free grammars A context-free grammar G is a four-tuple N; Σ; R; S such that: • N is a set of non-terminal symbols, corresponding to phrase markers in a syntactic tree. • Σ is a set of terminal symbols, corresponding to words (i.e., X0s) in a syntactic tree. • R is a set of production rules.
    [Show full text]
  • Axiomatic Semantics I
    Spring 2016 Program Analysis and Verification Lecture 3: Axiomatic Semantics I Roman Manevich Ben-Gurion University Warm-up exercises 1. Define program state: 2. Define structural semantics configurations: 3. Define the form of structural semantics transitions: 2 Tentative syllabus Abstract Program Program Analysis Interpretation Verification Analysis Basics Techniques fundamentals Operational Control Flow Numerical Lattices semantics Graphs Domains Axiomatic Equation Fixed-Points Alias analysis Verification Systems Collecting Chaotic Interprocedural Semantics Iteration Analysis Galois Shape Using Soot Connections Analysis Domain CEGAR constructors Widening/ Narrowing 3 Agenda • Basic concepts of correctness • Axiomatic semantics (pages 175-183) – Motivation – First-order logic reminder – Hoare Logic 4 program correctness 5 Program correctness concepts • Specification = a certain relationship between initial state and final state Main focus of this course • Partial correctness = specifications that hold if the program terminates • Termination = program always terminates – i.e., for every input state partial correctness + termination = total correctness Other correctness concepts exist: liveness, resource usage, … 6 Verifying factorial with structural semantics 7 Structural semantics for While [asssos] x := a, 1 [x Aa] [skipsos] skip, 1 1 S1, 1 S1’, ’ [comp sos] S1; S2, 1 S1’; S2, 2 S1, 1 ’ [comp sos] S1; S2, 1 S2, ’ tt [if sos] if b then S1 else S2, 1 S1, if B b = tt ff [if sos] if b then S1 else S2, 1 S2, if B b = ff while
    [Show full text]
  • Generative Linguistics and Neural Networks at 60: Foundation, Friction, and Fusion*
    Generative linguistics and neural networks at 60: foundation, friction, and fusion* Joe Pater, University of Massachusetts Amherst October 3, 2018. Abstract. The birthdate of both generative linguistics and neural networks can be taken as 1957, the year of the publication of foundational work by both Noam Chomsky and Frank Rosenblatt. This paper traces the development of these two approaches to cognitive science, from their largely autonomous early development in their first thirty years, through their collision in the 1980s around the past tense debate (Rumelhart and McClelland 1986, Pinker and Prince 1988), and their integration in much subsequent work up to the present. Although this integration has produced a considerable body of results, the continued general gulf between these two lines of research is likely impeding progress in both: on learning in generative linguistics, and on the representation of language in neural modeling. The paper concludes with a brief argument that generative linguistics is unlikely to fulfill its promise of accounting for language learning if it continues to maintain its distance from neural and statistical approaches to learning. 1. Introduction At the beginning of 1957, two men nearing their 29th birthdays published work that laid the foundation for two radically different approaches to cognitive science. One of these men, Noam Chomsky, continues to contribute sixty years later to the field that he founded, generative linguistics. The book he published in 1957, Syntactic Structures, has been ranked as the most influential work in cognitive science from the 20th century.1 The other one, Frank Rosenblatt, had by the late 1960s largely moved on from his research on perceptrons – now called neural networks – and died tragically young in 1971.
    [Show full text]
  • Lexical and Semantics Literatures and Their Trends Mohana Dass Ramasamy & Devi Vadiveloo Department of Indian Studies University of Malaya
    Lexical and Semantics Literatures and Their Trends Mohana Dass Ramasamy & Devi Vadiveloo Department of Indian Studies University of Malaya Introduction with his imagination give birth to new word formation whenever there is a need and necessity. The process of making or coining the new term, sometime easy and sometime not. By adding an affix or suffix to a root word, an intended word could be formed to fill the blank of missing message delivery. The only criteria of succession is it has to be the best term that could represent the abstract mental message or image borne in the mind. as a special innate mental module. The field of study known as lexical and lexical semantic have been established well in the literature, but they were less studied in the Tamil background. This study, therefore, aimed at giving an impressive look at the existing literatures on these fields, and possibly arguing for how they could be matched in the Tamil. Word Formation Literatures Word has different connection with various academic fields in this world. However, strong inter- connection between word and linguistic have created an academic discipline alone, known as today. Lexical has different perspective from each discipline in academic world [Palmer (1981), Trask (1993), Crystal (1997)]. Morphology is study of lexical structure and semantics is the study of lexical meaning. Lexical participation in every field played a vital role to create great success in academic today. Lexical Semantics also garnered extensive interest of the scholarly world. Since lexical semantics defined as study of word meaning, Lexical semanticists are interested in what words mean, why they mean so, what they mean, how they are and discourse(CaritaParadis 2013: 1).
    [Show full text]
  • Axiomatic Semantics II
    Spring 2016 Program Analysis and Verification Lecture 5: Axiomatic Semantics II Roman Manevich Ben-Gurion University Tentative syllabus Abstract Program Program Analysis Interpretation Verification Analysis Basics Techniques fundamentals Operational Control Flow Numerical Lattices semantics Graphs Domains Equation Hoare Logic Fixed-Points Alias analysis Systems Applying Hoare Collecting Chaotic Interprocedural Logic Semantics Iteration Analysis Weakest Galois Shape Precondition Using Soot Connections Analysis Calculus Proving Domain CEGAR Termination constructors Widening/ Data structures Narrowing 2 Previously • Basic notions of correctness • Formalizing Hoare triples • FO logic – Free variables – Substitutions • Hoare logic rules 3 Warm-up exercises 1. Define program state: 2. Define state predicate: 3. Define P 4. Formalize {P} C {Q} via structural semantics: 5. FV(m. x=k+1 0mx-1 nums(m)res) = { } 6. (m. x=k+1 0mx-1 nums(m)res)[x+1/x] = 4 Agenda • Inference system • Annotating programs with proofs • Properties of the semantics Chapter 6 • Predicate transformer calculus – Weakest precondition calculus – Strongest postcondition calculus 5 Axiomatic semantics as an inference system 6 Inference trees • Trees describing rule applications that ultimately prove a program correct • Leaves are axiom applications • Internal nodes correspond to rule applications over triples inferred from sub-trees • Inference tree is called – Simple if tree is only an axiom – Composite otherwise 7 Factorial proof inference tree Goal: { x=n } y:=1; while
    [Show full text]
  • Ogden and Richards' the Meaning of Meaning and Early Analytic
    Ogden and Richards’ The Meaning of Meaning and early analytic philosophy. James McElvenny [email protected] This is the green open access copy of my paper in Language Sciences (2014, 41.212-221). http://dx.doi.org/10.1016/j.langsci.2013.10.001 This version silently corrects a few small errors in the published paper. Abstract C.K. Ogden (1889–1957) and I.A. Richards’ (1893–1979) The Meaning of Meaning is widely recognised as a classic text of early twentieth-century linguistic semantics and semiotics, but less well known are its links to the ‘logical atomism’ of Bertrand Russell (1872–1970), one of the foundational doctrines of analytic philosophy. In this paper a detailed comparison of The Meaning of Meaning and logical atomism is made, in which several key similarities between the two theories in subject matter and approach are identified: both attempt to describe meaning in terms of the latest psychological doctrines and both pursue a normative program aimed at rectifying the perceived deficiencies of language. But there are also a number of differences between the theories. Ogden and Richards – most probably inspired by Victoria Lady Welby (1837–1912) – offered a pragmatically oriented account of ordinary language, while Russell sought a ‘logically perfect language’ beyond interpretation, and rejected the work of Welby and her allies. These differences contributed significantly to Russell’s largely negative opinion of The Meaning of Meaning. Despite this, several ideas pioneered in The Meaning of Meaning re-appear in Russell’s later writings. The Meaning of Meaning, it would seem, not only drew inspiration from Russell’s philosophy but may have also contributed to its further development.
    [Show full text]