Schreiner, Sylvia L.R. (In Press). the Syntax-Semantics/Pragmatics Interface. for A. Carnie, Y. Sato, and D. Siddiqi, Eds. Routledge Handbook of Syntax

Total Page:16

File Type:pdf, Size:1020Kb

Schreiner, Sylvia L.R. (In Press). the Syntax-Semantics/Pragmatics Interface. for A. Carnie, Y. Sato, and D. Siddiqi, Eds. Routledge Handbook of Syntax Schreiner, Sylvia L.R. (In press). The syntax-semantics/pragmatics interface. For A. Carnie, Y. Sato, and D. Siddiqi, eds. Routledge Handbook of Syntax. London: Routledge. 1. INTRODUCTION A number of phenomena important to our understanding of the structures and meanings of natural language lie at the juncture between the two. This overview considers the major phenomena at the interface between syntax and semantics/pragmatics, as well as the major theoretical questions that have arisen around these phenomena and around the interface itself. There is only an interface to talk about between syntax and semantics inasmuch as the two are considered to be separate components (as has generally been the case in the generative tradition). We can talk about this “interface” in at least two ways: on the one hand, we can talk about the location in a model of language competence and/or performance where the syntactic and semantic modules meet and interact. On the other hand, we can talk about phenomena that seem to be driven by both syntactic and semantic mechanisms or principles. Both perspectives will be considered here. Studies of phenomena at the interface seek to answer questions such as the following: Does each part of a syntactic structure play an equal role in determining the meaning? Which parts of the meaning have overt reflexes in the structure? Can the overall meaning be easily deduced from the summed meaning of the parts? And, which kinds of meaning are instantiated with a piece of morphosyntax, and which merely have a syntactic effect (i.e. on ordering relations, co-occurrence restrictions, limitations on movement, etc.)? Several approaches to overarching versions of these questions are discussed there. This article is structured as follows: Section 2 presents some of the major issues at the interface between syntax and semantics, with special attention paid to compositionality, theta theory, and functional heads; the final subsection is devoted to phenomena at the interface with pragmatics. Section 3 describes major models of the interface in syntactic and semantic theory, with the last subsection focusing on approaches to the interface(s) with pragmatics. Section 4 concludes and suggests avenues for future work. 2. ISSUES AT THE INTERFACE OF SYNTAX AND SEMANTICS Here I present some of the major topics that seem to occur naturally at the syntax-semantics interface, along with examples of work in each area. 2.1 Interpretation and compositionality The issue that underlies most if not all work at the syntax-semantics interface is how to arrive at the meaning of a structure. Many approaches have come to the same conclusion: that the structure is built first, and the meaning is then obtained from the structure in one way or another. This is the case in the Principles & Parameters framework in general (from Deep Structures, Surface Structures, or Logical Form), and in the Minimalist Program (from LF), but not, for instance, in Muskens’ (2001) non-compositional λ-grammar account (in which semantics is strictly parallel to syntax, rather than secondary to it in any way). In Lexical Functional Grammar, as well, syntactic and semantic levels of representation exist in parallel with mappings between them; meaning is read from the semantic representation. In mainstream generative syntax, the basic picture of the grammar has been one in which the syntax is responsible for building structures, and the semantics is responsible for assigning interpretations to those structures. In early views (the “Standard Theory”), syntactic Deep Structures were the input to the semantics. (In Generative Semantics, on the other hand, the interpretations were actually generated there.) In the “Extended Standard Theory”, semantic interpretation occurred at two points—once at Deep Structure, and once at Surface Structure (this was in response to issues with the interpretation of scope, as discussed below). This was followed by the move to an LF-input view. In much current Minimalist thinking, chunks of structure are interpreted piece-by-piece, e.g., at phase edges. Compositionality is the concept of assembling the meaning of a larger constituent from the meaning of its component parts via some principles of combination. A number of pragmatic or discourse-level (context dependent) phenomena present problems even for non-strict interpretations of compositionality; it is difficult, for instance, to see how conversational implicatures or the meaning lent by sarcasm could be computed by the same mechanism that determines the interpretation of verb phrases. At the syntax-semantics interface, there are several levels at which compositionality might be expected to hold: with sentence-level modification like negation; at clause level, from the composition of the external argument with the verb phrase; within the VP, to account for the composition of the verb with its internal argument; and within (the equivalent of) determiner phrases, adjective phrases, adverb phrases, etc. Depending on one’s theory of morphology, the syntax may also be responsible for producing the input to the lexical(- level) semantics—see e.g. Distributed Morphology (Halle & Marantz 1993, Harley 1999, etc.) for a view of morphology where word-building is done in the syntax. In formal semantics, Frege’s concept of semantic composition as the “saturation” of functions (i.e., as functional application) has remained in the fore, with Heim & Kratzer’s (1998) work being an important contribution. The concept of composition as functional application has been used in both extensional and intensional semantics. It is based on the idea that the meanings of words (and larger constituents) need to be “completed” with something else. (For example, the meaning of a transitive verb is incomplete without its direct object.) Sentence meanings are arrived at by a series of applications of functions to their arguments (which the functions need in order to be “saturated”). At the sentence level, the output is no longer a function but whatever the theory holds to be the meaning of a sentence—in extensional truth-conditional semantics, a truth value. Early formalisms based on Montague Grammar (Montague 1974) worked from the perspective that each phrase level’s syntactic rule had a separate mechanism for semantic interpretation. Klein and Sag (1985) proposed that each constituent needing an interpretation was of a certain basic type; in their theory it was these types that had rules for interpretation rather than the syntactic rules themselves. Klein & Sag used the types of individuals, truth values, and situations to form their higher types; work in event semantics (following Davidson 1967) has also proposed a type for events. Other rules of composition have been introduced, such as predicate modification (e.g., Heim & Kratzer 1998). This allows the meaning of intersective adjective phrases to be computed: it essentially lets us say that the meaning of brown house is the same as the meaning of brown plus the meaning of house. Non-intersective adjectives present some trouble for predicate modification. In addition to the mechanics of compositionality, theories of semantic interpretation differ in terms of how homomorphic they assert the syntax and the semantics to be—that is, how much of the interpretation is allowed outside the confines of the compositional meaning. Sentence meaning in strictly compositional theories (e.g., Montague’s 1970 approach) is derived only from the meaning of the syntactic parts and the way they are combined; in non-strictly compositional theories there are also rules that operate on the semantics itself, without a syntactic rule involved (as in some of Partee’s work, e.g. Partee & Rooth 1983). 2.2 Theta Theory The interaction between theta roles (e.g., external argument) and their associated thematic roles (e.g., agent) sits naturally at the syntax-semantics interface. The aim is to discover the connections between syntactic arguments and the semantic part(s) they play in sentences. The Theta Criterion has been the focus of much work in government and binding theory and its successors. The original formulation (Chomsky 1981) said that each theta role must be realized by one argument and each argument must be assigned one theta role. This was recast in Chomsky (1986) in terms of chains. An argument (e.g., a subject noun phrase in a passive) undergoes movement, and the coindexed positions it occupies before and after this movement make up a chain; the chain itself gets the theta role. The formal representation of theta roles also underwent changes—the early “theta grid” only represented the theta roles themselves with an indication of their status as internal or external, while later conceptions (e.g. as laid out in Haegeman’s 1991 textbook) also include argument structure information. Thematic roles and relations themselves have also received much attention. Work on “thematic hierarchies” (e.g. Larson 1988; Grimshaw 1990) attempts to explain the assignment of thematic role participants to their positions in the syntax. Dowty’s work (beginning with 1991) on proto-roles (Proto-Agent and Proto-Patient) was a reaction to the difficulty researchers were having in finding cross-linguistically reliable role categories. A notably different approach to defining roles is seen in Jackendoff’s work on thematic relations. Jackendoff (e.g., 1983) approaches the task from the semantics (or, rather, “conceptual structure”) side only—thematic relations are defined by conceptual structure primitives in different configurations. A number of researchers have also concerned themselves with the relation between thematic roles, argument structure/selection, and event structure. Krifka (1989) and Verkuyl (e.g. 1989) both propose aspectually specifying features to better account for particular thematic relationships (see Ramchand 1993 for an implementation). More recently, Ramchand (in her 2008 monograph) lays out primitives for decomposing verb meaning. She argues that in order to discover and understand thematic roles, we must first have the correct features that make up events, “since participants in the event will only be definable via the role they play in the event or subevent” (p.
Recommended publications
  • Making Logical Form Type-Logical Glue Semantics for Minimalist Syntax
    Making Logical Form type-logical Glue Semantics for Minimalist syntax Matthew Gotham University of Oslo UiO Forum for Theoretical Linguistics 12 October 2016 Slides available at <http://folk.uio.no/matthegg/research#talks> Matthew Gotham (Oslo) Making LF type-logical FTL, 12.10.2016 1 / 63 What this talk is about Slides available at <http://folk.uio.no/matthegg/research#talks> An implementation of Glue Semantics —an approach that treats the syntax-semantics interface as deduction in a type logic— for Minimalist syntax, i.e. syntactic theories in the ST!EST!REST!GB!...‘Chomskyan’ tradition. Q How Minimalist, as opposed to (say) GB-ish? A Not particularly, but the factoring together of subcategorization and structure building (in the mechanism of feature-checking) is, if not crucial to this analysis, then certainly useful. and a comparison of this approach with more mainstream approaches to the syntax-semantics interface. Matthew Gotham (Oslo) Making LF type-logical FTL, 12.10.2016 2 / 63 Outline 1 The mainstream approach 2 A fast introduction to Glue Semantics 3 Implementation in Minimalism The form of syntactic theory assumed The connection to Glue 4 Comparison with the mainstream approach Interpreting (overt) movement Problems with the mainstream approach Glue analysis Nested DPs Scope islands Matthew Gotham (Oslo) Making LF type-logical FTL, 12.10.2016 3 / 63 The mainstream approach How semantics tends to be done for broadly GB/P&P/Minimalist syntax Aer Heim & Kratzer (1998) Syntax produces structures that are interpreted recursively
    [Show full text]
  • Proceedings of the LFG 00 Conference University of California, Berkeley Editors: Miriam Butt and Tracy Holloway King 2000 CSLI P
    Proceedings of the LFG 00 Conference University of California, Berkeley Editors: Miriam Butt and Tracy Holloway King 2000 CSLI Publications ISSN 1098-6782 Editors' Note This year's conference was held as part of a larger Berkeley Formal Grammar Conference, which included a day of workshops and the annual HPSG conference. The program committee for LFG'00 were Rachel Nordlinger and Chris Manning. We would like to thank them for putting together the program that gave rise to this collection of papers. We would also like to thank all those who contributed to this conference, especially the local organizing committee, namely Andreas Kathol, and our reviewers, without whom the conference would not have been possible. Miriam Butt and Tracy Holloway King Functional Identity and Resource-Sensitivity in Control Ash Asudeh Stanford University and Xerox PARC Proceedings of the LFG00 Conference University of California, Berkeley Miriam Butt and Tracy Holloway King (Editors) 2000 CSLI Publications http://csli-publications.stanford.edu/ 2 1 Introduction1 Glue semantics provides a semantics for Lexical Functional Grammar (LFG) that is expressed using linear logic (Girard, 1987; Dalrymple, 1999) and provides an interpretation for the f(unctional)-structure level of syntactic representation, connecting it to the level of s(emantic)-structure in LFG’s parallel projection archi- tecture (Kaplan, 1987, 1995). Due to its use of linear logic for meaning assembly, Glue is resource-sensitive: semantic resources contributed by lexical entries and resulting f-structures must each be used in a successful proof exactly once. In this paper, I will examine the tension between a resource-sensitive semantics which interprets f-structures and structure-sharing in f-structures as expressed by functional control resulting from lexical functional identity equations.
    [Show full text]
  • RELATIONAL NOUNS, PRONOUNS, and Resumptionw Relational Nouns, Such As Neighbour, Mother, and Rumour, Present a Challenge to Synt
    Linguistics and Philosophy (2005) 28:375–446 Ó Springer 2005 DOI 10.1007/s10988-005-2656-7 ASH ASUDEH RELATIONAL NOUNS, PRONOUNS, AND RESUMPTIONw ABSTRACT. This paper presents a variable-free analysis of relational nouns in Glue Semantics, within a Lexical Functional Grammar (LFG) architecture. Rela- tional nouns and resumptive pronouns are bound using the usual binding mecha- nisms of LFG. Special attention is paid to the bound readings of relational nouns, how these interact with genitives and obliques, and their behaviour with respect to scope, crossover and reconstruction. I consider a puzzle that arises regarding rela- tional nouns and resumptive pronouns, given that relational nouns can have bound readings and resumptive pronouns are just a specific instance of bound pronouns. The puzzle is: why is it impossible for bound implicit arguments of relational nouns to be resumptive? The puzzle is highlighted by a well-known variety of variable-free semantics, where pronouns and relational noun phrases are identical both in category and (base) type. I show that the puzzle also arises for an established variable-based theory. I present an analysis of resumptive pronouns that crucially treats resumptives in terms of the resource logic linear logic that underlies Glue Semantics: a resumptive pronoun is a perfectly ordinary pronoun that constitutes a surplus resource; this surplus resource requires the presence of a resumptive-licensing resource consumer, a manager resource. Manager resources properly distinguish between resumptive pronouns and bound relational nouns based on differences between them at the level of semantic structure. The resumptive puzzle is thus solved. The paper closes by considering the solution in light of the hypothesis of direct compositionality.
    [Show full text]
  • Chapter 22 Semantics Jean-Pierre Koenig University at Buffalo Frank Richter Goethe Universität Frankfurt
    Chapter 22 Semantics Jean-Pierre Koenig University at Buffalo Frank Richter Goethe Universität Frankfurt This chapter discusses the integration of theories of semantic representations into HPSG. It focuses on those aspects that are specific to HPSG and, in particular, re- cent approaches that make use of underspecified semantic representations, as they are quite unique to HPSG. 1 Introduction A semantic level of description is more integrated into the architecture of HPSG than in many frameworks (although, in the last couple of decades, the integra- tion of syntax and semantics has become tighter overall; see Heim & Kratzer 1998 for Mainstream Generative Grammar1, for example). Every node in a syntactic tree includes all appropriate levels of structure, phonology, syntax, semantics, and pragmatics so that local interaction between all these levels is in principle possible within the HPSG architecture. The architecture of HPSG thus follows the spirit of the rule-to-rule approach advocated in Bach (1976) and more specifi- cally Klein & Sag (1985) to have every syntactic operation matched by a semantic operation (the latter, of course, follows the Categorial Grammar lead, broadly speaking; Ajdukiewicz 1935; Pollard 1984; Steedman 2000). But, as we shall see, only the spirit of the rule-to-rule approach is adhered to, as there can be more 1We follow Culicover & Jackendoff (2005: 3) in using the term Mainstream Generative Grammar (MGG) to refer to work in Government & Binding or Minimalism. Jean-Pierre Koenig & Frank Richter. 2021. Semantics. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook. Prepublished version.
    [Show full text]
  • Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler the University of Texas Ash Asudeh University of Rochester & Carleton University
    Chapter 30 HPSG and Lexical Functional Grammar Stephen Wechsler The University of Texas Ash Asudeh University of Rochester & Carleton University This chapter compares two closely related grammatical frameworks, Head-Driven Phrase Structure Grammar (HPSG) and Lexical Functional Grammar (LFG). Among the similarities: both frameworks draw a lexicalist distinction between morphology and syntax, both associate certain words with lexical argument structures, both employ semantic theories based on underspecification, and both are fully explicit and computationally implemented. The two frameworks make available many of the same representational resources. Typical differences between the analyses proffered under the two frameworks can often be traced to concomitant differ- ences of emphasis in the design orientations of their founding formulations: while HPSG’s origins emphasized the formal representation of syntactic locality condi- tions, those of LFG emphasized the formal representation of functional equivalence classes across grammatical structures. Our comparison of the two theories includes a point by point syntactic comparison, after which we turn to an exposition ofGlue Semantics, a theory of semantic composition closely associated with LFG. 1 Introduction Head-Driven Phrase Structure Grammar is similar in many respects to its sister framework, Lexical Functional Grammar or LFG (Bresnan et al. 2016; Dalrymple et al. 2019). Both HPSG and LFG are lexicalist frameworks in the sense that they distinguish between the morphological system that creates words and the syn- tax proper that combines those fully inflected words into phrases and sentences. Stephen Wechsler & Ash Asudeh. 2021. HPSG and Lexical Functional Gram- mar. In Stefan Müller, Anne Abeillé, Robert D. Borsley & Jean- Pierre Koenig (eds.), Head-Driven Phrase Structure Grammar: The handbook.
    [Show full text]
  • Towards Glue Semantics for Minimalist Syntax∗
    cambridge occasional papers in linguistics COP i L Volume 8, Article 4: 56–83, 2015 j ISSN 2050-5949 TOWARDS GLUE SEMANTICS FOR MINIMALIST SYNTAX∗ Matthew Gotham University College London Abstract Glue Semantics is a theory of the syntax/semantics interface according to which syntax generates premises in a fragment of linear logic, and semantic interpretation proceeds by deduction from those premises. Glue was originally developed within Lexical-Functional Grammar and is now the mainstream view of semantic composition within LFG, but it is in principle compatible with any syntactic framework. In this paper I present an implementation of Glue for Minimalism, and show how it can bring certain advantages in comparison with approaches to the syntax/semantics interface more conventionally adopted. 1 Introduction Take a simple example of quantifier scope ambiguity like (1). (1) A boy trains every dog. There are various theoretical options available when deciding where the ambi- guity of (1) should be located. One option would be to say that that the lexical entry of one (or more) of the words in the sentence is polymorphic (Hendriks 1987). Another would be to say that there is more than one syntactic analysis of (1): either in terms of constituent structure (May 1977) or derivational history (Montague 1973). There is a third option, however, which is to say that the ambiguity resides in the nature of the connection between syntax and semantic composition. This is the approach taken in Glue Semantics.1 What the lexical polymorphism and syntactic ambiguity accounts have in common is the view that, given a syntactic analysis of a sentence and a choice ∗ This paper is based on material that was presented at the event Interactions between syntax and semantics across frameworks at Cambridge University on 8 January 2015, and to the Oxford Glue Group on 19 February and 5 March 2015.
    [Show full text]
  • MISSING RESOURCES in a RESOURCE-SENSITIVE SEMANTICS Gianluca Giorgolo and Ash Asudeh King's College, London & Carleton
    MISSING RESOURCES IN A RESOURCE-SENSITIVE SEMANTICS Gianluca Giorgolo and Ash Asudeh King’s College, London & Carleton University & University of Oxford University of Oxford Proceedings of the LFG12 Conference Miriam Butt and Tracy Holloway King (Editors) 2012 CSLI Publications http://csli-publications.stanford.edu/ Abstract In this paper, we present an investigation of the argument/adjunct distinction in the context of LFG. We focus on those cases where certain grammatical functions that qualify as arguments according to all standard tests (Needham and Toivonen, 2011) are only optionally realized. We argue for an analy- sis first proposed by Blom et al. (2012), and we show how we can make it work within the machinery of LFG. Our second contribution regards how we propose to interpret a specific case of optional arguments, optional objects. In this case we propose to generalize the distinction between transitive and intransitive verbs to a continuum. Purely transitive and intransitive verbs rep- resent the extremes of the continuum. Other verbs, while leaning towards one or the other end of this spectrum, show an alternating behavior between the two extremes. We show how our first contribution is capable of accounting for these cases in terms of exceptional behavior. The key insight we present is that the verbs that exhibit the alternating behavior can best be understood as being capable of dealing with an exceptional context. In other words they display some sort of control on the way they compose with their context. This will prompt us also to rethink the place of the notion of subcategorization in the LFG architecture 1 Introduction The distinction between arguments and adjuncts is central for the LFG architec- ture as it influences the way in which representations of linguistic expressions are generated both at the functional and the semantic level.
    [Show full text]
  • The Anaphoric Semantics of Partial Control ∗
    Proceedings of SALT 24: 213–233, 2014 The anaphoric semantics of partial control ∗ Dag Trygve Truslew Haug Department of Philosophy, Classics, History of Arts and Ideas, University of Oslo Abstract This paper deals with partial control, the phenomenon that instead of identity there is a subset relation between the controller and the controllee in a control construction. We argue that the context dependency of partial control requires an anaphoric approach, rather than hard-coding the possibility of a reference shift in the semantics of the control verb. The anaphoric analysis is formalized in Partial Compositional Discourse Representation Theory. Keywords: control, anaphora, DRT, bridging 1 The semantics of syntactic dependencies Formal theories of syntax in general offer two different mechanisms for establishing dependencies between two syntactic positions. This difference can prototypically be illustrated as in (1), which shows (simplified) Government and Binding-style analyses of control (a) and raising (b). (1) a. IP b. IP Tomi VP Tomi VP hopes CP seems IP PROi to know the answer ti to know the answer On this traditional analysis, control involves two distinct syntactic elements which are coindexed, forcing identity of their referential properties. In raising, by contrast, we have a single syntactic element, which fills multiple roles in the syntax. In the ∗ I thank the SALT audience and the members of the Oxford Glue Semantics seminar for valuable feedback. ©2014 Haug Haug derivational analysis sketched in (1b), this is implemented as movement, allowing the single element to occupy several syntactic positions during the derivation. In non-derivational frameworks such as LFG or HPSG, the same effect is achieved via structure sharing.
    [Show full text]
  • Implementing a Japanese Semantic Parser Based on Glue Approach
    Implementing a Japanese Semantic Parser Based on Glue Approach Hiroshi Umemoto Fuji Xerox Co., Ltd. 430 Sakai, Nakaimachi, Ashigarakami-gun, Kanagawa 259-0157, Japan, [email protected] Abstract. This paper describes the implementation of a Japanese semantic parser based on glue ap- proach. The parser is designed as domain-independent, and produces fully scoped higher-order in- tensional logical expressions, coping with semantically ambiguous sentences without storage mechanism. It is constructed from an English semantic parser on top of Lexical-Functional Gram- mar (LFG), and it attains broad coverage through relatively little construction effort, thanks to the parallelism of the LFG grammars. I outline the parser, and I present the analyses of Japanese idio- syncratic expressions including floated numerical quantifiers, showing the distinct readings of dis- tributive and cumulative, as well as a double-subject construction and focus particles. I also explain the analyses of expressions that are syntactically parallel but semantically distinct, such as relative tense in subordinate clauses. Keywords: formal semantics, syntax-semantics interface, linear logic, semantic composition 1 Introduction The glue approach to semantic interpretation (glue semantics) provides the syntax-semantics interface where its semantic composition is represented as a linear logic derivation [1, 2]. Glue semantics realizes semantic ambiguity with multiple proofs from the same set of premises corresponding to syntactic items in a sentence, and then it requires no special machineries such as storages. Employing linear logic, it guarantees the semantic completeness and coherence of its results. In other words, all of the require- ments of the premises are satisfied and no unused premises remain in linear logic derivations.
    [Show full text]
  • Constraining Scope Ambiguity in LFG+Glue
    Constraining Scope Ambiguity in LFG+Glue Matthew Gotham University of Oxford Proceedings of the LFG’19 Conference Australian National University Miriam Butt, Tracy Holloway King, Ida Toivonen (Editors) 2019 CSLI Publications pages 111–129 http://csli-publications.stanford.edu/LFG/2019 Keywords: scope, quantification, Glue semantics Gotham, Matthew. 2019. Constraining Scope Ambiguity in LFG+Glue. In Butt, Miriam, King, Tracy Holloway, & Toivonen, Ida (Eds.), Proceedings of the LFG’19 Conference, Australian National University, 111–129. Stanford, CA: CSLI Publi- cations. Abstract A major strength of the Glue approach to semantic composition for LFG is that it accounts for quantifier scope ambiguity without the need for ad- ditional assumptions. However, quantifier scope is more rigid in some lan- guages and constructions than Glue would lead us to expect. I propose a mechanism for constraining scope ambiguity in LFG+Glue, based on ideas taken from Abstract Categorial Grammar. Unlike existing proposals, this ac- count does not depend on representational constraints on linear logic deriva- tions or meaning representations. 1 Introduction 1.1 Scope ambiguity Famously, sentences like (1) are ambiguous in English between a ‘surface scope’ and an ‘inverse scope’ interpretation. (1) A police officer guards every exit. The surface scope interpretation can be paraphrased as ‘there is a police officer who guards every exit’, and is represented logically in (2-a). The inverse scope interpretation can be paraphrased as ‘every exit is guarded by a police officer’, and is represented logically in (2-b). (2) a. 9x:officer0x ^ 8y:exit0y ! guard0xy b. 8y:exit0y ! 9x:officer0x ^ guard0xy Pre-theoretically, we can refer to the ambiguity of (1) under consideration as ‘scope ambiguity’.
    [Show full text]
  • A Modular Toolkit for Exploring Linear Logic and Glue Semantics
    The Glue Semantics Workbench: A modular toolkit for exploring Linear Logic and Glue Semantics Moritz Meßmer Mark-Matthias Zymla Across Systems GmbH University of Konstanz Proceedings of the LFG’18 Conference University of Vienna Miriam Butt, Tracy Holloway King (Editors) 2018 CSLI Publications pages 268–282 http://csli-publications.stanford.edu/LFG/2018 Keywords: Glue semantics, linear logic, implementation, Java, workbench Meßmer, Moritz, & Zymla, Mark-Matthias. 2018. The Glue Semantics Work- bench: A modular toolkit for exploring Linear Logic and Glue Semantics. In Butt, Miriam, & King, Tracy Holloway (Eds.), Proceedings of the LFG’18 Conference, University of Vienna, 268–282. Stanford, CA: CSLI Publications. Abstract In this paper we present an easy to use, modular Glue semantic prover building on the work by Crouch & van Genabith (2000) and implemented in Java. We take inspiration from a Glue semantics parser written in Pro- log as well as other existing tools such as the NLTK Glue semantics system. The architecture of our semantic parser allows us to explore the computa- tional viability of linear logic as a mechanism for modeling compositional semantics within LFG. Furthermore, it allows researchers interested in linear logic (for computational linguistics) to research its usefulness, when applied to different syntactic models and various formal semantic frameworks. The goal of this resource is to provide an accessible entry point for both beginners and adepts in computational semantics. It thus also has prospective uses as a teaching tool for computational semantics and linear logic. 1 Introduction In this paper we present an easy to use, modular Glue semantic prover and parser called Glue semantics workbench building on the work by Crouch & van Genabith (2000).1 Thereby, we revive a Glue semantics parser written in Prolog, since this first implementation is not readily accessible anymore, due to the commercializa- tion of the programming language.2 Our goal is to translate the system into a more modern implementation within the Java programming language.
    [Show full text]
  • Integrating LFG's Binding Theory with PCDRT
    Integrating LFG’s binding theory with PCDRT Mary Dalrymple1, Dag T. T. Haug2, and John J. Lowe1 1 Faculty of Linguistics, Philology and Phonetics, University of Oxford 2 Department of Linguistics and Scandinavian Studies, University of Oslo abstract We provide a formal model for the interaction of syntax and prag- Keywords: matics in the interpretation of anaphoric binding constraints on per- anaphora, binding sonal and reflexive pronouns. We assume a dynamic semantics, where theory, Lexical Functional type e expressions introduce discourse referents, and contexts are as- Grammar, Partial signments of individuals to discourse referents. We adopt the Par- Compositional tial Compositional Discourse Representation Theory (PCDRT) of Haug Discourse (2014b), whereby anaphoric resolution is modelled in terms of a Representation pragmatically-established relation between discourse referents. We in- Theory tegrate PCDRT into the constraint-based grammatical framework of Lexical Functional Grammar (LFG), and show how it is possible to state syntactic constraints on the pragmatic resolution of singular and plural anaphora within this framework. 1 introduction Pronouns are among the most frequently occurring words in many languages, including English, and speakers find no difficulty in using them and, for the most part, determining their reference in a partic- ular context. However, formally analysing the constraints on the in- terpretation of pronouns in context is a complex matter. In part, this is due to the fact that the interpretation of pronominal reference in- volves two components of language which are usually considered sep- arate: syntax and pragmatics. While syntax and semantics are widely Journal of Language Modelling Vol 6, No 1 (2018), pp.
    [Show full text]