The Computational Lexical Semantics of Syntagmatic Expressions

Total Page:16

File Type:pdf, Size:1020Kb

The Computational Lexical Semantics of Syntagmatic Expressions The Computational Lexical Semantics of Syntagmatic Relations Evelyne Viegas, Stephen Beale and Sergei Nirenburg New Mexico State University Computing Research Lab, Las Cruces, NM 88003, USA viegas, sb, sergei©crl, nmsu. edu Abstract inheritance hierarchy of Lexical Semantic Functions In this paper, we address the issue of syntagmatic (LSFs). expressions from a computational lexical semantic perspective. From a representational viewpoint, we 2 Approaches to Syntagmatic argue for a hybrid approach combining linguistic and Relations conceptual paradigms, in order to account for the Syntagmatic relations, also known as collocations, continuum we find in natural languages from free are used differently by lexicographers, linguists and combining words to frozen expressions. In particu- statisticians denoting almost similar but not identi- lar, we focus on the place of lexical and semantic cal classes of expressions. restricted co-occurrences. From a processing view- The traditional approach to collocations has been point, we show how to generate/analyze syntag- lexicographic. Here dictionaries provide infor- matic expressions by using an efficient constraint- mation about what is unpredictable or idiosyn- based processor, well fitted for a knowledge-driven cratic. Benson (1989) synthesizes Hausmann's stud- approach. ies on collocations, calling expressions such as com- 1 Introduction mit murder, compile a dictionary, inflict a wound, etc. "fixed combinations, recurrent combinations" You can take advantage o] the chambermaid 1 is not a or "collocations". In Hausmann's terms (1979) a collocation one would like to generate in the context collocation is composed of two elements, a base ("Ba- of a hotel to mean "use the services of." This is why sis") and a collocate ("Kollokator"); the base is se- collocations should constitute an important part in mantically autonomous whereas the collocate cannot the design of Machine Translation or Multilingual be semantically interpreted in isolation. In other Generation systems. words, the set of lexical collocates which can com- In this paper, we address the issue of syntagmatic bine with a given basis is not predictable and there- expressions from a computational lexical semantic fore collocations must be listed in dictionaries. perspective. From a representational viewpoint, we It is hard to say that there has been a real focus argue for a hybrid approach combining linguistic and on collocations from a linguistic perspective. The conceptual paradigms, in order to account for the lexicon has been broadly sacrificed by both English- continuum we find in natural languages from free speaking schools and continental European schools. combining words to frozen expressions (such as in The scientific agenda of the former has been largely idioms kick the (proverbial) bucket). In particular, dominated by syntactic issues until recently, whereas we focus on the representation of restricted seman- the latter was more concerned with pragmatic as- tic and lexical co-occurrences, such as heavy smoker pects of natural languages. The focus has been on and pro#ssor ... students respectively, that we de- grammatical collocations such as adapt to, aim at, fine later. From a processing viewpoint, we show look ]or. Lakoff (1970) distinguishes a class of ex- how to generate/analyze syntagmatic expressions by pressions which cannot undergo certain operations, using an efficient constraint-based processor, well fit- such as nominalization, causativization: the problem ted for a knowledge-driven approach. In the follow- is hard; *the hardness of the problem; *the problem ing, we first compare different approaches to collo- hardened. The restriction on the application of cer- cations. Second, we present our approach in terms tain syntactic operations can help define collocations of representation and processing. Finally, we show such as hard problem, for example. Mel'~uk's treat- how to facilitate the acquisition of co-occurrences by ment of collocations will be detailed below. using 1) the formalism of lexical rules (LRs), 2) an In recent years, there has been a resurgence of 1Lederer, R. 1990. Anguished English A Laurel Book, Dell statistical approaches applied to the study of nat- Publishing. ural languages. Sinclair (1991) states that '% word 1328 which occurs in close proximity to a word under in- the collocational information is listed in a static way. vestigation is called a collocate of it .... Collocation We believe that one of the main drawbacks of the ap- is the occurrence of two or more words within a proach is the lack of any predictable calculi on the short space of each other in a text". The prob- possible expressions which can collocate with each lem is that with such a definition of collocations, other semantically. even when improved, z one identifies not only collo- cations but free-combining pairs frequently appear- 3 The Computational Lexical ing together such as lawyer-client; doctor-hospital. Semantic Approach However, nowadays, researchers seem to agree that In order to account for the continuum we find in nat- combining statistic with symbolic approaches lead ural languages, we argue for a continuum perspec- to quantifiable improvements (Klavans and Resnik, tive, spanning the range from free-combining words 1996). to idioms, with semantic collocations and idiosyn- The Meaning Text Theory Approach The crasies in between as defined in (Viegas and Bouil- Meaning Text Theory (MTT) is a generator-oriented lon, 1994): lexical grammatical formalism. Lexical knowledge is encoded in an entry of the Explanatory Combina- • free-combining words (the girl ate candies) torial Dictionary (ECD), each entry being divided * semantic collocations (fast car; long book) 6 into three zones: the semantic zone (a semantic net- work representing the meaning of the entry in terms • idiosyncrasies (large coke; green jealousy) of more primitive words), the syntactic zone (the • idioms (to kick the (proverbial) bucket) grammatical properties of the entry) and the lexi- cal combinatorics zone (containing the values of the Formally, we go from a purely compositional Lexical Functions (LFs) 3). LFs are central to the approach in "free-combining words" to a non- study of collocations: compositional approach in idioms. In between, a (semi-)compositional approach is still possible. (Vie- A lexical function F is a correspondence gas and Bouillon, 1994) showed that we can reduce which associates a lexical item L, called the the set of what are conventionally considered as id- key word of F, with a set of lexical items iosyncrasies by differentiating "true" idiosyncrasies F(L)-the value of F. (Mel'6uk, 1988) 4 (difficult to derive or calculate) from expressions We focus here on syntagmatic LFs describing co- which have well-defined calculi, being compositional occurrence relations such as pay attention, legitimate in nature, and that have been called semantic collo- complaint; from a distance. 5 cations. In this paper, we further distinguish their Heylen et al. (1993) have worked out some cases idiosyncrasies into: which help license a starting point for assigning LFs. They distinguish four types of syntagmatic LFs: • restricted semantic co-occurrence, where the meaning of the co-occurrence is semi- • evaluative qualifier compositional between the base and the collo- Magn(bleed) = profusely cate (strong coffee, pay attention, heavy smoker, • distributional qualifier ...) Mult(sheep) = flock • restricted lexical co-occurrence, where the • co-occurrence meaning of the collocate is compositional but Loc-in(distance)= at a distance has a lexical idiosyncratic behavior (lecture ... • verbal operator student; rancid butter; sour milk). Operl(attention) = pay We provide below examples of restricted seman- The MTT approach is very interesting as it pro- tic co-occurrences in (1), and restricted lexical co- vides a model of production well suited for genera- occurrences in (2). tion with its different strata and also a lot of lexical- Restricted semantic co-occurrence The se- semantic information. It seems nevertheless that all mantics of the combination of the entries is semi- 2Church and Hanks (1989), Smadja (1993) use statistics compositional. In other words, there is an entry in " in their algorithms to extract collocations from texts. the lexicon for the base, (the semantic collocate is 3See (Iordanskaja et al., 1991) and (Ramos et al., 1994) for their use of LFs in MTT and NLG respectively. encoded inside the base), whereas we cannot directly 4(Held, 1989) contrasts Hausman's base and collate to refer to the sense of the semantic collocate in the Mel'tuk's keyword and LF values. lexicon, as it is not part of its senses. We assign 5There are about 60 LFs listed said to be universal; the the co-occurrence a new semi-compositional sense, lexicographic approach of Mel'tuk and Zolkovsky has been applied among other languages to Russian, French, German 6See (Pustejovsky, 1995) for his account of such expres- and English. sions using a coercion operator. 1329 where the sense of the base is composed with a new tional. In other words, there are entries in the lex- sense for the collocate. icon for the base and the collocate, with the same senses as in the co-occurrence. Therefore, we can di- (la) #O=[key: "smoker", rectly refer to the senses of the co-occurring words. rel: [syntagmatic: LSFIntensity What we are capturing here is a lexical idiosyncrasy [base: #0, collocate: or in other words, we specify that we should prefer [key: "heavy", this particular combination of words. This is useful gram: [subCat: Attributive, for analysis, where it can help disambiguate a sense, freq: [value: 8]]]]] ...] and is most relevant for generation; it can be viewed as a preference among the paradigmatic family of (lb) #0= [key: "attention", the co-occurrence. rel: [syntagmatic: LSFOper [base: #0, collocate: [key: "pay", (2a) #O=[key: "truth", tel: [syntagmatic: LSFSyn gram: [subCat: SupportVerb, freq: [value: 5]]]]] ...] [base: #0, collocate: [key: "plain", sense: adj2, Ir: [comp:no, superl:no]]]] ...] In examples (1), the LSFs (LSFIntensity, LS- FOper, ...) are equivalent (and some identical) to (2b) #0=[key: "pupil", the LFs provided in the ECD.
Recommended publications
  • The Generative Lexicon
    The Generative Lexicon James Pustejovsky" Computer Science Department Brandeis University In this paper, I will discuss four major topics relating to current research in lexical seman- tics: methodology, descriptive coverage, adequacy of the representation, and the computational usefulness of representations. In addressing these issues, I will discuss what I think are some of the central problems facing the lexical semantics community, and suggest ways of best ap- proaching these issues. Then, I will provide a method for the decomposition of lexical categories and outline a theory of lexical semantics embodying a notion of cocompositionality and type coercion, as well as several levels of semantic description, where the semantic load is spread more evenly throughout the lexicon. I argue that lexical decomposition is possible if it is per- formed generatively. Rather than assuming a fixed set of primitives, I will assume a fixed number of generative devices that can be seen as constructing semantic expressions. I develop a theory of Qualia Structure, a representation language for lexical items, which renders much lexical ambiguity in the lexicon unnecessary, while still explaining the systematic polysemy that words carry. Finally, I discuss how individual lexical structures can be integrated into the larger lexical knowledge base through a theory of lexical inheritance. This provides us with the necessary principles of global organization for the lexicon, enabling us to fully integrate our natural language lexicon into a conceptual whole. 1. Introduction I believe we have reached an interesting turning point in research, where linguistic studies can be informed by computational tools for lexicology as well as an appre- ciation of the computational complexity of large lexical databases.
    [Show full text]
  • Acquaintance Inferences As Evidential Effects
    The acquaintance inference as an evidential effect Abstract. Predications containing a special restricted class of predicates, like English tasty, tend to trigger an inference when asserted, to the effect that the speaker has had a spe- cific kind of `direct contact' with the subject of predication. This `acquaintance inference' has typically been treated as a hard-coded default effect, derived from the nature of the predicate together with the commitments incurred by assertion. This paper reevaluates the nature of this inference by examining its behavior in `Standard' Tibetan, a language that grammatically encodes perceptual evidentiality. In Tibetan, the acquaintance inference trig- gers not as a default, but rather when, and only when, marked by a perceptual evidential. The acquaintance inference is thus a grammaticized evidential effect in Tibetan, and so it cannot be a default effect in general cross-linguistically. An account is provided of how the semantics of the predicate and the commitment to perceptual evidentiality derive the in- ference in Tibetan, and it is suggested that the inference ought to be seen as an evidential effect generally, even in evidential-less languages, which invoke evidential notions without grammaticizing them. 1 Introduction: the acquaintance inference A certain restricted class of predicates, like English tasty, exhibit a special sort of behavior when used in predicative assertions. In particular, they require as a robust default that the speaker of the assertion has had direct contact of a specific sort with the subject of predication, as in (1). (1) This food is tasty. ,! The speaker has tasted the food. ,! The speaker liked the food's taste.
    [Show full text]
  • Compositional and Lexical Semantics • Compositional Semantics: The
    Compositional and lexical semantics Compositional semantics: the construction • of meaning (generally expressed as logic) based on syntax. This lecture: – Semantics with FS grammars Lexical semantics: the meaning of • individual words. This lecture: – lexical semantic relations and WordNet – one technique for word sense disambiguation 1 Simple compositional semantics in feature structures Semantics is built up along with syntax • Subcategorization `slot' filling instantiates • syntax Formally equivalent to logical • representations (below: predicate calculus with no quantifiers) Alternative FS encodings possible • 2 Objective: obtain the following semantics for they like fish: pron(x) (like v(x; y) fish n(y)) ^ ^ Feature structure encoding: 2 PRED and 3 6 7 6 7 6 7 6 2 PRED 3 7 6 pron 7 6 ARG1 7 6 6 7 7 6 6 ARG1 1 7 7 6 6 7 7 6 6 7 7 6 6 7 7 6 4 5 7 6 7 6 7 6 2 3 7 6 PRED and 7 6 7 6 6 7 7 6 6 7 7 6 6 7 7 6 6 2 3 7 7 6 6 PRED like v 7 7 6 6 7 7 6 6 6 7 7 7 6 6 ARG1 6 ARG1 1 7 7 7 6 6 6 7 7 7 6 6 6 7 7 7 6 6 6 7 7 7 6 ARG2 6 6 ARG2 2 7 7 7 6 6 6 7 7 7 6 6 6 7 7 7 6 6 6 7 7 7 6 6 4 5 7 7 6 6 7 7 6 6 7 7 6 6 2 3 7 7 6 6 PRED fish n 7 7 6 6 ARG2 7 7 6 6 6 7 7 7 6 6 6 ARG 2 7 7 7 6 6 6 1 7 7 7 6 6 6 7 7 7 6 6 6 7 7 7 6 6 4 5 7 7 6 6 7 7 6 4 5 7 6 7 4 5 3 Noun entry 2 3 2 CAT noun 3 6 HEAD 7 6 7 6 6 AGR 7 7 6 6 7 7 6 6 7 7 6 4 5 7 6 7 6 COMP 7 6 filled 7 6 7 fish 6 7 6 SPR filled 7 6 7 6 7 6 7 6 INDEX 1 7 6 2 3 7 6 7 6 SEM 7 6 6 PRED fish n 7 7 6 6 7 7 6 6 7 7 6 6 ARG 1 7 7 6 6 1 7 7 6 6 7 7 6 6 7 7 6 4 5 7 4 5 Corresponds to fish(x) where the INDEX • points to the characteristic variable of the noun (that is x).
    [Show full text]
  • Deriving Programming Theories from Equations by Functional Predicate Calculus
    Calculational Semantics: Deriving Programming Theories from Equations by Functional Predicate Calculus RAYMOND T. BOUTE INTEC, Ghent University The objects of programming semantics, namely, programs and languages, are inherently formal, but the derivation of semantic theories is all too often informal, deprived of the benefits of formal calculation “guided by the shape of the formulas.” Therefore, the main goal of this article is to provide for the study of semantics an approach with the same convenience and power of discov- ery that calculus has given for many years to applied mathematics, physics, and engineering. The approach uses functional predicate calculus and concrete generic functionals; in fact, a small part suffices. Application to a semantic theory proceeds by describing program behavior in the simplest possible way, namely by program equations, and discovering the axioms of the theory as theorems by calculation. This is shown in outline for a few theories, and in detail for axiomatic semantics, fulfilling a second goal of this article. Indeed, a chafing problem with classical axiomatic semantics is that some axioms are unintuitive at first, and that justifications via denotational semantics are too elaborate to be satisfactory. Derivation provides more transparency. Calculation of formulas for ante- and postconditions is shown in general, and for the major language constructs in par- ticular. A basic problem reported in the literature, whereby relations are inadequate for handling nondeterminacy and termination, is solved here through appropriately defined program equations. Several variants and an example in mathematical analysis are also presented. One conclusion is that formal calculation with quantifiers is one of the most important elements for unifying contin- uous and discrete mathematics in general, and traditional engineering with computing science, in particular.
    [Show full text]
  • Sentential Negation and Negative Concord
    Sentential Negation and Negative Concord Published by LOT phone: +31.30.2536006 Trans 10 fax: +31.30.2536000 3512 JK Utrecht email: [email protected] The Netherlands http://wwwlot.let.uu.nl/ Cover illustration: Kasimir Malevitch: Black Square. State Hermitage Museum, St. Petersburg, Russia. ISBN 90-76864-68-3 NUR 632 Copyright © 2004 by Hedde Zeijlstra. All rights reserved. Sentential Negation and Negative Concord ACADEMISCH PROEFSCHRIFT ter verkrijging van de graad van doctor aan de Universiteit van Amsterdam op gezag van de Rector Magnificus Prof. Mr P.F. van der Heijden ten overstaan van een door het College voor Promoties ingestelde commissie, in het openbaar te verdedigen in de Aula der Universiteit op woensdag 15 december 2004, te 10:00 uur door HEDZER HUGO ZEIJLSTRA geboren te Rotterdam Promotiecommissie: Promotores: Prof. Dr H.J. Bennis Prof. Dr J.A.G. Groenendijk Copromotor: Dr J.B. den Besten Leden: Dr L.C.J. Barbiers (Meertens Instituut, Amsterdam) Dr P.J.E. Dekker Prof. Dr A.C.J. Hulk Prof. Dr A. von Stechow (Eberhard Karls Universität Tübingen) Prof. Dr F.P. Weerman Faculteit der Geesteswetenschappen Voor Petra Table of Contents TABLE OF CONTENTS ............................................................................................ I ACKNOWLEDGEMENTS .......................................................................................V 1 INTRODUCTION................................................................................................1 1.1 FOUR ISSUES IN THE STUDY OF NEGATION.......................................................1
    [Show full text]
  • Chapter 1 Negation in a Cross-Linguistic Perspective
    Chapter 1 Negation in a cross-linguistic perspective 0. Chapter summary This chapter introduces the empirical scope of our study on the expression and interpretation of negation in natural language. We start with some background notions on negation in logic and language, and continue with a discussion of more linguistic issues concerning negation at the syntax-semantics interface. We zoom in on cross- linguistic variation, both in a synchronic perspective (typology) and in a diachronic perspective (language change). Besides expressions of propositional negation, this book analyzes the form and interpretation of indefinites in the scope of negation. This raises the issue of negative polarity and its relation to negative concord. We present the main facts, criteria, and proposals developed in the literature on this topic. The chapter closes with an overview of the book. We use Optimality Theory to account for the syntax and semantics of negation in a cross-linguistic perspective. This theoretical framework is introduced in Chapter 2. 1 Negation in logic and language The main aim of this book is to provide an account of the patterns of negation we find in natural language. The expression and interpretation of negation in natural language has long fascinated philosophers, logicians, and linguists. Horn’s (1989) Natural history of negation opens with the following statement: “All human systems of communication contain a representation of negation. No animal communication system includes negative utterances, and consequently, none possesses a means for assigning truth value, for lying, for irony, or for coping with false or contradictory statements.” A bit further on the first page, Horn states: “Despite the simplicity of the one-place connective of propositional logic ( ¬p is true if and only if p is not true) and of the laws of inference in which it participate (e.g.
    [Show full text]
  • Synchronous Dependency Insertion Grammars a Grammar Formalism for Syntax Based Statistical MT
    Synchronous Dependency Insertion Grammars A Grammar Formalism for Syntax Based Statistical MT Yuan Ding and Martha Palmer Department of Computer and Information Science University of Pennsylvania Philadelphia, PA 19104, USA {yding, mpalmer}@linc.cis.upenn.edu Abstract In the early 1990s, (Brown et. al. 1993) intro- duced the idea of statistical machine translation, This paper introduces a grammar formalism where the word to word translation probabilities and specifically designed for syntax-based sta- sentence reordering probabilities are estimated from tistical machine translation. The synchro- a large set of parallel sentence pairs. By having the nous grammar formalism we propose in advantage of leveraging large parallel corpora, the this paper takes into consideration the per- statistical MT approach outperforms the traditional vasive structure divergence between lan- transfer based approaches in tasks for which ade- guages, which many other synchronous quate parallel corpora is available (Och, 2003). grammars are unable to model. A Depend- However, a major criticism of this approach is that it ency Insertion Grammars (DIG) is a gen- is void of any internal representation for syntax or erative grammar formalism that captures semantics. word order phenomena within the depend- In recent years, hybrid approaches, which aim at ency representation. Synchronous Depend- applying statistical learning to structured data, began ency Insertion Grammars (SDIG) is the to emerge. Syntax based statistical MT approaches synchronous version of DIG which aims at began with (Wu 1997), who introduced a polyno- capturing structural divergences across the mial-time solution for the alignment problem based languages. While both DIG and SDIG have on synchronous binary trees.
    [Show full text]
  • Functional Unification Grammar: a Formalism for Machine Translation
    FUNCTIONAL UNIFICATION GRAMMAR: A FORMALISM FOR MACHINE TRANSLATION Martin Kay Xerox Palo Alto Research Center 3333 Coyote Hill Road Palo Alto California 94304 and CSLI, Stanford Abstract language--morphological, syntactic, semantic, or whatever--could be stated. A formalism powerful enough to accommodate the Functional Unification Grammar provides an opportunity various different kinds of linguistic phenomena with equal facility to encompass within one formalism and computational system might be unappealing to theoretical linguists because powerful the parts of machine translation systems that have usually been formal systems do not make powerful claims. But the engineering treated separately, natably analysis, transfer, and synthesis. advantages are clear to see. A single formalism would straightfor- Many of the advantages of this formalism come from the fact wardly reduce the number of interpreters to two, one for analysis that it is monotonic allowing data structures to grow differently and one for synthesis. Furthermore, the explanatory value of a as different nondeterministic alternatives in a computation are theory clearly rests on a great deal more than the restriciveness of pursued, but never to be modified in any way. A striking feature its formal base. In particular, the possiblity of encompassing what of this system is that it is fundamental reversible, allowing a to had hitherto been thought to require altogether different kinds of translate as b only if b could translate as a. treatment within a single framework could be theoretically inter- esting. I Overview Another clear improvement on the classical design would A. Machine Translation "result from merging 'the two interpreters associated with a for- malism.
    [Show full text]
  • The Logic of Argument Structure
    32 San Diego Linguistic Papers 3 (2008) 32-125 Circumstances and Perspective: The Logic of Argument Structure Jean Mark Gawron San Diego State University 1 Introduction The fox knows many things but the hedgehog knows one big thing. Archilochus cited by Isaiah Berlin Berlin (1997:“The Hedgehog and the Fox”) The last couple of decades have seen substantial progress in the study of lexical semantics, particularly in contributing to the understanding of how lexical semantic properties interact with syntactic properties, but many open questions await resolution before a consensus of what a theory of lexical semantics looks like is achieved. Two properties of word meanings contribute to the difficulty of the problem. One is the openness of word meanings. The variety of word meanings is the variety of human experience. Consider defining words such as ricochet, barber, alimony, seminal, amputate, and brittle. One needs to make reference to diverse practices, processes, and objects in the social and physical world: the impingement of one object against another, grooming and hair, marriage and divorce, discourse about concepts and theories, and events of breaking. Before this seemingly endless diversity, semanticists have in the past stopped short, excluding it from the semantic enterprise, and attempting to draw a line between a small linguistically significant set of concepts and the openness of the lexicon. The other problem is the closely related problem of the richness of word meanings. Words are hard to define, not so much because they invoke fine content specific distinctions, but because they invoke vast amounts of background information. The concept of buying presupposes the complex social fact of a commercial event.
    [Show full text]
  • Grounding Lexical Meaning in Core Cognition
    Grounding Lexical Meaning in Core Cognition Noah D. Goodman December 2012 (Updated June 2013) Abstract Author's note: This document is a slightly updated and reformatted extract from a grant proposal to the ONR. As a proposal, it aims describe useful directions while reviewing existing and pilot work; it has no pretensions to being a systematic, rigorous, or entirely coherent scholarly work. On the other hand, I've found that it provides a useful overview of a few ideas on the architecture of natural language that haven't yet appeared elsewhere. I provide it for those interested, but with all due caveats. Words are potentially one of the clearest windows on human knowledge and con- ceptual structure. But what do words mean? In this project we aim to construct and explore a formal model of lexical semantics grounded, via pragmatic inference, in core conceptual structures. Flexible human cognition is derived in large part from our ability to imagine possible worlds. A rich set of concepts, in- tuitive theories, and other mental representations support imagining and reasoning about possible worlds|together we call these core cognition. Here we posit that the collection of core concepts also forms the set of primitive elements available for lexi- cal semantics: word meanings are built from pieces of core cognition. We propose to study lexical semantics in the setting of an architecture for language understanding that integrates literal meaning with pragmatic inference. This architecture supports underspecified and uncertain lexical meaning, leading to subtle interactions between meaning, conceptual structure, and context. We will explore several cases of lexical semantics where these interactions are particularly important: indexicals, scalar adjec- tives, generics, and modals.
    [Show full text]
  • A Survey of Computational Semantics: Representation, Inference and Knowledge in Wide-Coverage Text Understanding Johan Bos* University of Groningen
    Language and Linguistics Compass 5/6 (2011): 336–366, 10.1111/j.1749-818x.2011.00284.x A Survey of Computational Semantics: Representation, Inference and Knowledge in Wide-Coverage Text Understanding Johan Bos* University of Groningen Abstract The aim of computational semantics is to capture the meaning of natural language expressions in representations suitable for performing inferences, in the service of understanding human language in written or spoken form. First-order logic is a good starting point, both from the representation and inference point of view. But even if one makes the choice of first-order logic as representa- tion language, this is not enough: the computational semanticist needs to make further decisions on how to model events, tense, modal contexts, anaphora and plural entities. Semantic representa- tions are usually built on top of a syntactic analysis, using unification, techniques from the lambda-calculus or linear logic, to do the book-keeping of variable naming. Inference has many potential applications in computational semantics. One way to implement inference is using algo- rithms from automated deduction dedicated to first-order logic, such as theorem proving and model building. Theorem proving can help in finding contradictions or checking for new infor- mation. Finite model building can be seen as a complementary inference task to theorem proving, and it often makes sense to use both procedures in parallel. The models produced by model generators for texts not only show that the text is contradiction-free; they also can be used for disambiguation tasks and linking interpretation with the real world. To make interesting inferences, often additional background knowledge is required (not expressed in the analysed text or speech parts).
    [Show full text]
  • Wittgenstein and Musical Formalism: a Case Revisited
    Ápeiron. Estudios de filosofía — Monográfico «Wittgenstein. Música y arquitectura» Wittgenstein and Musical Formalism: A Case Revisited Wittgenstein y el formalismo musical: Un caso reconsiderado Hanne Appelqvist University of Helsinki [email protected] Abstract: This article defends a formalist interpretation of Wittgenstein’s later thought on music by com- paring it with Eduard Hanslick’s musical formalism. In doing so, it returns to a disagreement I have had with Bela Szabados who, in his book Wittgenstein as a Philosophical Tone-Poet, claims that the attribution of formalism obscures the role that music played in the development of Wittgenstein’s thought. The paper scrutinizes the four arguments Szabados presents to defend his claim, pertaining to alleged differences be- tween Wittgenstein and Hanslick on their accounts of theory, beauty, rules, and the broader significance of music. I will argue that in each case the similarities between Wittgenstein’s and Hanslick’s respective views outshine possible differences. Ultimately, I will argue that instead of rendering music a marginal phenom- enon suited for mere entertainment, formalism –as presented by Hanslick and Wittgenstein, whom I read as influenced by Kant’s aesthetics– underscores music’s ability to show fundamental features of reality and our relation to it. Music does this precisely as a sensuous yet structured medium that is irreducible to any conceptually determined domain. Keywords: Wittgenstein, Hanslick, Kant, formalism, music. Resumen: Este artículo defiende una interpretación formalista del pensamiento posterior de Wittgenstein so- bre la música comparándolo con el formalismo musical de Eduard Hanslick. Con ese fin, reconsidera un des- acuerdo que he tenido con Bela Szabados.
    [Show full text]