Why Modern Logic Took So Long to Arrive: Three Lectures

Total Page:16

File Type:pdf, Size:1020Kb

Why Modern Logic Took So Long to Arrive: Three Lectures Why modern logic took so long A potted history in five periods to arrive: Three lectures 1. The classical Greeks, 4th century BC Aristotle introduces the idea of checking arguments by Wilfrid Hodges showing that they conform to valid argument patterns. He describes a systematic collection of valid argument Herons Brook, Sticklepath, patterns (syllogisms) and some sporadic ones (topics). Okehampton EX20 2PY Chrysippus describes some propositional argument March 2009 patterns. http://wilfridhodges.co.uk/history11.pdf 3 1 2. The Roman Empire, 2nd to 6th centuries AD Aristotle’s works, edited by Andronicus of Rhodes, become the basis of liberal education. Late 3rd century, Porphyry protects Aristotle’s logic from Lecture One: What traditional logic covered ideological disputes by separating it from metaphysics and and in particular the place of relational reasoning proposing a basis in terms of meanings. Details worked out by his followers and the Alexandrian school of Ammonius. 6th century, Boethius paraphrases in Latin Porphyry’s views on logic. 4 2 3. The Arabs, 8th to 13th centuries Excellent translations made of Aristotle. Translations of 5. Renaissance and Enlightenment, 15th to 19th centuries Roman Empire commentators become available in libraries Logic emerges from the universities and becomes education of connoisseurs as far east as Afghanistan. for barristers and young ladies. 10th century, Al-Far¯ ab¯ ¯ı writes major commentary (now 17th century, attempts to adapt logic to the spirit of the age mostly lost) on Aristotle’s logic. of Descartes and Newton. Main figures Arnauld and Nicole 11th century, Ibn S¯ına¯ an independent thinker in the (Port-Royal Logic) and above all Leibniz. aristotelian tradition (compare Leibniz and Frege — as we 19th century, old idea of logic as protection against error will). One of his textbooks of logic is over 2000 pages. revived in more sophisticated form, in particular by Frege. Later writings on logic (e.g. Ibn Rushd 12th century, Tusi With Peano (1890s) the connection to Aristotle is finally lost. 13th century) show less independence. 7 5 4. The Scholastics, 12th to 15th centuries Aristotle’s syllogisms 12th century, Abelard virtually reinvents logic on basis of A typical syllogism, as Aristotle wrote it (Prior Analytics i.6): Boethius and an incomplete set of texts of Aristotle. If R belongs to every S,andP to no S, there will be a 12th to 13th centuries, Terminists develop theory of deduction that P will necessarily not belong to some supposition. (Unclear whether it’s about the notion of R. reference or about infinitary proof rules.) Here ‘R’, ‘P ’, ‘S’ stand for nouns. (Proper names are treated Early 14th century, Jean Buridan probably the most as common nouns.) Thus for example: technically proficient of the Scholastics. If every animal is mobile, and no animal is eternal, Throughout this period, theory of logic was confined to a then necessarily some mobile thing is not eternal. handful of universities. 8 6 Later logicians saw that to make this example work, we Hilbert used this topic as an argument for adopting need to assume something about existence. first-order logic: S The usual assumption was that if there are no sthen “If there is a son, then there is a father,” is certainly a • ‘Some S is an R’isfalse; logically self-evident assertion, and we may demand of any satisfactory logical calculus that it make • ‘Every S is an R’isfalse; obvious this self-evidence, in the sense that the • ‘Some S is not an R’ (negation of ‘Every S is an R’) is asserted connection will be seen, by means of the true; symbolic representation, to be a consequence of simple logical principles. • ‘No S is an R’istrue. 11 9 Topics Also known as places. The ones for use in general situations This is from Hilbert’s Gottingen¨ lectures of 1917–1922, are called common places; Renaissance scholars suggested published in 1928 in his textbook with Ackermann. keeping a commonplace book for them. The most general In these lectures Hilbert created the syntax and semantics of topics are called maximal places, abbreviated to maxims. first-order logic. Example: What’s so hot about first-order logic, if this topic was already known to Boethius in the 6th century? If one of the correlated things is posited, the other is posited. (Peter of Spain, Tractatus V para. 28.) Answer (in Hilbert’s text): First-order logic is a logical calculus that analyses both syllogisms and relational This is the pattern behind the inference arguments like this one down to ‘simple logical principles’. If there is a father then there is a child. 12 10 For the rest of this lecture, we ask why nobody before the end of the 19th century seriously tried to integrate syllogisms and relational topics into a calculus covering Some Basics both. How are syllogisms supposed to be used, in practice, One meets three views: for validating arguments? • (Joachim Jungius) Relational topics are sui generis. There are some scattered remarks about this at the end of • (Leibniz) Relational topics can be reduced to syllogisms Aristotle’s Prior Analytics i. plus paraphrasing. They show that Aristotle’s practice was probably almost identical to modern elementary logic courses. • (Buridan, De Morgan) Syllogisms and some relational topics are justified by a higher-level principle called dictum de omni et nullo. 15 13 Jungius is chiefly memorable for having stimulated Leibniz. Given an argument in Greek, Aristotle shows that it fits The other views, plus relevant opinions of Ibn S¯ına,¯ some valid syllogism by showing what terms in the all revolve around the question whether one can really argument correspond to the term symbols in the syllogism. reason with anything beyond syllogistic sentences E.g. (Prior Analytics i.35): (Some/Every) A (is/isn’t) a B, A : having internal angles that sum to two right angles. where the terms in place of A and B are taken as B impenetrable. :triangle. C : isosceles triangle. The view that one can’t is what I call Top-Level Processing, TLP for short. (Some refinements will follow.) He calls this setting out the terms. 16 14 Setting out terms (in this sense) disappears after Aristotle Natural language paraphrase: and reappears only in Boole 1847. What takes its place? No (time needing to be set aside for action) is a To check an argument, we have to verify that (for example) (thing that God has). Some (right moment for action) thefirstpremisemeansthesameas‘EveryB is an A’, where is a (thing that God has). Therefore some (right ‘A’, ‘B’ are interpreted as in the setting out of terms. moment for action) is not a (time needing to be set Instead, the traditionals rewrote ‘Every B is an A’usingthe aside for action). expressions in the setting out of terms, and checked that the The subject and predicate terms are marked with brackets. result means the same as the premise. I refer to the corresponding parts of the original argument as So they made a natural language paraphrase of the premise. the eigenterms (adapting Gentzen). 19 17 Example (Aristotle, Prior Analytics i.36) God doesn’t have times that need to be set aside for Why does the paraphrase of each sentence have to be a action. God does have right moments for action. syllogistic sentence? Therefore some right moment for action is not a time that needs to be set aside for action. From a modern perspective, a no-brainer. We are validating by syllogisms, and this just is the form of the sentences in Setting out: syllogisms. In another logic, other forms would be used. A : thing that God has. B : time needing to be set aside for action. But traditional logicians often seem to want to show that C : right moment for action. reasoning can only use syllogistic (or very similar) sentences. Syllogism: This blocks any attempt to find a more powerful logic. No B is an A.SomeC is an A. Therefore some C is not a B. 20 18 Leibniz believes that the key to reasoning is that we can paraphrase the premises so as to bring the eigenterms to Ibn S¯ın¯a has a complex and highly integrated theory that nominative case. uses logic to explain the workings of the rational mind, Opuscules p. 287: including the kinds of error that one can make. When we realise for the first time that something is true, A reading of poets is an act by which a poet is read. on the basis of a rational argument, this is because our ...ParisisaloverofHelen,i.e.Parisisalover,and minds analyse the data, notice common features between thereby Helen is loved. So there are two propositions the propositions expressing the data, and by identifying packed into one. If you don’t resolve oblique these features, produce a new combination of ideas. cases into several propositions, you will never avoid being forced (like Jungius) to devise new-fangled ways of reasoning. 23 21 Remark ‘Recombinant’ syllogisms (Aristotle’s syllogisms and their Leibniz’s basic strategy here is that of Peirce ‘The reader is adaptations to propositional logic) express this kind of introduced to relatives’ (1892) and Davidson ‘The logical argument. form of action sentences’ (1967): Our minds are a kind of PROLOG inference engine. Quantify over actions or events, and treat other parts In short, the basic ingredient of reasoning is to bring of the situation as attached to the action or event in together two ideas. standard ways (e.g. as AGENT or OBJECT). The syllogistic sentences express such a union of ideas. Hence Top-Level Processing. This doesn’t get rid of relations, but it limits them to standard ones built into the language.
Recommended publications
  • Montague Meets Markov: Deep Semantics with Probabilistic Logical Form
    Montague Meets Markov: Deep Semantics with Probabilistic Logical Form Islam Beltagyx, Cuong Chaux, Gemma Boleday, Dan Garrettex, Katrin Erky, Raymond Mooneyx xDepartment of Computer Science yDepartment of Linguistics The University of Texas at Austin Austin, Texas 78712 x beltagy,ckcuong,dhg,mooney @cs.utexas.edu [email protected],[email protected] g Abstract # » # » sim(hamster, gerbil) = w We combine logical and distributional rep- resentations of natural language meaning by gerbil( transforming distributional similarity judg- hamster( ments into weighted inference rules using Markov Logic Networks (MLNs). We show x hamster(x) gerbil(x) f(w) that this framework supports both judg- 8 ! | ing sentence similarity and recognizing tex- tual entailment by appropriately adapting the Figure 1: Turning distributional similarity into a MLN implementation of logical connectives. weighted inference rule We also show that distributional phrase simi- larity, used as textual inference rules created on the fly, improves its performance. els (Turney and Pantel, 2010) use contextual sim- ilarity to predict semantic similarity of words and 1 Introduction phrases (Landauer and Dumais, 1997; Mitchell and Tasks in natural language semantics are very diverse Lapata, 2010), and to model polysemy (Schutze,¨ and pose different requirements on the underlying 1998; Erk and Pado,´ 2008; Thater et al., 2010). formalism for representing meaning. Some tasks This suggests that distributional models and logic- require a detailed representation of the structure of based representations of natural language meaning complex sentences. Some tasks require the ability to are complementary in their strengths (Grefenstette recognize near-paraphrases or degrees of similarity and Sadrzadeh, 2011; Garrette et al., 2011), which between sentences.
    [Show full text]
  • APA Newsletter on Philosophy and Computers, Vol. 14, No. 2
    NEWSLETTER | The American Philosophical Association Philosophy and Computers SPRING 2015 VOLUME 14 | NUMBER 2 FROM THE GUEST EDITOR John P. Sullins NOTES FROM THE COMMUNITY ON PAT SUPPES ARTICLES Patrick Suppes Patrick Suppes Autobiography Luciano Floridi Singularitarians, AItheists, and Why the Problem with Artificial Intelligence is H.A.L. (Humanity At Large), not HAL Peter Boltuc First-Person Consciousness as Hardware D. E. Wittkower Social Media and the Organization Man Niklas Toivakainen The Moral Roots of Conceptual Confusion in Artificial Intelligence Research Xiaohong Wang, Jian Wang, Kun Zhao, and Chaolin Wang Increase or Decrease of Entropy: To Construct a More Universal Macroethics (A Discussion of Luciano Floridi’s The Ethics of Information) VOLUME 14 | NUMBER 2 SPRING 2015 © 2015 BY THE AMERICAN PHILOSOPHICAL ASSOCIATION ISSN 2155-9708 APA NEWSLETTER ON Philosophy and Computers JOHN P. SULLINS, GUEST EDITOR VOLUME 14 | NUMBER 2 | SPRING 2015 but here we wish to celebrate his accomplishments in the FROM THE GUEST EDITOR fields of philosophy and computing one last time. John P. Sullins To accomplish that goal I have compiled some interesting SONOMA STATE UNIVERSITY pieces from an autobiography that Pat wrote some years ago but that he added to a bit for an event held in his honor November 17, 2014, marked the end of an inspiring at Stanford. In this document he explains his motivations career. On that day Patrick Suppes died quietly at the and accomplishments in various fields of study that are age of ninety-two in his house on the Stanford Campus, of interest to our community. In that section you will see which had been his home both physically and intellectually just how ambitious Pat was in the world of computer since 1950.
    [Show full text]
  • Operators in the Paradox of the Knower I
    PATRICK GRIM OPERATORS IN THE PARADOX OF THE KNOWER I ABSTRACT. Predicates are term-to-sentence devices, and operators are sentence-to- sentence devices. What Kaplan and Montague's Paradox of the Knower demonstrates is that necessity and other modatities cannot be treated as predicates, consistent with arithmetic; they must be treated as operators instead. Such is the current wisdom. A number of previous pieces have challenged such a view by showing that a predicative treatment of modalities need not raise the Paradox of the Knower. This paper attempts to challenge the current wisdom in another way as well: to show that mere appeal to modal operators in the sense of sentence-to-sentence devices is insufficient to escape the Paradox of the Knower. A family of systems is outlined in which closed formulae can encode other formulae and in which the diagonal lemma and Paradox of the Knower are thereby demonstrable for operators in this sense. Predicates are term-to-sentence devices: functions that take the terms of a language as input and render sentences (or open formulae) as output. Sentential operators or connectives, on the other hand, are sentence-to-sentence devices: they take sentences (or open formulae) as input and render sentences (or open formulae) as output. Such at least is the current wisdom. Quine's 'Nec' is intended as a predicate. The familiar '[2' of modal logic, in contrast, is an operator. 2 Another claim that appears as part of the current wisdom is this: that what Kaplan and Montague's Paradox of the Knower demonstrates is that necessity and other modalities cannot be treated as predicates, consistent with arithmetic.
    [Show full text]
  • Leibniz's Lingua Characteristica and Its Contemporary Counterparts
    Anna Pietryga LEIBNIZ’S LINGUA CHARACTERISTICA AND ITS CONTEMPORARY COUNTERPARTS Originally published as ”Leibniza << lingua characteristica >> i jej współczesne odpowiedniki,” Studia Semiotyczne 27 (2010), 293–305. Translated by Lesław Kawalec. There is no need to introduce Gottfried Wilhelm Leibniz, a great philoso- pher, theologian, diplomat, creator (independently of Isaac Newton) of the infinitesimal calculus and founder of the Academy of Sciences in Berlin. He also planned the development of the so-called Lingua characteristica (the plan shared by other 17 th century scholars). Literally taken, the name of the language means a language of letters , a graphic language , also called a characteristica universalis . It was meant to be a way of expressing meanings, as modeled after methods used in arithmetic and geometry (Leibniz also mentions logicians) and having unusual properties. 1. Like mathematical methods, such as written multiplication, lingua characteristica is supposed to enable an assessment of the reasoning correctness on the basis of the notation alone, which would prevent disputes between followers of opposing ideas and thus eliminate such disputes at the outset. Agreement would be reached by means of performing calculations in public, as encouraged by the Latin motto: calculemus (Murawski 1994: 93, 97). 2. Lingua characteristica will shut the mouths of ignoramuses as in the new language it will be possible to write about and discuss those topics only that one understands; otherwise the mistake will be noticeable for everyone, the author included (Murawski 1994: 95). (The text fails to mention authors of utopian designs, but these have not yet been expressed in a magical language). 241 Leibniz’s lingua characteristica and its contemporary counterparts 3.
    [Show full text]
  • Arabic Between Formalization and Computation
    International Journal of Languages, Literature and Linguistics, Vol. 1, No. 1, March 2015 Arabic between Formalization and Computation Haytham El-Sayed model for semantic processing for Arabic and even no Abstract—This paper provides an attempt to apply a formal existing formal theory is capable to provide a complete and method (Montague grammar) to Arabic, as a pre-step towards consistent account of all phenomena involved in Arabic its computation. Since semantic representation has to be semantic processing. compositional on the level of semantic processing, formalization based on Montague grammar can be utilized as a helpful and This paper is an attempt to provide a syntactic-semantic practical technique for the semantic construction of Arabic in formalization of a fragment of Modern Standard Arabic Arabic understanding systems. As efforts of Arabic (MSA) as a pre-step towards its computation. I do so using syntactic-semantic formalization are very limited, I hope that Montague grammar (henceforth MG) as a formal method. this approach might be a further motivation to redirect research MG is a well-known mathematically motivated logical to modern formal interpretation techniques for developing an method that treats natural languages semantic and of its adequate model of semantic processing for Arabic. relation with syntax. In general, the syntactic component of Index Terms—Arabic NLP, formalization of Arabic, MG is formulated in terms of a categorial system. The Montague grammar, syntax-semantics connectivity. semantic component of
    [Show full text]
  • Popper on Necessity and Natural Laws
    ALBERTO ARTOSI AND GUIDO GOVERNATORI POPPER ON NECESSITY AND NATURAL LAWS ABSTRACT: During his philosophical career Popper sought to characterize natural laws alter- nately as strictly universal and as ‘naturally’ or ‘physically’ necessary statements. In this paper we argue that neither characterization does what Popper claimed and sketch a reconstruction of his views that avoids some of their major drawbacks. 1. INTRODUCTION What distinguishes natural laws from merely ‘accidental’ generalizations? During his philosophical career Popper provided two answers to this long- debated question. The first claims that the difference between the two kinds of universal statements does not involve any more than the definition of their constitutive terms. The second claims that laws of nature are in some sense ‘necessary’. The first answer was put forward in The Logic of Scientific Dis- covery (henceforth LSD), in particular §§13–15, and in a short paper pub- lished in 1949.1 The second was proposed in Appendix *x to the 1959 edition of LSD.2 Although these answers may seem incompatible, Popper has main- tained that, under certain metaphysical assumptions, they are equivalent. In this paper we shall comment on both Popper’s first and second answer (henceforth referred to as the Old and the New Characterization, respec- tively) and their alleged equivalence. In particular, we shall argue that the Old Characterization fails to provide a basis for any distinction at all be- tween laws of nature and merely accidental generalizations, while the New Characterization succeeds, but at the price of making laws of nature logically necessary and thus indistinguishable from mere tautologies.
    [Show full text]
  • Possible Worlds Versioning
    Possible Worlds Versioning Blanca Mancilla John Plaice School of Computer Science and Engineering The University of New South Wales unsw sydney nsw 2052, Australia fmancilla,[email protected] 2008 Abstract We present a history of the application of the possible-world semantics of intensional logic to the development of versioned structures, ranging from simple software configuration to the networking of distributed context-aware applications permeated by multiple shared contexts. In this approach, all of the components of a system vary over a uniform multidimensional version, or context, space, and the version tag of a built version is the least upper bound of the version tags of the selected best-fit components. Context deltas allow the description of changes to contexts, and the subsequent changes to components and systems from one context to another. With æthers, active contexts with multiple participants, several networked programs may communicate by broadcasting deltas through a shared context to which they are continually adapting. 1 Introduction This paper presents the development of possible-worlds versioning, from 1993 to the present, and its application in the areas of software configuration, hypertext, programming language design and distributed computing. This research has followed naturally from the seminal article by author Plaice and William (Bill) Wadge, \A New Approach to Version Control" [28], which demonstrated that software configuration could be understood, and greatly simplified, using the possible-worlds semantics of Richard Montague's intensional logic, and which presented the vision of a flexible Unix-like system called Montagunix. The application of possible-worlds versioning has been used successfully | as will be seen in this article | to add versions to C programs, electronic documents, file systems, Linux processes and programming languages, among others, and to create collaborative software for shared browsing of Web sites.
    [Show full text]
  • John Buridan
    john buridan 33070-153_00FM.indd070-153_00FM.indd i 88/14/2008/14/2008 55:48:14:48:14 PPMM great medieval thinkers Series Editor Brian Davies Blackfriars, University of Oxford, and Fordham University duns scotus Richard Cross bernard of clairvaux Gillian R. Evans john scottus eriugena Deirdre Carabine robert grosseteste James McEvoy boethius John Marenbon peter lombard Philipp W. Rosemann bonaventure Christopher M. Cullen al-kindi¯ Peter Adamson john buridan Gyula Klima 33070-153_00FM.indd070-153_00FM.indd iiii 88/14/2008/14/2008 55:48:14:48:14 PPMM john buridan Gyula Klima 1 2008 33070-153_00FM.indd070-153_00FM.indd iiiiii 88/14/2008/14/2008 55:48:14:48:14 PPMM 1 Oxford University Press, Inc., publishes works that further Oxford University’s objective of excellence in research, scholarship, and education. Oxford New York Auckland Cape Town Dar es Salaam Hong Kong Karachi Kuala Lumpur Madrid Melbourne Mexico City Nairobi New Delhi Shanghai Taipei Toronto With offi ces in Argentina Austria Brazil Chile Czech Republic France Greece Guatemala Hungary Italy Japan Poland Portugal Singapore South Korea Switzerland Thailand Turkey Ukraine Vietnam Copyright © 2008 by Oxford University Press, Inc. Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016 www.oup.com Oxford is a registered trademark of Oxford University Press All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press. Library of Congress Cataloging-in-Publication Data ISBN 2 4 6 8 9 7 5 3 1 Printed in the United States of America on acid-free paper 33070-153_00FM.indd070-153_00FM.indd iivv 88/14/2008/14/2008 55:48:14:48:14 PPMM series foreword Many people would be surprised to be told that there were any great medi- eval thinkers.
    [Show full text]
  • 79 a Paradox Regained1 David Kaplan and Richard
    79 A PARADOX REGAINED1 DAVID KAPLAN AND RICHARD MONTAGUE Another attempt has recently been made (by R. Shaw) to analyze a puz- zle variously known as the Hangman, the Class A Blackout, the Unex- pected Egg, the Surprise Quiz, the Senior Sneak Week,5 the Prediction Paradox, and the Unexpected Examination. The following simple version of the paradox is sufficient to exhibit the essential features of all other versions. A judge decrees on Sunday that a prisoner shall be hanged on noon of the following Monday, Tuesday, or Wednesday, that he shall not be hanged more than once, and that he shall not know until the morning of the hanging the day on which it will occur. By familiar arguments it appears both that the decree cannot be fulfilled and that it can. Treatments of the paradox have for the most part proceeded by explain- ing it away, that is, by offering formulations which can be shown not to be paradoxical. We feel, with Shaw, that the interesting problem in this domain is of a quite different character; it is to discover an exact formulation of the puzzle which is genuinely paradoxical. The Hangman might then take a place beside the Liar and the Richard paradox, and, not unthinkably, lead like them to important technical progress. Before the appearance of Shaw's article, we had considered a form of the paradox essentially identical with his, and found it, contrary to his as- sertion, not to be paradoxical. At the same time we were successful in ob- taining several versions which are indeed paradoxical.
    [Show full text]
  • Course Descriptions Fall 2019
    COURSE DESCRIPTIONS FALL 2019 Phil 0002: Introduction to Philosophy The major types of philosophical thought and the central problems of philosophy are presented through study of some classic texts of the great philosophers. Offered each term. (May be used to satisfy the second half of the college writing requirement by students with credit for ENG 1.) Phil 0003: LanguagE & Mind Brian EpstEin Are we the only species with minds? Do animals — dolphins, chimpanzees, birds, spiders — have minds, or do they just have brains? We are the only species with language. Some animals have what might be called proto-languages, much simpler signaling systems, but these do not seem to give those species the spectacular boost in intelligence that language gives us. It is generally agreed that language makes our minds very different from animal minds, but how, and why? Are we the only conscious species? Are we the only self-conscious species? What is it like to be a bat? Is it like anything to be a spider? In the first half of the course we will explore fundamental questions about the nature of minds. What does it take for something to have a mind? We will discuss the empirical research that has recently shed new light on the questions about animal minds, while sharpening philosophical questions about the nature of minds in general. In the second half of the course, we will look at human language, its structure and evolution, and the effects it has on our minds. We will also explore "linguistic relativity": do people in other cultures think differently than we do? Is there a relation between the language we speak and how we think? The course has no prerequisites, and it is particularly appropriate for students who are not likely to major in philosophy but want to get a substantial introduction to the specific philosophical issues surrounding the mind-body problem and its relation to language.
    [Show full text]
  • Russell on Language [Keith Green, Bertrand Russell, Language And
    86 Reviews RUSSELL ON LANGUAGE Graham Stevens Philosophy / U. of Manchester Oxford Road, Manchester m13 9pl, uk [email protected] Keith Green. Bertrand Russell, Language and Linguistic Theory. London and New York: Continuum, 2007. Pp. ix, 174. isbn 978-08264-9736-9. £65; us$144. t is often claimed that there is more than a hint of irony about Russell’s Icontribution to the philosophy of language. The contribution, if assessed in terms of Russell’s inXuence on philosophy of language, is massivez—zit some- times seems that the entire subject subsequent to “On Denoting” is a vast collection of footnotes to Russell and Fregez—zyet, it is said, Russell himself was not interested in linguistic issues. The three items of evidence most commonly cited in support of this latter claim are (1) the non-linguistic nature of Russellian propositions and their constituents in Russell’s early philosophical work, (2) his vehement attacks on Wittgensteinian and ordinary language philosophy in his later publications, and (3) his alleged lack of interest in natural language beyond a general desire to replace it with an “ideal” alternative. In fact, there is no real irony here. Many philosophers of language and linguists would agree with Russell on both (1) and (2). Furthermore, as Keith Green shows in detail in this interesting book, (3) simply does not have ample support in Russell’s writings. Green’s study of Russell spans the entire period of his philosophical activity. G:\WPData\TYPE2801\russell pm) 27, 2008 (1:09 September 28,1 048RED.wpd
    [Show full text]
  • Montague's PTQ: Synthese Language Library
    Montague’s “Linguistic” Work: Motivations, Trajectory, Attitudes Barbara Partee – University of Massachusetts, Amherst Abstract. The history of formal semantics is a history of evolving ideas about logical form, linguistic form, and the nature of semantics (and pragmatics). This paper discusses Montague’s work in its historical context, and traces the evolution of his interest in issues of natural language, including the emergence of his interest in quantified noun phrases from a concern with intensional transitive verbs like seeks. Drawing in part on research in the Montague Archives in the UCLA Library, I discuss Montague’s motivation for his work on natural language semantics, his evaluation of its importance, and his evolving ideas about various linguistic topics, including some that he evidently thought about but did not include in his published work. Keywords: formal semantics, formal pragmatics, history of semantics, Richard Montague, Montague grammar, logical form, linguistic form, intensional verbs, generalized quantifiers. 1. Introduction The history of formal semantics is a history of evolving ideas about logical form, linguistic form, and the nature of semantics (and pragmatics). This paper1 discusses Montague’s work in its historical context, and traces the evolution of his interest in issues of natural language, including the emergence of his interest in quantified noun phrases from a concern with intensional transitive verbs like seeks. Drawing in part on research in the archived Montague papers in the UCLA Library, I discuss Montague’s motivation for his work on natural language semantics, his evaluation of its importance, and his evolving ideas about various linguistic topics, including some that he evidently thought about but did not include in his published work.
    [Show full text]