Two Dogmas of Empiricism1a
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
At Least in Biophysical Novelties Amihud Gilead
The Twilight of Determinism: At Least in Biophysical Novelties Amihud Gilead Department of Philosophy, Eshkol Tower, University of Haifa, Haifa 3498838 Email: [email protected] Abstract. In the 1990s, Richard Lewontin referred to what appeared to be the twilight of determinism in biology. He pointed out that DNA determines only a little part of life phenomena, which are very complex. In fact, organisms determine the environment and vice versa in a nonlinear way. Very recently, biophysicists, Shimon Marom and Erez Braun, have demonstrated that controlled biophysical systems have shown a relative autonomy and flexibility in response which could not be predicted. Within the boundaries of some restraints, most of them genetic, this freedom from determinism is well maintained. Marom and Braun have challenged not only biophysical determinism but also reverse-engineering, naive reductionism, mechanism, and systems biology. Metaphysically possibly, anything actual is contingent.1 Of course, such a possibility is entirely excluded in Spinoza’s philosophy as well as in many philosophical views at present. Kant believed that anything phenomenal, namely, anything that is experimental or empirical, is inescapably subject to space, time, and categories (such as causality) which entail the undeniable validity of determinism. Both Kant and Laplace assumed that modern science, such as Newton’s physics, must rely upon such a deterministic view. According to Kant, free will is possible not in the phenomenal world, the empirical world in which we 1 This is an assumption of a full-blown metaphysics—panenmentalism—whose author is Amihud Gilead. See Gilead 1999, 2003, 2009 and 2011, including literary, psychological, and natural applications and examples —especially in chemistry —in Gilead 2009, 2010, 2013, 2014a–c, and 2015a–d. -
Logical Truth, Contradictions, Inconsistency, and Logical Equivalence
Logical Truth, Contradictions, Inconsistency, and Logical Equivalence Logical Truth • The semantical concepts of logical truth, contradiction, inconsistency, and logi- cal equivalence in Predicate Logic are straightforward adaptations of the corre- sponding concepts in Sentence Logic. • A closed sentence X of Predicate Logic is logically true, X, if and only if X is true in all interpretations. • The logical truth of a sentence is proved directly using general reasoning in se- mantics. • Given soundness, one can also prove the logical truth of a sentence X by provid- ing a derivation with no premises. • The result of the derivation is that X is theorem, ` X. An Example • (∀x)Fx ⊃ (∃x)Fx. – Suppose d satisfies ‘(∀x)Fx’. – Then all x-variants of d satisfy ‘Fx’. – Since the domain D is non-empty, some x-variant of d satisfies ‘Fx’. – So d satisfies ‘(∃x)Fx’ – Therefore d satisfies ‘(∀x)Fx ⊃ (∃x)Fx’, QED. • ` (∀x)Fx ⊃ (∃x)Fx. 1 (∀x)Fx P 2 Fa 1 ∀ E 3 (∃x)Fx 1 ∃ I Contradictions • A closed sentence X of Predicate Logic is a contradiction if and only if X is false in all interpretations. • A sentence X is false in all interpretations if and only if its negation ∼X is true on all interpretations. • Therefore, one may directly demonstrate that a sentence is a contradiction by proving that its negation is a logical truth. • If the ∼X of a sentence is a logical truth, then given completeness, it is a theorem, and hence ∼X can be derived from no premises. 1 • If a sentence X is such that if it is true in any interpretation, both Y and ∼Y are true in that interpretation, then X cannot be true on any interpretation. -
Pluralisms About Truth and Logic Nathan Kellen University of Connecticut - Storrs, [email protected]
University of Connecticut OpenCommons@UConn Doctoral Dissertations University of Connecticut Graduate School 8-9-2019 Pluralisms about Truth and Logic Nathan Kellen University of Connecticut - Storrs, [email protected] Follow this and additional works at: https://opencommons.uconn.edu/dissertations Recommended Citation Kellen, Nathan, "Pluralisms about Truth and Logic" (2019). Doctoral Dissertations. 2263. https://opencommons.uconn.edu/dissertations/2263 Pluralisms about Truth and Logic Nathan Kellen, PhD University of Connecticut, 2019 Abstract: In this dissertation I analyze two theories, truth pluralism and logical pluralism, as well as the theoretical connections between them, including whether they can be combined into a single, coherent framework. I begin by arguing that truth pluralism is a combination of realist and anti-realist intuitions, and that we should recognize these motivations when categorizing and formulating truth pluralist views. I then introduce logical functionalism, which analyzes logical consequence as a functional concept. I show how one can both build theories from the ground up and analyze existing views within the functionalist framework. One upshot of logical functionalism is a unified account of logical monism, pluralism and nihilism. I conclude with two negative arguments. First, I argue that the most prominent form of logical pluralism faces a serious dilemma: it either must give up on one of the core principles of logical consequence, and thus fail to be a theory of logic at all, or it must give up on pluralism itself. I call this \The Normative Problem for Logical Pluralism", and argue that it is unsolvable for the most prominent form of logical pluralism. Second, I examine an argument given by multiple truth pluralists that purports to show that truth pluralists must also be logical pluralists. -
Willard Van Orman Quine
WILLARD VAN ORMAN QUINE Te philosopher’s task differs from the others’…in detail, but in no such drastic way as those suppose who imagine for the philosopher a vantage point outside the conceptual scheme he takes in charge. Tere is no such cosmic exile. He cannot study and revise the fundamental conceptual scheme of science and common sense without having some conceptual scheme, whether the same or another no less in need of philosophical scrutiny, in which to work. He can scrutinize and improve the system from within, appealing to coherence and simplicity, but this is the theoretician’s method generally. Quine, Word & Object, pp.275–6 Quine’s “Two Dogmas” 1. The analytic/synthetic distinction 2.The idea that everything reduces to sense-data. Quine thinks both are bogus. Two kinds of meaningful sentences: • Synthetic sentences (It passes the verifiability test: some possible experiences would either confirm it or disconfirm it.) e.g.: statements about physical things, other people, their minds, the self, my own sensations • Analytic sentences (Its truth or falsity are guaranteed by the rules of language alone. It is true in virtue of its meaning.) e.g.: propositions of logic, math, and definitions for translating empirical sentences into sentences about sense-data. “The problem of giving an actual rule for translating sentences about a material thing into sentences about sense-contents, which may be called the problem of the ‘reduction’ of material things to sense- contents, is the main philosophical part of the traditional problem of perception.” —Ayer, Language, Truth, and Logic, Ch.3 Theoretical Statements The table is beige. -
Ingenuity in Mathematics Ross Honsberger 10.1090/Nml/023
AMS / MAA ANNELI LAX NEW MATHEMATICAL LIBRARY VOL 23 Ingenuity in Mathematics Ross Honsberger 10.1090/nml/023 INGENUITY IN MATHEMAmCS NEW MATHEMATICAL LIBRARY PUBLISHED BY THEMATHEMATICAL ASSOCIATION OF AMERICA Editorial Committee Basil Gordon, Chairman (1975-76) Anneli Lax, Editor University of California, L.A. New York University Ivan Niven (1 975-77) University of Oregon M. M. Schiffer (1975-77) Stanford University The New Mathematical Library (NML) was begun in 1961 by the School Mathematics Study Group to make available to high school students short expository books on various topics not usually covered in the high school syllabus. In a decade the NML matured into a steadily growing series of some twenty titles of interest not only to the originally intended audience, but to college students and teachers at all levels. Previously published by Random House and L. W. Singer, the NML became a publication series of the Mathematical Association of America (MAA) in 1975. Under the auspices of the MAA the NML will continue to grow and will remain dedicated to its original and expanded purposes. INGENUITY IN MATHEMATICS by Ross Honsberger University of Waterloo, Canada 23 MATHEMATICAL ASSOCIATION OF AMERICA Illustrated by George H. Buehler Sixth Printing © Copyright 1970, by the Mathematical Association of America A11 rights reserved under International and Pan-American Copyright Conventions. Published in Washington by The Mathematical Association of America Library of Congress Catalog Card Number: 77-134351 Print ISBN 978-0-88385-623-9 Electronic ISBN 978-0-88385-938-4 Manufactured in the United States of America Note to the Reader his book is one of a series written by professional mathematicians Tin order to make some important mathematical ideas interesting and understandable to a large audience of high school students and laymen. -
Why Quine Is Not a Postmodernist
University of Chicago Law School Chicago Unbound Journal Articles Faculty Scholarship 1997 Why Quine Is Not a Postmodernist Brian Leiter Follow this and additional works at: https://chicagounbound.uchicago.edu/journal_articles Part of the Law Commons Recommended Citation Brian Leiter, "Why Quine Is Not a Postmodernist," 50 SMU Law Review 1739 (1997). This Article is brought to you for free and open access by the Faculty Scholarship at Chicago Unbound. It has been accepted for inclusion in Journal Articles by an authorized administrator of Chicago Unbound. For more information, please contact [email protected]. WHY QUINE Is NOT A POSTMODERNIST Brian Leiter* TABLE OF CONTENTS I. INTRODUCTION ........................................ 1739 II. LEGITIMACY IN ADJUDICATION, TRUTH IN LAW. 1740 IL. PATFERSON'S QUINE VERSUS QUINE THE NATU RA LIST ........................................... 1746 I. INTRODUCTION ENNIS Patterson's wide-ranging book Law and Truth' has the great virtue of locating questions of legal theory within their broader (and rightful) philosophical context-that is, as special instances of more general problems in metaphysics and the philosophy of language. The book also sets out a position in jurisprudence that has some undeniable attractions.2 Although I have a number of disagree- ments with Patterson's treatment of the substantive philosophical issues at stake, there can be no doubt that he has performed a useful service in forcing legal philosophers to think seriously about the distinctively philo- sophical problems that define the discipline of jurisprudence. I organize my discussion around one topic in particular-namely, Pat- terson's identification of the great American philosopher Willard van Or- man Quine (born 1908) as a pivotal figure in the transition from "modernity" to "postmodernity."'3 This characterization, I will argue, in- volves an important misunderstanding of Quine's thought. -
Defense of Reductionism About Testimonial Justification of Beliefs
Page 1 Forthcoming in Noûs A Defense of Reductionism about Testimonial Justification of Beliefs TOMOJI SHOGENJI Rhode Island College Abstract This paper defends reductionism about testimonial justification of beliefs against two influential arguments. One is the empirical argument to the effect that the reductionist justification of our trust in testimony is either circular since it relies on testimonial evidence or else there is scarce evidence in support of our trust in testimony. The other is the transcendental argument to the effect that trust in testimony is a prerequisite for the very existence of testimonial evidence since without the presumption of people’s truthfulness we cannot interpret their utterances as testimony with propositional contents. This paper contends that the epistemic subject can interpret utterances as testimony with propositional contents without presupposing the credibility of testimony, and that evidence available to the normal epistemic subject can justify her trust in testimony. I. Introduction There has recently been a considerable interest in anti-reductionism about testimonial justification of beliefs, according to which we cannot justify our trust in testimony by perceptual and memorial evidence.1 The reason for the interest is not the enticement of skepticism. Recent anti-reductionists hold that we are prima facie justified in trusting testimony simply because it is testimony. This means that there is a presumption in favor of testimony that it is credible unless contrary evidence is available. I will use the term “anti-reductionism” to refer to this non-skeptical version of anti-reductionism about testimonial justification. The more traditional position is reductionism, of which the most prominent advocate is David Hume. -
You Are Not Your Brain: Against Teaching to the Brain
You Are Not Your Brain: Against Teaching to the Brain Gregory M. Nixon Gregory Nixon is an Assistant Professor in the School of Education at the University of Northern British Columbia in Prince George. He took his doctorate with William F. Pinar at Louisiana State University and taught at various universities in the USA for 12 years before returning home to Canada. He publishes widely on learning theory and philosophy of mind. He is editor-at-large for the Journal of Consciousness Exploration & Research. Abstract Since educators are always looking for ways to improve their practice, and since empirical science is now accepted in our worldview as the final arbiter of truth, it is no surprise they have been lured toward cognitive neuroscience in hopes that discovering how the brain learns will provide a nutshell explanation for student learning in general. I argue that identifying the person with the brain is scientism (not science), that the brain is not the person, and that it is the person who learns. In fact the brain only responds to the learning of embodied experience within the extra-neural network of intersubjective communications. Learning is a dynamic, cultural activity, not a neural program. Brain-based learning is unnecessary for educators and may be dangerous in that a culturally narrow ontology is taken for granted, thus restricting our creativity and imagination, and narrowing the human community. [keywords: selfhood, neuroscience, cognitive science, brain-based learning, intersubjectivity, consciousness, philosophy of mind, explanatory gap, cultural construction, reductionism, scientism, education, learning theory, curriculum theory] Brain-Based Scientism 2 Introduction Human experience is a dance that unfolds in the world and with others. -
The Univocity of Substance and the Formal Distinction of Attributes: the Role of Duns Scotus in Deleuze's Reading of Spinoza Nathan Widder
parrhesia 33 · 2020 · 150-176 the univocity of substance and the formal distinction of attributes: the role of duns scotus in deleuze's reading of spinoza nathan widder This paper examines the role played by medieval theologian John Duns Scotus in Gilles Deleuze’s reading of Spinoza’s philosophy of expressive substance; more generally, it elaborates a crucial moment in the development of Deleuze’s philosophy of sense and difference. Deleuze contends that Spinoza adapts and extends Duns Scotus’s two most influential theses, the univocity of being and formal distinction, despite neither appearing explicitly in Spinoza’s writings. “It takes nothing away from Spinoza’s originality,” Deleuze declares, “to place him in a perspective that may already be found in Duns Scotus” (Deleuze, 1992, 49).1 Nevertheless, the historiographic evidence is clearly lacking, leaving Deleuze to admit that “it is hardly likely that” Spinoza had even read Duns Scotus (359n28). Indeed, the only support he musters for his speculation is Spinoza’s obvious in- terests in scholastic metaphysical and logical treatises, the “probable influence” of the Scotist-informed Franciscan priest Juan de Prado on his thought, and the fact that the problems Duns Scotus addresses need not be confined to Christian thought (359–360n28). The paucity of evidence supporting this “use and abuse” of history, however, does not necessarily defeat the thesis. Like other lineages Deleuze proposes, the one he traces from Duns Scotus to Spinoza, and subsequently to Nietzsche, turns not on establishing intentional references by one thinker to his predecessor, but instead on showing how the borrowings and adaptations asserted to create the connec- tion make sense of the way the second philosopher surmounts blockages he faces while responding to issues left unaddressed by the first. -
Quantification of Damages , in 3 ISSUES in COMPETITION LAW and POLICY 2331 (ABA Section of Antitrust Law 2008)
Theon van Dijk & Frank Verboven, Quantification of Damages , in 3 ISSUES IN COMPETITION LAW AND POLICY 2331 (ABA Section of Antitrust Law 2008) Chapter 93 _________________________ QUANTIFICATION OF DAMAGES Theon van Dijk and Frank Verboven * This chapter provides a general economic framework for quantifying damages or lost profits in price-fixing cases. We begin with a conceptual framework that explicitly distinguishes between the direct cost effect of a price overcharge and the subsequent indirect pass-on and output effects of the overcharge. We then discuss alternative methods to quantify these various components of damages, ranging from largely empirical approaches to more theory-based approaches. Finally, we discuss the law and economics of price-fixing damages in the United States and Europe, with the focus being the legal treatment of the various components of price-fixing damages, as well as the legal standing of the various groups harmed by collusion. 1. Introduction Economic damages can be caused by various types of competition law violations, such as price-fixing or market-dividing agreements, exclusionary practices, or predatory pricing by a dominant firm. Depending on the violation, there are different parties that may suffer damages, including customers (direct purchasers and possibly indirect purchasers located downstream in the distribution chain), suppliers, and competitors. Even though the parties damaged and the amount of the damage may be clear from an economic perspective, in the end it is the legal framework within a jurisdiction that determines which violations can lead to compensation, which parties have standing to sue for damages, and the burden and standard of proof that must be met by plaintiffs and defendants. -
Defining Data Science by a Data-Driven Quantification of the Community
machine learning & knowledge extraction Article Defining Data Science by a Data-Driven Quantification of the Community Frank Emmert-Streib 1,2,* and Matthias Dehmer 3,4,5 1 Predictive Medicine and Data Analytics Lab, Department of Signal Processing, Tampere University of Technology, FI-33101 Tampere, Finland 2 Institute of Biosciences and Medical Technology, FI-33101 Tampere, Finland 3 Institute for Intelligent Production, Faculty for Management, University of Applied Sciences Upper Austria, Steyr Campus, A-4400 Steyr, Austria; [email protected] 4 Department of Mechatronics and Biomedical Computer Science, UMIT, A-6060 Hall in Tyrol, Austria 5 College of Computer and Control Engineering, Nankai University, Tianjin 300071, China * Correspondence: [email protected]; Tel.: +358-50-301-5353 Received: 4 December 2018; Accepted: 17 December 2018; Published: 19 December 2018 Abstract: Data science is a new academic field that has received much attention in recent years. One reason for this is that our increasingly digitalized society generates more and more data in all areas of our lives and science and we are desperately seeking for solutions to deal with this problem. In this paper, we investigate the academic roots of data science. We are using data of scientists and their citations from Google Scholar, who have an interest in data science, to perform a quantitative analysis of the data science community. Furthermore, for decomposing the data science community into its major defining factors corresponding to the most important research fields, we introduce a statistical regression model that is fully automatic and robust with respect to a subsampling of the data. -
DUHEM, QUINE and the OTHER DOGMA Alexander Afriat
DUHEM, QUINE AND THE OTHER DOGMA Alexander Afriat ABSTRACT: By linking meaning and analyticity (through synonymy), Quine rejects both “dogmas of empiricism” together, as “two sides of a single dubious coin.” His rejection of the second (“reductionism”) has been associated with Duhem’s argument against crucial experiments—which relies on fundamental differences, brought up again and again, between mathematics and physics. The other dogma rejected by Quine is the “cleavage between analytic and synthetic truths”; but aren’t the truths of mathematics analytic, those of physics synthetic? Exploiting Quine’s association of essences, meaning, synonymy and analyticity, and appealing to a ‘model-theoretical’ notion of abstract test derived from Duhem and Quine—which can be used to overcome their holism by separating essences from accidents—I reconsider the ‘crucial experiment,’ the aforementioned “cleavage,” and the differences Duhem attributed to mathematics and physics; and propose a characterisation of the meaning and reference of sentences, which extends, in a natural way, the distinction as it applies to words. 1. Introduction A resemblance1 between positions held by Duhem and Quine has led to the conjunction of their names: one speaks of “Duhem-Quine.” Whether the conjunction—amid many differences2 of period, provenance, profession, subject-matter, style and generality—is entirely justified is debatable, but not really the issue here. Quine’s position is famously expressed in “Two dogmas of empiricism”; it was by disputing the second3 that he came to be associated with Duhem. But there is also the first, the “cleavage between analytic and synthetic truths.”4 Quine claims they are equivalent, indeed “two sides of the same dubious coin,” and contests both together.