On Popper on Truth
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
"There Is No Right to Be Stupid": a Voegelinian Analysis of Islamist
"There is No Right to Be Stupid": A Voegelinian Analysis of Islamist Terrorism with Reference to the "Elements of Reality" Copyright 2005 Celestino Perez, Jr. In the following essay I make three claims. First, in agreement with Eric Voegelin and Barry Cooper, I illustrate why a comprehensive study of politics requires that attention be paid to the symbols "God and man," "the Good", and "the soul" (Section I). I then briefly wonder if Voegelin and Cooper, despite offering cogent analyses of pernicious ideologies, ask too much of their readers? If so, what are the prospects for a competent political science and a healthy democratic populace (Section II)? In Section III, I suggest a different way of applying Voegelin's theory to an analysis of Islamist ideology, wherein I lay out ten "elements of reality" and five distilled "fundamental truths" derived from these elements. To the degree that someone distorts or neglects these elements of reality, he is a pneumopath. He may be very smart. But, alas, he is a "philosophical and religious ignoramus,"1 [1] and hence stupid. And, as Voegelin reminds us in Hitler and the Germans, "There is no right to be stupid." To be sure, if we allow ourselves to submit to stupidity, the gangster animals will be sure to mockingly let us know: 1 [1] Barry Cooper, New Political Religions, or An Analysis of Modern Terrorism ( Columbia : University of Missouri Press , 2004), xii. Cooper describes Sayyid Qutb's and other terrorists' "dogmatic certainty" as resulting from their being "philosophical and religious ignoramuses." "Yoohoo, silly ass!"2 [2] In the Concluding Notes, I lay out what I consider to be the gist of Cooper's argument, and I briefly comment on his Appendix in New Political Religions. -
Truth, Rationality, and the Growth of Scientific Knowledge
....... CONJECTURES sense similar to that in which processes or things may be said to be parts of the world; that the world consists of facts in a sense in which it may be said to consist of (four dimensional) processes or of (three dimensional) things. They believe that, just as certain nouns are names of things, sentences are names of facts. And they sometimes even believe that sentences are some- thing like pictures of facts, or that they are projections of facts.7 But all this is mistaken. The fact that there is no elephant in this room is not one of the processes or parts ofthe world; nor is the fact that a hailstorm in Newfound- 10 land occurred exactIy I I I years after a tree collapsed in the New Zealand bush. Facts are something like a common product oflanguage and reality; they are reality pinned down by descriptive statements. They are like abstracts from TRUTH, RATIONALITY, AND THE a book, made in a language which is different from that of the original, and determined not only by the original book but nearly as much by the principles GROWTH OF SCIENTIFIC KNOWLEDGE of selection and by other methods of abstracting, and by the means of which the new language disposes. New linguistic means not only help us to describe 1. THE GROWTH OF KNOWLEDGE: THEORIES AND PROBLEMS new kinds of facts; in a way, they even create new kinds of facts. In a certain sense, these facts obviously existed before the new means were created which I were indispensable for their description; I say, 'obviously' because a calcula- tion, for example, ofthe movements of the planet Mercury of 100 years ago, MY aim in this lecture is to stress the significance of one particular aspect of carried out today with the help of the calculus of the theory of relativity, may science-its need to grow, or, if you like, its need to progress. -
Religious Language and Verificationism
© Michael Lacewing Religious language and verificationism AYER’S ARGUMENT In the 1930s, a school of philosophy arose called logical positivism, concerned with the foundations and possibility of knowledge. It developed a criterion for meaningful statements, called the principle of verification. On A J Ayer’s version (Language, Truth and Logic), the principle of verification states that a statement only has meaning if it is either analytic or empirically verifiable. (can be shown by experience to be probably true/false). ‘God exists’, and so all other talk of God, is such a statement, claims Ayer. Despite the best attempts of the ontological argument, we cannot prove ‘God exists’ from a priori premises using deduction alone. So ‘God exists’ is not analytically true. Therefore, to be meaningful, ‘God exists’ must be empirically verifiable. Ayer argues it is not. If a statement is an empirical hypothesis, it predicts our experience will be different depending on whether it is true or false. But ‘God exists’ makes no such predictions. So it is meaningless. Some philosophers argue that religious language attempts to capture something of religious experience, although it is ‘inexpressible’ in literal terms. Ayer responds that whatever religious experiences reveal, they cannot be said to reveal any facts. Facts are the content of statements that purport to be intelligible and can be expressed literally. If talk of God is non-empirical, it is literally unintelligible, hence meaningless. Responses We can object that many people do think that ‘God exists’ has empirical content. For example, the argument from design argues that the design of the universe is evidence for the existence of God. -
Understanding the Progress of Science
Understanding the Progress of Science C. D. McCoy 30 May 2018 The central problem of epistemology has always been and still is the problem of the growth of knowledge. And the growth of knowledge can be studied best by studying the growth of scientific knowledge. —(Popper, 2002b, xix) Merely fact-minded sciences make merely fact-minded people. —(Husserl, 1970, 6) The growth of human knowledge is realized most dramatically in the growth of scientific knowledge. What we have come to know through science about the workings of our human bodies and the workings of heavenly bodies, among much else, astounds, especially in comparison to our accumulating knowledge of the commonplace and everyday. Impressed by the evident epistemic progress made by science, many philosophers during the 20th century thought that studying how scientific knowledge increases is a crucial task of epistemological inquiry, and, indeed, the progress of science has been a topic of considerable interest. Yet, eventually, late in the century, attention waned and for the most part shifted to other tasks. Recently, however, a provocative article by Bird has re-ignited philosophical interest in the topic of scientific progress (Bird, 2007). Bird criticizes what he takes to have been the most prominent accounts advanced in the 20th century and presses for what he views as a venerable but lately overlooked one. He claims that realist philosophers of science have often presumed that truth is the goal of science and progress to be properly characterized by an accu- mulation of truth or truths, and notes that anti-realists have often rejected truth as a goal of science and characterized progress in terms of, among other things, “problem-solving effectiveness.”1 He criticizes the former for overlooking the relevance of justification to progress and the latter for giving up on truth. -
Criticism and the Growth of Knowledge
CRITICISM AND THE GROWTH OF KNOWLEDGE Proceedings of the International Colloquium in the Philosophy of Science, London, ig 6^, volume 4 Edited by IMRE LAKATOS Professor of Logic, University of London ALAN MUSGRAVE Professor of Philosophy, University of Otago CAMBRIDGE AT THE UNIVERSITY PRESS 1970 CONTENTS Published by the Syndics of the Cambridge University Press Page Bentley House, 200 Euston Road, London N.W .i Preface vii American Branch: 32 East 57th Street, New York, N.Y. 10022 T. s. KUHN: Logic of Discovery or Psychology of Research? i © Cambridge University Press 1970 Library of Congress Catalogue Card Number 78-105496 Discussion: j. w. N. WATKINS: Against ‘Normal Science’ 25 Standard Book Number: 521 07826 i toulmin: Does the Distinction between Normal and Revolutionary Science Hold Water? 39 L. PEARCE williams: Normal Science, Scientific Revolutions / ^ d the History of Science 49 popper: Normal Science and its Dangers 5^ uJVfiC^ARET masterman: The Nature of a Paradigm 59 i I: Lakatos: Falsification and the Methodology of Scientific Research Programmes 9^ K„ feyer aben d : Consolations for the Specialist 197 s. KUHN: Reflections on my Critics 231 Index 279 Printed in Great Britain at the University Press, Aberdeen PREFACE This book constitutes the fourth volume of the Proceedings of the 1965 International Colloquium in the Philosophy of Science held at Bedford College, Regent’s Park, London, from i i to 17 July 1965. The Colloquium was organized jointly by the British Society for the Philosophy of Science and the London School of Economics and Political Science, under the auspices of the Division of Logic, Methodology and Philosophy of Science of the International Union of History and Philosophy of Seience. -
The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning
philosophies Article The P–T Probability Framework for Semantic Communication, Falsification, Confirmation, and Bayesian Reasoning Chenguang Lu School of Computer Engineering and Applied Mathematics, Changsha University, Changsha 410003, China; [email protected] Received: 8 July 2020; Accepted: 16 September 2020; Published: 2 October 2020 Abstract: Many researchers want to unify probability and logic by defining logical probability or probabilistic logic reasonably. This paper tries to unify statistics and logic so that we can use both statistical probability and logical probability at the same time. For this purpose, this paper proposes the P–T probability framework, which is assembled with Shannon’s statistical probability framework for communication, Kolmogorov’s probability axioms for logical probability, and Zadeh’s membership functions used as truth functions. Two kinds of probabilities are connected by an extended Bayes’ theorem, with which we can convert a likelihood function and a truth function from one to another. Hence, we can train truth functions (in logic) by sampling distributions (in statistics). This probability framework was developed in the author’s long-term studies on semantic information, statistical learning, and color vision. This paper first proposes the P–T probability framework and explains different probabilities in it by its applications to semantic information theory. Then, this framework and the semantic information methods are applied to statistical learning, statistical mechanics, hypothesis evaluation (including falsification), confirmation, and Bayesian reasoning. Theoretical applications illustrate the reasonability and practicability of this framework. This framework is helpful for interpretable AI. To interpret neural networks, we need further study. Keywords: statistical probability; logical probability; semantic information; rate-distortion; Boltzmann distribution; falsification; verisimilitude; confirmation measure; Raven Paradox; Bayesian reasoning 1. -
Degrees of Riskiness, Falsifiability, and Truthlikeness. a Neo-Popperian
Degrees of riskiness, falsifiability, and truthlikeness. A neo-Popperian account applicable to probabilistic theories. Leander Vignero Sylvia Wenmackers [email protected] [email protected] KU Leuven, Institute of Philosophy, Centre for Logic and Philosophy of Science, Kardinaal Mercierplein 2 { bus 3200, 3000 Leuven, Belgium. July 7, 2021 (Forthcoming in Synthese.) Abstract In this paper, we take a fresh look at three Popperian concepts: riskiness, falsifiability, and truthlikeness (or verisimilitude) of scien- tific hypotheses or theories. First, we make explicit the dimensions that underlie the notion of riskiness. Secondly, we examine if and how degrees of falsifiability can be defined, and how they are related to var- ious dimensions of the concept of riskiness as well as the experimental context. Thirdly, we consider the relation of riskiness to (expected degrees of) truthlikeness. Throughout, we pay special attention to probabilistic theories and we offer a tentative, quantitative account of verisimilitude for probabilistic theories. \Modern logic, as I hope is now evident, has the effect of enlarging our abstract imagination, and providing an infinite number of possible hypotheses to be applied in the analysis of any complex fact." { Russell (1914, p. 58). A theory is falsifiable if it allows us to deduce predictions that can be compared to evidence, which according to our background knowledge could arXiv:2107.03772v1 [physics.hist-ph] 8 Jul 2021 show the prediction | and hence the theory | to be false. In his defense of falsifiability as a demarcation criterion, Popper has stressed the importance of bold, risky claims that can be tested, for they allow science to make progress. -
Reexamining the Problem of Demarcating Science and Pseudoscience by Evan Westre B.A., Vancouver Island University, 2010 a Thesis
Reexamining the Problem of Demarcating Science and Pseudoscience By Evan Westre B.A., Vancouver Island University, 2010 A Thesis Submitted in Partial Fulfillment of the Requirements For the Degree of MASTER OF ARTS ©Evan Westre, 2014 All Rights Reserved. This thesis may not be reproduced in whole or in part, by photocopy or other means, without the permission of the author. Supervisory Committee Reexamining the Problem of Demarcating Science and Pseudoscience By Evan Westre B.A., Vancouver Island University, 2010 Dr. Audrey Yap: Supervisor (Department of Philosophy) Dr. Jeffrey Foss: Departmental Member (Department of Philosophy) ii Abstract Supervisory Committee Dr. Audrey Yap: Supervisor (Department of Philosophy) Dr. Jeffrey Foss: Departmental Member (Department of Philosophy) The demarcation problem aims to articulate the boundary between science and pseudoscience. Solutions to the problem have been notably raised by the logical positivists (verificationism), Karl Popper (falsificationism), and Imre Lakatos (methodology of research programmes). Due, largely, to the conclusions drawn by Larry Laudan, in a pivotal 1981 paper which dismissed the problem of demarcation as a “pseudo-problem”, the issue was brushed aside for years. Recently, however, there has been a revival of attempts to reexamine the demarcation problem and synthesize new solutions. My aim is to survey two of the contemporary attempts and to assess these approaches over and against the broader historical trajectory of the demarcation problem. These are the efforts of Nicholas Maxwell (aim-oriented empiricism), and Paul Hoyningen-Huene (systematicity). I suggest that the main virtue of the new attempts is that they promote a self-reflexive character within the sciences. -
1 Phil. 4400 Notes #1: the Problem of Induction I. Basic Concepts
Phil. 4400 Notes #1: The problem of induction I. Basic concepts: The problem of induction: • Philosophical problem concerning the justification of induction. • Due to David Hume (1748). Induction: A form of reasoning in which a) the premises say something about a certain group of objects (typically, observed objects) b) the conclusion generalizes from the premises: says the same thing about a wider class of objects, or about further objects of the same kind (typically, the unobserved objects of the same kind). • Examples: All observed ravens so far have been The sun has risen every day for the last 300 black. years. So (probably) all ravens are black. So (probably) the sun will rise tomorrow. Non-demonstrative (non-deductive) reasoning: • Reasoning that is not deductive. • A form of reasoning in which the premises are supposed to render the conclusion more probable (but not to entail the conclusion). Cogent vs. Valid & Confirm vs. Entail : ‘Cogent’ arguments have premises that confirm (render probable) their conclusions. ‘Valid’ arguments have premises that entail their conclusions. The importance of induction: • All scientific knowledge, and almost all knowledge depends on induction. • The problem had a great influence on Popper and other philosophers of science. Inductive skepticism: Philosophical thesis that induction provides no justification for ( no reason to believe) its conclusions. II. An argument for inductive skepticism 1. There are (at most) 3 kinds of knowledge/justified belief: a. Observations b. A priori knowledge c. Conclusions based on induction 2. All inductive reasoning presupposes the “Inductive Principle” (a.k.a. the “uniformity principle”): “The course of nature is uniform”, “The future will resemble the past”, “Unobserved objects will probably be similar to observed objects” 3. -
Theories of Truth
time 7 Theories of truth r A summary sketchl The object of this section is to sketch the main kinds of ~ theories of truth which have been proposed, and to indicate how they C relate to each other. (Subsequent sections will discuss some theories S'" p. in detail.) o Coherence theories take truth to consist in relations of coherence ~ among a set of beliefs. Coherence theories were proposed e.g. by '" ""C Bradley r9r4, and also by some positivist opponents of idealism, such o u as Neurath r932; more recently, Rescher r973 and Dauer r974 have C'" defended this kind of approach. Correspondence theories take the :l 0::'" truth of a proposition to consist, not in its relations to other proposi .= tions, but in its relation to the world, its cOllespondence to the facts. Theories of this kind were held by both Russell r9r8 and Wittgenstein r922, during the period of their adherence to logical atomism; Austin defended a version of the cOllespondence theory in r950' The prag matist theory, developed in the works of Peirce (see e.g. r877), Dewey (see e.g. r9or) and James (see e.g. r909) has affinities with both coherence and correspondence theories, allowing that the truth of a belief derives from its correspondence with reality, but stressing also that it is manifested by the beliefs' survival of test by experience, its coherence with other beliefs; the account of truth proposed in Dummett r959 has, in turn, quite strong affinities with the pragmatist VIew. 1 Proponents of the theories I shall discuss take different views about what kinds of items are truth-bearers. -
Phenomenological Verificationism
Husserlian Verificationism Gregor Bös Tue 29th Sept, 2020 Research Seminar Centre for Subjectivity Research, University of Copenhagen Abstract Abstract Verificationism is the name for a family of claims connecting meaning andepis- temic access, mostly known from logical empiricism. The widespread rejection of verificationism was a necessary step for the revival of metaphysics in ana- lytic philosophy, and has for this reason received much attention. Much less discussed are verificationist claims in the early phenomenological tradition. I aim to show in what sense and to what end Husserl commits to a form of ver- ificationism. I then present the Church-Fitch argument as a general challenge to such a commitment. Gregor Bös (KCL) Husserlian Verificationism 2020/09/24 1 / 25 Abstract Outline 1 Introduction: Unlikely Fellows 2 Verificationism in the Phenomenological Tradition 3 Verificationism in the Logical Empiricist Tradition 4 The Church-Fitch Argument against Epistemic Notions of Truth Gregor Bös (KCL) Husserlian Verificationism 2020/09/24 1 / 25 Introduction Outline 1 Introduction: Unlikely Fellows 2 Verificationism in the Phenomenological Tradition 3 Verificationism in the Logical Empiricist Tradition 4 The Church-Fitch Argument against Epistemic Notions of Truth Gregor Bös (KCL) Husserlian Verificationism 2020/09/24 1 / 25 Introduction Context and Background Assumptions Manifest and Scientific Images can get in conflict. Phenomenology promises to ground the scientific in the manifest. Overarching Goal: an explicit proposal for a phenomenological -
The Criterion of Truth
THE CRITERION OF TRUTH An article published in the Theosophical movement magazine, February 2017 THE STATED object of science is to discover truths about the natural world through strictly objective observation, experimental verification and theoretical explanations, free from all subjective biases. Testability and reproducibility of laws so discovered are the criterion of truth so far as modern science is concerned. Though science has made remarkable progress in unravelling laws of the physical world by empirical and inductive methods, and in their practical applications, leading scientists however confess that they have failed to develop a unified science with a single set of assumptions and terms to explain all observations as aspects of one coherent whole. In other words, they are still in search of one single universal principle, theoretical formulation of which will reconcile many different specialized observations resolvable into one grand synthetic whole which explains all the mysteries of the natural world. They admit that such a universal unifying principle alone is truth and that its discovery, by means of inductive method they adhere to, is still a far cry. Each one of the many sects of the world religions, resting on “Revelation,” lays claim to be the sole possessor of truth. Inasmuch as there is no other fertile source of mutual hatred and strife in the world than differences between the various religious sects, it is evident that none of them has the whole truth. Whatever truth they had originally possessed has been so mixed up with human error and superstitions over the centuries that they either repel and cause the reasoning and thinking portions of humanity to fall into agnosticism, or induce blind faith and superstitions in unthinking portions of the public who willingly submit to priestly authority.