Nature, Science, Bayes' Theorem, and the Whole of Reality
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
The Open Handbook of Formal Epistemology
THEOPENHANDBOOKOFFORMALEPISTEMOLOGY Richard Pettigrew &Jonathan Weisberg,Eds. THEOPENHANDBOOKOFFORMAL EPISTEMOLOGY Richard Pettigrew &Jonathan Weisberg,Eds. Published open access by PhilPapers, 2019 All entries copyright © their respective authors and licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License. LISTOFCONTRIBUTORS R. A. Briggs Stanford University Michael Caie University of Toronto Kenny Easwaran Texas A&M University Konstantin Genin University of Toronto Franz Huber University of Toronto Jason Konek University of Bristol Hanti Lin University of California, Davis Anna Mahtani London School of Economics Johanna Thoma London School of Economics Michael G. Titelbaum University of Wisconsin, Madison Sylvia Wenmackers Katholieke Universiteit Leuven iii For our teachers Overall, and ultimately, mathematical methods are necessary for philosophical progress. — Hannes Leitgeb There is no mathematical substitute for philosophy. — Saul Kripke PREFACE In formal epistemology, we use mathematical methods to explore the questions of epistemology and rational choice. What can we know? What should we believe and how strongly? How should we act based on our beliefs and values? We begin by modelling phenomena like knowledge, belief, and desire using mathematical machinery, just as a biologist might model the fluc- tuations of a pair of competing populations, or a physicist might model the turbulence of a fluid passing through a small aperture. Then, we ex- plore, discover, and justify the laws governing those phenomena, using the precision that mathematical machinery affords. For example, we might represent a person by the strengths of their beliefs, and we might measure these using real numbers, which we call credences. Having done this, we might ask what the norms are that govern that person when we represent them in that way. -
The Bayesian Approach to the Philosophy of Science
The Bayesian Approach to the Philosophy of Science Michael Strevens For the Macmillan Encyclopedia of Philosophy, second edition The posthumous publication, in 1763, of Thomas Bayes’ “Essay Towards Solving a Problem in the Doctrine of Chances” inaugurated a revolution in the understanding of the confirmation of scientific hypotheses—two hun- dred years later. Such a long period of neglect, followed by such a sweeping revival, ensured that it was the inhabitants of the latter half of the twentieth century above all who determined what it was to take a “Bayesian approach” to scientific reasoning. Like most confirmation theorists, Bayesians alternate between a descrip- tive and a prescriptive tone in their teachings: they aim both to describe how scientific evidence is assessed, and to prescribe how it ought to be assessed. This double message will be made explicit at some points, but passed over quietly elsewhere. Subjective Probability The first of the three fundamental tenets of Bayesianism is that the scientist’s epistemic attitude to any scientifically significant proposition is, or ought to be, exhausted by the subjective probability the scientist assigns to the propo- sition. A subjective probability is a number between zero and one that re- 1 flects in some sense the scientist’s confidence that the proposition is true. (Subjective probabilities are sometimes called degrees of belief or credences.) A scientist’s subjective probability for a proposition is, then, more a psy- chological fact about the scientist than an observer-independent fact about the proposition. Very roughly, it is not a matter of how likely the truth of the proposition actually is, but about how likely the scientist thinks it to be. -
Most Honourable Remembrance. the Life and Work of Thomas Bayes
MOST HONOURABLE REMEMBRANCE. THE LIFE AND WORK OF THOMAS BAYES Andrew I. Dale Springer-Verlag, New York, 2003 668 pages with 29 illustrations The author of this book is professor of the Department of Mathematical Statistics at the University of Natal in South Africa. Andrew I. Dale is a world known expert in History of Mathematics, and he is also the author of the book “A History of Inverse Probability: From Thomas Bayes to Karl Pearson” (Springer-Verlag, 2nd. Ed., 1999). The book is very erudite and reflects the wide experience and knowledge of the author not only concerning the history of science but the social history of Europe during XVIII century. The book is appropriate for statisticians and mathematicians, and also for students with interest in the history of science. Chapters 4 to 8 contain the texts of the main works of Thomas Bayes. Each of these chapters has an introduction, the corresponding tract and commentaries. The main works of Thomas Bayes are the following: Chapter 4: Divine Benevolence, or an attempt to prove that the principal end of the Divine Providence and Government is the happiness of his creatures. Being an answer to a pamphlet, entitled, “Divine Rectitude; or, An Inquiry concerning the Moral Perfections of the Deity”. With a refutation of the notions therein advanced concerning Beauty and Order, the reason of punishment, and the necessity of a state of trial antecedent to perfect Happiness. This is a work of a theological kind published in 1731. The approaches developed in it are not far away from the rationalist thinking. -
Maty's Biography of Abraham De Moivre, Translated
Statistical Science 2007, Vol. 22, No. 1, 109–136 DOI: 10.1214/088342306000000268 c Institute of Mathematical Statistics, 2007 Maty’s Biography of Abraham De Moivre, Translated, Annotated and Augmented David R. Bellhouse and Christian Genest Abstract. November 27, 2004, marked the 250th anniversary of the death of Abraham De Moivre, best known in statistical circles for his famous large-sample approximation to the binomial distribution, whose generalization is now referred to as the Central Limit Theorem. De Moivre was one of the great pioneers of classical probability the- ory. He also made seminal contributions in analytic geometry, complex analysis and the theory of annuities. The first biography of De Moivre, on which almost all subsequent ones have since relied, was written in French by Matthew Maty. It was published in 1755 in the Journal britannique. The authors provide here, for the first time, a complete translation into English of Maty’s biography of De Moivre. New mate- rial, much of it taken from modern sources, is given in footnotes, along with numerous annotations designed to provide additional clarity to Maty’s biography for contemporary readers. INTRODUCTION ´emigr´es that both of them are known to have fre- Matthew Maty (1718–1776) was born of Huguenot quented. In the weeks prior to De Moivre’s death, parentage in the city of Utrecht, in Holland. He stud- Maty began to interview him in order to write his ied medicine and philosophy at the University of biography. De Moivre died shortly after giving his Leiden before immigrating to England in 1740. Af- reminiscences up to the late 1680s and Maty com- ter a decade in London, he edited for six years the pleted the task using only his own knowledge of the Journal britannique, a French-language publication man and De Moivre’s published work. -
The Theory That Would Not Die Reviewed by Andrew I
Book Review The Theory That Would Not Die Reviewed by Andrew I. Dale The solution, given more geometrico as Proposi- The Theory That Would Not Die: How Bayes’ tion 10 in [2], can be written today in a somewhat Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant compressed form as from Two Centuries of Controversy posterior probability ∝ likelihood × prior probability. Sharon Bertsch McGrayne Yale University Press, April 2011 Price himself added an appendix in which he used US$27.50, 336 pages the proposition in a prospective sense to find the ISBN-13: 978-03001-696-90 probability of the sun’s rising tomorrow given that it has arisen daily a million times. Later use In the early 1730sThomas Bayes (1701?–1761)was was made by Laplace, to whom one perhaps really appointed minister at the Presbyterian Meeting owes modern Bayesian methods. House on Mount Sion, Tunbridge Wells, a town that The degree to which one uses, or even supports, had developed around the restorative chalybeate Bayes’s Theorem (in some form or other) depends spring discovered there by Dudley, Lord North, in to a large extent on one’s views on the nature 1606. Apparently not one who was a particularly of probability. Setting this point aside, one finds popular preacher, Bayes would be recalled today, that the Theorem is generally used to update if at all, merely as one of the minor clergy of (to justify the updating of?) information in the eighteenth-century England, who also dabbled in light of new evidence as the latter is received, mathematics. -
A Modern History of Probability Theory
A Modern History of Probability Theory Kevin H. Knuth Depts. of Physics and Informatics University at Albany (SUNY) Albany NY USA 4/29/2016 Knuth - Bayes Forum 1 A Modern History of Probability Theory Kevin H. Knuth Depts. of Physics and Informatics University at Albany (SUNY) Albany NY USA 4/29/2016 Knuth - Bayes Forum 2 A Long History The History of Probability Theory, Anthony J.M. Garrett MaxEnt 1997, pp. 223-238. Hájek, Alan, "Interpretations of Probability", The Stanford Encyclopedia of Philosophy (Winter 2012 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/win2012/entries/probability-interpret/>. 4/29/2016 Knuth - Bayes Forum 3 … la théorie des probabilités n'est, au fond, que le bon sens réduit au calcul … … the theory of probabilities is basically just common sense reduced to calculation … Pierre Simon de Laplace Théorie Analytique des Probabilités 4/29/2016 Knuth - Bayes Forum 4 Taken from Harold Jeffreys “Theory of Probability” 4/29/2016 Knuth - Bayes Forum 5 The terms certain and probable describe the various degrees of rational belief about a proposition which different amounts of knowledge authorise us to entertain. All propositions are true or false, but the knowledge we have of them depends on our circumstances; and while it is often convenient to speak of propositions as certain or probable, this expresses strictly a relationship in which they stand to a corpus of knowledge, actual or hypothetical, and not a characteristic of the propositions in themselves. A proposition is capable at the same time of varying degrees of this relationship, depending upon the knowledge to which it is related, so that it is without significance to call a John Maynard Keynes proposition probable unless we specify the knowledge to which we are relating it. -
History of Probability (Part 4) - Inverse Probability and the Determination of Causes of Observed Events
History of Probability (Part 4) - Inverse probability and the determination of causes of observed events. Thomas Bayes (c1702-1761) . By the early 1700s the work of Pascal, Fermat, and Huygens was well known, mainly couched in terms of odds and fair bets in gambling. Jacob Bernoulli and Abraham DeMoivre had made efforts to broaden the scope and interpretation of probability. Bernoulli had put probability measures on a scale between zero and one and DeMoivre had defined probability as a fraction of chances. But then there was a 50 year lull in further development in probability theory. This is surprising, but the “brains” of the day were busy applying the newly invented calculus to problems of physics – especially astronomy. At the beginning of the 18 th century a probability exercise like the following one was not too difficult. Suppose there are two boxes with white and black balls. Box A has 4 white and 1 black, box B has 2 white and 3 black. You pick one box at random, with probability 1/3 it will be Box A and 2/3 it will be B. Then you randomly pick one ball from the box. What is the probability you will end up with a white ball? We can model this scenario by a tree diagram. We also have the advantage of efficient symbolism. Expressions like = had not been developed by 1700. The idea of the probability of a specific event was just becoming useful in addition to the older emphasis on odds of one outcome versus another. Using the multiplication rule for independent events together with the addition rule, we find that the probability of picking a white ball is 8/15. -
Bayes Rule in Perception, Action and Cognition
Bayes rule in perception, action and cognition Daniel M. Wolpert and Zoubin Ghahramani Department of Engineering, University of Cambridge, Trumpington Street, Cambridge Cognition and intelligent behaviour are fundamentally tied to the ability to survive in an uncertain and changing environment. Our only access to knowledge about the world is through our senses which provide information that is usually corrupted by random fluctuations, termed noise, and may provide ambiguous information about the possible states of the environment. Moreover, when we act on the world through our motor system, the commands we send to our muscles are also corrupted by variability or noise. This combined sensory and motor variability limits the precision with which we can perceive and act on the world. Here we will review the framework of Bayesian decision theory, which has emerged as a principled approach to handle uncertain information in an attempt to behave optimally, and how this framework can be used to understand sensory, motor and cognitive processes. Bayesian Theory is named after Thomas Bayes (Figure 1) who was an 18th century English Presbyterian minister. He is only known to have published two works during his life, of which only one dealt with mathematics in which he defended the logical foundation of Isaac Newton’s methods against contemporary criticism. After Bayes death, his friend Richard Price found an interesting mathematical proof among Bayes’ papers and sent the paper to the Editor of the Philosophical Transactions of the Royal Society stating “I now send you an essay which I have found among the papers of our deceased friend Mr Bayes, and which, in my opinion, has great merit....” The paper was subsequently published in 1764 as “Essay towards solving a problem in the doctrine of chances.” In the latter half of the 20th century Bayesian approaches have become a mainstay of statistics and a more general framework has now emerged, termed Bayesian decision theory (BDT). -
A Brief Overview of Probability Theory in Data Science by Geert
A brief overview of probability theory in data science Geert Verdoolaege 1Department of Applied Physics, Ghent University, Ghent, Belgium 2Laboratory for Plasma Physics, Royal Military Academy (LPP–ERM/KMS), Brussels, Belgium Tutorial 3rd IAEA Technical Meeting on Fusion Data Processing, Validation and Analysis, 27-05-2019 Overview 1 Origins of probability 2 Frequentist methods and statistics 3 Principles of Bayesian probability theory 4 Monte Carlo computational methods 5 Applications Classification Regression analysis 6 Conclusions and references 2 Overview 1 Origins of probability 2 Frequentist methods and statistics 3 Principles of Bayesian probability theory 4 Monte Carlo computational methods 5 Applications Classification Regression analysis 6 Conclusions and references 3 Early history of probability Earliest traces in Western civilization: Jewish writings, Aristotle Notion of probability in law, based on evidence Usage in finance Usage and demonstration in gambling 4 Middle Ages World is knowable but uncertainty due to human ignorance William of Ockham: Ockham’s razor Probabilis: a supposedly ‘provable’ opinion Counting of authorities Later: degree of truth, a scale Quantification: Law, faith ! Bayesian notion Gaming ! frequentist notion 5 Quantification 17th century: Pascal, Fermat, Huygens Comparative testing of hypotheses Population statistics 1713: Ars Conjectandi by Jacob Bernoulli: Weak law of large numbers Principle of indifference De Moivre (1718): The Doctrine of Chances 6 Bayes and Laplace Paper by Thomas Bayes (1763): inversion -
Bayesian Statistics: Thomas Bayes to David Blackwell
Bayesian Statistics: Thomas Bayes to David Blackwell Kathryn Chaloner Department of Biostatistics Department of Statistics & Actuarial Science University of Iowa [email protected], or [email protected] Field of Dreams, Arizona State University, Phoenix AZ November 2013 Probability 1 What is the probability of \heads" on a toss of a fair coin? 2 What is the probability of \six" upermost on a roll of a fair die? 3 What is the probability that the 100th digit after the decimal point, of the decimal expression of π equals 3? 4 What is the probability that Rome, Italy, is North of Washington DC USA? 5 What is the probability that the sun rises tomorrow? (Laplace) 1 1 1 My answers: (1) 2 (2) 6 (3) 10 (4) 0.99 Laplace's answer to (5) 0:9999995 Interpretations of Probability There are several interpretations of probability. The interpretation leads to methods for inferences under uncertainty. Here are the 2 most common interpretations: 1 as a long run frequency (often the only interpretation in an introductory statistics course) 2 as a subjective degree of belief You cannot put a long run frequency on an event that cannot be repeated. 1 The 100th digit of π is or is not 3. The 100th digit is constant no matter how often you calculate it. 2 Similarly, Rome is North or South of Washington DC. The Mathematical Concept of Probability First the Sample Space Probability theory is derived from a set of rules and definitions. Define a sample space S, and A a set of subsets of S (events) with specific properties. -
Maximum Entropy: the Universal Method for Inference
MAXIMUM ENTROPY: THE UNIVERSAL METHOD FOR INFERENCE by Adom Giffin A Dissertation Submitted to the University at Albany, State University of New York in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy College of Arts & Sciences Department of Physics 2008 Abstract In this thesis we start by providing some detail regarding how we arrived at our present understanding of probabilities and how we manipulate them – the product and addition rules by Cox. We also discuss the modern view of entropy and how it relates to known entropies such as the thermodynamic entropy and the information entropy. Next, we show that Skilling's method of induction leads us to a unique general theory of inductive inference, the ME method and precisely how it is that other entropies such as those of Renyi or Tsallis are ruled out for problems of inference. We then explore the compatibility of Bayes and ME updating. After pointing out the distinction between Bayes' theorem and the Bayes' updating rule, we show that Bayes' rule is a special case of ME updating by translating information in the form of data into constraints that can be processed using ME. This implies that ME is capable of reproducing every aspect of orthodox Bayesian inference and proves the complete compatibility of Bayesian and entropy methods. We illustrated this by showing that ME can be used to derive two results traditionally in the domain of Bayesian statistics, Laplace's Succession rule and Jeffrey's conditioning rule. The realization that the ME method incorporates Bayes' rule as a special case allows us to go beyond Bayes' rule and to process both data and expected value constraints simultaneously. -
UC Berkeley UC Berkeley Electronic Theses and Dissertations
UC Berkeley UC Berkeley Electronic Theses and Dissertations Title Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints Permalink https://escholarship.org/uc/item/1sx6w3qg Author Albanna, Badr Faisal Publication Date 2013 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints by Badr Faisal Albanna A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Physics in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Michael R. DeWeese, Chair Professor Ahmet Yildiz Professor David Presti Fall 2013 Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints Copyright 2013 by Badr Faisal Albanna 1 Abstract Bounds on the Entropy of a Binary System with Known Mean and Pairwise Constraints by Badr Faisal Albanna Doctor of Philosophy in Physics University of California, Berkeley Professor Michael R. DeWeese, Chair Maximum entropy models are increasingly being used to describe the collective activity of neural populations with measured mean neural activities and pairwise correlations, but the full space of probability distributions consistent with these constraints has not been explored. In this dissertation, I provide lower and upper bounds on the entropy for both the minimum and maximum entropy distributions over binary units with any fixed set of mean values and pairwise correlations, and we construct distributions for several relevant cases. Surprisingly, the minimum entropy solution has entropy scaling logarithmically with system size, unlike the possible linear behavior of the maximum entropy solution, for any set of first- and second-order statistics consistent with arbitrarily large systems.