UC Berkeley UC Berkeley Electronic Theses and Dissertations
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
The Teleodynamics of Language, Culture, Technology and Science (LCT&S)
Information 2013, 4, 94-116; doi:10.3390/info4010094 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Review The Teleodynamics of Language, Culture, Technology and Science (LCT&S) Robert K. Logan 1,2 1 Department of Physics, University of Toronto, 60 Street George, Toronto, ON M5S 1A7, Canada; E-Mail: [email protected]; Tel.: +1-416-361-5928 2 Strategic Innovation Lab OCAD University, Toronto, ON M5T 1W1, Canada Received: 8 November 2012; in revised form: 30 January 2013 / Accepted: 2 February 2013 / Published: 7 February 2013 Abstract: Logan [1] in his book The Extended Mind developed the hypothesis that language, culture, technology and science can be treated as organisms that evolve and reproduce themselves. This idea is extended by making use of the notion of teleodynamics that Deacon [2] introduced and developed in his book Incomplete Nature to explain the nature of life, sentience, mind and a self that acts in its own interest. It is suggested that language, culture, technology and science (LCT&S) like living organisms also act in their own self-interest, are self-correcting and are to a certain degree autonomous even though they are obligate symbionts with their human hosts. Specifically, it will be argued that LCT&S are essentially teleodynamic systems, which Deacon defines as “self-creating, self-maintaining, self-reproducing, individuated systems [2] (p. 325)”. Keywords: language; culture; technology; science; teleodynamics; morphodynamics; thermodynamics; organism; obligate symbiont 1. Introduction Although [teleodynamics] is the distinguishing characteristic of living processes, it is not necessarily limited to the biological—Deacon. Terrence Deacon [2] in his book, Incomplete Nature: How Mind Emerged from Matter attempts to develop a scientific theory of how properties such as information, value, purpose, meaning, and end-directed behavior emerged from physics and chemistry. -
A Modern History of Probability Theory
A Modern History of Probability Theory Kevin H. Knuth Depts. of Physics and Informatics University at Albany (SUNY) Albany NY USA 4/29/2016 Knuth - Bayes Forum 1 A Modern History of Probability Theory Kevin H. Knuth Depts. of Physics and Informatics University at Albany (SUNY) Albany NY USA 4/29/2016 Knuth - Bayes Forum 2 A Long History The History of Probability Theory, Anthony J.M. Garrett MaxEnt 1997, pp. 223-238. Hájek, Alan, "Interpretations of Probability", The Stanford Encyclopedia of Philosophy (Winter 2012 Edition), Edward N. Zalta (ed.), URL = <http://plato.stanford.edu/archives/win2012/entries/probability-interpret/>. 4/29/2016 Knuth - Bayes Forum 3 … la théorie des probabilités n'est, au fond, que le bon sens réduit au calcul … … the theory of probabilities is basically just common sense reduced to calculation … Pierre Simon de Laplace Théorie Analytique des Probabilités 4/29/2016 Knuth - Bayes Forum 4 Taken from Harold Jeffreys “Theory of Probability” 4/29/2016 Knuth - Bayes Forum 5 The terms certain and probable describe the various degrees of rational belief about a proposition which different amounts of knowledge authorise us to entertain. All propositions are true or false, but the knowledge we have of them depends on our circumstances; and while it is often convenient to speak of propositions as certain or probable, this expresses strictly a relationship in which they stand to a corpus of knowledge, actual or hypothetical, and not a characteristic of the propositions in themselves. A proposition is capable at the same time of varying degrees of this relationship, depending upon the knowledge to which it is related, so that it is without significance to call a John Maynard Keynes proposition probable unless we specify the knowledge to which we are relating it. -
A DEFENSE of EQUILIBRIUM REASONING in ECONOMICS by Jennifer Soyun Jhun BA in Philosophy and Economics, Northwestern University
A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS by Jennifer Soyun Jhun BA in Philosophy and Economics, Northwestern University, 2008 Submitted to the Graduate Faculty of The Kenneth P. Dietrich School of Arts and Sciences in partial fulfillment of the requirements for the degree of Doctor of Philosophy University of Pittsburgh 2016 UNIVERSITY OF PITTSBURGH KENNETH P. DIETRICH SCHOOL OF ARTS AND SCIENCES This dissertation was presented by Jennifer Soyun Jhun It was defended on May 27, 2016 and approved by Robert Batterman, Professor of Philosophy, University of Pittsburgh Sheldon Smith, Professor of Philosophy, University of California, Los Angeles Dissertation Advisor: Mark Wilson, Distinguished Professor of Philosophy, University of Pittsburgh Dissertation Advisor: James Woodward, Distinguished Professor of History and Philosophy of Science, University of Pittsburgh ii A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS Jennifer Jhun, PhD University of Pittsburgh, 2016 Copyright © by Jennifer Jhun 2016 iii A DEFENSE OF EQUILIBRIUM REASONING IN ECONOMICS Jennifer Soyun Jhun, PhD University of Pittsburgh, 2016 Critics both within and outside of philosophy have challenged economics wholesale as unscientific. In particular, economics seems unable to predict future events because it relies on assumptions like equilibrium conditions, which stipulate that the economy tends to stay in its current state absent external forces. The popular background view that gives rise to this criticism is that the job of science is to uncover laws of nature, by appeal to which we can determine (usually deductively) the future behavior of a dynamical system as it evolves. I argue that lawlike statements in economics have a very different role than this: they provide a means of understanding in terms of how efficient a particular system is. -
Macroscopic Time Evolution and Maxent Inference for Closed
Macroscopic time evolution and MaxEnt inference for closed systems with Hamiltonian dynamics Domagoj Kui´c,∗ Paˇsko Zupanovi´c,ˇ † and Davor Jureti´c‡ University of Split, Faculty of Science, N. Tesle 12, 21000 Split, Croatia Abstract MaxEnt inference algorithm and information theory are relevant for the time evolution of macroscopic systems considered as problem of incomplete information. Two different MaxEnt approaches are introduced in this work, both applied to prediction of time evolution for closed Hamiltonian systems. The first one is based on Liouville equation for the conditional probability distribution, introduced as a strict microscopic constraint on time evolution in phase space. The conditional probability distribution is defined for the set of microstates associated with the set of phase space paths determined by solutions of Hamilton’s equations. The MaxEnt inference algorithm with Shannon’s concept of the conditional information entropy is then applied to prediction, consistently with this strict microscopic constraint on time evolution in phase space. The second approach is based on the same concepts, with a difference that Liouville equation for the conditional probability distribution is introduced as a macroscopic constraint given by a phase space average. We consider the incomplete nature of our information about microscopic dynamics in a rational way that is consistent with Jaynes’ formulation of predictive statistical mechanics, and the concept of macroscopic reproducibility for time dependent processes. Maximization of the conditional information entropy subject to this macroscopic constraint leads to a loss of correlation between the initial phase space paths and final microstates. Information entropy is the theoretic upper bound on the conditional information entropy, with the upper bound attained only in case of the complete loss of correlation. -
945 the Transition from Constraint To
[Frontiers in Bioscience 19, 945-957, June 1, 2014] The transition from constraint to regulation at the origin of life Terrence W. Deacon1, Alok Srivastava1, Joshua Augustus Bacigalupi1 1Department of Anthropology, University of California, Berkeley, CA 94720 TABLE OF CONTENTS 1. Abstract 2. Introduction 2.1. Living thermodynamics 2.2. Context-limited versus self-limiting dissipation 2.3. The autopoietic dilemma 2.4. The formal versus physical logic of biological regulation 3. Autogenesis 3.1. Constraints on constraint-production 3.2. From synergistic constraint generation to regulation 4. Conditional autogenesis 4.1. The emergence of cybernetic regulation 4.2. Template regulated autogenesis: an imaginative scenario 5. Conclusions 6. References 1. ABSTRACT 2. INTRODUCTION The origin of living dynamics required a local 2.1. Living thermodynamics evasion of thermodynamic degradation by maintaining Living organisms are thermodynamically and critical dynamical and structural constraints. Scenarios for biochemically open physicochemical systems. They life’s origin that fail to distinguish between constrained constantly exchange matter and energy with their chemistry and regulated metabolism do not address the physicochemical environment and yet are constrained question of how living processes first emerge from simpler within physical boundaries and structures that are constraints on molecular interactions. We describe a maintained through dynamic processes. The molecular model system consisting of coupled reciprocal physicochemical processes that constitute living organisms catalysis and self-assembly in which one of the catalytic bi- tend to persist in states maintained far from thermodynamic products tends to spontaneously self-assemble into a and chemical equilibrium, whereas non-living containing shell (analogous to a viral capsule). -
A Brief Overview of Probability Theory in Data Science by Geert
A brief overview of probability theory in data science Geert Verdoolaege 1Department of Applied Physics, Ghent University, Ghent, Belgium 2Laboratory for Plasma Physics, Royal Military Academy (LPP–ERM/KMS), Brussels, Belgium Tutorial 3rd IAEA Technical Meeting on Fusion Data Processing, Validation and Analysis, 27-05-2019 Overview 1 Origins of probability 2 Frequentist methods and statistics 3 Principles of Bayesian probability theory 4 Monte Carlo computational methods 5 Applications Classification Regression analysis 6 Conclusions and references 2 Overview 1 Origins of probability 2 Frequentist methods and statistics 3 Principles of Bayesian probability theory 4 Monte Carlo computational methods 5 Applications Classification Regression analysis 6 Conclusions and references 3 Early history of probability Earliest traces in Western civilization: Jewish writings, Aristotle Notion of probability in law, based on evidence Usage in finance Usage and demonstration in gambling 4 Middle Ages World is knowable but uncertainty due to human ignorance William of Ockham: Ockham’s razor Probabilis: a supposedly ‘provable’ opinion Counting of authorities Later: degree of truth, a scale Quantification: Law, faith ! Bayesian notion Gaming ! frequentist notion 5 Quantification 17th century: Pascal, Fermat, Huygens Comparative testing of hypotheses Population statistics 1713: Ars Conjectandi by Jacob Bernoulli: Weak law of large numbers Principle of indifference De Moivre (1718): The Doctrine of Chances 6 Bayes and Laplace Paper by Thomas Bayes (1763): inversion -
Maximum Entropy: the Universal Method for Inference
MAXIMUM ENTROPY: THE UNIVERSAL METHOD FOR INFERENCE by Adom Giffin A Dissertation Submitted to the University at Albany, State University of New York in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy College of Arts & Sciences Department of Physics 2008 Abstract In this thesis we start by providing some detail regarding how we arrived at our present understanding of probabilities and how we manipulate them – the product and addition rules by Cox. We also discuss the modern view of entropy and how it relates to known entropies such as the thermodynamic entropy and the information entropy. Next, we show that Skilling's method of induction leads us to a unique general theory of inductive inference, the ME method and precisely how it is that other entropies such as those of Renyi or Tsallis are ruled out for problems of inference. We then explore the compatibility of Bayes and ME updating. After pointing out the distinction between Bayes' theorem and the Bayes' updating rule, we show that Bayes' rule is a special case of ME updating by translating information in the form of data into constraints that can be processed using ME. This implies that ME is capable of reproducing every aspect of orthodox Bayesian inference and proves the complete compatibility of Bayesian and entropy methods. We illustrated this by showing that ME can be used to derive two results traditionally in the domain of Bayesian statistics, Laplace's Succession rule and Jeffrey's conditioning rule. The realization that the ME method incorporates Bayes' rule as a special case allows us to go beyond Bayes' rule and to process both data and expected value constraints simultaneously. -
Thermodynamics of Surface Phenomena
THERMODYNAMICS OF SURFACE PHENOMENA J. C. MELROSE Mobil Research and Development Corporation, P.O. Box 900, Dallas, Texas 75221, U.S.A. ABSTRACT The thermodynamic treatment of the interfacial region corresponding to two fluid phases in contact is discussed. The features of this analysis which are reviewed include the classical treatment of Gibbs and the extensions to this treatment which are due to Buff. It is shown that these extensions are essential ifthe logical structure of the analysis is to be regarded as complete. 1. INTRODUCTION The thin, nonhomogeneous region separating two homogeneous bulk phases in contact constitutes an interface. It is generally recognized that an adequate thermodynamic treatment of such a region must be based on 1 the work of Gibbs • N evertheless, the Iiterature contains a number of proposals for modifying various features ofthis treatment. It is, in fact, remarkable that no other important contribution of Gibbs to the understanding of the equilibrium states of heterogeneous substances has given rise to so many reservations and attempts to develop alternative treatments. The proposed modifications are usually concerned with one or more of several concepts which are characteristic of the Gibbsian treatment. The first ofthese concepts involves the notion of a mathematical dividing or reference surface, located within or very near the nonhomogeneous interfacial region. This surface serves to define the geometrical configuration of the interfacial region and also partitions the volume of the system between the two bulk phases. A second feature of the Gibbs treatment which is occasionally challenged is the definition of the chemical potentials which are appropriate to the components present in an interfacial region. -
Nature, Science, Bayes' Theorem, and the Whole of Reality
Nature, Science, Bayes' Theorem, and the Whole of Reality Moorad Alexanian Department of Physics and Physical Oceanography University of North Carolina Wilmington Wilmington, NC 28403-5606 Abstract A fundamental problem in science is how to make logical inferences from scientific data. Mere data does not suffice since additional information is necessary to select a domain of models or hypotheses and thus determine the likelihood of each model or hypothesis. Thomas Bayes' Theorem relates the data and prior information to posterior probabilities associated with differing models or hypotheses and thus is useful in identifying the roles played by the known data and the assumed prior information when making inferences. Scientists, philosophers, and theologians accumulate knowledge when analyzing different aspects of reality and search for particular hypotheses or models to fit their respective subject matters. Of course, a main goal is then to integrate all kinds of knowledge into an all-encompassing worldview that would describe the whole of reality. A generous description of the whole of reality would span, in the order of complexity, from the purely physical to the supernatural. These two extreme aspects of reality are bridged by a nonphysical realm, which would include elements of life, man, consciousness, rationality, mental and mathematical abstractions, etc. An urgent problem in the theory of knowledge is what science is and what it is not. Albert Einstein's notion of science in terms of sense perception is refined by defining operationally the data that makes up the subject matter of science. It is shown, for instance, that theological considerations included in the prior information assumed by Isaac Newton is irrelevant in relating the data logically to the model or hypothesis. -
Distribution Transformer Semantics for Bayesian Machine Learning
Distribution Transformer Semantics for Bayesian Machine Learning Johannes Borgstrom¨ 1, Andrew D. Gordon1, Michael Greenberg2, James Margetson1, and Jurgen Van Gael3 1 Microsoft Research 2 University of Pennsylvania 3 Microsoft FUSE Labs Abstract. The Bayesian approach to machine learning amounts to inferring pos- terior distributions of random variables from a probabilistic model of how the variables are related (that is, a prior distribution) and a set of observations of vari- ables. There is a trend in machine learning towards expressing Bayesian models as probabilistic programs. As a foundation for this kind of programming, we pro- pose a core functional calculus with primitives for sampling prior distributions and observing variables. We define novel combinators for distribution transform- ers, based on theorems in measure theory, and use these to give a rigorous se- mantics to our core calculus. The original features of our semantics include its support for discrete, continuous, and hybrid distributions, and observations of zero-probability events. We compile our core language to a small imperative lan- guage that in addition to the distribution transformer semantics also has a straight- forward semantics via factor graphs, data structures that enable many efficient inference algorithms. We then use an existing inference engine for efficient ap- proximate inference of posterior marginal distributions, treating thousands of ob- servations per second for large instances of realistic models. 1 Introduction In the past 15 years, statistical machine learning has unified many seemingly unrelated methods through the Bayesian paradigm. With a solid understanding of the theoreti- cal foundations, advances in algorithms for inference, and numerous applications, the Bayesian paradigm is now the state of the art for learning from data. -
Report from the Chair by Robert H
HistoryN E W S L E T T E R of Physics A F O R U M O F T H E A M E R I C A N P H Y S I C A L S O C I E T Y • V O L U M E I X N O . 5 • F A L L 2 0 0 5 Report From The Chair by Robert H. Romer, Amherst College, Forum Chair 2005, the World Year of Physics, has been a good one for the The Forum sponsored several sessions of invited lectures at History Forum. I want to take advantage of this opportunity to the March meeting (in Los Angeles) and the April meeting (in describe some of FHP’s activities during recent months and to Tampa), which are more fully described elsewhere in this Newslet- look forward to the coming year. ter. At Los Angeles we had two invited sessions under the general The single most important forum event of 2005 was the pre- rubric of “Einstein and Friends.” At Tampa, we had a third such sentation of the fi rst Pais Prize in the History of Physics to Martin Einstein session, as well as a good session on “Quantum Optics Klein of Yale University. It was only shortly before the award Through the Lens of History” and then a fi nal series of talks on ceremony, at the Tampa meeting in April, that funding reached “The Rise of Megascience.” A new feature of our invited sessions the level at which this honor could be promoted from “Award” to this year is the “named lecture.” The purpose of naming a lecture “Prize.” We are all indebted to the many generous donors and to is to pay tribute to a distinguished physicist while simultaneously the members of the Pais Award Committee and the Pais Selection encouraging donations to support the travel expenses of speak- Committee for their hard work over the last several years that ers. -
Przemysław Żywiczyński Center for Language Evolution Studies, Nicolaus Copernicus University [email protected]
THEORIA ET HISTORIA SCIENTIARUM, VOL. XVI Ed. Nicolaus Copernicus University 2019 DOI: http://dx.doi.org/10.12775/ths.2019.010 Przemysław Żywiczyński Center for Language Evolution Studies, Nicolaus Copernicus University [email protected] Biological Evolution, Cultural Evolution and the Evolution of Language. Review of Daniel Dennett’s From Bacteria to Bach and Back Abstract. Daniel Dennett is one of the giants of contemporary philosophy. His new book, From Bacteria to Bach and Back, does reiterates the old motifs, such as “strange inversion of reasoning” or “production without comprehension”. But it is first and foremost a new project, whose goal is to calibrate the theory of universal Darwinism to the very recent developments in science, technology and our lifestyles, the most important of which is the coming of Artificial Intelligence. What Dennett does in the new book offers us “thinking tools” (his own phrase) to understand this changing reality by means of basic Darwinian principles. Keywords: universal Darwinism; AI; cognitive science; Darwinian Spaces; biological evolution; cultural evolution; evolution of language; memetics. Introduction Daniel Dennett is the type of writer who does not produce works of minor importance. All his books, starting with the collection of essays Brainstorms (1978), grapple with the grandest philosophical problems – mind and free will, morality or the nature and scope of evolutionary processes. However, some of his titles stand out, not only due to the breath-taking range of subjects they cover and the subtlety of exposition these subjects are given, but also due to the impact they have exerted on contemporary 170 Przemysław Żywiczyński philosophical debate.