Thermodynamics ≠ Information Theory: Science’S Greatest Sokal Affair

Total Page:16

File Type:pdf, Size:1020Kb

Thermodynamics ≠ Information Theory: Science’S Greatest Sokal Affair Journal of Human Thermodynamics 2012, 8(1):1‐120 Open Access Journal of Human Thermodynamics ISSN 1559-386X HumanThermodynamics.com/Journal.html Article Thermodynamics ≠ Information Theory: Science’s Greatest Sokal Affair Libb Thims Institute of Human Thermodynamics, Chicago, IL; [email protected] Received: 26 Jun 2012; Reviewed: 14 Dec 2012 – 18 Dec 2012; Published: 19 Dec 2012 Abstract This short article is a long-overdue, seven decades—1940 to present—delayed, inter-science departmental memorandum—though not the first—that INFORMATION THEORY IS NOT THERMODYNAMICS and thermodynamics is not information theory. We repeat again: information theory—the mathematical study of the transmission of information in binary format and or the study of the probabilistic decoding of keys and cyphers in cryptograms—is not thermodynamics! This point cannot be overemphasized enough, nor restated in various ways enough. Information theory is not statistical mechanics—information theory is not statistical thermodynamics—information theory is not a branch of physics. Claude Shannon is not a thermodynamicist. Heat is not a binary digit. A telegraph operator is not a Maxwell’s demon. The statistical theory of radiography and electrical signal transmission is NOT gas theory. The equations used in communication theory have absolutely nothing to do with the equations used in thermodynamics, statistical mechanics, or statistical thermodynamics. The logarithm is not a panacea. Boltzmann was not thinking of entropy in 1894 as ‘missing information’. Concisely, the truncated abstract of this article is: thermodynamics ≠ information theory. In pictorial form, the abstract of this article is: Journal of Human Thermodynamics, 2012, Vol. 8(1) 2 Neumann-Shannon anecdote “Neumann’s proposal to call Shannon’s function defining information by the name ‘entropy’ opened a Pandora’s box of intellectual confusion.” — Antoine Danchin (2001), French geneticist1 The roots of this article trace to what known as the Neumann-Shannon anecdote, a famous, or rather infamous, depending on point of view, discussion between American mathematician and communications engineer Claude Shannon, shown below left, and Hungarian-born American mathematician and chemical engineer John Neumann, shown below right, which occurred in the period between fall 1940 to spring 1941, during Shannon’s postdoctoral year at the Institute for Advanced Study, Princeton, New Jersey, during which time Shannon was wavering on whether or not to call his new logarithm formulation, shown below, of communications in signal transmissions by the name of either information—as in binary coded information sent via a telegraph transmission—or uncertainty—of the Heisenberg uncertainty type, i.e. of ‘uncertainty’ in the mind of someone about to receive a message?2 Several alternately worded versions of the anecdote exist.3 The following is a popular version (with image) from the 2010 book Biomedical Informatics by German computer processing scientist and cognitive researcher Andreas Holzinger:4 Claude Shannon John Neumann Shannon, prior to this famously-repeated 1940 conversation, had developed a long-standing passion for signal transmission. The earliest mention of this was when as a boy in circa 1926 he used a telegraph machine to send Morris code messages to his neighborhood friend a ½-mile away through the barbed wire that ran along the road, as summarized below:5 Journal of Human Thermodynamics, 2012, Vol. 8(1) 3 American Claude Shannon (1916‐2001), as a boy, circa 1926, in Gaylord, Michigan, hooked up a telegraph machine to the barbed wire fence (left) that ran along his country road so that he could communicate with his neighborhood friend a ½‐mile away, using Morse code (center): a 1836 communication system developed by American Samuel Morse, according to which the telegraph system (right) could be used to send coded pulses—dots (short hold) or dashes (long hold)—of electric current along a wire which controlled an electromagnet that was located at the receiving end of the telegraph system.5 The barbed wire photo shown, to note, is of a Texas area farm rather than a Michigan area farm—put the point is still the same: tapping out ‘hey, what are you doing’ on a single button telegraph board—or for that matter typing your daily email on a modern 61-button Dvorak Simplified Keyboard, and the various coding formulas underlying the transmission of these communications are NOT measures of Boltzmann entropy. To clarify, however, before proceeding further, there is a real physical-chemical perspective in which entropy—of the real thermodynamic variety—seems to be a quantitative factor in the sending of messages between people, such as via in person verbal speech, post, telegraph, email, text messaging, Skype, etc., but this is an advanced subject of human chemical thermodynamics, which holds firstly that each person is a surface-attached reactive molecule, a twenty-six-element human molecule to be specific, and secondly that the binding and or debinding of molecules—people—is completely ‘determined’, in the words of American-born Canadian biophysicalN4 chemist Julie Forman-Kay, by ‘the free energy change of the interaction, composed of both enthalpic and entropic terms’.6 Hence, any interaction between two people, such as typing out ‘LOL’ in text message, is a factor—composed of both enthalpic and or entropic terms—involved in the reactionary chemical binding or debonding of two people.196 To clarify further, in this sense, we are not referring to the ‘physical irregularities and functional degradation, an unreliability resulting from the physical effect of the second law in the real world of signal transmissions’, in the 2011 words of American anthropological neuroscientist Terrence Deacon, that REAL entropy differs from SHANNON bit entropy, but rather the way in which entropy plays a role in spontaneous direction of chemical reactions, atomic or human.220 This, however, is a subject beyond the scope of the present article.7 The objective herein is to show that the so-called Shannon entropy (information theoretic entropy)— or specifically the one time Shannon uses the term ‘entropy’ in his 1945 article ‘A Mathematical theory of Cryptography’ and 152 times Shannon uses the term ‘entropy’ in his 1948 article ‘A Mathematical Theory of Communication’, namely: conditional entropy, entropy of two possibilities, entropy of a Journal of Human Thermodynamics, 2012, Vol. 8(1) 4 source, entropy of a joint event, the uncertainty (or entropy) of a joint event—here we see Shannon vacillating on terminology (an inconsistency point we will dig into in the following article)—entropy of a source per symbol of text, entropy per second, entropy per symbol of blocks, relative entropy, entropy of the approximations to English, entropy per unit time, entropy of symbols on a channel, entropy of the channel input per second, true entropy, actual entropy, maximum possible entropy, zero entropy, maximum entropy source, joint entropy of input and output, the entropy of the output when the input is known and conversely, entropy as a measure of uncertainty, input entropy to a channel, entropy of a continuous distribution, entropy of discrete set of probabilities, entropy of an averaged distribution, entropy of a one-dimensional Gaussian distribution, the entropy as a measure in a an absolute way of the randomness of a chance variable, new entropy, old entropy, entropy of an ensemble of functions, entropy of an ensemble per degree of freedom, maximum possible entropy of white noise, entropy of a continuous stochastic process, entropy power, maximum entropy for a given power, entropy power of a noise, entropy loss in linear filters, change of entropy with a change of coordinates, entropy power factor, entropy power gain (in decibels), final power entropy, initial power entropy, output power entropy, entropy power loss, entropy of a sum of two ensembles, entropy power of a distribution, definite entropy of noise, entropy of a received signal less the entropy of noise, maximum entropy of a received signal, greatest possible entropy for a power, entropy (per second) of a received ensemble, noise entropy, noise entropy power, entropy power of the noise, maximum entropy for the power, entropy power of the received signal, entropy power of the sum of two ensembles, entropy powers, individual entropy powers, maximum entropy power of a sum, entropy of the maximum distribution, entropy of the transmitted ensemble, entropy of the white noise, entropy of an underlying stochastic process, maximum possible entropy for a power, have absolutely positively unequivocally NOTHING to do with thermodynamics, specifically the original entropy of Clausius or the ideal gas entropy of Boltzmann or the statistical mechanical entropy of Gibbs. The above usages are but a cesspool of misadopted terminological confusion. Historically, to continue, the amount of ‘Pandora’s box’ confusion, as French geneticist Antoine Danchin puts it, surrounding the falsely-assumed notion that Shannon entropy equals Boltzmann entropy is immense.1 American evolutionary chemist Jeffrey Wicken, in 1987, cogently summarized things as follows (his italics, quote slightly rewritten, square brackets added):8 “It is a mistaken belief that Shannon’s function—called ‘entropy’ by namesake misadoption in information theory (Shannon and Weaver 1949)—has resulted in a true generalization of the Carnot‐Clausius state function [dQ/T] treatment and the Boltzmann‐Gibbs statistical [H function] treatment of the original [thermodynamic] entropy formulation of heat [Q], thus, in a sense, freeing it from disciplinary framework of thermodynamics for use in probability distributions in general— which is a major impediment to productive communication [in science].” In addition, this is not the first time this ‘mistaken belief’ issue has been pointed out. In 1953, British electrical engineer and cognitive scientist Colin Cherry, in his On Human Communication: a Review, a Survey, and a Criticism, gave his reflective opinion:9 Journal of Human Thermodynamics, 2012, Vol. 8(1) 5 “I had heard of ‘entropies’ of language, social systems, and economic systems and its use in various method‐starved studies.
Recommended publications
  • On Entropy, Information, and Conservation of Information
    entropy Article On Entropy, Information, and Conservation of Information Yunus A. Çengel Department of Mechanical Engineering, University of Nevada, Reno, NV 89557, USA; [email protected] Abstract: The term entropy is used in different meanings in different contexts, sometimes in contradic- tory ways, resulting in misunderstandings and confusion. The root cause of the problem is the close resemblance of the defining mathematical expressions of entropy in statistical thermodynamics and information in the communications field, also called entropy, differing only by a constant factor with the unit ‘J/K’ in thermodynamics and ‘bits’ in the information theory. The thermodynamic property entropy is closely associated with the physical quantities of thermal energy and temperature, while the entropy used in the communications field is a mathematical abstraction based on probabilities of messages. The terms information and entropy are often used interchangeably in several branches of sciences. This practice gives rise to the phrase conservation of entropy in the sense of conservation of information, which is in contradiction to the fundamental increase of entropy principle in thermody- namics as an expression of the second law. The aim of this paper is to clarify matters and eliminate confusion by putting things into their rightful places within their domains. The notion of conservation of information is also put into a proper perspective. Keywords: entropy; information; conservation of information; creation of information; destruction of information; Boltzmann relation Citation: Çengel, Y.A. On Entropy, 1. Introduction Information, and Conservation of One needs to be cautious when dealing with information since it is defined differently Information. Entropy 2021, 23, 779.
    [Show full text]
  • Bernard M. Oliver Oral History Interview
    http://oac.cdlib.org/findaid/ark:/13030/kt658038wn Online items available Bernard M. Oliver Oral History Interview Daniel Hartwig Stanford University. Libraries.Department of Special Collections and University Archives Stanford, California November 2010 Copyright © 2015 The Board of Trustees of the Leland Stanford Junior University. All rights reserved. Note This encoded finding aid is compliant with Stanford EAD Best Practice Guidelines, Version 1.0. Bernard M. Oliver Oral History SCM0111 1 Interview Overview Call Number: SCM0111 Creator: Oliver, Bernard M., 1916- Title: Bernard M. Oliver oral history interview Dates: 1985-1986 Physical Description: 0.02 Linear feet (1 folder) Summary: Transcript of an interview conducted by Arthur L. Norberg covering Oliver's early life, his education, and work experiences at Bell Laboratories and Hewlett-Packard. Subjects include television research, radar, information theory, organizational climate and objectives at both companies, Hewlett-Packard's associations with Stanford University, and Oliver's association with William Hewlett and David Packard. Language(s): The materials are in English. Repository: Department of Special Collections and University Archives Green Library 557 Escondido Mall Stanford, CA 94305-6064 Email: [email protected] Phone: (650) 725-1022 URL: http://library.stanford.edu/spc Information about Access Reproduction is prohibited. Ownership & Copyright All requests to reproduce, publish, quote from, or otherwise use collection materials must be submitted in writing to the Head of Special Collections and University Archives, Stanford University Libraries, Stanford, California 94304-6064. Consent is given on behalf of Special Collections as the owner of the physical items and is not intended to include or imply permission from the copyright owner.
    [Show full text]
  • ENERGY, ENTROPY, and INFORMATION Jean Thoma June
    ENERGY, ENTROPY, AND INFORMATION Jean Thoma June 1977 Research Memoranda are interim reports on research being conducted by the International Institute for Applied Systems Analysis, and as such receive only limited scientific review. Views or opinions contained herein do not necessarily represent those of the Institute or of the National Member Organizations supporting the Institute. PREFACE This Research Memorandum contains the work done during the stay of Professor Dr.Sc. Jean Thoma, Zug, Switzerland, at IIASA in November 1976. It is based on extensive discussions with Professor HAfele and other members of the Energy Program. Al- though the content of this report is not yet very uniform because of the different starting points on the subject under consideration, its publication is considered a necessary step in fostering the related discussion at IIASA evolving around th.e problem of energy demand. ABSTRACT Thermodynamical considerations of energy and entropy are being pursued in order to arrive at a general starting point for relating entropy, negentropy, and information. Thus one hopes to ultimately arrive at a common denominator for quanti- ties of a more general nature, including economic parameters. The report closes with the description of various heating appli- cation.~and related efficiencies. Such considerations are important in order to understand in greater depth the nature and composition of energy demand. This may be highlighted by the observation that it is, of course, not the energy that is consumed or demanded for but the informa- tion that goes along with it. TABLE 'OF 'CONTENTS Introduction ..................................... 1 2 . Various Aspects of Entropy ........................2 2.1 i he no me no logical Entropy ........................
    [Show full text]
  • Lecture 2 the First Law of Thermodynamics (Ch.1)
    Lecture 2 The First Law of Thermodynamics (Ch.1) Outline: 1. Internal Energy, Work, Heating 2. Energy Conservation – the First Law 3. Quasi-static processes 4. Enthalpy 5. Heat Capacity Internal Energy The internal energy of a system of particles, U, is the sum of the kinetic energy in the reference frame in which the center of mass is at rest and the potential energy arising from the forces of the particles on each other. system Difference between the total energy and the internal energy? boundary system U = kinetic + potential “environment” B The internal energy is a state function – it depends only on P the values of macroparameters (the state of a system), not on the method of preparation of this state (the “path” in the V macroparameter space is irrelevant). T A In equilibrium [ f (P,V,T)=0 ] : U = U (V, T) U depends on the kinetic energy of particles in a system and an average inter-particle distance (~ V-1/3) – interactions. For an ideal gas (no interactions) : U = U (T) - “pure” kinetic Internal Energy of an Ideal Gas f The internal energy of an ideal gas U = Nk T with f degrees of freedom: 2 B f ⇒ 3 (monatomic), 5 (diatomic), 6 (polyatomic) (here we consider only trans.+rotat. degrees of freedom, and neglect the vibrational ones that can be excited at very high temperatures) How does the internal energy of air in this (not-air-tight) room change with T if the external P = const? f ⎡ PV ⎤ f U =Nin room k= T Bin⎢ N room = ⎥ = PV 2 ⎣ kB T⎦ 2 - does not change at all, an increase of the kinetic energy of individual molecules with T is compensated by a decrease of their number.
    [Show full text]
  • Terms for Talking About Information and Communication
    Information 2012, 3, 351-371; doi:10.3390/info3030351 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Concept Paper Terms for Talking about Information and Communication Corey Anton School of Communications, Grand Valley State University, 210 Lake Superior Hall, Allendale, MI 49401, USA; E-Mail: [email protected]; Tel.: +1-616-331-3668; Fax: +1-616-331-2700 Received: 4 June 2012; in revised form: 17 August 2012 / Accepted: 20 August 2012 / Published: 27 August 2012 Abstract: This paper offers terms for talking about information and how it relates to both matter-energy and communication, by: (1) Identifying three different levels of signs: Index, based in contiguity, icon, based in similarity, and symbol, based in convention; (2) examining three kinds of coding: Analogic differences, which deal with positive quantities having contiguous and continuous values, and digital distinctions, which include “either/or functions”, discrete values, and capacities for negation, decontextualization, and abstract concept-transfer, and finally, iconic coding, which incorporates both analogic differences and digital distinctions; and (3) differentiating between “information theoretic” orientations (which deal with data, what is “given as meaningful” according to selections and combinations within “contexts of choice”) and “communication theoretic” ones (which deal with capta, what is “taken as meaningful” according to various “choices of context”). Finally, a brief envoi reflects on how information broadly construed relates to probability and entropy. Keywords: sign; index; icon; symbol; coding; context; information; communication “If you have an apple and I have an apple and we exchange apples then you and I will still each have one apple.
    [Show full text]
  • Great Physicists
    Great Physicists Great Physicists The Life and Times of Leading Physicists from Galileo to Hawking William H. Cropper 1 2001 1 Oxford New York Athens Auckland Bangkok Bogota´ Buenos Aires Cape Town Chennai Dar es Salaam Delhi Florence HongKong Istanbul Karachi Kolkata Kuala Lumpur Madrid Melbourne Mexico City Mumbai Nairobi Paris Sao Paulo Shanghai Singapore Taipei Tokyo Toronto Warsaw and associated companies in Berlin Ibadan Copyright ᭧ 2001 by Oxford University Press, Inc. Published by Oxford University Press, Inc. 198 Madison Avenue, New York, New York 10016 Oxford is a registered trademark of Oxford University Press All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior permission of Oxford University Press. Library of Congress Cataloging-in-Publication Data Cropper, William H. Great Physicists: the life and times of leadingphysicists from Galileo to Hawking/ William H. Cropper. p. cm Includes bibliographical references and index. ISBN 0–19–513748–5 1. Physicists—Biography. I. Title. QC15 .C76 2001 530'.092'2—dc21 [B] 2001021611 987654321 Printed in the United States of America on acid-free paper Contents Preface ix Acknowledgments xi I. Mechanics Historical Synopsis 3 1. How the Heavens Go 5 Galileo Galilei 2. A Man Obsessed 18 Isaac Newton II. Thermodynamics Historical Synopsis 41 3. A Tale of Two Revolutions 43 Sadi Carnot 4. On the Dark Side 51 Robert Mayer 5. A Holy Undertaking59 James Joule 6. Unities and a Unifier 71 Hermann Helmholtz 7. The Scientist as Virtuoso 78 William Thomson 8.
    [Show full text]
  • Lecture 4: 09.16.05 Temperature, Heat, and Entropy
    3.012 Fundamentals of Materials Science Fall 2005 Lecture 4: 09.16.05 Temperature, heat, and entropy Today: LAST TIME .........................................................................................................................................................................................2� State functions ..............................................................................................................................................................................2� Path dependent variables: heat and work..................................................................................................................................2� DEFINING TEMPERATURE ...................................................................................................................................................................4� The zeroth law of thermodynamics .............................................................................................................................................4� The absolute temperature scale ..................................................................................................................................................5� CONSEQUENCES OF THE RELATION BETWEEN TEMPERATURE, HEAT, AND ENTROPY: HEAT CAPACITY .......................................6� The difference between heat and temperature ...........................................................................................................................6� Defining heat capacity.................................................................................................................................................................6�
    [Show full text]
  • Systemic View of Parts of the World
    50th Annual Meeting of the International Society for the Systems Sciences 2006 (ISSS 2006) Rohnert Park, California, USA 9-14 July 2006 Volume 1 of 2 Editors: Jennifer Wilby Janet K. Allen Cecilia Loureiro-Koechlin ISBN: 978-1-62276-755-7 Printed from e-media with permission by: Curran Associates, Inc. 57 Morehouse Lane Red Hook, NY 12571 Some format issues inherent in the e-media version may also appear in this print version. Copyright© (2006) by the International Society for the Systems Sciences (ISSS) All rights reserved. Printed by Curran Associates, Inc. (2013) For permission requests, please contact the International Society for the Systems Sciences (ISSS) at the address below. International Society for the Systems Sciences (ISSS) 47 Southfield Road Pocklington, York YO42 2XE, United Kingdom Phone/Fax: +44 (0)1759 302718 [email protected] Additional copies of this publication are available from: Curran Associates, Inc. 57 Morehouse Lane Red Hook, NY 12571 USA Phone: 845-758-0400 Fax: 845-758-2634 Email: [email protected] Web: www.proceedings.com TABLE OF CONTENTS Volume 1 Systemic View of Parts of the World....................................................................................................................................1 Korn, Janos Mindful Knowing................................................................................................................................................................. 26 Fielden, Kay Fostering a Sustainable Learning Society through Knowledge Based Development ....................................................
    [Show full text]
  • Claude Elwood Shannon (1916–2001) Solomon W
    Claude Elwood Shannon (1916–2001) Solomon W. Golomb, Elwyn Berlekamp, Thomas M. Cover, Robert G. Gallager, James L. Massey, and Andrew J. Viterbi Solomon W. Golomb Done in complete isolation from the community of population geneticists, this work went unpublished While his incredibly inventive mind enriched until it appeared in 1993 in Shannon’s Collected many fields, Claude Shannon’s enduring fame will Papers [5], by which time its results were known surely rest on his 1948 work “A mathematical independently and genetics had become a very theory of communication” [7] and the ongoing rev- different subject. After his Ph.D. thesis Shannon olution in information technology it engendered. wrote nothing further about genetics, and he Shannon, born April 30, 1916, in Petoskey, Michi- expressed skepticism about attempts to expand gan, obtained bachelor’s degrees in both mathe- the domain of information theory beyond the matics and electrical engineering at the University communications area for which he created it. of Michigan in 1936. He then went to M.I.T., and Starting in 1938 Shannon worked at M.I.T. with after spending the summer of 1937 at Bell Tele- Vannevar Bush’s “differential analyzer”, the an- phone Laboratories, he wrote one of the greatest cestral analog computer. After another summer master’s theses ever, published in 1938 as “A sym- (1940) at Bell Labs, he spent the academic year bolic analysis of relay and switching circuits” [8], 1940–41 working under the famous mathemati- in which he showed that the symbolic logic of cian Hermann Weyl at the Institute for Advanced George Boole’s nineteenth century Laws of Thought Study in Princeton, where he also began thinking provided the perfect mathematical model for about recasting communications on a proper switching theory (and indeed for the subsequent mathematical foundation.
    [Show full text]
  • Thermodynamics
    ME346A Introduction to Statistical Mechanics { Wei Cai { Stanford University { Win 2011 Handout 6. Thermodynamics January 26, 2011 Contents 1 Laws of thermodynamics 2 1.1 The zeroth law . .3 1.2 The first law . .4 1.3 The second law . .5 1.3.1 Efficiency of Carnot engine . .5 1.3.2 Alternative statements of the second law . .7 1.4 The third law . .8 2 Mathematics of thermodynamics 9 2.1 Equation of state . .9 2.2 Gibbs-Duhem relation . 11 2.2.1 Homogeneous function . 11 2.2.2 Virial theorem / Euler theorem . 12 2.3 Maxwell relations . 13 2.4 Legendre transform . 15 2.5 Thermodynamic potentials . 16 3 Worked examples 21 3.1 Thermodynamic potentials and Maxwell's relation . 21 3.2 Properties of ideal gas . 24 3.3 Gas expansion . 28 4 Irreversible processes 32 4.1 Entropy and irreversibility . 32 4.2 Variational statement of second law . 32 1 In the 1st lecture, we will discuss the concepts of thermodynamics, namely its 4 laws. The most important concepts are the second law and the notion of Entropy. (reading assignment: Reif x 3.10, 3.11) In the 2nd lecture, We will discuss the mathematics of thermodynamics, i.e. the machinery to make quantitative predictions. We will deal with partial derivatives and Legendre transforms. (reading assignment: Reif x 4.1-4.7, 5.1-5.12) 1 Laws of thermodynamics Thermodynamics is a branch of science connected with the nature of heat and its conver- sion to mechanical, electrical and chemical energy. (The Webster pocket dictionary defines, Thermodynamics: physics of heat.) Historically, it grew out of efforts to construct more efficient heat engines | devices for ex- tracting useful work from expanding hot gases (http://www.answers.com/thermodynamics).
    [Show full text]
  • Secret Sharing and Algorithmic Information Theory Tarik Kaced
    Secret Sharing and Algorithmic Information Theory Tarik Kaced To cite this version: Tarik Kaced. Secret Sharing and Algorithmic Information Theory. Information Theory [cs.IT]. Uni- versité Montpellier II - Sciences et Techniques du Languedoc, 2012. English. tel-00763117 HAL Id: tel-00763117 https://tel.archives-ouvertes.fr/tel-00763117 Submitted on 10 Dec 2012 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. UNIVERSITÉ DE MONTPELLIER 2 ÉCOLE DOCTORALE I2S THÈSE Partage de secret et théorie algorithmique de l’information présentée pour obtenir le grade de DOCTEUR DE L’UNIVERSITÉ DE MONTPELLIER 2 Spécialité : Informatique par Tarik KACED sous la direction d’Andrei ROMASHCHENKO et d’Alexander SHEN soutenue publiquement le 4 décembre 2012 JURY Eugène ASARIN CNRS & Université Paris Diderot Rapporteur Bruno DURAND CNRS & Université Montpellier 2 Examinateur Konstantin MAKARYCHEV Microsoft Research Rapporteur František MATÚŠ Institute of Information Theory and Automation Rapporteur Andrei ROMASHCHENKO CNRS & Université Montpellier 2 Co-encadrant Alexander SHEN CNRS & Université Montpellier 2 Directeur de thèse ii iii Résumé Le partage de secret a pour but de répartir une donnée secrète entre plusieurs participants. Les participants sont organisés en une structure d’accès recensant tous les groupes qualifiés pour accéder au secret.
    [Show full text]
  • The Teleodynamics of Language, Culture, Technology and Science (LCT&S)
    Information 2013, 4, 94-116; doi:10.3390/info4010094 OPEN ACCESS information ISSN 2078-2489 www.mdpi.com/journal/information Review The Teleodynamics of Language, Culture, Technology and Science (LCT&S) Robert K. Logan 1,2 1 Department of Physics, University of Toronto, 60 Street George, Toronto, ON M5S 1A7, Canada; E-Mail: [email protected]; Tel.: +1-416-361-5928 2 Strategic Innovation Lab OCAD University, Toronto, ON M5T 1W1, Canada Received: 8 November 2012; in revised form: 30 January 2013 / Accepted: 2 February 2013 / Published: 7 February 2013 Abstract: Logan [1] in his book The Extended Mind developed the hypothesis that language, culture, technology and science can be treated as organisms that evolve and reproduce themselves. This idea is extended by making use of the notion of teleodynamics that Deacon [2] introduced and developed in his book Incomplete Nature to explain the nature of life, sentience, mind and a self that acts in its own interest. It is suggested that language, culture, technology and science (LCT&S) like living organisms also act in their own self-interest, are self-correcting and are to a certain degree autonomous even though they are obligate symbionts with their human hosts. Specifically, it will be argued that LCT&S are essentially teleodynamic systems, which Deacon defines as “self-creating, self-maintaining, self-reproducing, individuated systems [2] (p. 325)”. Keywords: language; culture; technology; science; teleodynamics; morphodynamics; thermodynamics; organism; obligate symbiont 1. Introduction Although [teleodynamics] is the distinguishing characteristic of living processes, it is not necessarily limited to the biological—Deacon. Terrence Deacon [2] in his book, Incomplete Nature: How Mind Emerged from Matter attempts to develop a scientific theory of how properties such as information, value, purpose, meaning, and end-directed behavior emerged from physics and chemistry.
    [Show full text]