What Is Shannon Information?
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Information in Reality: Logic and Metaphysics
tripleC 9(2): 332-341, 2011 ISSN 1726-670X http://www.triple-c.at Information in Reality: Logic and Metaphysics Joseph E. Brenner [email protected], Chemin du Collège, Les Diablerets, Switzerland Abstract: The recent history of information theory and science shows a trend in emphasis from quantitative measures to qualitative characterizations. In parallel, aspects of information are being developed, for example by Pedro Marijuan, Wolf- gang Hofkirchner and others that are extending the notion of qualitative, non-computational information in the biological and cognitive domain to include meaning and function. However, there is as yet no consensus on whether a single acceptable definition or theory of the concept of information is possible, leading to many attempts to view it as a complex, a notion with varied meanings or a group of different entities. In my opinion, the difficulties in developing a Unified Theory of Information (UTI) that would include its qualitative and quantita- tive aspects and their relation to meaning are a consequence of implicit or explicit reliance on the principles of standard, truth-functional bivalent or multivalent logics. In reality, information processes, like those of time, change and human con- sciousness, are contradictory: they are regular and irregular; consistent and inconsistent; continuous and discontinuous. Since the indicated logics cannot accept real contradictions, they have been incapable of describing the multiple but interre- lated characteristics of information. The framework for the discussion of information in this paper will be the new extension of logic to real complex processes that I have made, Logic in Reality (LIR), which is grounded in the dualities and self-dualities of quantum physics and cos- mology. -
Librarianship and the Philosophy of Information
University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Library Philosophy and Practice (e-journal) Libraries at University of Nebraska-Lincoln July 2005 Librarianship and the Philosophy of Information Ken R. Herold Hamilton College Follow this and additional works at: https://digitalcommons.unl.edu/libphilprac Part of the Library and Information Science Commons Herold, Ken R., "Librarianship and the Philosophy of Information" (2005). Library Philosophy and Practice (e-journal). 27. https://digitalcommons.unl.edu/libphilprac/27 Library Philosophy and Practice Vol. 3, No. 2 (Spring 2001) (www.uidaho.edu/~mbolin/lppv3n2.htm) ISSN 1522-0222 Librarianship and the Philosophy of Information Ken R. Herold Systems Manager Burke Library Hamilton College Clinton, NY 13323 “My purpose is to tell of bodies which have been transformed into shapes of a different kind.” Ovid, Metamorphoses Part I. Library Philosophy Provocation Information seems to be ubiquitous, diaphanous, a-categorical, discrete, a- dimensional, and knowing. · Ubiquitous. Information is ever-present and pervasive in our technology and beyond in our thinking about the world, appearing to be a generic ‘thing’ arising from all of our contacts with each other and our environment, whether thought of in terms of communication or cognition. For librarians information is a universal concept, at its greatest extent total in content and comprehensive in scope, even though we may not agree that all information is library information. · Diaphanous. Due to its virtuality, the manner in which information has the capacity to make an effect, information is freedom. In many aspects it exhibits a transparent quality, a window-like clarity as between source and patron in an ideal interface or a perfect exchange without bias. -
Package 'Infotheo'
Package ‘infotheo’ February 20, 2015 Title Information-Theoretic Measures Version 1.2.0 Date 2014-07 Publication 2009-08-14 Author Patrick E. Meyer Description This package implements various measures of information theory based on several en- tropy estimators. Maintainer Patrick E. Meyer <[email protected]> License GPL (>= 3) URL http://homepage.meyerp.com/software Repository CRAN NeedsCompilation yes Date/Publication 2014-07-26 08:08:09 R topics documented: condentropy . .2 condinformation . .3 discretize . .4 entropy . .5 infotheo . .6 interinformation . .7 multiinformation . .8 mutinformation . .9 natstobits . 10 Index 12 1 2 condentropy condentropy conditional entropy computation Description condentropy takes two random vectors, X and Y, as input and returns the conditional entropy, H(X|Y), in nats (base e), according to the entropy estimator method. If Y is not supplied the function returns the entropy of X - see entropy. Usage condentropy(X, Y=NULL, method="emp") Arguments X data.frame denoting a random variable or random vector where columns contain variables/features and rows contain outcomes/samples. Y data.frame denoting a conditioning random variable or random vector where columns contain variables/features and rows contain outcomes/samples. method The name of the entropy estimator. The package implements four estimators : "emp", "mm", "shrink", "sg" (default:"emp") - see details. These estimators require discrete data values - see discretize. Details • "emp" : This estimator computes the entropy of the empirical probability distribution. • "mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator. • "shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution. • "sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution. -
THE LOGIC of BEING INFORMED LUCIANO FLORIDI∗ Abstract One of the Open Problems in the Philosophy of Information Is Wheth- Er T
Logique & Analyse 196 (2006), x–x THE LOGIC OF BEING INFORMED ∗ LUCIANO FLORIDI Abstract One of the open problems in the philosophy of information is wheth- er there is an information logic (IL), different from epistemic (EL) and doxastic logic (DL), which formalises the relation “a is in- formed that p” (Iap) satisfactorily. In this paper, the problem is solved by arguing that the axiom schemata of the normal modal logic (NML) KTB (also known as B or Br or Brouwer's system) are well suited to formalise the relation of “being informed”. After having shown that IL can be constructed as an informational read- ing of KTB, four consequences of a KTB-based IL are explored: information overload; the veridicality thesis (Iap ! p); the relation between IL and EL; and the Kp ! Bp principle or entailment prop- erty, according to which knowledge implies belief. Although these issues are discussed later in the article, they are the motivations be- hind the development of IL. Introduction As anyone acquainted with modal logic (ML) knows, epistemic logic (EL) formalises the relation “a knows that p” (Kap), whereas doxastic logic (DL) formalises the relation “a believes that p” (Bap). One of the open problems in the philosophy of information (Floridi, 2004c) is whether there is also an information logic (IL), different from EL and from DL, that formalises the relation “a is informed that p” (Iap) equally well. The keyword here is “equally” not “well”. One may contend that EL and DL do not capture the relevant relations very well or even not well at all. -
Information Theory and the Length Distribution of All Discrete Systems
Information Theory and the Length Distribution of all Discrete Systems Les Hatton1* Gregory Warr2** 1 Faculty of Science, Engineering and Computing, Kingston University, London, UK 2 Medical University of South Carolina, Charleston, South Carolina, USA. * [email protected] ** [email protected] Revision: $Revision: 1.35 $ Date: $Date: 2017/09/06 07:30:43 $ Abstract We begin with the extraordinary observation that the length distribution of 80 million proteins in UniProt, the Universal Protein Resource, measured in amino acids, is qualitatively identical to the length distribution of large collections of computer functions measured in programming language tokens, at all scales. That two such disparate discrete systems share important structural properties suggests that yet other apparently unrelated discrete systems might share the same properties, and certainly invites an explanation. We demonstrate that this is inevitable for all discrete systems of components built from tokens or symbols. Departing from existing work by embedding the Conservation of Hartley-Shannon information (CoHSI) in a classical statistical mechanics framework, we identify two kinds of discrete system, heterogeneous and homogeneous. Heterogeneous systems contain components built from a unique alphabet of tokens and yield an implicit CoHSI distribution with a sharp unimodal peak asymptoting to a power-law. Homogeneous systems contain components each built from arXiv:1709.01712v1 [q-bio.OT] 6 Sep 2017 just one kind of token unique to that component and yield a CoHSI distribution corresponding to Zipf’s law. This theory is applied to heterogeneous systems, (proteome, computer software, music); homogeneous systems (language texts, abundance of the elements); and to systems in which both heterogeneous and homogeneous behaviour co-exist (word frequencies and word length frequencies in language texts). -
Chapter 4 Information Theory
Chapter Information Theory Intro duction This lecture covers entropy joint entropy mutual information and minimum descrip tion length See the texts by Cover and Mackay for a more comprehensive treatment Measures of Information Information on a computer is represented by binary bit strings Decimal numb ers can b e represented using the following enco ding The p osition of the binary digit 3 2 1 0 Bit Bit Bit Bit Decimal Table Binary encoding indicates its decimal equivalent such that if there are N bits the ith bit represents N i the decimal numb er Bit is referred to as the most signicant bit and bit N as the least signicant bit To enco de M dierent messages requires log M bits 2 Signal Pro cessing Course WD Penny April Entropy The table b elow shows the probability of o ccurrence px to two decimal places of i selected letters x in the English alphab et These statistics were taken from Mackays i b o ok on Information Theory The table also shows the information content of a x px hx i i i a e j q t z Table Probability and Information content of letters letter hx log i px i which is a measure of surprise if we had to guess what a randomly chosen letter of the English alphab et was going to b e wed say it was an A E T or other frequently o ccuring letter If it turned out to b e a Z wed b e surprised The letter E is so common that it is unusual to nd a sentence without one An exception is the page novel Gadsby by Ernest Vincent Wright in which -
Guide for the Use of the International System of Units (SI)
Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S. -
On Measures of Entropy and Information
On Measures of Entropy and Information Tech. Note 009 v0.7 http://threeplusone.com/info Gavin E. Crooks 2018-09-22 Contents 5 Csiszar´ f-divergences 12 Csiszar´ f-divergence ................ 12 0 Notes on notation and nomenclature 2 Dual f-divergence .................. 12 Symmetric f-divergences .............. 12 1 Entropy 3 K-divergence ..................... 12 Entropy ........................ 3 Fidelity ........................ 12 Joint entropy ..................... 3 Marginal entropy .................. 3 Hellinger discrimination .............. 12 Conditional entropy ................. 3 Pearson divergence ................. 14 Neyman divergence ................. 14 2 Mutual information 3 LeCam discrimination ............... 14 Mutual information ................. 3 Skewed K-divergence ................ 14 Multivariate mutual information ......... 4 Alpha-Jensen-Shannon-entropy .......... 14 Interaction information ............... 5 Conditional mutual information ......... 5 6 Chernoff divergence 14 Binding information ................ 6 Chernoff divergence ................. 14 Residual entropy .................. 6 Chernoff coefficient ................. 14 Total correlation ................... 6 Renyi´ divergence .................. 15 Lautum information ................ 6 Alpha-divergence .................. 15 Uncertainty coefficient ............... 7 Cressie-Read divergence .............. 15 Tsallis divergence .................. 15 3 Relative entropy 7 Sharma-Mittal divergence ............. 15 Relative entropy ................... 7 Cross entropy -
On the Irresistible Efficiency of Signal Processing Methods in Quantum
On the Irresistible Efficiency of Signal Processing Methods in Quantum Computing 1,2 2 Andreas Klappenecker∗ , Martin R¨otteler† 1Department of Computer Science, Texas A&M University, College Station, TX 77843-3112, USA 2 Institut f¨ur Algorithmen und Kognitive Systeme, Universit¨at Karlsruhe,‡ Am Fasanengarten 5, D-76 128 Karlsruhe, Germany February 1, 2008 Abstract We show that many well-known signal transforms allow highly ef- ficient realizations on a quantum computer. We explain some elemen- tary quantum circuits and review the construction of the Quantum Fourier Transform. We derive quantum circuits for the Discrete Co- sine and Sine Transforms, and for the Discrete Hartley transform. We show that at most O(log2 N) elementary quantum gates are necessary to implement any of those transforms for input sequences of length N. arXiv:quant-ph/0111039v1 7 Nov 2001 §1 Introduction Quantum computers have the potential to solve certain problems at much higher speed than any classical computer. Some evidence for this statement is given by Shor’s algorithm to factor integers in polynomial time on a quan- tum computer. A crucial part of Shor’s algorithm depends on the discrete Fourier transform. The time complexity of the quantum Fourier transform is polylogarithmic in the length of the input signal. It is natural to ask whether other signal transforms allow for similar speed-ups. We briefly recall some properties of quantum circuits and construct the quantum Fourier transform. The main part of this paper is concerned with ∗e-mail: [email protected] †e-mail: [email protected] ‡research group Quantum Computing, Professor Thomas Beth the construction of quantum circuits for the discrete Cosine transforms, for the discrete Sine transforms, and for the discrete Hartley transform. -
Religious Fundamentalism in Eight Muslim‐
JOURNAL for the SCIENTIFIC STUDY of RELIGION Religious Fundamentalism in Eight Muslim-Majority Countries: Reconceptualization and Assessment MANSOOR MOADDEL STUART A. KARABENICK Department of Sociology Combined Program in Education and Psychology University of Maryland University of Michigan To capture the common features of diverse fundamentalist movements, overcome etymological variability, and assess predictors, religious fundamentalism is conceptualized as a set of beliefs about and attitudes toward religion, expressed in a disciplinarian deity, literalism, exclusivity, and intolerance. Evidence from representative samples of over 23,000 adults in Egypt, Iraq, Jordan, Lebanon, Pakistan, Saudi Arabia, Tunisia, and Turkey supports the conclusion that fundamentalism is stronger in countries where religious liberty is lower, religion less fractionalized, state structure less fragmented, regulation of religion greater, and the national context less globalized. Among individuals within countries, fundamentalism is linked to religiosity, confidence in religious institutions, belief in religious modernity, belief in conspiracies, xenophobia, fatalism, weaker liberal values, trust in family and friends, reliance on less diverse information sources, lower socioeconomic status, and membership in an ethnic majority or dominant religion/sect. We discuss implications of these findings for understanding fundamentalism and the need for further research. Keywords: fundamentalism, Islam, Christianity, Sunni, Shia, Muslim-majority countries. INTRODUCTION -
A Flow-Based Entropy Characterization of a Nated Network and Its Application on Intrusion Detection
A Flow-based Entropy Characterization of a NATed Network and its Application on Intrusion Detection J. Crichigno1, E. Kfoury1, E. Bou-Harb2, N. Ghani3, Y. Prieto4, C. Vega4, J. Pezoa4, C. Huang1, D. Torres5 1Integrated Information Technology Department, University of South Carolina, Columbia (SC), USA 2Cyber Threat Intelligence Laboratory, Florida Atlantic University, Boca Raton (FL), USA 3Electrical Engineering Department, University of South Florida, Tampa (FL), USA 4Electrical Engineering Department, Universidad de Concepcion, Concepcion, Chile 5Department of Mathematics, Northern New Mexico College, Espanola (NM), USA Abstract—This paper presents a flow-based entropy charac- information is collected by the device and then exported for terization of a small/medium-sized campus network that uses storage and analysis. Thus, the performance impact is minimal network address translation (NAT). Although most networks and no additional capturing devices are needed [3]-[5]. follow this configuration, their entropy characterization has not been previously studied. Measurements from a production Entropy has been used in the past to detect anomalies, network show that the entropies of flow elements (external without requiring payload inspection. Its use is appealing IP address, external port, campus IP address, campus port) because it provides more information on flow elements (ports, and tuples have particular characteristics. Findings include: i) addresses, tuples) than traffic volume analysis. Entropy has entropies may widely vary in the course of a day. For example, also been used for anomaly detection in backbones and large in a typical weekday, the entropies of the campus and external ports may vary from below 0.2 to above 0.8 (in a normalized networks. -
Still Minding the Gap? Reflecting on Transitions Between Concepts of Information in Varied Domains
View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by City Research Online City Research Online City, University of London Institutional Repository Citation: Bawden, D. ORCID: 0000-0002-0478-6456 and Robinson, L. ORCID: 0000- 0001-5202-8206 (2020). Still minding the gap? Reflecting on transitions between concepts of information in varied domains. Information, This is the accepted version of the paper. This version of the publication may differ from the final published version. Permanent repository link: https://openaccess.city.ac.uk/id/eprint/23561/ Link to published version: Copyright and reuse: City Research Online aims to make research outputs of City, University of London available to a wider audience. Copyright and Moral Rights remain with the author(s) and/or copyright holders. URLs from City Research Online may be freely distributed and linked to. City Research Online: http://openaccess.city.ac.uk/ [email protected] 1 Review 2 Still Minding the Gap? Reflecting on Transitions 3 between Concepts of Information in Varied Domains 4 David Bawden 1,* and Lyn Robinson 2 5 1 Centre for Information Science, City, University of London, Northampton Square, London, EC1V 0HB, 6 United Kingdom; [email protected] 7 2 Centre for Information Science, City, University of London, Northampton Square, London, EC1V 0HB, 8 United Kingdom; [email protected] 9 * Correspondence: [email protected] 10 Abstract: This conceptual paper, a contribution to the tenth anniversary special issue of information, 11 gives a cross-disciplinary review of general and unified theories of information.