Shannon's Formula Wlog(1+SNR): a Historical Perspective

Total Page:16

File Type:pdf, Size:1020Kb

Shannon's Formula Wlog(1+SNR): a Historical Perspective Shannon’s Formula W log(1 + SNR): A Historical· Perspective on the occasion of Shannon’s Centenary Oct. 26th, 2016 Olivier Rioul <[email protected]> Outline Who is Claude Shannon? Shannon’s Seminal Paper Shannon’s Main Contributions Shannon’s Capacity Formula A Hartley’s rule C0 = log2 1 + ∆ is not Hartley’s 1 P Many authors independently derived C = 2 log2 1 + N in 1948. Hartley’s rule is exact: C0 = C (a coincidence?) C0 is the capacity of the “uniform” channel Shannon’s Conclusion 2 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective April 30, 2016 centennial day celebrated by Google: here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA April 30, 2016 centennial day celebrated by Google: 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA April 30, 2016 centennial day celebrated by Google: here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man... you’ve never heard of” Do you Know Claude Shannon? 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Do you Know Claude Shannon? “the most important man... you’ve never heard of” 4 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Well-Known Scientific Heroes Alan Turing (1912–1954) 5 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Well-Known Scientific Heroes Alan Turing (1912–1954) 5 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Well-Known Scientific Heroes John Nash (1928–2015) 6 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Well-Known Scientific Heroes John Nash (1928–2015) 6 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective The Quiet and Modest Life of Shannon Shannon with Juggling Props 7 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective The Quiet and Modest Life of Shannon Shannon’s Toys Room Shannon is known for riding through the halls of Bell Labs on a unicycle while simultaneously juggling four balls 8 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines Theseus (labyrinth mouse) 9 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines 10 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines calculator in Roman numerals 11 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective properties hex Shannon machine knowledge provably hard search humans 10x10 Crazy Machines computers 1951 Shannon machine pawlewicz“Hex” + [email protected] switchingsolving game 10x10 hex machine 12 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines Rubik’s cube solver 13 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines 3-ball juggling machine 14 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines Wearable computer to predict roulette in casinos (with Edward Thorp) 15 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Crazy Machines ultimate useless machine 16 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective ...and information theory! “Serious” Work At the same time, Shannon made decisive theoretical advances in ... logic & circuits cryptography artifical intelligence stock investment wearable computing . 17 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “Serious” Work At the same time, Shannon made decisive theoretical advances in ... logic & circuits cryptography artifical intelligence stock investment wearable computing . ...and information theory! 17 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Outline Who is Claude Shannon? Shannon’s Seminal Paper Shannon’s Main Contributions Shannon’s Capacity Formula A Hartley’s rule C0 = log2 1 + ∆ is not Hartley’s 1 P Many authors independently derived C = 2 log2 1 + N in 1948. Hartley’s rule is exact: C0 = C (a coincidence?) C0 is the capacity of the “uniform” channel Shannon’s Conclusion 18 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective One article (written 1940–48): A REVOLUTION !!!!!! The Mathematical Theory of Communication (BSTJ, 1948) 19 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective The Mathematical Theory of Communication (BSTJ, 1948) One article (written 1940–48): A REVOLUTION !!!!!! 19 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Shannon’s Theorems Yes it’s Maths !! 1.
Recommended publications
  • Information Theory and the Length Distribution of All Discrete Systems
    Information Theory and the Length Distribution of all Discrete Systems Les Hatton1* Gregory Warr2** 1 Faculty of Science, Engineering and Computing, Kingston University, London, UK 2 Medical University of South Carolina, Charleston, South Carolina, USA. * [email protected] ** [email protected] Revision: $Revision: 1.35 $ Date: $Date: 2017/09/06 07:30:43 $ Abstract We begin with the extraordinary observation that the length distribution of 80 million proteins in UniProt, the Universal Protein Resource, measured in amino acids, is qualitatively identical to the length distribution of large collections of computer functions measured in programming language tokens, at all scales. That two such disparate discrete systems share important structural properties suggests that yet other apparently unrelated discrete systems might share the same properties, and certainly invites an explanation. We demonstrate that this is inevitable for all discrete systems of components built from tokens or symbols. Departing from existing work by embedding the Conservation of Hartley-Shannon information (CoHSI) in a classical statistical mechanics framework, we identify two kinds of discrete system, heterogeneous and homogeneous. Heterogeneous systems contain components built from a unique alphabet of tokens and yield an implicit CoHSI distribution with a sharp unimodal peak asymptoting to a power-law. Homogeneous systems contain components each built from arXiv:1709.01712v1 [q-bio.OT] 6 Sep 2017 just one kind of token unique to that component and yield a CoHSI distribution corresponding to Zipf’s law. This theory is applied to heterogeneous systems, (proteome, computer software, music); homogeneous systems (language texts, abundance of the elements); and to systems in which both heterogeneous and homogeneous behaviour co-exist (word frequencies and word length frequencies in language texts).
    [Show full text]
  • Guide for the Use of the International System of Units (SI)
    Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.
    [Show full text]
  • On the Irresistible Efficiency of Signal Processing Methods in Quantum
    On the Irresistible Efficiency of Signal Processing Methods in Quantum Computing 1,2 2 Andreas Klappenecker∗ , Martin R¨otteler† 1Department of Computer Science, Texas A&M University, College Station, TX 77843-3112, USA 2 Institut f¨ur Algorithmen und Kognitive Systeme, Universit¨at Karlsruhe,‡ Am Fasanengarten 5, D-76 128 Karlsruhe, Germany February 1, 2008 Abstract We show that many well-known signal transforms allow highly ef- ficient realizations on a quantum computer. We explain some elemen- tary quantum circuits and review the construction of the Quantum Fourier Transform. We derive quantum circuits for the Discrete Co- sine and Sine Transforms, and for the Discrete Hartley transform. We show that at most O(log2 N) elementary quantum gates are necessary to implement any of those transforms for input sequences of length N. arXiv:quant-ph/0111039v1 7 Nov 2001 §1 Introduction Quantum computers have the potential to solve certain problems at much higher speed than any classical computer. Some evidence for this statement is given by Shor’s algorithm to factor integers in polynomial time on a quan- tum computer. A crucial part of Shor’s algorithm depends on the discrete Fourier transform. The time complexity of the quantum Fourier transform is polylogarithmic in the length of the input signal. It is natural to ask whether other signal transforms allow for similar speed-ups. We briefly recall some properties of quantum circuits and construct the quantum Fourier transform. The main part of this paper is concerned with ∗e-mail: [email protected] †e-mail: [email protected] ‡research group Quantum Computing, Professor Thomas Beth the construction of quantum circuits for the discrete Cosine transforms, for the discrete Sine transforms, and for the discrete Hartley transform.
    [Show full text]
  • Conformational Transition in Immunoglobulin MOPC 460" by Correction. in Themembership List of the National Academy of Scien
    Corrections Proc. Natl. Acad. Sci. USA 74 (1977) 1301 Correction. In the article "Kinetic evidence for hapten-induced Correction. In the membership list of the National Academy conformational transition in immunoglobulin MOPC 460" by of Sciences that appeared in the October 1976 issue of Proc. D. Lancet and I. Pecht, which appeared in the October 1976 Natl. Acad. Sci. USA 73,3750-3781, please note the following issue of Proc. Nati. Acad. Sci. USA 73,3549-3553, the authors corrections: H. E. Carter, Britton Chance, Seymour S. Cohen, have requested the following changes. On p. 3550, right-hand E. A. Doisy, Gerald M. Edelman, and John T. Edsall are affil- column, second line from bottom, and p. 3551, left-hand col- iated with the Section ofBiochemistry (21), not the Section of umn, fourth line from the top, "Fig. 2" should be "Fig. 1A." Botany (25). In the legend of Table 2, third line, note (f) should read "AG, = -RTlnKj." On p. 3553, left-hand column, third paragraph, fifth line, "ko" should be replaced by "Ko." Correction. In the Author Index to Volume 73, January-De- cember 1976, which appeared in the December 1976 issue of Proc. Natl. Acad. Sci. USA 73, 4781-4788, the limitations of Correction. In the article "Amino-terminal sequences of two computer alphabetization resulted in the listing of one person polypeptides from human serum with nonsuppressible insu- as the author of another's paper. On p. 4786, it should indicate lin-like and cell-growth-promoting activities: Evidence for that James Christopher Phillips had an article beginning on p.
    [Show full text]
  • Copyright by Xue Chen 2018 the Dissertation Committee for Xue Chen Certifies That This Is the Approved Version of the Following Dissertation
    Copyright by Xue Chen 2018 The Dissertation Committee for Xue Chen certifies that this is the approved version of the following dissertation: Using and Saving Randomness Committee: David Zuckerman, Supervisor Dana Moshkovitz Eric Price Yuan Zhou Using and Saving Randomness by Xue Chen DISSERTATION Presented to the Faculty of the Graduate School of The University of Texas at Austin in Partial Fulfillment of the Requirements for the Degree of DOCTOR OF PHILOSOPHY THE UNIVERSITY OF TEXAS AT AUSTIN May 2018 Dedicated to my parents Daiguang and Dianhua. Acknowledgments First, I am grateful to my advisor, David Zuckerman, for his unwavering support and encouragements. David has been a constant resource for providing me with great advice and insightful feedback about my ideas. Besides these and many fruitful discussions we had, his wisdom, way of thinking, and optimism guided me through the last six years. I also thank him for arranging a visit to the Simons institute in Berkeley, where I enrich my understanding and made new friends. I could not have hoped for a better advisor. I would like to thank Eric Price. His viewpoints and intuitions opened a different gate for me, which is important for the line of research presented in this thesis. I benefited greatly from our long research meetings and discussions. Besides learning a lot of technical stuff from him, I hope I have absorbed some of his passion and attitudes to research. Apart from David and Eric, I have had many mentors during my time as a student. I especially thank Yuan Zhou, who has been an amazing collaborator and friend since starting college.
    [Show full text]
  • Bits, Bans Y Nats: Unidades De Medida De Cantidad De Información
    Bits, bans y nats: Unidades de medida de cantidad de información Junio de 2017 Apellidos, Nombre: Flores Asenjo, Santiago J. (sfl[email protected]) Departamento: Dep. de Comunicaciones Centro: EPS de Gandia 1 Introducción: bits Resumen Todo el mundo conoce el bit como unidad de medida de can- tidad de información, pero pocos utilizan otras unidades también válidas tanto para esta magnitud como para la entropía en el ám- bito de la Teoría de la Información. En este artículo se presentará el nat y el ban (también deno- minado hartley), y se relacionarán con el bit. Objetivos y conocimientos previos Los objetivos de aprendizaje este artículo docente se presentan en la tabla 1. Aunque se trata de un artículo divulgativo, cualquier conocimiento sobre Teo- ría de la Información que tenga el lector le será útil para asimilar mejor el contenido. 1. Mostrar unidades alternativas al bit para medir la cantidad de información y la entropía 2. Presentar las unidades ban (o hartley) y nat 3. Contextualizar cada unidad con la historia de la Teoría de la Información 4. Utilizar la Wikipedia como herramienta de referencia 5. Animar a la edición de la Wikipedia, completando los artículos en español, una vez aprendidos los conceptos presentados y su contexto histórico Tabla 1: Objetivos de aprendizaje del artículo docente 1 Introducción: bits El bit es utilizado de forma universal como unidad de medida básica de canti- dad de información. Su nombre proviene de la composición de las dos palabras binary digit (dígito binario, en inglés) por ser históricamente utilizado para denominar cada uno de los dos posibles valores binarios utilizados en compu- tación (0 y 1).
    [Show full text]
  • Quantum Computing and a Unified Approach to Fast Unitary Transforms
    Quantum Computing and a Unified Approach to Fast Unitary Transforms Sos S. Agaiana and Andreas Klappeneckerb aThe City University of New York and University of Texas at San Antonio [email protected] bTexas A&M University, Department of Computer Science, College Station, TX 77843-3112 [email protected] ABSTRACT A quantum computer directly manipulates information stored in the state of quantum mechanical systems. The available operations have many attractive features but also underly severe restrictions, which complicate the design of quantum algorithms. We present a divide-and-conquer approach to the design of various quantum algorithms. The class of algorithm includes many transforms which are well-known in classical signal processing applications. We show how fast quantum algorithms can be derived for the discrete Fourier transform, the Walsh-Hadamard transform, the Slant transform, and the Hartley transform. All these algorithms use at most O(log2 N) operations to transform a state vector of a quantum computer of length N. 1. INTRODUCTION Discrete orthogonal transforms and discrete unitary transforms have found various applications in signal, image, and video processing, in pattern recognition, in biocomputing, and in numerous other areas [1–11]. Well- known examples of such transforms include the discrete Fourier transform, the Walsh-Hadamard transform, the trigonometric transforms such as the Sine and Cosine transform, the Hartley transform, and the Slant transform. All these different transforms find applications in signal and image processing, because the great variety of signal classes occuring in practice cannot be handeled by a single transform. On a classical computer, the straightforward way to compute a discrete orthogonal transform of a signal vector of length N takes in general O(N 2) operations.
    [Show full text]
  • Revision of Lecture 5
    ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen Revision of Lecture 5 Information transferring across channels • – Channel characteristics and binary symmetric channel – Average mutual information Average mutual information tells us what happens to information transmitted across • channel, or it “characterises” channel – But average mutual information is a bit too mathematical (too abstract) – As an engineer, one would rather characterises channel by its physical quantities, such as bandwidth, signal power and noise power or SNR Also intuitively given source with information rate R, one would like to know if • channel is capable of “carrying” the amount of information transferred across it – In other word, what is the channel capacity? – This lecture answers this fundamental question 71 ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen A Closer Look at Channel Schematic of communication system • MODEM part X(k) Y(k) source sink x(t) y(t) {X i } {Y i } Analogue channel continuous−time discrete−time channel – Depending which part of system: discrete-time and continuous-time channels – We will start with discrete-time channel then continuous-time channel Source has information rate, we would like to know “capacity” of channel • – Moreover, we would like to know under what condition, we can achieve error-free transferring information across channel, i.e. no information loss – According to Shannon: information rate capacity error-free transfer ≤ → 72 ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen Review of Channel Assumptions No amplitude or phase distortion by the channel, and the only disturbance is due • to additive white Gaussian noise (AWGN), i.e.
    [Show full text]
  • Maxwell's Demon: Entropy, Information, Computing
    Complex Social Systems a guided exploration of concepts and methods Information Theory Martin Hilbert (Dr., PhD) Today’s questions I. What are the formal notions of data, information & knowledge? II. What is the role of information theory in complex systems? …the 2nd law and life… III. What is the relation between information and growth? Complex Systems handle information "Although they [complex systems] differ widely in their physical attributes, they resemble one another in the way they handle information. That common feature is perhaps the best starting point for exploring how they operate.” (Gell-Mann, 1995, p. 21) Source: Gell-Mann, M. (1995). The Quark and the Jaguar: Adventures in the Simple and the Complex. New York: St. Martin’s Griffin. Complex Systems Science as computer science “What we are witnessing now is … actually the beginning of an amazing convergence of theoretical physics and theoretical computer science” Chaitin, G. J. (2002). Meta-Mathematics and the Foundations of Mathematics. EATCS Bulletin, 77(June), 167–179. Society computes Numerical: Analytical: Agent-based models Information Theory Why now? Thermodynamics: or ? Aerodynamics: or ? Hydrodynamics: or ? Electricity: or ? etc… Claude Shannon Alan Turing John von Neumann Bell, D. (1973). The Coming of Post- Industrial Society: A Venture in Social Forecasting. New York, NY: Basic Books. “industrial organization of goods” => “industrial organization of information” 15,000 citations 35,000 citations Beniger, J. (1986). The Control Revolution: Castells, M. (1999). The Information Technological and Economic Origins of the Age, Volumes 1-3: Economy, Society Information Society. Harvard University Press and Culture. Cambridge (Mass.); “exploitation of information” Oxford: Wiley-Blackwell.
    [Show full text]
  • Information Basics Observer and Choice
    information basics observer and choice Information is defined as “a measure of the freedom from choice with which a message is selected from the set of all possible messages” Bit (short for binary digit) is the most elementary choice one can make Between two items: “0’ and “1”, “heads” or “tails”, “true” or “false”, etc. Bit is equivalent to the choice between two equally likely alternatives Example, if we know that a coin is to be tossed, but are unable to see it as it falls, a message telling whether the coin came up heads or tails gives us one bit of information 1 Bit of uncertainty 1 Bit of information H,T? uncertainty removed, choice between 2 symbols information gained recognized by an observer INDIANA [email protected] UNIVERSITY informatics.indiana.edu/rocha Fathers of uncertainty-based information Information is transmitted through noisy communication channels Ralph Hartley and Claude Shannon (at Bell Labs), the fathers of Information Theory, worked on the problem of efficiently transmitting information; i. e. decreasing the uncertainty in the transmission of information. Hartley, R.V.L., "Transmission of Information", Bell System Technical Journal, July 1928, p.535. C. E. Shannon [1948], “A mathematical theory of communication”. Bell System Technical Journal, 27:379-423 and 623-656 C. E. Shannon, “A Symbolic analysis of relay and switching circuits” .MS Thesis, (unpublished) MIT, 1937. C. E. Shannon, “An algebra for theoretical genetics.” Phd Dissertation, MIT, 1940. INDIANA [email protected] UNIVERSITY informatics.indiana.edu/rocha Let’s talk about choices Multiplication Principle “If some choice can be made in M different ways, and some subsequent choice can be made in N different ways, then there are M x N different ways these choices can be made in succession” [Paulos] 3 shirts and 4 pants = 3 x 4 = 12 outfit choices INDIANA [email protected] UNIVERSITY informatics.indiana.edu/rocha Hartley Uncertainty Nonspecificity A = Set of Hartley measure Alternatives The amount of uncertainty associated with a set of alternatives (e.g.
    [Show full text]
  • Nyquist, Shannon and the Information Carrying Capacity of Sig- Nals
    Nyquist, Shannon and the information carrying capacity of sig- nals Figure 1: The information highway There is whole science called the information theory. As far as a communications engineer is concerned, information is defined as a quantity called a bit. This is a pretty easy concept to intuit. A yes or a no, in or out, up or down, a 0 or a 1, these are all a form of information bits. In communications, a bit is a unit of information that can be conveyed by some change in the signal, be it amplitude, phase or frequency. We might issue a pulse of a fixed amplitude. The pulse of a positive amplitude can represent a 0 bit and one of a negative amplitude, a 1 bit. Let’s think about the transmission of information along the concepts of a highway as shown in Fig. 1. The width of this highway which is really the full electromagnetic spectrum, is infinite. The useful part of this highway is divided in lanes by regulatory bodies. We have very many lanes of different widths, which separate the multitude of users, such as satellites, microwave towers, wifi etc. In communications, we need to confine these signals in lanes. The specific lanes (i.e. of a certain center frequency) and their widths are assigned by ITU internationally and represent the allocated bandwidth. Within this width, the users can do what they want. In our highway analogy, we want our transporting truck to be the width of the lane, or in other words, we want the signal to fill the whole allocated bandwidth.
    [Show full text]
  • Maximal Repetition and Zero Entropy Rate
    Maximal Repetition and Zero Entropy Rate Łukasz Dębowski∗ Abstract Maximal repetition of a string is the maximal length of a repeated substring. This paper investigates maximal repetition of strings drawn from stochastic processes. Strengthening previous results, two new bounds for the almost sure growth rate of maximal repetition are identified: an upper bound in terms of conditional R´enyi entropy of order γ > 1 given a sufficiently long past and a lower bound in terms of unconditional Shannon entropy (γ = 1). Both the upper and the lower bound can be proved using an inequality for the distribution of recurrence time. We also supply an alternative proof of the lower bound which makes use of an inequality for the expectation of subword complexity. In particular, it is shown that a power-law logarithmic growth of maximal repetition with respect to the string length, recently observed for texts in natural language, may hold only if the conditional R´enyi entropy rate given a sufficiently long past equals zero. According to this observation, natural language cannot be faithfully modeled by a typical hidden Markov process, which is a class of basic language models used in computational linguistics. Keywords: maximal repetition, R´enyi entropies, entropy rate, recurrence time, subword complexity, natural language arXiv:1609.04683v4 [cs.IT] 26 Jul 2017 ∗Ł. Dębowski is with the Institute of Computer Science, Polish Academy of Sciences, ul. Jana Kazimierza 5, 01-248 Warszawa, Poland (e-mail: [email protected]). 1 I Motivation and main results n n Maximal repetition L(x1 ) of a string x1 = (x1; x2; :::; xn) is the maximal length of a repeated substring.
    [Show full text]