A Iilstory of the THEORY of INFORMATION

Total Page:16

File Type:pdf, Size:1020Kb

A Iilstory of the THEORY of INFORMATION Paper No. 1177 621.391(091) RADIO SECTION A IIlSTORY OF THE THEORY OF INFORMATION By E. COLIN CHERRY, M.Sc., Associate Member. (The paper was first received 1th February, and in revised form 28th May, 1951.) SUMMARY most important of which has been his ability to receive, to The paper mentions first some essential points about the early communicate and to record his knowledge. Communication development of languages, codes and symbolism, picking out those essentially involves a language, a symbolism, whether this is a fundamental points in human communication which have recently been spoken dialect, a stone inscription, a cryptogram, a Morse-code summarized by precise mathematical theory. A survey of telegraphy signal or a chain of numbers in binary digital form in a modern and telephony development leads to the need for "economy," which computing machine. It is interesting to observe that as technical has given rise to various systems of signal compression. Hartley's applications have increased in complexity with the passage of early theory of communication is summarized, and Gabor's theory time, languages have increased in simplicity, until to-day we are of signal structure is described. Modern statistical theory of Wiener and Shannon, by which considering the ultimate compression of information in the "information" may be expressed quantitatively, is shown to be a simplest possible forms. It is important to emphasize, at the logical extension of Hartley's work. A Section on calculating start, that we are not concerned with the meaning or the truth machines and brains attempts to clear up popular misunderstandings of messages; semantics lies outside the scope of mathematical and to separate definite accomplishments in mechanization of thought information theory. The material which is to be communicated processes from mere myths. between a transmitting mind and a receiving mind must, however, Finally, a generalization of the work over the whole field of be prearranged into an agreed language. We are not concerned scientific observation is shown, and evidence which supports the view with the "higher" mental processes which have enabled man to that "information plus entropy is an important invariant of a physical find a word for a concept, even less with the formation of a system" is included. concept; we start only from the point when he already has a dictionary. (1) INTRODUCTION A detailed history of spoken and written languages would be In recent years much has been hecird of the "theory of com­ irrelevant to our present subject, but nevertheless there are munication" or, in a wider scientific circle, the "theory of certain matters of interest which may be taken as a starting­ information": in scientific books, articles and broadcast talks, point. The early writings of Mediterranean civilizations were many concepts, words and expressions are used which are· part in picture, or "logographic" script; simple pictures were used to of the communication engineer's stock-in-trade. Although represent objects and also, by association, ideas, actions, names, electrical communication systems have been in use for over a and so on. Also, what is much more important, phonetic century it is only comparatively recently that any attempt has writing was developed, in which sounds were given symbols. been made to define what is the commodity that is being With the passage of time, the pictures were reduced to more communicated and how it may be measured. Modem com­ formal symbols, as determined by the difficulty of using a chisel, munication theory has evolved from the attempts of a few or a reed brush, while the phonetic writing simplified into a mathematically-minded communication engineers to establish set of two or three dozen alphabetic letters, divided into con­ such a quantitive basis to this most fundamental aspect of the sonants and vowels.t. 2 subject. Since the idea of "communication of information" has In Egyptian. hieroglyphics we have a supreme example of what a scope which extends far beyond electrical techniques of trans­ is now called redundancy in languages and code; one of the mission and reception, it is not surprising that the mathematical difficultiesin deciphering the Rosetta stone lay in the fact that a work is proving to be of value in many, widely diverse, fields of polysyllabic word might give each syllable not one symbol but a science. number of different ones in common use, in order that the word It may therefore be of interest to _communication engineers to should be thoroughly understood. 3 (The effect when literally consider historically the interrelation between various activities transcribed into English is one of stuttering.) On the other hand and fields of scientific investigation on which their subject is the Semitic languages show an early recognition of redundancy. beginning to impinge. The scope of information theory is Ancient Hebrew script had no vowels; modern Hebrew has none, extensive, and no attempt will be made here to present an exposi­ too, except in children's books. Many other ancient scripts tion of the theory; it is in any case still in process of development. show no vowels. Slavonic Russian went a step further in This paper should be regarded purely as a history; the references condensation: in religious texts, commonly used words were appended may be found useful by those wishing to study the abbreviated to a few letters, in a manner similar to our present­ subject in detail. day use of the ampersand, abbreviations such as lb and the increasing use of initials, e.g. U.S.A., Unesco, O.K. ( 1. 1) Languages and Codes But the Romans were responsible for a very big step in speeding Man's development and the growth of civilizations have up the recording of information. The freed slave Tyro invented depended in the main on progress in a few activities, one of the shorthand in about 63 B.c. in order to record, verbatim, the Written contributions on papers published without being read at meetings are speeches of Cicero. This is not unlike modern shorthand in invited for consideration with a view to publication. appearance (Fig. and it is known to have been used in Europe This is an "integrating" paper. Members are invited to submit papers in this 1), category, giving the full perspective of the developments leading to the present until the Middle Ages.4 practice in a particular part of one of the branches of electrical science. The paper is a modified version of one presented by the author to the Symposium Related to the structure of language is the theory of crypto­ on Information Theory held under the auspices of the Royal Society in September, grams or ciphers, certain aspects of which are of interest in our 1950. The original paper was given only restricted publication. Mr. Cherry is Reader in Telecommunications, University of London. present study, in connection with the problem of coding.24 [ 383] 384 CHERRY: A HISTORY OF THE THEORY OF INFORMATION this has assumed increasing importance. Certain types of message are of very common occurrence in telegraphy; e.g. some commercial expressions and birthday greetings, and these should be coded into very short symbols. By the year 1825 a number of such codes were in use. The modern view is that Ne fide/iter di/igit quernfastidit et ca/amitas queru/a. mo nam messages having a high probability of occurrence contain little Fig. 1.-Roman shorthand (orthographic). information and that any mathematical definition we adopt for the expression "information" must conform with this idea, that Ciphering, of vital importance for military and diplomatic the information conveyed by a symbol, a message or an observa­ secrecy, is as old as the Scriptures. The simple displaced alphabet tion, in a set of such events, must decrease as their frequency of code, known to every schoolboy, was most probably used by occurrence in the set increases. Shannon has emphasized the Julius Caesar.3 There are many historic uses of cipher; e.g. importance of the statistics of language in his recent publica­ Samuel Pepys's diary4 was entirely ciphered "to secrete it from tions,24 and has referred to the relative frequencies of letters in a his servants and the World"; also certain of Roger Bacon's scripts language, and of the digrams (or letter-pairs) such as ed, st, er, have as yet resisted all attempts at deciphering. A particularly and of the trigrams ing, ter, and so on, which occur in the English important cipher is one known as "Francis Bacon's Biliteral language. His estimate is that English has a redundancy of at Code": Bacon suggested the possibility of printing seemingly least 50%, meaning that if instead of writing every letter of an Vowels h d t e q a o u e ' Ill lllf 1111 I II Ill WI /Ill/ I II II Ill \1111 I II Ill Ill/ //� � b j v s n m g ng z Fig. 2.-The Ogam (Celtic) script. innocent lines of verse or prose, using two slightly different English sentence (or its coded symbol) we were suitably to founts (called, say, 1 and 2). The order of the 1 's and 2's was symbolize the digrams, trigrams, etc., the possible ultimate used for coding a secret message: each letter .of the alphabet compression would be at least two to one. If such a perfect was coded into five units, founts 1 and 2 being used as these code were used, any mistake which might occur in transmission units.3 Now, this code illustrates an important principle which of a message could not be corrected by guessing. Language seems to have been understood since the early civilizations statistics3 have been essential for centuries, for the purpose of -information may be coded into a two-symbol code.* There are deciphering secret codes and cryptograms.
Recommended publications
  • Mining Patient Journeys from Healthcare Narratives
    MINING PATIENT JOURNEYS FROM HEALTHCARE NARRATIVES A THESIS SUBMITTED TO THE UNIVERSITY OF MANCHESTER FOR THE DEGREE OF DOCTOR OF PHILOSOPHY IN THE FACULTY OF ENGINEERING AND PHYSICAL SCIENCES 2014 By Azad Dehghan School of Computer Science Contents Abstract 14 Declaration 16 Copyright 17 Acknowledgements 18 1 Introduction 23 1.1 Hypotheses and research questions................... 25 1.2 Aim and objectives........................... 26 1.3 Contributions.............................. 27 1.4 Thesis structure............................. 28 2 Background 29 2.1 Text Mining............................... 29 2.1.1 Natural language processing.................. 30 2.1.2 Information extraction..................... 40 2.1.3 Named entity recognition.................... 47 2.1.4 Temporal information extraction................ 54 2.1.5 Temporal entity extraction................... 59 2.1.6 Temporal relation extraction.................. 66 2.2 Clinical Background.......................... 77 2.2.1 Clinical text........................... 77 2.2.2 Computer aided standardisation of clinical care........ 77 2.2.3 Health-related quality of life.................. 78 2.3 Summary................................ 81 2 3 Extraction of Health-related Concepts 83 3.1 Event Extraction............................ 84 3.1.1 Methods............................ 85 3.1.2 Data............................... 89 3.1.3 Experiments, results, and discussion.............. 91 3.2 Health-related Quality of Life Extraction................ 96 3.2.1 HrQoL Schema........................
    [Show full text]
  • Guide for the Use of the International System of Units (SI)
    Guide for the Use of the International System of Units (SI) m kg s cd SI mol K A NIST Special Publication 811 2008 Edition Ambler Thompson and Barry N. Taylor NIST Special Publication 811 2008 Edition Guide for the Use of the International System of Units (SI) Ambler Thompson Technology Services and Barry N. Taylor Physics Laboratory National Institute of Standards and Technology Gaithersburg, MD 20899 (Supersedes NIST Special Publication 811, 1995 Edition, April 1995) March 2008 U.S. Department of Commerce Carlos M. Gutierrez, Secretary National Institute of Standards and Technology James M. Turner, Acting Director National Institute of Standards and Technology Special Publication 811, 2008 Edition (Supersedes NIST Special Publication 811, April 1995 Edition) Natl. Inst. Stand. Technol. Spec. Publ. 811, 2008 Ed., 85 pages (March 2008; 2nd printing November 2008) CODEN: NSPUE3 Note on 2nd printing: This 2nd printing dated November 2008 of NIST SP811 corrects a number of minor typographical errors present in the 1st printing dated March 2008. Guide for the Use of the International System of Units (SI) Preface The International System of Units, universally abbreviated SI (from the French Le Système International d’Unités), is the modern metric system of measurement. Long the dominant measurement system used in science, the SI is becoming the dominant measurement system used in international commerce. The Omnibus Trade and Competitiveness Act of August 1988 [Public Law (PL) 100-418] changed the name of the National Bureau of Standards (NBS) to the National Institute of Standards and Technology (NIST) and gave to NIST the added task of helping U.S.
    [Show full text]
  • On the Irresistible Efficiency of Signal Processing Methods in Quantum
    On the Irresistible Efficiency of Signal Processing Methods in Quantum Computing 1,2 2 Andreas Klappenecker∗ , Martin R¨otteler† 1Department of Computer Science, Texas A&M University, College Station, TX 77843-3112, USA 2 Institut f¨ur Algorithmen und Kognitive Systeme, Universit¨at Karlsruhe,‡ Am Fasanengarten 5, D-76 128 Karlsruhe, Germany February 1, 2008 Abstract We show that many well-known signal transforms allow highly ef- ficient realizations on a quantum computer. We explain some elemen- tary quantum circuits and review the construction of the Quantum Fourier Transform. We derive quantum circuits for the Discrete Co- sine and Sine Transforms, and for the Discrete Hartley transform. We show that at most O(log2 N) elementary quantum gates are necessary to implement any of those transforms for input sequences of length N. arXiv:quant-ph/0111039v1 7 Nov 2001 §1 Introduction Quantum computers have the potential to solve certain problems at much higher speed than any classical computer. Some evidence for this statement is given by Shor’s algorithm to factor integers in polynomial time on a quan- tum computer. A crucial part of Shor’s algorithm depends on the discrete Fourier transform. The time complexity of the quantum Fourier transform is polylogarithmic in the length of the input signal. It is natural to ask whether other signal transforms allow for similar speed-ups. We briefly recall some properties of quantum circuits and construct the quantum Fourier transform. The main part of this paper is concerned with ∗e-mail: [email protected] †e-mail: [email protected] ‡research group Quantum Computing, Professor Thomas Beth the construction of quantum circuits for the discrete Cosine transforms, for the discrete Sine transforms, and for the discrete Hartley transform.
    [Show full text]
  • Bits, Bans Y Nats: Unidades De Medida De Cantidad De Información
    Bits, bans y nats: Unidades de medida de cantidad de información Junio de 2017 Apellidos, Nombre: Flores Asenjo, Santiago J. (sfl[email protected]) Departamento: Dep. de Comunicaciones Centro: EPS de Gandia 1 Introducción: bits Resumen Todo el mundo conoce el bit como unidad de medida de can- tidad de información, pero pocos utilizan otras unidades también válidas tanto para esta magnitud como para la entropía en el ám- bito de la Teoría de la Información. En este artículo se presentará el nat y el ban (también deno- minado hartley), y se relacionarán con el bit. Objetivos y conocimientos previos Los objetivos de aprendizaje este artículo docente se presentan en la tabla 1. Aunque se trata de un artículo divulgativo, cualquier conocimiento sobre Teo- ría de la Información que tenga el lector le será útil para asimilar mejor el contenido. 1. Mostrar unidades alternativas al bit para medir la cantidad de información y la entropía 2. Presentar las unidades ban (o hartley) y nat 3. Contextualizar cada unidad con la historia de la Teoría de la Información 4. Utilizar la Wikipedia como herramienta de referencia 5. Animar a la edición de la Wikipedia, completando los artículos en español, una vez aprendidos los conceptos presentados y su contexto histórico Tabla 1: Objetivos de aprendizaje del artículo docente 1 Introducción: bits El bit es utilizado de forma universal como unidad de medida básica de canti- dad de información. Su nombre proviene de la composición de las dos palabras binary digit (dígito binario, en inglés) por ser históricamente utilizado para denominar cada uno de los dos posibles valores binarios utilizados en compu- tación (0 y 1).
    [Show full text]
  • Shannon's Formula Wlog(1+SNR): a Historical Perspective
    Shannon’s Formula W log(1 + SNR): A Historical· Perspective on the occasion of Shannon’s Centenary Oct. 26th, 2016 Olivier Rioul <[email protected]> Outline Who is Claude Shannon? Shannon’s Seminal Paper Shannon’s Main Contributions Shannon’s Capacity Formula A Hartley’s rule C0 = log2 1 + ∆ is not Hartley’s 1 P Many authors independently derived C = 2 log2 1 + N in 1948. Hartley’s rule is exact: C0 = C (a coincidence?) C0 is the capacity of the “uniform” channel Shannon’s Conclusion 2 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective April 30, 2016 centennial day celebrated by Google: here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA April 30, 2016 centennial day celebrated by Google: 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective Claude Shannon (1916–2001) 100th birthday 2016 April 30, 1916 Claude Elwood Shannon was born in Petoskey, Michigan, USA April 30, 2016 centennial day celebrated by Google: here Shannon is juggling with bits (1,0,0) in his communication scheme “father of the information age’’ 3 / 85 26/10/2016 Olivier Rioul Shannon’s Formula W log(1 + SNR): A Historical Perspective “the most important man..
    [Show full text]
  • Quantum Computing and a Unified Approach to Fast Unitary Transforms
    Quantum Computing and a Unified Approach to Fast Unitary Transforms Sos S. Agaiana and Andreas Klappeneckerb aThe City University of New York and University of Texas at San Antonio [email protected] bTexas A&M University, Department of Computer Science, College Station, TX 77843-3112 [email protected] ABSTRACT A quantum computer directly manipulates information stored in the state of quantum mechanical systems. The available operations have many attractive features but also underly severe restrictions, which complicate the design of quantum algorithms. We present a divide-and-conquer approach to the design of various quantum algorithms. The class of algorithm includes many transforms which are well-known in classical signal processing applications. We show how fast quantum algorithms can be derived for the discrete Fourier transform, the Walsh-Hadamard transform, the Slant transform, and the Hartley transform. All these algorithms use at most O(log2 N) operations to transform a state vector of a quantum computer of length N. 1. INTRODUCTION Discrete orthogonal transforms and discrete unitary transforms have found various applications in signal, image, and video processing, in pattern recognition, in biocomputing, and in numerous other areas [1–11]. Well- known examples of such transforms include the discrete Fourier transform, the Walsh-Hadamard transform, the trigonometric transforms such as the Sine and Cosine transform, the Hartley transform, and the Slant transform. All these different transforms find applications in signal and image processing, because the great variety of signal classes occuring in practice cannot be handeled by a single transform. On a classical computer, the straightforward way to compute a discrete orthogonal transform of a signal vector of length N takes in general O(N 2) operations.
    [Show full text]
  • Revision of Lecture 5
    ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen Revision of Lecture 5 Information transferring across channels • – Channel characteristics and binary symmetric channel – Average mutual information Average mutual information tells us what happens to information transmitted across • channel, or it “characterises” channel – But average mutual information is a bit too mathematical (too abstract) – As an engineer, one would rather characterises channel by its physical quantities, such as bandwidth, signal power and noise power or SNR Also intuitively given source with information rate R, one would like to know if • channel is capable of “carrying” the amount of information transferred across it – In other word, what is the channel capacity? – This lecture answers this fundamental question 71 ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen A Closer Look at Channel Schematic of communication system • MODEM part X(k) Y(k) source sink x(t) y(t) {X i } {Y i } Analogue channel continuous−time discrete−time channel – Depending which part of system: discrete-time and continuous-time channels – We will start with discrete-time channel then continuous-time channel Source has information rate, we would like to know “capacity” of channel • – Moreover, we would like to know under what condition, we can achieve error-free transferring information across channel, i.e. no information loss – According to Shannon: information rate capacity error-free transfer ≤ → 72 ELEC3203 Digital Coding and Transmission – Overview & Information Theory S Chen Review of Channel Assumptions No amplitude or phase distortion by the channel, and the only disturbance is due • to additive white Gaussian noise (AWGN), i.e.
    [Show full text]
  • Nyquist, Shannon and the Information Carrying Capacity of Sig- Nals
    Nyquist, Shannon and the information carrying capacity of sig- nals Figure 1: The information highway There is whole science called the information theory. As far as a communications engineer is concerned, information is defined as a quantity called a bit. This is a pretty easy concept to intuit. A yes or a no, in or out, up or down, a 0 or a 1, these are all a form of information bits. In communications, a bit is a unit of information that can be conveyed by some change in the signal, be it amplitude, phase or frequency. We might issue a pulse of a fixed amplitude. The pulse of a positive amplitude can represent a 0 bit and one of a negative amplitude, a 1 bit. Let’s think about the transmission of information along the concepts of a highway as shown in Fig. 1. The width of this highway which is really the full electromagnetic spectrum, is infinite. The useful part of this highway is divided in lanes by regulatory bodies. We have very many lanes of different widths, which separate the multitude of users, such as satellites, microwave towers, wifi etc. In communications, we need to confine these signals in lanes. The specific lanes (i.e. of a certain center frequency) and their widths are assigned by ITU internationally and represent the allocated bandwidth. Within this width, the users can do what they want. In our highway analogy, we want our transporting truck to be the width of the lane, or in other words, we want the signal to fill the whole allocated bandwidth.
    [Show full text]
  • Maximal Repetition and Zero Entropy Rate
    Maximal Repetition and Zero Entropy Rate Łukasz Dębowski∗ Abstract Maximal repetition of a string is the maximal length of a repeated substring. This paper investigates maximal repetition of strings drawn from stochastic processes. Strengthening previous results, two new bounds for the almost sure growth rate of maximal repetition are identified: an upper bound in terms of conditional R´enyi entropy of order γ > 1 given a sufficiently long past and a lower bound in terms of unconditional Shannon entropy (γ = 1). Both the upper and the lower bound can be proved using an inequality for the distribution of recurrence time. We also supply an alternative proof of the lower bound which makes use of an inequality for the expectation of subword complexity. In particular, it is shown that a power-law logarithmic growth of maximal repetition with respect to the string length, recently observed for texts in natural language, may hold only if the conditional R´enyi entropy rate given a sufficiently long past equals zero. According to this observation, natural language cannot be faithfully modeled by a typical hidden Markov process, which is a class of basic language models used in computational linguistics. Keywords: maximal repetition, R´enyi entropies, entropy rate, recurrence time, subword complexity, natural language arXiv:1609.04683v4 [cs.IT] 26 Jul 2017 ∗Ł. Dębowski is with the Institute of Computer Science, Polish Academy of Sciences, ul. Jana Kazimierza 5, 01-248 Warszawa, Poland (e-mail: [email protected]). 1 I Motivation and main results n n Maximal repetition L(x1 ) of a string x1 = (x1; x2; :::; xn) is the maximal length of a repeated substring.
    [Show full text]
  • Photonic Qubits for Quantum Communication
    Photonic Qubits for Quantum Communication Exploiting photon-pair correlations; from theory to applications Maria Tengner Doctoral Thesis KTH School of Information and Communication Technology Stockholm, Sweden 2008 TRITA-ICT/MAP AVH Report 2008:13 KTH School of Information and ISSN 1653-7610 Communication Technology ISRN KTH/ICT-MAP/AVH-2008:13-SE Electrum 229 ISBN 978-91-7415-005-6 SE-164 40 Kista Sweden Akademisk avhandling som med tillstånd av Kungl Tekniska högskolan framlägges till offentlig granskning för avläggande av teknologie doktorsexamen i fotonik fredagen den 13 juni 2008 klockan 10.00 i Sal D, Forum, KTH Kista, Kungl Tekniska högskolan, Isafjordsgatan 39, Kista. © Maria Tengner, May 2008 iii Abstract For any communication, the conveyed information must be carried by some phys- ical system. If this system is a quantum system rather than a classical one, its behavior will be governed by the laws of quantum mechanics. Hence, the proper- ties of quantum mechanics, such as superpositions and entanglement, are accessible, opening up new possibilities for transferring information. The exploration of these possibilities constitutes the field of quantum communication. The key ingredient in quantum communication is the qubit, a bit that can be in any superposition of 0 and 1, and that is carried by a quantum state. One possible physical realization of these quantum states is to use single photons. Hence, to explore the possibilities of optical quantum communication, photonic quantum states must be generated, transmitted, characterized, and detected with high precision. This thesis begins with the first of these steps: the implementation of single-photon sources generat- ing photonic qubits.
    [Show full text]
  • Multiplicative Functions of Numbers Set and Logarithmic Identities
    Trends Journal of Sciences Research (2015) 2(1):13-16 http://www.tjsr.org Research Paper Open Access Multiplicative Functions of Numbers Set and Logarithmic Identities. Shannon and factorial logarithmic Identities, Entropy and Coentropy Yu. A. Kokotov Agrophysical Institute, Sankt- Petersburg, 195220, Russia *Correspondence: Yu. A. Kokotov([email protected]) Abstract The multiplicative functions characterizing the finite set of positive numbers are introduced in the work. With their help we find the logarithmic identities which connect logarithm of sum of the set numbers and logarithms of numbers themselves. One of them (contained in the work of Shannon) interconnects three information functions: information Hartley, entropy and coentropy. Shannon's identity allows better to understand the meaning and relationship of these collective characteristics of information (as the characteristics of finite sets and as probabilistic characteristics). The factorial multiplicative function and the logarithmic factorial identity are formed also from initial set numbers. That identity connects logarithms of factorials of integer numbers and logarithm of factorial of their sum. Keywords: Mumber Sets, Multiplicative Functions, Logarithmic Identity, The Identity of Shannon, Information Entropy, Coentropy, Factorials Yu. A. Kokotov. Multiplicative Functions of Numbers Set and Logarithmic Identities. Citation: Shannon and factorial logarithmic Identities, Entropy and Coentropy. Vol. 2, No. 1, 2015, pp.13-16 1. Introduction In information theory the logarithm of the sum, M, and the logarithms of its terms are considered as a The logarithmic scale is widely used in many areas of measure of the information. These ones are respectively science, including information theory. In this theory the measure of information about the set as a whole positive numbers logarithms are treated as Hartley (Hartley information, I0), and of information about each information about a set divided into subsets.The widely of the subsets (partial information, Ii).
    [Show full text]
  • The Case for Shifting the Renyi Entropy
    Article The case for shifting the Renyi Entropy Francisco J. Valverde-Albacete1,‡ ID , Carmen Peláez-Moreno2,‡* ID 1 Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Leganés 28911, Spain; [email protected] 2 Department of Signal Theory and Communications, Universidad Carlos III de Madrid, Leganés 28911, Spain; [email protected] * Correspondence: [email protected]; Tel.: +34-91-624-8771 † These authors contributed equally to this work. Academic Editor: name Received: date; Accepted: date; Published: date Abstract: We introduce a variant of the Rényi entropy definition that aligns it with the well-known Hölder mean: in the new formulation, the r-th order Rényi Entropy is the logarithm of the inverse of the r-th order Hölder mean. This brings about new insights into the relationship of the Rényi entropy to quantities close to it, like the information potential and the partition function of statistical mechanics. We also provide expressions that allow us to calculate the Rényi entropies from the Shannon cross-entropy and the escort probabilities. Finally, we discuss why shifting the Rényi entropy is fruitful in some applications. Keywords: shifted Rényi entropy; Shannon-type relations; generalized weighted means; Hölder means; escort distributions. 1. Introduction Let X ∼ PX be a random variable over a set of outcomes X = fxi j 1 ≤ i ≤ ng and pmf PX defined in terms of the non-null values pi = PX(xi) . The Rényi entropy for X is defined in terms of that of PX as Ha(X) = Ha(PX) by a case analysis [1] 1 n a 6= 1 H (P ) = log pa (1) a X − ∑ i 1 a i=1 a = 1 lim Ha(PX) = H(PX) a!1 n where H(PX) = − ∑i=1 pi log pi is the Shannon entropy [2–4].
    [Show full text]