Information Theory and Entropy
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Information Theory, Pattern Recognition and Neural Networks Part III Physics Exams 2006
Part III Physics exams 2004–2006 Information Theory, Pattern Recognition and Neural Networks Part III Physics exams 2006 1 A channel has a 3-bit input, x 000, 001, 010, 011, 100, 101, 110, 111 , ∈{ } and a 2-bit output y 00, 01, 10, 11 . Given an input x, the output y is ∈{ } generated by deleting exactly one of the three input bits, selected at random. For example, if the input is x = 010 then P (y x) is 1/3 for each of the outputs 00, 10, and 01; If the input is x = 001 then P (y=|01 x)=2/3 and P (y=00 x)=1/3. | | Write down the conditional entropies H(Y x=000), H(Y x=010), and H(Y x=001). | | [3] |Assuming an input distribution of the form x 000 001 010 011 100 101 110 111 1 p p p p p 1 p P (x) − 0 0 − , 2 4 4 4 4 2 work out the conditional entropy H(Y X) and show that | 2 H(Y )=1+ H p , 2 3 where H2(x)= x log2(1/x)+(1 x)log2(1/(1 x)). [3] Sketch H(Y ) and H(Y X−) as a function− of p (0, 1) on a single diagram. [5] | ∈ Sketch the mutual information I(X; Y ) as a function of p. [2] H2(1/3) 0.92. ≃ Another channel with a 3-bit input x 000, 001, 010, 011, 100, 101, 110, 111 , ∈{ } erases exactly one of its three input bits, marking the erased symbol by a ?. -
Pioneers in U.S. Cryptology Ii
PIONEERS IN U.S. CRYPTOLOGY II This brochure was produced by the Center for Cryptologic History Herbert 0. Yardley 2 Herbert 0. Yardley Herbert 0 . Yardley was born in 1889 in Worthington, Indiana. After working as a railroad telegrapher and spending a year taking an English course at the University of Chicago, he became a code clerk for the Department of State. In June 1917, Yardley received a commission in the Signal Officers Reserve Corps; in July Colonel Ralph Van Deman appointed him chief of the new cryptanalytic unit, MI-8, in the Military Intelligence division. MI-8, or the Cipher Bureau, consisted of Yardley and two clerks. At MI-8's peak in November 1918, Yardley had 18 officers, 24 civilians, and 109 typists. The section had expanded to include secret inks, code and cipher compilation, communications, and shorthand. This was the first formally organized cryptanalytic unit in the history of the U.S. government. When World War I ended, the Army was considering disbanding MI-8. Yardley presented a persuasive argument for retaining it for peacetime use. His plan called for the permanent retention of a code and cipher organization funded jointly by the State and War Departments. He demonstrated that in the past eighteen months MI-8 had read almost 11,000 messages in 579 cryptographic systems. This was in addition to everything that had been examined in connection with postal censorship. On 17 May Acting Secretary of State Frank L. Polk approved the plan, and two days later the Army Chief of Staff, General Peyton C. -
On Measures of Entropy and Information
On Measures of Entropy and Information Tech. Note 009 v0.7 http://threeplusone.com/info Gavin E. Crooks 2018-09-22 Contents 5 Csiszar´ f-divergences 12 Csiszar´ f-divergence ................ 12 0 Notes on notation and nomenclature 2 Dual f-divergence .................. 12 Symmetric f-divergences .............. 12 1 Entropy 3 K-divergence ..................... 12 Entropy ........................ 3 Fidelity ........................ 12 Joint entropy ..................... 3 Marginal entropy .................. 3 Hellinger discrimination .............. 12 Conditional entropy ................. 3 Pearson divergence ................. 14 Neyman divergence ................. 14 2 Mutual information 3 LeCam discrimination ............... 14 Mutual information ................. 3 Skewed K-divergence ................ 14 Multivariate mutual information ......... 4 Alpha-Jensen-Shannon-entropy .......... 14 Interaction information ............... 5 Conditional mutual information ......... 5 6 Chernoff divergence 14 Binding information ................ 6 Chernoff divergence ................. 14 Residual entropy .................. 6 Chernoff coefficient ................. 14 Total correlation ................... 6 Renyi´ divergence .................. 15 Lautum information ................ 6 Alpha-divergence .................. 15 Uncertainty coefficient ............... 7 Cressie-Read divergence .............. 15 Tsallis divergence .................. 15 3 Relative entropy 7 Sharma-Mittal divergence ............. 15 Relative entropy ................... 7 Cross entropy -
Lecture Notes (Chapter
Chapter 12 Information Theory So far in this course, we have discussed notions such as “complexity” and “in- formation”, but we have not grounded these ideas in formal detail. In this part of the course, we will introduce key concepts from the area of information the- ory, such as formal definitions of information and entropy, as well as how these definitions relate to concepts previously discussed in the course. As we will see, many of the ideas introduced in previous chapters—such as the notion of maximum likelihood—can be re-framed through the lens of information theory. Moreover, we will discuss how machine learning methods can be derived from a information-theoretic perspective, based on the idea of maximizing the amount of information gain from training data. In this chapter, we begin with key definitions from information theory, as well as how they relate to previous concepts in this course. In this next chapter, we will introduce a new supervised learning technique—termed decision trees— which can be derived from an information-theoretic perspective. 12.1 Entropy and Information Suppose we have a discrete random variable x. The key intuition of information theory is that we want to quantify how much information this random variable conveys. In particular, we want to know how much information we obtain when we observe a particular value for this random variable. This idea is often motivated via the notion of surprise: the more surprising an observation is, the more information it contains. Put in another way, if we observe a random event that is completely predictable, then we do not gain any information, since we could already predict the result. -
The Legendary William F. Friedman
'\ •' THE LEGENDARY WILLIAM F. FRH:DMAN By L~mbros D. C~llimPhos February 1974 "------- @'pp roved for Release bv NSA on 05-12-2014 Pursuant to E .0. 1352\3 REF ID:A4045961 "To my disciple, colles:igue, end friend, LPmbros Demetrios Cl3llim::>hos- who, hr:iving at an early ?ge become one of the gre?t fl1=1utists of the world, deserted the profession of musician, in favor of another in which he is making noteworthy contributions-- From r:illif'm F. Friedm?.n- - who, having ?t 8n early ege aspired to become noted in the field of biology, deserted the profession of geneticist, in favor of ~nother, upon which he hopes he hes left some impression by his contributions. In short: From deserter to deserter, with affectionate greetings! W.F.F. Washington, 21 November 1958" (Facing pege J-) REF ID:A4045961 ~ THE LEGENDARY WILLIAM F. FRIEDMAN By Lambros D. Callimahos Unclassified PROLOGUE When I joined the U.S. Army to enter the cryptologic service of my adopted country on February 11, 1941, William F. Friedman was already e legendRry figure in my eyes. I had read two efffly papers of his, "L'indice de coincidence et ses a-pplications en cryptographie" end "Application des methodes de la statistioue a la cryptogre~hie, when I was living in Feris in 1934, concertizing as a flute soloist throughout Europe while privately pursuing en active hobby of cryptology, which I felt would be my niche when the time CPme for me to enter the army. (Be sides the U.S. Army, I was also liable for service in the Greek, Egy-ptian, and Turkish armies, P.nd barely missed being in one of the letter.) As a member of the original group of 28 students in the Cryptogr8phic School at Fort Monmouth: N. -
Cryptologic Quarterly, 2019-01
Cryptologic Quarterly NSA’s Friedman Conference Center PLUS: Vint Hill Farms Station • STONEHOUSE of East Africa The Evolution of Signals Security • Mysteries of Linguistics 2019-01 • Vol. 38 Center for Cryptologic History Cryptologic Quarterly Contacts. Please feel free to address questions or comments to Editor, CQ, at [email protected]. Disclaimer. All opinions expressed in Cryptologic PUBLISHER: Center for Cryptologic History Quarterly are those of the authors. Th ey do not neces- CHIEF: John A. Tokar sarily refl ect the offi cial views of the National Security EXECUTIVE EDITOR: Pamela F. Murray Agency/Central Security Service. MANAGING EDITOR: Laura Redcay Copies of Cryptologic Quarterly can be obtained by ASSOCIATE EDITOR: Jennie Reinhardt sending an email to [email protected]. Editorial Policy. Cryptologic Quarterly is the pro- fessional journal for the National Security Agency/ Central Security Service. Its mission is to advance knowledge of all aspects of cryptology by serving as a forum for issues related to cryptologic theory, doc- trine, operations, management, and history. Th e pri- mary audience for Cryptologic Quarterly is NSA/CSS professionals, but CQ is also distributed to personnel in other United States intelligence organizations as well as cleared personnel in other federal agencies and departments. Cryptologic Quarterly is published by the Center for Cryptologic History, NSA/CSS. Th e publication is de- signed as a working aid and is not subject to receipt, control, or accountability. Cover: “Father of American cryptology” William Friedman’s retirement ceremony in the Arlington Hall Post Theater, Arlington, VA, 1955. Lieutenant General Ralph Canine is at left, Solomon Kullback is seated left, Friedman is second from right, and Director of Central Intelligence Allen Dulles is at far right. -
Stan Modeling Language
Stan Modeling Language User’s Guide and Reference Manual Stan Development Team Stan Version 2.12.0 Tuesday 6th September, 2016 mc-stan.org Stan Development Team (2016) Stan Modeling Language: User’s Guide and Reference Manual. Version 2.12.0. Copyright © 2011–2016, Stan Development Team. This document is distributed under the Creative Commons Attribution 4.0 International License (CC BY 4.0). For full details, see https://creativecommons.org/licenses/by/4.0/legalcode The Stan logo is distributed under the Creative Commons Attribution- NoDerivatives 4.0 International License (CC BY-ND 4.0). For full details, see https://creativecommons.org/licenses/by-nd/4.0/legalcode Stan Development Team Currently Active Developers This is the list of current developers in order of joining the development team (see the next section for former development team members). • Andrew Gelman (Columbia University) chief of staff, chief of marketing, chief of fundraising, chief of modeling, max marginal likelihood, expectation propagation, posterior analysis, RStan, RStanARM, Stan • Bob Carpenter (Columbia University) language design, parsing, code generation, autodiff, templating, ODEs, probability functions, con- straint transforms, manual, web design / maintenance, fundraising, support, training, Stan, Stan Math, CmdStan • Matt Hoffman (Adobe Creative Technologies Lab) NUTS, adaptation, autodiff, memory management, (re)parameterization, optimization C++ • Daniel Lee (Columbia University) chief of engineering, CmdStan, builds, continuous integration, testing, templates, -
Information Theory: MATH34600
Information Theory: MATH34600 Professor Oliver Johnson [email protected] Twitter: @BristOliver School of Mathematics, University of Bristol Teaching Block 1, 2019-20 Oliver Johnson ([email protected]) Information Theory: @BristOliver TB 1 c UoB 2019 1 / 157 Why study information theory? Information theory was introduced by Claude Shannon in 1948. Paper A Mathematical Theory of Communication available via course webpage. It forms the basis for our modern world. Information theory is used every time you make a phone call, take a selfie, download a movie, stream a song, save files to your hard disk. Represent randomly generated `messages' by sequences of 0s and 1s Key idea: some sources of randomness are more random than others. Can compress random information (remove redundancy) to save space when saving it. Can pad out random information (introduce redundancy) to avoid errors when transmitting it. Quantity called entropy quantifies how well we can do. Oliver Johnson ([email protected]) Information Theory: @BristOliver TB 1 c UoB 2019 2 / 157 Relationship to other fields (from Cover and Thomas) Now also quantum information, security, algorithms, AI/machine learning (privacy), neuroscience, bioinformatics, ecology . Oliver Johnson ([email protected]) Information Theory: @BristOliver TB 1 c UoB 2019 3 / 157 Course outline 15 lectures, 3 exercise classes, 5 mandatory HW sets. Printed notes are minimal, lectures will provide motivation and help. IT IS YOUR RESPONSIBILITY TO ATTEND LECTURES AND TO ENSURE YOU HAVE A FULL SET OF NOTES AND SOLUTIONS Course webpage for notes, problem sheets, links, textbooks etc: https://people.maths.bris.ac.uk/∼maotj/IT.html Drop-in sessions: 11.30-12.30 on Mondays, G83 Fry Building. -
SIS and Cipher Machines: 1930 – 1940
SIS and Cipher Machines: 1930 – 1940 John F Dooley Knox College Presented at the 14th Biennial NSA CCH History Symposium, October 2013 This work is licensed under a Creative Commons Attribution-NonCommercial-ShareAlike 3.0 United States License. 1 Thursday, November 7, 2013 1 The Results of Friedman’s Training • The initial training regimen as it related to cipher machines was cryptanalytic • But this detailed analysis of the different machine types informed the team’s cryptographic imaginations when it came to creating their own machines 2 Thursday, November 7, 2013 2 The Machines • Wheatstone/Plett Machine • M-94 • AT&T machine • M-138 and M-138-A • Hebern cipher machine • M-209 • Kryha • Red • IT&T (Parker Hitt) • Purple • Engima • SIGABA (M-134 and M-134-C) • B-211(and B-21) 3 Thursday, November 7, 2013 3 The Wheatstone/Plett Machine • polyalphabetic cipher disk with gearing mechanism rotates the inner alphabet. • Plett’s improvement is to add a second key and mixed alphabet to the inner ring. • Friedman broke this in 1918 Principles: (a) The inner workings of a mechanical cryptographic device can be worked out using a paper and pencil analog of the device. (b) if there is a cycle in the mechanical device (say for particular cipher alphabets), then that cycle can be discovered by analysis of the paper and pencil analog. 4 Thursday, November 7, 2013 4 The Army M-94 • Traces its roots back to Jefferson and Bazieres • Used by US Army from 1922 to circa 1942 • 25 mixed alphabets. Disk order is the key. -
The Cultural Contradictions of Cryptography: a History of Secret Codes in Modern America
The Cultural Contradictions of Cryptography: A History of Secret Codes in Modern America Charles Berret Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy under the Executive Committee of the Graduate School of Arts and Sciences Columbia University 2019 © 2018 Charles Berret All rights reserved Abstract The Cultural Contradictions of Cryptography Charles Berret This dissertation examines the origins of political and scientific commitments that currently frame cryptography, the study of secret codes, arguing that these commitments took shape over the course of the twentieth century. Looking back to the nineteenth century, cryptography was rarely practiced systematically, let alone scientifically, nor was it the contentious political subject it has become in the digital age. Beginning with the rise of computational cryptography in the first half of the twentieth century, this history identifies a quarter-century gap beginning in the late 1940s, when cryptography research was classified and tightly controlled in the US. Observing the reemergence of open research in cryptography in the early 1970s, a course of events that was directly opposed by many members of the US intelligence community, a wave of political scandals unrelated to cryptography during the Nixon years also made the secrecy surrounding cryptography appear untenable, weakening the official capacity to enforce this classification. Today, the subject of cryptography remains highly political and adversarial, with many proponents gripped by the conviction that widespread access to strong cryptography is necessary for a free society in the digital age, while opponents contend that strong cryptography in fact presents a danger to society and the rule of law. -
The Legendary William F. Friedman
The Legendary William F. Friedman by Lambros D Calhmahos Repnnted-from Cryptologic Spear.um (Vol 4 No l) Winter 1974 f,J :De.,......,..trv ~ L. l"lAHOS ~ £ 1"1-- .!',r~ot 5'1• t t J Re ~u..-\J_ ,... J- °,f- l"\oti-..~.:r .....lh l.., he.. 8 Lambros D Calhmahos The Legendary William F. Friedman Prologue cryptanalysis and smce I was presumably 1nd1spensable on weekdays pulled KP only on Saturdays and Sundays When I iomed the U S Army to enter the cryptologic TJred of waiting I went to Officer Candidate School and service of my adopted country on February 11 1941 graduated with my class m August 1942 Wilham F Fnedman was already a legendary figure m my eyes I had read two early papers of his "L 'mdice de It was at Vint Hill that Mr Fnedman first paid us a comcrdence et ses applicatwns en cryptograph1e" and v1s1t and we were all properly impressed at the dapper Application des methodes de la statist1que a la figure with the Adolph Meniou moustache the cryptographte when I was living m Pans m 1934 charactenstJC bow tie and the two tone black and white concertlZlng as a flute sol01st throughout Europe while shoes-the cryptologJC giant who asked the most pnvately pursumg an active hobby of cryptology which I searchmg quest10ns and understood our answers even felt would be my niche when the time came for me to before we had finished our explanat10ns Having been at enter the army (Besides the U S Army I was also liable Vmt Hill for 14 months and thtnkmg that I might be for service m the Greek Egyptian and Turkish armies stuck there for the durat10n -
Arxiv:1703.01694V2 [Cs.CL] 31 May 2017 Tinctive Forms to Differentiate Each Word from Others, Especially If a Speaker’S Intended Word Has Higher-Frequency Competitors
Word forms|not just their lengths|are optimized for efficient communication Stephan C. Meylan1 and Thomas L. Griffiths1 1University of California, Berkeley, Psychology Abstract The inverse relationship between the length of a word and the frequency of its use, first identified by G.K. Zipf in 1935, is a classic empirical law that holds across a wide range of human languages. We demonstrate that length is one aspect of a much more general property of words: how distinctive they are with respect to other words in a language. Distinctiveness plays a critical role in recognizing words in fluent speech, in that it reflects the strength of potential competitors when selecting the best candidate for an ambiguous signal. Phonological information content, a measure of a word's string probability under a statistical model of a language's sound or character sequences, concisely captures distinctiveness. Examining large- scale corpora from 13 languages, we find that distinctiveness significantly outperforms word length as a predictor of frequency. This finding provides evidence that listeners' processing constraints shape fine-grained aspects of word forms across languages. Despite their apparent diversity, natural languages display striking structural regularities [1{3]. How such regularities relate to human cognition remains an open question with implications for linguistics, psychology, and neuroscience [2, 4{6]. Prominent among these regularities is the well- known relationship between word length and frequency: across languages, frequently-used words tend to be short [7]. In a classic work, Zipf [7] posited that this pattern emerges from speakers minimizing total articulatory effort by using the shortest form for words that are used most often, following what he later called the Principle of Least Effort [8].