Symposium in Remembrance of Rudolf Ahlswede
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Digital Communication Systems 2.2 Optimal Source Coding
Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L. -
Principles of Communications ECS 332
Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Communication Systems Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 1 BKD 2 Remark 1 If the downloaded file crashed your device/browser, try another one posted on the course website: 3 Remark 2 There is also three more sections from the Appendices of the lecture notes: 4 Shannon's insight 5 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 6 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD- TV, and the California Institute for Telecommunic ations and Information Technology 7 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) Hello. I'm Claude Shannon a mathematician here at the Bell Telephone laboratories He didn't create the compact disc, the fax machine, digital wireless telephones Or mp3 files, but in 1948 Claude Shannon paved the way for all of them with the Basic theory underlying digital communications and storage he called it 8 information theory. C. E. Shannon (1916-2001) 9 https://www.youtube.com/watch?v=47ag2sXRDeU C. E. Shannon (1916-2001) One of the most influential minds of the 20th century yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large 10 C. -
Arxiv:1306.1586V4 [Quant-Ph]
Strong converse for the classical capacity of entanglement-breaking and Hadamard channels via a sandwiched R´enyi relative entropy Mark M. Wilde∗ Andreas Winter†‡ Dong Yang†§ May 23, 2014 Abstract A strong converse theorem for the classical capacity of a quantum channel states that the probability of correctly decoding a classical message converges exponentially fast to zero in the limit of many channel uses if the rate of communication exceeds the classical capacity of the channel. Along with a corresponding achievability statement for rates below the capacity, such a strong converse theorem enhances our understanding of the capacity as a very sharp dividing line between achievable and unachievable rates of communication. Here, we show that such a strong converse theorem holds for the classical capacity of all entanglement-breaking channels and all Hadamard channels (the complementary channels of the former). These results follow by bounding the success probability in terms of a “sandwiched” R´enyi relative entropy, by showing that this quantity is subadditive for all entanglement-breaking and Hadamard channels, and by relating this quantity to the Holevo capacity. Prior results regarding strong converse theorems for particular covariant channels emerge as a special case of our results. 1 Introduction One of the most fundamental tasks in quantum information theory is the transmission of classical data over many independent uses of a quantum channel, such that, for a fixed rate of communica- tion, the error probability of the transmission decreases to zero in the limit of many channel uses. The maximum rate at which this is possible for a given channel is known as the classical capacity of the channel. -
Jacob Wolfowitz 1910–1981
NATIONAL ACADEMY OF SCIENCES JACOB WOLFOWITZ 1910–1981 A Biographical Memoir by SHELEMYAHU ZACKS Any opinions expressed in this memoir are those of the author and do not necessarily reflect the views of the National Academy of Sciences. Biographical Memoirs, VOLUME 82 PUBLISHED 2002 BY THE NATIONAL ACADEMY PRESS WASHINGTON, D.C. JACOB WOLFOWITZ March 19, 1910–July 16, 1981 BY SHELEMYAHU ZACKS ACOB WOLFOWITZ, A GIANT among the founders of modern Jstatistics, will always be remembered for his originality, deep thinking, clear mind, excellence in teaching, and vast contributions to statistical and information sciences. I met Wolfowitz for the first time in 1957, when he spent a sab- batical year at the Technion, Israel Institute of Technology. I was at the time a graduate student and a statistician at the building research station of the Technion. I had read papers of Wald and Wolfowitz before, and for me the meeting with Wolfowitz was a great opportunity to associate with a great scholar who was very kind to me and most helpful. I took his class at the Technion on statistical decision theory. Outside the classroom we used to spend time together over a cup of coffee or in his office discussing statistical problems. He gave me a lot of his time, as though I was his student. His advice on the correct approach to the theory of statistics accompanied my development as statistician for many years to come. Later we kept in touch, mostly by correspondence and in meetings of the Institute of Mathematical Statistics. I saw him the last time in his office at the University of Southern Florida in Tampa, where he spent the last years 3 4 BIOGRAPHICAL MEMOIRS of his life. -
Information Theory and Statistics: a Tutorial
Foundations and Trends™ in Communications and Information Theory Volume 1 Issue 4, 2004 Editorial Board Editor-in-Chief: Sergio Verdú Department of Electrical Engineering Princeton University Princeton, New Jersey 08544, USA [email protected] Editors Venkat Anantharam (Berkeley) Amos Lapidoth (ETH Zurich) Ezio Biglieri (Torino) Bob McEliece (Caltech) Giuseppe Caire (Eurecom) Neri Merhav (Technion) Roger Cheng (Hong Kong) David Neuhoff (Michigan) K.C. Chen (Taipei) Alon Orlitsky (San Diego) Daniel Costello (NotreDame) Vincent Poor (Princeton) Thomas Cover (Stanford) Kannan Ramchandran (Berkeley) Anthony Ephremides (Maryland) Bixio Rimoldi (EPFL) Andrea Goldsmith (Stanford) Shlomo Shamai (Technion) Dave Forney (MIT) Amin Shokrollahi (EPFL) Georgios Giannakis (Minnesota) Gadiel Seroussi (HP-Palo Alto) Joachim Hagenauer (Munich) Wojciech Szpankowski (Purdue) Te Sun Han (Tokyo) Vahid Tarokh (Harvard) Babak Hassibi (Caltech) David Tse (Berkeley) Michael Honig (Northwestern) Ruediger Urbanke (EPFL) Johannes Huber (Erlangen) Steve Wicker (GeorgiaTech) Hideki Imai (Tokyo) Raymond Yeung (Hong Kong) Rodney Kennedy (Canberra) Bin Yu (Berkeley) Sanjeev Kulkarni (Princeton) Editorial Scope Foundations and Trends™ in Communications and Information Theory will publish survey and tutorial articles in the following topics: • Coded modulation • Multiuser detection • Coding theory and practice • Multiuser information theory • Communication complexity • Optical communication channels • Communication system design • Pattern recognition and learning • Cryptology -
Network Information Theory
Network Information Theory This comprehensive treatment of network information theory and its applications pro- vides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon’s point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless com- munication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia. Abbas El Gamal is the Hitachi America Chaired Professor in the School of Engineering and the Director of the Information Systems Laboratory in the Department of Electri- cal Engineering at Stanford University. In the field of network information theory, he is best known for his seminal contributions to the relay, broadcast, and interference chan- nels; multiple description coding; coding for noisy networks; and energy-efficient packet scheduling and throughput–delay tradeoffs in wireless networks. He is a Fellow of IEEE and the winner of the 2012 Claude E. Shannon Award, the highest honor in the field of information theory. Young-Han Kim is an Assistant Professor in the Department of Electrical and Com- puter Engineering at the University of California, San Diego. -
Rakesh V. Vohra
Rakesh V. Vohra Department of Economics and the Department of Electrical and Systems Engineering University of Pennsylvania Philadelphia, PA 19104 Electronic mail: [email protected], [email protected] August 15th, 2020 Experience • (2013 to date) George A. Weiss and Lydia Bravo Weiss University Professor, Depart- ment of Economics and the Department of Electrical and Systems Engineering, University of Pennsylvania • (2013 to date) Secondary appointment, Department of Computer and Information Sciences, University of Pennsylvania. • (1999 to 2013) John L. and Helen Kellogg Professor of Managerial Economics and Decision Sciences, Kellogg School of Management, Northwestern University. • (2010 to 2013) Professor (Courtesy), Department of Economics, Weinberg College of Arts and Sciences, Northwestern University. • (2006 to 2013) Professor (Courtesy), Electrical Engineering and Computer Science, Mc- Cormick School, Northwestern University. • (1998 to 1999) Professor of Managerial Economics and Decision Sciences, Kellogg School of Management, Northwestern University. • (1991 to 1998) Associate Professor of Management Science and Industrial Engineering, Fisher College of Business, The Ohio State University. • (1985 to 1991) Assistant Professor of Management Science, Fisher College of Business, The Ohio State University. • (Summer 2011, 2012, 2014) Visiting Researcher, Microsoft Research, New England. • (Summer 2006) Visiting Researcher, Microsoft Research, Redmond. • (Winter 2005) Visiting Scholar, California Institute of Technology. • (Spring 2004, 2006, 2011, 2016) Visiting Professor, Indian School of Business. 1 • (Summer 2003) METEOR Fellow, Department of Quantitative Economics, University of Maas- tricht. • (September 2000) Cycle and Carriage Visiting Professor, Department of Decision Sciences, College of Business, National University of Singapore. • (1997-1998) Visiting Associate Professor, The Kellogg School of Management, Northwestern University. • (1992 -1993, Summer '96) Visiting Associate Professor, The Sloan School, MIT. -
Reflections on Compressed Sensing, by E
itNL1208.qxd 11/26/08 9:09 AM Page 1 IEEE Information Theory Society Newsletter Vol. 58, No. 4, December 2008 Editor: Daniela Tuninetti ISSN 1059-2362 Source Coding and Simulation XXIX Shannon Lecture, presented at the 2008 IEEE International Symposium on Information Theory, Toronto Canada Robert M. Gray Prologue Source coding/compression/quantization A unique aspect of the Shannon Lecture is the daunting fact that the lecturer has a year to prepare for (obsess over?) a single lec- source reproduction ture. The experience begins with the comfort of a seemingly infi- { } - bits- - Xn encoder decoder {Xˆn} nite time horizon and ends with a relativity-like speedup of time as the date approaches. I early on adopted a few guidelines: I had (1) a great excuse to review my more than four decades of Simulation/synthesis/fake process information theoretic activity and historical threads extending simulation even further back, (2) a strong desire to avoid repeating the top- - - ics and content I had worn thin during 2006–07 as an interconti- random bits coder {X˜n} nental itinerant lecturer for the Signal Processing Society, (3) an equally strong desire to revive some of my favorite topics from the mid 1970s—my most active and focused period doing Figure 1: Source coding and simulation. unadulterated information theory, and (4) a strong wish to add something new taking advantage of hindsight and experience, The goal of source coding [1] is to communicate or transmit preferably a new twist on some old ideas that had not been pre- the source through a constrained, discrete, noiseless com- viously fully exploited. -
IEEE Information Theory Society Newsletter
IEEE Information Theory Society Newsletter Vol. 66, No. 3, September 2016 Editor: Michael Langberg ISSN 1059-2362 Editorial committee: Frank Kschischang, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, Sergio Verdú, and Parham Noorzad President’s Column Alon Orlitsky ISIT in Barcelona was true joy. Like a Gaudi sion for our community along with extensive masterpiece, it was artfully planned and flaw- experience as board member, associate editor, lessly executed. Five captivating plenaries and a first-class researcher. Be-earlied con- covered the gamut from communication and gratulations, Madam President. coding to graphs and satisfiability, while Al- exander Holevo’s Shannon Lecture reviewed Other decisions and announcements con- the fascinating development of quantum cerned the society’s major awards, including channels from his early theoretical contribu- three Jack Wolf student paper awards and the tions to today’s near-practical innovations. James Massey Young Scholars Award that The remaining 620 talks were presented in 9 went to Andrea Montanari. Surprisingly, the parallel sessions and, extrapolating from those IT Paper Award was given to two papers, I attended, were all superb. Two records were “The Capacity Region of the Two-Receiver broken, with 888 participants this was the Gaussian Vector Broadcast Channel With Pri- largest ISIT outside the US, and at 540 Euro vate and Common Messages” by Yanlin Geng registration, the most economical in a decade. and Chandra Nair, and “Fundamental Limits of Caching” by Mohammad Maddah-Ali and Apropos records, and with the recent Rio Games, isn’t ISIT Urs Niesen. While Paper Awards were given to two related like the Olympics, except even better? As with the ultimate papers recently, not since 1972 were papers on different top- sports event, we convene to show our best results, further our ics jointly awarded. -
Nonparametric Statistics a Brief Introduction
Nonparametric Statistics A Brief Introduction Curtis Greene December 4, 2003 Statistics Across the Curriculum – p.1 Nonparametric tests: examples • Sign Test • Mann-Whitney(=Wilcoxon Rank Sum) Test • Wilcoxon Signed Rank Test • Tukey’s Quick Test • Wald-Wolfowitz Runs Test • Kruskal-Wallis Test • Friedman Test • Spearman’s Rho /Kendall’s Tau • Chi-Squared Test • Kolmogorov-Smirnov Test • Log Rank Test Statistics Across the Curriculum – p.2 Nonparametric tests: examples • Sign Test (t-test/location) • Mann-Whitney(=Wilcoxon Rank Sum) Test • Wilcoxon Signed Rank Test (t-test/location) • Tukey’s Quick Test (t-test/location) • Wald-Wolfowitz Runs Test (t-test/location) • Kruskal-Wallis Test (ANOVA) • Friedman Test (ANOVA) • Spearman’s Rho /Kendall’s Tau (Correlation) • Chi-Squared Test (Goodness of fit) • Kolmogorov-Smirnov Test (Goodness of fit) • Log Rank Test (Survival data) Statistics Across the Curriculum – p.2 And possibly even more . Some statisticians would add to this list: • Binomial Test (comparing sample proportions) This despite the obvious presence of “parameters” being compared. Statistics Across the Curriculum – p.3 Definition of "nonparametric" • There is not complete agreement about the precise meaning of "nonparametric test". Some statisticians are more generous than others in what they would include. • Fortunately, it’s not a distinction worth arguing about. Nobody seems to care much about the semantics here. • The next slide contains a definition that seems to work pretty well. It is paraphrased from Practical Nonparametric Statistics, by W. J. Conover, which is an excellent book on this subject. Statistics Across the Curriculum – p.4 Definition of "nonparametric" A test is nonparametric if it satisfies at least one of the following conditions: • It may be applied to categorical(nominal) data. -
Engineering Theory and Mathematics in the Early Development of Information Theory
1 Engineering Theory and Mathematics in the Early Development of Information Theory Lav R. Varshney School of Electrical and Computer Engineering Cornell University information theory in the American engineering community Abstract— A historical study of the development of information and also touch on the course that information theory took in theory advancements in the first few years after Claude E. Shannon’s seminal work of 1948 is presented. The relationship the American mathematics community. As the early between strictly mathematical advances, information theoretic development of information theory was being made, social evaluations of existing communications systems, and the meanings of the nascent field were just being constructed and development of engineering theory for applications is explored. It incorporated by the two different social groups, mathematical is shown that the contributions of American communications scientists and communications engineering theorists. The engineering theorists are directly tied to the socially constructed meanings constructed had significant influence on the direction meanings of information theory held by members of the group. The role played by these engineering theorists is compared with that future developments took, both in goals and in the role of American mathematical scientists and it is also shown methodology. Mutual interaction between the two social that the early advancement of information is linked to the mutual groups was also highly influenced by these social meanings. interactions between these two social groups. In the half century since its birth, Shannon information theory has paved the way for electronic systems such as compact discs, cellular phones, and modems, all built upon the I. INTRODUCTION developments and social meanings of its first few years. -
Multivariate Decoding of Reed-Solomon Codes
UC San Diego UC San Diego Electronic Theses and Dissertations Title Algebraic list-decoding of error-correcting codes Permalink https://escholarship.org/uc/item/68q346tn Author Parvaresh, Farzad Publication Date 2007 Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, SAN DIEGO Algebraic List-Decoding of Error-Correcting Codes A dissertation submitted in partial satisfaction of the requirements for the degree Doctor of Philosophy in Electrical Engineering (Communications Theory and Systems) by Farzad Parvaresh Committee in charge: Professor Alexander Vardy, Chair Professor Ilya Dumer Professor Alon Orlitsky Professor Paul H. Siegel Professor Jack K. Wolf 2007 Copyright Farzad Parvaresh, 2007 All rights reserved. The dissertation of Farzad Parvaresh is approved, and it is acceptable in quality and form for publication on microfilm: Chair University of California, San Diego 2007 iii To my parents iv TABLE OF CONTENTS Signature Page . iii Dedication . iv Table of Contents . v List of Figures . vii Acknowledgements . viii Vita and Publications . xi Abstract of the Dissertation . xii Chapter 1 Introduction . 1 1.1 Contributions . 5 1.2 Waiting for a bus in front of a pub . 8 1.2.1 Reed-Solomon codes . 9 1.2.2 Guruswami-Sudan algorithm . 10 1.2.3 Multivariate interpolation decoding algorithm . 12 1.3 Waiting for a bus on an April Fool’s day . 15 1.4 A new kind of Tic-Tac-Toe with multiplicity . 18 Chapter 2 Multivariate decoding of Reed-Solomon codes . 21 2.1 Introduction . 22 2.2 Preliminaries . 27 2.3 Error-correction radius of the MID algorithm .