IEEE Information Theory Society Newsletter

Total Page:16

File Type:pdf, Size:1020Kb

IEEE Information Theory Society Newsletter IEEE Information Theory Society Newsletter Vol. 59, No. 3, September 2009 Editor: Tracey Ho ISSN 1059-2362 Optimal Estimation XXX Shannon lecture, presented in 2009 in Seoul, South Korea (extended abstract) J. Rissanen 1 Prologue 2 Modeling Problem The fi rst quest for optimal estimation by Fisher, [2], Cramer, The modeling problem begins with a set of observed data 5 5 5 c 6 Rao and others, [1], dates back to over half a century and has Y yt:t 1, 2, , n , often together with other so-called 5 51 c26 changed remarkably little. The covariance of the estimated pa- explanatory data Y, X yt, x1,t, x2,t, . The objective is rameters was taken as the quality measure of estimators, for to learn properties in Y expressed by a set of distributions which the main result, the Cramer-Rao inequality, sets a lower as models bound. It is reached by the maximum likelihood (ML) estima- 5 1 u 26 tor for a restricted subclass of models and asymptotically for f Y|Xs; , s . a wider class. The covariance, which is just one property of u5u c u models, is too weak a measure to permit extension to estima- Here s is a structure parameter and 1, , k1s2 real- valued tion of the number of parameters, which is handled by various parameters, whose number depends on the structure. The ad hoc criteria too numerous to list here. structure is simply a subset of the models. Typically it is used to indicate the most important variables in X. (The traditional Soon after I had studied Shannon’s formal defi nition of in- name for the set of models is ‘likelihood function’ although no formation in random variables and his other remarkable per- such concept exists in probability theory.) formance bounds for communication, [4], I wanted to apply them to other fi elds – in particular to estimation and statis- The most important problem is the selection of the model tics in general. After all, the central problem in statistics is to class, but since its optimal selection is non-computable there extract information from data. After having worked on data is little we can say about it. It is clear that when picking it we compression and introduced arithmetic coding it seemed evi- must take into account the general type of the data, the sensi- dent that both estimation and data compression have a com- tivity of the models with respect to the parameters, which is mon goal: in data compression the shortest code length cannot an issue in the so-called robust statistics, and so on. be achieved without taking advantage of the regular features in data, while in estimation it is these regular features, the un- To simplify the notations we write the data as x with the un- derlying mechanism, that we want to learn. This led me to in- derstanding that the models are distributions on Y or condi- troduce the MDL or Minimum Description Length principle, tional distributions on Y given the explanatory data X if they and I thought that the job was done. too are observed. (In the fi rst reading put just x 5 Y ). We also consider only structures determined by the number of param- uk 5u c u However, when starting to prepare this lecture I found that it eters so that we already fi x the sequence 1, , k, also was diffi cult, for I could not connect the several in themselves written as u. We then have the two model classes meaningful results to form a nice coherent picture. It was like a jigsaw puzzle where the pieces almost fi t but not quite, M 5 5 f1x; u, k2:u [ Vk6, k # n and, moreover, vital pieces were missing. After considerable k K struggle I was able to get the pieces to fi t but to do so I had to M 5 M # d k, K n alter them all, and ignore the means and concepts introduced k50 by the masters mentioned above. The result was separation of estimation from data compression, and we can now defi ne op- depending on whether we are considering the number of pa- timality for all amounts of data and not just asymptotically. rameters fi xed or if it, too, is to be estimated. The latter class continued on page 6 2 From the Editor Tracey Ho Dear IT Society members, For many of us, the summer is drawing to an end, and fall term is starting soon. Do take some time to relax with our It was very nice to see many of you at ISIT fun regular features – the Historian’s Column and Golomb’s in June. Many thanks to the organizers for Puzzle Column. Also, an article commemorating the hun- a fantastic program and meticulously-run dredth anniversary of telecommunications pioneer Vladimir conference. This issue includes the re- Kotelnikov is included in this issue. ports from ISIT and ITW Volos, and an- nouncements of awards from ISIT and Please help to make the newsletter as interesting and infor- elsewhere – warmest congratulations to mative as possible by offering suggestions and contributing all the award winners. I’m also delighted news. The deadlines for the next few issues are: to include in this issue the fi rst-ever pub- lished article on 2009 Shannon Lecturer Issue Deadline Jorma Rissanen’s new theory of optimal December 2009 October 10, 2009 estimation. Jorma said that as he was March 2010 January 10, 2010 preparing his Shannon Lecture, he un- June 2010 April 10, 2010 expectedly encountered diffi culties in- corporating some of the basic results of Announcements, news and events intended for both the printed newsletter and the web- traditional estimation theory in a coher- site, such as award announcements, calls for nominations and upcoming conferences, can ent framework, so he set about creating now be submitted jointly at the IT Society website http://www.itsoc.org/, using the quick a new framework for optimal estimation. links “Share News” and “Announce an Event”. For more details please see the article on His article in this issue summarizes his the new website in this issue. lecture and gives an exciting fi rst look at these new ideas. Articles and columns intended only for the printed newsletter should be e-mailed to me at [email protected]. This issue also comes with sad news of the passing of Wesley Peterson, whose Please submit ASCII, LaTeX or Word source fi les; do not worry about fonts or layout many achievements and honors include as this will be taken care of by IEEE layout specialists. Electronic photos and graphics the Shannon Award in 1981. He is fondly should be in high resolution and sent as separate fi les. remembered by his friends Ned Weldon and Shu Lin. I look forward to your contributions and suggestions for future issues of the newsletter. Tracey Ho IEEE Information Theory Society Newsletter IEEE Information Theory Society Newsletter Table of Contents (USPS 360-350) is published quarterly by the Information Theory Society of the Institute of Optimal Estimation: XXX Shannon Lecture . 1 Electrical and Electronics Engineers, Inc. From the Editor . 2 Headquarters: 3 Park Avenue, 17th Floor, President’s Column. 3 New York, NY 10016-5997. Annual Information Theory Society Awards Announced . .4 Cost is $1.00 per member per year (included in Society fee) for each member of the In Memoriam, Wes Peterson . 5 Information Theory Society. Printed in the The Historian’s Column. 8 U.S.A. Periodicals postage paid at New York, NY and at additional mailing offices. IT Society Members Honored . 9 Symposium Report: The 2009 IEEE International Symposium on Information Theory, Seoul, Korea . 9 Postmaster: Send address changes to IEEE Information Theory Society Newsletter, Workshop Report: Networking and Information Theory, Volos, Greece. 11 IEEE, 445 Hoes Lane, Piscataway, NJ 08854. Golomb’s Puzzle Column: Finding Sums. 12 © 2009 IEEE. Information contained in this The Life Filled with Cognition and Action: 100th Anniversary of Academician V.A. Kotelnikov . .13 newsletter may be copied without permission provided that the copies are not made or dis- Golomb’s Puzzle Column: Some Partition Problems Solutions . 15 tributed for direct commercial advantage, and Call for Papers . 18 the title of the publication and its date appear. Conference Calendar . .20 IEEE Information Theory Society Newsletter September 2009 3 President’s Column Andrea Goldsmith I have just returned from the main annual event model associated with a given set of observed of our society, the International Symposium on In- data is the one that leads to the largest compres- formation Theory (ISIT). This year ISIT was held sion of the data. MDL is particularly well-suited in Seoul, Korea, returning to mainland Asia for for highly complex models where overfi tting the the fi rst time since 1973, when Shannon gave the data is a serious concern. MDL has had signifi cant fi rst of what would become the Shannon lectures impact on the fi elds of inductive inference, statis- in Askelon Israel. The symposium was superb in tical modeling, pattern recognition, and machine both its organization and technical content. A di- learning, and has been widely applied to a broad verse set of stimulating and thought-provoking cross- section of problems in engineering as well as plenary talks by Rich Baraniuk, David Tse, biology and medicine. It continues to be a very ac- Raymond Yeung, and Noga Alon set an excel- tive area of research more than 20 years after its lent tone for each day’s sessions. The technical invention, including recent work by Jorma himself program was outstanding with a broad range of on MDL denoising currently in press.
Recommended publications
  • Group Testing
    Group Testing Amit Kumar Sinhababu∗ and Vikraman Choudhuryy Department of Computer Science and Engineering, Indian Institute of Technology Kanpur April 21, 2013 1 Motivation Out original motivation in this project was to study \coding theory in data streaming", which has two aspects. • Applications of theory correcting codes to efficiently solve problems in the model of data streaming. • Solving coding theory problems in the model of data streaming. For ex- ample, \Can one recognize a Reed-Solomon codeword in one-pass using only poly-log space?" [1] As we started, we were directed to a related combinatorial problem, \Group testing", which is important on its own, having connections with \Compressed Sensing", \Data Streaming", \Coding Theory", \Expanders", \Derandomiza- tion". This project report surveys some of these interesting connections. 2 Group Testing The group testing problem is to identify the set of \positives" (\defectives", or \infected", or 1) from a large set of population/items, using as few tests as possible. ∗[email protected] [email protected] 1 2.1 Definition There is an unknown stream x 2 f0; 1gn with at most d ones in it. We are allowed to test any subset S of the indices. The answer to the test tells whether xi = 0 for all i 2 S, or not (at least one xi = 1). The objective is to design as few tests as possible (t tests) such that x can be identified as fast as possible. Group testing strategies can be either adaptive or non-adaptive. A group testing algorithm is non-adaptive if all tests must be specified without knowing the outcome of other tests.
    [Show full text]
  • DRASIC Distributed Recurrent Autoencoder for Scalable
    DRASIC: Distributed Recurrent Autoencoder for Scalable Image Compression Enmao Diao∗, Jie Dingy, and Vahid Tarokh∗ ∗Duke University yUniversity of Minnesota-Twin Cities Durham, NC, 27701, USA Minneapolis, MN 55455, USA [email protected] [email protected] [email protected] Abstract We propose a new architecture for distributed image compression from a group of distributed data sources. The work is motivated by practical needs of data-driven codec design, low power con- sumption, robustness, and data privacy. The proposed architecture, which we refer to as Distributed Recurrent Autoencoder for Scalable Image Compression (DRASIC), is able to train distributed encoders and one joint decoder on correlated data sources. Its compression capability is much bet- ter than the method of training codecs separately. Meanwhile, the performance of our distributed system with 10 distributed sources is only within 2 dB peak signal-to-noise ratio (PSNR) of the performance of a single codec trained with all data sources. We experiment distributed sources with different correlations and show how our data-driven methodology well matches the Slepian- Wolf Theorem in Distributed Source Coding (DSC). To the best of our knowledge, this is the first data-driven DSC framework for general distributed code design with deep learning. 1 Introduction It has been shown by a variety of previous works that deep neural networks (DNN) can achieve comparable results as classical image compression techniques [1–9]. Most of these methods are based on autoencoder networks and quantization of bottleneck representa- tions. These models usually rely on entropy codec to further compress codes. Moreover, to achieve different compression rates it is unavoidable to train multiple models with different regularization parameters separately, which is often computationally intensive.
    [Show full text]
  • Minimax Redundancy for Markov Chains with Large State Space
    Minimax redundancy for Markov chains with large state space Kedar Shriram Tatwawadi, Jiantao Jiao, Tsachy Weissman May 8, 2018 Abstract For any Markov source, there exist universal codes whose normalized codelength approaches the Shannon limit asymptotically as the number of samples goes to infinity. This paper inves- tigates how fast the gap between the normalized codelength of the “best” universal compressor and the Shannon limit (i.e. the compression redundancy) vanishes non-asymptotically in terms of the alphabet size and mixing time of the Markov source. We show that, for Markov sources (2+c) whose relaxation time is at least 1 + √k , where k is the state space size (and c> 0 is a con- stant), the phase transition for the number of samples required to achieve vanishing compression redundancy is precisely Θ(k2). 1 Introduction For any data source that can be modeled as a stationary ergodic stochastic process, it is well known in the literature of universal compression that there exist compression algorithms without any knowledge of the source distribution, such that its performance can approach the fundamental limit of the source, also known as the Shannon entropy, as the number of observations tends to infinity. The existence of universal data compressors has spurred a huge wave of research around it. A large fraction of practical lossless compressors are based on the Lempel–Ziv algorithms [ZL77, ZL78] and their variants, and the normalized codelength of a universal source code is also widely used to measure the compressibility of the source, which is based on the idea that the normalized codelength is “close” to the true entropy rate given a moderate number of samples.
    [Show full text]
  • Digital Communication Systems 2.2 Optimal Source Coding
    Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L.
    [Show full text]
  • Principles of Communications ECS 332
    Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Communication Systems Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 1 BKD 2 Remark 1 If the downloaded file crashed your device/browser, try another one posted on the course website: 3 Remark 2 There is also three more sections from the Appendices of the lecture notes: 4 Shannon's insight 5 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 6 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD- TV, and the California Institute for Telecommunic ations and Information Technology 7 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) Hello. I'm Claude Shannon a mathematician here at the Bell Telephone laboratories He didn't create the compact disc, the fax machine, digital wireless telephones Or mp3 files, but in 1948 Claude Shannon paved the way for all of them with the Basic theory underlying digital communications and storage he called it 8 information theory. C. E. Shannon (1916-2001) 9 https://www.youtube.com/watch?v=47ag2sXRDeU C. E. Shannon (1916-2001) One of the most influential minds of the 20th century yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large 10 C.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 59, No. 3, September 2009 Editor: Tracey Ho ISSN 1059-2362 Optimal Estimation XXX Shannon lecture, presented in 2009 in Seoul, South Korea (extended abstract) J. Rissanen 1 Prologue 2 Modeling Problem The fi rst quest for optimal estimation by Fisher, [2], Cramer, The modeling problem begins with a set of observed data 5 5 5 c 6 Rao and others, [1], dates back to over half a century and has Y yt:t 1, 2, , n , often together with other so-called 5 51 c26 changed remarkably little. The covariance of the estimated pa- explanatory data Y, X yt, x1,t, x2,t, . The objective is rameters was taken as the quality measure of estimators, for to learn properties in Y expressed by a set of distributions which the main result, the Cramer-Rao inequality, sets a lower as models bound. It is reached by the maximum likelihood (ML) estima- 5 1 u 26 tor for a restricted subclass of models and asymptotically for f Y|Xs; , s . a wider class. The covariance, which is just one property of u5u c u models, is too weak a measure to permit extension to estima- Here s is a structure parameter and 1, , k1s2 real- valued tion of the number of parameters, which is handled by various parameters, whose number depends on the structure. The ad hoc criteria too numerous to list here. structure is simply a subset of the models. Typically it is used to indicate the most important variables in X. (The traditional Soon after I had studied Shannon’s formal defi nition of in- name for the set of models is ‘likelihood function’ although no formation in random variables and his other remarkable per- such concept exists in probability theory.) formance bounds for communication, [4], I wanted to apply them to other fi elds – in particular to estimation and statis- The most important problem is the selection of the model tics in general.
    [Show full text]
  • Nested Tailbiting Convolutional Codes for Secrecy, Privacy, and Storage
    Nested Tailbiting Convolutional Codes for Secrecy, Privacy, and Storage Thomas Jerkovits Onur Günlü Vladimir Sidorenko [email protected] [email protected] Gerhard Kramer German Aerospace Center TU Berlin [email protected] Weçling, Germany Berlin, Germany [email protected] TU Munich Munich, Germany ABSTRACT them as physical “one-way functions” that are easy to compute and A key agreement problem is considered that has a biometric or difficult to invert [33]. physical identifier, a terminal for key enrollment, and a terminal There are several security, privacy, storage, and complexity con- for reconstruction. A nested convolutional code design is proposed straints that a PUF-based key agreement method should fulfill. First, that performs vector quantization during enrollment and error the method should not leak information about the secret key (neg- control during reconstruction. Physical identifiers with small bit ligible secrecy leakage). Second, the method should leak as little error probability illustrate the gains of the design. One variant of information about the identifier (minimum privacy leakage). The the nested convolutional codes improves on the best known key privacy leakage constraint can be considered as an upper bound vs. storage rate ratio but it has high complexity. A second variant on the secrecy leakage via the public information of the first en- with lower complexity performs similar to nested polar codes. The rollment of a PUF about the secret key generated by the second results suggest that the choice of code for key agreement with enrollment of the same PUF [12]. Third, one should limit the stor- identifiers depends primarily on the complexity constraint.
    [Show full text]
  • MASTER of ADVANCED STUDY New Professional Degrees for Engineers University of California, San Diego of California, University
    pulse cover12_Layout 1 6/22/11 3:46 PM Page 1 Entrepreneurism Center • Research Expo 2011 In Memory of Jack Wolf Jacobs School of Engineering News PulseSummer 2011 MASTER OF ADVANCED STUDY New Professional Degrees for Engineers University of California, San Diego of California, University > dean’s column < New Interdisciplinary Degree Programs for Engineering Professionals Jacobs School of Engineering The most exciting and innovative engineering often occurs on the interface between traditional disciplines. We are extending our interdisciplinary Leadership Dean: Frieder Seible collaborations — which have always been at the core of the Jacobs School culture Associate Dean: Jeanne Ferrante — to new graduate education programs for engineering professionals. Associate Dean: Charles Tu Associate Dean for Administration and Finance: Beginning this fall, the Jacobs School will offer four new interdisciplinary Steve Ross Master of Advanced Study (MAS) programs for working engineers: Wireless Executive Director of External Relations: Embedded Systems, Medical Device Engineering, Structural Health Monitoring, Denine Hagen and Simulation-Based Engineering. Academic Departments Bioengineering: Shankar Subramanian, Chair TThese master degree programs are engineering equivalents of MBA programs Computer Science and Engineering: at business management schools. Geared to early- to mid-career engineers Rajesh Gupta, Chair Electrical and Computer Engineering: with practical work experience, our new MAS programs align faculty research Yeshaiahu Fainman, Chair strengths with industry workforce needs. The curricula are always jointly offered Mechanical and Aerospace Engineering: by two academic departments, so that the training focuses in a practical way on Sutanu Sarkar, Chair NanoEngineering: industry-specific application areas that are not available through traditional master Kenneth Vecchio, Chair degree programs.
    [Show full text]
  • Information Theory and Statistics: a Tutorial
    Foundations and Trends™ in Communications and Information Theory Volume 1 Issue 4, 2004 Editorial Board Editor-in-Chief: Sergio Verdú Department of Electrical Engineering Princeton University Princeton, New Jersey 08544, USA [email protected] Editors Venkat Anantharam (Berkeley) Amos Lapidoth (ETH Zurich) Ezio Biglieri (Torino) Bob McEliece (Caltech) Giuseppe Caire (Eurecom) Neri Merhav (Technion) Roger Cheng (Hong Kong) David Neuhoff (Michigan) K.C. Chen (Taipei) Alon Orlitsky (San Diego) Daniel Costello (NotreDame) Vincent Poor (Princeton) Thomas Cover (Stanford) Kannan Ramchandran (Berkeley) Anthony Ephremides (Maryland) Bixio Rimoldi (EPFL) Andrea Goldsmith (Stanford) Shlomo Shamai (Technion) Dave Forney (MIT) Amin Shokrollahi (EPFL) Georgios Giannakis (Minnesota) Gadiel Seroussi (HP-Palo Alto) Joachim Hagenauer (Munich) Wojciech Szpankowski (Purdue) Te Sun Han (Tokyo) Vahid Tarokh (Harvard) Babak Hassibi (Caltech) David Tse (Berkeley) Michael Honig (Northwestern) Ruediger Urbanke (EPFL) Johannes Huber (Erlangen) Steve Wicker (GeorgiaTech) Hideki Imai (Tokyo) Raymond Yeung (Hong Kong) Rodney Kennedy (Canberra) Bin Yu (Berkeley) Sanjeev Kulkarni (Princeton) Editorial Scope Foundations and Trends™ in Communications and Information Theory will publish survey and tutorial articles in the following topics: • Coded modulation • Multiuser detection • Coding theory and practice • Multiuser information theory • Communication complexity • Optical communication channels • Communication system design • Pattern recognition and learning • Cryptology
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 63, No. 3, September 2013 Editor: Tara Javidi ISSN 1059-2362 Editorial committee: Ioannis Kontoyiannis, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, and Sergio Verdú Annual Awards Announced The main annual awards of the • 2013 IEEE Jack Keil Wolf ISIT IEEE Information Theory Society Student Paper Awards were were announced at the 2013 ISIT selected and announced at in Istanbul this summer. the banquet of the Istanbul • The 2014 Claude E. Shannon Symposium. The winners were Award goes to János Körner. the following: He will give the Shannon Lecture at the 2014 ISIT in 1) Mohammad H. Yassaee, for Hawaii. the paper “A Technique for Deriving One-Shot Achiev - • The 2013 Claude E. Shannon ability Results in Network Award was given to Katalin János Körner Daniel Costello Information Theory”, co- Marton in Istanbul. Katalin authored with Mohammad presented her Shannon R. Aref and Amin A. Gohari Lecture on the Wednesday of the Symposium. If you wish to see her slides again or were unable to attend, a copy of 2) Mansoor I. Yousefi, for the paper “Integrable the slides have been posted on our Society website. Communication Channels and the Nonlinear Fourier Transform”, co-authored with Frank. R. Kschischang • The 2013 Aaron D. Wyner Distinguished Service Award goes to Daniel J. Costello. • Several members of our community became IEEE Fellows or received IEEE Medals, please see our web- • The 2013 IT Society Paper Award was given to Shrinivas site for more information: www.itsoc.org/honors Kudekar, Tom Richardson, and Rüdiger Urbanke for their paper “Threshold Saturation via Spatial Coupling: The Claude E.
    [Show full text]
  • Network Information Theory
    Network Information Theory This comprehensive treatment of network information theory and its applications pro- vides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon’s point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless com- munication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia. Abbas El Gamal is the Hitachi America Chaired Professor in the School of Engineering and the Director of the Information Systems Laboratory in the Department of Electri- cal Engineering at Stanford University. In the field of network information theory, he is best known for his seminal contributions to the relay, broadcast, and interference chan- nels; multiple description coding; coding for noisy networks; and energy-efficient packet scheduling and throughput–delay tradeoffs in wireless networks. He is a Fellow of IEEE and the winner of the 2012 Claude E. Shannon Award, the highest honor in the field of information theory. Young-Han Kim is an Assistant Professor in the Department of Electrical and Com- puter Engineering at the University of California, San Diego.
    [Show full text]
  • Itnl0908.Pdf
    itNL0908.qxd 8/13/08 8:34 AM Page 1 IEEE Information Theory Society Newsletter Vol. 58, No. 3, September 2008 Editor: Daniela Tuninetti ISSN 1059-2362 Annual IT Awards Announced The principal annual awards of the Information Theory Society of the U.S. National Academy of Engineering (2007 to date). He were announced at the Toronto ISIT. The 2009 Shannon Award will be General Co-Chair of the 2009 ISIT in Seoul. goes to Jorma Rissanen. The 2008 Wyner Award goes to Vincent Poor. The winners of the 2008 IT Paper Award are two 2006 IT The Information Theory Society Paper Award is given annually Transactions papers on compressed sensing by David Donoho to an outstanding publication in the fields of interest to the and by Emmanuel Candes and Terence Tao, respectively (with Society appearing anywhere during the preceding two calendar acknowledgment of an earlier paper by Candes, Romberg and years. The winners of the 2008 award are "Compressed sensing," Tao). The winner of the 2008 Joint IT/ComSoc Paper Award is a by David Donoho, which appeared in the April 2006 IEEE Communications Transactions paper on Accumulate-Repeat- Transactions on Information Theory, and "Near-optimal signal Accumulate Codes by Aliazam Abbasfar, Dariush Divsalar and recovery from random projections: universal encoding strate- Kung Yao. Three student authors of ISIT papers received ISIT gies," by Emmanuel Candes and Terence Tao, which appeared Student Paper Awards: Paul Cuff of Stanford, Satish Korada of in the December 2006 Transactions. These two overlapping EPFL, and Yury Polyanskiy of Princeton. Finally, the 2008 papers are the foundations of the exciting new field of com- Chapter of the Year Award was presented to the Kitchener- pressed sensing.
    [Show full text]