Universal Source Coding for Multiple Decoders with Side

Total Page:16

File Type:pdf, Size:1020Kb

Universal Source Coding for Multiple Decoders with Side 2010 IEEE International Symposium on Information Theory Proceedings (ISIT 2010) Austin, Texas, USA 13 – 18 June 2010 Pages 1-908 IEEE Catalog Number: CFP10SIF-PRT ISBN: 978-1-4244-7890-3 1/3 TABLE OF CONTENTS Plenary Sessions Monday 8:30 – 9:30 — Completely Random Measures for Bayesian Nonparametrics .........................................................................xlv Michael I. Jordan Tuesday 8:30 – 9:30 — Coding for Noisy Networks .................................................................................................................................xlv Abbas El Gamal Wednesday 8:30 – 9:30 — THE AUDACITY OF THROUGHPUT – A Trilogy of Rates – ........................................................................xlv Anthony Ephremides Thursday 8:30 – 9:30 — Musing upon Information Theory ......................................................................................................................xliv Te Sun Han Friday 8:30 – 9:30 — Can Structure Beat Shannon? — The Secrets of Lattice-codes ...............................................................................xlv Ram Zamir S-Mo-1: MULTI-TERMINAL SOURCE CODING S-Mo-1.1: UNIVERSAL SOURCE CODING FOR MULTIPLE DECODERS WITH SIDE ............................................................................... 1 INFORMATION Shigeaki Kuzuoka, Akisato Kimura, Tomohiko Uyematsu S-Mo-1.2: ROBUST MULTIRESOLUTION CODING WITH HAMMING DISTORTION ............................................................................ 6 MEASURE Jun Chen, Sorina Dumitrescu, Ying Zhang, Jia Wang S-Mo-1.3: ON ZERO-ERROR SOURCE CODING WITH FEEDBACK .......................................................................................................11 Mayank Bakshi, Michelle Effros S-Mo-1.4: RATE DISTORTION WITH SIDE-INFORMATION AT MANY RECEIVERS ...............................................................................16 Roy Timo, Terence Chan, Alex Grant S-Mo-2: GAUSSIAN MULTI-TERMINAL SOURCE CODING I S-Mo-2.1: THE GENERALIZED QUADRATIC GAUSSIAN CEO PROBLEM: NEW CASES ....................................................................21 WITH TIGHT RATE REGION AND APPLICATIONS Yang Yang, Yifu Zhang, Zixiang Xiong S-Mo-2.2: SUM RATE OF THE VACATIONING CEO PROBLEM ..............................................................................................................26 Rajiv Soundararajan, Aaron Wagner, Sriram Vishwanath S-Mo-2.3: CASCADE AND TRIANGULAR SOURCE CODING WITH SIDE INFORMATION AT .........................................................31 THE FIRST TWO NODES Haim Permuter, Tsachy Weissman S-Mo-2.4: ON THE EXCESS DISTORTION EXPONENT OF THE QUADRATIC-GAUSSIAN .................................................................36 WYNER-ZIV PROBLEM Yuval Kochman, Gregory W. Wornell S-Mo-3: GAUSSIAN MULTI-TERMINAL SOURCE CODING II S-Mo-3.1: ON THE SUM-RATE LOSS OF QUADRATIC GAUSSIAN MULTITERMINAL .........................................................................41 SOURCE CODING Yang Yang, Yifu Zhang, Zixiang Xiong vii S-Mo-3.2: ON THE SUM RATE OF VECTOR GAUSSIAN MULTITERMINAL SOURCE ..........................................................................46 CODING Jia Wang, Jun Chen S-Mo-3.3: DISTRIBUTED SOURCE CODING OF CORRELATED MEMORYLESS GAUSSIAN ...............................................................51 OBSERVATIONS Yasutada Oohama S-Mo-3.4: RATE REGION OF THE GAUSSIAN SCALAR-HELP-VECTOR SOURCE-CODING ...............................................................56 PROBLEM Md Saifur Rahman, Aaron Wagner S-Mo-4: RATE-DISTORTION THEORY S-Mo-4.1: INTERACTION STRICTLY IMPROVES THE WYNER-ZIV RATE-DISTORTION .........................................................................61 FUNCTION Nan Ma, Prakash Ishwar S-Mo-4.2: ON THE RATE DISTORTION FUNCTION OF BERNOULLI GAUSSIAN ...............................................................................66 SEQUENCES Cheng Chang S-Mo-4.3: ON THE PHYSICS OF RATE-DISTORTION THEORY ................................................................................................................71 Neri Merhav S-Mo-4.4: IMPROVED UPPER BOUNDS TO THE CAUSAL QUADRATIC RATE-DISTORTION ..............................................................76 FUNCTION FOR GAUSSIAN STATIONARY SOURCES Milan Derpich, Jan Ostergaard S-Tu-1: SOURCE CODING UNDER CONSTRAINTS S-Tu-1.1: ON OPTIMAL CAUSAL CODING OF PARTIALLY OBSERVED MARKOV SOURCES ..............................................................81 UNDER CLASSICAL AND NON-CLASSICAL INFORMATION STRUCTURES Serdar Yuksel S-Tu-1.2: STRUCTURE THEOREM FOR REAL-TIME VARIABLE-RATE LOSSY SOURCE .............................................................................86 ENCODERS AND MEMORY-LIMITED DECODERS WITH SIDE INFORMATION Yonatan Kaspi, Neri Merhav S-Tu-1.3: SOURCE CODING WITH SIDE INFORMATION USING LIST DECODING ...........................................................................91 Mortuza Ali, Margreta Kuijper S-Tu-1.4: SOURCE CODING WITH FEEDFORWARD USING THE POSTERIOR ....................................................................................96 MATCHING SCHEME Hani-James Ebeid, Todd Coleman S-Tu-2: CODING FOR SOURCES I S-Tu-2.1: ASYMPTOTIC OPTIMALITY OF ANTIDICTIONARY CODES .................................................................................................. 101 Takahiro Ota, Hiroyoshi Morita S-Tu-2.2: FRACTAL COMPRESSION RATE CURVES IN LOSSLESS COMPRESSION OF .................................................................... 106 BALANCED TREES Sang-Youn Oh, John Kieffer S-Tu-2.3: LOCAL COORDINATES OF A TRELLIS SOURCE CODE .......................................................................................................... 111 John Kieffer, John Marcos S-Tu-2.4: CASES WHERE FINDING THE MINIMUM ENTROPY COLORING OF A .......................................................................... 116 CHARACTERISTIC GRAPH IS A POLYNOMIAL TIME PROBLEM Soheil Feizi, Muriel Medard viii S-Tu-3: CODING FOR SOURCES II S-Tu-3.1: ON THE ERROR EXPONENT TO REDUNDANCY RATIO OF INTERACTIVE ....................................................................... 121 ENCODING AND DECODING Jin Meng, En-Hui Yang S-Tu-3.2: ON THE ACHIEVABLE REDUNDANCY RATE OF FIXED LENGTH SOURCE ........................................................................ 126 CODE FOR GENERAL SOURCES Mitsuharu Arimura, Ken-Ichi Iwata S-Tu-3.3: ON THE OVERFLOW PROBABILITY OF LOSSLESS CODES WITH SIDE .............................................................................. 131 INFORMATION Ryo Nomura, Toshiyasu Matsushima S-Tu-4: JOINT SOURCE-CHANNEL CODING S-Tu-4.1: A TIGHT BOUND ON THE PERFORMANCE OF A MINIMAL DELAY JOINT ..................................................................... 136 SOURCE-CHANNEL CODING SCHEME Marius Kleiner, Bixio Rimoldi S-Tu-4.2: ON OPTIMALITY OF A HYBRID DIGITAL/ANALOG SCHEME FOR WYNER-ZIV .............................................................. 141 CODING OVER BROADCAST CHANNELS Yang Gao, Ertem Tuncel S-Tu-4.3: THE ACHIEVABLE DISTORTION REGION OF BIVARIATE GAUSSIAN SOURCE ................................................................ 146 ON GAUSSIAN BROADCAST CHANNEL Chao Tian, Suhas Diggavi, Shlomo Shamai (Shitz) S-Tu-4.4: DISTRIBUTED SIGNAL CANCELATION INSPIRED BY WITSENHAUSEN’S ........................................................................ 151 COUNTEREXAMPLE Pulkit Grover, Anant Sahai S-Tu-4.5: DISTORTION EXPONENTS OF TWO-WAY RELAYING COOPERATIVE .............................................................................. 156 NETWORKS Jing Wang, Jie Liang S-We-2: CODING FOR SOURCE NETWORKS I S-We-2.1: EXACT-REPAIR MDS CODES FOR DISTRIBUTED STORAGE USING .................................................................................. 161 INTERFERENCE ALIGNMENT Changho Suh, Kannan Ramchandran S-We-2.2: INTERFERENCE NEUTRALIZATION IN DISTRIBUTED LOSSY SOURCE ............................................................................. 166 CODING Mohammad Ali Maddah-Ali, David Tse S-We-2.3: HAMMING CODING FOR MULTIPLE SOURCES ................................................................................................................. 171 Rick Ma, Samuel Cheng S-We-2.4: TWO-WAY SOURCE CODING THROUGH A RELAY .......................................................................................................... 176 Han-I Su, Abbas El Gamal S-Th-1: CODING FOR SOURCE NETWORKS II S-Th-1.1: OPTIMAL MULTIRESOLUTION QUANTIZATION FOR BROADCAST CHANNELS ............................................................. 181 WITH RANDOM INDEX ASSIGNMENT Fei Teng, En-Hui Yang, Xiang Yu S-Th-1.2: UNIVERSAL SLEPIAN-WOLF SOURCE CODES USING LOW-DENSITY .............................................................................. 186 PARITY-CHECK MATRICES Tetsunao Matsuta, Tomohiko Uyematsu, Ryutaroh Matsumoto S-Th-1.3: IDENTIFICATION AND LOSSY RECONSTRUCTION IN NOISY DATABASES ..................................................................... 191 Ertem Tuncel, Deniz Gunduz ix S-Th-1.4: ON THE MEMORY REQUIRED TO COMPUTE FUNCTIONS OF STREAMING ................................................................. 196 DATA Krishnamurthy Viswanathan S-Th-2: CAPACITY S-Th-2.1: THE CAPACITY OF THE UNIFORM NOISE REWRITABLE CHANNEL WITH ....................................................................... 201 AVERAGE COST Luis A. Lastras-Montaño, Michele M. Franceschini, Thomas Mittelholzer S-Th-2.2: A FOKKER-PLANCK DIFFERENTIAL EQUATION APPROACH FOR THE ..............................................................................
Recommended publications
  • Group Testing
    Group Testing Amit Kumar Sinhababu∗ and Vikraman Choudhuryy Department of Computer Science and Engineering, Indian Institute of Technology Kanpur April 21, 2013 1 Motivation Out original motivation in this project was to study \coding theory in data streaming", which has two aspects. • Applications of theory correcting codes to efficiently solve problems in the model of data streaming. • Solving coding theory problems in the model of data streaming. For ex- ample, \Can one recognize a Reed-Solomon codeword in one-pass using only poly-log space?" [1] As we started, we were directed to a related combinatorial problem, \Group testing", which is important on its own, having connections with \Compressed Sensing", \Data Streaming", \Coding Theory", \Expanders", \Derandomiza- tion". This project report surveys some of these interesting connections. 2 Group Testing The group testing problem is to identify the set of \positives" (\defectives", or \infected", or 1) from a large set of population/items, using as few tests as possible. ∗[email protected] [email protected] 1 2.1 Definition There is an unknown stream x 2 f0; 1gn with at most d ones in it. We are allowed to test any subset S of the indices. The answer to the test tells whether xi = 0 for all i 2 S, or not (at least one xi = 1). The objective is to design as few tests as possible (t tests) such that x can be identified as fast as possible. Group testing strategies can be either adaptive or non-adaptive. A group testing algorithm is non-adaptive if all tests must be specified without knowing the outcome of other tests.
    [Show full text]
  • DRASIC Distributed Recurrent Autoencoder for Scalable
    DRASIC: Distributed Recurrent Autoencoder for Scalable Image Compression Enmao Diao∗, Jie Dingy, and Vahid Tarokh∗ ∗Duke University yUniversity of Minnesota-Twin Cities Durham, NC, 27701, USA Minneapolis, MN 55455, USA [email protected] [email protected] [email protected] Abstract We propose a new architecture for distributed image compression from a group of distributed data sources. The work is motivated by practical needs of data-driven codec design, low power con- sumption, robustness, and data privacy. The proposed architecture, which we refer to as Distributed Recurrent Autoencoder for Scalable Image Compression (DRASIC), is able to train distributed encoders and one joint decoder on correlated data sources. Its compression capability is much bet- ter than the method of training codecs separately. Meanwhile, the performance of our distributed system with 10 distributed sources is only within 2 dB peak signal-to-noise ratio (PSNR) of the performance of a single codec trained with all data sources. We experiment distributed sources with different correlations and show how our data-driven methodology well matches the Slepian- Wolf Theorem in Distributed Source Coding (DSC). To the best of our knowledge, this is the first data-driven DSC framework for general distributed code design with deep learning. 1 Introduction It has been shown by a variety of previous works that deep neural networks (DNN) can achieve comparable results as classical image compression techniques [1–9]. Most of these methods are based on autoencoder networks and quantization of bottleneck representa- tions. These models usually rely on entropy codec to further compress codes. Moreover, to achieve different compression rates it is unavoidable to train multiple models with different regularization parameters separately, which is often computationally intensive.
    [Show full text]
  • Digital Communication Systems 2.2 Optimal Source Coding
    Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L.
    [Show full text]
  • Principles of Communications ECS 332
    Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Communication Systems Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 1 BKD 2 Remark 1 If the downloaded file crashed your device/browser, try another one posted on the course website: 3 Remark 2 There is also three more sections from the Appendices of the lecture notes: 4 Shannon's insight 5 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 6 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD- TV, and the California Institute for Telecommunic ations and Information Technology 7 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) Hello. I'm Claude Shannon a mathematician here at the Bell Telephone laboratories He didn't create the compact disc, the fax machine, digital wireless telephones Or mp3 files, but in 1948 Claude Shannon paved the way for all of them with the Basic theory underlying digital communications and storage he called it 8 information theory. C. E. Shannon (1916-2001) 9 https://www.youtube.com/watch?v=47ag2sXRDeU C. E. Shannon (1916-2001) One of the most influential minds of the 20th century yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large 10 C.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 59, No. 3, September 2009 Editor: Tracey Ho ISSN 1059-2362 Optimal Estimation XXX Shannon lecture, presented in 2009 in Seoul, South Korea (extended abstract) J. Rissanen 1 Prologue 2 Modeling Problem The fi rst quest for optimal estimation by Fisher, [2], Cramer, The modeling problem begins with a set of observed data 5 5 5 c 6 Rao and others, [1], dates back to over half a century and has Y yt:t 1, 2, , n , often together with other so-called 5 51 c26 changed remarkably little. The covariance of the estimated pa- explanatory data Y, X yt, x1,t, x2,t, . The objective is rameters was taken as the quality measure of estimators, for to learn properties in Y expressed by a set of distributions which the main result, the Cramer-Rao inequality, sets a lower as models bound. It is reached by the maximum likelihood (ML) estima- 5 1 u 26 tor for a restricted subclass of models and asymptotically for f Y|Xs; , s . a wider class. The covariance, which is just one property of u5u c u models, is too weak a measure to permit extension to estima- Here s is a structure parameter and 1, , k1s2 real- valued tion of the number of parameters, which is handled by various parameters, whose number depends on the structure. The ad hoc criteria too numerous to list here. structure is simply a subset of the models. Typically it is used to indicate the most important variables in X. (The traditional Soon after I had studied Shannon’s formal defi nition of in- name for the set of models is ‘likelihood function’ although no formation in random variables and his other remarkable per- such concept exists in probability theory.) formance bounds for communication, [4], I wanted to apply them to other fi elds – in particular to estimation and statis- The most important problem is the selection of the model tics in general.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 53, No.4, December 2003 Editor: Lance C. Pérez ISSN 1059-2362 The Shannon Lecture Hidden Markov Models and the Baum-Welch Algorithm Lloyd R. Welch Content of This Talk what the ‘running variable’ is. The lectures of previous Shannon Lecturers fall into several Of particular use will be the concept of conditional probabil- categories such as introducing new areas of research, resusci- ity and recursive factorization. The recursive factorization tating areas of research, surveying areas identified with the idea says that the joint probability of a collection of events can lecturer, or reminiscing on the career of the lecturer. In this be expressed as a product of conditional probabilities, where talk I decided to restrict the subject to the Baum-Welch “algo- each is the probability of an event conditioned on all previous rithm” and some of the ideas that led to its development. events. For example, let A, B, and C be three events. Then I am sure that most of you are familiar with Markov chains Pr(A ∩ B ∩ C) = Pr(A)Pr(B | A)Pr(C | A ∩ B) and Markov processes. They are natural models for various communication channels in which channel conditions change Using the bracket notation, we can display the recursive fac- with time. In many cases it is not the state sequence of the torization of the joint probability distribution of a sequence of model which is observed but the effects of the process on a discrete random variables: signal. That is, the states are not observable but some func- tions, possibly random, of the states are observed.
    [Show full text]
  • Nested Tailbiting Convolutional Codes for Secrecy, Privacy, and Storage
    Nested Tailbiting Convolutional Codes for Secrecy, Privacy, and Storage Thomas Jerkovits Onur Günlü Vladimir Sidorenko [email protected] [email protected] Gerhard Kramer German Aerospace Center TU Berlin [email protected] Weçling, Germany Berlin, Germany [email protected] TU Munich Munich, Germany ABSTRACT them as physical “one-way functions” that are easy to compute and A key agreement problem is considered that has a biometric or difficult to invert [33]. physical identifier, a terminal for key enrollment, and a terminal There are several security, privacy, storage, and complexity con- for reconstruction. A nested convolutional code design is proposed straints that a PUF-based key agreement method should fulfill. First, that performs vector quantization during enrollment and error the method should not leak information about the secret key (neg- control during reconstruction. Physical identifiers with small bit ligible secrecy leakage). Second, the method should leak as little error probability illustrate the gains of the design. One variant of information about the identifier (minimum privacy leakage). The the nested convolutional codes improves on the best known key privacy leakage constraint can be considered as an upper bound vs. storage rate ratio but it has high complexity. A second variant on the secrecy leakage via the public information of the first en- with lower complexity performs similar to nested polar codes. The rollment of a PUF about the secret key generated by the second results suggest that the choice of code for key agreement with enrollment of the same PUF [12]. Third, one should limit the stor- identifiers depends primarily on the complexity constraint.
    [Show full text]
  • MASTER of ADVANCED STUDY New Professional Degrees for Engineers University of California, San Diego of California, University
    pulse cover12_Layout 1 6/22/11 3:46 PM Page 1 Entrepreneurism Center • Research Expo 2011 In Memory of Jack Wolf Jacobs School of Engineering News PulseSummer 2011 MASTER OF ADVANCED STUDY New Professional Degrees for Engineers University of California, San Diego of California, University > dean’s column < New Interdisciplinary Degree Programs for Engineering Professionals Jacobs School of Engineering The most exciting and innovative engineering often occurs on the interface between traditional disciplines. We are extending our interdisciplinary Leadership Dean: Frieder Seible collaborations — which have always been at the core of the Jacobs School culture Associate Dean: Jeanne Ferrante — to new graduate education programs for engineering professionals. Associate Dean: Charles Tu Associate Dean for Administration and Finance: Beginning this fall, the Jacobs School will offer four new interdisciplinary Steve Ross Master of Advanced Study (MAS) programs for working engineers: Wireless Executive Director of External Relations: Embedded Systems, Medical Device Engineering, Structural Health Monitoring, Denine Hagen and Simulation-Based Engineering. Academic Departments Bioengineering: Shankar Subramanian, Chair TThese master degree programs are engineering equivalents of MBA programs Computer Science and Engineering: at business management schools. Geared to early- to mid-career engineers Rajesh Gupta, Chair Electrical and Computer Engineering: with practical work experience, our new MAS programs align faculty research Yeshaiahu Fainman, Chair strengths with industry workforce needs. The curricula are always jointly offered Mechanical and Aerospace Engineering: by two academic departments, so that the training focuses in a practical way on Sutanu Sarkar, Chair NanoEngineering: industry-specific application areas that are not available through traditional master Kenneth Vecchio, Chair degree programs.
    [Show full text]
  • Information Theory and Statistics: a Tutorial
    Foundations and Trends™ in Communications and Information Theory Volume 1 Issue 4, 2004 Editorial Board Editor-in-Chief: Sergio Verdú Department of Electrical Engineering Princeton University Princeton, New Jersey 08544, USA [email protected] Editors Venkat Anantharam (Berkeley) Amos Lapidoth (ETH Zurich) Ezio Biglieri (Torino) Bob McEliece (Caltech) Giuseppe Caire (Eurecom) Neri Merhav (Technion) Roger Cheng (Hong Kong) David Neuhoff (Michigan) K.C. Chen (Taipei) Alon Orlitsky (San Diego) Daniel Costello (NotreDame) Vincent Poor (Princeton) Thomas Cover (Stanford) Kannan Ramchandran (Berkeley) Anthony Ephremides (Maryland) Bixio Rimoldi (EPFL) Andrea Goldsmith (Stanford) Shlomo Shamai (Technion) Dave Forney (MIT) Amin Shokrollahi (EPFL) Georgios Giannakis (Minnesota) Gadiel Seroussi (HP-Palo Alto) Joachim Hagenauer (Munich) Wojciech Szpankowski (Purdue) Te Sun Han (Tokyo) Vahid Tarokh (Harvard) Babak Hassibi (Caltech) David Tse (Berkeley) Michael Honig (Northwestern) Ruediger Urbanke (EPFL) Johannes Huber (Erlangen) Steve Wicker (GeorgiaTech) Hideki Imai (Tokyo) Raymond Yeung (Hong Kong) Rodney Kennedy (Canberra) Bin Yu (Berkeley) Sanjeev Kulkarni (Princeton) Editorial Scope Foundations and Trends™ in Communications and Information Theory will publish survey and tutorial articles in the following topics: • Coded modulation • Multiuser detection • Coding theory and practice • Multiuser information theory • Communication complexity • Optical communication channels • Communication system design • Pattern recognition and learning • Cryptology
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 63, No. 3, September 2013 Editor: Tara Javidi ISSN 1059-2362 Editorial committee: Ioannis Kontoyiannis, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, and Sergio Verdú Annual Awards Announced The main annual awards of the • 2013 IEEE Jack Keil Wolf ISIT IEEE Information Theory Society Student Paper Awards were were announced at the 2013 ISIT selected and announced at in Istanbul this summer. the banquet of the Istanbul • The 2014 Claude E. Shannon Symposium. The winners were Award goes to János Körner. the following: He will give the Shannon Lecture at the 2014 ISIT in 1) Mohammad H. Yassaee, for Hawaii. the paper “A Technique for Deriving One-Shot Achiev - • The 2013 Claude E. Shannon ability Results in Network Award was given to Katalin János Körner Daniel Costello Information Theory”, co- Marton in Istanbul. Katalin authored with Mohammad presented her Shannon R. Aref and Amin A. Gohari Lecture on the Wednesday of the Symposium. If you wish to see her slides again or were unable to attend, a copy of 2) Mansoor I. Yousefi, for the paper “Integrable the slides have been posted on our Society website. Communication Channels and the Nonlinear Fourier Transform”, co-authored with Frank. R. Kschischang • The 2013 Aaron D. Wyner Distinguished Service Award goes to Daniel J. Costello. • Several members of our community became IEEE Fellows or received IEEE Medals, please see our web- • The 2013 IT Society Paper Award was given to Shrinivas site for more information: www.itsoc.org/honors Kudekar, Tom Richardson, and Rüdiger Urbanke for their paper “Threshold Saturation via Spatial Coupling: The Claude E.
    [Show full text]
  • Network Information Theory
    Network Information Theory This comprehensive treatment of network information theory and its applications pro- vides the first unified coverage of both classical and recent results. With an approach that balances the introduction of new models and new coding techniques, readers are guided through Shannon’s point-to-point information theory, single-hop networks, multihop networks, and extensions to distributed computing, secrecy, wireless communication, and networking. Elementary mathematical tools and techniques are used throughout, requiring only basic knowledge of probability, whilst unified proofs of coding theorems are based on a few simple lemmas, making the text accessible to newcomers. Key topics covered include successive cancellation and superposition coding, MIMO wireless com- munication, network coding, and cooperative relaying. Also covered are feedback and interactive communication, capacity approximations and scaling laws, and asynchronous and random access channels. This book is ideal for use in the classroom, for self-study, and as a reference for researchers and engineers in industry and academia. Abbas El Gamal is the Hitachi America Chaired Professor in the School of Engineering and the Director of the Information Systems Laboratory in the Department of Electri- cal Engineering at Stanford University. In the field of network information theory, he is best known for his seminal contributions to the relay, broadcast, and interference chan- nels; multiple description coding; coding for noisy networks; and energy-efficient packet scheduling and throughput–delay tradeoffs in wireless networks. He is a Fellow of IEEE and the winner of the 2012 Claude E. Shannon Award, the highest honor in the field of information theory. Young-Han Kim is an Assistant Professor in the Department of Electrical and Com- puter Engineering at the University of California, San Diego.
    [Show full text]
  • President's Column
    IEEE Information Theory Society Newsletter Vol. 53, No. 3, September 2003 Editor: Lance C. Pérez ISSN 1059-2362 President’s Column Han Vinck This message is written after re- tion of the electronic library, many turning from an exciting Informa- universities and companies make tion Theory Symposium in this product available to their stu- Yokohama, Japan. Our Japanese dents and staff members. For these colleagues prepared an excellent people there is no direct need to be technical program in the beautiful an IEEE member. IEEE member- and modern setting of Yokohama ship reduced in general by about city. The largest number of contri- 10% and the Society must take ac- butions was in the field of LDPC tions to make the value of member- coding, followed by cryptography. ship visible to you and potential It is interesting to observe that new members. This is an impor- quantum information theory gets tant task for the Board and the more and more attention. The Jap- Membership Development com- anese hospitality contributed to IT Society President Han Vinck playing mittee. The Educational Commit- the general success of this year’s traditional Japanese drums. tee, chaired by Ivan Fair, will event. In spite of all the problems contribute to the value of member- with SARS, the Symposium attracted approximately 650 ship by introducing new activities. The first action cur- participants. At the symposium we announced Bob rently being undertaken by this committee is the McEliece to be the winner of the 2004 Shannon Award. creation of a web site whose purpose is to make readily The Claude E.
    [Show full text]