
Streaming source coding with delay Cheng Chang Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2007-164 http://www.eecs.berkeley.edu/Pubs/TechRpts/2007/EECS-2007-164.html December 19, 2007 Copyright © 2007, by the author(s). All rights reserved. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission. Streaming Source Coding with Delay by Cheng Chang M.S. (University of California, Berkeley) 2004 B.E. (Tsinghua University, Beijing) 2000 A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Engineering - Electrical Engineeing and Computer Sciences in the GRADUATE DIVISION of the UNIVERSITY OF CALIFORNIA, BERKELEY Committee in charge: Professor Anant Sahai, Chair Professor Kannan Ramchandran Professor David Aldous Fall 2007 The dissertation of Cheng Chang is approved. Chair Date Date Date University of California, Berkeley Fall 2007 Streaming Source Coding with Delay Copyright °c 2007 by Cheng Chang Abstract Streaming Source Coding with Delay by Cheng Chang Doctor of Philosophy in Engineering - Electrical Engineeing and Computer Sciences University of California, Berkeley Professor Anant Sahai, Chair Traditionally, information theory is studied in the block coding context | all the source symbols are known in advance by the encoder(s). Under this setup, the minimum number of channel uses per source symbol needed for reliable information transmission is understood for many cases. This is known as the capacity region for channel coding and the rate region for source coding. The block coding error exponent (the convergence rate of the error probability as a function of block length) is also studied in the literature. In this thesis, we consider the problem that source symbols stream into the encoder in real time and the decoder has to make a decision within a ¯nite delay on a symbol by symbol basis. For a ¯nite delay constraint, a fundamental question is how many channel uses per source symbol are needed to achieve a certain symbol error probability. This question, to our knowledge, has never been systematically studied. We answer the source coding side of the question by studying the asymptotic convergence rate of the symbol error probability as a function of delay | the delay constrained error exponent. The technical contributions of this thesis include the following. We derive an upper bound on the delay constrained error exponent for lossless source coding and show the achievability of this bound by using a ¯xed to variable length coding scheme. We then extend the same treatment to lossy source coding with delay where a tight bound on the error exponent is derived. Both delay constrained error exponents are connected to their block coding counterpart by a \focusing" operator. An achievability result for lossless multiple terminal source coding is then derived in the delay constrained context. Finally, we borrow the genie-aided feed-forward decoder argument from the channel coding literature 1 to derive an upper bound on the delay constrained error exponent for source coding with decoder side-information. This upper bound is strictly smaller than the error exponent for source coding with both encoder and decoder side-information. This \price of ignorance" phenomenon only appears in the streaming with delay setup but not the traditional block coding setup. These delay constrained error exponents for streaming source coding are generally di®erent from their block coding counterparts. This di®erence has also been recently observed in the channel coding with feedback literature. Professor Anant Sahai Dissertation Committee Chair 2 Contents List of Figures v List of Tables viii Acknowledgements ix 1 Introduction 1 1.1 Sources, channels, feedback and side-information . 3 1.2 Information, delay and random arrivals . 6 1.3 Error exponents, focusing operator and bandwidth . 9 1.4 Overview of the thesis . 12 1.4.1 Lossless source coding with delay . 12 1.4.2 Lossy source coding with delay . 13 1.4.3 Distributed lossless source coding with delay . 14 1.4.4 Lossless Source Coding with Decoder Side-Information with delay . 15 2 Lossless Source Coding 16 2.1 Problem Setup and Main Results . 16 2.1.1 Source Coding with Delay Constraints . 16 2.1.2 Main result of Chapter 2: Lossless Source Coding Error Exponent with Delay . 22 2.2 Numerical Results . 22 2.2.1 Comparison of error exponents . 23 2.2.2 Non-asymptotic results: pre¯x free coding of length 2 and queueing delay . 24 2.3 First attempt: some suboptimal coding schemes . 28 2.3.1 Block Source Coding's Delay Performance . 28 i 2.3.2 Error-free Coding with Queueing Delay . 29 2.3.3 Sequential Random Binning . 31 2.4 Proof of the Main Results . 40 2.4.1 Achievability . 40 2.4.2 Converse . 44 2.5 Discussions . 46 2.5.1 Properties of the delay constrained error exponent Es(R) . 46 2.5.2 Conclusions and future work . 52 3 Lossy Source Coding 54 3.1 Lossy source coding . 54 3.1.1 Lossy source coding with delay . 55 3.1.2 Delay constrained lossy source coding error exponent . 56 3.2 A brief detour to peak distortion . 56 3.2.1 Peak distortion . 57 3.2.2 Rate distortion and error exponent for the peak distortion measure . 57 3.3 Numerical Results . 59 3.4 Proof of Theorem 2 . 60 3.4.1 Converse . 60 3.4.2 Achievability . 60 3.5 Discussions: why peak distortion measure? . 63 4 Distributed Lossless Source Coding 65 4.1 Distributed source coding with delay . 65 4.1.1 Lossless Distributed Source Coding with Delay Constraints . 66 4.1.2 Main result of Chapter 4: Achievable error exponents . 69 4.2 Numerical Results . 72 4.2.1 Example 1: symmetric source with uniform marginals . 73 4.2.2 Example 2: non-symmetric source . 74 4.3 Proofs . 76 4.3.1 ML decoding . 76 4.3.2 Universal Decoding . 85 4.4 Discussions . 88 ii 5 Lossless Source Coding with Decoder Side-Information 90 5.1 Delay constrained source coding with decoder side-information . 90 5.1.1 Delay Constrained Source Coding with Side-Information . 92 5.1.2 Main results of Chapter 5: lower and upper bound on the error ex- ponents . 93 5.2 Two special cases . 95 5.2.1 Independent side-information . 95 5.2.2 Delay constrained encryption of compressed data . 96 5.3 Delay constrained Source Coding with Encoder Side-Information . 99 5.3.1 Delay constrained error exponent . 100 5.3.2 Price of ignorance . 102 5.4 Numerical results . 103 5.4.1 Special case 1: independent side-information . 103 5.4.2 Special case 2: compression of encrypted data . 103 5.4.3 General cases . 106 5.5 Proofs . 108 5.5.1 Proof of Theorem 6: random binning . 108 5.5.2 Proof of Theorem 7: Feed-forward decoding . 110 5.6 Discussions . 117 6 Future Work 118 6.1 Past, present and future . 119 6.1.1 Dominant error event . 119 6.1.2 How to deal with both past and future? . 120 6.2 Source model and common randomness . 122 6.3 Small but interesting problems . 123 Bibliography 124 A Review of ¯xed-length block source coding 131 A.1 Lossless Source Coding and Error Exponent . 131 A.2 Lossy source coding . 133 A.2.1 Rate distortion function and error exponent for block coding under average distortion . 133 A.3 Block distributed source coding and error exponents . 135 iii A.4 Review of block source coding with side-information . 137 B Tilted Entropy Coding and its delay constrained performance 140 B.1 Tilted entropy coding . 140 B.2 Error Events . 142 B.3 Achievability of delay constrained error exponent Es(R) . 143 C Proof of the concavity of the rate distortion function under the peak distortion measure 145 D Derivation of the upper bound on the delay constrained lossy source cod- ing error exponent 146 E Bounding source atypicality under a distortion measure 149 F Bounding individual error events for distributed source coding 152 F.1 ML decoding: Proof of Lemma 9 . 152 F.2 Universal decoding: Proof of Lemma 11 . 155 G Equivalence of ML and universal error exponents and tilted distributions 158 (γ) 1 1 G.1 case 1: γH(pxjy ) + (1 ¡ γ)H(pxy ) < R < γH(¹pxjy ) + (1 ¡ γ)H(pxy ). 160 (γ) 1 1 G.2 case 2: R ¸ γH(¹pxjy ) + (1 ¡ γ)H(pxy ). 165 G.3 Technical Lemmas on tilted distributions . 167 H Bounding individual error events for source coding with side-information 175 H.1 ML decoding: Proof of Lemma 13 . 175 H.2 Universal decoding: Proof of Lemma 14 . 177 I Proof of Corollary 1 179 I.1 Proof of the lower bound . 179 I.2 Proof of the upper bound . 180 J Proof of Theorem 8 181 J.1 Achievability of Eei(R).............................. 181 J.2 Converse . 185 iv List of Figures 1.1 Information theory in 50 years . 2 1.2 Shannon's original problem . 3 1.3 Separation, feedback and side-informfation . 4 1.4 Noiseless channel . 5 1.5 Delay constrained source coding for streaming data . 7 1.6 Random arrivals of source symbols . 7 1.7 Block length vs decoding error . 10 1.8 Delay vs decoding error .
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages202 Page
-
File Size-