Information Theory and Network Coding Information Technology: Transmission, Processing, and Storage

Series Editors: Robert Gallager Massachusetts Institute of Technology Cambridge, Massachusetts

Jack Keil Wolf University of California at San Diego La Jolla, California

Information Theory and Network Coding Raymond W. Yeung Networks and Grids: Technology and Theory Thomas G. Robertazzi CDMA Radio with Repeaters Joseph Shapira and Shmuel Miller Digital Satellite Communications Giovanni Corazza, ed. Immersive Audio Signal Processing Sunil Bharitkar and Chris Kyriakakis Digital Signal Processing for Measurement Systems: Theory and Applications Gabriele D’Antona and Alessandro Ferrero Coding for Wireless Channels Ezio Biglieri Wireless Networks: Multiuser Detection in Cross-Layer Design Christina Comaniciu, Narayan B. Mandayam and H. Vincent Poor The Multimedia Internet Stephen Weinstein MIMO Signals and Systems Horst J. Bessai Multi-Carrier Digital Communications: Theory and Applications of OFDM, 2nd Ed Ahmad R.S. Bahai, Burton R. Saltzberg and Mustafa Ergen Performance Analysis and Modeling of Digital Transmission Systems William Turin Wireless Communications Systems and Networks Mohsen Guizani Interference Avoidance Methods for Wireless Systems Dimitrie C. Popescu and Christopher Rose Stochastic Image Processing Chee Sun Won and Robert M. Gray Coded Modulation Systems John B. Anderson and Arne Svensson Communication System Design Using DSP Algorithms: With Laboratory Experiments for the TMS320C6701 and TMS320C6711 Steven A. Tretter (continued after index) Raymond W. Yeung

Information Theory and Network Coding

123 Raymond W. Yeung The Chinese University of Hong Kong Department of Information Engineering Hong Kong People’s Republic of China [email protected]

Series Editors Robert Gallager Jack Keil Wolf Massachusetts Institute of Technology University of California, San Diego Department of Electrical Engineering Department of Electrical and Computer Science and Computer Engineering Cambridge, MA La Jolla, CA USA USA

ISBN: 978-0-387-79233-0 e-ISBN: 978-0-387-79234-7

Library of Congress Control Number: 2008924472

c 2008 Springer Science+Business Media, LLC All rights reserved. This work may not be translated or copied in whole or in part without the writ- ten permission of the publisher (Springer Science+Business Media, LLC., 233 Spring Street, New York, NY10013, USA), except for brief excerpts in connection with reviews or scholarly analysis. Use in con- nection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden. The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights.

Printed on acid-free paper

987654321 springer.com To my parents and my family Preface

This book is an evolution from my book A First Course in Information Theory published in 2002 when network coding was still at its infancy. The last few years have witnessed the rapid development of network coding into a research field of its own in information science. With its root in informa- tion theory, network coding has not only brought about a paradigm shift in network communications at large, but also had significant influence on such specific research fields as , networking, switching, wireless com- munications, distributed data storage, cryptography, and optimization theory. While new applications of network coding keep emerging, the fundamental re- sults that lay the foundation of the subject are more or less mature. One of the main goals of this book therefore is to present these results in a unifying and coherent manner. While the previous book focused only on information theory for discrete random variables, the current book contains two new chapters on information theory for continuous random variables, namely the chapter on differential entropy and the chapter on continuous-valued channels. With these topics included, the book becomes more comprehensive and is more suitable to be used as a textbook for a course in an electrical engineering department. What Is in This book Out of the 21 chapters in this book, the first 16 chapters belong to Part I, Components of Information Theory, and the last 5 chapters belong to Part II, Fundamentals of Network Coding. Part I covers the basic topics in information theory and prepares the reader for the discussions in Part II. A brief rundown of the chapters will give a better idea of what is in this book. Chapter 1 contains a high-level introduction to the contents of this book. First, there is a discussion on the nature of information theory and the main results in Shannon’s original paper in 1948 which founded the field. There are also pointers to Shannon’s biographies and his works. Chapter 2 introduces Shannon’s information measures for discrete ran- dom variables and their basic properties. Useful identities and inequalities in VIII Preface information theory are derived and explained. Extra care is taken in handling joint distributions with zero probability masses. There is a section devoted to the discussion of maximum entropy distributions. The chapter ends with a section on the entropy rate of a stationary information source. Chapter 3 is an introduction to the theory of I-Measure which establishes a one-to-one correspondence between Shannon’s information measures and set theory. A number of examples are given to show how the use of informa- tion diagrams can simplify the proofs of many results in information theory. Such diagrams are becoming standard tools for solving information theory problems. Chapter 4 is a discussion of zero-error data compression by uniquely de- codable codes, with prefix codes as a special case. A proof of the entropy bound for prefix codes which involves neither the Kraft inequality nor the fundamental inequality is given. This proof facilitates the discussion of the redundancy of prefix codes. Chapter 5 is a thorough treatment of weak typicality. The weak asymp- totic equipartition property and the source coding theorem are discussed. An explanation of the fact that a good data compression scheme produces almost i.i.d. bits is given. There is also an introductory discussion of the Shannon– McMillan–Breiman theorem. The concept of weak typicality will be further developed in Chapter 10 for continuous random variables. Chapter 6 contains a detailed discussion of strong typicality which applies to random variables with finite alphabets. The results developed in this chap- ter will be used for proving the channel coding theorem and the rate-distortion theorem in the next two chapters. The discussion in Chapter 7 of the discrete memoryless channel is an en- hancement of the discussion in the previous book. In particular, the new definition of the discrete memoryless channel enables rigorous formulation and analysis of coding schemes for such channels with or without feed- back. The proof of the channel coding theorem uses a graphical model approach that helps explain the conditional independence of the random variables. Chapter 8 is an introduction to rate-distortion theory. The version of the rate-distortion theorem here, proved by using strong typicality, is a stronger version of the original theorem obtained by Shannon. In Chapter 9, the Blahut–Arimoto algorithms for computing the channel capacity and the rate-distortion function are discussed, and a simplified proof for convergence is given. Great care is taken in handling distributions with zero probability masses. Chapters 10 and 11 are devoted to the discussion of information theory for continuous random variables. Chapter 10 introduces differential entropy and related information measures, and their basic properties are discussed. The asymptotic equipartition property for continuous random variables is proved. The last section on maximum differential entropy distributions echoes the section in Chapter 2 on maximum entropy distributions. Preface IX

Chapter 11 discusses a variety of continuous-valued channels, with the continuous memoryless channel being the basic building block. In proving the capacity of the memoryless Gaussian channel, a careful justification is given for the existence of the differential entropy of the output random variable. Based on this result, the capacity of a system of parallel/correlated Gaussian chan- nels is obtained. Heuristic arguments leading to the formula for the capacity of the bandlimited white/colored Gaussian channel are given. The chapter ends with a proof of the fact that zero-mean Gaussian noise is the worst additive noise. Chapter 12 explores the structure of the I-Measure for Markov structures. Set-theoretic characterizations of full conditional independence and Markov random field are discussed. The treatment of Markov random field here maybe too specialized for the average reader, but the structure of the I-Measure and the simplicity of the information diagram for a Markov chain are best explained as a special case of a Markov random field. Information inequalities are sometimes called the laws of information theory because they govern the impossibilities in information theory. In Chapter 13, the geometrical meaning of information inequalities and the re- lation between information inequalities and conditional independence are ex- plained in depth. The framework for information inequalities discussed here is the basis of the next two chapters. Chapter 14 explains how the problem of proving information inequalities can be formulated as a linear programming problem. This leads to a complete characterization of all information inequalities provable by conventional tech- niques. These inequalities, called Shannon-type inequalities, can be proved by the World Wide Web available software package ITIP. It is also shown how Shannon-type inequalities can be used to tackle the implication problem of conditional independence in probability theory. Shannon-type inequalities are all the information inequalities known dur- ing the first half century of information theory. In the late 1990s, a few new inequalities, called non-Shannon-type inequalities, were discovered. These in- equalities imply the existence of laws in information theory beyond those laid down by Shannon. In Chapter 15, we discuss these inequalities and their ap- plications. Chapter 16 explains an intriguing relation between information theory and group theory. Specifically, for every information inequality satisfied by any joint probability distribution, there is a corresponding group inequality satisfied by any finite group and its subgroups and vice versa. Inequalities of the latter type govern the orders of any finite group and their subgroups. Group-theoretic proofs of Shannon-type information inequalities are given. At the end of the chapter, a group inequality is obtained from a non-Shannon- type inequality discussed in Chapter 15. The meaning and the implication of this inequality are yet to be understood. Chapter 17 starts Part II of the book with a discussion of the butterfly network, the primary example in network coding. Variations of the butterfly XPreface network are analyzed in detail. The advantage of network coding over store- and-forward in wireless and satellite communications is explained through a simple example. We also explain why network coding with multiple infor- mation sources is substantially different from network coding with a single information source. In Chapter 18, the fundamental bound for single-source network coding, called the max-flow bound, is explained in detail. The bound is established for a general class of network codes. In Chapter 19, we discuss various classes of linear network codes on acyclic networks that achieve the max-flow bound to different extents. Static network codes, a special class of linear network codes that achieves the max-flow bound in the presence of channel failure, are also discussed. Polynomial-time algo- rithms for constructing these codes are presented. In Chapter 20, we formulate and analyze convolutional network codes on cyclic networks. The existence of such codes that achieve the max-flow bound is proved. Network coding theory is further developed in Chapter 21. The scenario when more than one information source are multicast in a point-to-point acyclic network is discussed. An implicit characterization of the achievable information rate region which involves the framework for information inequal- ities developed in Part I is proved.

How to Use This book Preface XI

Part I of this book by itself may be regarded as a comprehensive textbook in information theory. The main reason why the book is in the present form is because in my opinion, the discussion of network coding in Part II is in- complete without Part I. Nevertheless, except for Chapter 21 on multi-source network coding, Part II by itself may be used satisfactorily as a textbook on single-source network coding. An elementary course on probability theory and an elementary course on linear algebra are prerequisites to Part I and Part II, respectively. For Chapter 11, some background knowledge on digital communication systems would be helpful, and for Chapter 20, some prior exposure to discrete-time linear systems is necessary. The reader is recommended to read the chapters according to the above chart. However, one will not have too much difficulty jumping around in the book because there should be sufficient references to the previous relevant sections. This book inherits the writing style from the previous book, namely that all the derivations are from the first principle. The book contains a large number of examples, where important points are very often made. To facilitate the use of the book, there is a summary at the end of each chapter. This book can be used as a textbook or a reference book. As a textbook, it is ideal for a two-semester course, with the first and second semesters cov- ering selected topics from Part I and Part II, respectively. A comprehensive instructor’s manual is available upon request. Please contact the author at [email protected] for information and access. Just like any other lengthy document, this book for sure contains errors and omissions. To alleviate the problem, an errata will be maintained at the book homepage http://www.ie.cuhk.edu.hk/IT book2/.

Hong Kong, China Raymond W. Yeung December, 2007 Acknowledgments

The current book, an expansion of my previous book A First Course in Information Theory, was written within the year 2007. Thanks to the generous support of the Friedrich Wilhelm Bessel Research Award from the Alexander von Humboldt Foundation of Germany, I had the luxury of working on the project full-time from January to April when I visited Munich University of Technology. I would like to thank Joachim Hagenauer and Ralf Koetter for nominating me for the award and for hosting my visit. I would also like to thank the Department of Information Engineering, The Chinese University of Hong Kong, for making this arrangement possible. There are many individuals who have directly or indirectly contributed to this book. First, I am indebted to who taught me information theory and writing. I am most thankful to Zhen Zhang, Ning Cai, and Bob Li for their friendship and inspiration. Without the results obtained through our collaboration, the book cannot possibly be in its current form. I would also like to thank Venkat Anantharam, Vijay Bhargava, Dick Blahut, Agnes and Vin- cent Chan, Tom Cover, Imre Csisz´ar, Tony Ephremides, Bob Gallager, Bruce Hajek, , Jim Massey, Prakash Narayan, , , Sergio Verd´u, Victor Wei, Frans Willems, and for their support and encouragement throughout the years. I would also like to thank all the collaborators of my work for their contribution and all the anonymous reviewers for their useful comments. I would like to thank a number of individuals who helped in the project. I benefited tremendously from the discussions with who gave a lot of suggestions for writing the chapters on differential entropy and continuous- valued channels. Terence Chan, Ka Wo Cheung, Bruce Hajek, Siu-Wai Ho, Siu Ting Ho, Tat Ming Lok, Prakash Narayan, Will Ng, Sagar Shenvi, Xiang- Gen Xia, Shaohua Yang, Ken Zeger, and Zhixue Zhang gave many valu- able comments at different stages of the writing. My graduate students Silas Fong, Min Tan, and Shenghao Yang proofread the chapters on network cod- ing in great detail. Silas Fong also helped compose the figures throughout the book. XIV Acknowledgments

On the domestic side, I am most grateful to my wife Rebecca for her love. During our stay in Munich, she took good care of the whole family so that I was able to concentrate on my writing. We are most thankful to our family friend Ms. Pui Yee Wong for taking care of Rebecca when she was ill during the final stage of the project and to my sister Georgiana for her moral support. In this regard, we are indebted to Dr. Yu Lap Yip for his timely diagnosis. I would also like to thank my sister-in-law Ophelia Tsang who comes over during the weekends to help taking care of our daughter Shannon, who continues to be the sweetheart of the family and was most supportive during the time her mom was ill. Contents

1 The Science of Information ...... 1

Part I Components of Information Theory

2 Information Measures ...... 7 2.1 Independence and Markov Chains ...... 7 2.2 Shannon’s Information Measures ...... 12 2.3 Continuity of Shannon’s Information Measures for Fixed FiniteAlphabets...... 18 2.4 ChainRules...... 21 2.5 Informational Divergence ...... 23 2.6 The Basic Inequalities ...... 26 2.7 Some Useful Information Inequalities...... 28 2.8 Fano’s Inequality ...... 32 2.9 Maximum Entropy Distributions ...... 36 2.10 Entropy Rate of a Stationary Source ...... 38 Appendix 2.A: Approximation of Random Variables with Countably Infinite Alphabets by Truncation ...... 41 Chapter Summary ...... 43 Problems ...... 45 HistoricalNotes ...... 50

3TheI-Measure ...... 51 3.1 Preliminaries...... 52 3.2 The I-MeasureforTwoRandomVariables...... 53 3.3 Construction of the I-Measure μ* ...... 55 3.4 μ* Can Be Negative ...... 59 3.5 Information Diagrams ...... 61 3.6 Examples of Applications ...... 67 Appendix 3.A: A Variation of the Inclusion–Exclusion Formula . . . . 74 Chapter Summary ...... 76 XVI Contents

Problems ...... 78 HistoricalNotes ...... 80

4 Zero-Error Data Compression ...... 81 4.1 TheEntropyBound ...... 81 4.2 PrefixCodes...... 86 4.2.1 Definition and Existence ...... 86 4.2.2 Huffman Codes ...... 88 4.3 Redundancy of Prefix Codes ...... 93 Chapter Summary ...... 97 Problems ...... 98 HistoricalNotes ...... 99

5 Weak Typicality ...... 101 5.1 TheWeakAEP...... 101 5.2 TheSourceCodingTheorem...... 104 5.3 EfficientSourceCoding ...... 106 5.4 The Shannon–McMillan–Breiman Theorem ...... 107 Chapter Summary ...... 109 Problems ...... 110 HistoricalNotes ...... 112

6 Strong Typicality ...... 113 6.1 StrongAEP ...... 113 6.2 Strong Typicality Versus Weak Typicality ...... 121 6.3 Joint Typicality ...... 122 6.4 An Interpretation of the Basic Inequalities ...... 131 Chapter Summary ...... 131 Problems ...... 132 HistoricalNotes ...... 135

7 Discrete Memoryless Channels ...... 137 7.1 Definition and Capacity ...... 141 7.2 The Channel Coding Theorem ...... 149 7.3 TheConverse...... 152 7.4 Achievability ...... 157 7.5 ADiscussion...... 164 7.6 Feedback Capacity...... 167 7.7 Separation of Source and Channel Coding ...... 172 Chapter Summary ...... 175 Problems ...... 176 HistoricalNotes ...... 181

8 Rate-Distortion Theory ...... 183 8.1 Single-Letter Distortion Measures ...... 184 8.2 The Rate-Distortion Function R(D) ...... 187 Contents XVII

8.3 The Rate-Distortion Theorem ...... 192 8.4 TheConverse...... 200 8.5 Achievability of RI (D)...... 202 Chapter Summary ...... 207 Problems ...... 208 HistoricalNotes ...... 209

9 The Blahut–Arimoto Algorithms ...... 211 9.1 Alternating Optimization ...... 212 9.2 The Algorithms ...... 214 9.2.1 Channel Capacity ...... 214 9.2.2 The Rate-Distortion Function ...... 219 9.3 Convergence ...... 222 9.3.1 A Sufficient Condition ...... 222 9.3.2 Convergence to the Channel Capacity ...... 225 Chapter Summary ...... 226 Problems ...... 227 HistoricalNotes ...... 228

10 Differential Entropy ...... 229 10.1 Preliminaries...... 231 10.2 Definition ...... 235 10.3 Joint Differential Entropy, Conditional (Differential) Entropy, and Mutual Information ...... 238 10.4 The AEP for Continuous Random Variables ...... 245 10.5 Informational Divergence ...... 248 10.6 Maximum Differential Entropy Distributions ...... 249 Chapter Summary ...... 252 Problems ...... 255 HistoricalNotes ...... 256

11 Continuous-Valued Channels ...... 257 11.1 Discrete-Time Channels ...... 257 11.2 The Channel Coding Theorem ...... 260 11.3 Proof of the Channel Coding Theorem ...... 262 11.3.1 The Converse ...... 262 11.3.2 Achievability...... 265 11.4 Memoryless Gaussian Channels ...... 270 11.5 Parallel Gaussian Channels ...... 272 11.6 Correlated Gaussian Channels ...... 278 11.7 The Bandlimited White Gaussian Channel ...... 280 11.8 The Bandlimited Colored Gaussian Channel ...... 287 11.9 Zero-Mean Gaussian Noise Is the Worst Additive Noise...... 290 Chapter Summary ...... 294 Problems ...... 296 HistoricalNotes ...... 297 XVIII Contents

12 Markov Structures...... 299 12.1 Conditional Mutual Independence ...... 300 12.2 Full Conditional Mutual Independence ...... 309 12.3 Markov Random Field ...... 314 12.4 Markov Chain ...... 317 Chapter Summary ...... 319 Problems ...... 320 HistoricalNotes ...... 321

13 Information Inequalities ...... 323 ∗ 13.1 The Region Γ n ...... 325 13.2 Information Expressions in Canonical Form ...... 326 13.3 A Geometrical Framework ...... 329 13.3.1 Unconstrained Inequalities ...... 329 13.3.2 Constrained Inequalities ...... 330 13.3.3 Constrained Identities ...... 332 13.4 Equivalence of Constrained Inequalities ...... 333 13.5 The Implication Problem of Conditional Independence ...... 336 Chapter Summary ...... 337 Problems ...... 338 HistoricalNotes ...... 338

14 Shannon-Type Inequalities ...... 339 14.1 The Elemental Inequalities ...... 339 14.2 A Linear Programming Approach...... 341 14.2.1 Unconstrained Inequalities ...... 343 14.2.2 Constrained Inequalities and Identities ...... 344 14.3 A Duality ...... 345 14.4 Machine Proving – ITIP ...... 347 14.5 Tackling the Implication Problem...... 351 14.6 Minimality of the Elemental Inequalities ...... 353 Appendix 14.A: The Basic Inequalities and the Polymatroidal Axioms ...... 356 Chapter Summary ...... 357 Problems ...... 358 HistoricalNotes ...... 360

15 Beyond Shannon-Type Inequalities ...... 361 ∗ ∗ ∗ 15.1 Characterizations of Γ 2, Γ 3,andΓ n ...... 361 15.2 A Non-Shannon-Type Unconstrained Inequality ...... 369 15.3 A Non-Shannon-Type Constrained Inequality ...... 374 15.4 Applications ...... 380 Chapter Summary ...... 383 Problems ...... 383 HistoricalNotes ...... 385 Contents XIX

16 Entropy and Groups ...... 387 16.1 Group Preliminaries ...... 388 16.2 Group-Characterizable Entropy Functions ...... 393 ∗ 16.3 A Group Characterization of Γ n ...... 398 16.4 Information Inequalities and Group Inequalities ...... 401 Chapter Summary ...... 405 Problems ...... 406 HistoricalNotes ...... 408

Part II Fundamentals of Network Coding

17 Introduction ...... 411 17.1 The Butterfly Network ...... 412 17.2 Wireless and Satellite Communications ...... 415 17.3 Source Separation ...... 417 Chapter Summary ...... 418 Problems ...... 418 HistoricalNotes ...... 419

18 The Max-Flow Bound ...... 421 18.1 Point-to-Point Communication Networks ...... 421 18.2 Examples Achieving the Max-Flow Bound ...... 424 18.3 A Class of Network Codes ...... 427 18.4 Proof of the Max-Flow Bound ...... 429 Chapter Summary ...... 431 Problems ...... 431 HistoricalNotes ...... 434

19 Single-Source Linear Network Coding: Acyclic Networks ...... 435 19.1 Acyclic Networks ...... 436 19.2 Linear Network Codes ...... 437 19.3 Desirable Properties of a Linear Network Code ...... 443 19.3.1 Transformation of a Linear Network Code ...... 447 19.3.2 Implementation of a Linear Network Code ...... 448 19.4 Existence and Construction ...... 449 19.5 Generic Network Codes ...... 460 19.6 Static Network Codes ...... 468 19.7 Random Network Coding: A Case Study ...... 473 19.7.1 How the System Works ...... 474 19.7.2 Model and Analysis ...... 475 Chapter Summary ...... 478 Problems ...... 479 HistoricalNotes ...... 482 XX Contents

20 Single-Source Linear Network Coding: Cyclic Networks . . . . 485 20.1 Delay-Free Cyclic Networks ...... 485 20.2 Convolutional Network Codes ...... 488 20.3 Decoding of Convolutional Network Codes ...... 498 Chapter Summary ...... 503 Problems ...... 503 HistoricalNotes ...... 504

21 Multi-source Network Coding ...... 505 21.1 The Max-Flow Bounds ...... 505 21.2 Examples of Application ...... 508 21.2.1 Multilevel Diversity Coding ...... 508 21.2.2 Satellite Communication Network ...... 510 21.3 A Network Code for Acyclic Networks ...... 511 21.4 The Achievable Information Rate Region ...... 512 21.5 Explicit Inner and Outer Bounds ...... 515 21.6 The Converse ...... 516 21.7 Achievability ...... 521 21.7.1 Random Code Construction ...... 524 21.7.2 Performance Analysis ...... 527 Chapter Summary ...... 536 Problems ...... 537 HistoricalNotes ...... 539

Bibliography ...... 541

Index ...... 561