LDPC Codes 2 Outline
Total Page:16
File Type:pdf, Size:1020Kb
An Introduction to Low-Density Parity-Check Codes Paul H. Siegel Electrical and Computer Engineering University of California, San Diego 5/ 31/ 07 1 ©©CCoopyrightpyright 2007 2007 by by Paul Paul H. H .Siegel Siegel Outline • Shannon’s Channel Coding Theorem • Error-Correcting Codes – State-of-the-Art • LDPC Code Basics • Encoding • Decoding • LDPC Code Design • Asymptotic performance analysis • Design optimization 5/ 31/ 07 LDPC Codes 2 Outline • EXIT Chart Analysis • Applications • Binary Erasure Channel • Binary Symmetric Channel • AWGN Channel • Rayleigh Fading Channel • Partial-Response Channel • Basic References 5/ 31/ 07 LDPC Codes 3 A Noisy Communication System INFORMATION SOURCE TRANSMITTER RECEIVER DESTINATION CHANNEL SIGNAL RECEIVED SIGNAL MESSAGE MESSAGE NOISE SOURCE 5/ 31/ 07 LDPC Codes 4 Channels • Binary erasure channel BEC(ε) 1- ε 0 0 ε ε ? 1 1 1- ε • Binary symmetric channel BSC(p) 1-p 00 p p 1 1 1-p 5/ 31/ 07 LDPC Codes 5 More Channels • Additive white Gaussian noise channel AWGN f (y | −1) f ( y |1) − P P 5/ 31/ 07 LDPC Codes 6 Shannon Capacity Every communication channel is characterized by a single number C, called the channel capacity. It is possible to transmit information over this channel reliably (with probability of error → 0) if and only if: def # information bits R = < C channel use 5/ 31/ 07 LDPC Codes 7 Channels and Capacities • Binary erasure channel BEC(ε) C = 1− ε 1- ε 1 0 0 0.9 0.8 ε 0.7 0.6 0.5 ? 0.4 ε 0.3 0.2 1 1 0.1 0 1- ε 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 • Binary symmetric channel BSC(p) C = 1− H 2 ( p) 1 1-p 0.9 0.8 000.7 p 0.6 0.5 0.4 p 0.3 0.2 0.1 1 1 0 0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1 1-p H 2 ( p) = − p log2 p − (1− p)log2 (1− p) 5/ 31/ 07 LDPC Codes 8 More Channels and Capacities • Additive white Gaussian noise channel AWGN 1 ⎛ P ⎞ C = log 2 ⎜1+ ⎟ f (y |1) f (y | 0) 2 ⎝ σ 2 ⎠ 1.8 1.6 1.4 − P P 1.2 1 0.8 0.6 0.4 0.2 0 -10 -8 -6 -4 -2 0 2 4 6 8 10 5/ 31/ 07 LDPC Codes 9 Coding We use a code to communicate over the noisy channel. c = c ,c ,…,c x = x1, x2 ,…, xk 1 2 n Source Encoder Channel Sink Decoder xˆ = xˆ1, xˆ2 ,…, xˆk y = y1, y2 ,…, yn Code rate: R = k n 5/ 31/ 07 LDPC Codes 10 Shannon’s Coding Theorems If C is a code with rate R>C, then the probability of error in decoding this code is bounded away from 0. (In other words, at any rate R>C, reliable communication is not possible.) For any information rate R < C and any δ > 0, there exists a code C of length nδ and rate R, such that the probability of error in maximum likelihood decoding of this code is at most δ. C = Max (H(x) – H (x)) y Proof: Non-constructive! 5/ 31/ 07 LDPC Codes 11 Review of Shannon’s Paper • A pioneering paper: Shannon, C. E. “A mathematical theory of communication. Bell System Tech. J. 27, (1948). 379–423, 623–656 • A regrettable review: Doob, J.L., Mathematical Reviews, MR0026286 (10,133e) “The discussion is suggestive throughout, rather than mathematical, and it is not always clear that the author’s mathematical intentions are honorable.” Cover, T. “Shannon’s Contributions to Shannon Theory,” AMS Notices, vol. 49, no. 1, p. 11, January 2002 “Doob has recanted this remark many times, saying that it and his naming of super martingales (processes that go down instead of up) are his two big regrets.” 5/ 31/ 07 LDPC Codes 12 Finding Good Codes • Ingredients of Shannon’s proof: • Random code • Large block length • Optimal decoding • Problem Randomness + large block length + optimal decoding = COMPLEXITY! 5/ 31/ 07 LDPC Codes 13 State-of-the-Art •Solution • Long, structured, “pseudorandom” codes • Practical, near-optimal decoding algorithms •Examples • Turbo codes (1993) • Low-density parity-check (LDPC) codes (1960, 1999) • State-of-the-art • Turbo codes and LDPC codes have brought Shannon limits to within reach on a wide range of channels. 5/ 31/ 07 LDPC Codes 14 Evolution of Coding Technology LDPC codes from Trellis and Turbo Coding, Schlegel and Perez, IEEE Press, 2004 5/ 31/ 07 LDPC Codes 15 Linear Block Codes - Basics • Parameters of binary linear block code C • k = number of information bits • n = number of code bits • R = k/n • dmin = minimum distance • There are many ways to describe C • Codebook (list) • Parity-check matrix / generator matrix • Graphical representation (“Tanner graph”) 5/ 31/ 07 LDPC Codes 16 Example: (7,4) Hamming Code •• ((n,kn,k)) == (7,4)(7,4) , , R R = = 4/74/7 • d = 3 • dminmin = 3 • single error correcting • single error correcting 5 •• doubledouble erasure erasure correct correctinging •• EEncodingncoding rule:rule: 2 3 1 1.1. InsertInsert data data bits bits in in 1, 1, 2, 2, 3, 3, 4. 4. 7 6 2.2. InsertInsert “parity” “parity”bbitsits in in 5, 5, 6, 6, 7 7 4 toto ensureensure anan even even number number ofof 1’s1’s inin eacheach circlecircle 5/ 31/ 07 LDPC Codes 17 Example: (7,4) Hamming Code •2k=16 codewords • Systematic encoder places input bits in positions 1, 2, 3, 4 • Parity bits are in positions 5, 6, 7 0 0 0 0 0 0 0 1 0 0 0 1 1 1 0 0 0 1 0 1 1 1 0 0 1 1 0 0 0 0 1 0 1 1 0 1 0 1 0 0 0 1 0 0 1 1 1 0 1 1 0 1 1 0 1 0 0 1 0 0 1 0 1 1 1 0 0 0 1 0 0 1 0 1 1 1 0 1 1 0 1 0 0 1 0 1 1 0 0 1 1 1 1 1 0 1 0 0 0 1 1 1 0 0 0 1 1 1 1 1 1 1 5/ 31/ 07 LDPC Codes 18 Hamming Code – Parity Checks 5 11 22 33 44 55 66 77 1110100 2 3 1 1011010 7 6 1101001 4 5/ 31/ 07 LDPC Codes 19 Hamming Code: Matrix Perspective • Parity check matrix H c = []c1,c2 ,c3 ,c4 ,c5 ,c6 ,c7 ⎡1 1 1 0 1 0 0⎤ ⎡0⎤ ⎢ ⎥ T ⎢ ⎥ H = ⎢1 0 1 1 0 1 0⎥ H c = ⎢0⎥ ⎢1 1 0 1 0 0 1⎥ ⎣ ⎦ ⎣⎢0⎦⎥ •Generator matrix G ⎡1 0 0 0 1 1 1⎤ u = []u1,u2 ,u3.u4 ⎢0 1 0 0 1 0 1⎥ G = ⎢ ⎥ c = []c1,c2 ,c3 ,c4 ,c5 ,c6 ,c7 ⎢0 0 1 0 1 1 0⎥ ⎢ ⎥ u ⋅G = c ⎣0 0 0 1 0 1 1⎦ 5/ 31/ 07 LDPC Codes 20 Parity-Check Equations • Parity-check matrix implies system of linear equations. ⎡1 1 1 0 1 0 0⎤ c1 + c2 + c3 + c5 = 0 ⎢ ⎥ c + c + c + c = 0 H = ⎢1 0 1 1 0 1 0⎥ 1 3 4 6 ⎣⎢1 1 0 1 0 0 1⎦⎥ c1 + c2 + c4 + c7 = 0 • Parity-check matrix is not unique. • Any set of vectors that span the rowspace generated by H can serve as the rows of a parity check matrix (including sets with more than 3 vectors). 5/ 31/ 07 LDPC Codes 21 Hamming Code: Tanner Graph • Bi-partite graph representing parity-check equations c1 c2 c1 +c2 +c3 +c5 =0 c3 c4 c1 +c3 +c4 +c6 =0 c5 c6 c1 +c2 +c4 +c7 =0 c7 5/ 31/ 07 LDPC Codes 22 Tanner Graph Terminology variable nodes (bit, left) check nodes (constraint, right) TheThe degreedegreeofof a a node node is is the the number number ofof edges edges connected connected to to it. it. 5/ 31/ 07 LDPC Codes 23 Low-Density Parity-Check Codes • Proposed by Gallager (1960) • “Sparseness” of matrix and graph descriptions • Number of 1’s in H grows linearly with block length • Number of edges in Tanner graph grows linearly with block length • “Randomness” of construction in: • Placement of 1’s in H • Connectivity of variable and check nodes • Iterative, message-passing decoder • Simple “local” decoding at nodes • Iterative exchange of information (message-passing) 5/ 31/ 07 LDPC Codes 24 Review of Gallager’s Paper • Another pioneering work: Gallager, R. G., Low-Density Parity-Check Codes, M.I.T. Press, Cambridge, Mass: 1963. • A more enlightened review: Horstein, M., IEEE Trans. Inform. Thoery, vol. 10, no. 2, p. 172, April 1964, “This book is an extremely lucid and circumspect exposition of an important piece of research. A comparison with other coding and decoding procedures designed for high-reliability transmission ... is difficult...Furthermore, many hours of computer simulation are needed to evaluate a probabilistic decoding scheme... It appears, however, that LDPC codes have a sufficient number of desirable features to make them highly competitive with ... other schemes ....” 5/ 31/ 07 LDPC Codes 25 Gallager’s LDPC Codes • Now called “regular” LDPC codes • Parameters (n,j,k) ─ n = codeword length ─ j = # of parity-check equations involving each code bit = degree of each variable node ─ k = # code bits involved in each parity-check equation = degree of each check node • Locations of 1’s can be chosen randomly, subject to (j,k) constraints.