Channel Coding
Total Page:16
File Type:pdf, Size:1020Kb
INVITED PAPER Channel Coding: The Road to Channel Capacity Fifty years of effort and invention have finally produced coding schemes that closely approach Shannon’s channel capacity limit on memoryless communication channels. By Daniel J. Costello, Jr., Fellow IEEE, and G. David Forney, Jr., Life Fellow IEEE ABSTRACT | Starting from Shannon’s celebrated 1948 channel As Bob McEliece observed in his 2004 Shannon coding theorem, we trace the evolution of channel coding from Lecture [2], the extraordinary efforts that were required Hamming codes to capacity-approaching codes. We focus on to achieve this objective may not be fully appreciated by the contributions that have led to the most significant future historians. McEliece imagined a biographical note improvements in performance versus complexity for practical in the 166th edition of the Encyclopedia Galactica along the applications, particularly on the additive white Gaussian noise following lines. channel. We discuss algebraic block codes, and why they did not prove to be the way to get to the Shannon limit. We trace Claude Shannon: Born on the planet Earth (Sol III) the antecedents of today’s capacity-approaching codes: con- in the year 1916 A.D. Generally regarded as the volutional codes, concatenated codes, and other probabilistic father of the Information Age, he formulated the coding schemes. Finally, we sketch some of the practical notion of channel capacity in 1948 A.D. Within applications of these codes. several decades, mathematicians and engineers had devised practical ways to communicate reliably at KEYWORDS | Algebraic block codes; channel coding; codes on data rates within 1% of the Shannon limit ... graphs; concatenated codes; convolutional codes; low-density parity-check codes; turbo codes The purpose of this paper is to tell the story of how Shannon’s challenge was met, at least as it appeared to us, before the details of this story are lost to memory. I. INTRODUCTION We focus on the AWGN channel, which was the target The field of channel coding started with Claude Shannon’s for many of these efforts. In Section II, we review various 1948 landmark paper [1]. For the next half century, its definitions of the Shannon limit for this channel. central objective was to find practical coding schemes that In Section III, we discuss the subfield of algebraic could approach channel capacity (hereafter called Bthe coding, which dominated the field of channel coding for its Shannon limit[) on well-understood channels such as the first couple of decades. We will discuss both the additive white Gaussian noise (AWGN) channel. This goal achievements of algebraic coding, and also the reasons proved to be challenging, but not impossible. In the past why it did not prove to be the way to approach the decade, with the advent of turbo codes and the rebirth of Shannon limit. low-density parity-check (LDPC) codes, it has finally been In Section IV, we discuss the alternative line of achieved, at least in many cases of practical interest. development that was inspired more directly by Shannon’s random coding approach, which is sometimes called Manuscript received May 31, 2006; revised January 22, 2007. This work was supported Bprobabilistic coding.[ This line of development includes in part by the National Science Foundation under Grant CCR02-05310 and by the convolutional codes, product codes, concatenated codes, National Aeronautics and Space Administration under Grant NNGO5GH73G. D. J. Costello, Jr., is with the Department of Electrical Engineering, University of Notre trellis decoding of block codes, and ultimately modern Dame, Notre Dame, IN 46556 USA (e-mail: [email protected]). capacity-approaching codes. G. D. Forney, Jr., is with the Department of Electrical Engineering and Computer Science, Massachusetts Institute of Technology, Cambridge, MA 02139 USA In Section V, we discuss codes for bandwidth- (e-mail: [email protected]). limited channels, namely lattice codes and trellis-coded Digital Object Identifier: 10.1109/JPROC.2007.895188 modulation. 1150 Proceedings of the IEEE | Vol. 95, No. 6, June 2007 0018-9219/$25.00 Ó2007 IEEE Authorized licensed use limited to: University of Illinois. Downloaded on September 18, 2009 at 10:42 from IEEE Xplore. Restrictions apply. Costello and Forney: Channel Coding: Road to Channel Capacity Finally, in Section VI, we discuss the development of So, we may say that the Shannon limit on rate (i.e., the capacity-approaching codes, principally turbo codes and channel capacity) is W log 1 SNR b/s, or equivalently 2ð þ Þ LDPC codes. that the Shannon limit on spectral efficiency is log 1 SNR b/s/Hz, or equivalently that the Shannon 2ð þ Þ limit on SNR for a given spectral efficiency is 2 1. À II. CODINGFORTHEAWGNCHANNEL Note that the Shannon limit on SNR is a lower bound A coding scheme for the AWGN channel may be rather than an upper bound. characterized by two simple parameters: its signal-to-noise These bounds suggest that we define a normalized SNR ratio (SNR) and its spectral efficiency in bits per second parameter SNRnorm as follows: per Hertz (b/s/Hz). The SNR is the ratio of average signal power to average noise power, a dimensionless quantity. SNR The spectral efficiency of a coding scheme that transmits R SNR : norm ¼ 2 1 bits per second (b/s) over an AWGN channel of bandwidth À W Hz is simply R=W b/s/Hz. ¼ Coding schemes for the AWGN channel typically map a Then, for any reliable coding scheme, SNRnorm 9 1, i.e., sequence of bits at a rate R b/s to a sequence of real the Shannon limit (lower bound) on SNRnorm is 1 (0 dB), symbols at a rate of 2B symbols per second; the discrete- independent of . Moreover, SNRnorm measures the Bgap time code rate is then r R=2B bits per symbol. The [ ¼ to capacity, i.e., 10 log10 SNRnorm is the difference in sequence of real symbols is then modulated via pulse decibels (dB)1 between the SNR actually used and the amplitude modulation (PAM) or quadrature amplitude Shannon limit on SNR given , namely 2 1. If the À modulation (QAM) for transmission over an AWGN desired spectral efficiency is less than 1 b/s/Hz (the so- channel of bandwidth W. By Nyquist theory, B (sometimes called power-limited regime), then it can be shown that called the BShannon bandwidth[ [3]) cannot exceed the binary codes can be used on the AWGN channel with a actual bandwidth W. If B W, then the spectral efficiency cost in Shannon limit on SNR of less than 0.2 dB. On is R=W R=B 2r. We therefore say that the ¼ ¼ the other hand, since for a binary coding scheme the nominal spectral efficiency of a discrete-time coding scheme discrete-time code rate is bounded by r 1 bit per is 2r, the discrete-time code rate in bits per two symbols. symbol, the spectral efficiency of a binary coding scheme The actual spectral efficiency R=W of the cor- is limited to 2r 2 b/s/Hz, so multilevel coding ¼ responding continuous-time scheme is upperbounded by schemes must be used if the desired spectral efficiency is the nominal spectral efficiency 2r and approaches 2r as greater than 2 b/s/Hz (the so-called bandwidth-limited B W. Thus, for discrete-time codes, we will often ! regime). In practice, coding schemes for the power-limited denote 2r by , implicitly assuming B W. and bandwidth-limited regimes differ considerably. Shannon showed that on an AWGN channel with a A closely related normalized SNR parameter that has given SNR and bandwidth W Hz, the rate of reliable been traditionally used in the power-limited regime is transmission is upperbounded by Eb=N0, which may be defined as R G W log 1 SNR : SNR 2 1 2ð þ Þ E =N À SNR : b 0 ¼ ¼ norm Moreover, if a long code with rate R G W log 1 SNR is 2ð þ Þ chosen at random, then there exists a decoding scheme For a given spectral efficiency , Eb=N0 is thus lower such that with high probability the code and decoder will bounded by achieve highly reliable transmission (i.e., low probability of decoding error). Equivalently, Shannon’s result shows that the spectral 2 1 Eb=N0 9 À efficiency is upperbounded by so we may say that the Shannon limit (lower bound) on G log2 1 SNR ð þ Þ E =N as a function of is 2 1 =. This function b 0 ð À Þ decreases monotonically with and approaches ln 2 as or, given a spectral efficiency , that the SNR needed for 0, so we may say that the ultimate Shannon limit ! reliable transmission is lowerbounded by (lower bound) on Eb=N0 for any is ln 2( 1.59 dB). À SNR 9 2 1: 1In decibels, a multiplicative factor of is expressed as 10 log dB. À 10 Vol. 95, No. 6, June 2007 | Proceedings of the IEEE 1151 Authorized licensed use limited to: University of Illinois. Downloaded on September 18, 2009 at 10:42 from IEEE Xplore. Restrictions apply. Costello and Forney: Channel Coding: Road to Channel Capacity We see that as 0, E =N SNR ln 2, so to the Nyquist limit of 2W binary symbols per second, ! b 0 ! norm Eb=N0 and SNRnorm become equivalent parameters in the using standard pulse amplitude modulation (PAM) for severely power-limited regime. In the power-limited baseband channels or quadrature amplitude modulation regime, we will therefore use the traditional parameter (QAM) for passband channels. Eb=N0. However, in the bandwidth-limited regime, we will At the receiver, an optimum PAM or QAM detector use SNR , which is more informative in this regime.