Information Theory and Coding

Information Theory and Coding

Information Theory and Coding Topic 04/10 Maximum-likelihood decoding M. Sc. Marko Hennhöfer Ilmenau University of Technology Communications Research Laboratory Winter Semester 2019 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 1 Contents 4 Channel coding 4.1 Block codes, asymptotic coding gains 4.2 Convolutional codes, trellis diagram, hard-/soft decision decoding 4.3 Turbo Codes 4.4 LDPC codes 4.5 Polar codes (additional slide set) 5 Appendix 5.1 Polynomial representation of convolutional codes 5.2 Cyclic Redundancy check (CRC) 5.3 Matlab / Octave example, BER Hamming Code M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 2 4 Channel Coding Overview on different coding schemes: convolutional codes block codes concatenated codes non- recursive non- linear serial parallel recursive linear Turbo Conv. e.g. used e.g. used Codes (TCC) in GSM, in 4G LTE in 4G LTE LTE Turbo codes non-cyclic cyclic Polar LDPC Hamming RS BCH Golay codes in 5G e.g. used in „1977 5G, DVB-T2/S2, Voyager“, - in detail Wifi, Wimax now QR, - basics Audio CD - not treated M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 3 4 Channel Coding Channel coding: 11,01, Channel 110,011, coder info word, add useful code word, length k redundancy length n e.g., for FEC Defines a (n,k ) block code code rate R = k / n < 1 Example: (3,1) repetition code code info bandwidth bit bit expansion rate rate factor results in an increased data rate M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 4 4 Channel Coding Code properties: Systematic codes: Info words occur as a part of the code words Code space: Linear codes: The sum of two code words is again a codeword bit-by-bit modulo 2 addition without carry M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 5 4 Channel Coding Code properties: Minimum Hamming distance: A measure, how different the most closely located code words are. Example: compare all combinations of code words For linear codes the comparison simplifies to finding the code word with the lowest Hamming weight: M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 6 4 Channel Coding Maximum likelihood decoding (MLD): Goal: Minimum word error probability CW 11,01, 11,01, Channel 110,011, discrete 100,011, 110,011, encoder coder channel estimator inverse Code word estimator: is the mapping from all 2n possible received words to the 2k possible code words in Example: (7,4) Hamming code 27 = 128 possible received words 24 = 16 valid code words in M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 7 4 Channel Coding Decoding rule: Assumption: equal apriori probabilities, i.e., all 2k code words appear with probability 1/2k. Probability for wrong detection if a certain cw was transmitted: Probability to receice a CW that yields an estimate Furthermore: M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 8 4 Channel Coding Example: (n=3,k=1) Repetition Code: Assumption: equal apriori probabilities, i.e., each of the 2k =21 =2 code words (111,000) appear with probability 1/2k=1/21=1/2 Probability for wrong detection if a certain cw was transmitted: e.g., assume was transmitted over a BSC: Transmitted, Possibly Decoded Prob., e.g., if a BSC is a received, y considered 111 000 000 P(000|111) pe pe pe 001 000 consider all received P(001|111) pe pe (1-pe) 010 000 P(010|111) p (1-p ) p 011 111 words that yield a e e e 100 000 wrong estimate P(100|111) (1-pe) pe pe 101 111 110 111 111 111 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 9 4 Channel Coding Probability for a wrong detection (considering all possibly transmitted CWs now): mean over all transmitted CWs combining the sums wrong detection any detection correct detection M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 10 4 Channel Coding Probability for wrong detection: To minimize choose for each received word such that gets maximized is maximized, if we choose a CW with the minimum distance to the received word . M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 11 4 Channel Coding MLD for hard decision DMC: Find the CW with minimum Hamming distance. MLD for soft decision AWGN: Euklidean distance Find the CW with minimum Euklidean distance. M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 12 4 Channel Coding Coding gain: Suitable measure: Bit error probability: (for the BER after decoding, we consider the errors in the k decoded bits) Code word error probability: Example: Transmit 10 CWs and 1 bit error shall occur k info bits 1 bit wrong will yield 1 wrong code word 40 info bits have been transmitted As in general more than one error can occur in a code word, we can only approximate M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 13 4 Channel Coding If we consider that a decoding error occurs only if bits are wrong: Comparison of codes considering the AWGN channel: Energy per bit vs. energy per coded bit (for constant transmit power) Example: (3,1) repetition code, coded bits, energy coding 1 1 1 1 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 14 4 Channel Coding Example: BER Performance using the (7,4) Hamming code uncoded -2 P hard, approx 10 b P soft, approx b -4 10 In the low SNR regime we suffer from the reduced energy per coded bit -6 10 asymptotic coding gain -8 10 hard vs. soft decision gain -10 10 3 4 5 6 7 8 9 10 11 12 E / N in dB c 0 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 15 4 Channel Coding Analytical calculation of the error probabilities: Hard decision: combinations for r errors Example: (3,1) repetition code in a sequence of length n Info code received word word word 1 combination for 3 errors 3 combinations for 2 errors 3 combinations for 1 error will be corrected M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 16 4 Channel Coding error can be corrected 3 combinations 1 combination for 2 errors for 3 errors general: CW errors occur combinations probability probability for more than for r errors for r errors for n-r t+1 wrong bits in a sequence correct bits of length n M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 17 4 Channel Coding Approximation for small values of only take the lowest power of into account general: Example: (7,4) Hamming code, for a binary mod. scheme & AWGN channel M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 18 4 Channel Coding Set of code words for the (7,4) Hamming-Code: 0 0 0 0 0 0 0 1 0 0 0 1 0 1 0 0 0 1 0 1 1 1 0 0 1 1 1 0 0 0 1 0 1 1 0 1 0 1 0 0 1 1 0 0 1 1 1 0 1 1 0 1 1 0 0 0 0 1 0 0 1 1 1 1 1 0 0 0 1 0 0 1 0 1 1 0 0 1 1 0 1 0 0 1 0 1 1 0 0 0 1 1 1 1 0 1 0 0 0 1 1 1 0 1 0 1 1 1 1 1 1 1 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 19 4 Channel Coding Example: BER Performance using the (7,4) Hamming code uncoded -2 P hard 10 b simulated P hard w P approx b calculated P approx -4 w 10 as derived before -6 10 more bits should have been simulated to get reliable results here -8 10 3 4 5 6 7 8 9 10 11 12 E / N in dB b 0 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 20 4 Channel Coding Asymptotic coding gain for hard decision decoding: uncoded: good approximation for high SNR coded: constant Assume constant BER and compare signal-to-noise ratios in dB M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 21 4 Channel Coding Example: BER Performance using the (7,4) Hamming code uncoded -2 P hard, approx 10 b P soft, approx b -4 10 -6 10 Asymptotic coding gain -8 10 -10 10 3 4 5 6 7 8 9 10 11 12 E / N in dB c 0 M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 22 4 Channel Coding Analytical calculation of the error probabilities: Soft decision: code word AWGN channel received word + noise vector: i.i.d. Example: (3,2) Parity check code + M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 23 4 Channel Coding Example continued ML decoding rule, derived before Pairwise error probability: Assume has been transmitted. What is the probability that the decoder decides for a different CW ? The decoder will decide for if the received word has a smaller Euklidean distance to as compared to . next: Evaluate the norm by summing the squared components M.Sc. Marko Hennhöfer, Communications Research Lab Information Theory and Coding Slide: 24 4 Channel Coding For the whole CW we have different bits M.Sc.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    159 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us