4 Classical Error Correcting Codes

Total Page:16

File Type:pdf, Size:1020Kb

4 Classical Error Correcting Codes “Machines should work. People should think.” Richard Hamming. 4 Classical Error Correcting Codes Coding theory is an application of information theory critical for reliable communication and fault-tolerant information storage and processing; indeed, Shannon channel coding theorem tells us that we can transmit information on a noisy channel with an arbitrarily low probability of error. A code is designed based on a well-defined set of specifications and protects the information only for the type and number of errors prescribed in its design. Agoodcode: (i) Adds a minimum amount of redundancy to the original message. (ii) Efficient encoding and decoding schemes for the code exist; this means that information is easily mapped to and extracted from a codeword. Reliable communication and fault-tolerant computing are intimately related to each other; the concept of a communication channel, an abstraction for a physical system used to transmit information from one place to another and/or from one time to another is at the core of communication, as well as, information storage and processing. It should however be clear 355 Basic Concepts Linear Codes Polynomial Codes Basic Concepts Linear Codes Other Codes Figure 96: The organization of Chapter 4 at a glance. that the existence of error-correcting codes does not guarantee that logic operations can be implemented using noisy gates and circuits. The strategies to build reliable computing systems using unreliable components are based on John von Neumann’s studies of error detection and error correction techniques for information storage and processing. This chapter covers topics from the theory of classical error detection and error correction and introduces concepts useful for understanding quantum error correction, the subject of Chapter 5. For simplicity we often restrict our discussion to the binary case. The organization of Chapter 4 is summarized in Figure 96. After introducing the basic concepts we discuss linear, polynomial, and several other classes of codes. 356 4.1 Informal Introduction to Error Detection and Error Correction Error detection is the process of determining if a message is in error; error correction is the process of restoring a message in error to its original content. Information is packaged into codewords and the intuition behind error detection and error correction is to artificially increase some “distance” between codewords so that the errors cannot possibly transform one valid codeword into another valid codeword. Error detection and error correction are based on schemes to increase the redundancy of a message. A crude analogy is to bubble wrap a fragile item and place it into a box to reduce the chance that the item will be damaged during transport. Redundant information plays the role of the packing materials; it increases the amount of data transmitted, but it also increases the chance that we will be able to restore the original contents of a message distorted during communication. Coding corresponds to the selection of both the packing materials and the strategy to optimally pack the fragile item subject to the obvious constraints: use the least amount of packing materials and the least amount of effort to pack and unpack. Example. A trivial example of an error detection scheme is the addition of a parity check bit to a word of a given length. This is a simple scheme but very powerful; it allows us to detect an odd number of errors, but fails if an even number of errors occur. For example, consider a system that enforces even parity for an eight-bit word. Given the string (10111011), we add one more bit to ensure that the total number of 1s is even, in this case a 0, and we transmit the nine-bit string (101110110). The error detection procedure is to count the number of 1s; we decide that the string is in error if this number is odd. When the information symbols are embedded into the codeword, as in this example, the code is called a systematic code; the information symbols are scattered throughout the codeword of a non-systematic code. This example also hints to the limitations of error detection mechanisms. A code is designed with certain error detection or error correction capabilities and fails to detect, or to correct error patterns not covered by the original design of the code. In the previous example we transmit (101110110) and when two errors occur, in the 4-th and the 7-th bits we receive (101010010). This tuple has even parity (an even number of 1’s) and our scheme for error detection fails. Error detection and error correction have practical applications in our daily life. Error detection is used in cases when an item needs to be uniquely identified by a string of digits and there is the possibility of misidentification. Bar codes, also called Universal Product Codes (UPC), the codes used by the postal service, the American banking system, or by the airlines for paper tickets, all have some error detection capability. For example, in the case of ordinary bank checks and travellers checks there are means to detect if the check numbers are entered correctly during various financial transactions. Example. The International Standard Book Number (ISBN) coding scheme. The ISBN code is designed to detect any single digit in error and any transposition of adjacent digits. The ISBN number consists of 10 digits: ISBN → d1 d2 d3 d4 d5 d6 d7 d8 d9 d10 357 Here d10 is computed as follows: A = w1d1 + w2d2 + w3d3 + w4d4 + w5d5 + w6d6 + w7d7 + w8d8 + w9d9 mod 11 with w1 =10,w2 =9,w3 =8,w4 =7,w5 =6,w6 =5,w7 =4,w8 =3,w9 = 2. Then: 11 − A if 2 ≤ A ≤ 10 d = 10 X if A =1 For example, consider the book with ISBN 0 − 471 − 43962 − 2. In this case: A =10× 0+9× 4+8× 7+7× 1+6× 4+5× 3+4× 9+3× 6+2× 2mod11 A =36+56+7+24+15+36+18+4 mod11=196 mod11=9. d10 =11− A =2. The code presented above detects any single errors because the weights are relatively prime with 11. Indeed, if digit di is affected by an error and becomes di + ei then ∀ei wi(di + ei)mod11= widi mod 11 because wiei = 0 mod 11, wi is relatively prime to 11. The code also detects any transpo- sitions of adjacent digits. Indeed, the difference of two adjacent weights wi − wi+1 = 1 and di − di+1 < 11, thus it cannot be a multiple of 11. We conclude that the ISBN code detects any single digit in error and any transposition of adjacent digits. Assume that we wish to transmit through a binary channel a text from a source A with an alphabet with 32 letters. We can encode each one of the 32 = 25 letters of the alphabet as a 5-bit binary string and we can rewrite the entire text in the binary alphabet accepted by the channel. This strategy does not support either detection, or correction of transmission errors because all possible 5-bit combinations are exhausted to encode the 32 letters of the alphabet; any single bit error will transform a valid letter into another valid letter. However, if we use more than five bits to represent each letter of the alphabet, we have a chance to correct a number of single-bit errors. Adding redundancy to a message increases the amount of information transmitted through a communication channel. The ratio useful information versus total information packed in each codeword defines the rate of a code, see Section 4.2. Example. We now consider a very simple error correcting code, a “repetitive” code. Instead of transmitting a single binary symbol, 0 or 1, we map each bit into a string of length n of binary symbols: 0 → (00 ...0) and 1 → (11 ...1). If n =2k + 1 we use a majority voting scheme to decode a codeword of n bits. We count the number of 0s; if this number is larger than k we decode the codeword as 0, else we decode it as 1. Consider a binary symmetric channel with random errors occurring with probability p and n call perr the probability that an n bit codeword of this repetitive code is in error; then 358 2k +1 2k +1 pn =1− (1 − p)k+1 (1 − p)k + p(1 − p)k−1 + ···+ pk(1 − p)0 . err 1 k Indeed, the probability of error is the probability that (k + 1) or more bits are in error which in turn is equal to one minus the probability that at most k bits are in error. As we can see the larger is k, the smaller is the probability of error; this encoding scheme n is better than sending a single bit as long as perr <p. For example, when n = 3 and k =1 we encode a single bit as follows: 0 → (000) and 1 → (111). Then the probability of error is 3 p3 =1− (1 − p)2[(1 − p)1 + p(1 − p)0]=3p2 − 2p3. err 1 The encoding scheme increases the reliability of the transmission when p<1/2. 4.2 Block Codes. Decoding Policies In this section we discuss information transmission through a classical channel using codewords of fixed length. The channel alphabet, A, is a set of q symbols that could be transmitted through the communication channel; when q = 2 we have the binary alphabet A = {0, 1}. The source encoder maps a message into blocks of k symbols from the code alphabet; then the channel encoder adds r symbols to increase redundancy and transmits n-tuples or codewords of length n = k + r, as shown in Figure 74.
Recommended publications
  • Digital Communication Systems 2.2 Optimal Source Coding
    Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 63, No. 3, September 2013 Editor: Tara Javidi ISSN 1059-2362 Editorial committee: Ioannis Kontoyiannis, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Anand Sarwate, Andy Singer, and Sergio Verdú Annual Awards Announced The main annual awards of the • 2013 IEEE Jack Keil Wolf ISIT IEEE Information Theory Society Student Paper Awards were were announced at the 2013 ISIT selected and announced at in Istanbul this summer. the banquet of the Istanbul • The 2014 Claude E. Shannon Symposium. The winners were Award goes to János Körner. the following: He will give the Shannon Lecture at the 2014 ISIT in 1) Mohammad H. Yassaee, for Hawaii. the paper “A Technique for Deriving One-Shot Achiev - • The 2013 Claude E. Shannon ability Results in Network Award was given to Katalin János Körner Daniel Costello Information Theory”, co- Marton in Istanbul. Katalin authored with Mohammad presented her Shannon R. Aref and Amin A. Gohari Lecture on the Wednesday of the Symposium. If you wish to see her slides again or were unable to attend, a copy of 2) Mansoor I. Yousefi, for the paper “Integrable the slides have been posted on our Society website. Communication Channels and the Nonlinear Fourier Transform”, co-authored with Frank. R. Kschischang • The 2013 Aaron D. Wyner Distinguished Service Award goes to Daniel J. Costello. • Several members of our community became IEEE Fellows or received IEEE Medals, please see our web- • The 2013 IT Society Paper Award was given to Shrinivas site for more information: www.itsoc.org/honors Kudekar, Tom Richardson, and Rüdiger Urbanke for their paper “Threshold Saturation via Spatial Coupling: The Claude E.
    [Show full text]
  • Urban Flooding Centre for Neuroscience Solid Waste
    Connect ISSN 2454-6232 WITH THE INDIAN INSTITUTE OF SCIENCE Urban Flooding Causes and answers Centre for Neuroscience From brain to behaviour Solid Waste Management February 2016 Volume 3 • Issue 1 Towards zero waste CONTRIBUTORS Nithyanand Rao is a Consultant Editor at the Archives N Apurva Ratan Murty is a PhD student at the Centre for and Publications Cell Neuroscience Sudhi Oberoi is a Project Trainee at the Archives and Soumyajit Bhar is a PhD student at ATREE Publications Cell Akanksha Dixit is a PhD student in the Department of Science Media Center is a joint initiative of IISc and Biochemistry Gubbi Labs Sakshi Gera is a PhD student in the Department of Debaleena Basu is a PhD student in the Centre for Molecular Reproduction, Development and Genetics Neuroscience Disha Mohan is a PhD student in the Molecular Karthik Ramaswamy is the Editor, CONNECT, at the Biophysics Unit Archives and Publications Cell Sayantan Khan is an undergraduate student Manbeena Chawla is a Research Associate at the Centre Ankit Ruhi is a PhD student in the Department of for Infectious Disease Research Mathematics Manu Rajan is a Technical Officer at the Archives and Megha Prakash is a Consultant Editor at the Archives and Publications Cell Publications Cell Nisha Meenakshi is a PhD student in the Department of Raghunath Joshi is a PhD student in the Department of Electrical Engineering Materials Engineering Bharti Dharapuram is a PhD student in the Centre for Sridevi Venkatesan is an undergraduate student Ecological Sciences Navin S is a Master’s student
    [Show full text]
  • Digital Communication Systems ECS 452
    Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Digital Communication Systems Office Hours: BKD, 6th floor of Sirindhralai building Tuesday 14:20-15:20 Wednesday 14:20-15:20 1 Friday 9:15-10:15 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 2 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD-TV, and the California Institute for Telecommunications and Information Technology Won a Gold award in the Biography category in the 2002 Aurora Awards. 3 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) 1938 MIT master's thesis: A Symbolic Analysis of Relay and Switching Circuits Insight: The binary nature of Boolean logic was analogous to the ones and zeros used by digital circuits. The thesis became the foundation of practical digital circuit design. The first known use of the term bit to refer to a “binary digit.” Possibly the most important, and also the most famous, master’s thesis of the century. It was simple, elegant, and important. 4 C. E. Shannon: Master Thesis 5 Boole/Shannon Celebration Events in 2015 and 2016 centered around the work of George Boole, who was born 200 years ago, and Claude E. Shannon, born 100 years ago. Events were scheduled both at the University College Cork (UCC), Ireland and the Massachusetts Institute of Technology (MIT) 6 http://www.rle.mit.edu/booleshannon/ An Interesting Book The Logician and the Engineer: How George Boole and Claude Shannon Created the Information Age by Paul J.
    [Show full text]
  • MIT 150 | the Birth of Time-Shared Computing—Robert Fano (1985)
    MIT 150 | The Birth of Time-Shared Computing—Robert Fano (1985) FERNANDO The history of time sharing is the subject of the lecture. And I'm not going to try to address that. Instead, I want CORBATO: to address the context of our speaker who has played a key role. And I want to go back to his roots a bit, because I want to explain why he's in a position to give this distinguished lecture. I won't discuss his books. He has several. I won't discuss his long list of papers. I won't discuss is patents, the awards, or the many academies he belongs to-- has been elected to. He has a truly distinguish record. As the program notes briefly outline, Bob Fano was born in Torino, Italy. He came to this country when Europe came under the totalitarian grip. He came to the US and MIT in particular. He got his bachelor's in 1941. And he went on from there as a student engineer at General Motors, whose specialty was power engineering. And he tells me that he was in charge of maintaining the welding machines. And he lasted all of six months. And he decided that was not for him and came back to MIT to enter graduate school, clearly a great decision. As a graduate student, he had a nice career of being a TA and then an instructor when that was a meaningful rank in those days. It still is. He became a staff member of the radiation laboratory during wartime.
    [Show full text]
  • Andrew J. and Erna Viterbi Family Archives, 1905-20070335
    http://oac.cdlib.org/findaid/ark:/13030/kt7199r7h1 Online items available Finding Aid for the Andrew J. and Erna Viterbi Family Archives, 1905-20070335 A Guide to the Collection Finding aid prepared by Michael Hooks, Viterbi Family Archivist The Andrew and Erna Viterbi School of Engineering, University of Southern California (USC) First Edition USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 213-740-5900 [email protected] 2008 University Archives of the University of Southern California Finding Aid for the Andrew J. and Erna 0335 1 Viterbi Family Archives, 1905-20070335 Title: Andrew J. and Erna Viterbi Family Archives Date (inclusive): 1905-2007 Collection number: 0335 creator: Viterbi, Erna Finci creator: Viterbi, Andrew J. Physical Description: 20.0 Linear feet47 document cases, 1 small box, 1 oversize box35000 digital objects Location: University Archives row A Contributing Institution: USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 Language of Material: English Language of Material: The bulk of the materials are written in English, however other languages are represented as well. These additional languages include Chinese, French, German, Hebrew, Italian, and Japanese. Conditions Governing Access note There are materials within the archives that are marked confidential or proprietary, or that contain information that is obviously confidential. Examples of the latter include letters of references and recommendations for employment, promotions, and awards; nominations for awards and honors; resumes of colleagues of Dr. Viterbi; and grade reports of students in Dr. Viterbi's classes at the University of California, Los Angeles, and the University of California, San Diego.
    [Show full text]
  • Department of Electrical Engineering and Computer Science
    Department of Electrical Engineering and Computer Science The Department of Electrical Engineering and Computer Science (EECS) remains an international leader in electrical engineering, computer engineering, and computer science by setting standards both in research and in education. In addition to traditional foci on research, teaching, and student supervision, the department is actively engaged in a range of efforts in outreach and globalization and continues to evolve its recent major initiative in undergraduate curriculum reform. Research within the department is primarily conducted through one of the affiliated interdisciplinary research labs. Research expenditures within the department in FY2010 were $84.5 million, up 14% from FY2009. In FY2011, research expenditures were $99 million. (Note that this total may include some EECS-affiliated lab members who are not EECS faculty.) Specific research highlights and expenditures are presented in the separate reports by the affiliated research laboratories. EECS continues to display enrollment trends that are ahead of national trends. At the undergraduate level, the number of new majors has shown small but steady increases over the past five years. The percentage of women undergraduate majors (31%) has held steady over the past three years. At the graduate level, we continue to see significant interest in the department, with 2,887 applications this year; 211 of these applicants were offered admission, of whom 46 were women. One hundred and forty-three students accepted, resulting in an acceptance rate of 67.8%, reflecting the high interest in the department. Thirty-five of these students were women, resulting in an acceptance rate of 76% for women who were offered admission.
    [Show full text]
  • Robert Mario Fano: Un Italiano Tra Gli Architetti Della Società Dell’Informazione
    Robert Mario Fano: un italiano tra gli architetti della Società dell’Informazione Benedetta Campanile ˗ Seminario di Storia della Scienza ˗ Università degli Studi di Bari Aldo Moro ˗ [email protected] Abstract: Italian Jewish immigrated in US during the World War II, Rob- erto Mario Fano (1917-2016) is considered at Massachusetts Institute of Technology (MIT) an architect of Information Society, because he was pio- neer of concept of “making the power of computers directly accessible to people”. Really Fano, in the early 1960s, earned the support of the Com- mand & Control Branch of ARPA of US Department of Defense for a big program in computer field, the Project MAC, that had a civil aim: permit people to communicate directly with a computer from remote locations. The implementation of the Time-Sharing System, realized by the theoretical physicist Fernando J. Corbató, was the focus of the project. Fano served as founding director of MIT’s Project MAC and helped to change the compu- tational landscape to support interactive process. The information pro- cessing in time-sharing became a way for many people to use a computer and to break away from the dreaded world of batch processing. The concept of “computer utility” became the creed of Fano and his colleagues. The im- pact of Project MAC on the education and research generated a new disci- pline, the Computer Science, and the large federal funds for academic re- search carried out a new knowledge that changed the society organization. Keywords: Robert Fano, Project MAC, Big Science The New Frontier of which I speak is not a set of promises ..
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 66, No. 4, December 2016 Editor: Michael Langberg ISSN 1059-2362 Editorial committee: Frank Kschischang, Giuseppe Caire, Meir Feder, Tracey Ho, Joerg Kliewer, Parham Noorzad, Anand Sarwate, Andy Singer, and Sergio Verdú President’s Column Alon Orlitsky One should never start with a cliche. So let me that would bring and curate information the- end with one: where did 2016 go? It seems like oretic lectures for large audiences. yesterday that I wrote my first column, and it’s already time to reflect on a year gone by? • Jeff Andrews and Elza Erkip are heading a BoG ad-hoc committee exploring the cre- And a remarkable year it was! Each of earth’s ation of a new publication connecting corners and all of life’s walks were abuzz with information theory to other topics and activity and change, and information theory communities, with two main formats con- followed suit. Adding to our society’s regu- sidered: a special topics journal, and a larly hectic conference meeting/publication popular magazine. calendar, we also celebrated Shannon’s 100th anniversary with fifty worldwide centennials, • Christina Fragouli and Anna Scaglione are numerous write-ups in major journals and co-authoring a children’s book on informa- magazines, and a coveted Google Doodle on tion theory, the first chapter is completed Shannon’s birthday. For the last time let me and the rest are expected next year. thank Christina Fragouli and Rudi Urbanke for co-chairing the hyperactive Centennials Committee. As the holidays ushered in, twelve-shy-one society members received jolly good tidings.
    [Show full text]
  • Minicomputers, Workstations and Portable Memory
    2/12/20 15-292 History of Computing Mini-computers, workstations and advances in portable memory 1 The Minicomputer A class of multi-user computers In terms of size & computing power, in the middle range of the computing spectrum in between mainframes (the largest) and the personal computers (the smallest) emerged in the 1970s (before the PC) the term evolved in the 1960s to describe the "small" 3rd generation computers that became possible with the use of the newly invented IC technology 2 1 2/12/20 The Minicomputer took up one or a few cabinets, compared with mainframes that would usually fill a room led to the microcomputer (the PC) Some consider Seymour Cray’s CDC-160 the first minicomputer The PDP-8 was the definitive minicomputer as important to computer architecture as the EDVAC report 3 Digital Equipment Corporation Founded in 1957 by Ken Olsen, a Massachusetts engineer who had been working at MIT Lincoln Lab on the TX-0 and TX-2 projects. Began operations in Maynard, MA in an old textile mill Digital started with the manufacturing of logic modules. 4 2 2/12/20 Digital Equipment Corp. (DEC) 5 DEC PDP-1 In 1961 DEC started construction of its first computer, the PDP-1. PDP = Programmable Digital Processor 18-bit word size Prototype shown at the Eastern Joint Computer Conference in December 1959 Delivered the first PDP-1 to Bolt, Beranek and Newman (BBN) in November 1960 (BBN plays a role in the development of the Internet, stay tuned!) 6 3 2/12/20 PDP-1 7 PDP-1 console and program tapes for PDP-1 (Computer History Museum) 8 4 2/12/20 PDP-1 9 PDP-1 Demonstration 10 5 2/12/20 Spacewar! The first computer game, created on the PDP-1 http://www.atarimagazines.com/cva/v1n1/spacewar.php The original control boxes looked something like this.
    [Show full text]
  • (J.P.M.) Piet Schalkwijk 80Th Birthday Presentation: Han Vinck
    (J.P.M.) Piet Schalkwijk 80th birthday presentation: Han Vinck Historical notes Results ENIGE GESCHIEDENIS 1959 Afstuderen bij Prof. Ir. M.P. Breedveld (Delft) 1959‐1961 Hengelo, 1st lieutenant Netherlands armed forces, 1961‐1963 RVO‐TNO (Rijks verdedigings organisatie) Ook Y. Boxma (TUD) en E. Gröneveld (TUT) Han Vinck lecture at Piet Schalkwijk's 80th birthday, 2016 2 Shannon and feedback But what about: ‐ Channels with memory ‐ Multi user channels like MAC? Han Vinck lecture at Piet Schalkwijk's 80th birthday, 2016 3 PhD Stanford, 1965 (supervisor T. Kailath) Coding Schemes for Additive Noise Channels with Feedback J. P. M. Schalkwijk Department of Electrical Engineering, Stanford University., 1965 - 49 pagina's Han Vinck lecture at Piet Schalkwijk's 80th birthday, 2016 4 At Kailath’s 80th birthday Kailath’s first Ph.D student, Piet Schalkwijk from Holland, is said to have complained that he was made to rewrite his thesis 6 times! In response Kailath says: “That’s an exaggeration. After all he graduated in two years after his Master’s degree. Almost all my students graduated within 3‐4 years 5 from their Master’s degrees.” Han Vinck lecture at Piet Schalkwijk's 80th birthday, 2016 5 IEEE Information theory society best paper award The purpose of the Information Theory Paper Award is to recognize exceptional publications in the field and to stimulate interest in and encourage contributions to fields of interest of the Society. Han Vinck lecture at Piet Schalkwijk's 80th birthday, 2016 6 AMERIKA Assistant professor University California, Senior scientist, Boston, 1965‐1968.. San Diego, 1968‐1972 Married Susan Alicia Farranto, September 17, 1966.
    [Show full text]
  • Mathematical Theory of Claude Shannon (2001)
    Mathematical Theory of Claude Shannon A study of the style and context of his work up to the genesis of information theory. by Eugene Chiu, Jocelyn Lin, Brok Mcferron, Noshirwan Petigara, Satwiksai Seshasai 6.933J / STS.420J The Structure of Engineering Revolutions Table of Contents Acknowledgements............................................................................................................................................3 Introduction........................................................................................................................................................4 Methodology ..................................................................................................................................................7 Information Theory...........................................................................................................................................8 Information Theory before Shannon.........................................................................................................8 What was Missing........................................................................................................................................15 1948 Mathematical theory of communication ........................................................................................16 The Shannon Methodology .......................................................................................................................19 Shannon as the architect.............................................................................................................................19
    [Show full text]