5.2 Binary Convolutional Codes

Total Page:16

File Type:pdf, Size:1020Kb

5.2 Binary Convolutional Codes Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 5.2 Binary Convolutional Codes Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 163 BKD Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 5.2 Binary Convolutional Codes Introduction to Binary Convolutional Codes Binary Convolutional Codes Introduced by Elias in 1955 There, it is referred to as convolutional parity-check symbols codes. Peter Elias received Claude E. Shannon Award in 1977 IEEE Richard W. Hamming Medal in 2002 for "fundamental and pioneering contributions to information theory and its applications The encoder has memory. In other words, the encoder is a sequential circuit or a finite- state machine. Easily implemented by shift register(s). The state of the encoder is defined as the contents of its memory. 165 Binary Convolutional Codes The encoding is done on a continuous running basis rather than by blocks of k data digits. So, we use the terms bit streams or sequences for the input and output of the encoder. In theory, these sequences have infinite duration. In practice, the state of the convolutional code is periodically forced to a known state and therefore code sequences are produced in a block-wise manner. 166 Binary Convolutional Codes In general, a rate- convolutional encoder has k shift registers, one per input information bit, and n output coded bits that are given by linear combinations (over the binary field, of the contents of the registers and the input information bits. k and n are usually small. For simplicity of exposition, and for practical purposes, only rate- binary convolutional codes are considered here. k = 1. These are the most widely used binary codes. 167 (Serial-in/Serial-out) Shift Register Accept data serially: one bit at a time on a single line. Each clock pulse will move an input bit to the next FF. For example, a 1 is shown as it moves across. Example: five-bit serial-in serial-out register. FF0 FF1 FF2 FF3 FF4 Serial 1 1 1 1 1 1 Serial data D0 Q0 D1 Q1 D2 Q2 D3 Q3 D4 Q4 data input output C C C C C CLKCLK 168 Example 1: n = 2, k = 1 + + 169 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 5.2 Binary Convolutional Codes Three Graphical Representations Graphical Representations Three different but related graphical representations have been devised for the study of convolutional encoding: 1. the state diagram 2. the code tree 3. the trellis diagram 171 Ex. 1: State (Transition) Diagram The encoder behavior can be seen from the perspective of a finite state machine with its state (transition) diagram. 0/00 + 00 1/11 0/11 1/00 + 10 01 0/10 1/01 0/01 A four-state directed graph that 11 uniquely represents the input-output relation of the encoder. 1/10 172 Drawing State Diagram 00 10 01 11 + + 173 Input bit Drawing State Diagram 0 00 + 1 0 0 0 + 10 01 + 11 1 0 0 + + + 174 Corresponding Input bit output bits Drawing State Diagram 0/00 00 + 0 1/11 0 0 0 + 10 01 0 + 1 11 1 0 0 + 1 + + 175 Drawing State Diagram 0/00 00 + 1 1/11 0/11 1/00 0 0 1 + 10 01 1 + 0 11 1 0 1 + 0 + + 176 Drawing State Diagram 0/00 00 1/11 0/11 + 1/00 + 10 01 (1) (2) bs1 s2 x x 0/10 0000 0 1/01 0/01 1001 1 11 0011 1 1010 0 1/10 0101 0 + 1100 1 0110 1 1111 0 + 177 Tracing the State Diagram to Find the Outputs 0/00 00 1/11 0/11 1/00 10 01 0/10 1/01 0/01 Input 1 1 0 1 1 1 11 Output 11 01 01 00 01 10 1/10 + + 178 Directly Finding the Output + + (1) (2) bs1 s2 x x 100 1 Input 1 1 0 1 1 1 0 Output 1 1 + 1 + 179 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 5.2 Binary Convolutional Codes Code Tree and Trellis Diagram Graphical Representations Three different but related graphical representations have been devised for the study of convolutional encoding: 1. the state diagram 0/00 2. the code tree 00 1/11 0/11 3. the trellis diagram 1/00 10 01 0/10 1/01 0/01 11 1/10 182 Parts for Code Tree 0/00 00 1/11 0/11 0/11 0/00 00 00 1/00 00 01 1/11 1/00 10 01 10 10 0/10 1/01 0/01 0/01 11 0/10 01 01 10 11 1/01 11 1/10 11 1/10 Two branches initiate from each node, + the upper one for 0 and the lower one for 1. + 183 Show the coded output for any possible sequence of data digits. 0/00 Code Tree 00 0/00 00 1/11 0/00 0/11 10 00 00 0/00 00 01 00 0/10 1/11 1/00 01 10 10 1/11 10 1/01 11 0/10 0/00 01 0/01 01 Initially, we 00 0/11 10 11 00 1/01 11 1/10 always assume 0/10 01 11 that all the 1/00 10 1/11 10 contents of the 0/01 10 1/01 11 register are 0. 1/10 11 Start 00 0/00 0/11 00 00 1/11 0/10 10 01 0/10 1/00 01 10 + 1/01 11 10 0/11 1/11 0/01 00 01 1/00 10 + 1/01 11 0/01 10 1/10 11 184 1/10 11 0/00 Code Tree 00 0/00 00 1/11 10 0/00 00 0/10 01 1/11 10 1/01 11 Input 1 1 0 1 0/00 00 0/11 00 Output 11 01 01 00 0/10 01 1/00 10 1/11 10 0/01 10 1/01 11 1/10 11 Start 00 0/00 0/11 00 00 1/11 0/10 10 01 0/10 1/00 01 10 + 1/01 11 10 0/11 1/11 0/01 00 01 1/00 10 + 1/01 11 0/01 10 1/10 11 185 1/10 11 Code Trellis [Carlson & Crilly, 2009, p. 620] 0/00 0/11 00 00 00 01 1/11 1/00 Current Next 10 10 State State 0/10 01 0/01 01 00 0/00 00 10 11 1/01 1/10 11 11 10 10 01 01 11 11 1/10 + + 186 Another useful way of Towards the Trellis Diagram representing the code tree. 0/00 0/00 00 0/00 0/00 0/00 0/00 00 10 10 01 01 11 11 1/10 1/10 1/10 1/10 1/10 1/10 + + 187 Trellis Diagram 0/00 0/00 00 0/00 0/00 0/00 0/00 00 10 10 Initially, we 01 01 always assume that all the contents of the 11 11 1/10 1/10 1/10 1/10 register are 0. + Each path that traverses through the trellis represents a valid codeword. + 188 Trellis Diagram 0/00 0/00 00 0/00 0/00 0/00 0/00 00 10 10 01 01 11 11 1/10 1/10 1/10 1/10 Input 1 1 0 1 1 1 + Output 11 01 01 00 01 10 + 189 Trellis โครงลูกไม้,โครงสร้างบงตาทั ่เปี ็นช่อง,โครงสร้างสาหรํ ับปลูกไม้เล้ือย 190 Convolutional Encoder: Graphical Representations + + 0/00 00 0/00 0/00 00 1/11 10 0/00 0/00 0/00 0/00 00 00 00 1/11 0/11 0/10 01 1/11 1/00 10 1/01 11 10 01 10 Start 00 0/10 0/11 00 0/01 0/10 1/01 01 01 11 1/00 10 10 1/11 0/01 1/10 01 11 1/01 11 1/10 1/10 11 191 + The Codebook + 0/00 0/00 00 0/00 00 1/11 10 00 0/00 00 1/11 0/11 0/10 01 1/11 b b b x x x x x x 1/00 10 1 2 3 1 2 3 4 5 6 1/01 11 10 01 Start 00 0/11 0/10 00 0000 0 0 0 0 0 0/10 01 1/00 1/01 0/01 10 11 10 0010 0 0 0 1 1 1/11 0/01 01 1/01 11 1/10 1/10 11 0100 0 1 1 1 0 00 0/00 0/00 0/00 0110 0 1 1 0 1 10 1001 1 1 0 1 1 01 1011 1 1 0 0 0 1101 1 0 1 0 1 11 1/10 1111 1 0 1 1 0 192 Digital Communication Systems ECS 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 5.2 Binary Convolutional Codes Introduction to Viterbi Decoding Convolutional Encoder and Decoder + d,b + Message (Stream of data) x,c Convolutional Digital Encoder Modulator Encoded stream Transmitted Signal 0 1-p 0 p Channel Binary Symmetric p 1 1 Channel with 2 0/00 (2) 0/00 (1) 3 2 0/00 (0) 2 0/00 (1) 3 3 1-p 00 0 3 3 1 3 3 10 p < 0.5 2 1 2 2 3 01 Noise & Interference 0 1 2 1 1 Received 11 1/10 (1) 1/10 (1) 1/10 (0) , Signal Channel Digital yxe Recovered Message Decoder Demodulator 194 minimum distance decoder Viterbi decoder Direct Minimum Distance Decoding Suppose = [11 01 11].
Recommended publications
  • Minimax Redundancy for Markov Chains with Large State Space
    Minimax redundancy for Markov chains with large state space Kedar Shriram Tatwawadi, Jiantao Jiao, Tsachy Weissman May 8, 2018 Abstract For any Markov source, there exist universal codes whose normalized codelength approaches the Shannon limit asymptotically as the number of samples goes to infinity. This paper inves- tigates how fast the gap between the normalized codelength of the “best” universal compressor and the Shannon limit (i.e. the compression redundancy) vanishes non-asymptotically in terms of the alphabet size and mixing time of the Markov source. We show that, for Markov sources (2+c) whose relaxation time is at least 1 + √k , where k is the state space size (and c> 0 is a con- stant), the phase transition for the number of samples required to achieve vanishing compression redundancy is precisely Θ(k2). 1 Introduction For any data source that can be modeled as a stationary ergodic stochastic process, it is well known in the literature of universal compression that there exist compression algorithms without any knowledge of the source distribution, such that its performance can approach the fundamental limit of the source, also known as the Shannon entropy, as the number of observations tends to infinity. The existence of universal data compressors has spurred a huge wave of research around it. A large fraction of practical lossless compressors are based on the Lempel–Ziv algorithms [ZL77, ZL78] and their variants, and the normalized codelength of a universal source code is also widely used to measure the compressibility of the source, which is based on the idea that the normalized codelength is “close” to the true entropy rate given a moderate number of samples.
    [Show full text]
  • Digital Communication Systems 2.2 Optimal Source Coding
    Digital Communication Systems EES 452 Asst. Prof. Dr. Prapun Suksompong [email protected] 2. Source Coding 2.2 Optimal Source Coding: Huffman Coding: Origin, Recipe, MATLAB Implementation 1 Examples of Prefix Codes Nonsingular Fixed-Length Code Shannon–Fano code Huffman Code 2 Prof. Robert Fano (1917-2016) Shannon Award (1976 ) Shannon–Fano Code Proposed in Shannon’s “A Mathematical Theory of Communication” in 1948 The method was attributed to Fano, who later published it as a technical report. Fano, R.M. (1949). “The transmission of information”. Technical Report No. 65. Cambridge (Mass.), USA: Research Laboratory of Electronics at MIT. Should not be confused with Shannon coding, the coding method used to prove Shannon's noiseless coding theorem, or with Shannon–Fano–Elias coding (also known as Elias coding), the precursor to arithmetic coding. 3 Claude E. Shannon Award Claude E. Shannon (1972) Elwyn R. Berlekamp (1993) Sergio Verdu (2007) David S. Slepian (1974) Aaron D. Wyner (1994) Robert M. Gray (2008) Robert M. Fano (1976) G. David Forney, Jr. (1995) Jorma Rissanen (2009) Peter Elias (1977) Imre Csiszár (1996) Te Sun Han (2010) Mark S. Pinsker (1978) Jacob Ziv (1997) Shlomo Shamai (Shitz) (2011) Jacob Wolfowitz (1979) Neil J. A. Sloane (1998) Abbas El Gamal (2012) W. Wesley Peterson (1981) Tadao Kasami (1999) Katalin Marton (2013) Irving S. Reed (1982) Thomas Kailath (2000) János Körner (2014) Robert G. Gallager (1983) Jack KeilWolf (2001) Arthur Robert Calderbank (2015) Solomon W. Golomb (1985) Toby Berger (2002) Alexander S. Holevo (2016) William L. Root (1986) Lloyd R. Welch (2003) David Tse (2017) James L.
    [Show full text]
  • Principles of Communications ECS 332
    Principles of Communications ECS 332 Asst. Prof. Dr. Prapun Suksompong (ผศ.ดร.ประพันธ ์ สขสมปองุ ) [email protected] 1. Intro to Communication Systems Office Hours: Check Google Calendar on the course website. Dr.Prapun’s Office: 6th floor of Sirindhralai building, 1 BKD 2 Remark 1 If the downloaded file crashed your device/browser, try another one posted on the course website: 3 Remark 2 There is also three more sections from the Appendices of the lecture notes: 4 Shannon's insight 5 “The fundamental problem of communication is that of reproducing at one point either exactly or approximately a message selected at another point.” Shannon, Claude. A Mathematical Theory Of Communication. (1948) 6 Shannon: Father of the Info. Age Documentary Co-produced by the Jacobs School, UCSD- TV, and the California Institute for Telecommunic ations and Information Technology 7 [http://www.uctv.tv/shows/Claude-Shannon-Father-of-the-Information-Age-6090] [http://www.youtube.com/watch?v=z2Whj_nL-x8] C. E. Shannon (1916-2001) Hello. I'm Claude Shannon a mathematician here at the Bell Telephone laboratories He didn't create the compact disc, the fax machine, digital wireless telephones Or mp3 files, but in 1948 Claude Shannon paved the way for all of them with the Basic theory underlying digital communications and storage he called it 8 information theory. C. E. Shannon (1916-2001) 9 https://www.youtube.com/watch?v=47ag2sXRDeU C. E. Shannon (1916-2001) One of the most influential minds of the 20th century yet when he died on February 24, 2001, Shannon was virtually unknown to the public at large 10 C.
    [Show full text]
  • Marconi Society - Wikipedia
    9/23/2019 Marconi Society - Wikipedia Marconi Society The Guglielmo Marconi International Fellowship Foundation, briefly called Marconi Foundation and currently known as The Marconi Society, was established by Gioia Marconi Braga in 1974[1] to commemorate the centennial of the birth (April 24, 1874) of her father Guglielmo Marconi. The Marconi International Fellowship Council was established to honor significant contributions in science and technology, awarding the Marconi Prize and an annual $100,000 grant to a living scientist who has made advances in communication technology that benefits mankind. The Marconi Fellows are Sir Eric A. Ash (1984), Paul Baran (1991), Sir Tim Berners-Lee (2002), Claude Berrou (2005), Sergey Brin (2004), Francesco Carassa (1983), Vinton G. Cerf (1998), Andrew Chraplyvy (2009), Colin Cherry (1978), John Cioffi (2006), Arthur C. Clarke (1982), Martin Cooper (2013), Whitfield Diffie (2000), Federico Faggin (1988), James Flanagan (1992), David Forney, Jr. (1997), Robert G. Gallager (2003), Robert N. Hall (1989), Izuo Hayashi (1993), Martin Hellman (2000), Hiroshi Inose (1976), Irwin M. Jacobs (2011), Robert E. Kahn (1994) Sir Charles Kao (1985), James R. Killian (1975), Leonard Kleinrock (1986), Herwig Kogelnik (2001), Robert W. Lucky (1987), James L. Massey (1999), Robert Metcalfe (2003), Lawrence Page (2004), Yash Pal (1980), Seymour Papert (1981), Arogyaswami Paulraj (2014), David N. Payne (2008), John R. Pierce (1979), Ronald L. Rivest (2007), Arthur L. Schawlow (1977), Allan Snyder (2001), Robert Tkach (2009), Gottfried Ungerboeck (1996), Andrew Viterbi (1990), Jack Keil Wolf (2011), Jacob Ziv (1995). In 2015, the prize went to Peter T. Kirstein for bringing the internet to Europe. Since 2008, Marconi has also issued the Paul Baran Marconi Society Young Scholar Awards.
    [Show full text]
  • MÁSTER UNIVERSITARIO EN INGENIERÍA DE TELECOMUNICACIÓN TRABAJO FIN DE MASTER Analysis and Implementation of New Techniques Fo
    MÁSTER UNIVERSITARIO EN INGENIERÍA DE TELECOMUNICACIÓN TRABAJO FIN DE MASTER Analysis and Implementation of New Techniques for the Enhancement of the Reliability and Delay of Haptic Communications LUIS AMADOR GONZÁLEZ 2017 MASTER UNIVERSITARIO EN INGENIERÍA DE TELECOMUNICACIÓN TRABAJO FIN DE MASTER Título: Analysis and Implementation of New Techniques for the Enhancement of the Reliability and Delay of Haptic Communications Autor: D. Luis Amador González Tutor: D. Hamid Aghvami Ponente: D. Santiago Zazo Bello Departamento: MIEMBROS DEL TRIBUNAL Presidente: D. Vocal: D. Secretario: D. Suplente: D. Los miembros del tribunal arriba nombrados acuerdan otorgar la calificación de: ……… Madrid, a de de 20… UNIVERSIDAD POLITÉCNICA DE MADRID ESCUELA TÉCNICA SUPERIOR DE INGENIEROS DE TELECOMUNICACIÓN MASTER UNIVERSITARIO EN INGENIERÍA DE TELECOMUNICACIÓN TRABAJO FIN DE MASTER Analysis and Implementation of New Techniques for the Enhancement of the Reliability and Delay of Haptic Communications LUIS AMADOR GONZÁLEZ 2017 CONTENTS LIST OF FIGURES .....................................................................................2 LIST OF TABLES ......................................................................................3 ABBREVIATIONS .....................................................................................4 SUMMARY ............................................................................................5 RESUMEN .............................................................................................6 ACKNOWLEDGEMENTS
    [Show full text]
  • Channel Coding
    1 Channel Coding: The Road to Channel Capacity Daniel J. Costello, Jr., Fellow, IEEE, and G. David Forney, Jr., Fellow, IEEE Submitted to the Proceedings of the IEEE First revision, November 2006 Abstract Starting from Shannon’s celebrated 1948 channel coding theorem, we trace the evolution of channel coding from Hamming codes to capacity-approaching codes. We focus on the contributions that have led to the most significant improvements in performance vs. complexity for practical applications, particularly on the additive white Gaussian noise (AWGN) channel. We discuss algebraic block codes, and why they did not prove to be the way to get to the Shannon limit. We trace the antecedents of today’s capacity-approaching codes: convolutional codes, concatenated codes, and other probabilistic coding schemes. Finally, we sketch some of the practical applications of these codes. Index Terms Channel coding, algebraic block codes, convolutional codes, concatenated codes, turbo codes, low-density parity- check codes, codes on graphs. I. INTRODUCTION The field of channel coding started with Claude Shannon’s 1948 landmark paper [1]. For the next half century, its central objective was to find practical coding schemes that could approach channel capacity (hereafter called “the Shannon limit”) on well-understood channels such as the additive white Gaussian noise (AWGN) channel. This goal proved to be challenging, but not impossible. In the past decade, with the advent of turbo codes and the rebirth of low-density parity-check codes, it has finally been achieved, at least in many cases of practical interest. As Bob McEliece observed in his 2004 Shannon Lecture [2], the extraordinary efforts that were required to achieve this objective may not be fully appreciated by future historians.
    [Show full text]
  • IEEE Information Theory Society Newsletter
    IEEE Information Theory Society Newsletter Vol. 59, No. 3, September 2009 Editor: Tracey Ho ISSN 1059-2362 Optimal Estimation XXX Shannon lecture, presented in 2009 in Seoul, South Korea (extended abstract) J. Rissanen 1 Prologue 2 Modeling Problem The fi rst quest for optimal estimation by Fisher, [2], Cramer, The modeling problem begins with a set of observed data 5 5 5 c 6 Rao and others, [1], dates back to over half a century and has Y yt:t 1, 2, , n , often together with other so-called 5 51 c26 changed remarkably little. The covariance of the estimated pa- explanatory data Y, X yt, x1,t, x2,t, . The objective is rameters was taken as the quality measure of estimators, for to learn properties in Y expressed by a set of distributions which the main result, the Cramer-Rao inequality, sets a lower as models bound. It is reached by the maximum likelihood (ML) estima- 5 1 u 26 tor for a restricted subclass of models and asymptotically for f Y|Xs; , s . a wider class. The covariance, which is just one property of u5u c u models, is too weak a measure to permit extension to estima- Here s is a structure parameter and 1, , k1s2 real- valued tion of the number of parameters, which is handled by various parameters, whose number depends on the structure. The ad hoc criteria too numerous to list here. structure is simply a subset of the models. Typically it is used to indicate the most important variables in X. (The traditional Soon after I had studied Shannon’s formal defi nition of in- name for the set of models is ‘likelihood function’ although no formation in random variables and his other remarkable per- such concept exists in probability theory.) formance bounds for communication, [4], I wanted to apply them to other fi elds – in particular to estimation and statis- The most important problem is the selection of the model tics in general.
    [Show full text]
  • Itnl0908.Pdf
    itNL0908.qxd 8/13/08 8:34 AM Page 1 IEEE Information Theory Society Newsletter Vol. 58, No. 3, September 2008 Editor: Daniela Tuninetti ISSN 1059-2362 Annual IT Awards Announced The principal annual awards of the Information Theory Society of the U.S. National Academy of Engineering (2007 to date). He were announced at the Toronto ISIT. The 2009 Shannon Award will be General Co-Chair of the 2009 ISIT in Seoul. goes to Jorma Rissanen. The 2008 Wyner Award goes to Vincent Poor. The winners of the 2008 IT Paper Award are two 2006 IT The Information Theory Society Paper Award is given annually Transactions papers on compressed sensing by David Donoho to an outstanding publication in the fields of interest to the and by Emmanuel Candes and Terence Tao, respectively (with Society appearing anywhere during the preceding two calendar acknowledgment of an earlier paper by Candes, Romberg and years. The winners of the 2008 award are "Compressed sensing," Tao). The winner of the 2008 Joint IT/ComSoc Paper Award is a by David Donoho, which appeared in the April 2006 IEEE Communications Transactions paper on Accumulate-Repeat- Transactions on Information Theory, and "Near-optimal signal Accumulate Codes by Aliazam Abbasfar, Dariush Divsalar and recovery from random projections: universal encoding strate- Kung Yao. Three student authors of ISIT papers received ISIT gies," by Emmanuel Candes and Terence Tao, which appeared Student Paper Awards: Paul Cuff of Stanford, Satish Korada of in the December 2006 Transactions. These two overlapping EPFL, and Yury Polyanskiy of Princeton. Finally, the 2008 papers are the foundations of the exciting new field of com- Chapter of the Year Award was presented to the Kitchener- pressed sensing.
    [Show full text]
  • President's Column
    IEEE Information Theory Society Newsletter Vol. 53, No. 3, September 2003 Editor: Lance C. Pérez ISSN 1059-2362 President’s Column Han Vinck This message is written after re- tion of the electronic library, many turning from an exciting Informa- universities and companies make tion Theory Symposium in this product available to their stu- Yokohama, Japan. Our Japanese dents and staff members. For these colleagues prepared an excellent people there is no direct need to be technical program in the beautiful an IEEE member. IEEE member- and modern setting of Yokohama ship reduced in general by about city. The largest number of contri- 10% and the Society must take ac- butions was in the field of LDPC tions to make the value of member- coding, followed by cryptography. ship visible to you and potential It is interesting to observe that new members. This is an impor- quantum information theory gets tant task for the Board and the more and more attention. The Jap- Membership Development com- anese hospitality contributed to IT Society President Han Vinck playing mittee. The Educational Commit- the general success of this year’s traditional Japanese drums. tee, chaired by Ivan Fair, will event. In spite of all the problems contribute to the value of member- with SARS, the Symposium attracted approximately 650 ship by introducing new activities. The first action cur- participants. At the symposium we announced Bob rently being undertaken by this committee is the McEliece to be the winner of the 2004 Shannon Award. creation of a web site whose purpose is to make readily The Claude E.
    [Show full text]
  • A Study on Convolutional Codes. Classification, New Families And
    A Study on Convolutional Codes. Classification, New Families and Decoding Jos´eIgnacio Iglesias Curto Tesis Doctoral dirigida por Jos´eMar´ıaMu˜nozPorras Jos´e Angel´ Dom´ınguez P´erez Departamento de Matem´aticas Universidad de Salamanca ii A Study on Convolutional Codes. Classification, New Families and Decoding Tesis presentada por Jos´eIgnacio Iglesias Curto para optar al t´ıtulo de Doctor por la Universidad de Salamanca Jos´eIgnacio Iglesias Curto, Doctorando Jos´eMar´ıaMu˜noz Porras, Director Jos´e Angel´ Dom´ınguez P´erez, Codirector Departamento de Matem´aticas Universidad de Salamanca iv Agradecimientos Mi primer agradecimiento es para mis padres, Jos´ey Petra, y para mis hermanos, Pedro y Ma Eugenia, que en todos estos a˜nos me han cuidado y apoyado y de cuyos esfuerzos me he servido para desarrollar mi formaci´ony poder dedicarme a la elaboraci´onde este trabajo. Y para el resto de mi familia, que han seguido muy de cerca mi esfuerzo en el desarrollo del mismo. Este´ no hubiera sido posible sin la direcci´on, consejo y orientaci´onde Jos´eMar´ıa Mu˜nozPorras, a quien debo agradecer la atenci´on, la ayuda y la confianza prestadas en todo este tiempo. Tambi´enquiero agradecer a Francisco Jos´ePlaza, Jos´e Angel´ Dom´ınguez, Gloria Serrano y Esteban G´omez su colaboraci´on, sus correcciones y sus observaciones, que me han ayudado enormemente. As´ı como al resto de profesores y personal del Departamento de Matem´aticasde la Universidad de Salamanca, que directa o indirectamente han facilitado mi trabajo y creado un entorno distendido donde poder llevarlo a cabo.
    [Show full text]
  • University of California Los Angeles
    University of California Los Angeles Robust Communication and Optimization over Dynamic Networks A dissertation submitted in partial satisfaction of the requirements for the degree Doctor of Philosophy in Electrical Engineering by Can Karakus 2018 c Copyright by Can Karakus 2018 ABSTRACT OF THE DISSERTATION Robust Communication and Optimization over Dynamic Networks by Can Karakus Doctor of Philosophy in Electrical Engineering University of California, Los Angeles, 2018 Professor Suhas N. Diggavi, Chair Many types of communication and computation networks arising in modern systems have fundamentally dynamic, time-varying, and ultimately unreliably available resources. Specifi- cally, in wireless communication networks, such unreliability may manifest itself as variability in channel conditions, intermittent availability of undedicated resources (such as unlicensed spectrum), or collisions due to multiple-access. In distributed computing, and specifically in large-scale distributed optimization and machine learning, this phenomenon manifests itself in the form of communication bottlenecks, straggling or failed nodes, or running background processes which hamper or slow down the computational task. In this thesis, we develop information-theoretically-motivated approaches that make progress towards building robust and reliable communication and computation networks built upon unreliable resources. In the first part of the thesis, we focus on three problems in wireless networks which involve opportunistically harnessing time-varying resources while providing theoretical per- formance guarantees. First, we show that in full-duplex uplink-downlink cellular networks, a simple, low-overhead user scheduling scheme that exploits the variations in channel con- ditions can be used to optimally mitigate inter-user interference in the many-user regime. Next, we consider the use of intermittently available links over unlicensed spectral bands to enhance communication over the licensed cellular band.
    [Show full text]
  • Andrew J. and Erna Viterbi Family Archives, 1905-20070335
    http://oac.cdlib.org/findaid/ark:/13030/kt7199r7h1 Online items available Finding Aid for the Andrew J. and Erna Viterbi Family Archives, 1905-20070335 A Guide to the Collection Finding aid prepared by Michael Hooks, Viterbi Family Archivist The Andrew and Erna Viterbi School of Engineering, University of Southern California (USC) First Edition USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 213-740-5900 [email protected] 2008 University Archives of the University of Southern California Finding Aid for the Andrew J. and Erna 0335 1 Viterbi Family Archives, 1905-20070335 Title: Andrew J. and Erna Viterbi Family Archives Date (inclusive): 1905-2007 Collection number: 0335 creator: Viterbi, Erna Finci creator: Viterbi, Andrew J. Physical Description: 20.0 Linear feet47 document cases, 1 small box, 1 oversize box35000 digital objects Location: University Archives row A Contributing Institution: USC Libraries Special Collections Doheny Memorial Library 206 3550 Trousdale Parkway Los Angeles, California, 90089-0189 Language of Material: English Language of Material: The bulk of the materials are written in English, however other languages are represented as well. These additional languages include Chinese, French, German, Hebrew, Italian, and Japanese. Conditions Governing Access note There are materials within the archives that are marked confidential or proprietary, or that contain information that is obviously confidential. Examples of the latter include letters of references and recommendations for employment, promotions, and awards; nominations for awards and honors; resumes of colleagues of Dr. Viterbi; and grade reports of students in Dr. Viterbi's classes at the University of California, Los Angeles, and the University of California, San Diego.
    [Show full text]