Introduction to Theory of Computation

Total Page:16

File Type:pdf, Size:1020Kb

Introduction to Theory of Computation Introduction to Theory of Computation Anil Maheshwari Michiel Smid School of Computer Science Carleton University Ottawa Canada anil,michiel @scs.carleton.ca { } April 17, 2019 ii Contents Contents Preface vi 1 Introduction 1 1.1 Purposeandmotivation ..................... 1 1.1.1 Complexitytheory .................... 2 1.1.2 Computability theory . 2 1.1.3 Automatatheory ..................... 3 1.1.4 Thiscourse ........................ 3 1.2 Mathematical preliminaries . 4 1.3 Prooftechniques ......................... 7 1.3.1 Directproofs ....................... 8 1.3.2 Constructiveproofs. 9 1.3.3 Nonconstructiveproofs . 10 1.3.4 Proofsbycontradiction. 11 1.3.5 The pigeon hole principle . 12 1.3.6 Proofsbyinduction. 13 1.3.7 Moreexamplesofproofs . 15 Exercises................................. 18 2 Finite Automata and Regular Languages 21 2.1 An example: Controling a toll gate . 21 2.2 Deterministic finite automata . 23 2.2.1 Afirstexampleofafiniteautomaton . 26 2.2.2 Asecondexampleofafiniteautomaton . 28 2.2.3 A third example of a finite automaton . 29 2.3 Regularoperations ........................ 31 2.4 Nondeterministic finite automata . 35 2.4.1 Afirstexample ...................... 35 iv Contents 2.4.2 Asecondexample..................... 37 2.4.3 Athirdexample...................... 38 2.4.4 Definition of nondeterministic finite automaton . 39 2.5 EquivalenceofDFAsandNFAs . 41 2.5.1 Anexample ........................ 44 2.6 Closureundertheregularoperations . 48 2.7 Regularexpressions. .. .. 52 2.8 Equivalence of regular expressions and regular languages . 56 2.8.1 Every regular expression describes a regular language . 57 2.8.2 Converting a DFA to a regular expression . 60 2.9 The pumping lemma and nonregular languages . 67 2.9.1 Applications of the pumping lemma . 69 2.10Higman’sTheorem ........................ 76 2.10.1 Dickson’sTheorem . 76 2.10.2 ProofofHigman’sTheorem . 77 Exercises................................. 80 3 Context-Free Languages 91 3.1 Context-freegrammars . 91 3.2 Examplesofcontext-freegrammars . 94 3.2.1 Properlynestedparentheses . 94 3.2.2 A context-free grammar for a nonregular language . 95 3.2.3 A context-free grammar for the complement of a non- regularlanguage ..................... 97 3.2.4 A context-free grammar that verifies addition . 98 3.3 Regularlanguagesarecontext-free. 100 3.3.1 Anexample ........................102 3.4 Chomskynormalform . .104 3.4.1 Anexample ........................109 3.5 Pushdownautomata . .112 3.6 Examplesofpushdownautomata . 116 3.6.1 Properlynestedparentheses . 116 3.6.2 Strings of the form 0n1n .................117 3.6.3 Strings with b in the middle . 118 3.7 Equivalence of pushdown automata and context-free grammars 120 3.8 The pumping lemma for context-free languages . 124 3.8.1 Proofofthepumpinglemma. .125 3.8.2 Applications of the pumping lemma . 128 Contents v Exercises.................................132 4 Turing Machines and the Church-Turing Thesis 137 4.1 Definition of a Turing machine . 137 4.2 ExamplesofTuringmachines . .141 4.2.1 Accepting palindromes using one tape . 141 4.2.2 Accepting palindromes using two tapes . 142 4.2.3 Accepting anbncn usingonetape. .143 4.2.4 Accepting anbncn using tape alphabet a, b, c, ✷ . .145 { } 4.2.5 Accepting ambncmn usingonetape. .147 4.3 Multi-tape Turing machines . 148 4.4 TheChurch-TuringThesis . .151 Exercises.................................152 5 Decidable and Undecidable Languages 157 5.1 Decidability . 157 5.1.1 The language ADFA ....................158 5.1.2 The language ANFA ....................159 5.1.3 The language ACFG ....................160 5.1.4 The language ATM ....................161 5.1.5 The Halting Problem . 163 5.2 Countablesets...........................164 5.2.1 The Halting Problem revisited . 168 5.3 Rice’sTheorem ..........................170 5.3.1 ProofofRice’sTheorem . .171 5.4 Enumerability...........................173 5.4.1 Hilbert’s problem . 174 5.4.2 The language ATM ....................176 5.5 Where does the term “enumerable” come from? . 177 5.6 Mostlanguagesarenotenumerable . 180 5.6.1 The set of enumerable languages is countable . 180 5.6.2 The set of all languages is not countable . 181 5.6.3 There are languages that are not enumerable . 183 5.7 The relationship between decidable and enumerable languages 184 5.8 A language A such that both A and A are not enumerable . 186 5.8.1 EQ TM isnotenumerable. .186 5.8.2 EQ TM isnotenumerable. .188 Exercises.................................189 vi Contents 6 Complexity Theory 197 6.1 The running time of algorithms . 197 6.2 The complexity class P ......................199 6.2.1 Someexamples ......................199 6.3 The complexity class NP .....................202 6.3.1 P is contained in NP ...................208 6.3.2 Deciding NP-languages in exponential time . 208 6.3.3 Summary .........................210 6.4 Non-deterministic algorithms . 211 6.5 NP-completelanguages . .213 6.5.1 Twoexamplesofreductions . .215 6.5.2 Definition of NP-completeness . 220 6.5.3 An NP-completedominogame . .222 6.5.4 Examples of NP-completelanguages . 231 Exercises.................................235 7 Summary 239 Preface This is a free textbook for an undergraduate course on the Theory of Com- putation, which we have been teaching at Carleton University since 2002. Until the 2011/2012 academic year, this course was offered as a second-year course (COMP 2805) and was compulsory for all Computer Science students. Starting with the 2012/2013 academic year, the course has been downgraded to a third-year optional course (COMP 3803). We have been developing this book since we started teaching this course. Currently, we cover most of the material from Chapters 2–5 during a 12-week term with three hours of classes per week. The material from Chapter 6, on Complexity Theory, is taught in the third-year course COMP 3804 (Design and Analysis of Algorithms). In the early years of COMP 2805, we gave a two-lecture overview of Complexity Theory at the end of the term. Even though this overview has disappeared from the course, we decided to keep Chapter 6. This chapter has not been revised/modified for a long time. The course as we teach it today has been influenced by the following two textbooks: Introduction to the Theory of Computation (second edition), by Michael • Sipser, Thomson Course Technnology, Boston, 2006. Einf¨uhrung in die Theoretische Informatik, by Klaus Wagner, Springer- • Verlag, Berlin, 1994. Besides reading this text, we recommend that you also take a look at these excellent textbooks, as well as one or more of the following ones: Elements of the Theory of Computation (second edition), by Harry • Lewis and Christos Papadimitriou, Prentice-Hall, 1998. viii Introduction to Languages and the Theory of Computation (third edi- • tion), by John Martin, McGraw-Hill, 2003. Introduction to Automata Theory, Languages, and Computation (third • edition), by John Hopcroft, Rajeev Motwani, Jeffrey Ullman, Addison Wesley, 2007. Please let us know if you find errors, typos, simpler proofs, comments, omissions, or if you think that some parts of the book “need improvement”. Chapter 1 Introduction 1.1 Purpose and motivation This course is on the Theory of Computation, which tries to answer the following questions: What are the mathematical properties of computer hardware and soft- • ware? What is a computation and what is an algorithm? Can we give rigorous • mathematical definitions of these notions? What are the limitations of computers? Can “everything” be com- • puted? (As we will see, the answer to this question is “no”.) Purpose of the Theory of Computation: Develop formal math- ematical models of computation that reflect real-world computers. This field of research was started by mathematicians and logicians in the 1930’s, when they were trying to understand the meaning of a “computation”. A central question asked was whether all mathematical problems can be solved in a systematic way. The research that started in those days led to computers as we know them today. Nowadays, the Theory of Computation can be divided into the follow- ing three areas: Complexity Theory, Computability Theory, and Automata Theory. 2 Chapter 1. Introduction 1.1.1 Complexity theory The main question asked in this area is “What makes some problems com- putationally hard and other problems easy?” Informally, a problem is called “easy”, if it is efficiently solvable. Exam- ples of “easy” problems are (i) sorting a sequence of, say, 1,000,000 numbers, (ii) searching for a name in a telephone directory, and (iii) computing the fastest way to drive from Ottawa to Miami. On the other hand, a problem is called “hard”, if it cannot be solved efficiently, or if we don’t know whether it can be solved efficiently. Examples of “hard” problems are (i) time table scheduling for all courses at Carleton, (ii) factoring a 300-digit integer into its prime factors, and (iii) computing a layout for chips in VLSI. Central Question in Complexity Theory: Classify problems ac- cording to their degree of “difficulty”. Give a rigorous proof that problems that seem to be “hard” are really “hard”. 1.1.2 Computability theory In the 1930’s, G¨odel, Turing, and Church discovered that some of the fun- damental mathematical problems cannot be solved by a “computer”. (This may sound strange, because computers were invented only in the 1940’s). An example of such a problem is “Is an arbitrary mathematical statement true or false?” To attack such a problem, we need formal definitions of the notions of computer, • algorithm, and • computation. • The theoretical models that were proposed in order to understand solvable and unsolvable problems led to the development of real computers. Central Question in Computability Theory: Classify problems as being solvable or unsolvable. 1.1. Purpose and motivation 3 1.1.3 Automata theory Automata Theory deals with definitions and properties of different types of “computation models”. Examples of such models are: Finite Automata. These are used in text processing, compilers, and • hardware design. Context-Free Grammars. These are used to define programming lan- • guages and in Artificial Intelligence. Turing Machines. These form a simple abstract model of a “real” • computer, such as your PC at home.
Recommended publications
  • 19 Theory of Computation
    19 Theory of Computation "You are about to embark on the study of a fascinating and important subject: the theory of computation. It com- prises the fundamental mathematical properties of computer hardware, software, and certain applications thereof. In studying this subject we seek to determine what can and cannot be computed, how quickly, with how much memory, and on which type of computational model." - Michael Sipser, "Introduction to the Theory of Computation", one of the driest computer science textbooks ever In which we go back to the basics with arcane, boring, and irrelevant set theory and counting. This week’s material is made challenging by two orthogonal issues. There’s a whole bunch of stuff to cover, and 95% of it isn’t the least bit interesting to a physical scientist. Nevertheless, being the starry-eyed optimist that I am, I’ll forge ahead through the mountainous bulk of knowledge. In other words, expect these notes to be long! But skim when necessary; if you look at the homework first (and haven’t entirely forgotten the lectures), you should be able to pick out the important parts. I understand if this type of stuff isn’t exactly your cup of tea, but bear with me while you can. In some sense, most of what we refer to as computer science isn’t computer science at all, any more than what an accountant does is math. Sure, modern economics takes some crazy hardcore theory to make it all work, but un- derlying the whole thing is a solid layer of applications and industrial moolah.
    [Show full text]
  • Software for Numerical Computation
    Purdue University Purdue e-Pubs Department of Computer Science Technical Reports Department of Computer Science 1977 Software for Numerical Computation John R. Rice Purdue University, [email protected] Report Number: 77-214 Rice, John R., "Software for Numerical Computation" (1977). Department of Computer Science Technical Reports. Paper 154. https://docs.lib.purdue.edu/cstech/154 This document has been made available through Purdue e-Pubs, a service of the Purdue University Libraries. Please contact [email protected] for additional information. SOFTWARE FOR NUMERICAL COMPUTATION John R. Rice Department of Computer Sciences Purdue University West Lafayette, IN 47907 CSD TR #214 January 1977 SOFTWARE FOR NUMERICAL COMPUTATION John R. Rice Mathematical Sciences Purdue University CSD-TR 214 January 12, 1977 Article to appear in the book: Research Directions in Software Technology. SOFTWARE FOR NUMERICAL COMPUTATION John R. Rice Mathematical Sciences Purdue University INTRODUCTION AND MOTIVATING PROBLEMS. The purpose of this article is to examine the research developments in software for numerical computation. Research and development of numerical methods is not intended to be discussed for two reasons. First, a reasonable survey of the research in numerical methods would require a book. The COSERS report [Rice et al, 1977] on Numerical Computation does such a survey in about 100 printed pages and even so the discussion of many important fields (never mind topics) is limited to a few paragraphs. Second, the present book is focused on software and thus it is natural to attempt to separate software research from numerical computation research. This, of course, is not easy as the two are intimately intertwined.
    [Show full text]
  • Turing's Influence on Programming — Book Extract from “The Dawn of Software Engineering: from Turing to Dijkstra”
    Turing's Influence on Programming | Book extract from \The Dawn of Software Engineering: from Turing to Dijkstra" Edgar G. Daylight∗ Eindhoven University of Technology, The Netherlands [email protected] Abstract Turing's involvement with computer building was popularized in the 1970s and later. Most notable are the books by Brian Randell (1973), Andrew Hodges (1983), and Martin Davis (2000). A central question is whether John von Neumann was influenced by Turing's 1936 paper when he helped build the EDVAC machine, even though he never cited Turing's work. This question remains unsettled up till this day. As remarked by Charles Petzold, one standard history barely mentions Turing, while the other, written by a logician, makes Turing a key player. Contrast these observations then with the fact that Turing's 1936 paper was cited and heavily discussed in 1959 among computer programmers. In 1966, the first Turing award was given to a programmer, not a computer builder, as were several subsequent Turing awards. An historical investigation of Turing's influence on computing, presented here, shows that Turing's 1936 notion of universality became increasingly relevant among programmers during the 1950s. The central thesis of this paper states that Turing's in- fluence was felt more in programming after his death than in computer building during the 1940s. 1 Introduction Many people today are led to believe that Turing is the father of the computer, the father of our digital society, as also the following praise for Martin Davis's bestseller The Universal Computer: The Road from Leibniz to Turing1 suggests: At last, a book about the origin of the computer that goes to the heart of the story: the human struggle for logic and truth.
    [Show full text]
  • Quantum Hypercomputation—Hype Or Computation?
    Quantum Hypercomputation—Hype or Computation? Amit Hagar,∗ Alex Korolev† February 19, 2007 Abstract A recent attempt to compute a (recursion–theoretic) non–computable func- tion using the quantum adiabatic algorithm is criticized and found wanting. Quantum algorithms may outperform classical algorithms in some cases, but so far they retain the classical (recursion–theoretic) notion of computability. A speculation is then offered as to where the putative power of quantum computers may come from. ∗HPS Department, Indiana University, Bloomington, IN, 47405. [email protected] †Philosophy Department, University of BC, Vancouver, BC. [email protected] 1 1 Introduction Combining physics, mathematics and computer science, quantum computing has developed in the past two decades from a visionary idea (Feynman 1982) to one of the most exciting areas of quantum mechanics (Nielsen and Chuang 2000). The recent excitement in this lively and fashionable domain of research was triggered by Peter Shor (1994) who showed how a quantum computer could exponentially speed–up classical computation and factor numbers much more rapidly (at least in terms of the number of computational steps involved) than any known classical algorithm. Shor’s algorithm was soon followed by several other algorithms that aimed to solve combinatorial and algebraic problems, and in the last few years the- oretical study of quantum systems serving as computational devices has achieved tremendous progress. According to one authority in the field (Aharonov 1998, Abstract), we now have strong theoretical evidence that quantum computers, if built, might be used as powerful computational tool, capable of per- forming tasks which seem intractable for classical computers.
    [Show full text]
  • CS 2210: Theory of Computation
    CS 2210: Theory of Computation Spring, 2019 Administrative Information • Background survey • Textbook: E. Rich, Automata, Computability, and Complexity: Theory and Applications, Prentice-Hall, 2008. • Book website: http://www.cs.utexas.edu/~ear/cs341/automatabook/ • My Office Hours: – Monday, 6:00-8:00pm, Searles 224 – Tuesday, 1:00-2:30pm, Searles 222 • TAs – Anjulee Bhalla: Hours TBA, Searles 224 – Ryan St. Pierre, Hours TBA, Searles 224 What you can expect from the course • How to do proofs • Models of computation • What’s the difference between computability and complexity? • What’s the Halting Problem? • What are P and NP? • Why do we care whether P = NP? • What are NP-complete problems? • Where does this make a difference outside of this class? • How to work the answers to these questions into the conversation at a cocktail party… What I will expect from you • Problem Sets (25%): – Goal: Problems given on Mondays and Wednesdays – Due the next Monday – Graded by following Monday – A learning tool, not a testing tool – Collaboration encouraged; more on this in next slide • Quizzes (15%) • Exams (2 non-cumulative, 30% each): – Closed book, closed notes, but… – Can bring in 8.5 x 11 page with notes on both sides • Class participation: Tiebreaker Other Important Things • Go to the TA hours • Study and work on problem sets in groups • Collaboration Issues: – Level 0 (In-Class Problems) • No restrictions – Level 1 (Homework Problems) • Verbal collaboration • But, individual write-ups – Level 2 (not used in this course) • Discussion with TAs only – Level 3 (Exams) • Professor clarifications only Right now… • What does it mean to study the “theory” of something? • Experience with theory in other disciplines? • Relationship to practice? – “In theory, theory and practice are the same.
    [Show full text]
  • The Problem with Threads
    The Problem with Threads Edward A. Lee Electrical Engineering and Computer Sciences University of California at Berkeley Technical Report No. UCB/EECS-2006-1 http://www.eecs.berkeley.edu/Pubs/TechRpts/2006/EECS-2006-1.html January 10, 2006 Copyright © 2006, by the author(s). All rights reserved. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission. Acknowledgement This work was supported in part by the Center for Hybrid and Embedded Software Systems (CHESS) at UC Berkeley, which receives support from the National Science Foundation (NSF award No. CCR-0225610), the State of California Micro Program, and the following companies: Agilent, DGIST, General Motors, Hewlett Packard, Infineon, Microsoft, and Toyota. The Problem with Threads Edward A. Lee Professor, Chair of EE, Associate Chair of EECS EECS Department University of California at Berkeley Berkeley, CA 94720, U.S.A. [email protected] January 10, 2006 Abstract Threads are a seemingly straightforward adaptation of the dominant sequential model of computation to concurrent systems. Languages require little or no syntactic changes to sup- port threads, and operating systems and architectures have evolved to efficiently support them. Many technologists are pushing for increased use of multithreading in software in order to take advantage of the predicted increases in parallelism in computer architectures.
    [Show full text]
  • Mathematics and Computation
    Mathematics and Computation Mathematics and Computation Ideas Revolutionizing Technology and Science Avi Wigderson Princeton University Press Princeton and Oxford Copyright c 2019 by Avi Wigderson Requests for permission to reproduce material from this work should be sent to [email protected] Published by Princeton University Press, 41 William Street, Princeton, New Jersey 08540 In the United Kingdom: Princeton University Press, 6 Oxford Street, Woodstock, Oxfordshire OX20 1TR press.princeton.edu All Rights Reserved Library of Congress Control Number: 2018965993 ISBN: 978-0-691-18913-0 British Library Cataloging-in-Publication Data is available Editorial: Vickie Kearn, Lauren Bucca, and Susannah Shoemaker Production Editorial: Nathan Carr Jacket/Cover Credit: THIS INFORMATION NEEDS TO BE ADDED WHEN IT IS AVAILABLE. WE DO NOT HAVE THIS INFORMATION NOW. Production: Jacquie Poirier Publicity: Alyssa Sanford and Kathryn Stevens Copyeditor: Cyd Westmoreland This book has been composed in LATEX The publisher would like to acknowledge the author of this volume for providing the camera-ready copy from which this book was printed. Printed on acid-free paper 1 Printed in the United States of America 10 9 8 7 6 5 4 3 2 1 Dedicated to the memory of my father, Pinchas Wigderson (1921{1988), who loved people, loved puzzles, and inspired me. Ashgabat, Turkmenistan, 1943 Contents Acknowledgments 1 1 Introduction 3 1.1 On the interactions of math and computation..........................3 1.2 Computational complexity theory.................................6 1.3 The nature, purpose, and style of this book............................7 1.4 Who is this book for?........................................7 1.5 Organization of the book......................................8 1.6 Notation and conventions.....................................
    [Show full text]
  • Language and Automata Theory and Applications
    LANGUAGE AND AUTOMATA THEORY AND APPLICATIONS Carlos Martín-Vide Characterization • It deals with the description of properties of sequences of symbols • Such an abstract characterization explains the interdisciplinary flavour of the field • The theory grew with the need of formalizing and describing the processes linked with the use of computers and communication devices, but its origins are within mathematical logic and linguistics A bit of history • Early roots in the work of logicians at the beginning of the XXth century: Emil Post, Alonzo Church, Alan Turing Developments motivated by the search for the foundations of the notion of proof in mathematics (Hilbert) • After the II World War: Claude Shannon, Stephen Kleene, John von Neumann Development of computers and telecommunications Interest in exploring the functions of the human brain • Late 50s XXth century: Noam Chomsky Formal methods to describe natural languages • Last decades Molecular biology considers the sequences of molecules formed by genomes as sequences of symbols on the alphabet of basic elements Interest in describing properties like repetitions of occurrences or similarity between sequences Chomsky hierarchy of languages • Finite-state or regular • Context-free • Context-sensitive • Recursively enumerable REG ⊂ CF ⊂ CS ⊂ RE Finite automata: origins • Warren McCulloch & Walter Pitts. A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biophysics, 5:115-133, 1943 • Stephen C. Kleene. Representation of events in nerve nets and
    [Show full text]
  • Computer Architecture: Dataflow (Part I)
    Computer Architecture: Dataflow (Part I) Prof. Onur Mutlu Carnegie Mellon University A Note on This Lecture n These slides are from 18-742 Fall 2012, Parallel Computer Architecture, Lecture 22: Dataflow I n Video: n http://www.youtube.com/watch? v=D2uue7izU2c&list=PL5PHm2jkkXmh4cDkC3s1VBB7- njlgiG5d&index=19 2 Some Required Dataflow Readings n Dataflow at the ISA level q Dennis and Misunas, “A Preliminary Architecture for a Basic Data Flow Processor,” ISCA 1974. q Arvind and Nikhil, “Executing a Program on the MIT Tagged- Token Dataflow Architecture,” IEEE TC 1990. n Restricted Dataflow q Patt et al., “HPS, a new microarchitecture: rationale and introduction,” MICRO 1985. q Patt et al., “Critical issues regarding HPS, a high performance microarchitecture,” MICRO 1985. 3 Other Related Recommended Readings n Dataflow n Gurd et al., “The Manchester prototype dataflow computer,” CACM 1985. n Lee and Hurson, “Dataflow Architectures and Multithreading,” IEEE Computer 1994. n Restricted Dataflow q Sankaralingam et al., “Exploiting ILP, TLP and DLP with the Polymorphous TRIPS Architecture,” ISCA 2003. q Burger et al., “Scaling to the End of Silicon with EDGE Architectures,” IEEE Computer 2004. 4 Today n Start Dataflow 5 Data Flow Readings: Data Flow (I) n Dennis and Misunas, “A Preliminary Architecture for a Basic Data Flow Processor,” ISCA 1974. n Treleaven et al., “Data-Driven and Demand-Driven Computer Architecture,” ACM Computing Surveys 1982. n Veen, “Dataflow Machine Architecture,” ACM Computing Surveys 1986. n Gurd et al., “The Manchester prototype dataflow computer,” CACM 1985. n Arvind and Nikhil, “Executing a Program on the MIT Tagged-Token Dataflow Architecture,” IEEE TC 1990.
    [Show full text]
  • 2020 SIGACT REPORT SIGACT EC – Eric Allender, Shuchi Chawla, Nicole Immorlica, Samir Khuller (Chair), Bobby Kleinberg September 14Th, 2020
    2020 SIGACT REPORT SIGACT EC – Eric Allender, Shuchi Chawla, Nicole Immorlica, Samir Khuller (chair), Bobby Kleinberg September 14th, 2020 SIGACT Mission Statement: The primary mission of ACM SIGACT (Association for Computing Machinery Special Interest Group on Algorithms and Computation Theory) is to foster and promote the discovery and dissemination of high quality research in the domain of theoretical computer science. The field of theoretical computer science is the rigorous study of all computational phenomena - natural, artificial or man-made. This includes the diverse areas of algorithms, data structures, complexity theory, distributed computation, parallel computation, VLSI, machine learning, computational biology, computational geometry, information theory, cryptography, quantum computation, computational number theory and algebra, program semantics and verification, automata theory, and the study of randomness. Work in this field is often distinguished by its emphasis on mathematical technique and rigor. 1. Awards ▪ 2020 Gödel Prize: This was awarded to Robin A. Moser and Gábor Tardos for their paper “A constructive proof of the general Lovász Local Lemma”, Journal of the ACM, Vol 57 (2), 2010. The Lovász Local Lemma (LLL) is a fundamental tool of the probabilistic method. It enables one to show the existence of certain objects even though they occur with exponentially small probability. The original proof was not algorithmic, and subsequent algorithmic versions had significant losses in parameters. This paper provides a simple, powerful algorithmic paradigm that converts almost all known applications of the LLL into randomized algorithms matching the bounds of the existence proof. The paper further gives a derandomized algorithm, a parallel algorithm, and an extension to the “lopsided” LLL.
    [Show full text]
  • Ece585 Lec2.Pdf
    ECE 485/585 Microprocessor System Design Lecture 2: Memory Addressing 8086 Basics and Bus Timing Asynchronous I/O Signaling Zeshan Chishti Electrical and Computer Engineering Dept Maseeh College of Engineering and Computer Science Source: Lecture based on materials provided by Mark F. Basic I/O – Part I ECE 485/585 Outline for next few lectures Simple model of computation Memory Addressing (Alignment, Byte Order) 8088/8086 Bus Asynchronous I/O Signaling Review of Basic I/O How is I/O performed Dedicated/Isolated /Direct I/O Ports Memory Mapped I/O How do we tell when I/O device is ready or command complete? Polling Interrupts How do we transfer data? Programmed I/O DMA ECE 485/585 Simplified Model of a Computer Control Control Data, Address, Memory Data Path Microprocessor Keyboard Mouse [Fetch] Video display [Decode] Printer [Execute] I/O Device Hard disk drive Audio card Ethernet WiFi CD R/W DVD ECE 485/585 Memory Addressing Size of operands Bytes, words, long/double words, quadwords 16-bit half word (Intel: word) 32-bit word (Intel: doubleword, dword) 0x107 64-bit double word (Intel: quadword, qword) 0x106 Note: names are non-standard 0x105 SUN Sparc word is 32-bits, double is 64-bits 0x104 0x103 Alignment 0x102 Can multi-byte operands begin at any byte address? 0x101 Yes: non-aligned 0x100 No: aligned. Low order address bit(s) will be zero ECE 485/585 Memory Operand Alignment …Intel IA speak (i.e. word = 16-bits = 2 bytes) 0x107 0x106 0x105 0x104 0x103 0x102 0x101 0x100 Aligned Unaligned Aligned Unaligned Aligned Unaligned word word Double Double Quad Quad address address word word word word -----0 address address address address -----00 ----000 ECE 485/585 Memory Operand Alignment Why do we care? Unaligned memory references Can cause multiple memory bus cycles for a single operand May also span cache lines Requiring multiple evictions, multiple cache line fills Complicates memory system and cache controller design Some architectures restrict addresses to be aligned Even in architectures without alignment restrictions (e.g.
    [Show full text]
  • What Is Computerscience?
    What is Computer Science? Computer Science Defined? • “computer science” —which, actually is like referring to surgery as “knife science” - Prof. Dr. Edsger W. Dijkstra • “A branch of science that deals with the theory of computation or the design of computers” - Webster Dictionary • Computer science "is the study of computation and information” - University of York • “Computer science is the study of process: how we or computers do things, how we specify what we do, and how we specify what the stuff is that we’re processing.” - Your Textbook Computer Science in Reality The study of using computers to solve problems. Fields of Computer Science • Software Engineering • Multimedia (Game Design, Animation, DataVisualization) • Web Development • Networking • Big Data / Machine Learning /AI • Bioinformatics • Robotics • Internet of Things • … Computers Rule the World! • Shopping • Communication / SocialMedia • Work • Entertainment • Vehicles • Appliances • Banking • … Overview of the Couse • Learn a programming language • Develop algorithms and write programs to implement them • Understand how computers store data and multimedia • Have fun! Programming Languages • How we communicate with computers in a way they understand • Lots of different languages, some with special purposes • How we write programs to implement algorithms What’s an Algorithm? Input Algorithm Output • An algorithm is a finite series of instructions applied to an input to produceoutput. • Computer programs are made up of algorithms. The “Recipe” Analogy Input Output Algorithm
    [Show full text]