6.006 Introduction to Algorithms Spring 2008
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Radix Sort Comparison Sort Runtime of O(N*Log(N)) Is Optimal
Radix Sort Comparison sort runtime of O(n*log(n)) is optimal • The problem of sorting cannot be solved using comparisons with less than n*log(n) time complexity • See Proposition I in Chapter 2.2 of the text How can we sort without comparison? • Consider the following approach: • Look at the least-significant digit • Group numbers with the same digit • Maintain relative order • Place groups back in array together • I.e., all the 0’s, all the 1’s, all the 2’s, etc. • Repeat for increasingly significant digits The characteristics of Radix sort • Least significant digit (LSD) Radix sort • a fast stable sorting algorithm • begins at the least significant digit (e.g. the rightmost digit) • proceeds to the most significant digit (e.g. the leftmost digit) • lexicographic orderings Generally Speaking • 1. Take the least significant digit (or group of bits) of each key • 2. Group the keys based on that digit, but otherwise keep the original order of keys. • This is what makes the LSD radix sort a stable sort. • 3. Repeat the grouping process with each more significant digit Generally Speaking public static void sort(String [] a, int W) { int N = a.length; int R = 256; String [] aux = new String[N]; for (int d = W - 1; d >= 0; --d) { aux = sorted array a by the dth character a = aux } } A Radix sort example A Radix sort example A Radix sort example • Problem: How to ? • Group the keys based on that digit, • but otherwise keep the original order of keys. Key-indexed counting • 1. Take the least significant digit (or group of bits) of each key • 2. -
Vertex Ordering Problems in Directed Graph Streams∗
Vertex Ordering Problems in Directed Graph Streams∗ Amit Chakrabartiy Prantar Ghoshz Andrew McGregorx Sofya Vorotnikova{ Abstract constant-pass algorithms for directed reachability [12]. We consider directed graph algorithms in a streaming setting, This is rather unfortunate given that many of the mas- focusing on problems concerning orderings of the vertices. sive graphs often mentioned in the context of motivating This includes such fundamental problems as topological work on graph streaming are directed, e.g., hyperlinks, sorting and acyclicity testing. We also study the related citations, and Twitter \follows" all correspond to di- problems of finding a minimum feedback arc set (edges rected edges. whose removal yields an acyclic graph), and finding a sink In this paper we consider the complexity of a variety vertex. We are interested in both adversarially-ordered and of fundamental problems related to vertex ordering in randomly-ordered streams. For arbitrary input graphs with directed graphs. For example, one basic problem that 1 edges ordered adversarially, we show that most of these motivated much of this work is as follows: given a problems have high space complexity, precluding sublinear- stream consisting of edges of an acyclic graph in an space solutions. Some lower bounds also apply when the arbitrary order, how much memory is required to return stream is randomly ordered: e.g., in our most technical result a topological ordering of the graph? In the offline we show that testing acyclicity in the p-pass random-order setting, this can be computed in O(m + n) time using model requires roughly n1+1=p space. -
Algorithms (Decrease-And-Conquer)
Algorithms (Decrease-and-Conquer) Pramod Ganapathi Department of Computer Science State University of New York at Stony Brook August 22, 2021 1 Contents Concepts 1. Decrease by constant Topological sorting 2. Decrease by constant factor Lighter ball Josephus problem 3. Variable size decrease Selection problem ............................................................... Problems Stooge sort 2 Contributors Ayush Sharma 3 Decrease-and-conquer Problem(n) [Step 1. Decrease] Subproblem(n0) [Step 2. Conquer] Subsolution [Step 3. Combine] Solution 4 Types of decrease-and-conquer Decrease by constant. n0 = n − c for some constant c Decrease by constant factor. n0 = n/c for some constant c Variable size decrease. n0 = n − c for some variable c 5 Decrease by constant Size of instance is reduced by the same constant in each iteration of the algorithm Decrease by 1 is common Examples: Array sum Array search Find maximum/minimum element Integer product Exponentiation Topological sorting 6 Decrease by constant factor Size of instance is reduced by the same constant factor in each iteration of the algorithm Decrease by factor 2 is common Examples: Binary search Search/insert/delete in balanced search tree Fake coin problem Josephus problem 7 Variable size decrease Size of instance is reduced by a variable in each iteration of the algorithm Examples: Selection problem Quicksort Search/insert/delete in binary search tree Interpolation search 8 Decrease by Constant HOME 9 Topological sorting Problem Topological sorting of vertices of a directed acyclic graph is an ordering of the vertices v1, v2, . , vn in such a way that there is an edge directed towards vertex vj from vertex vi, then vi comes before vj. -
Topological Sort (An Application of DFS)
Topological Sort (an application of DFS) CSC263 Tutorial 9 Topological sort • We have a set of tasks and a set of dependencies (precedence constraints) of form “task A must be done before task B” • Topological sort : An ordering of the tasks that conforms with the given dependencies • Goal : Find a topological sort of the tasks or decide that there is no such ordering Examples • Scheduling : When scheduling task graphs in distributed systems, usually we first need to sort the tasks topologically ...and then assign them to resources (the most efficient scheduling is an NP-complete problem) • Or during compilation to order modules/libraries d c a g f b e Examples • Resolving dependencies : apt-get uses topological sorting to obtain the admissible sequence in which a set of Debian packages can be installed/removed Topological sort more formally • Suppose that in a directed graph G = (V, E) vertices V represent tasks, and each edge ( u, v)∊E means that task u must be done before task v • What is an ordering of vertices 1, ..., | V| such that for every edge ( u, v), u appears before v in the ordering? • Such an ordering is called a topological sort of G • Note: there can be multiple topological sorts of G Topological sort more formally • Is it possible to execute all the tasks in G in an order that respects all the precedence requirements given by the graph edges? • The answer is " yes " if and only if the directed graph G has no cycle ! (otherwise we have a deadlock ) • Such a G is called a Directed Acyclic Graph, or just a DAG Algorithm for -
6.006 Course Notes
6.006 Course Notes Wanlin Li Fall 2018 1 September 6 • Computation • Definition. Problem: matching inputs to correct outputs • Definition. Algorithm: procedure/function matching each input to single out- put, correct if matches problem • Want program to be able to handle infinite space of inputs, arbitrarily large size of input • Want finite and efficient program, need recursive code • Efficiency determined by asymptotic growth Θ() • O(f(n)) is all functions below order of growth of f(n) • Ω(f(n)) is all functions above order of growth of f(n) • Θ(f(n)) is tight bound • Definition. Log-linear: Θ(n log n) Θ(nc) • Definition. Exponential: b • Algorithm generally considered efficient if runs in polynomial time • Model of computation: computer has memory (RAM - random access memory) and CPU for performing computation; what operations are allowed in constant time • CPU has registers to access and process memory address, needs log n bits to handle size n input • Memory chunked into “words” of size at least log n 1 6.006 Wanlin Li 32 10 • E.g. 32 bit machine can handle 2 ∼ 4 GB, 64 bit machine can handle 10 GB • Model of computaiton in word-RAM • Ways to solve algorithms problem: design new algorithm or reduce to already known algorithm (particularly search in data structure, sort, shortest path) • Classification of algorithms based on graph on function calls • Class examples and design paradigms: brute force (completely disconnected), decrease and conquer (path), divide and conquer (tree with branches), dynamic programming (directed acyclic graph), greedy (choose which paths on directed acyclic graph to take) • Computation doesn’t have to be a tree, just has to be directed acyclic graph 2 September 11 • Example. -
Lecture 8.Key
CSC 391/691: GPU Programming Fall 2015 Parallel Sorting Algorithms Copyright © 2015 Samuel S. Cho Sorting Algorithms Review 2 • Bubble Sort: O(n ) 2 • Insertion Sort: O(n ) • Quick Sort: O(n log n) • Heap Sort: O(n log n) • Merge Sort: O(n log n) • The best we can expect from a sequential sorting algorithm using p processors (if distributed evenly among the n elements to be sorted) is O(n log n) / p ~ O(log n). Compare and Exchange Sorting Algorithms • Form the basis of several, if not most, classical sequential sorting algorithms. • Two numbers, say A and B, are compared between P0 and P1. P0 P1 A B MIN MAX Bubble Sort • Generic example of a “bad” sorting 0 1 2 3 4 5 algorithm. start: 1 3 8 0 6 5 0 1 2 3 4 5 Algorithm: • after pass 1: 1 3 0 6 5 8 • Compare neighboring elements. • Swap if neighbor is out of order. 0 1 2 3 4 5 • Two nested loops. after pass 2: 1 0 3 5 6 8 • Stop when a whole pass 0 1 2 3 4 5 completes without any swaps. after pass 3: 0 1 3 5 6 8 0 1 2 3 4 5 • Performance: 2 after pass 4: 0 1 3 5 6 8 Worst: O(n ) • 2 • Average: O(n ) fin. • Best: O(n) "The bubble sort seems to have nothing to recommend it, except a catchy name and the fact that it leads to some interesting theoretical problems." - Donald Knuth, The Art of Computer Programming Odd-Even Transposition Sort (also Brick Sort) • Simple sorting algorithm that was introduced in 1972 by Nico Habermann who originally developed it for parallel architectures (“Parallel Neighbor-Sort”). -
Tutorial 2: Heapsort, Quicksort, Counting Sort, Radix Sort
Tutorial 2: Heapsort, Quicksort, Counting Sort, Radix Sort Mayank Saksena September 20, 2006 1 Heapsort We review Heapsort, and prove some loop invariants for it. For further information, see Chapter 6 of Introduction to Algorithms. HEAPSORT(A) 1 BUILD-MAX-HEAP(A) Ð eÒg Øh A 2 for i = ( ) downto 2 A i 3 swap A[1] and [ ] ×iÞ e A heaÔ ×iÞ e A 4 heaÔ- ( )= - ( ) 1 5 MAX-HEAPIFY(A; 1) BUILD-MAX-HEAP(A) ×iÞ e A Ð eÒg Øh A 1 heaÔ- ( )= ( ) Ð eÒg Øh A = 2 for i = ( ) 2 downto 1 3 MAX-HEAPIFY(A; i) MAX-HEAPIFY(A; i) i 1 Ð =LEFT( ) i 2 Ö =RIGHT( ) heaÔ ×iÞ e A A Ð > A i Ð aÖ g eר Ð 3 if Ð - ( ) and ( ) ( ) then = i 4 else Ð aÖ g eר = heaÔ ×iÞ e A A Ö > A Ð aÖ g eר Ð aÖ g eר Ö 5 if Ö - ( ) and ( ) ( ) then = i 6 if Ð aÖ g eר = i A Ð aÖ g eר 7 swap A[ ] and [ ] 8 MAX-HEAPIFY(A; Ð aÖ g eר) 1 Loop invariants First, assume that MAX-HEAPIFY(A; i) is correct, i.e., that it makes the subtree with A root i a max-heap. Under this assumption, we prove that BUILD-MAX-HEAP( ) is correct, i.e., that it makes A a max-heap. A We show: at the start of iteration i of the for-loop of BUILD-MAX-HEAP( ) (line ; i ;:::;Ò 2), each of the nodes i +1 +2 is the root of a max-heap. -
Sorting Algorithm 1 Sorting Algorithm
Sorting algorithm 1 Sorting algorithm In computer science, a sorting algorithm is an algorithm that puts elements of a list in a certain order. The most-used orders are numerical order and lexicographical order. Efficient sorting is important for optimizing the use of other algorithms (such as search and merge algorithms) that require sorted lists to work correctly; it is also often useful for canonicalizing data and for producing human-readable output. More formally, the output must satisfy two conditions: 1. The output is in nondecreasing order (each element is no smaller than the previous element according to the desired total order); 2. The output is a permutation, or reordering, of the input. Since the dawn of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it efficiently despite its simple, familiar statement. For example, bubble sort was analyzed as early as 1956.[1] Although many consider it a solved problem, useful new sorting algorithms are still being invented (for example, library sort was first published in 2004). Sorting algorithms are prevalent in introductory computer science classes, where the abundance of algorithms for the problem provides a gentle introduction to a variety of core algorithm concepts, such as big O notation, divide and conquer algorithms, data structures, randomized algorithms, best, worst and average case analysis, time-space tradeoffs, and lower bounds. Classification Sorting algorithms used in computer science are often classified by: • Computational complexity (worst, average and best behaviour) of element comparisons in terms of the size of the list . For typical sorting algorithms good behavior is and bad behavior is . -
Verified Textbook Algorithms
Verified Textbook Algorithms A Biased Survey Tobias Nipkow[0000−0003−0730−515X] and Manuel Eberl[0000−0002−4263−6571] and Maximilian P. L. Haslbeck[0000−0003−4306−869X] Technische Universität München Abstract. This article surveys the state of the art of verifying standard textbook algorithms. We focus largely on the classic text by Cormen et al. Both correctness and running time complexity are considered. 1 Introduction Correctness proofs of algorithms are one of the main motivations for computer- based theorem proving. This survey focuses on the verification (which for us always means machine-checked) of textbook algorithms. Their often tricky na- ture means that for the most part they are verified with interactive theorem provers (ITPs). We explicitly cover running time analyses of algorithms, but for reasons of space only those analyses employing ITPs. The rich area of automatic resource analysis (e.g. the work by Jan Hoffmann et al. [103,104]) is out of scope. The following theorem provers appear in our survey and are listed here in alphabetic order: ACL2 [111], Agda [31], Coq [25], HOL4 [181], Isabelle/HOL [151,150], KeY [7], KIV [63], Minlog [21], Mizar [17], Nqthm [34], PVS [157], Why3 [75] (which is primarily automatic). We always indicate which ITP was used in a particular verification, unless it was Isabelle/HOL (which we abbreviate to Isabelle from now on), which remains implicit. Some references to Isabelle formalizations lead into the Archive of Formal Proofs (AFP) [1], an online library of Isabelle proofs. There are a number of algorithm verification frameworks built on top of individual theorem provers. -
Visvesvaraya Technological University a Project Report
` VISVESVARAYA TECHNOLOGICAL UNIVERSITY “Jnana Sangama”, Belagavi – 590 018 A PROJECT REPORT ON “PREDICTIVE SCHEDULING OF SORTING ALGORITHMS” Submitted in partial fulfillment for the award of the degree of BACHELOR OF ENGINEERING IN COMPUTER SCIENCE AND ENGINEERING BY RANJIT KUMAR SHA (1NH13CS092) SANDIP SHAH (1NH13CS101) SAURABH RAI (1NH13CS104) GAURAV KUMAR (1NH13CS718) Under the guidance of Ms. Sridevi (Senior Assistant Professor, Dept. of CSE, NHCE) DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING NEW HORIZON COLLEGE OF ENGINEERING (ISO-9001:2000 certified, Accredited by NAAC ‘A’, Permanently affiliated to VTU) Outer Ring Road, Panathur Post, Near Marathalli, Bangalore – 560103 ` NEW HORIZON COLLEGE OF ENGINEERING (ISO-9001:2000 certified, Accredited by NAAC ‘A’ Permanently affiliated to VTU) Outer Ring Road, Panathur Post, Near Marathalli, Bangalore-560 103 DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CERTIFICATE Certified that the project work entitled “PREDICTIVE SCHEDULING OF SORTING ALGORITHMS” carried out by RANJIT KUMAR SHA (1NH13CS092), SANDIP SHAH (1NH13CS101), SAURABH RAI (1NH13CS104) and GAURAV KUMAR (1NH13CS718) bonafide students of NEW HORIZON COLLEGE OF ENGINEERING in partial fulfillment for the award of Bachelor of Engineering in Computer Science and Engineering of the Visvesvaraya Technological University, Belagavi during the year 2016-2017. It is certified that all corrections/suggestions indicated for Internal Assessment have been incorporated in the report deposited in the department library. The project report has been approved as it satisfies the academic requirements in respect of Project work prescribed for the Degree. Name & Signature of Guide Name Signature of HOD Signature of Principal (Ms. Sridevi) (Dr. Prashanth C.S.R.) (Dr. Manjunatha) External Viva Name of Examiner Signature with date 1. -
CS 61B Final Exam Guerrilla Section Spring 2018 May 5, 2018
CS 61B Final Exam Guerrilla Section Spring 2018 May 5, 2018 Instructions Form a small group. Start on the first problem. Check off with a helper or discuss your solution process with another group once everyone understands how to solve the first problem and then repeat for the second problem . You may not move to the next problem until you check off or discuss with another group and everyone understands why the solution is what it is. You may use any course resources at your disposal: the purpose of this review session is to have everyone learning together as a group. 1 The Tortoise or the Hare 1.1 Given an undirected graph G = (V; E) and an edge e = (s; t) in G, describe an O(jV j + jEj) time algorithm to determine whether G has a cycle containing e. Remove e from the graph. Then, run DFS or BFS starting at s to find t. If a path already exists from s to t, then adding e would create a cycle. 1.2 Given a connected, undirected, and weighted graph, describe an algorithm to con- struct a set with as few edges as possible such that if those edges were removed, there would be no cycles in the remaining graph. Additionally, choose edges such that the sum of the weights of the edges you remove is minimized. This algorithm must be as fast as possible. 1. Negate all edges. 2. Form an MST via Kruskal's or Prim's algorithm. 3. Return the set of all edges not in the MST (undo negation). -
Arxiv:1105.2397V1 [Cs.DS] 12 May 2011 [email protected]
A Incremental Cycle Detection, Topological Ordering, and Strong Component Maintenance BERNHARD HAEUPLER, Massachusetts Institute of Technology TELIKEPALLI KAVITHA, Tata Institute of Fundamental Research ROGERS MATHEW, Indian Institute of Science SIDDHARTHA SEN, Princeton University ROBERT E. TARJAN, Princeton University & HP Laboratories We present two on-line algorithms for maintaining a topological order of a directed n-vertex acyclic graph as arcs are added, and detecting a cycle when one is created. Our first algorithm handles m arc additions in O(m3=2) time. For sparse graphs (m=n = O(1)), this bound improves the best previous bound by a logarithmic factor, and is tight to within a constant factor among algorithms satisfying a natural locality property. Our second algorithm handles an arbitrary sequence of arc additions in O(n5=2) time. For sufficiently dense graphs, this bound improves the best previous boundp by a polynomial factor. Our bound may be far from tight: we show that the algorithm can take Ω(n22 2 lg n) time by relating its performance to a generalization of the k-levels problem of combinatorial geometry. A completely different algorithm running in Θ(n2 log n) time was given recently by Bender, Fineman, and Gilbert. We extend both of our algorithms to the maintenance of strong components, without affecting the asymptotic time bounds. Categories and Subject Descriptors: F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems—Computations on discrete structures; G.2.2 [Discrete Mathematics]: Graph Theory—Graph algorithms; E.1 [Data]: Data Structures—Graphs and networks General Terms: Algorithms, Theory Additional Key Words and Phrases: Dynamic algorithms, directed graphs, topological order, cycle detection, strong compo- nents, halving intersection, arrangement ACM Reference Format: Haeupler, B., Kavitha, T., Mathew, R., Sen, S., and Tarjan, R.