641 Abstract Data Structures, 335, 336, 393 Abstractions, 158, 367 Access Attributes, 356, 429 Add Two Function, 216, 217 Algori

Total Page:16

File Type:pdf, Size:1020Kb

641 Abstract Data Structures, 335, 336, 393 Abstractions, 158, 367 Access Attributes, 356, 429 Add Two Function, 216, 217 Algori Index A input data, 96 input size, 95 Abstract data structures, 335, 336, 393 measurement uncertainty, 94, 95 Abstractions, 158, 367 plotting, 93, 95 Access attributes, 356, 429 range, 95 add_two function, 216, 217 sleep() function, 94, 95 Algorithmic efficiency theoretical, 94, 95 amortized complexity validation, 93 add elements, 77 variable-size data, 93 analysis, 78, 80 worst-case scenario, 96 append operation, 78 sequence of numbers, 73 banker’s method, 78, 79 types cheap operation, 78 average-case complexity, 75 data structures, 76 best-case complexity, 75 expensive operation, 77, 79 worst-case complexity, 75 individual operation, 79 Algorithms, 41 memory layout, 78 abstraction level, 7 operations, 79 checking, 42 potential method, 78–80 combining, 6 Python’s lists, 77 comparing, 66 rule, 77 comparison-based, 98, 103 space, 77 complexity, 44 statistical properties, 76 computational problem, 42, 43 analysis, 74 cryptographic protocols, 43 guessing game, 76 designing, 7, 43 if statement, 74 abstractions, 45 operations, 74 assertions, 51 running time bits, 56 best-case scenario, 96 cities, 45 data size, 96 computational problem, 44 empirical, 93, 94 computations, 54 641 © Thomas Mailund 2021 T. Mailund, Introduction to Computational Thinking, https://doi.org/10.1007/978-1-4842-7077-6 INDEX Algorithms (cont.) problems complexity, 43 data, 45 programmer time, 65 distances, 45 properties, 42 for loops, 54 sequential, 57–59 implementation, 45 stable/unstable, 102 index, 46 tasks, 7 invariants, 52, 53 techniques, 42 lists, 46 terminates, 41 loop condition, 56 Timsort, 105 map, 44, 46 undecidable, 43 model of system, 44 append() method, 394, 396, 397, 552 operators, 54 apply() function, 159, 188, 382 output, 46 Arithmetic operations, 336, 563, 571 pairwise distance, 51 Assertions, 51 post-conditions, 51, 52 Asymptotic running time, see Big-Oh preconditions, 51, 52 notation progress, 54 Attributes, 320, 344, 348, 358 reductionist approach (see Reductionist approach) roads, 45 B termination function, 54, 55 Bayesian statistics, 301 while loops, 54 Big-Oh notation efficiency, 43 add numbers, 82 halting problem, 42 arithmetic implementation, 6, 65 addition rule, 87 intractable, 43 binary search, 88 lists complexity, 89 changing numerical base, 60, 61 functions, 86 longest subsequence, 63 inner loop, 88 longest substring, 63 insertion sort, 88 merging, 64 interval, 89 powerset, 63 low/high range, 88 Sieve of Eratosthenes, 62 multiplication, 86, 87 measuring, 66 nonzero constant, 87 operations, 80 outer loop, 88 optimal, 44 quadratic time, 88 P vs. NP problem, 43 running time, 86, 88 primitive operations, 66 summation, 87 642 INDEX transitivity rules, 86 binary numbers, 596, 601 asymptotic behavior, 81 binomial trees, 595, 596, 598 big-Theta, 82 build tree, 597 binary search, 92 carry tree, 602 classes, 82, 83 constant time, 597 complexity classes constant-time operations, 605, 606 algorithm, 91 create nodes, 597 constant time, 89 data structure, 595 cubic time, 90 deletion, 602 exponential time, 90 delete_min(), 606 growth, 90, 91 designing, 599 linear time, 90 ephemeral, 535, 605 logarithmic time, 89 get_min(), 604 long-linear time, 90 heap class, 598 properties, 83, 84 implementation, 605 quadratic time, 90 insertions/deletions, 599, 604 function growth, 91 insert_tree(), 599 functions, 81 vs. leftist heap, 604 insertion sort, 92 linking, 597 longest substring, 93 merge(), 600–602 merging, 93 operations, 533, 603, 604 operations, 81 Python lists, 596 order class, 80 ranking, 596 reasoning, 84, 85 running time, 601 Sieve of Eratosthenes, 92 sibling/children attributes, 596 symmetric, 81 sorting, 605 upper bound, 81 value, 600 Binary heaps, 607, 610–612, 615, 622, 630 Breadth-first tree traversal Binary search algorithm, 98, 99, 247 definition, 581 bind1st()function, 342 implementation, 582 Binning, 475, 478 in-order and breadth-first, 582 Binomial coefficient, 302, 595 Built-in mutable data structures, 476 Binomial distribution, 303 Binomial heaps, 595, 629 add numbers, 601 C add tree, 597 __call__() magic method, 364 amortization analysis, 604 __call__() method, 338, 365 binary addition, 597 Call by reference, 148 643 INDEX Call by value, 148 implementation, 107 Classes, 173, 318, 319, 326 iteration, 106, 107 Class hierarchies, 369 not stable, 108 Class variables, 339, 341, 343 running time, 108 compare() function, 182, 184 strategy, 105 Comparison sorts swapping, 106 bubble sort Complexity theory, 43 asymptotic complexity, 114 Computational thinking, 2, 4, 8 cocktail shaker algorithm, 115 feed data, 10 comparisons, 114, 115, 117 language choice, 10 for loop, 112 natural sciences, 8 forms, 115, 116 problem solving, 9 inner loop, 113, 114 Computer vs. insertion sort, 111, 114 Caches, 67 invariants, 112, 113 computations, 66 positions, 115 core, 66 swapping, 111, 113, 115, 118 memory hierarchy, 67 termination, 113 registers, 67 variables, 112 running time, 67 variants, 115 Concrete data structures, 335 while loop, 112 Conditional independence, 295, 296, 306 worst-case complexity, 115 Conditional probabilities, 294, 296–298 insertion sort Constant-time operations, 402, 452, 474, asymptotic complexity, 114 481, 550 best-case, 110 Constructor, 320, 604 contiguous regions, 109 contains() function, 447, 450, 463 for loop, 109 counter() function, 186, 196 implementation, 110 inner loop, 109 invariant guarantees, 108, 109 D iteration, 108 Data objects, 98, 360 reverse order, 116 Data structures, 637 running time, 110 Decoding problem, 313 swapping, 109, 110 Deep learning neural networks, 639 worst-case, 110 def function, 202 selection sort Default methods, 360 for loop, 107 Default parameters, 162, 164, 165 guarantees, 106 Designing algorithms, 5, 11, 43, 44, 47, 53 644 INDEX Distribution sorting algorithms, 119 first element, 412–414, 419 Divide and conquer last element, 414, 415, 419 definition, 247 __repr__(), 410 floating-point numbers, 284–287 set_at_index(), 411, 412, 421 merge sort, 248 special case, 413, 414 quick sort, 249 Dynamic programming recursive, 247 edit distance Domain-specific languages (DSLs), 493 backtracking, 279, 280 Double underscore methods, 336 implementation, 278 Doubly linked lists memoization, 276, 277 adding dummy recursion, 275, 276 create list, 416 fibonacci number, 271, 272 dummy link, 417 linear algorithm, 268, 269 first/last, 416 memoization, 273, 274 get_at_index(), 417 partitioning new link, 418 definition, 281, 282 __repr__() method, 416, 417 memoization, 283, 284 set_at_index(), 417 recursion, 282, 283 add reference, 410 recursion, 268 append(), 410, 418 Dynamic programming algorithm, 273, circular, 424 274, 307, 313 constant-time operation, 422 create dummy, 410 dummy points, 410 E dummy.next.next, 412 Emission probabilities, 306 elegant solutions, 420 enqueue() operations, 551 extend(), 415, 416, 420 Ephemeral data structure, 461 general case, 412 Error handling, 171, 175 get_at_index(), 411, 419, 421 exception, 176 insert_after(), 418, 420 low-level, 176 insert_sequence_at(), 422 evaluate(), 373, 375 list implementation, 422–424 Exception() expression, 173 ListSequence class, 421 Exceptions, 170, 172, 173, 326 new link, 411 *expressions argument, 562 prepend(), 422 Expression stacks and stack machines previous pointer, 415, 416 call operator, 568 Python lists, 421 CPython, 561 removal EmptyStack exception, 563 645 INDEX Expression stacks and stack issue, 424 machines (cont.) objects, 426 inner-inner function, 567 open() function, 425 LOAD_CONST operation, 569 previous attribute, 428 operator module, 565 property class, 428 operator, 561 Python, 425, 427 Python features, 568 references RAM model, 560 adding, 426 registers, 560 circular, 427 reverse Polish notation, 562, 564 count, 426, 429 extend method, 230, 402 previous/next, 427 weak, 427 resources, 424, 425 F triggering, 427 fib() function, 233, 239 with syntax, 425 First-class objects, 158 Garbage collector, 425 First-in first-out (FIFO), 551 Generators Floating-point numbers, 293 animals(), 438 Floyd-Warshall’s algorithm, 50, 51, 53 bugs, 441 format() method, 378 counter, 439 Forward algorithm, 307–310 definition, 438 rescale, 310 for loop, 441, 442 scaling factor, 311 infinite sequences, 440 Forward recursion, 311 instance, 438 Functional programming, 2, 195, 231 __iter__(), 442 Function vs. function instance, 160–162 iterator, 438 Function instances, 160, 161, 166 linked list, 442 Function parameters vs. local variables, 146 lists, 441 modification, 440 modifiers, 441 G output, 438 Garbage collection square_counts, 441 context manager, 425 syntax, 440 __del__() method, 427 take(), 441 doubly linked lists, 426 testing, 438 Garbage collectors, 426, 429 yield, 438, 439 gc module, 426 yield from command, 439 gc.collect(), 427 __getattribute__() method, 350, 352, 353 646 INDEX get_node() function, 466 implementation, 588, 589 Global scope, 143, 146, 147, 178 insertion, 586 Global variable, 145, 147–149, 340 is_empty(), 586 merge(), 586, 587 node’s value, 586 H Hidden Markov models (HMMs), 304 Halting problem, 42, 43 dependency graph, 305 __hash__() method, 477 emission probabilities, 306 Hash dictionary, 485 observable states, 305 Hash function, 474, 475, 478 probabilities, 289–292 Hash table data structure, 474 Hierarchies analysis, 481 class, 375 collision, 479, 480 conceptual, 372, 374, 375 hash function, 475, 478 design, 371 resize, 481, 482, 484 expression, 375 Heaps Higher-level abstractions, 8 binary heaps Higher-order functions, 156–158 addition, 609, 610 High-level languages, 9 balancing, 607 Huffman encoding delete_max() operation,
Recommended publications
  • Vertex Ordering Problems in Directed Graph Streams∗
    Vertex Ordering Problems in Directed Graph Streams∗ Amit Chakrabartiy Prantar Ghoshz Andrew McGregorx Sofya Vorotnikova{ Abstract constant-pass algorithms for directed reachability [12]. We consider directed graph algorithms in a streaming setting, This is rather unfortunate given that many of the mas- focusing on problems concerning orderings of the vertices. sive graphs often mentioned in the context of motivating This includes such fundamental problems as topological work on graph streaming are directed, e.g., hyperlinks, sorting and acyclicity testing. We also study the related citations, and Twitter \follows" all correspond to di- problems of finding a minimum feedback arc set (edges rected edges. whose removal yields an acyclic graph), and finding a sink In this paper we consider the complexity of a variety vertex. We are interested in both adversarially-ordered and of fundamental problems related to vertex ordering in randomly-ordered streams. For arbitrary input graphs with directed graphs. For example, one basic problem that 1 edges ordered adversarially, we show that most of these motivated much of this work is as follows: given a problems have high space complexity, precluding sublinear- stream consisting of edges of an acyclic graph in an space solutions. Some lower bounds also apply when the arbitrary order, how much memory is required to return stream is randomly ordered: e.g., in our most technical result a topological ordering of the graph? In the offline we show that testing acyclicity in the p-pass random-order setting, this can be computed in O(m + n) time using model requires roughly n1+1=p space.
    [Show full text]
  • Algorithms (Decrease-And-Conquer)
    Algorithms (Decrease-and-Conquer) Pramod Ganapathi Department of Computer Science State University of New York at Stony Brook August 22, 2021 1 Contents Concepts 1. Decrease by constant Topological sorting 2. Decrease by constant factor Lighter ball Josephus problem 3. Variable size decrease Selection problem ............................................................... Problems Stooge sort 2 Contributors Ayush Sharma 3 Decrease-and-conquer Problem(n) [Step 1. Decrease] Subproblem(n0) [Step 2. Conquer] Subsolution [Step 3. Combine] Solution 4 Types of decrease-and-conquer Decrease by constant. n0 = n − c for some constant c Decrease by constant factor. n0 = n/c for some constant c Variable size decrease. n0 = n − c for some variable c 5 Decrease by constant Size of instance is reduced by the same constant in each iteration of the algorithm Decrease by 1 is common Examples: Array sum Array search Find maximum/minimum element Integer product Exponentiation Topological sorting 6 Decrease by constant factor Size of instance is reduced by the same constant factor in each iteration of the algorithm Decrease by factor 2 is common Examples: Binary search Search/insert/delete in balanced search tree Fake coin problem Josephus problem 7 Variable size decrease Size of instance is reduced by a variable in each iteration of the algorithm Examples: Selection problem Quicksort Search/insert/delete in binary search tree Interpolation search 8 Decrease by Constant HOME 9 Topological sorting Problem Topological sorting of vertices of a directed acyclic graph is an ordering of the vertices v1, v2, . , vn in such a way that there is an edge directed towards vertex vj from vertex vi, then vi comes before vj.
    [Show full text]
  • Topological Sort (An Application of DFS)
    Topological Sort (an application of DFS) CSC263 Tutorial 9 Topological sort • We have a set of tasks and a set of dependencies (precedence constraints) of form “task A must be done before task B” • Topological sort : An ordering of the tasks that conforms with the given dependencies • Goal : Find a topological sort of the tasks or decide that there is no such ordering Examples • Scheduling : When scheduling task graphs in distributed systems, usually we first need to sort the tasks topologically ...and then assign them to resources (the most efficient scheduling is an NP-complete problem) • Or during compilation to order modules/libraries d c a g f b e Examples • Resolving dependencies : apt-get uses topological sorting to obtain the admissible sequence in which a set of Debian packages can be installed/removed Topological sort more formally • Suppose that in a directed graph G = (V, E) vertices V represent tasks, and each edge ( u, v)∊E means that task u must be done before task v • What is an ordering of vertices 1, ..., | V| such that for every edge ( u, v), u appears before v in the ordering? • Such an ordering is called a topological sort of G • Note: there can be multiple topological sorts of G Topological sort more formally • Is it possible to execute all the tasks in G in an order that respects all the precedence requirements given by the graph edges? • The answer is " yes " if and only if the directed graph G has no cycle ! (otherwise we have a deadlock ) • Such a G is called a Directed Acyclic Graph, or just a DAG Algorithm for
    [Show full text]
  • 6.006 Course Notes
    6.006 Course Notes Wanlin Li Fall 2018 1 September 6 • Computation • Definition. Problem: matching inputs to correct outputs • Definition. Algorithm: procedure/function matching each input to single out- put, correct if matches problem • Want program to be able to handle infinite space of inputs, arbitrarily large size of input • Want finite and efficient program, need recursive code • Efficiency determined by asymptotic growth Θ() • O(f(n)) is all functions below order of growth of f(n) • Ω(f(n)) is all functions above order of growth of f(n) • Θ(f(n)) is tight bound • Definition. Log-linear: Θ(n log n) Θ(nc) • Definition. Exponential: b • Algorithm generally considered efficient if runs in polynomial time • Model of computation: computer has memory (RAM - random access memory) and CPU for performing computation; what operations are allowed in constant time • CPU has registers to access and process memory address, needs log n bits to handle size n input • Memory chunked into “words” of size at least log n 1 6.006 Wanlin Li 32 10 • E.g. 32 bit machine can handle 2 ∼ 4 GB, 64 bit machine can handle 10 GB • Model of computaiton in word-RAM • Ways to solve algorithms problem: design new algorithm or reduce to already known algorithm (particularly search in data structure, sort, shortest path) • Classification of algorithms based on graph on function calls • Class examples and design paradigms: brute force (completely disconnected), decrease and conquer (path), divide and conquer (tree with branches), dynamic programming (directed acyclic graph), greedy (choose which paths on directed acyclic graph to take) • Computation doesn’t have to be a tree, just has to be directed acyclic graph 2 September 11 • Example.
    [Show full text]
  • Verified Textbook Algorithms
    Verified Textbook Algorithms A Biased Survey Tobias Nipkow[0000−0003−0730−515X] and Manuel Eberl[0000−0002−4263−6571] and Maximilian P. L. Haslbeck[0000−0003−4306−869X] Technische Universität München Abstract. This article surveys the state of the art of verifying standard textbook algorithms. We focus largely on the classic text by Cormen et al. Both correctness and running time complexity are considered. 1 Introduction Correctness proofs of algorithms are one of the main motivations for computer- based theorem proving. This survey focuses on the verification (which for us always means machine-checked) of textbook algorithms. Their often tricky na- ture means that for the most part they are verified with interactive theorem provers (ITPs). We explicitly cover running time analyses of algorithms, but for reasons of space only those analyses employing ITPs. The rich area of automatic resource analysis (e.g. the work by Jan Hoffmann et al. [103,104]) is out of scope. The following theorem provers appear in our survey and are listed here in alphabetic order: ACL2 [111], Agda [31], Coq [25], HOL4 [181], Isabelle/HOL [151,150], KeY [7], KIV [63], Minlog [21], Mizar [17], Nqthm [34], PVS [157], Why3 [75] (which is primarily automatic). We always indicate which ITP was used in a particular verification, unless it was Isabelle/HOL (which we abbreviate to Isabelle from now on), which remains implicit. Some references to Isabelle formalizations lead into the Archive of Formal Proofs (AFP) [1], an online library of Isabelle proofs. There are a number of algorithm verification frameworks built on top of individual theorem provers.
    [Show full text]
  • Visvesvaraya Technological University a Project Report
    ` VISVESVARAYA TECHNOLOGICAL UNIVERSITY “Jnana Sangama”, Belagavi – 590 018 A PROJECT REPORT ON “PREDICTIVE SCHEDULING OF SORTING ALGORITHMS” Submitted in partial fulfillment for the award of the degree of BACHELOR OF ENGINEERING IN COMPUTER SCIENCE AND ENGINEERING BY RANJIT KUMAR SHA (1NH13CS092) SANDIP SHAH (1NH13CS101) SAURABH RAI (1NH13CS104) GAURAV KUMAR (1NH13CS718) Under the guidance of Ms. Sridevi (Senior Assistant Professor, Dept. of CSE, NHCE) DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING NEW HORIZON COLLEGE OF ENGINEERING (ISO-9001:2000 certified, Accredited by NAAC ‘A’, Permanently affiliated to VTU) Outer Ring Road, Panathur Post, Near Marathalli, Bangalore – 560103 ` NEW HORIZON COLLEGE OF ENGINEERING (ISO-9001:2000 certified, Accredited by NAAC ‘A’ Permanently affiliated to VTU) Outer Ring Road, Panathur Post, Near Marathalli, Bangalore-560 103 DEPARTMENT OF COMPUTER SCIENCE AND ENGINEERING CERTIFICATE Certified that the project work entitled “PREDICTIVE SCHEDULING OF SORTING ALGORITHMS” carried out by RANJIT KUMAR SHA (1NH13CS092), SANDIP SHAH (1NH13CS101), SAURABH RAI (1NH13CS104) and GAURAV KUMAR (1NH13CS718) bonafide students of NEW HORIZON COLLEGE OF ENGINEERING in partial fulfillment for the award of Bachelor of Engineering in Computer Science and Engineering of the Visvesvaraya Technological University, Belagavi during the year 2016-2017. It is certified that all corrections/suggestions indicated for Internal Assessment have been incorporated in the report deposited in the department library. The project report has been approved as it satisfies the academic requirements in respect of Project work prescribed for the Degree. Name & Signature of Guide Name Signature of HOD Signature of Principal (Ms. Sridevi) (Dr. Prashanth C.S.R.) (Dr. Manjunatha) External Viva Name of Examiner Signature with date 1.
    [Show full text]
  • CS 61B Final Exam Guerrilla Section Spring 2018 May 5, 2018
    CS 61B Final Exam Guerrilla Section Spring 2018 May 5, 2018 Instructions Form a small group. Start on the first problem. Check off with a helper or discuss your solution process with another group once everyone understands how to solve the first problem and then repeat for the second problem . You may not move to the next problem until you check off or discuss with another group and everyone understands why the solution is what it is. You may use any course resources at your disposal: the purpose of this review session is to have everyone learning together as a group. 1 The Tortoise or the Hare 1.1 Given an undirected graph G = (V; E) and an edge e = (s; t) in G, describe an O(jV j + jEj) time algorithm to determine whether G has a cycle containing e. Remove e from the graph. Then, run DFS or BFS starting at s to find t. If a path already exists from s to t, then adding e would create a cycle. 1.2 Given a connected, undirected, and weighted graph, describe an algorithm to con- struct a set with as few edges as possible such that if those edges were removed, there would be no cycles in the remaining graph. Additionally, choose edges such that the sum of the weights of the edges you remove is minimized. This algorithm must be as fast as possible. 1. Negate all edges. 2. Form an MST via Kruskal's or Prim's algorithm. 3. Return the set of all edges not in the MST (undo negation).
    [Show full text]
  • Arxiv:1105.2397V1 [Cs.DS] 12 May 2011 [email protected]
    A Incremental Cycle Detection, Topological Ordering, and Strong Component Maintenance BERNHARD HAEUPLER, Massachusetts Institute of Technology TELIKEPALLI KAVITHA, Tata Institute of Fundamental Research ROGERS MATHEW, Indian Institute of Science SIDDHARTHA SEN, Princeton University ROBERT E. TARJAN, Princeton University & HP Laboratories We present two on-line algorithms for maintaining a topological order of a directed n-vertex acyclic graph as arcs are added, and detecting a cycle when one is created. Our first algorithm handles m arc additions in O(m3=2) time. For sparse graphs (m=n = O(1)), this bound improves the best previous bound by a logarithmic factor, and is tight to within a constant factor among algorithms satisfying a natural locality property. Our second algorithm handles an arbitrary sequence of arc additions in O(n5=2) time. For sufficiently dense graphs, this bound improves the best previous boundp by a polynomial factor. Our bound may be far from tight: we show that the algorithm can take Ω(n22 2 lg n) time by relating its performance to a generalization of the k-levels problem of combinatorial geometry. A completely different algorithm running in Θ(n2 log n) time was given recently by Bender, Fineman, and Gilbert. We extend both of our algorithms to the maintenance of strong components, without affecting the asymptotic time bounds. Categories and Subject Descriptors: F.2.2 [Analysis of Algorithms and Problem Complexity]: Nonnumerical Algorithms and Problems—Computations on discrete structures; G.2.2 [Discrete Mathematics]: Graph Theory—Graph algorithms; E.1 [Data]: Data Structures—Graphs and networks General Terms: Algorithms, Theory Additional Key Words and Phrases: Dynamic algorithms, directed graphs, topological order, cycle detection, strong compo- nents, halving intersection, arrangement ACM Reference Format: Haeupler, B., Kavitha, T., Mathew, R., Sen, S., and Tarjan, R.
    [Show full text]
  • A Low Complexity Topological Sorting Algorithm for Directed Acyclic Graph
    International Journal of Machine Learning and Computing, Vol. 4, No. 2, April 2014 A Low Complexity Topological Sorting Algorithm for Directed Acyclic Graph Renkun Liu then return error (graph has at Abstract—In a Directed Acyclic Graph (DAG), vertex A and least one cycle) vertex B are connected by a directed edge AB which shows that else return L (a topologically A comes before B in the ordering. In this way, we can find a sorted order) sorting algorithm totally different from Kahn or DFS algorithms, the directed edge already tells us the order of the Because we need to check every vertex and every edge for nodes, so that it can simply be sorted by re-ordering the nodes the “start nodes”, then sorting will check everything over according to the edges from a Direction Matrix. No vertex is again, so the complexity is O(E+V). specifically chosen, which makes the complexity to be O*E. An alternative algorithm for topological sorting is based on Then we can get an algorithm that has much lower complexity than Kahn and DFS. At last, the implement of the algorithm by Depth-First Search (DFS) [2]. For this algorithm, edges point matlab script will be shown in the appendix part. in the opposite direction as the previous algorithm. The algorithm loops through each node of the graph, in an Index Terms—DAG, algorithm, complexity, matlab. arbitrary order, initiating a depth-first search that terminates when it hits any node that has already been visited since the beginning of the topological sort: I.
    [Show full text]
  • Introduction to Graphs ・V = Nodes
    Undirected graphs Notation. G = (V, E) Introduction to Graphs ・V = nodes. ・E = edges between pairs of nodes. ・Captures pairwise relationship between objects. Tyler Moore ・Graph size parameters: n = | V |, m = | E |. CS 2123, The University of Tulsa V = { 1, 2, 3, 4, 5, 6, 7, 8 } Some slides created by or adapted from Dr. Kevin Wayne. For more information see E = { 1-2, 1-3, 2-3, 2-4, 2-5, 3-5, 3-7, 3-8, 4-5, 5-6, 7-8 } http://www.cs.princeton.edu/~wayne/kleinberg-tardos m = 11, n = 8 3 2 / 36 One week of Enron emails The evolution of FCC lobbying coalitions 4 “The Evolution of FCC Lobbying Coalitions” by Pierre de Vries in JoSS Visualization Symposium 2010 5 3 / 36 4 / 36 The Spread of Obesity in a Large Social Network Over 32 Years educational level; the ego’s obesity status at the ing both their weights. We estimated these mod- previous time point (t); and most pertinent, the els in varied ego–alter pair types. alter’s obesity status at times t and t + 1.25 We To evaluate the possibility that omitted vari- used generalized estimating equations to account ables or unobserved events might explain the as- for multiple observations of the same ego across sociations, we examined how the type or direc- examinations and across ego–alter pairs.26 We tion of the social relationship between the ego assumed an independent working correlation and the alter affected the association between the structure for the clusters.26,27 ego’s obesity and the alter’s obesity.
    [Show full text]
  • CS 270 Algorithms
    CS 270 Algorithms Oliver Week 11 Kullmann Introduction Algorithms and their Revision analysis Graphs Data 1 Introduction structures Conclusion 2 Algorithms and their analysis 3 Graphs 4 Data structures 5 Conclusion CS 270 General remarks Algorithms Oliver Kullmann You need to be able to perform the following: Introduction Algorithms and their Reproducing all definitions. analysis Performing all computations on (arbitrary) examples. Graphs Data Explaining all algorithms in (written) words (possibly using structures some mathematical formalism). Conclusion Stating the time complexity in term of Θ. Check all this by writing it all down! Actually understanding the definitions and algorithms helps a lot (but sometimes “understanding” can become a trap). CS 270 How to prepare Algorithms Oliver Kullmann All information is on the course homepage Introduction Algorithms and their analysis Graphs http://cs.swan.ac.uk/~csoliver/Algorithms201314/index.html Data structures Conclusion Go through all lectures (perhaps you download the slides again — certain additions have been made). Go through the courseworks and their solutions. Go through the lab sessions (run it!). Use the learning goals! CS 270 On the exam Algorithms Oliver Kullmann Introduction Algorithms and their analysis We use the “two-out-of-three” format. Graphs Data Give us a chance to give you marks — write down structures something! Conclusion Actually, better write down a lot — in the second or third round, after answering all questions. Examples are typically helpful. CS 270 The general structure of the module Algorithms Oliver Kullmann Introduction Algorithms and their analysis Graphs Data 1 Weeks 1-3: Algorithms and their analysis structures 2 Weeks 4-5: Graphs — BFS and DFS.
    [Show full text]
  • Topologically Sorting a Directed Acyclic Graph (CLRS 22.4)
    Topologically Sorting a Directed Acyclic Graph (CLRS 22.4) 1 The problem • A topological sorting of a directed acyclic graph G = (V; E) is a linear ordering of vertices V such that (u; v) 2 E ) u appear before v in ordering. Intuitively, we think of an edge (a; b) as meaning that a has to come before b|thus an edge defines a precedence relation. A topological order is an order of the vertices that satisfies all the edges. • Example: Dressing (arrow implies \must come before") Socks Watch Underwear Shoes Shirt Pants Tie Belt Jacket We want to compute order in which to get dressed. One possibility: Socks Underwear Pants Shoes WatchShirt Belt Tie Jacket 1 2 3 4 5 6 7 8 9 The given order is one possible topological order. 1 2 Does a tpopological order always exist? • If the graph has a cycle, a topological order cannot exist. Imagine the simplest cycle, consist- ing of two edges: (a; b) and (b; a). A topological ordering , if it existed, would have to satisfy that a must come before b and b must come before a. This is not possible. • Now consider we have a graph without cycles; this is usually referred to as a DAG (directed acyclic graph). Does any DAG have a topological order? The answer is YES. We'll prove this below indirectly by showing that the toposort algorithm always gives a valid ordering when run on any DAG. 3 Topological sort via DFS It turns out that we can get topological order based on DFS.
    [Show full text]