Graphs: Structures and Algorithms Vocabulary

Total Page:16

File Type:pdf, Size:1020Kb

Graphs: Structures and Algorithms Vocabulary Graphs: Structures and Algorithms Vocabulary z How do packets of bits/information get routed on the internet z Graphs are collections of vertices and edges Ê Message divided into packets on client (your) machine Ê Vertex is sometimes called a node Ê Packets sent out using routing tables toward destination Ê An edge connects two vertices • Packets may take different routes to destination • Direction is sometimes important, other times not so • What happens if packets lost or arrive out-of-order? • Sometimes edge has a weight/cost associated with it Ê Routing tables store local information, not global (why?) z A sequence of vertices v0, v1, v2, …, vn-1 is a path where vk and z What about The Oracle of Bacon, Six Degrees of Separation, vk+1 are connected by an edge. Erdos Numbers, and Word Ladders? Ê If some vertex is repeated, the path is a cycle Ê All can be modeled using graphs Ê Trees are cycle-free (acyclic) graphs with a root Ê What kind of connectivity does each concept model? Ê A graph is connected if there is a path between any pair of vertices • Non-connected graphs have connected components z Graphs are everywhere in the world of algorithms (world?) CPS 100 10.1 CPS 100 10.2 2 Graph Traversals 3 Pseudo-code for depth-first search 1 6 z Connected? void depthfirst(const string& vertex) Ê Why? // post: depth-first search from vertex complete 4 Ê Indegrees? Outdegrees? { if (! alreadySeen(vertex)) 7 { z Starting at 7 where can we get? 5 markAsSeen(vertex); Ê Depth-first search, envision each vertex as a room, with doors cout << vertex << endl; leading out for(each v adjacent to vertex) { • Go into a room, choose a door, mark the door and go out depthfirst(v); • Don’t go into a room you’ve already been in } • Backtrack when all doors marked and open next unopened door } Ê Rooms are stacked up, backtracking is really recursion } Ê One alternative uses a queue: breadth-first search z Clones are stacked up, problem? When are all doors out of vertex opened and visited? Can we make use of stack explicit? CPS 100 10.3 CPS 100 10.4 Graph implementations Graph implementations (continued) z Typical operations on graph: z Adjacency matrix Ê Add vertex Ê Every possible edge Ê Add edge (parameters?) represented, how many? Ê AdjacentVerts(vertex) z Adjacency list uses O(V+E) space Ê AllVerts(..) Ê What about matrix? Ê String->int (vice versa) Ê Which is better? TF … z Different kinds of graphs z What do we do to get adjacent vertices for given vertex? Ê Lots of vertices, few edges, sparse graph Ê What is complexity? • Use adjacency list Ê Compared to adjacency list? Adjacency list Ê Lots of edges (max # ?) dense graph z What about weighted edges? • Use adjacency matrix CPS 100 10.5 CPS 100 10.6 Other graph questions/operations Breadth first search z What vertices are reachable from a given vertex z In an unweighted graph this finds the shortest path between a Ê Can depth-first search help here? start vertex and every vertex Ê Visit every node one away from start z What vertex has the highest in-degree (out-degree)? Ê Visit every node two away from start Ê How can we use a map to answer this question? • This is every node one away from a node one away Ê Visit every node three away from start z Shortest path between any two vertices Ê Breadth first search is storage expensive z Like depth first search, but use a queue instead of a stack Ê Dijkstra’s algorithm will offer an alternative, uses a priority queue too! Ê What features of a queue ensure shortest path? Ê Stack can be simulated with recursion, advantages? z Longest path in a graph Ê How many vertices on the stack/queue? Ê No known efficient algorithm CPS 100 10.7 CPS 100 10.8 Pseudocode for breadth first What about word ladders void breadthfirst(const string& vertex) z Find a path from white->house changing one letter // post: breadth-first search from vertex complete { Ê Real world? Computer vs. human? tqueue<string> q; q.enqueue(vertex); • white write writs waits warts parts ports forts forte while (q.size() > 0) { • … rouse house q.dequeue(current); Ê See ladder.cpp program for(each v adjacent to current) { if (distance[v] == INFINITY) // not seen z How is this a graph problem? What are vertices/edges? { distance[v] = distance[current] + 1; z What about spell-checking, how is it similar? q.enqueue(v); Ê Edge from accomodate to accommodate } } Ê Can also use tries with wild-cards, e.g., acc*date } } CPS 100 10.9 CPS 100 10.10 What about connected components? Shortest path in weighted graph z What computers are reachable from this one? What people are z We need to modify approach slightly for weighted graph reachable from me via acquaintanceship? Ê Edges have weights, breadth first by itself doesn’t work Ê Start at some vertex, depth-first search (why not breadth)? Ê What’s shortest path from A to F in graph below? • Mark nodes visited Ê Repeat, starting from an unvisited vertex (until all visited) z Use same idea as breadth first search Ê Don’t add 1 to current distance, add ??? z What is minimal size of a component? Maximal size? Ê Might adjust distances more than once C 3 4 Ê What is complexity of algorithm in terms of V and E? Ê What vertex do we visit next? E 6 A 8 2 z What algorithms does this lead to in graphs? z What vertex is next is key D 3 4 Ê Use greedy algorithm: closest 2 F Ê Huffman is greedy, … B CPS 100 10.11 CPS 100 10.12 Greedy Algorithms Edsger Dijkstra z A greedy algorithm makes a locally optimal decision that z Turing Award, 1972 leads to a globally optimal solution z Operating systems and Ê Huffman: choose two nodes with minimal weight, concurrency combine z Algol-60 programming language • Leads to optimal coding, optimal Huffman tree z Goto considered harmful Ê Making change with American coins: choose largest coin z Shortest path algorithm possible as many times as possible z Structured programming • Change for $0.63, change for $0.32 “Program testing can show the presence of bugs, but never their • What if we’re out of nickels, change for $0.32? absence” z A Discipline of programming z Greedy doesn’t always work, but it does sometimes “For the absence of a bibliography I z Weighted shortest path algorithm is Dijkstra’s algorithm, offer neither explanation nor greedy and uses priority queue apology” CPS 100 10.13 CPS 100 10.14 Dijkstra’s Shortest Path Algorithm Shortest paths, more details A 4 z Similar to breadth first search, but uses a priority queue instead of a z Single-source shortest path E queue. Code below is for breadth first search 7 3 1 Ê Start at some vertex S 2 6 C q.dequeue(vertex w) Ê Find shortest path to every S foreach (vertex v adjacent to w) 3 7 reachable vertex from S 2 if (distance[v] == INT_MAX) // not visited SA BCE z A set of vertices is processed B { 8 distance[v] = distance[w] + 1; Ê Initially just S is processed 072 6 q.enqueue(v); process B } Ê Each pass processes a vertex 072 5 9 After each pass, shortest path from A 4 z Dijkstra: Find minimal unvisited node, recalculate costs through node S to any vertex using just vertices 7 3 1 E from processed set (except for last 2 q.deletemin(vertex w) vertex) is always known 6 C foreach (vertex v adjacent to w) S if (distance[w] + weight(w,v) < distance[v]) z Next processed vertex is closest 3 7SA B { 2 CE to S still needing processing B distance[v] = distance[w] + weight(w,v); 0 6 2 5 7 q.enqueue(vertex(v, distance[v])); } process C CPS 100 10.15 CPS 100 10.16 Dijkstra’s algorithm works (greedily) Topological sort z Choosing minimal unseen vertex to process leads to shortest z Given a directed acyclic graph paths (DAG) Ê Order vertices so that any if 0 1 2 3 4 5 6 q.deletemin(vertex w) there is an edge (v,w), then v foreach (vertex v adjacent to w) appears before w in the order if (distance[w] + weight(w,v) < distance[v]) { z Prerequisites for a major, take distance[v] = distance[w] + weight(w,v); CPS 100 before CPS 130 q.enqueue(vertex(v, distance[v])); } Ê Edge(cps100,cps130) 0 belt Ê Topological sort gives an ordering for taking courses underwear z We always know shortest path through processed vertices z Where does ordering start? 6 Ê When we choose w, there can’t be a shorter path to w than 1 shirt distance[w] – it would go through processed u, then we Ê First vertex has no prereqs pants 5 would have chosen u instead of w Ê “remove” this vertex, continue 3 4 Ê Depends on in-degree jacket shoes socks 2 CPS 100 10.17 CPS 100 10.18 Pseudocode for topological sort Minimum Spanning Trees void topologicalsort(Graph G) z Minimum weight spanning tree (MST) properties { Ê Subgraph of a given undirected graph with edge weights ArbitrarySet fringe; count[v] = G.inDegree(v); Ê Contains all vertices of a given graph for ( i=0; i < G.vertexSize(); v++) Ê Minimizes the sum of its edge weights if ((count[v] = G.inDegree(i)) == 0) z How do we find the MST of a graph? fringe.add(v); // topological sort traversal while (!fringe.isEmpty()) { z Key insight: Vertex v1 = fringe.removeAny(); Ê If the vertices of a connected graph G are divided into two result.pushback(v1); disjoing non-empty sets G and G , then any MST for G will foreach edge w (v,w) { 0 1 count[w]--; contain one of the edges running between a vertex in G0 and a if (count[w] == 0) vertex in G1 that has minimal weight fringe.add(w); z Solution } Ê Prim’s algorithm } } Ê Kruskal’s algorithm CPS 100 10.19 CPS 100 10.20.
Recommended publications
  • Create a Program Which Will Accept the Price of a Number of Items. • You Must State the Number of Items in the Basket
    PROBLEM: Create a program which will accept the price of a number of items. • You must state the number of items in the basket. • The program will then accept the unit price and the quantity for each item. • It must then calculate the item prices and the total price. • The program must also calculate and add a 5% tax. • If the total price exceeds $450, a discount of $15 is given to the customer; else he/she pays the full amount. • Display the total price, tax, discounted total if applicable, as well as the total number of items bought. • A thank you message must appear at the bottom. SOLUTION First establish your inputs and your outputs, and then develop your IPO chart. Remember: this phase is called - Analysis Of The Problem. So what are your inputs & output? : Variable names: Number of items (input & output) Num (Assume there are 4 items) Unit price of each item (input) Price1, Price2, Price3, Price4 Quantity of each item (Input) Quan1, Quan2, Quan3, Quan4 Total price (output) TotPrice Tax amount (output) Tax Discounted total price (output) DiscTotPrice In your IPO chart, you will describe the logical steps to move from your inputs to your outputs. INPUT PROCESSING OUTPUT Price1, Price2, 1. Get the number of items TotPrice, Tax, Price3, Price4, 2. Get the price and quantity of the each item DiscTaxedTotal, Num 3. Calculate a sub-total for each item TaxedTotal Quan1, Quan2, 4. Calculate the overall total Num Quan3, Quan4 5. Calculate the tax payable {5% of total} 6. If TaxedTotal > 450 then Apply discount {Total minus 15} Display: TotPrice, Tax, DiscTaxedTotal, TaxedTotal, Num Else TaxedTotal is paid Display: TotPrice, Tax, TaxedTotal, Num Note that there are some variables that are neither input nor output.
    [Show full text]
  • Packing and Covering Dense Graphs
    Packing and Covering Dense Graphs Noga Alon ∗ Yair Caro y Raphael Yuster z Abstract Let d be a positive integer. A graph G is called d-divisible if d divides the degree of each vertex of G. G is called nowhere d-divisible if no degree of a vertex of G is divisible by d. For a graph H, gcd(H) denotes the greatest common divisor of the degrees of the vertices of H. The H-packing number of G is the maximum number of pairwise edge disjoint copies of H in G. The H-covering number of G is the minimum number of copies of H in G whose union covers all edges of G. Our main result is the following: For every fixed graph H with gcd(H) = d, there exist positive constants (H) and N(H) such that if G is a graph with at least N(H) vertices and has minimum degree at least (1 − (H)) G , then the H-packing number of G and the H-covering number of G can be computed j j in polynomial time. Furthermore, if G is either d-divisible or nowhere d-divisible, then there is a closed formula for the H-packing number of G, and the H-covering number of G. Further extensions and solutions to related problems are also given. 1 Introduction All graphs considered here are finite, undirected and simple, unless otherwise noted. For the standard graph-theoretic terminology the reader is referred to [1]. Let H be a graph without isolated vertices. An H-covering of a graph G is a set L = G1;:::;Gs of subgraphs of G, where f g each subgraph is isomorphic to H, and every edge of G appears in at least one member of L.
    [Show full text]
  • Embedding Trees in Dense Graphs
    Embedding trees in dense graphs Václav Rozhoň February 14, 2019 joint work with T. Klimo¹ová and D. Piguet Václav Rozhoň Embedding trees in dense graphs February 14, 2019 1 / 11 Alternative definition: substructures in graphs Extremal graph theory Definition (Extremal graph theory, Bollobás 1976) Extremal graph theory, in its strictest sense, is a branch of graph theory developed and loved by Hungarians. Václav Rozhoň Embedding trees in dense graphs February 14, 2019 2 / 11 Extremal graph theory Definition (Extremal graph theory, Bollobás 1976) Extremal graph theory, in its strictest sense, is a branch of graph theory developed and loved by Hungarians. Alternative definition: substructures in graphs Václav Rozhoň Embedding trees in dense graphs February 14, 2019 2 / 11 Density of edges vs. density of triangles (Razborov 2008) (image from the book of Lovász) Extremal graph theory Theorem (Mantel 1907) Graph G has n vertices. If G has more than n2=4 edges then it contains a triangle. Generalisations? Václav Rozhoň Embedding trees in dense graphs February 14, 2019 3 / 11 Extremal graph theory Theorem (Mantel 1907) Graph G has n vertices. If G has more than n2=4 edges then it contains a triangle. Generalisations? Density of edges vs. density of triangles (Razborov 2008) (image from the book of Lovász) Václav Rozhoň Embedding trees in dense graphs February 14, 2019 3 / 11 3=2 The answer for C4 is of order n , lower bound via finite projective planes. What is the answer for trees? Extremal graph theory Theorem (Mantel 1907) Graph G has n vertices. If G has more than n2=4 edges then it contains a triangle.
    [Show full text]
  • Fractional Arboricity, Strength, and Principal Partitions in Graphs and Matroids
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Elsevier - Publisher Connector Discrete Applied Mathematics 40 (1992) 285-302 285 North-Holland Fractional arboricity, strength, and principal partitions in graphs and matroids Paul A. Catlin Wayne State University, Detroit, MI 48202. USA Jerrold W. Grossman Oakland University, Rochester, MI 48309, USA Arthur M. Hobbs* Texas A & M University, College Station, TX 77843, USA Hong-Jian Lai** West Virginia University, Morgantown, WV 26506, USA Received 15 February 1989 Revised 15 November 1989 Abstract Catlin, P.A., J.W. Grossman, A.M. Hobbs and H.-J. Lai, Fractional arboricity, strength, and principal partitions in graphs and matroids, Discrete Applied Mathematics 40 (1992) 285-302. In a 1983 paper, D. Gusfield introduced a function which is called (following W.H. Cunningham, 1985) the strength of a graph or matroid. In terms of a graph G with edge set E(G) and at least one link, this is the function v(G) = mmFr9c) IFl/(w(G F) - w(G)), where the minimum is taken over all subsets F of E(G) such that w(G F), the number of components of G - F, is at least w(G) + 1. In a 1986 paper, C. Payan introduced thefractional arboricity of a graph or matroid. In terms of a graph G with edge set E(G) and at least one link this function is y(G) = maxHrC lE(H)I/(l V(H)1 -w(H)), where H runs over Correspondence to: Professor A.M. Hobbs, Department of Mathematics, Texas A & M University, College Station, TX 77843, USA.
    [Show full text]
  • Limits of Graph Sequences
    Snapshots of modern mathematics № 10/2019 from Oberwolfach Limits of graph sequences Tereza Klimošová Graphs are simple mathematical structures used to model a wide variety of real-life objects. With the rise of computers, the size of the graphs used for these models has grown enormously. The need to ef- ficiently represent and study properties of extremely large graphs led to the development of the theory of graph limits. 1 Graphs A graph is one of the simplest mathematical structures. It consists of a set of vertices (which we depict as points) and edges between the pairs of vertices (depicted as line segments); several examples are shown in Figure 1. Many real-life settings can be modeled using graphs: for example, social and computer networks, road and other transport network maps, and the structure of molecules (see Figure 2a and 2b). These models are widely used in computer science, for instance for route planning algorithms or by Google’s PageRank algorithm, which generates search results. What these models all have in common is the representation of a set of several objects (the vertices) and relations or connections between pairs of those objects (the edges). In recent years, with the increasing use of computers in all areas of life, it has become necessary to handle large volumes of data. To represent such data (for example, connections between internet servers) in turn requires graphs with very many vertices and an even larger number of edges. In such situations, traditional algorithms for processing graphs are often impractical, or even impossible, because the computations take too much time and/or computational 1 Figure 1: Examples of graphs.
    [Show full text]
  • Constructing Algorithms and Pseudocoding This Document Was Originally Developed by Professor John P
    Constructing Algorithms and Pseudocoding This document was originally developed by Professor John P. Russo Purpose: # Describe the method for constructing algorithms. # Describe an informal language for writing PSEUDOCODE. # Provide you with sample algorithms. Constructing Algorithms 1) Understand the problem A common mistake is to start writing an algorithm before the problem to be solved is fully understood. Among other things, understanding the problem involves knowing the answers to the following questions. a) What is the input list, i.e. what information will be required by the algorithm? b) What is the desired output or result? c) What are the initial conditions? 2) Devise a Plan The first step in devising a plan is to solve the problem yourself. Arm yourself with paper and pencil and try to determine precisely what steps are used when you solve the problem "by hand". Force yourself to perform the task slowly and methodically -- think about <<what>> you are doing. Do not use your own memory -- introduce variables when you need to remember information used by the algorithm. Using the information gleaned from solving the problem by hand, devise an explicit step-by-step method of solving the problem. These steps can be written down in an informal outline form -- it is not necessary at this point to use pseudo-code. 3) Refine the Plan and Translate into Pseudo-Code The plan devised in step 2) needs to be translated into pseudo-code. This refinement may be trivial or it may take several "passes". 4) Test the Design Using appropriate test data, test the algorithm by going through it step by step.
    [Show full text]
  • The $ K $-Strong Induced Arboricity of a Graph
    The k-strong induced arboricity of a graph Maria Axenovich, Daniel Gon¸calves, Jonathan Rollin, Torsten Ueckerdt June 1, 2017 Abstract The induced arboricity of a graph G is the smallest number of induced forests covering the edges of G. This is a well-defined parameter bounded from above by the number of edges of G when each forest in a cover consists of exactly one edge. Not all edges of a graph necessarily belong to induced forests with larger components. For k > 1, we call an edge k-valid if it is contained in an induced tree on k edges. The k-strong induced arboricity of G, denoted by fk(G), is the smallest number of induced forests with components of sizes at least k that cover all k-valid edges in G. This parameter is highly non-monotone. However, we prove that for any proper minor-closed graph class C, and more gener- ally for any class of bounded expansion, and any k > 1, the maximum value of fk(G) for G 2 C is bounded from above by a constant depending only on C and k. This implies that the adjacent closed vertex-distinguishing number of graphs from a class of bounded expansion is bounded by a constant depending only on the class. We further t+1 d prove that f2(G) 6 3 3 for any graph G of tree-width t and that fk(G) 6 (2k) for any graph of tree-depth d. In addition, we prove that f2(G) 6 310 when G is planar.
    [Show full text]
  • Sparse Exchangeable Graphs and Their Limits Via Graphon Processes
    Journal of Machine Learning Research 18 (2018) 1{71 Submitted 8/16; Revised 6/17; Published 5/18 Sparse Exchangeable Graphs and Their Limits via Graphon Processes Christian Borgs [email protected] Microsoft Research One Memorial Drive Cambridge, MA 02142, USA Jennifer T. Chayes [email protected] Microsoft Research One Memorial Drive Cambridge, MA 02142, USA Henry Cohn [email protected] Microsoft Research One Memorial Drive Cambridge, MA 02142, USA Nina Holden [email protected] Department of Mathematics Massachusetts Institute of Technology Cambridge, MA 02139, USA Editor: Edoardo M. Airoldi Abstract In a recent paper, Caron and Fox suggest a probabilistic model for sparse graphs which are exchangeable when associating each vertex with a time parameter in R+. Here we show that by generalizing the classical definition of graphons as functions over probability spaces to functions over σ-finite measure spaces, we can model a large family of exchangeable graphs, including the Caron-Fox graphs and the traditional exchangeable dense graphs as special cases. Explicitly, modelling the underlying space of features by a σ-finite measure space (S; S; µ) and the connection probabilities by an integrable function W : S × S ! [0; 1], we construct a random family (Gt)t≥0 of growing graphs such that the vertices of Gt are given by a Poisson point process on S with intensity tµ, with two points x; y of the point process connected with probability W (x; y). We call such a random family a graphon process. We prove that a graphon process has convergent subgraph frequencies (with possibly infinite limits) and that, in the natural extension of the cut metric to our setting, the sequence converges to the generating graphon.
    [Show full text]
  • Introduction to Computer Science CSCI 109
    Introduction to Computer Science CSCI 109 China – Tianhe-2 Readings Andrew Goodney Fall 2019 St. Amant, Ch. 5 Lecture 7: Compilers and Programming 10/14, 2019 Reminders u Quiz 3 today – at the end u Midterm 10/28 u HW #2 due tomorrow u HW #3 not out until next week 1 Where are we? 2 Side Note u Two things funny/note worthy about this cartoon u #1 is COBOL (we’ll talk about that later) u #2 is ?? 3 “Y2k Bug” u Y2K bug?? u In the 1970’s-1980’s how to store a date? u Use MM/DD/YY v More efficient – every byte counts (especially then) u What is/was the issue? u What was the assumption here? v “No way my COBOL program will still be in use 25+ years from now” u Wrong! 4 Agenda u What is a program u Brief History of High-Level Languages u Very Brief Introduction to Compilers u ”Robot Example” u Quiz 6 What is a Program? u A set of instructions expressed in a language the computer can understand (and therefore execute) u Algorithm: abstract (usually expressed ‘loosely’ e.g., in english or a kind of programming pidgin) u Program: concrete (expressed in a computer language with precise syntax) v Why does it need to be precise? 8 Programming in Machine Language is Hard u CPU performs fetch-decode-execute cycle millions of time a second u Each time, one instruction is fetched, decoded and executed u Each instruction is very simple (e.g., move item from memory to register, add contents of two registers, etc.) u To write a sophisticated program as a sequence of these simple instructions is very difficult (impossible) for humans 9 Machine Language Example
    [Show full text]
  • Novice Programmer = (Sourcecode) (Pseudocode) Algorithm
    Journal of Computer Science Original Research Paper Novice Programmer = (Sourcecode) (Pseudocode) Algorithm 1Budi Yulianto, 2Harjanto Prabowo, 3Raymond Kosala and 4Manik Hapsara 1Computer Science Department, School of Computer Science, Bina Nusantara University, Jakarta, Indonesia 11480 2Management Department, BINUS Business School Undergraduate Program, Bina Nusantara University, Jakarta, Indonesia 11480 3Computer Science Department, Faculty of Computing and Media, Bina Nusantara University, Jakarta, Indonesia 11480 4Computer Science Department, BINUS Graduate Program - Doctor of Computer Science, Bina Nusantara University, Jakarta, Indonesia 11480 Article history Abstract: Difficulties in learning programming often hinder new students Received: 7-11-2017 as novice programmers. One of the difficulties is to transform algorithm in Revised: 14-01-2018 mind into syntactical solution (sourcecode). This study proposes an Accepted: 9-04-2018 application to help students in transform their algorithm (logic) into sourcecode. The proposed application can be used to write down students’ Corresponding Author: Budi Yulianto algorithm (logic) as block of pseudocode and then transform it into selected Computer Science Department, programming language sourcecode. Students can learn and modify the School of Computer Science, sourcecode and then try to execute it (learning by doing). Proposed Bina Nusantara University, application can improve 17% score and 14% passing rate of novice Jakarta, Indonesia 11480 programmers (new students) in learning programming. Email: [email protected] Keywords: Algorithm, Pseudocode, Novice Programmer, Programming Language Introduction students in some universities that are new to programming and have not mastered it. In addition, Programming language is a language used by some universities (especially in rural areas or with programmers to write commands (syntax and semantics) limited budget) do not have tools that can help their new that can be understood by a computer to create a students in learning programming (Yulianto et al ., 2016b).
    [Show full text]
  • 4. Elementary Graph Algorithms in External Memory∗
    4. Elementary Graph Algorithms in External Memory∗ Irit Katriel and Ulrich Meyer∗∗ 4.1 Introduction Solving real-world optimization problems frequently boils down to process- ing graphs. The graphs themselves are used to represent and structure rela- tionships of the problem’s components. In this chapter we review external- memory (EM) graph algorithms for a few representative problems: Shortest path problems are among the most fundamental and also the most commonly encountered graph problems, both in themselves and as sub- problems in more complex settings [21]. Besides obvious applications like preparing travel time and distance charts [337], shortest path computations are frequently needed in telecommunications and transportation industries [677], where messages or vehicles must be sent between two geographical lo- cations as quickly or as cheaply as possible. Other examples are complex traffic flow simulations and planning tools [337], which rely on solving a large number of individual shortest path problems. One of the most commonly encountered subtypes is the Single-Source Shortest-Path (SSSP) version: let G =(V,E) be a graph with |V | nodes and |E| edges, let s be a distinguished vertex of the graph, and c be a function assigning a non-negative real weight to each edge of G. The objective of the SSSP is to compute, for each vertex v reachable from s, the weight dist(v) of a minimum-weight (“shortest”) path from s to v; the weight of a path is the sum of the weights of its edges. Breadth-First Search (BFS) [554] can be seen as the unweighted version of SSSP; it decomposes a graph into levels where level i comprises all nodes that can be reached from the source via i edges.
    [Show full text]
  • Sparse Graphs
    EuroComb 2005 DMTCS proc. AE, 2005, 181–186 Pebble Game Algorithms and (k, l)-Sparse Graphs Audrey Lee1†and Ileana Streinu2‡ 1Department of Computer Science, University of Massachusetts at Amherst, Amherst, MA, USA. [email protected] 2Computer Science Department, Smith College, Northampton, MA, USA. [email protected] A multi-graph G on n vertices is (k, l)-sparse if every subset of n0 ≤ n vertices spans at most kn0 − l edges, 0 ≤ l < 2k. G is tight if, in addition, it has exactly kn − l edges. We characterize (k, l)-sparse graphs via a family of simple, elegant and efficient algorithms called the (k, l)-pebble games. As applications, we use the pebble games for computing components (maximal tight subgraphs) in sparse graphs, to obtain inductive (Henneberg) constructions, and, when l = k, edge-disjoint tree decompositions. Keywords: sparse graph, pebble game, rigidity, arboricity, graph orientation with bounded degree 1 Introduction A graph§ G = (V, E) with n = |V | vertices and m = |E| edges is (k, l)-sparse if every subset of n0 ≤ n vertices spans at most kn0 − l edges. If, furthermore, m = kn − l, G will be called tight. A graph is a (k, a)-arborescence if it can be decomposed into k edge-disjoint spanning trees after the addition of any a edges. Here, k, l and a are positive integers satisfying a simple relationship, which is 0 ≤ l < 2k for sparseness or 0 ≤ a < k for arboricity. Sparseness and arboricity are closely related, and have important applications in Rigidity Theory. Clas- sical results of Nash-Williams [10] and Tutte [16] identify the class of graphs decomposable into k edge- disjoint spanning trees with the tight (k, k)-sparse graphs.
    [Show full text]