Asymptotic Running Time of Algorithms

Asymptotic Running Time of Algorithms

Asymptotic Complexity: leading term analysis • Comparing searching and sorting algorithms so far: – Count worst-case number of comparisons as function of array size. – Drop lower-order terms, floors/ceilings, and constants to come up Asymptotic Running Time with asymptotic running time of algorithm. of Algorithms • We will now generalize this approach to other programs: – Count worst-case number of operations executed by program as a function of input size. – Formalize definition of big-O complexity to derive asymptotic running time of algorithm. Formal Definition of big-O Notation: • Let f(n) and g(n) be functions. We say f(n) is of order g(n), written O(g(n)), if there is a constant c > 0 such that for all but a finite number of positive values of n: f(n) ≤ c * g(n) In other words, sooner or later g(n) overtakes f(n) as n gets large. • Example: f(n) = n+5; g(n) = n. Show that f(n) = O(g(n)). Choose c = 6: f(n) = n+5 ≤ 6*n for all n > 0. 2 • Example: f(n) = 17n; g(n) = 3n . Show that f(n) = O(g(n)). • To prove that f(n) = O(g(n)), find an n0 and c such that Choose c = 6: f(n) ≤ c * g(n) for all n > n0 . 2 f(n) = 17n ≤ 6 * 3n for all n > 0. • We will call the pair (c, n0) a witness pair for proving that f(n) = O(g(n)). 1 • For asymptotic complexity, base of logarithms does not matter. Why is Asymptotic Complexity So Important? • Let us show that log2(n) = O(logb(n)) for any b > 1. • Asymptotic complexity gives an idea of how rapidly the space/time requirements grow as problem size increases. • Suppose we have a computing device that can execute 1000 complex • Need to find a witness pair (c, n0) such that: operations per second. Here is the size problem that can be solved in log (n) ≤ c * log (n) for all n > n . 2 b 0 a second, a minute, and an hour by algorithms of different asymptotic complexity: • Choose (c = log2 (b), n0 = 0). Complexity 1 second 1 minute 1 hour • This works because n 1000 60,000 3,600,000 c * logb(n) = log2(b) * logb(n) n log n 140 4893 200,000 log (n) = log2(b b ) n2 31 244 1897 = log2(n) for all positive n. 3n2 18 144 1096 n3 10 39 153 2n 9 15 21 What are the basic OPERATIONS? What Operations Should We Count? • For searching and sorting algorithms, you can usually • Must do a detailed counting of all executed operations. determine big-O complexity by counting comparisons. • Estimate number of primitive operations that RAM model (basic microprocessor) must do to run program: • Why? – Basic operation: arithmetic/logical operations count as 1 operation. • Usually end up doing some fixed number of arithmetic • even though adds, multiplies, and divides don’t (usually) cost the same and/or logical operations per comparison. – Assignment: counts as 1 operation. • operation count of righthand side of expression is determined separately. • Why don’t we count swaps for sorting? – Loop: number of operations per iteration * number of loop iterations. – e.g., think about selection sort: – Method invocation: number of operations executed in the invoked • max of n swaps method. • but must do O(n2) comparisons • ignore costs of building stack frames, … • swaps and comparisons have similar cost – Recursion: same as method invocation, but harder to reason about. • Is counting number of comparisons always right? – Ignoring garbage collection, memory hierarchy (caches), … – No. – e.g., selection sort on linked lists instead of arrays 2 Example: Selection Sort Example: Matrix Multiplication public static void selectionSort(Comparable[] a) { //array of size n int n = A.length; <-- cost = c0, 1 time for (int i = 0; i < a.length; i++) { <-- cost = c1, n times for (int i = 0; i < n; i++) { <-- cost = c1, n times int MinPos = i; <-- cost = c2, n times for (int j = 0; j < n; j++) { <-- cost = c2, n*n times for (int j = i+1; j < a.length; j++) { <-- cost = c3, n*(n-1)/2 times sum = 0; <-- cost = c3, n*n times if (a[j].compareTo(a[MinPos]) < 0) <-- cost = c4, n*(n-1)/2 times for k = 0; k < n; k++) <-- cost = c4, n*n*n times MinPos = j;} <-- cost = c5, n*(n-1)/2 times sum = sum + A[i][k]*B[k][j]; <-- cost = c5, n*n*n times Comparable temp = a[i]; <-- cost = c6, n times C[i][j] = sum; <-- cost = c6, n*n times a[i] = a[MinPos]; <-- cost = c7, n times } a[MinPos] = temp;}} <-- cost = c8, n times } Total number of operations: Total number of operations: = (c1+c2+c6+c7+c8)*n + (c3+c4+c5)*n*(n-1)/2 = c0 + c1*n + (c2+c3+c6)*n*n + (c4+c5)*n*n*n = (c1+c2+c6+c7+c8 -(c3+c4+c5)/2)*n + ((c3+c4+c5)/2)*n2 = O(n3) = O(n2) Remarks Analysis of Merge-Sort • For asymptotic running time, we do not need to count precise number public static Comparable[] mergeSort(Comparable[] A, int low, int high) of operations executed by each statement, provided that number of { operations is independent of input size. Just use symbolic constants if (low < high - 1) //at least three elements <-- cost = c0, 1 time like c1, c2, … instead. {int mid = (low + high)/2; <-- cost = c1, 1 time Comparable[] A1 = mergeSort(A, low, mid); <-- cost = ??, 1 time • Our estimate used a precise count for the number of times the j loop Comparable[] A2 = mergeSort(A, mid+1, high); <-- cost = ??, 1 time was executed in selection sort (e.g., n*(n-1)/2). Could have said it was return merge(A1,A2);} <-- cost = c2*n + c3 2 executed n times and still have obtained the same big-O complexity. .... • Once you get the hang of this, you can quickly zero in on what is Recurrence equation: relevant for determining asymptotic complexity. For example, you can usually ignore everything that is not in the innermost loop. Why? T(n) = (c0+c1) + 2T(n/2) + (c2*n + c3) <-- recurrence T(1) = c4 <-- base case • Main difficulty: estimating running time for recursive programs How do we solve this recurrence equation? 3 Analysis of Merge-Sort What do we mean by “Solution” Recurrence equation: • Recurrence: T(n) = 2T(n/2) + n T(n) = (c0+c1) + 2T(n/2) + (c2*n + c3) • Solution: T(n) = nlog2n T(1) = c4 • To prove, substitute nlog2 n for T(n) in recurrence: First, simplify by dropping lower-order terms. T(n) = 2T(n/2) + n nlog2n = 2(n/2)log2(n/2) + n Simplified recurrence equation: nlog2n = nlog2(n/2) + n T(n) = 2T(n/2) + n nlog2n = n(log2(n) - log2(2)) + n T(1) = 1 nlog2n = n(log2(n) - 1) + n nlog2n = nlog2(n) - n + n It can be shown that T(n) = O(nlog(n)) is a solution to this nlog2n = nlog2n recurrence. Solving recurrences Cheat Sheet for Common Recurrences • Solving recurrences is like integration --- no general Recurrence Relation Closed-Form Example techniques known for solving recurrences. c(1) = a c(n) = b + c(n-1) c(n) = O(n) Linear search • For CS 211, we just expect you to remember a few c(1) = a common patterns. c(n) = b*n + c(n-1) c(n) = O(n2) Quicksort c(1) = a • CS 280, learn a bag of tricks for solving recurrences that c(n) = b + c(n/2) c(n) = O(log(n)) Binary search arise in practice. c(1) = a c(n) = b*n + c(n/2) c(1) = a c(n) = b + kc(n/k) 4 Cheat Sheet for Common Recurrences Cheat Sheet for Common Recurrences Recurrence Relation Closed-Form Example Recurrence Relation Closed-Form Example c(1) = a c(1) = a c(n) = b + c(n-1) c(n) = O(n) Linear search c(n) = b + c(n-1) c(n) = O(n) Linear search c(1) = a c(1) = a c(n) = b*n + c(n-1) c(n) = O(n2) Quicksort c(n) = b*n + c(n-1) c(n) = O(n2) Quicksort c(1) = a c(1) = a c(n) = b + c(n/2) c(n) = O(log(n)) Binary search c(n) = b + c(n/2) c(n) = O(log(n)) Binary search c(1) = a c(1) = a c(n) = b*n + c(n/2) c(n) = O(n) c(n) = b*n + c(n/2) c(n) = O(n) c(1) = a c(1) = a c(n) = b + kc(n/k) c(n) = b + kc(n/k) c(n) = O(n) Cheat Sheet for Common Recurrences cont. Analysis of Quicksort: Tricky! Recurrence Relation Closed-Form Example public static void quickSort(Comparable[] A, int l, int h) { if (l < h) c(1) = a {int p = partition(A,l+1,h,A[l]); c(n) = b*n + 2c(n/2) c(n) = O(nlog(n)) Mergesort //move pivot into its final resting place; Comparable temp = A[p-1]; A[p-1] = A[l]; c(1) = a A[l] = temp; c(n) = b*n + kc(n/k) c(n) = O(nlog(n)) //make recursive calls quickSort(A,l,p-1); c(1) = a quickSort(A,p,h);}} c(2) = b c(n) = c(n-1) + c(n-2) + d c(n) = O(2n) Fibonacci Incorrect attempt: c(1) = 1 • Don’t just memorize these. Try to understand each one. c(n) = n + 2c(n/2) • When in doubt, guess a solution and see if it works (just like with integration). ---- --------- partition sorting the two partitioned arrays 5 Analysis of Quicksort: Tricky! public static void quickSort(Comparable[] A, int l, int h) { if (l < h) {int p = partition(A,l+1,h,A[l]); //move pivot into its final resting place; Comparable temp = A[p-1]; A[p-1] = A[l]; What is wrong with this analysis? A[l] = temp; //make recursive calls quickSort(A,l,p-1); quickSort(A,p,h);}} Incorrect attempt: c(1) = 1 c(n) = n + 2c(n/2) ---- --------- partition sorting the two partitioned arrays Analysis of Quicksort: Tricky! Not all algorithms are created equal • Remember: big-O is worst-case complexity.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    8 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us