
Sorting Algorithms We have already seen: I Selection-sort I Insertion-sort I Heap-sort We will see: I Bubble-sort I Merge-sort I Quick-sort We will show that: I O(n · log n) is optimal for comparison based sorting. Bubble-Sort The basic idea of bubble-sort is as follows: I exchange neighboring elements that are in wrong order I stops when no elements were exchanged bubbleSort(A): n = length(A) swapped = true while swapped == true do swapped = false for i = 0 to n - 2 do if A[i] > A[i + 1] then swap(A[i]; A[i + 1]) swapped = true done n = n - 1 done Bubble-Sort: Example 5 4 3 7 1 Unsorted Part 4 5 3 7 1 Sorted Part 4 3 5 7 1 4 3 5 7 1 4 3 5 1 7 3 4 5 1 7 4 3 5 1 7 4 3 1 5 7 3 4 1 5 7 3 1 4 5 7 1 3 4 5 7 Bubble-Sort: Properties Time complexity: I worst-case: (n - 1)2 + n - 1 (n - 1) + ::: + 1 = 2 O(n2) 2 (caused by sorting an inverse sorted list) I best-case: O(n) Bubble-sort is: I slow I in-place Divide-and-Conquer Divide-and-Conquer is a general algorithm design paradigm: I Divide: divide the input S into two disjoint subsets S1, S2 I Recur: recursively solve the subproblems S1, S2 I Conquer: combine solutions for S1, S2 to a solution for S (the base case of the recursion are problems of size 0 or 1) Example: merge-sort 7 2 j 9 4 7 2 4 7 9 7 j 2 7 2 7 9 j 4 7 4 9 ! 7 7 7 2 7 2 9 7 9 4 7 4 ! ! I j indicates! the splitting! point ! ! I 7 indicates merging of the sub-solutions ! Merge Sort Merge-sort of a list S with n elements works as follows: I Divide: divide S into two lists S1, S2 of ≈ n=2 elements I Recur: recursively sort S1, S2 I Conquer: merge S1 and S2 into a sorting of S Algorithm mergeSort(S; C): Input: a list S of n elements and a comparator C Output: the list S sorted according to C if size(S) > 1 then (S1; S2) = partition S into size bn=2c and dn=2e mergeSort(S1; C) mergeSort(S2; C) S = merge(S1; S2; C) Merging two Sorted Sequences Algorithm merge(A; B; C): Input: sorted lists A, B Output: sorted lists containing the elements of A and B S = empty list while :A:isEmtpy() or :B:isEmtpy() do if A:first():element < B:first():element then S:insertLast(A:remove(A:first())) else S:insertLast(B:remove(B:first())) done while :A:isEmtpy() do S:insertLast(A:remove(A:first())) while :B:isEmtpy() do S:insertLast(B:remove(B:first())) Performance: I Merging two sorted lists of length about n=2 is O(n) time. (for singly linked lists, double linked lists, and arrays) Merge-sort: Example a n e x j a m p l e Divide (split) a n j e x a m j p l e a j n e j x a j m p j l e a n e x a m p l j e Conquer l e (merge) a n e x a m e l a e n x e l p a e l m p a a e e l m n p x Merge-Sort Tree An execution of merge-sort can be displayed in a binary tree: I each node represents recursive call and stores: I unsorted sequence before execution, its partition I sorted sequence after execution I leaves are calls on subsequences of size 0 or 1 a n e x j a m p l e 7 a a e e l m n p x a n j e x 7 a e n x ! a m j p l e 7 a e l m p a j n 7 a n ! e j x 7 e x a j m 7 a m ! p j l e 7 e l p a 7 a!n 7 n e 7 e!x 7 x a 7 a !m 7 m p 7 p !l j e 7 e l ! ! ! ! ! ! ! l 7 l !e 7 e ! ! Merge-Sort: Example Execution 7 1 2 9 j 6 5 3 8 7 1 2 3 5 6 7 8 9 7 1 j 2 9 7 1 2 7 9 ! 6 5 j 3 8 7 3 5 6 8 7 j 1 7 1 7 ! 2 j 9 7 2 9 6 j 5 7 5 6 ! 3 j 8 7 3 8 7 7 7!1 7 1 2 7 2!9 7 9 6 7 6!5 7 5 3 7 3!8 7 8 ! ! ! ! ! ! ! ! Finished merge-sort tree. Merge-Sort: Running Time The height h of the merge-sort tree is O(log2 n): I each recursive call splits the sequence in half The work all nodes together at depth i is O(n): i i I partitioning, and merging of 2 sequences of n=2 i+1 I 2 ≤ n recursive calls depth nodes size 0 1 n 1 2 n=2 i 2i n=2i ... ... ... Thus the worst-case running time is O(n · log2 n). Quick-Sort Quick-sort of a list S with n elements works as follows: I Divide: pick random element x (pivot) from S and split S in: I L elements less than x I E elements equal than x I G elements greater than x 7 2 1 9 6 5 3 8 I Recur: recursively sort L, and G 2 1 3 5 7 9 6 8 L E G I Conquer: join L, E, and G 1 2 3 5 6 7 8 9 Quick-Sort: The Partitioning The partitioning runs in O(n) time: I we traverse S and compare every element y with x I depending on the comparison insert y in L, E or G Algorithm partition(S; p): Input: a list S of n elements, and positon p of the pivot Output: list L, E, G of lists less, equal or greater than pivot L; E; G = empty lists x = S:elementAtRank(p) while :S:isEmpty() do y = S:remove(S:first()) if y < x then L:insertLast(y) if y == x then E:insertLast(y) if y > x then G:insertLast(y) done return L; E; G Quick-Sort Tree An execution of quick-sort can be displayed in a binary tree: I each node represents recursive call and stores: I unsorted sequence before execution, and its pivot I sorted sequence after execution I leaves are calls on subsequences of size 0 or 1 1 6 2 940 7 0 1 2469 120 7 012 ! 69 7 69 0 7 0 ! 2 7 2 ! 9 7 9 ! ! ! Quick-Sort: Example 8 2 9 3 1 5 764 7 1 2 3 4 56789 2 3154 7 12345 ! 897 7 789 !2354 7 2345 7 7 7 ! 9 7 9 2 7 2 ! 54 7 45 ! ! ! 4 7 4! ! Quick-Sort: Worst-Case Running Time The worst-case running time occurs when: I the pivot is always the minimal or maximal element I then one L and G has size n - 1, the other size 0 Then the running time is O(n2): n + (n - 1) + (n - 2) + ::: + 1 2 O(n2) n 0 n - 1 0 n - 2 . 0 . 1 0 0 Quick-Sort: Average Running Time Consider a recursive call on a list of size m: 3 I Good call: if both L and G are each less then 4 · s size 3 I Bad call: one of L and G is greater than 4 · s size 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 bad calls good calls bad calls 1 A good call has probability 2 : I half of the pivots give rise to good calls Quick-Sort: Average Running Time For a node at depth i, we expect (average): I i=2 ancestors are good calls i=2 I the size of the sequence is ≤ (3=4) · n As a consequence: I for a node at depth 2 · log3=4 n the expected input size is 1 I the expected height of the quick-sort tree is O(log n) sa O(n) sb sc O(n) O(log n) sd se sf sg O(n) ... ... ... ... ... ... ... ... The amount of work at depth i is O(n). Thus the expected (average) running time is O(n · log n). In-Place Quick-Sort Quick-Sort can be sorted in-place (but then non-stable): Algorithm inPlaceQuickSort(A; l; r): Input: list A, indices l and r Output: list A where elements from index l to r are sorted if l ≥ r then return p = A[r] (take rightmost element as pivot) l0 = l and r 0 = r while l0 ≤ r 0 do while l0 ≤ r 0 and A[l0] ≤ p do l0 = l0 + 1 (find > p) while l0 ≤ r 0 and A[r 0] ≥ p do r 0 = r 0 - 1 (find < p) if l0 < r 0 then swap(A[l0]; A[r 0]) (swap < p with > p) done swap(A[r]; A[l0]) (put pivot into the right place) inPlaceQuickSort(A; l; l0 - 1) (sort left part) inPlaceQuickSort(A; l0 + 1; r) (sort right part) Considered in-place although recursion needs O(log n) space.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages27 Page
-
File Size-