
<p>Unit 7 <br>Sorting Algorithms </p><p> Simple Sorting algorithms Quicksort Improving Quicksort </p><p>Overview of Sorting Algorithms </p><p> <strong>Given a collection of items we want to arrange them in an increasing or decreasing order. </strong></p><p> <strong>You probably have seen a number of sorting algorithms including </strong></p><p>¾ <strong>selection sort </strong>¾ <strong>insertion sort </strong>¾ <strong>bubble sort </strong>¾ <strong>quicksort </strong>¾ <strong>tree sort using BST's </strong></p><p> <strong>In terms of efficiency: </strong></p><p>¾ <strong>average complexity of the first three is O(n2) </strong>¾ <strong>average complexity of quicksort and tree sort is O(n lg n) </strong>¾ <strong>but its worst case is still O(n2) which is not acceptable </strong></p><p> <strong>In this section, we </strong></p><p>¾ <strong>review insertion, selection and bubble sort </strong>¾ <strong>discuss quicksort and its average/worst case analysis </strong>¾ <strong>show how to eliminate tail recursion </strong>¾ <strong>present another sorting algorithm called heapsort </strong></p><p>Unit 7- Sorting Algorithms </p><p>2</p><p>Selection Sort </p><p> <strong>Assume that data </strong></p><p>¾ <strong>are integers </strong>¾ <strong>are stored in an array, from 0 to size-1 </strong>¾ <strong>sorting is in ascending order </strong></p><p><strong>Algorithm </strong></p><p>for i=0 to size-1 do x = location with smallest value in locations i to size-1 swap data[i] and data[x] end </p><p><strong>Complexity </strong></p><p> If array has n items, i-th step will perform n-i operations First step performs n operations second step does n-1 operations ... last step performs 1 operatio. </p><p> Total cost : n + (n-1) +(n-2) + ... + 2 + 1 = n*(n+1)/2 . </p><p>2</p><p> Algorithm is O(n ). </p><p>Unit 7- Sorting Algorithms </p><p>3</p><p>Insertion Sort </p><p><strong>Algorithm </strong></p><p>for i = 0 to size-1 do temp = data[i] x = first location from 0 to i with a value greater or equal to temp shift all values from x to i-1 one location forwards data[x] = temp end </p><p><strong>Complexity </strong></p><p> Interesting operations: comparison and shift i-th step performs i comparison and shift operations Total cost : 1 + 2 + ... + (n-1) + n = n*(n+1)/2. Algorithm is O(n<sup style="top: -0.4189em;">2</sup>). </p><p>Unit 7- Sorting Algorithms </p><p>4</p><p>Bubble Sort </p><p> n passes each pass swaps out of order adjacent elements </p><p><strong>Algorithm </strong></p><p>for end = size-1 to 0 do for i = 0 to end-1 do if data[i] > data[i+1] swap data[i] and data[i+1] end end </p><p><strong>Complexity </strong></p><p> Each step in inside for-loop performs end # of operations. Therefore, the total cost of the algorithm is n + (n-1) + ... + 1 = n*(n+1)/2. </p><p> Algorithm is O(n2). </p><p>Unit 7- Sorting Algorithms </p><p>5</p><p>Tree Sort </p><p> Insert each element in a BST or AVL tree. Traverse the tree inorder and place the elements back into the array. </p><p><strong>Algorithm </strong></p><p>tree = an empty BST or AVL tree for i = 0 to size-1 do insert data[i] in tree end for i = 0 to size-1 do get the next inorder item in tree store the item in data[i] end </p><p><strong>Complexity </strong></p><p> Inserting n items in the bst or AVL tree: O(n log(n)), on the average. Traversing the tree and inserting back the items in the array : O(n) time. Average cost : O(n log(n) + n) = O(n log(n)). If BST's are used, worst case cost: O(n<sup style="top: -0.2614em;">2</sup>). </p><p><strong>Problem: </strong></p><p> algorithm needs an additional O(n) space for the tree. </p><p>Unit 7- Sorting Algorithms </p><p>6</p><p>Quicksort </p><p> Quicksort is a 'Divide and Conquer' (recursive method) </p><p><strong>Algorithm </strong></p><p>– pick one element of the array as the pivot – partition the array into two regions: <br>• the left region that has the items less than the pivot • the right region that has the items greater or equal to pivot <br>– apply quicksort to the left region – apply quicksort to the right region </p><p>Unit 7- Sorting Algorithms </p><p>7</p><p>Quicksort Code </p><p>typedef long int Item; // It returns the final position of the pivot </p><p><strong>void quicksort( Item a[], int first, int last )</strong>{ <strong>int partition( Item a[], int first, int last ) </strong></p><p></p><ul style="display: flex;"><li style="flex:1">int pos; </li><li style="flex:1">// position of pivot </li></ul><p></p><ul style="display: flex;"><li style="flex:1">{</li><li style="flex:1">Item pivot = a[first]; </li></ul><p>int left = first, right = last; <br>// pivot if (first < last) </p><p></p><ul style="display: flex;"><li style="flex:1">{</li><li style="flex:1">// array has more than 1 item </li></ul><p>pos = partition( a, first, last ); quicksort(a, first, pos-1); quicksort(a, pos+1, last); while (left < right) { <br>// search from right for item <= pivot while ( a[right] > pivot ) right--; <br>}<br>}<br>//search from left for item >= pivot while ( left < right && a[left] <= pivot ) left++; <br>// swap items if left and right have not cross if ( left < right ) { <br>Item tmp =a[left]; a[left] = a[right]; a[right] = tmp; <br>}<br>}// place pivot in correct position a[first] = a[right]; a[right] = pivot; return right; <br>}</p><p>Unit 7- Sorting Algorithms </p><p>8</p><p>Quicksort – Partition Algorithm </p><p> The partition algorithm do the following: </p><p>¾ picks the first item as the pivot item ¾ divides the array into regions <br>– the left region that contains all items that are <= pivot – the right region that contains all items that are > pivot <br>¾ places the pivot at the right slot (no need to move it any more) ¾ returns the pivot’s final position <br>– after that, quicksort has to recursively sort the left region (the part before the pivot) and the right region (part after the pivot) </p><p> Partition starts searching from the two ends of the array and swaps the items that are in the wrong region. </p><p>¾ When the two searches meet, the array is partitioned ¾ Then the pivot is placed in the right position </p><p>Unit 7- Sorting Algorithms </p><p>9</p><p>Quicksort - Example </p><p>1<sup style="top: -0.211em;">st </sup>partition </p><ul style="display: flex;"><li style="flex:1">initial: </li><li style="flex:1">17 </li></ul><p></p><p>17 </p><p>32 32 l<br>68 68 <br>16 16 <br>14 14 <br>15 15 r<br>44 44 <br>22 22 </p><p>17 </p><p></p><ul style="display: flex;"><li style="flex:1">15 </li><li style="flex:1">68 </li></ul><p>l</p><ul style="display: flex;"><li style="flex:1">16 </li><li style="flex:1">14 </li></ul><p>r</p><ul style="display: flex;"><li style="flex:1">32 </li><li style="flex:1">44 </li></ul><p>44 44 <br>22 22 22 </p><p>17 </p><p></p><ul style="display: flex;"><li style="flex:1">15 </li><li style="flex:1">14 </li><li style="flex:1">16 </li></ul><p>r<br>68 l<br>32 </p><ul style="display: flex;"><li style="flex:1">32 </li><li style="flex:1">16 </li></ul><p></p><p>16 </p><p>15 15 <br>14 14 l r </p><p>17 </p><p>68 <br>2nd partition </p><p>14 </p><p>14 </p><p>r<br>15 15 l</p><p>16 </p><p>3rd partition 4th partition </p><p>68 </p><p></p><ul style="display: flex;"><li style="flex:1">32 </li><li style="flex:1">44 </li><li style="flex:1">22 </li></ul><p>I r <br>22 </p><p>22 </p><p>r<br>32 32 l<br>44 44 </p><p>68 </p><p>5th partition 6<sup style="top: -0.2114em;">th </sup>partition </p><p>22 </p><p>32 </p><p>32 </p><p>r<br>44 44 l</p><p>Unit 7- Sorting Algorithms </p><p>10 </p><p>32 </p><p>44 </p><p>Quicksort – Time Complexity </p><p><strong>Average Time </strong></p><p>¾ Partition is O(n): it's just a loop through the array ¾ Assuming that we split the array in half each time, we need: <br>– 2 recursive calls of half the array, – 4 of 1/4 of the array, and so on. <br>¾ So the algorithm has at the most log(n) levels. ¾ Therefore, the average time complexity of the algorithm is O(n log n) </p><p><strong>Average Space </strong></p><p>¾ It can be shown that the max number of frames on the stack is log n ¾ So average space is O(log n) </p><p><strong>Worst Time </strong></p><p>¾ Array is sorted. Partition splits the array into an empty region and a region with n-1 items </p><p>¾ Then time = n + (n-1) + … + 2 + 1 which is O(n<sup style="top: -0.2616em;">2</sup>) </p><p><strong>Worst Space </strong></p><p>¾</p><p>It will create n stack frames. So space is O(n) </p><p>Unit 7- Sorting Algorithms </p><p>11 </p><p>Improvements of Quicksort </p><p> How can we improve the worst time case? </p><p>¾ select as pivot the median of the left, right and middle items ¾ significantly reduces the probability of the worst case </p><p> How can we improve the worst space case? </p><p>¾ Remove tail recursion (recursive call done as the last operation) and replace it with iteration </p><p> With these improvements, time/space complexity fof Quicksort is: </p><p></p><ul style="display: flex;"><li style="flex:1"><strong>Average </strong></li><li style="flex:1"><strong>Worst (for time) </strong></li></ul><p></p><p>2</p><p><strong>time </strong></p><p></p><ul style="display: flex;"><li style="flex:1">O(n lg n) </li><li style="flex:1">O(n ) </li></ul><p></p><p><strong>space </strong>O(lg n) </p><p>O(1) <br> Further improvements: </p><p>¾ Quicksort is inefficient on small arrays ¾ Stop using Quicksort when partition size is small (i.e. < 50) use insertion sort for this part of the array) </p><p>Unit 7- Sorting Algorithms </p><p>12 </p>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages6 Page
-
File Size-