Sorting: From Theory to Practice Sorting out sorts

z Why do we study sorting? z Simple, O(n2) sorts --- for sorting n elements 2 ¾ Because we have to ¾ --- n comparisons, n swaps, easy to code 2 2 ¾ Because sorting is beautiful ¾ --- n comparisons, n moves, stable, fast ¾ --- n2 everything, slow, slower, and ugly ¾ Example of algorithm analysis in a simple, useful setting

z Divide and conquer faster sorts: O(n log n) for n elements z There are n sorting algorithms, how many should we study? ¾ Quick sort: fast in practice, O(n2) worst case ¾ O(n), O(log n), … ¾ : good worst case, great for linked lists, uses ¾ Why do we study more than one algorithm? extra storage for vectors/arrays • Some are good, some are bad, some are very, very sad z Other sorts: • Paradigms of trade-offs and algorithmic design ¾ Heap sort, basically priority queue sorting ¾ Which is best? ¾ : doesn’t compare keys, uses digits/characters ¾ Which sort should you call from code you write? ¾ Shell sort: quasi-insertion, fast in practice, non-recursive

CPS 100 13.1 CPS 100 13.2

Selection sort: summary Insertion Sort: summary

z Simple to code n2 sort: n2 comparisons, n swaps z Stable sort, O(n2), good on nearly sorted vectors ¾ Stable sorts maintain order of equal keys void selectSort(String[] a) ¾ Good for sorting on two criteria: name, then age { for(int k=0; k < a.length; k++){ void insertSort(String[] a) int minIndex = findMin(a,k); { int k, loc; string elt; for(k=1; k < a.length; k++) { swap(a,k,minIndex); elt = a[k]; } loc = k; } // shift until spot for elt is found while (0 < loc && elt.compareTo(a[loc-1]) < 0) n { a[loc] = a[loc-1]; // shift right z # comparisons: Σ k = 1 + 2 + … + n = n(n+1)/2 = O(n2) loc=loc-1; k=1 } ¾ Swaps? a[loc] = elt; Sorted, won’t move ¾ Invariant: ????? } final position } Sorted relative to ????? each other

CPS 100 13.3 CPS 100 13.4 Bubble sort: summary of a dog Summary of simple sorts

z For completeness you should know about this sort z Selection sort has n swaps, good for “heavy” data ¾ Few, if any, redeeming features. Really slow, really, really ¾ moving objects with lots of state, e.g., … ¾ Can code to recognize already sorted vector (see insertion) • In C or C++ this is an issue • Not worth it for bubble sort, much slower than insertion • In Java everything is a pointer/reference, so swapping is fast since it's pointer assignment void bubbleSort(String[] a) { z Insertion sort is good on nearly sorted data, it’s stable, it’s fast for(int j=a.length-1; j >= 0; j--) { ¾ Also foundation for Shell sort, very fast non-recursive for(int k=0; k < j; k++) { ¾ More complicated to code, but relatively simple, and fast if (a[k] > a[k+1]) swap(a,k,k+1); } z Bubble sort is a travesty? But it's fast to code if you know it! Sorted, in final } ????? ¾ Can be parallelized, but on one machine don’t go near it } position (see quotes at end of slides) z “bubble” elements down the vector/array

CPS 100 13.5 CPS 100 13.6

Quicksort: fast in practice Partition code for

z Easy to develop partition z Invented in 1962 by C.A.R. Hoare, didn’t understand recursion what we want ¾ Worst case is O(n2), but avoidable in nearly all cases <= pivot > pivot int partition(String[] a, int left, int right) ¾ In 1997 published (Musser, introspective sort) left right { • Like quicksort in practice, but recognizes when it will be bad pIndex string pivot = a[left]; int k, pIndex = left; and changes to for(k=left+1, k <= right; k++) { what we have if (a[k].compareTo(pivot) <= 0){ void quick(String[], int left, int right) pIndex++; { ?????????????? swap(a,k,pIndex); right } if (left < right) { left } int pivot = partition(a,left,right); swap(a,left,pIndex); quick(a,left,pivot-1); } invariant quick(a,pivot+1, right); z loop invariant: } <= > ??? ¾ statement true each time loop } left right test is evaluated, used to verify z Recurrence? <= X X > X correctness of loop pIndex k z Can swap into a[left] before loop pivot index ¾ Nearly sorted data still ok CPS 100 13.7 CPS 100 13.8 Analysis of Quicksort Tail recursion elimination

z Average case and worst case analysis z If the last statement is a recursive call, recursion can be replaced ¾ Recurrence for worst case: T(n) = T(n-1) + T(1) + O(n) with iteration ¾ What about average? T(n) = 2T(n/2) + O(n) ¾ Call cannot be part of an expression ¾ Some compilers do this automatically

z Reason informally: void foo(int n) void foo2(int n) ¾ Two calls vector size n/2 { { ¾ Four calls vector size n/4 if (0 < n) { while (0 < n) { System.out.println(n); System.out.println(n); ¾ … How many calls? Work done on each call? foo(n-1); n = n-1; } } z Partition: typically find middle of left, middle, right, swap, go } } ¾ Avoid bad performance on nearly sorted data z What if print and recursive call switched? z In practice: remove some (all?) recursion, avoid lots of “clones” z What about recursive factorial? return n*factorial(n-1);

CPS 100 13.9 CPS 100 13.10

Merge sort: worst case O(n log n) Merge sort: lists or vectors

z Divide and conquer --- recursive sort z Mergesort for vectors

¾ Divide list/vector into two halves void mergesort(String[] a, int left, int right) • Sort each half { • Merge sorted halves together if (left < right) { int mid = (right+left)/2; ¾ What is complexity of merging two sorted lists? mergesort(a, left, mid); ¾ What is recurrence relation for merge sort as described? mergesort(a, mid+1, right); merge(a,left,mid,right); T(n) = T(n) = 2T(n/2) + O(n) } } z What is advantage of array over linked-list for merge sort? z What’s different when linked lists used? ¾ What about merging, advantage of linked list? ¾ Do differences affect complexity? Why? ¾ Array requires auxiliary storage (or very fancy coding) z How does merge work?

CPS 100 13.11 CPS 100 13.12 Mergesort continued Summary of O(n log n) sorts

z Array code for merge isn’t pretty, but it’s not hard z Quicksort is relatively straight-forward to code, very fast ¾ Mergesort itself is elegant ¾ Worst case is very unlikely, but possible, therefore … ¾ But, if lots of elements are equal, performance will be bad void merge(String[] a, • One million integers from range 0 to 10,000 int left, int middle, int right) • How can we change partition to handle this? // pre: left <= middle <= right, // a[left] <= … <= a[middle], // a[middle+1] <= … <= a[right] z Merge sort is stable, it’s fast, good for linked lists, harder to code? // post: a[left] <= … <= a[right] ¾ Worst case performance is O(n log n), compare quicksort ¾ Extra storage for array/vector z Why is this prototype potentially simpler for linked lists?

¾ What will prototype be? What is complexity? z Heapsort, more complex to code, good worst case, not stable ¾ Basically heap-based priority queue in a vector

CPS 100 13.13 CPS 100 13.14

Sorting in practice Non-comparison-based sorts

z Rarely will you need to roll your own sort, but when you do z lower bound: Ω(n log n) for … comparison based sorts (like 23 34 56 25 44 73 42 26 10 16 ¾ What are key issues? searching lower bound) z /radix sort are not- 16 73 44 26 comparison based, faster z If you use a library sort, you need to understand the interface 10 42 23 34 25 56 asymptotically and in practice ¾ In C++ we have STL 01 2 3 4 5 6 789 • STL has sort, and stable_sort 10 42 23 73 34 44 25 56 26 16 z sort a vector of ints, all ints in ¾ In C the generic sort is complex to use because arrays are the range 1..100, how? ugly 26 ¾ (use extra storage) 16 25 44 ¾ In Java guarantees and worst-case are important 10 23 34 42 56 73 z radix: examine each digit of • Why won’t quicksort be used? 01 2 3 4 5 6 789 numbers being sorted ¾ One-pass per digit z Comparators permit sorting criteria to change simply ¾ Sort based on digit 10 16 23 25 26 34 42 44 56 73

CPS 100 13.15 CPS 100 13.16 Jim Gray (Turing 1998) Brian Reid (Hopper Award 1982)

z Bubble sort is a good Feah. I love bubble argument for analyzing sort, and I grow weary algorithm performance. It of people who have is a perfectly correct nothing better to do algorithm. But it's than to preach about it. performance is among the Universities are good worst imaginable. So, it places to keep such crisply shows the difference people, so that they between correct algorithms don't scare the general and good algorithms. public. (italics ola’s) (continued)

CPS 100 13.17 CPS 100 13.18

Brian Reid (Hopper 1982) Niklaus Wirth (Turing award 1984) I am quite capable of squaring N with or without a calculator, and I know how long my sorts will bubble. I can type every I have read your article and share your view that Bubble Sort has hardly any form of bubble sort into a text editor from memory. If I am merits. I think that it is so often writing some quick code and I need a sort quick, as opposed to mentioned, because it illustrates quite a quick sort, I just type in the bubble sort as if it were a well the principle of sorting by statement. I'm done with it before I could look up the data exchanging. type of the third argument to the quicksort library. I think BS is popular, because it fits I have a dual-processor 1.2 GHz Powermac and it sneers at well into a systematic development of your N squared for most interesting values of N. And my sorting algorithms. But it plays no source code is smaller than yours. role in actual applications. Quite in contrast to C, also without merit (and Brian Reid its derivative Java), among who keeps all of his bubbles sorted anyhow. programming codes.

Good paper, by the way. Well written and actually has something to say. CPS 100 13.19 CPS 100 13.20 Guy L. Steele, Jr. (Hopper ’88) Guy L. Steele, Jr.

(Thank you for your fascinating paper and inquiry. Here are some off-the-cuff As for its status today, it may be an example thoughts on the subject. ) of that phenomenon whereby the first widely I think that one reason for popular version of something becomes frozen as a the popularity of Bubble common term or cultural icon. Even in the 1990s, Sort is that it is easy to see a comic-strip bathtub very likely sits off the floor why it works, and the idea is simple enough that one on claw feet. can carry it around in one's … it is the first thing that leaps to mind, the head … thing that is easy to recognize, the thing that is easy to doodle on a napkin, when one thinks continued generically or popularly about sort routines.

CPS 100 13.21 CPS 100 13.22