Efficient Sorts Two Great Sorting Algorithms

Total Page:16

File Type:pdf, Size:1020Kb

Efficient Sorts Two Great Sorting Algorithms Mergesort and Quicksort Lecture 3: Efficient Sorts Two great sorting algorithms. Full scientific understanding of their properties has enabled us to hammer them into practical system sorts. Occupies a prominent place in world's computational infrastructure. Quicksort honored as one of top 10 algorithms for science and engineering of 20th century. Mergesort Quicksort Mergesort. Analysis of Algorithms Java Arrays sort for type Object. Java Collections sort. Perl stable, Python stable. Quicksort. Java Arrays sort for primitive types. C qsort, Unix, g++, Visual C++, Perl, Python. Princeton University • COS 226 • Algorithms and Data Structures • Spring 2004 • Kevin Wayne • http://www.Princeton.EDU/~cos226 2 Sorting Applications Estimating the Running Time Applications. Total running time is sum of cost P frequency for all of the basic ops. Sort a list of names. obvious applications Cost depends on machine, compiler. Organize an MP3 library. Frequency depends on algorithm, input. Display Google PageRank results. Find the median. Cost for sorting. Find the closest pair. A = # function calls. Binary search in a database. problems become easy once B = # exchanges. Identify statistical outliers. items are in sorted order Find duplicates in a mailing list. C = # comparisons. Cost on a typical machine = 35A + 11B + 4C. Data compression. Computer graphics. Frequency of sorting ops. Computational biology. Supply chain management. non-obvious applications N = # elements to sort. Simulate a system of particles. Selection sort: A = 1, B = N-1, C = N(N-1) / 2. Book recommendations on Amazon. Donald Knuth Load balancing on a parallel computer. 3 4 Estimating the Running Time Big Oh Notation An easier alternative. Big Theta, Oh, and Omega notation. (i) Analyze asymptotic growth as a function of input size N. 2 2 2 2 1.5 (ii) For medium N, run and measure time. (N ) means { N , 17N , N + 17N + 3N, . } (iii) For large N, use (i) and (ii) to predict time. – ignore lower order terms and leading coefficients 2 2 2 2 1.5 1.5 Asymptotic growth rates. O(N ) means { N , 17N , N + 17N + 3N, N , 100N, . } 2 Estimate as a function of input size N. – (N ) and smaller – N, N log N, N2, N3, 2N, N! – use for upper bounds Ignore lower order terms and leading coefficients. 3 2 3 2 2 2 2 1.5 3 5 – Ex. 6N + 17N + 56 is asymptotically proportional to N (N ) means { N , 17N , N + 17N + 3N, N , 100N , . } – (N2) and larger – use for lower bounds Never say: insertion sort makes at least O(N2) comparisons. 5 6 Estimating the Running Time Why It Matters Insertion sort is quadratic. 2 Run time in N / 4 - N / 4 comparisons on average. 1.3 N3 10 N2 47 N log N 48 N nanoseconds --> 2 2 (N ). 1000 1.3 seconds 10 msec 0.4 msec 0.048 msec On arizona: 1 second for N = 10,000. Time to 10,000 22 minutes 1 second 6 msec 0.48 msec solve a 100,000 15 days 1.7 minutes 78 msec 4.8 msec How long for N = 100,000? 100 seconds (100 times as long) problem N = 1 million? 2.78 hours (another factor of 100) of size million 41 years 2.8 hours 0.94 seconds 48 msec 6 10 million 41 millennia 1.7 weeks 11 seconds 0.48 seconds N = 1 billion? 317 years (another factor of 10 ) N = 1 trillion? second 920 10,000 1 million 21 million Max size problem minute 3,600 77,000 49 million 1.3 billion solved hour 14,000 600,000 2.4 trillion 76 trillion in one day 41,000 2.9 million 50 trillion 1,800 trillion N multiplied by 10, 1,000 100 10+ 10 time multiplied by Reference: More Programming Pearls by Jon Bentley 7 8 Orders of Magnitude Mergesort Mergesort (divide-and-conquer) Meters Per Imperial Seconds Equivalent Example Second Units Divide array into two halves. 1 1 second 10-10 1.2 in / decade Continental drift Recursively sort each half. 10 10 seconds 10-8 1 ft / year Hair growing 102 1.7 minutes Merge two halves to make sorted whole. 10-6 3.4 in / day Glacier 103 17 minutes 10-4 1.2 ft / hour Gastro-intestinal tract 104 2.8 hours 10-2 2 ft / minute Ant 105 1.1 days 1 2.2 mi / hour Human walk 106 1.6 weeks 102 220 mi / hour Propeller airplane 107 3.8 months 104 370 mi / min Space shuttle 108 3.1 years A L G O R I T H M S 106 620 mi / sec Earth in galactic orbit 109 3.1 decades 108 62,000 mi / sec 1/3 speed of light 1010 3.1 centuries A L G O R I T H M S divide . forever 210 thousand Powers age of 220 million sort 1017 of 2 A G L O R H I M S T universe 230 billion A G H I L M O R S T merge Reference: More Programming Pearls by Jon Bentley 9 12 Mergesort Implementation in Java Mergesort Analysis Stability? Yes, if underlying merge is stable. public static void mergesort(Comparable[] a, int low, int high) { Comparable temp[] = new Comparable[a.length]; for (int i = 0; i < a.length; i++) temp[i]=a[i]; How much memory does array implementation of mergesort require? mergesort(temp, a, low, high); } Original input = N. Auxiliary array for merging = N. private static void mergesort(Comparable[] from, Comparable[] to, int low, int high) { Local variables: constant. if (high <= low) return; Function call stack: log2 N. int mid =(low + high)/2; Total = 2N + O(log N). mergesort(to, from, low, mid); mergesort(to, from, mid+1, high); How much memory do other sorting algorithms require? int p = low, q = mid+1; for(int i = low; i <= high; i++) { N + O(1) for insertion sort, selection sort, bubble sort. if (q > high) to[i]=from[p++]; In-place = N + O(log N). else if (p > mid) to[i]=from[q++]; else if (less(from[q], from[p])) to[i]=from[q++]; else to[i]=from[p++]; } } 13 14 Mergesort Analysis Proof by Picture of Recursion Tree How long does mergesort take? $ 0 if N 1 ' Bottleneck = merging (and copying). T(N ) % 2T (N /2) N otherwise ' – merging two files of size N/2 requires ? N comparisons & sorting both halves merging T(N) = comparisons to mergesort N elements. – assume N is a power of 2 T(N) N – assume merging requires exactly N comparisons $ 0 if N 1 T(N/2) T(N/2) 2(N/2) ' T(N ) % 2T (N /2) N otherwise ' & sorting both halves merging T(N/4) T(N/4) T(N/4) T(N/4) 4(N/4) log2N . including already sorted T(N / 2k) 2k (N / 2k) Claim. T(N) = N log2 N. Note: same number of comparisons for ANY file. T(2) T(2) We'll give several proofs to illustrate standard techniques. T(2) T(2) T(2) T(2) T(2) T(2) N/2 (2) Nlog2N 15 16 Proof by Telescoping Mathematical Induction Claim. If T(N) satisfies this recurrence, then T(N) = N log2 N. Mathematical induction. Powerful and general proof technique in discrete mathematics. $ 0 if N 1 assumes N is a power of 2 To prove a theorem true for all integers k O 0: ' T(N ) % 2T(N /2) N otherwise – base case: prove it to be true for N = 0 ' merging & sorting both halves – induction hypothesis: assuming it is true for arbitrary N – induction step: show it is true for N + 1 T (N ) 2T (N /2) Proof. For N > 1: 1 N N Claim: 0 + 1 + 2 + 3 + . + N = N(N+1) / 2 for all N O 0. T (N /2) 1 Proof: (by mathematical induction) N /2 Base case (N = 0). T (N / 4) 1 1 – 0 = 0(0+1) / 2. N / 4 Induction hypothesis: assume 0 + 1 + 2 + . + N = N(N+1) / 2 T (N /N ) 1 1 Induction step: 0 + 1 + . + N + N + 1 = (0 + 1 + . + N) + N+1 N /N log2 N = N (N+1) /2 + N+1 log2 N = (N+2)(N+1) / 2 17 18 Proof by Induction Proof by Induction Claim. If T(N) satisfies this recurrence, then T(N) = N log2 N. Q. What if N is not a power of 2? Q. What if merging takes at most N comparisons instead of exactly N? $ 0 if N 1 assumes N is a power of 2 ' T(N ) % 2T (N /2) N otherwise A. T(N) satisfies following recurrence. ' & sorting both halves merging $ 0 if N 1 ' T(N ) ? % T !1N /2 T #3N /2 N otherwise ' & solve left half solve right half merging Proof. (by induction on N) Base case: N = 1. Inductive hypothesis: T(N) = N log2 N. Claim. T(N) ? N !log N1. 2 Goal: show that T(2N) = 2N log2 (2N). Proof. Challenge for the bored. T (2N ) 2T (N ) 2N 2N log2 N 2N 2N log2(2N ) 1 2N 2N log2(2N ) 19 20 Mergesort: Practical Improvements Sorting By Different Fields Eliminate recursion. Bottom-up mergesort. Sedgewick Program 8.5 Design challenge: enable sorting students by email or section. Stop if already sorted. // sort by email 1 Anand Dharan adharan Is biggest element in first half ? smallest element in second half? Student.setSortKey(Student.EMAIL); 1 Ashley Evans amevans 1 Alicia Myers amyers Helps for nearly ordered lists. ArraySort.mergesort(students, 0, N-1); 1 Arthur Shum ashum 1 Amy Trangsrud atrangsr // then by precept 1 Bryant Chen bryantc Insertion sort small files.
Recommended publications
  • Sort Algorithms 15-110 - Friday 2/28 Learning Objectives
    Sort Algorithms 15-110 - Friday 2/28 Learning Objectives • Recognize how different sorting algorithms implement the same process with different algorithms • Recognize the general algorithm and trace code for three algorithms: selection sort, insertion sort, and merge sort • Compute the Big-O runtimes of selection sort, insertion sort, and merge sort 2 Search Algorithms Benefit from Sorting We use search algorithms a lot in computer science. Just think of how many times a day you use Google, or search for a file on your computer. We've determined that search algorithms work better when the items they search over are sorted. Can we write an algorithm to sort items efficiently? Note: Python already has built-in sorting functions (sorted(lst) is non-destructive, lst.sort() is destructive). This lecture is about a few different algorithmic approaches for sorting. 3 Many Ways of Sorting There are a ton of algorithms that we can use to sort a list. We'll use https://visualgo.net/bn/sorting to visualize some of these algorithms. Today, we'll specifically discuss three different sorting algorithms: selection sort, insertion sort, and merge sort. All three do the same action (sorting), but use different algorithms to accomplish it. 4 Selection Sort 5 Selection Sort Sorts From Smallest to Largest The core idea of selection sort is that you sort from smallest to largest. 1. Start with none of the list sorted 2. Repeat the following steps until the whole list is sorted: a) Search the unsorted part of the list to find the smallest element b) Swap the found element with the first unsorted element c) Increment the size of the 'sorted' part of the list by one Note: for selection sort, swapping the element currently in the front position with the smallest element is faster than sliding all of the numbers down in the list.
    [Show full text]
  • Quick Sort Algorithm Song Qin Dept
    Quick Sort Algorithm Song Qin Dept. of Computer Sciences Florida Institute of Technology Melbourne, FL 32901 ABSTRACT each iteration. Repeat this on the rest of the unsorted region Given an array with n elements, we want to rearrange them in without the first element. ascending order. In this paper, we introduce Quick Sort, a Bubble sort works as follows: keep passing through the list, divide-and-conquer algorithm to sort an N element array. We exchanging adjacent element, if the list is out of order; when no evaluate the O(NlogN) time complexity in best case and O(N2) exchanges are required on some pass, the list is sorted. in worst case theoretically. We also introduce a way to approach the best case. Merge sort [4]has a O(NlogN) time complexity. It divides the 1. INTRODUCTION array into two subarrays each with N/2 items. Conquer each Search engine relies on sorting algorithm very much. When you subarray by sorting it. Unless the array is sufficiently small(one search some key word online, the feedback information is element left), use recursion to do this. Combine the solutions to brought to you sorted by the importance of the web page. the subarrays by merging them into single sorted array. 2 Bubble, Selection and Insertion Sort, they all have an O(N2) In Bubble sort, Selection sort and Insertion sort, the O(N ) time time complexity that limits its usefulness to small number of complexity limits the performance when N gets very big. element no more than a few thousand data points.
    [Show full text]
  • Radix Sort Comparison Sort Runtime of O(N*Log(N)) Is Optimal
    Radix Sort Comparison sort runtime of O(n*log(n)) is optimal • The problem of sorting cannot be solved using comparisons with less than n*log(n) time complexity • See Proposition I in Chapter 2.2 of the text How can we sort without comparison? • Consider the following approach: • Look at the least-significant digit • Group numbers with the same digit • Maintain relative order • Place groups back in array together • I.e., all the 0’s, all the 1’s, all the 2’s, etc. • Repeat for increasingly significant digits The characteristics of Radix sort • Least significant digit (LSD) Radix sort • a fast stable sorting algorithm • begins at the least significant digit (e.g. the rightmost digit) • proceeds to the most significant digit (e.g. the leftmost digit) • lexicographic orderings Generally Speaking • 1. Take the least significant digit (or group of bits) of each key • 2. Group the keys based on that digit, but otherwise keep the original order of keys. • This is what makes the LSD radix sort a stable sort. • 3. Repeat the grouping process with each more significant digit Generally Speaking public static void sort(String [] a, int W) { int N = a.length; int R = 256; String [] aux = new String[N]; for (int d = W - 1; d >= 0; --d) { aux = sorted array a by the dth character a = aux } } A Radix sort example A Radix sort example A Radix sort example • Problem: How to ? • Group the keys based on that digit, • but otherwise keep the original order of keys. Key-indexed counting • 1. Take the least significant digit (or group of bits) of each key • 2.
    [Show full text]
  • Secure Multi-Party Sorting and Applications
    Secure Multi-Party Sorting and Applications Kristján Valur Jónsson1, Gunnar Kreitz2, and Misbah Uddin2 1 Reykjavik University 2 KTH—Royal Institute of Technology Abstract. Sorting is among the most fundamental and well-studied problems within computer science and a core step of many algorithms. In this article, we consider the problem of constructing a secure multi-party computing (MPC) protocol for sorting, building on previous results in the field of sorting networks. Apart from the immediate uses for sorting, our protocol can be used as a building-block in more complex algorithms. We present a weighted set intersection algorithm, where each party inputs a set of weighted ele- ments and the output consists of the input elements with their weights summed. As a practical example, we apply our protocols in a network security setting for aggregation of security incident reports from multi- ple reporters, specifically to detect stealthy port scans in a distributed but privacy preserving manner. Both sorting and weighted set inter- section use O`n log2 n´ comparisons in O`log2 n´ rounds with practical constants. Our protocols can be built upon any secret sharing scheme supporting multiplication and addition. We have implemented and evaluated the performance of sorting on the Sharemind secure multi-party computa- tion platform, demonstrating the real-world performance of our proposed protocols. Keywords. Secure multi-party computation; Sorting; Aggregation; Co- operative anomaly detection 1 Introduction Intrusion Detection Systems (IDS) [16] are commonly used to detect anoma- lous and possibly malicious network traffic. Incidence reports and alerts from such systems are generally kept private, although much could be gained by co- operative sharing [30].
    [Show full text]
  • Batcher's Algorithm
    18.310 lecture notes Fall 2010 Batcher’s Algorithm Prof. Michel Goemans Perhaps the most restrictive version of the sorting problem requires not only no motion of the keys beyond compare-and-switches, but also that the plan of comparison-and-switches be fixed in advance. In each of the methods mentioned so far, the comparison to be made at any time often depends upon the result of previous comparisons. For example, in HeapSort, it appears at first glance that we are making only compare-and-switches between pairs of keys, but the comparisons we perform are not fixed in advance. Indeed when fixing a headless heap, we move either to the left child or to the right child depending on which child had the largest element; this is not fixed in advance. A sorting network is a fixed collection of comparison-switches, so that all comparisons and switches are between keys at locations that have been specified from the beginning. These comparisons are not dependent on what has happened before. The corresponding sorting algorithm is said to be non-adaptive. We will describe a simple recursive non-adaptive sorting procedure, named Batcher’s Algorithm after its discoverer. It is simple and elegant but has the disadvantage that it requires on the order of n(log n)2 comparisons. which is larger by a factor of the order of log n than the theoretical lower bound for comparison sorting. For a long time (ten years is a long time in this subject!) nobody knew if one could find a sorting network better than this one.
    [Show full text]
  • CS 758/858: Algorithms
    CS 758/858: Algorithms ■ COVID Prof. Wheeler Ruml Algorithms TA Sumanta Kashyapi This Class Complexity http://www.cs.unh.edu/~ruml/cs758 4 handouts: course info, schedule, slides, asst 1 2 online handouts: programming tips, formulas 1 physical sign-up sheet/laptop (for grades, piazza) Wheeler Ruml (UNH) Class 1, CS 758 – 1 / 25 COVID ■ COVID Algorithms This Class Complexity ■ check your Wildcat Pass before coming to campus ■ if you have concerns, let me know Wheeler Ruml (UNH) Class 1, CS 758 – 2 / 25 ■ COVID Algorithms ■ Algorithms Today ■ Definition ■ Why? ■ The Word ■ The Founder This Class Complexity Algorithms Wheeler Ruml (UNH) Class 1, CS 758 – 3 / 25 Algorithms Today ■ ■ COVID web: search, caching, crypto Algorithms ■ networking: routing, synchronization, failover ■ Algorithms Today ■ machine learning: data mining, recommendation, prediction ■ Definition ■ Why? ■ bioinformatics: alignment, matching, clustering ■ The Word ■ ■ The Founder hardware: design, simulation, verification ■ This Class business: allocation, planning, scheduling Complexity ■ AI: robotics, games Wheeler Ruml (UNH) Class 1, CS 758 – 4 / 25 Definition ■ COVID Algorithm Algorithms ■ precisely defined ■ Algorithms Today ■ Definition ■ mechanical steps ■ Why? ■ ■ The Word terminates ■ The Founder ■ input and related output This Class Complexity What might we want to know about it? Wheeler Ruml (UNH) Class 1, CS 758 – 5 / 25 Why? ■ ■ COVID Computer scientist 6= programmer Algorithms ◆ ■ Algorithms Today understand program behavior ■ Definition ◆ have confidence in results, performance ■ Why? ■ The Word ◆ know when optimality is abandoned ■ The Founder ◆ solve ‘impossible’ problems This Class ◆ sets you apart (eg, Amazon.com) Complexity ■ CPUs aren’t getting faster ■ Devices are getting smaller ■ Software is the differentiator ■ ‘Software is eating the world’ — Marc Andreessen, 2011 ■ Everything is computation Wheeler Ruml (UNH) Class 1, CS 758 – 6 / 25 The Word: Ab¯u‘Abdall¯ah Muh.ammad ibn M¯us¯aal-Khw¯arizm¯ı ■ COVID 780-850 AD Algorithms Born in Uzbekistan, ■ Algorithms Today worked in Baghdad.
    [Show full text]
  • Sorting Algorithms Correcness, Complexity and Other Properties
    Sorting Algorithms Correcness, Complexity and other Properties Joshua Knowles School of Computer Science The University of Manchester COMP26912 - Week 9 LF17, April 1 2011 The Importance of Sorting Important because • Fundamental to organizing data • Principles of good algorithm design (correctness and efficiency) can be appreciated in the methods developed for this simple (to state) task. Sorting Algorithms 2 LF17, April 1 2011 Every algorithms book has a large section on Sorting... Sorting Algorithms 3 LF17, April 1 2011 ...On the Other Hand • Progress in computer speed and memory has reduced the practical importance of (further developments in) sorting • quicksort() is often an adequate answer in many applications However, you still need to know your way (a little) around the the key sorting algorithms Sorting Algorithms 4 LF17, April 1 2011 Overview What you should learn about sorting (what is examinable) • Definition of sorting. Correctness of sorting algorithms • How the following work: Bubble sort, Insertion sort, Selection sort, Quicksort, Merge sort, Heap sort, Bucket sort, Radix sort • Main properties of those algorithms • How to reason about complexity — worst case and special cases Covered in: the course book; labs; this lecture; wikipedia; wider reading Sorting Algorithms 5 LF17, April 1 2011 Relevant Pages of the Course Book Selection sort: 97 (very short description only) Insertion sort: 98 (very short) Merge sort: 219–224 (pages on multi-way merge not needed) Heap sort: 100–106 and 107–111 Quicksort: 234–238 Bucket sort: 241–242 Radix sort: 242–243 Lower bound on sorting 239–240 Practical issues, 244 Some of the exercise on pp.
    [Show full text]
  • 2.3 Quicksort
    2.3 Quicksort ‣ quicksort ‣ selection ‣ duplicate keys ‣ system sorts Algorithms, 4th Edition · Robert Sedgewick and Kevin Wayne · Copyright © 2002–2010 · January 30, 2011 12:46:16 PM Two classic sorting algorithms Critical components in the world’s computational infrastructure. • Full scientific understanding of their properties has enabled us to develop them into practical system sorts. • Quicksort honored as one of top 10 algorithms of 20th century in science and engineering. Mergesort. last lecture • Java sort for objects. • Perl, C++ stable sort, Python stable sort, Firefox JavaScript, ... Quicksort. this lecture • Java sort for primitive types. • C qsort, Unix, Visual C++, Python, Matlab, Chrome JavaScript, ... 2 ‣ quicksort ‣ selection ‣ duplicate keys ‣ system sorts 3 Quicksort Basic plan. • Shuffle the array. • Partition so that, for some j - element a[j] is in place - no larger element to the left of j j - no smaller element to the right of Sir Charles Antony Richard Hoare • Sort each piece recursively. 1980 Turing Award input Q U I C K S O R T E X A M P L E shu!e K R A T E L E P U I M Q C X O S partitioning element partition E C A I E K L P U T M Q R X O S not greater not less sort left A C E E I K L P U T M Q R X O S sort right A C E E I K L M O P Q R S T U X result A C E E I K L M O P Q R S T U X Quicksort overview 4 Quicksort partitioning Basic plan.
    [Show full text]
  • An Evolutionary Approach for Sorting Algorithms
    ORIENTAL JOURNAL OF ISSN: 0974-6471 COMPUTER SCIENCE & TECHNOLOGY December 2014, An International Open Free Access, Peer Reviewed Research Journal Vol. 7, No. (3): Published By: Oriental Scientific Publishing Co., India. Pgs. 369-376 www.computerscijournal.org Root to Fruit (2): An Evolutionary Approach for Sorting Algorithms PRAMOD KADAM AND Sachin KADAM BVDU, IMED, Pune, India. (Received: November 10, 2014; Accepted: December 20, 2014) ABstract This paper continues the earlier thought of evolutionary study of sorting problem and sorting algorithms (Root to Fruit (1): An Evolutionary Study of Sorting Problem) [1]and concluded with the chronological list of early pioneers of sorting problem or algorithms. Latter in the study graphical method has been used to present an evolution of sorting problem and sorting algorithm on the time line. Key words: Evolutionary study of sorting, History of sorting Early Sorting algorithms, list of inventors for sorting. IntroDUCTION name and their contribution may skipped from the study. Therefore readers have all the rights to In spite of plentiful literature and research extent this study with the valid proofs. Ultimately in sorting algorithmic domain there is mess our objective behind this research is very much found in documentation as far as credential clear, that to provide strength to the evolutionary concern2. Perhaps this problem found due to lack study of sorting algorithms and shift towards a good of coordination and unavailability of common knowledge base to preserve work of our forebear platform or knowledge base in the same domain. for upcoming generation. Otherwise coming Evolutionary study of sorting algorithm or sorting generation could receive hardly information about problem is foundation of futuristic knowledge sorting problems and syllabi may restrict with some base for sorting problem domain1.
    [Show full text]
  • Sorting and Asymptotic Complexity
    SORTING AND ASYMPTOTIC COMPLEXITY Lecture 14 CS2110 – Fall 2013 Reading and Homework 2 Texbook, chapter 8 (general concepts) and 9 (MergeSort, QuickSort) Thought question: Cloud computing systems sometimes sort data sets with hundreds of billions of items – far too much to fit in any one computer. So they use multiple computers to sort the data. Suppose you had N computers and each has room for D items, and you have a data set with N*D/2 items to sort. How could you sort the data? Assume the data is initially in a big file, and you’ll need to read the file, sort the data, then write a new file in sorted order. InsertionSort 3 //sort a[], an array of int Worst-case: O(n2) for (int i = 1; i < a.length; i++) { (reverse-sorted input) // Push a[i] down to its sorted position Best-case: O(n) // in a[0..i] (sorted input) int temp = a[i]; Expected case: O(n2) int k; for (k = i; 0 < k && temp < a[k–1]; k– –) . Expected number of inversions: n(n–1)/4 a[k] = a[k–1]; a[k] = temp; } Many people sort cards this way Invariant of main loop: a[0..i-1] is sorted Works especially well when input is nearly sorted SelectionSort 4 //sort a[], an array of int Another common way for for (int i = 1; i < a.length; i++) { people to sort cards int m= index of minimum of a[i..]; Runtime Swap b[i] and b[m]; . Worst-case O(n2) } . Best-case O(n2) .
    [Show full text]
  • Improving the Performance of Bubble Sort Using a Modified Diminishing Increment Sorting
    Scientific Research and Essay Vol. 4 (8), pp. 740-744, August, 2009 Available online at http://www.academicjournals.org/SRE ISSN 1992-2248 © 2009 Academic Journals Full Length Research Paper Improving the performance of bubble sort using a modified diminishing increment sorting Oyelami Olufemi Moses Department of Computer and Information Sciences, Covenant University, P. M. B. 1023, Ota, Ogun State, Nigeria. E- mail: [email protected] or [email protected]. Tel.: +234-8055344658. Accepted 17 February, 2009 Sorting involves rearranging information into either ascending or descending order. There are many sorting algorithms, among which is Bubble Sort. Bubble Sort is not known to be a very good sorting algorithm because it is beset with redundant comparisons. However, efforts have been made to improve the performance of the algorithm. With Bidirectional Bubble Sort, the average number of comparisons is slightly reduced and Batcher’s Sort similar to Shellsort also performs significantly better than Bidirectional Bubble Sort by carrying out comparisons in a novel way so that no propagation of exchanges is necessary. Bitonic Sort was also presented by Batcher and the strong point of this sorting procedure is that it is very suitable for a hard-wired implementation using a sorting network. This paper presents a meta algorithm called Oyelami’s Sort that combines the technique of Bidirectional Bubble Sort with a modified diminishing increment sorting. The results from the implementation of the algorithm compared with Batcher’s Odd-Even Sort and Batcher’s Bitonic Sort showed that the algorithm performed better than the two in the worst case scenario. The implication is that the algorithm is faster.
    [Show full text]
  • Bubble Sort: an Archaeological Algorithmic Analysis
    Bubble Sort: An Archaeological Algorithmic Analysis Owen Astrachan 1 Computer Science Department Duke University [email protected] Abstract 1 Introduction Text books, including books for general audiences, in- What do students remember from their first program- variably mention bubble sort in discussions of elemen- ming courses after one, five, and ten years? Most stu- tary sorting algorithms. We trace the history of bub- dents will take only a few memories of what they have ble sort, its popularity, and its endurance in the face studied. As teachers of these students we should ensure of pedagogical assertions that code and algorithmic ex- that what they remember will serve them well. More amples used in early courses should be of high quality specifically, if students take only a few memories about and adhere to established best practices. This paper is sorting from a first course what do we want these mem- more an historical analysis than a philosophical trea- ories to be? Should the phrase Bubble Sort be the first tise for the exclusion of bubble sort from books and that springs to mind at the end of a course or several courses. However, sentiments for exclusion are sup- years later? There are compelling reasons for excluding 1 ported by Knuth [17], “In short, the bubble sort seems discussion of bubble sort , but many texts continue to to have nothing to recommend it, except a catchy name include discussion of the algorithm after years of warn- and the fact that it leads to some interesting theoreti- ings from scientists and educators.
    [Show full text]