Sorting Algorithms Ch. 6

Total Page:16

File Type:pdf, Size:1020Kb

Sorting Algorithms Ch. 6 Sorting Algorithms Sorting Algorithms • A sorting algorithm is in place if only a constant number of Ch. 6 - 8 elements of the input array are ever stored outside the array. Slightly modified definition of the sorting problem: • A sorting algorithm is comparison based if the only operation we can perform on keys is to compare two keys. input: A collection of n data items <a1,a2,...,an> where each data item has a key drawn from a linearly ordered set (e.g., ints, chars) Comparison-Based Sorts Running Time output: A permutation (reordering) <a'1,a'2,...,a'n> of the input worst-case average-case best-case in place sequence such that a'1 £ a'2 £ ...£ a'n Insertion Sort O(n2) O(n2) O(n) yes • In practice, one usually sorts 'records' according to their key Merge Sort O(nlgn) O(nlgn) O(nlgn) no (the non-key data is called satellite data.) Heap Sort O(nlgn) O(nlgn) O(nlgn) yes • If the records are large, we may sort an array of pointers. Quick Sort O(n2) O(nlgn) O(nlgn) yes Algorithmics 1 Algorithmics 2 heap concept (Binary) Heaps Max-Heap • complete binary tree, except may be missing some rightmost In the array representation of a max-heap, the root of the tree leaves on the bottom level. is in A[1], and given the index i of a node, • each node contains a key Parent(i) LeftChild(i) RightChild(i) • values in the nodes satisfy the heap property. return (ëi/2û) return (2i) return (2i + 1) 1 binary tree: an array implementation 2 3 Max-heap property: A[Parent(i)] ³ A[i] • root is A[1] 4 5 6 7 • for element A[i] 1 2 3 4 5 6 7 8 9 10 11 A index 9 - left child is in position A[2i] 8 10 11 20 18 15 11 10 9 6 2 4 5 3 keys - right child is in position A[2i + 1] 1 - parent is in A[ëi/2û] n = 11 20 heap as an array implementation height = 3 2 3 (# edges on 18 15 Note that current • store heap as a binary tree in an array longest leaf to 4 5 6 7 length may be greater • heapsize is number of elements in heap root path) 11 10 9 6 than current heapsize. • length is number of elements in array 8 2 9 4 5 10 3 11 Algorithmics 3 Algorithmics 4 Min-Heap Heap Sort Input: An n-element array A (unsorted). Min-heaps are commonly used for priority queues in event-driven Output: An n-element array A in sorted order, smallest to largest. simulators. Parent(i) LeftChild(i) RightChild(i) HeapSort(A) return (ëi/2û) return (2i) return (2i + 1) 1. Build-Max-Heap(A) /* put all elements in heap */ 2. for i ¬ length(A) downto 2 do Min-heap property: A[Parent(i)] £ A[i] 3. swap A[1] « A[i] /* puts max in ith array position */ 1 2 3 4 5 6 7 8 9 10 11 A index 4. heap-size[A] ¬ heap-size[A] - 1 2 3 5 4 6 9 11 10 15 20 18 keys 5. Max-Heapify(A,1) /* restore heap property */ 1 n = 11 2 Running time: height = 3 2 3 Line 1: c (?????) Need to know running time of Build-Max-Heap (# edges on 3 5 1 longest leaf to 4 5 6 7 Line 2: c2(length(A)) = c2n Line 3: c (n - 1) root path) 4 6 9 11 3 Line 4: c4(n - 1) 8 10 9 15 20 10 18 11 Line 5: c5(n - 1) (????) Need to know running time of Max-Heapify Algorithmics 5 Algorithmics 6 1 Heap Sort Heapify: Maintaining the Heap Property Max-Heapify(A, i) Input: An n-element array A (unsorted). • Assumption: subtrees rooted at left and right children of A[i] (i.e., roots of Output: An n-element array A in sorted order, smallest to largest. these subtrees are A[2i] and A[2i + 1]) are heaps (i.e., obey the max-heap property) HeapSort(A) • ...but subtree rooted at A[i] might not be a heap (that is, A[i] may be smaller 1. Build-Max-Heap(A) /* put all elements in heap */ than its left or right child) 2. for i ¬ length(A) downto 2 do • Max-Heapify(A, i) will cause the value at A[i] to "float down" in the heap so that subtree rooted at A[i] becomes a heap. 3. swap A[1] « A[i] /* puts max in ith array position */ 4. heap-size[A] ¬ heap-size[A] - 1 Max-Heapify(A,i) 1. left ¬ 2i; right ¬ 2i + 1 /*indices of left & right children of A[i] */ 5. Max-Heapify(A,1) /* restore heap property */ 2. largest ¬ i; 3. if left £ heapsize(A) and A[left] > A[i] then We'll see that Running time of HeapSort 4. largest ¬ left • 1 call to Build-Max-Heap() • Build-Max-Heap(A) takes 5. if right £ heapsize(A) and A[right] > A[largest] then Þ O(n) time O(|A|) = O(n) time 6. largest ¬ right • n-1 calls to Max-Heapify() • Max-Heapify(A,1) takes each takes O(lgn) time 7. if largest ¹ i then O(lg|A|) = O(lgn) time Þ O(nlgn) time 8. swap(A[i], A[largest]) 9. Max-Heapify(A, largest) Algorithmics 7 Algorithmics 8 Max-Heapify: Running Time Build-Max-Heap(A) Intuition: use Max-Heapify in a bottom-up manner to convert A into Running Time of Max-Heapify a heap • every line is q(1) time except the recursive call • Leaves are already heaps. Elements A[(ën/2û + 1) ... n] are all leaves. • in worst-case, last row of binary tree is half empty, so children's • Start at parents of leaves...then, grandparents of leaves...etc. sub-trees have size at most (2/3)n Build-Max-Heap(A) So we get the recurrence T(n) £ T(2n/3) + q(1) 1. heapsize(A) ¬ length(A) which, by case 2 of the master theorem, has the solution 2. for i ¬ ëlength(A)/2û downto 1 do T(n) = O(lgn) 3. Max-Heapify(A, i) (or, Max-Heapify takes O(h) time when node A[i] has height h in the Running Time of Build-Max-Heap heap) • About n/2 calls to Max-Heapify (O(n) calls) • Each call takes O(lgn) time • Þ O(nlgn) time total (note: This bound is not tight.) • The book shows that Build-Max-Heap runs in O(n) time. Algorithmics 9 Algorithmics 10 Correctness of Build-Max-Heap Inserting Heap Elements Loop invariant: At the start of each iteration of the for loop, each Inserting an element into a heap: node i + 1, i + 2, ..., n is the root of a max-heap. • increment heapsize and "add" new element to the end of array • Initialization: i = ën/2û. Each node ën/2û + 1, ën/2û + 2, ... n is a • walk up tree from new leaf to root, swapping values. Insert input leaf, trivially satisfying the max-heap property. key when a parent key larger than the input key is found • At the k-1st iteration, the children of node i are numbered higher than i. Therefore, by the inductive hypothesis, the children of i are Max-Heap-Insert(A, key) the roots of max-heaps. This is the condition required for inputs 1. heapsize(A) ¬ heapsize(A) +1 to Max-Heapify to make node i a max-heap root. 2. i ¬ heapsize(A) • At termination, i = 0. By the loop invariant, nodes 1, 2, ...,n are 3. while i > 1 and A[parent(i)] < key do the roots of a max-heap. 4. A[i] ¬ A[parent(i)] Build-Max-Heap(A) 5. i ¬ parent(i) 1. heapsize(A) ¬ length(A) 6. A[i] ¬ key 2. for i ¬ ëlength(A)/2û downto 1 do Running time of Max-Heap-Insert: O(lgn) 3. Max-Heapify(A, i) • time to traverse leaf to root path (height = O(lgn)) Algorithmics 11 Algorithmics 12 2 Priority Queues Priority Queues: Application for Heaps Definition: A priority queue is a data structure for maintaining a set An application of max-priority queues is to schedule jobs on a S of elements, each with an associated key. A max-priority-queue shared processor. Need to be able to gives priority to keys with larger values and supports the following check current job's priority Heap-Maximum(A) remove job from the queue Heap-Extract-Max(A) operations insert new jobs into queue Max-Heap-Insert(A, key) 1. insert(S, x) inserts the element x into set S. increase priority of jobs Heap-Increase-Key(A,i,key) 2. max(S) returns element of S with largest key. 3. extract-max(S) removes and returns element of S with largest Initialize PQ by running Build-Max-Heap on an array A. key. 4. increase-key(S,x,k) increases the value of element x's key to new A[1] holds the maximum value after this step. value k (assuming k is at least as large as current key's value). Heap-Maximum(A) - returns value of A[1]. Heap-Extract-Max(A) - Saves A[1] and then, like Heap-Sort, puts item in A[heapsize] at A[1], decrements heapsize, and uses Max-Heapify to restore heap property. Algorithmics 13 Algorithmics 14 Priority Queues: QuickSort (Ch. 7) Application for Heaps Like Merge-Sort, a divide-and-conquer algorithm.
Recommended publications
  • Radix Sort Comparison Sort Runtime of O(N*Log(N)) Is Optimal
    Radix Sort Comparison sort runtime of O(n*log(n)) is optimal • The problem of sorting cannot be solved using comparisons with less than n*log(n) time complexity • See Proposition I in Chapter 2.2 of the text How can we sort without comparison? • Consider the following approach: • Look at the least-significant digit • Group numbers with the same digit • Maintain relative order • Place groups back in array together • I.e., all the 0’s, all the 1’s, all the 2’s, etc. • Repeat for increasingly significant digits The characteristics of Radix sort • Least significant digit (LSD) Radix sort • a fast stable sorting algorithm • begins at the least significant digit (e.g. the rightmost digit) • proceeds to the most significant digit (e.g. the leftmost digit) • lexicographic orderings Generally Speaking • 1. Take the least significant digit (or group of bits) of each key • 2. Group the keys based on that digit, but otherwise keep the original order of keys. • This is what makes the LSD radix sort a stable sort. • 3. Repeat the grouping process with each more significant digit Generally Speaking public static void sort(String [] a, int W) { int N = a.length; int R = 256; String [] aux = new String[N]; for (int d = W - 1; d >= 0; --d) { aux = sorted array a by the dth character a = aux } } A Radix sort example A Radix sort example A Radix sort example • Problem: How to ? • Group the keys based on that digit, • but otherwise keep the original order of keys. Key-indexed counting • 1. Take the least significant digit (or group of bits) of each key • 2.
    [Show full text]
  • Overview of Sorting Algorithms
    Unit 7 Sorting Algorithms Simple Sorting algorithms Quicksort Improving Quicksort Overview of Sorting Algorithms Given a collection of items we want to arrange them in an increasing or decreasing order. You probably have seen a number of sorting algorithms including ¾ selection sort ¾ insertion sort ¾ bubble sort ¾ quicksort ¾ tree sort using BST's In terms of efficiency: ¾ average complexity of the first three is O(n2) ¾ average complexity of quicksort and tree sort is O(n lg n) ¾ but its worst case is still O(n2) which is not acceptable In this section, we ¾ review insertion, selection and bubble sort ¾ discuss quicksort and its average/worst case analysis ¾ show how to eliminate tail recursion ¾ present another sorting algorithm called heapsort Unit 7- Sorting Algorithms 2 Selection Sort Assume that data ¾ are integers ¾ are stored in an array, from 0 to size-1 ¾ sorting is in ascending order Algorithm for i=0 to size-1 do x = location with smallest value in locations i to size-1 swap data[i] and data[x] end Complexity If array has n items, i-th step will perform n-i operations First step performs n operations second step does n-1 operations ... last step performs 1 operatio. Total cost : n + (n-1) +(n-2) + ... + 2 + 1 = n*(n+1)/2 . Algorithm is O(n2). Unit 7- Sorting Algorithms 3 Insertion Sort Algorithm for i = 0 to size-1 do temp = data[i] x = first location from 0 to i with a value greater or equal to temp shift all values from x to i-1 one location forwards data[x] = temp end Complexity Interesting operations: comparison and shift i-th step performs i comparison and shift operations Total cost : 1 + 2 + ..
    [Show full text]
  • Batcher's Algorithm
    18.310 lecture notes Fall 2010 Batcher’s Algorithm Prof. Michel Goemans Perhaps the most restrictive version of the sorting problem requires not only no motion of the keys beyond compare-and-switches, but also that the plan of comparison-and-switches be fixed in advance. In each of the methods mentioned so far, the comparison to be made at any time often depends upon the result of previous comparisons. For example, in HeapSort, it appears at first glance that we are making only compare-and-switches between pairs of keys, but the comparisons we perform are not fixed in advance. Indeed when fixing a headless heap, we move either to the left child or to the right child depending on which child had the largest element; this is not fixed in advance. A sorting network is a fixed collection of comparison-switches, so that all comparisons and switches are between keys at locations that have been specified from the beginning. These comparisons are not dependent on what has happened before. The corresponding sorting algorithm is said to be non-adaptive. We will describe a simple recursive non-adaptive sorting procedure, named Batcher’s Algorithm after its discoverer. It is simple and elegant but has the disadvantage that it requires on the order of n(log n)2 comparisons. which is larger by a factor of the order of log n than the theoretical lower bound for comparison sorting. For a long time (ten years is a long time in this subject!) nobody knew if one could find a sorting network better than this one.
    [Show full text]
  • Can We Overcome the Nlog N Barrier for Oblivious Sorting?
    Can We Overcome the n log n Barrier for Oblivious Sorting? ∗ Wei-Kai Lin y Elaine Shi z Tiancheng Xie x Abstract It is well-known that non-comparison-based techniques can allow us to sort n elements in o(n log n) time on a Random-Access Machine (RAM). On the other hand, it is a long-standing open question whether (non-comparison-based) circuits can sort n elements from the domain [1::2k] with o(kn log n) boolean gates. We consider weakened forms of this question: first, we consider a restricted class of sorting where the number of distinct keys is much smaller than the input length; and second, we explore Oblivious RAMs and probabilistic circuit families, i.e., computational models that are somewhat more powerful than circuits but much weaker than RAM. We show that Oblivious RAMs and probabilistic circuit families can sort o(log n)-bit keys in o(n log n) time or o(kn log n) circuit complexity. Our algorithms work in the indivisible model, i.e., not only can they sort an array of numerical keys | if each key additionally carries an opaque ball, our algorithms can also move the balls into the correct order. We further show that in such an indivisible model, it is impossible to sort Ω(log n)-bit keys in o(n log n) time, and thus the o(log n)-bit-key assumption is necessary for overcoming the n log n barrier. Finally, after optimizing the IO efficiency, we show that even the 1-bit special case can solve open questions: our oblivious algorithms solve tight compaction and selection with optimal IO efficiency for the first time.
    [Show full text]
  • 1 Lower Bound on Runtime of Comparison Sorts
    CS 161 Lecture 7 { Sorting in Linear Time Jessica Su (some parts copied from CLRS) 1 Lower bound on runtime of comparison sorts So far we've looked at several sorting algorithms { insertion sort, merge sort, heapsort, and quicksort. Merge sort and heapsort run in worst-case O(n log n) time, and quicksort runs in expected O(n log n) time. One wonders if there's something special about O(n log n) that causes no sorting algorithm to surpass it. As a matter of fact, there is! We can prove that any comparison-based sorting algorithm must run in at least Ω(n log n) time. By comparison-based, I mean that the algorithm makes all its decisions based on how the elements of the input array compare to each other, and not on their actual values. One example of a comparison-based sorting algorithm is insertion sort. Recall that insertion sort builds a sorted array by looking at the next element and comparing it with each of the elements in the sorted array in order to determine its proper position. Another example is merge sort. When we merge two arrays, we compare the first elements of each array and place the smaller of the two in the sorted array, and then we make other comparisons to populate the rest of the array. In fact, all of the sorting algorithms we've seen so far are comparison sorts. But today we will cover counting sort, which relies on the values of elements in the array, and does not base all its decisions on comparisons.
    [Show full text]
  • CS 758/858: Algorithms
    CS 758/858: Algorithms ■ COVID Prof. Wheeler Ruml Algorithms TA Sumanta Kashyapi This Class Complexity http://www.cs.unh.edu/~ruml/cs758 4 handouts: course info, schedule, slides, asst 1 2 online handouts: programming tips, formulas 1 physical sign-up sheet/laptop (for grades, piazza) Wheeler Ruml (UNH) Class 1, CS 758 – 1 / 25 COVID ■ COVID Algorithms This Class Complexity ■ check your Wildcat Pass before coming to campus ■ if you have concerns, let me know Wheeler Ruml (UNH) Class 1, CS 758 – 2 / 25 ■ COVID Algorithms ■ Algorithms Today ■ Definition ■ Why? ■ The Word ■ The Founder This Class Complexity Algorithms Wheeler Ruml (UNH) Class 1, CS 758 – 3 / 25 Algorithms Today ■ ■ COVID web: search, caching, crypto Algorithms ■ networking: routing, synchronization, failover ■ Algorithms Today ■ machine learning: data mining, recommendation, prediction ■ Definition ■ Why? ■ bioinformatics: alignment, matching, clustering ■ The Word ■ ■ The Founder hardware: design, simulation, verification ■ This Class business: allocation, planning, scheduling Complexity ■ AI: robotics, games Wheeler Ruml (UNH) Class 1, CS 758 – 4 / 25 Definition ■ COVID Algorithm Algorithms ■ precisely defined ■ Algorithms Today ■ Definition ■ mechanical steps ■ Why? ■ ■ The Word terminates ■ The Founder ■ input and related output This Class Complexity What might we want to know about it? Wheeler Ruml (UNH) Class 1, CS 758 – 5 / 25 Why? ■ ■ COVID Computer scientist 6= programmer Algorithms ◆ ■ Algorithms Today understand program behavior ■ Definition ◆ have confidence in results, performance ■ Why? ■ The Word ◆ know when optimality is abandoned ■ The Founder ◆ solve ‘impossible’ problems This Class ◆ sets you apart (eg, Amazon.com) Complexity ■ CPUs aren’t getting faster ■ Devices are getting smaller ■ Software is the differentiator ■ ‘Software is eating the world’ — Marc Andreessen, 2011 ■ Everything is computation Wheeler Ruml (UNH) Class 1, CS 758 – 6 / 25 The Word: Ab¯u‘Abdall¯ah Muh.ammad ibn M¯us¯aal-Khw¯arizm¯ı ■ COVID 780-850 AD Algorithms Born in Uzbekistan, ■ Algorithms Today worked in Baghdad.
    [Show full text]
  • Hacking a Google Interview – Handout 2
    Hacking a Google Interview – Handout 2 Course Description Instructors: Bill Jacobs and Curtis Fonger Time: January 12 – 15, 5:00 – 6:30 PM in 32‐124 Website: http://courses.csail.mit.edu/iap/interview Classic Question #4: Reversing the words in a string Write a function to reverse the order of words in a string in place. Answer: Reverse the string by swapping the first character with the last character, the second character with the second‐to‐last character, and so on. Then, go through the string looking for spaces, so that you find where each of the words is. Reverse each of the words you encounter by again swapping the first character with the last character, the second character with the second‐to‐last character, and so on. Sorting Often, as part of a solution to a question, you will need to sort a collection of elements. The most important thing to remember about sorting is that it takes O(n log n) time. (That is, the fastest sorting algorithm for arbitrary data takes O(n log n) time.) Merge Sort: Merge sort is a recursive way to sort an array. First, you divide the array in half and recursively sort each half of the array. Then, you combine the two halves into a sorted array. So a merge sort function would look something like this: int[] mergeSort(int[] array) { if (array.length <= 1) return array; int middle = array.length / 2; int firstHalf = mergeSort(array[0..middle - 1]); int secondHalf = mergeSort( array[middle..array.length - 1]); return merge(firstHalf, secondHalf); } The algorithm relies on the fact that one can quickly combine two sorted arrays into a single sorted array.
    [Show full text]
  • Sorting Algorithms Correcness, Complexity and Other Properties
    Sorting Algorithms Correcness, Complexity and other Properties Joshua Knowles School of Computer Science The University of Manchester COMP26912 - Week 9 LF17, April 1 2011 The Importance of Sorting Important because • Fundamental to organizing data • Principles of good algorithm design (correctness and efficiency) can be appreciated in the methods developed for this simple (to state) task. Sorting Algorithms 2 LF17, April 1 2011 Every algorithms book has a large section on Sorting... Sorting Algorithms 3 LF17, April 1 2011 ...On the Other Hand • Progress in computer speed and memory has reduced the practical importance of (further developments in) sorting • quicksort() is often an adequate answer in many applications However, you still need to know your way (a little) around the the key sorting algorithms Sorting Algorithms 4 LF17, April 1 2011 Overview What you should learn about sorting (what is examinable) • Definition of sorting. Correctness of sorting algorithms • How the following work: Bubble sort, Insertion sort, Selection sort, Quicksort, Merge sort, Heap sort, Bucket sort, Radix sort • Main properties of those algorithms • How to reason about complexity — worst case and special cases Covered in: the course book; labs; this lecture; wikipedia; wider reading Sorting Algorithms 5 LF17, April 1 2011 Relevant Pages of the Course Book Selection sort: 97 (very short description only) Insertion sort: 98 (very short) Merge sort: 219–224 (pages on multi-way merge not needed) Heap sort: 100–106 and 107–111 Quicksort: 234–238 Bucket sort: 241–242 Radix sort: 242–243 Lower bound on sorting 239–240 Practical issues, 244 Some of the exercise on pp.
    [Show full text]
  • Quicksort Z Merge Sort Z T(N) = Θ(N Lg(N)) Z Not In-Place CSE 680 Z Selection Sort (From Homework) Prof
    Sorting Review z Insertion Sort Introduction to Algorithms z T(n) = Θ(n2) z In-place Quicksort z Merge Sort z T(n) = Θ(n lg(n)) z Not in-place CSE 680 z Selection Sort (from homework) Prof. Roger Crawfis z T(n) = Θ(n2) z In-place Seems pretty good. z Heap Sort Can we do better? z T(n) = Θ(n lg(n)) z In-place Sorting Comppgarison Sorting z Assumptions z Given a set of n values, there can be n! 1. No knowledge of the keys or numbers we permutations of these values. are sorting on. z So if we look at the behavior of the 2. Each key supports a comparison interface sorting algorithm over all possible n! or operator. inputs we can determine the worst-case 3. Sorting entire records, as opposed to complexity of the algorithm. numbers,,p is an implementation detail. 4. Each key is unique (just for convenience). Comparison Sorting Decision Tree Decision Tree Model z Decision tree model ≤ 1:2 > z Full binary tree 2:3 1:3 z A full binary tree (sometimes proper binary tree or 2- ≤≤> > tree) is a tree in which every node other than the leaves has two c hildren <1,2,3> ≤ 1:3 >><2,1,3> ≤ 2:3 z Internal node represents a comparison. z Ignore control, movement, and all other operations, just <1,3,2> <3,1,2> <2,3,1> <3,2,1> see comparison z Each leaf represents one possible result (a permutation of the elements in sorted order).
    [Show full text]
  • An Evolutionary Approach for Sorting Algorithms
    ORIENTAL JOURNAL OF ISSN: 0974-6471 COMPUTER SCIENCE & TECHNOLOGY December 2014, An International Open Free Access, Peer Reviewed Research Journal Vol. 7, No. (3): Published By: Oriental Scientific Publishing Co., India. Pgs. 369-376 www.computerscijournal.org Root to Fruit (2): An Evolutionary Approach for Sorting Algorithms PRAMOD KADAM AND Sachin KADAM BVDU, IMED, Pune, India. (Received: November 10, 2014; Accepted: December 20, 2014) ABstract This paper continues the earlier thought of evolutionary study of sorting problem and sorting algorithms (Root to Fruit (1): An Evolutionary Study of Sorting Problem) [1]and concluded with the chronological list of early pioneers of sorting problem or algorithms. Latter in the study graphical method has been used to present an evolution of sorting problem and sorting algorithm on the time line. Key words: Evolutionary study of sorting, History of sorting Early Sorting algorithms, list of inventors for sorting. IntroDUCTION name and their contribution may skipped from the study. Therefore readers have all the rights to In spite of plentiful literature and research extent this study with the valid proofs. Ultimately in sorting algorithmic domain there is mess our objective behind this research is very much found in documentation as far as credential clear, that to provide strength to the evolutionary concern2. Perhaps this problem found due to lack study of sorting algorithms and shift towards a good of coordination and unavailability of common knowledge base to preserve work of our forebear platform or knowledge base in the same domain. for upcoming generation. Otherwise coming Evolutionary study of sorting algorithm or sorting generation could receive hardly information about problem is foundation of futuristic knowledge sorting problems and syllabi may restrict with some base for sorting problem domain1.
    [Show full text]
  • Improving the Performance of Bubble Sort Using a Modified Diminishing Increment Sorting
    Scientific Research and Essay Vol. 4 (8), pp. 740-744, August, 2009 Available online at http://www.academicjournals.org/SRE ISSN 1992-2248 © 2009 Academic Journals Full Length Research Paper Improving the performance of bubble sort using a modified diminishing increment sorting Oyelami Olufemi Moses Department of Computer and Information Sciences, Covenant University, P. M. B. 1023, Ota, Ogun State, Nigeria. E- mail: [email protected] or [email protected]. Tel.: +234-8055344658. Accepted 17 February, 2009 Sorting involves rearranging information into either ascending or descending order. There are many sorting algorithms, among which is Bubble Sort. Bubble Sort is not known to be a very good sorting algorithm because it is beset with redundant comparisons. However, efforts have been made to improve the performance of the algorithm. With Bidirectional Bubble Sort, the average number of comparisons is slightly reduced and Batcher’s Sort similar to Shellsort also performs significantly better than Bidirectional Bubble Sort by carrying out comparisons in a novel way so that no propagation of exchanges is necessary. Bitonic Sort was also presented by Batcher and the strong point of this sorting procedure is that it is very suitable for a hard-wired implementation using a sorting network. This paper presents a meta algorithm called Oyelami’s Sort that combines the technique of Bidirectional Bubble Sort with a modified diminishing increment sorting. The results from the implementation of the algorithm compared with Batcher’s Odd-Even Sort and Batcher’s Bitonic Sort showed that the algorithm performed better than the two in the worst case scenario. The implication is that the algorithm is faster.
    [Show full text]
  • CS 106B Lecture 11: Sorting Friday, October 21, 2016
    CS 106B Lecture 11: Sorting Friday, October 21, 2016 Programming Abstractions Fall 2016 Stanford University Computer Science Department Lecturer: Chris Gregg reading: Programming Abstractions in C++, Section 10.2 Today's Topics •Logistics •We will have a midterm review, TBA •Midterm materials (old exams, study guides, etc.) will come out next week. •Throwing a string error •Recursive != exponential computational complexity •Sorting •Insertion Sort •Selection Sort •Merge Sort •Quicksort •Other sorts you might want to look at: •Radix Sort •Shell Sort •Tim Sort •Heap Sort (we will cover heaps later in the course) •Bogosort Throwing an Exception • For the Serpinski Triangle option in MetaAcademy assignment says, "If the order passed is negative, your function should throw a string exception." • What does it mean to "throw a string exception?" • An "exception" is your program's way of pulling the fire alarm — something drastic happened, and your program does not like it. In fact, if an exception is not "handled" by the rest of your program, it crashes! It is not a particularly graceful way to handle bad input from the user. • If you were going to use your Serpinski function in a program you wrote, you would want to check the input before calling the function — in other words, make sure your program never creates a situation where a function needs to throw an exception. Throwing an Exception • To throw an exception: throw("Illegal level: Serpinski level must be non-negative."); • This will crash the program. • You can "catch" exceptions that have been thrown by other functions, but that is beyond our scope — see here for details: https://www.tutorialspoint.com/ cplusplus/cpp_exceptions_handling.htm (that is a creepy hand) Recursion != Exponential Computational Complexity • In the previous lecture, we discussed the recursive fibonacci function: long fibonacci(int n) { • This function happens to have if (n == 0) { return 0; exponential computational complexity } if (n == 1) { because each recursive call has two return 1; } additional recursions.
    [Show full text]