CS 758/858: Algorithms

Total Page:16

File Type:pdf, Size:1020Kb

CS 758/858: Algorithms CS 758/858: Algorithms ■ COVID Prof. Wheeler Ruml Algorithms TA Sumanta Kashyapi This Class Complexity http://www.cs.unh.edu/~ruml/cs758 4 handouts: course info, schedule, slides, asst 1 2 online handouts: programming tips, formulas 1 physical sign-up sheet/laptop (for grades, piazza) Wheeler Ruml (UNH) Class 1, CS 758 – 1 / 25 COVID ■ COVID Algorithms This Class Complexity ■ check your Wildcat Pass before coming to campus ■ if you have concerns, let me know Wheeler Ruml (UNH) Class 1, CS 758 – 2 / 25 ■ COVID Algorithms ■ Algorithms Today ■ Definition ■ Why? ■ The Word ■ The Founder This Class Complexity Algorithms Wheeler Ruml (UNH) Class 1, CS 758 – 3 / 25 Algorithms Today ■ ■ COVID web: search, caching, crypto Algorithms ■ networking: routing, synchronization, failover ■ Algorithms Today ■ machine learning: data mining, recommendation, prediction ■ Definition ■ Why? ■ bioinformatics: alignment, matching, clustering ■ The Word ■ ■ The Founder hardware: design, simulation, verification ■ This Class business: allocation, planning, scheduling Complexity ■ AI: robotics, games Wheeler Ruml (UNH) Class 1, CS 758 – 4 / 25 Definition ■ COVID Algorithm Algorithms ■ precisely defined ■ Algorithms Today ■ Definition ■ mechanical steps ■ Why? ■ ■ The Word terminates ■ The Founder ■ input and related output This Class Complexity What might we want to know about it? Wheeler Ruml (UNH) Class 1, CS 758 – 5 / 25 Why? ■ ■ COVID Computer scientist 6= programmer Algorithms ◆ ■ Algorithms Today understand program behavior ■ Definition ◆ have confidence in results, performance ■ Why? ■ The Word ◆ know when optimality is abandoned ■ The Founder ◆ solve ‘impossible’ problems This Class ◆ sets you apart (eg, Amazon.com) Complexity ■ CPUs aren’t getting faster ■ Devices are getting smaller ■ Software is the differentiator ■ ‘Software is eating the world’ — Marc Andreessen, 2011 ■ Everything is computation Wheeler Ruml (UNH) Class 1, CS 758 – 6 / 25 The Word: Ab¯u‘Abdall¯ah Muh.ammad ibn M¯us¯aal-Khw¯arizm¯ı ■ COVID 780-850 AD Algorithms Born in Uzbekistan, ■ Algorithms Today worked in Baghdad. ■ Definition ■ Why? Solution of linear and ■ The Word ■ The Founder quadratic equations. This Class Founder of algebra. Complexity Popularized arabic numerals, decimal positional numbers → algorism (manipulating digits) → algorithm. The Compendious Book on Calculation by Completion and Balancing, 830. Wheeler Ruml (UNH) Class 1, CS 758 – 7 / 25 The Founder: Donald E. Knuth ■ COVID invented algorithm analysis Algorithms The Art of Computer ■ Algorithms Today Programming, vol. 1, 1968 ■ Definition ■ Why? ■ The Word ■ The Founder developed TEX, This Class literate programming Complexity many famous students published in MAD magazine Wheeler Ruml (UNH) Class 1, CS 758 – 8 / 25 ■ COVID Algorithms This Class ■ Relations ■ Topics ■ Course Mechanics Complexity This Class Wheeler Ruml (UNH) Class 1, CS 758 – 9 / 25 Relations ■ ■ COVID requires 531/659 (formal thinking), 515 (data structures, C) Algorithms ■ some intentional overlap! This Class ■ central (required for BS CS and BS DS) ■ Relations ■ ■ Topics same content, assignments, grading both semesters ■ Course Mechanics ■ continuous improvement! Complexity Wheeler Ruml (UNH) Class 1, CS 758 – 10 / 25 Topics ‘Greatest Hits’ ■ COVID Algorithms 1. data structures: trees, tries, hashing This Class 2. algorithms: divide-and-conquer, dynamic programming, ■ Relations greedy, graphs ■ Topics ■ Course Mechanics 3. correctness: invariants Complexity 4. complexity: time and space 5. NP-completeness: reductions Not including 1. (much) computability 2. (many) randomized algorithms 3. parallel algorithms 4. distributed algorithms 5. numerical algorithms, eg: crypto, linear algebra 6. geometric algorithms 7. on-line or ‘run forever’ algorithms 8. fancy analysis Wheeler Ruml (UNH) Class 1, CS 758 – 11 / 25 Course Mechanics ■ ■ COVID names → faces Algorithms ■ sign up sheet This Class ■ General information ■ Relations ■ Topics ◆ contact, books, C, due dates, collaboration, piazza.com ■ Course Mechanics Complexity ■ Schedule ◆ wildcard slot ■ Expectations ◆ 50/4=12.5; 50/3=16.7 ◆ 2018: median 12, mean 12.8, stddev 5.5 ■ Feedback is always needed and appreciated. ◆ eg, EOLQs. Try coming to my office hours! Wheeler Ruml (UNH) Class 1, CS 758 – 12 / 25 ■ COVID Algorithms This Class Complexity ■ Sorting ■ Counting Sort ■ Correctness ■ Counting Sort ■ Complexity Complexity ■ Counting Sort ■ Order Notation ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 13 / 25 Sorting ■ ■ COVID Bubble Sort Algorithms ■ Selection Sort This Class ■ Insertion Sort Complexity ■ Shell Sort ■ Sorting ■ ■ Counting Sort Merge Sort ■ Correctness ■ Heap Sort ■ Counting Sort ■ Complexity ■ Quick Sort ■ Counting Sort ■ Order Notation How to sort one million records? ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 14 / 25 Sorting ■ ■ COVID Bubble Sort Algorithms ■ Selection Sort This Class ■ Insertion Sort Complexity ■ Shell Sort ■ Sorting ■ ■ Counting Sort Merge Sort ■ Correctness ■ Heap Sort ■ Counting Sort ■ Complexity ■ Quick Sort ■ Counting Sort ■ Order Notation How to sort one million records? ■ O() ■ Examples ■ And Friends ■ Asymptotics How to sort one billion 16-bit integers? ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 14 / 25 Sorting ■ ■ COVID Bubble Sort Algorithms ■ Selection Sort This Class ■ Insertion Sort Complexity ■ Shell Sort ■ Sorting ■ ■ Counting Sort Merge Sort ■ Correctness ■ Heap Sort ■ Counting Sort ■ Complexity ■ Quick Sort ■ Counting Sort ■ Order Notation How to sort one million records? ■ O() ■ Examples ■ And Friends ■ Asymptotics How to sort one billion 16-bit integers? ■ EOLQs How to sort one trillion 4-bit integers? Wheeler Ruml (UNH) Class 1, CS 758 – 14 / 25 Counting Sort ■ COVID For n numbers in the range 0 to k: Algorithms This Class Complexity 1. for i from 0 to k ■ Sorting 2. count[i] ← 0 ■ Counting Sort ■ Correctness 3. for each input number x ■ Counting Sort 4. increment count[x] ■ Complexity ■ Counting Sort 5. for i from 0 to k ■ Order Notation i ■ O() 6. docount[ ] times ■ Examples 7. emit i ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 15 / 25 Counting Sort ■ COVID For n numbers in the range 0 to k: Algorithms This Class Complexity 1. for i from 0 to k ■ Sorting 2. count[i] ← 0 ■ Counting Sort ■ Correctness 3. for each input number x ■ Counting Sort 4. increment count[x] ■ Complexity ■ Counting Sort 5. for i from 0 to k ■ Order Notation i ■ O() 6. docount[ ] times ■ Examples 7. emit i ■ And Friends ■ Asymptotics ■ EOLQs Correctness? Complexity? Wheeler Ruml (UNH) Class 1, CS 758 – 15 / 25 Correctness ■ COVID property 1: output is in sorted order Algorithms proof sketch: output loop increments i, never decrements This Class Complexity ■ Sorting ■ Counting Sort ■ Correctness ■ Counting Sort ■ Complexity ■ Counting Sort ■ Order Notation ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 16 / 25 Correctness ■ COVID property 1: output is in sorted order Algorithms proof sketch: output loop increments i, never decrements This Class Complexity ■ Sorting property 2: output contains same numbers as input ■ Counting Sort ■ Correctness invariant: ■ Counting Sort ■ Complexity ■ Counting Sort ■ Order Notation ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 16 / 25 Correctness ■ COVID property 1: output is in sorted order Algorithms proof sketch: output loop increments i, never decrements This Class Complexity ■ Sorting property 2: output contains same numbers as input ■ Counting Sort ■ Correctness invariant: for each value, ■ Counting Sort remaining input + sum of counts = total ■ Complexity ■ Counting Sort proof sketch: ■ Order Notation ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs Wheeler Ruml (UNH) Class 1, CS 758 – 16 / 25 Correctness ■ COVID property 1: output is in sorted order Algorithms proof sketch: output loop increments i, never decrements This Class Complexity ■ Sorting property 2: output contains same numbers as input ■ Counting Sort ■ Correctness invariant: for each value, ■ Counting Sort remaining input + sum of counts = total ■ Complexity ■ Counting Sort proof sketch: ■ Order Notation initialized/established: before line 3 ■ O() ■ Examples maintained: through lines 3–4 ■ And Friends ■ Asymptotics at termination: no remaining input ■ EOLQs each number printed count times therefore, output has same numbers as input Wheeler Ruml (UNH) Class 1, CS 758 – 16 / 25 Counting Sort ■ COVID For n numbers in the range 0 to k: Algorithms This Class Complexity 1. for i from 0 to k ■ Sorting 2. count[i] ← 0 ■ Counting Sort ■ Correctness 3. for each input number x ■ Counting Sort 4. increment count[x] ■ Complexity ■ Counting Sort 5. for i from 0 to k ■ Order Notation i ■ O() 6. docount[ ] times ■ Examples 7. emit i ■ And Friends ■ Asymptotics ■ EOLQs Correctness? Yes. Complexity? Wheeler Ruml (UNH) Class 1, CS 758 – 17 / 25 Complexity ■ COVID RAM model: no cache Algorithms order of growth This Class worst-case Complexity ■ Sorting ■ Counting Sort ■ Correctness ■ Counting Sort ■ Complexity ■ Counting Sort ■ Order Notation ■ O() ■ Examples ■ And Friends ■ Asymptotics ■ EOLQs [ try with previous slide ] Wheeler Ruml (UNH) Class 1, CS 758 – 18 / 25 Counting Sort ■ COVID For n numbers in the range 0 to k: Algorithms This Class Complexity 1. for i from 0 to k O(k) ■ Sorting 2. count[i] ← 0 ■ Counting Sort ■ Correctness 3. for each input number x O(n) ■ Counting
Recommended publications
  • Can We Overcome the Nlog N Barrier for Oblivious Sorting?
    Can We Overcome the n log n Barrier for Oblivious Sorting? ∗ Wei-Kai Lin y Elaine Shi z Tiancheng Xie x Abstract It is well-known that non-comparison-based techniques can allow us to sort n elements in o(n log n) time on a Random-Access Machine (RAM). On the other hand, it is a long-standing open question whether (non-comparison-based) circuits can sort n elements from the domain [1::2k] with o(kn log n) boolean gates. We consider weakened forms of this question: first, we consider a restricted class of sorting where the number of distinct keys is much smaller than the input length; and second, we explore Oblivious RAMs and probabilistic circuit families, i.e., computational models that are somewhat more powerful than circuits but much weaker than RAM. We show that Oblivious RAMs and probabilistic circuit families can sort o(log n)-bit keys in o(n log n) time or o(kn log n) circuit complexity. Our algorithms work in the indivisible model, i.e., not only can they sort an array of numerical keys | if each key additionally carries an opaque ball, our algorithms can also move the balls into the correct order. We further show that in such an indivisible model, it is impossible to sort Ω(log n)-bit keys in o(n log n) time, and thus the o(log n)-bit-key assumption is necessary for overcoming the n log n barrier. Finally, after optimizing the IO efficiency, we show that even the 1-bit special case can solve open questions: our oblivious algorithms solve tight compaction and selection with optimal IO efficiency for the first time.
    [Show full text]
  • Bubble Sort: an Archaeological Algorithmic Analysis
    Bubble Sort: An Archaeological Algorithmic Analysis Owen Astrachan 1 Computer Science Department Duke University [email protected] Abstract 1 Introduction Text books, including books for general audiences, in- What do students remember from their first program- variably mention bubble sort in discussions of elemen- ming courses after one, five, and ten years? Most stu- tary sorting algorithms. We trace the history of bub- dents will take only a few memories of what they have ble sort, its popularity, and its endurance in the face studied. As teachers of these students we should ensure of pedagogical assertions that code and algorithmic ex- that what they remember will serve them well. More amples used in early courses should be of high quality specifically, if students take only a few memories about and adhere to established best practices. This paper is sorting from a first course what do we want these mem- more an historical analysis than a philosophical trea- ories to be? Should the phrase Bubble Sort be the first tise for the exclusion of bubble sort from books and that springs to mind at the end of a course or several courses. However, sentiments for exclusion are sup- years later? There are compelling reasons for excluding 1 ported by Knuth [17], “In short, the bubble sort seems discussion of bubble sort , but many texts continue to to have nothing to recommend it, except a catchy name include discussion of the algorithm after years of warn- and the fact that it leads to some interesting theoreti- ings from scientists and educators.
    [Show full text]
  • Data Structures & Algorithms
    DATA STRUCTURES & ALGORITHMS Tutorial 6 Questions SORTING ALGORITHMS Required Questions Question 1. Many operations can be performed faster on sorted than on unsorted data. For which of the following operations is this the case? a. checking whether one word is an anagram of another word, e.g., plum and lump b. findin the minimum value. c. computing an average of values d. finding the middle value (the median) e. finding the value that appears most frequently in the data Question 2. In which case, the following sorting algorithm is fastest/slowest and what is the complexity in that case? Explain. a. insertion sort b. selection sort c. bubble sort d. quick sort Question 3. Consider the sequence of integers S = {5, 8, 2, 4, 3, 6, 1, 7} For each of the following sorting algorithms, indicate the sequence S after executing each step of the algorithm as it sorts this sequence: a. insertion sort b. selection sort c. heap sort d. bubble sort e. merge sort Question 4. Consider the sequence of integers 1 T = {1, 9, 2, 6, 4, 8, 0, 7} Indicate the sequence T after executing each step of the Cocktail sort algorithm (see Appendix) as it sorts this sequence. Advanced Questions Question 5. A variant of the bubble sorting algorithm is the so-called odd-even transposition sort . Like bubble sort, this algorithm a total of n-1 passes through the array. Each pass consists of two phases: The first phase compares array[i] with array[i+1] and swaps them if necessary for all the odd values of of i.
    [Show full text]
  • 13 Basic Sorting Algorithms
    Concise Notes on Data Structures and Algorithms Basic Sorting Algorithms 13 Basic Sorting Algorithms 13.1 Introduction Sorting is one of the most fundamental and important data processing tasks. Sorting algorithm: An algorithm that rearranges records in lists so that they follow some well-defined ordering relation on values of keys in each record. An internal sorting algorithm works on lists in main memory, while an external sorting algorithm works on lists stored in files. Some sorting algorithms work much better as internal sorts than external sorts, but some work well in both contexts. A sorting algorithm is stable if it preserves the original order of records with equal keys. Many sorting algorithms have been invented; in this chapter we will consider the simplest sorting algorithms. In our discussion in this chapter, all measures of input size are the length of the sorted lists (arrays in the sample code), and the basic operation counted is comparison of list elements (also called keys). 13.2 Bubble Sort One of the oldest sorting algorithms is bubble sort. The idea behind it is to make repeated passes through the list from beginning to end, comparing adjacent elements and swapping any that are out of order. After the first pass, the largest element will have been moved to the end of the list; after the second pass, the second largest will have been moved to the penultimate position; and so forth. The idea is that large values “bubble up” to the top of the list on each pass. A Ruby implementation of bubble sort appears in Figure 1.
    [Show full text]
  • Investigating the Effect of Implementation Languages and Large Problem Sizes on the Tractability and Efficiency of Sorting Algorithms
    International Journal of Engineering Research and Technology. ISSN 0974-3154, Volume 12, Number 2 (2019), pp. 196-203 © International Research Publication House. http://www.irphouse.com Investigating the Effect of Implementation Languages and Large Problem Sizes on the Tractability and Efficiency of Sorting Algorithms Temitayo Matthew Fagbola and Surendra Colin Thakur Department of Information Technology, Durban University of Technology, Durban 4000, South Africa. ORCID: 0000-0001-6631-1002 (Temitayo Fagbola) Abstract [1],[2]. Theoretically, effective sorting of data allows for an efficient and simple searching process to take place. This is Sorting is a data structure operation involving a re-arrangement particularly important as most merge and search algorithms of an unordered set of elements with witnessed real life strictly depend on the correctness and efficiency of sorting applications for load balancing and energy conservation in algorithms [4]. It is important to note that, the rearrangement distributed, grid and cloud computing environments. However, procedure of each sorting algorithm differs and directly impacts the rearrangement procedure often used by sorting algorithms on the execution time and complexity of such algorithm for differs and significantly impacts on their computational varying problem sizes and type emanating from real-world efficiencies and tractability for varying problem sizes. situations [5]. For instance, the operational sorting procedures Currently, which combination of sorting algorithm and of Merge Sort, Quick Sort and Heap Sort follow a divide-and- implementation language is highly tractable and efficient for conquer approach characterized by key comparison, recursion solving large sized-problems remains an open challenge. In this and binary heap’ key reordering requirements, respectively [9].
    [Show full text]
  • Sorting Partnership Unless You Sign Up! Brian Curless • Homework #5 Will Be Ready After Class, Spring 2008 Due in a Week
    Announcements (5/9/08) • Project 3 is now assigned. CSE 326: Data Structures • Partnerships due by 3pm – We will not assume you are in a Sorting partnership unless you sign up! Brian Curless • Homework #5 will be ready after class, Spring 2008 due in a week. • Reading for this lecture: Chapter 7. 2 Sorting Consistent Ordering • Input – an array A of data records • The comparison function must provide a – a key value in each data record consistent ordering on the set of possible keys – You can compare any two keys and get back an – a comparison function which imposes a indication of a < b, a > b, or a = b (trichotomy) consistent ordering on the keys – The comparison functions must be consistent • Output • If compare(a,b) says a<b, then compare(b,a) must say b>a • If says a=b, then must say b=a – reorganize the elements of A such that compare(a,b) compare(b,a) • If compare(a,b) says a=b, then equals(a,b) and equals(b,a) • For any i and j, if i < j then A[i] ≤ A[j] must say a=b 3 4 Why Sort? Space • How much space does the sorting • Allows binary search of an N-element algorithm require in order to sort the array in O(log N) time collection of items? • Allows O(1) time access to kth largest – Is copying needed? element in the array for any k • In-place sorting algorithms: no copying or • Sorting algorithms are among the most at most O(1) additional temp space.
    [Show full text]
  • Counting Sort and Radix Sort
    Counting sort and radix sort Nick Smallbone December 11, 2018 Radix sort is the oldest sorting algorithm which is still in general use; it predates the computer. On the left is Herman Hollerith, the inventor of radix sort. In the late 1800s, he built enormous electromechanical machines which used radix sort to tabulate US census data; he set up a company to sell the machines, and the company later became IBM. On the right you can see a 1950s IBM sorting machine running the radix sort algorithm. Radix sort is unlike the sorting algorithms we have seen so far, in that it is not based on comparisons (≤). However, it only works on certain kinds of data – it is mainly used on numerical data, and is often faster than comparison-based sorting algorithms. Before seeing radix sort, we will look at a simpler algorithm, counting sort. 1 Counting sort #1: sorting a list of digits Suppose you want to sort the first 100 digits of π, in ascending order, by hand. How would you do it? 31415926535897932384626433832795028841971693993751058209749445923078164062862089986280348253421170679 Counting sort is one approach to sorting lists of digits. It can be carried out by hand or on a computer. The idea is like this. First we count how many times each digit occurs in the input data (in this case, the first 100 digits of π): Digit 0 1 2 3 4 5 6 7 8 9 Number of occurrences 8 8 12 12 10 8 9 8 12 14 We can compute this table while looking through the input list only once, by keeping a running total for each digit.
    [Show full text]
  • Tutorial 2: Heapsort, Quicksort, Counting Sort, Radix Sort
    Tutorial 2: Heapsort, Quicksort, Counting Sort, Radix Sort Mayank Saksena September 20, 2006 1 Heapsort We review Heapsort, and prove some loop invariants for it. For further information, see Chapter 6 of Introduction to Algorithms. HEAPSORT(A) 1 BUILD-MAX-HEAP(A) Ð eÒg Øh A 2 for i = ( ) downto 2 A i 3 swap A[1] and [ ] ×iÞ e A heaÔ ×iÞ e A 4 heaÔ- ( )= - ( ) 1 5 MAX-HEAPIFY(A; 1) BUILD-MAX-HEAP(A) ×iÞ e A Ð eÒg Øh A 1 heaÔ- ( )= ( ) Ð eÒg Øh A = 2 for i = ( ) 2 downto 1 3 MAX-HEAPIFY(A; i) MAX-HEAPIFY(A; i) i 1 Ð =LEFT( ) i 2 Ö =RIGHT( ) heaÔ ×iÞ e A A Ð > A i Ð aÖ g e×Ø Ð 3 if Ð - ( ) and ( ) ( ) then = i 4 else Ð aÖ g e×Ø = heaÔ ×iÞ e A A Ö > A Ð aÖ g e×Ø Ð aÖ g e×Ø Ö 5 if Ö - ( ) and ( ) ( ) then = i 6 if Ð aÖ g e×Ø = i A Ð aÖ g e×Ø 7 swap A[ ] and [ ] 8 MAX-HEAPIFY(A; Ð aÖ g e×Ø) 1 Loop invariants First, assume that MAX-HEAPIFY(A; i) is correct, i.e., that it makes the subtree with A root i a max-heap. Under this assumption, we prove that BUILD-MAX-HEAP( ) is correct, i.e., that it makes A a max-heap. A We show: at the start of iteration i of the for-loop of BUILD-MAX-HEAP( ) (line ; i ;:::;Ò 2), each of the nodes i +1 +2 is the root of a max-heap.
    [Show full text]
  • Sorting Algorithm 1 Sorting Algorithm
    Sorting algorithm 1 Sorting algorithm In computer science, a sorting algorithm is an algorithm that puts elements of a list in a certain order. The most-used orders are numerical order and lexicographical order. Efficient sorting is important for optimizing the use of other algorithms (such as search and merge algorithms) that require sorted lists to work correctly; it is also often useful for canonicalizing data and for producing human-readable output. More formally, the output must satisfy two conditions: 1. The output is in nondecreasing order (each element is no smaller than the previous element according to the desired total order); 2. The output is a permutation, or reordering, of the input. Since the dawn of computing, the sorting problem has attracted a great deal of research, perhaps due to the complexity of solving it efficiently despite its simple, familiar statement. For example, bubble sort was analyzed as early as 1956.[1] Although many consider it a solved problem, useful new sorting algorithms are still being invented (for example, library sort was first published in 2004). Sorting algorithms are prevalent in introductory computer science classes, where the abundance of algorithms for the problem provides a gentle introduction to a variety of core algorithm concepts, such as big O notation, divide and conquer algorithms, data structures, randomized algorithms, best, worst and average case analysis, time-space tradeoffs, and lower bounds. Classification Sorting algorithms used in computer science are often classified by: • Computational complexity (worst, average and best behaviour) of element comparisons in terms of the size of the list . For typical sorting algorithms good behavior is and bad behavior is .
    [Show full text]
  • Sorting Algorithms
    Sorting Algorithms Next to storing and retrieving data, sorting of data is one of the more common algorithmic tasks, with many different ways to perform it. Whenever we perform a web search and/or view statistics at some website, the presented data has most likely been sorted in some way. In this lecture and in the following lectures we will examine several different ways of sorting. The following are some reasons for investigating several of the different algorithms (as opposed to one or two, or the \best" algorithm). • There exist very simply understood algorithms which, although for large data sets behave poorly, perform well for small amounts of data, or when the range of the data is sufficiently small. • There exist sorting algorithms which have shown to be more efficient in practice. • There are still yet other algorithms which work better in specific situations; for example, when the data is mostly sorted, or unsorted data needs to be merged into a sorted list (for example, adding names to a phonebook). 1 Counting Sort Counting sort is primarily used on data that is sorted by integer values which fall into a relatively small range (compared to the amount of random access memory available on a computer). Without loss of generality, we can assume the range of integer values is [0 : m], for some m ≥ 0. Now given array a[0 : n − 1] the idea is to define an array of lists l[0 : m], scan a, and, for i = 0; 1; : : : ; n − 1 store element a[i] in list l[v(a[i])], where v is the function that computes an array element's sorting value.
    [Show full text]
  • Evaluation of Sorting Algorithms, Mathematical and Empirical Analysis of Sorting Algorithms
    International Journal of Scientific & Engineering Research Volume 8, Issue 5, May-2017 86 ISSN 2229-5518 Evaluation of Sorting Algorithms, Mathematical and Empirical Analysis of sorting Algorithms Sapram Choudaiah P Chandu Chowdary M Kavitha ABSTRACT:Sorting is an important data structure in many real life applications. A number of sorting algorithms are in existence till date. This paper continues the earlier thought of evolutionary study of sorting problem and sorting algorithms concluded with the chronological list of early pioneers of sorting problem or algorithms. Latter in the study graphical method has been used to present an evolution of sorting problem and sorting algorithm on the time line. An extensive analysis has been done compared with the traditional mathematical methods of ―Bubble Sort, Selection Sort, Insertion Sort, Merge Sort, Quick Sort. Observations have been obtained on comparing with the existing approaches of All Sorts. An “Empirical Analysis” consists of rigorous complexity analysis by various sorting algorithms, in which comparison and real swapping of all the variables are calculatedAll algorithms were tested on random data of various ranges from small to large. It is an attempt to compare the performance of various sorting algorithm, with the aim of comparing their speed when sorting an integer inputs.The empirical data obtained by using the program reveals that Quick sort algorithm is fastest and Bubble sort is slowest. Keywords: Bubble Sort, Insertion sort, Quick Sort, Merge Sort, Selection Sort, Heap Sort,CPU Time. Introduction In spite of plentiful literature and research in more dimension to student for thinking4. Whereas, sorting algorithmic domain there is mess found in this thinking become a mark of respect to all our documentation as far as credential concern2.
    [Show full text]
  • 6.006 Introduction to Algorithms Spring 2008
    MIT OpenCourseWare http://ocw.mit.edu 6.006 Introduction to Algorithms Spring 2008 For information about citing these materials or our Terms of Use, visit: http://ocw.mit.edu/terms. Lecture 11 Sorting IV: Stable Sorting, Radix Sort 6.006 Spring 2008 Lecture 11: Sorting IV: Stable Sorting, Radix Sort Lecture Overview • Stable Sorting • Radix Sort • Quick Sort not officially a part of 6:006 • Sorting Races Stable Sorting Preserves input order among equal elements 4’ 1 3* 4 3 counting sort is stable merge sort is stable 1 3* 3 4’ 4 Figure 1: Stability Selection Sort and Heap: Find maximum element and put it at end of array (swap with element at end of array) NOT STABLE! ← define 3 2a 2b 2b 2a 3 2a <2b Figure 2: Selection Sort Instability Radix Sort • Herman Hollerith card-sorting machine for 1890 census. • Digit by Digit sort by mechanical machine 1. Examine given column of each card in a deck 2. Distribute the card into one of 10 bins 1 Lecture 11 Sorting IV: Stable Sorting, Radix Sort 6.006 Spring 2008 3. Gather cards bin by bin, so cards with first place punched are on top of cards with second place punched, etc. 80 cols . 10 . places . Figure 3: Punch Card MSB vs. LSB? Sort on most significant digit first or least significant digit first? MSB strategy: Cards in 9 of 10 bins must be put aside, leading to a large number of intermediate piles LSB strategy: Can gather sorted cards in bins appropriately to create a deck! Example 3 2 9 7 2 0 7 2 0 3 2 9 4 5 7 3 5 5 3 2 9 3 5 5 6 5 7 4 3 6 4 3 6 4 3 6 8 3 9 4 5 7 8 3 9 4 5 7 4 3 6 6 5 7 3 5 5 6 5 7 7 2 0 3 2 9 4 5 7 7 2 0 3 5 5 8 3 9 6 5 7 8 3 9 Digit sort needs to be stable, else will get wrong result! Figure 4: Example of Radix Sort 2 Lecture 11 Sorting IV: Stable Sorting, Radix Sort 6.006 Spring 2008 Analysis Assume counting sort is auxiliary stable sort.
    [Show full text]