<<

CSC 308

( AND ANALYSIS)

1 DISCLAIMER

The contents of this document are intended for leaning purposes at the undergraduate level. The materials are from different sources including the internet and the contributor do not in any way claim authorship or ownership of them.

2 Introduction of the course

This course applies design and analysis techniques to numeric and nonnumeric algorithms which act on data structures. Design is emphasized so that the student will be able to develop new algorithms. is concerned with the resources an must use to reach a solution. Only theoretical techniques of analysis are covered. Topics include introduction to algorithm, Basic algorithmic analysis: of Upper and average complexity bounds (Asymptotic complexity, and searching); standard Complexity Classes Time and space tradeoffs in algorithms analysis recursive algorithms. Algorithmic Strategies: Fundamental algorithms: Numerical algorithms, sequential and binary search algorithms; sorting algorithms, Binary Search trees, Hash tables, graphs & its representation.

3 Objectives

By the end of the semester, the student should be able to: (1) describe algorithm and types of algorithms (2) describe and recognize the use of analysis of algorithms such as asymptotic analysis of upper and average complexity bounds (3) recognize the use of different types of algorithms such as sorting, searching, recursive algorithms, etc, and use these algorithms / methods in solving problems (4) Determine asymptotic growth rates for algorithms (5) apply graphs and its theorem/representation to algorithm analysis

4 Course Overview

Basic algorithmic analysis: Asymptotic analysis of Upper and average complexity bounds; standard Complexity Classes Time and space tradeoffs in algorithms analysis recursive algorithms. Algorithmic Strategies: Fundamental computing algorithms: Numerical algorithms, sequential and binary search algorithms; sorting algorithms, Binary Search tress, Hash tables, graphs & its representation.

5 Course Plan

Week 1: Introduction, basic algorithm analysis, algorithm complexity, comparing algorithm Week 2: Types of algorithms Week 3: Asymptotic Complexity, Asymptotic analysis of Upper and average complexity bounds, Week 4: Standard Complexity Classes Time and space tradeoffs in algorithms analysis Week 5: Revision and 1st test Week 6: searching (binary search) and sorting algorithms Week 7: Recursive algorithms. Week 8: other fundamental computing algorithms: Numerical algorithms, sequential algorithms, Week 9: Binary Search tress, Hash tables, Week 10: Graphs & its representation. Week 11: revision and 2nd test 6 Lecture One Introduction

It is very common for beginning science students to compare their programs with one another. You may also have noticed that it is common for computer programs to look very similar, especially the simple ones. An interesting question often arises. When two programs solve the same problem but look different, is one program better than the other? In order to answer this question, we need to remember that there is an important difference between a program and the underlying algorithm that the program is representing. As we stated in Chapter 1, an algorithm is a generic, step-by-step list of instructions for solving a problem. It is a method for solving any instance of the problem such that given a particular input, the algorithm produces the desired result. A program, on the other hand, is an algorithm that has been encoded into some . There may be many programs for the same algorithm, depending on the programmer and the programming language being used. 7 Algorithms What is an algorithm?

An algorithm is a finite set of precise instructions for performing a or for solving a problem. It can also be defined as step-by-step instructions to accomplish a task or solve a problem

Format & Detail level depends on user 8 Algorithms

Properties of algorithms:

• Input from a specified set, • Output from a specified set (solution), • Definiteness of every step in the computation, • Correctness of output for every possible input, • Finiteness of the number of calculation steps, • Effectiveness of each calculation step and • Generality for a class of problems.

9 Algorithm Examples

We will use a pseudocode to specify algorithms, which slightly reminds us of Basic and Pascal. Example: an algorithm that finds the maximum element in a finite sequence procedure max(a1, a2, …, an: integers) max := a1 for i := 2 to n

if max < ai then max := ai {max is the largest element}

10 Algorithm Examples

Another example: a algorithm, that is, an algorithm that linearly searches a sequence for a particular element. procedure linear_search(x: integer; a1, a2, …, an: integers) i := 1 while (i  n and x  ai) i := i + 1 if i  n then location := i else location := 0 {location is the subscript of the term that equals x, or is zero if x is not found}

11 Analysis of Algorithms

In , the analysis of algorithms is the determination of the of algorithms, that is the amount of time, storage and/or other resources necessary to execute them. Usually, this involves determining a that relates the length of an algorithm's input to the number of steps it takes (its ) or the number of storage locations it uses (its ). An algorithm is said to be efficient when this function's values are small. Since different inputs of the same length may cause the algorithm to have different behavior, the function describing its performance is usually an upper bound on the actual performance, determined from the worst case inputs to the algorithm.

12 The term "analysis of algorithms" was coined by . Algorithm analysis is an important part of a broader computational complexity theory, which provides theoretical estimates for the resources needed by any algorithm which solves a given . These estimates provide an insight into reasonable directions of search for efficient algorithms.

13 Efficiency of Algorithms

Different algorithms for a single task may have different efficiency

-Must be able to evaluate efficiency (speed) of computer algorithms

14 Complexity = Speed = Efficiency

Complexity is the number of basic operations required by an algorithm Will 2 algorithms with same complexity take the same actual amount of time to run?

15 Complexity Examples

Do 10 times Read x, y, z Read x, y, z x = y + z x = y + z Print x Print x

16 Complexity Examples

Do 10 times Do n times Read x, y, z Read x, y, z x = y + z x = y + z Print x Print x

17 Complexity

In general, we are not so much interested in the time and space complexity for small inputs.

For example, while the difference in time complexity between linear and binary search is meaningless for a sequence with n = 10, it is gigantic for n = 230.

18 Complexity

For example, let us assume two algorithms A and B that solve the same class of problems. The time complexity of A is 5,000n, the one for B is 1.1n for an input with n elements. For n = 10, A requires 50,000 steps, but B only 3, so B seems to be superior to A. For n = 1000, however, A requires 5,000,000 steps, while B requires 2.51041 steps.

19 Complexity

Comparison: time complexity of algorithms A and B

Input Size Algorithm A Algorithm B n 5,000n 1.1n 10 50,000 3 100 500,000 13,781 1,000 5,000,000 2.51041 1,000,000 5109 4.81041392

20 Complexity

This means that algorithm B cannot be used for large inputs, while algorithm A is still feasible.

So what is important is the growth of the complexity functions.

 The growth of time and space complexity with increasing input size n is a suitable measure for the comparison of algorithms.

21 Comparing Algorithms

Want to choose the “best” algorithm for tasks Generally, best = fastest But there are other considerations – Hardware – Size of data set – Data Structure  But a standard measurement is needed for this!!!

22 The Need for Analysis

Algorithms are often quite different from one another, though the objective of these algorithms are the same. For example, we know that a set of numbers can be sorted using different algorithms. Number of comparisons performed by one algorithm may vary with others for the same input. Hence, time complexity of those algorithms may differ. At the same time, we need to calculate the memory space required by each algorithm.

23 Analysis of algorithm is the process of analyzing the problem-solving capability of the algorithm in terms of the time and size required (the size of memory for storage while ). However, the main concern of analysis of algorithms is the required time or performance. Generally, we perform the following types of analysis − • Worst-case − The maximum number of steps taken on any instance of size a.

• Best-case − The minimum number of steps taken on any instance of size a.

• Average case − An average number of steps taken on any instance of size a.

24 To solve a problem, we need to consider time as well as space complexity as the

program may run on a system where memory is limited but adequate space is

available or may be vice-versa. In this context, if we compare bubble sort

and . Bubble sort does not require additional memory, but merge

sort requires additional space. Though time complexity of bubble sort is

higher compared to merge sort, we may need to apply bubble sort if the

program needs to run in an environment, where memory is very limited.

25 Bubble Sort def bubbleSort(alist): for passnum in range(len(alist)-1,0,-1): for i in range(passnum): if alist[i]>alist[i+1]: temp = alist[i] alist[i] = alist[i+1] alist[i+1] = temp alist = [54,26,93,17,77,31,44,55,20] bubbleSort(alist) print(alist)

26 Merge Sort Conceptually, a merge sort works as follows:

• Divide the unsorted list into n sublists, each containing 1 element (a list of 1 element is considered sorted). • Repeatedly merge sublists to produce new sorted sublists until there is only 1 sublist remaining. This will be the sorted list.

Merge sort is a recursive algorithm that continually splits a list in half. If the list is empty or has one item, it is sorted by definition (the base case). If the list has more than one item, we split the list and recursively invoke a merge sort on both halves. Once the two halves are sorted, the fundamental operation, called a merge, is performed. Merging is the process of taking two smaller sorted lists and combining them together into a single, sorted, new list.

27 def mergeSort(alist): print("Splitting ",alist) if len(alist)>1: mid = len(alist)//2 lefthalf = alist[:mid] righthalf = alist[mid:]

mergeSort(lefthalf) mergeSort(righthalf)

i=0 j=0 k=0 while i < len(lefthalf) and j < len(righthalf): if lefthalf[i] < righthalf[j]: alist[k]=lefthalf[i] i=i+1 else: alist[k]=righthalf[j] j=j+1 k=k+1 28 while i < len(lefthalf): alist[k]=lefthalf[i] i=i+1 k=k+1

while j < len(righthalf): alist[k]=righthalf[j] j=j+1 k=k+1 print("Merging ",alist) alist = [54,26,93,17,77,31,44,55,20] mergeSort(alist) print(alist)

29 30 31 While comparing two sublists for merging, the first element of both lists is taken into consideration. While sorting in ascending order, the element that is of a lesser value becomes a new element of the sorted list. This procedure is repeated until both the smaller sublists are empty and the new combined sublist comprises all the elements of both the sublists.

32