University of Hawaii ICS141: Discrete for Science I

Dept. & Computer Sci., University of Hawaii

Jan Stelovsky based on slides by Dr. Baek and Dr. Still Originals by Dr. M. P. Frank and Dr. J.L. Gross Provided by McGraw-Hill

ICS 141: Discrete Mathematics I – Fall 2011 13-1 University of Hawaii

Lecture 16

Chapter 3. The Fundamentals 3.3 Complexity of 3.4 The and Division

ICS 141: Discrete Mathematics I – Fall 2011 13-2 3.3 Complexity of AlgorithmsUniversity of Hawaii

n An must always produce the correct answer, and should be efficient.

n How can the efficiency of an algorithm be analyzed?

n The algorithmic complexity of a computation is, most generally, a measure of how difficult it is to perform the computation.

n That is, it measures some aspect of the cost of computation (in a general sense of “cost”).

n Amount of resources required to do a computation.

n Some of the most common complexity measures:

n “Time” complexity: # of operations or steps required

n “Space” complexity: # of memory bits required

ICS 141: Discrete Mathematics I – Fall 2011 13-3 Complexity Depends on InputUniversity of Hawaii

n Most algorithms have different complexities for inputs of different sizes.

n E.g. searching a long list typically takes more time than searching a short one.

n Therefore, complexity is usually expressed as a of the input size.

n This function usually gives the complexity for the worst-case input of any given length.

ICS 141: Discrete Mathematics I – Fall 2011 13-4 Worst-, Average- and University of Hawaii Best-Case Complexity

n A worst-case complexity measure estimates the time required for the most time consuming input of each size.

n An average-case complexity measure estimates the average time required for input of each size.

n An best-case complexity measure estimates the least time consuming input of each size.

ICS 141: Discrete Mathematics I – Fall 2011 13-5 Example 1: Max algorithm University of Hawaii

n Problem: Find the simplest form of the exact order of growth (Θ) of the worst-case time complexity of the max algorithm, assuming that each line of code takes some constant time every time it is executed (with possibly different times for different lines of code).

ICS 141: Discrete Mathematics I – Fall 2011 13-6 Complexity of max University of Hawaii

procedure max(a1, a2, …, an: integers)

v := a1 t1

for i := 2 to n t2

if ai > v then v := ai t3

return v t4

n First, what is an expression for the exact total worst-case time? (Not its order of growth.)

n t1: once

n t2: n – 1 + 1 times

n t3 (comparison): n – 1 times

n t4: once ICS 141: Discrete Mathematics I – Fall 2011 13-7 Complexity Analysis (cont.) University of Hawaii

n Worst-case time complexity

t(n) = t1 + t2 + t3 + t4 =1+ (n −1+1) + (n −1) +1 = 2n +1

n In terms of the number of comparisons made

t(n) = t2 + t3

# of = (n −1+1) + (n −1) comparisons = 2n −1 n Now, what is the simplest form of the exact (Θ) order of growth of t(n)? t(n) = Θ(n) ICS 141: Discrete Mathematics I – Fall 2011 13-8 Example 2: Linear Search University of Hawaii

n In terms of the number of comparisons

procedure linear_search (x: ,

a1, a2, …, an: distinct integers) i := 1

while (i ≤ n ∧ x ≠ ai) t11 & t12 i := i + 1

if i ≤ n then location := i t2 else location := 0 return location

ICS 141: Discrete Mathematics I – Fall 2011 13-9 Linear Search Analysis University of Hawaii

n Worst case time complexity:

t(n) = t11 + t12 + t2 # of = (n +1) + n +1 = 2n + 2 = Θ(n) comparisons

n Best case: t(n) = t11 +t12 +t2 =1+1+1= Θ(1)

n Average case, if item is present: 3+ 5 + 7 ++ (2n +1) 2(1+ 2 ++ n) + n t(n) = = n n 2[n(n +1) / 2] = +1 = n + 2 = Θ(n) n ICS 141: Discrete Mathematics I – Fall 2011 13-10 Example 3: Binary Search University of Hawaii

procedure binary_search (x:integer, a1, a2, …, an: distinct integers, sorted smallest to largest) i := 1 Key question: j := n How many loop iterations?

while i < j begin t1 m := ⎣(i + j)/2⎦

if x > am then i := m + 1 else j := m t2 end

if x = ai then location := i else location := 0 t3 return location

ICS 141: Discrete Mathematics I – Fall 2011 13-11 Binary Search Analysis University of Hawaii

k n Suppose that n is a power of 2, i.e., ∃k: n = 2 .

n Original range from i = 1 to j = n contains n items.

n Each iteration: Size j - i + 1 of range is cut in half. k k–1 k–2 n Size decreases as 2 , 2 , 2 ,… 0 n Loop terminates when size of range is 1 = 2 (i = j).

n Therefore, the number of iterations is: k = log2n

t(n) = t1 + t2 + t3

= (k +1) + k +1 = 2k + 2 = 2log2 n + 2 = Θ(log2 n)

k n Even for n ≠ 2 (not an integral power of 2), time complexity is still the same.

ICS 141: Discrete Mathematics I – Fall 2011 13-12 Analysis of Sorting AlgorithmsUniversity of Hawaii

n Check out Rosen 3.3 Example 5 and Example 6 for worst-case time complexity of bubble sort and insertion sort algorithms in terms of the number of comparisons made.

ICS 141: Discrete Mathematics I – Fall 2011 13-13 Bubble Sort Analysis University of Hawaii

procedure bubble_sort (a1, a2, …, an: real numbers with n ≥ 2) for i := 1 to n – 1 for j := 1 to n – i

if aj > aj+1 then interchange aj and aj+1

{a1, a2, …, an is in increasing order}

n Worst-case complexity in terms of the number of comparisons: Θ(n2)

ICS 141: Discrete Mathematics I – Fall 2011 13-14 Insertion Sort University of Hawaii

procedure insertion_sort (a , a , …, a : real numbers; n ≥ 2) 1 2 n for j := 2 to n begin i := 1

while aj > ai i := i + 1

m := aj for k := 0 to j – i – 1

aj-k := aj-k-1 ai := m end {a1, a2, …, an are sorted in increasing order}

n Worst-case complexity in terms of the number of comparisons: Θ(n2) ICS 141: Discrete Mathematics I – Fall 2011 13-15 Common Terminology for the University of Hawaii Complexity of Algorithms

ICS 141: Discrete Mathematics I – Fall 2011 13-16 Computer Time Examples University of Hawaii

-9 n Assume that time = 1 ns (10 second) per operation, problem size = n bits, and #ops is a function of n.

(1.25 bytes) (125 kB) #ops(n) n = 10 n = 106

log2 n 3.3 ns 19.9 ns n 10 ns 1 ms n log2 n 33 ns 19.9 ms n2 100 ns 16 m 40 s n 301,004.5 2 1.024 µs 10 n! 3.63 ms Ouch!

ICS 141: Discrete Mathematics I – Fall 2011 13-17 Review: Complexity University of Hawaii

n Algorithmic complexity = cost of computation.

n Focus on time complexity for our course.

n Although space & energy are also important.

n Characterize complexity as a function of input size:

n Worst-case, best-case, or average-case.

n Use orders-of-growth notation to concisely summarize the growth properties of complexity functions.

n Need to know

n Names of specific orders of growth of complexity.

n How to analyze the order of growth of time complexity for simple algorithms. ICS 141: Discrete Mathematics I – Fall 2011 13-18 Tractable vs. Intractable University of Hawaii

n A problem that is solvable using an algorithm with at most polynomial time complexity is called tractable (or feasible). P is the of all tractable problems.

n A problem that cannot be solved using an algorithm with worst-case polynomial time complexity is called intractable (or infeasible). 1,000,000 n Note that n is technically tractable, but really very hard. nlog log log n is technically intractable, but easy. Such cases are rare though.

ICS 141: Discrete Mathematics I – Fall 2011 13-19 P vs. NP University of Hawaii

n NP is the set of problems for which there exists a tractable algorithm for checking a proposed solution to tell if it is correct.

n We know that P⊆NP, but the most famous unproven conjecture in is that this inclusion is proper.

n i.e., that P⊂NP rather than P=NP.

n Whoever first proves this will be famous! (or disproves it!)

ICS 141: Discrete Mathematics I – Fall 2011 13-20 3.4 The Integers and DivisionUniversity of Hawaii

n Of course, you already know what the integers are, and what division is…

n But: There are some specific notations, terminology, and theorems associated with these concepts which you may not know.

n These form the basics of .

n Number theory is vital in many today important algorithms (hash functions, , digital signatures,…).

ICS 141: Discrete Mathematics I – Fall 2011 13-21 Divides, Factor, Multiple University of Hawaii

n Let a,b ∈ Z with a ≠ 0.

n Definition: a|b ⇔ “a divides b” ⇔ (∃c∈Z: b = ac) “There is an integer c such that c times a equals b.”

n Example: 3|-12 (True), but 3|7 (False).

n If a divides b, then we say a is a factor or a divisor of b, and b is a multiple of a.

n E.g.: “b is even” ⇔ 2|b.

ICS 141: Discrete Mathematics I – Fall 2011 13-22 The Divides University of Hawaii

n Theorem: ∀a,b,c ∈ Z: 1. a|0 2. (a|b ∧ a|c) → a|(b + c) 3. a|b → a|bc ∀c ∈ Z 4. (a|b ∧ b|c) → a|c

n Corollary: ∀a,b,c ∈ Z n (a|b ∧ a|c) → a|(mb + nc), m,n ∈ Z

ICS 141: Discrete Mathematics I – Fall 2011 13-23 Proof of (2) University of Hawaii

n Show ∀a,b,c ∈ Z: (a|b ∧ a|c) → a|(b + c).

n Let a, b, c be any integers such that a|b and a|c, and show that a|(b + c).

n By definition of |, we know ∃s∈Z: b = as, and ∃t∈Z: c = at. Let s, t, be such integers.

n Then b + c = as + at = a(s + t), so ∃u∈Z: b + c = au, namely u = s + t. Thus a|(b + c).

ICS 141: Discrete Mathematics I – Fall 2011 13-24 The Division “Algorithm” University of Hawaii

n It’s really just a theorem, not an algorithm…

n Only called an “algorithm” for historical reasons.

n Theorem: For any integer dividend a and divisor d∈Z+, there are unique integers quotient q and remainder r∈N such that a = dq + r and 0 ≤ r < d. Formally, the theorem is: ∀a∈Z, d∈Z+: ∃!q,r∈Z: 0 ≤ r < d, a = dq + r

n We can find q and r by: q = ⎣a/d⎦, r = a - dq

ICS 141: Discrete Mathematics I – Fall 2011 13-25