ECE750-TXB Lecture 1: Asymptotics

ECE750-TXB Lecture 1: Asymptotics

ECE750-TXB Lecture 1: Asymptotics Todd L. Veldhuizen ECE750-TXB Lecture 1: Asymptotics [email protected] Asymptotics Asymptotics: Motivation Todd L. Veldhuizen Bibliography [email protected] Electrical & Computer Engineering University of Waterloo Canada February 26, 2007 ECE750-TXB Motivation Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] I We want to choose the best algorithm or data structure for the job. Asymptotics Asymptotics: Need characterizations of resource use, e.g., time, Motivation I Bibliography space; for circuits: area, depth. I Many, many approaches: I Worst Case Execution Time (WCET): for hard real-time applications I Exact measurements for a specific problem size, e.g., number of gates in a 64-bit addition circuit. I Performance models, e.g., R∞, n1/2 for latency-throughput, HINT curves for linear algebra (characterize performance through different cache regimes), etc. I ... ECE750-TXB Asymptotic analysis Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: Motivation Bibliography I We will focus on Asymptotic analysis: a good “first approximation” of performance that describes behaviour on big problems I Reasonably independent of: I Machine details (e.g., 2 cycles for add+mult vs. 1 cycle) I Clock speed, programming language, compiler, etc. ECE750-TXB Asymptotics: Brief history Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] I Basic ideas originated in Paul du Bois-Reymond’s Asymptotics Infinit¨arcalc¨ul(‘calculus of infinities’) developed in the Asymptotics: 1870s. Motivation Bibliography I G. H. Hardy greatly expanded on Paul du Bois-Reymond’s ideas in his monograph Orders of Infinity (1910) [3]. I The “big-O” notation was first used by Bachmann (1894), and popularized by Landau (hence sometimes called “Landau notation.”) I Adopted by computer scientists [4] to characterize resource consumption, independent of small machine differences, languages, compilers, etc. ECE750-TXB Basic asymptotic notations Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: Asymptotic ≡ behaviour as n → ∞, where for our purposes Motivation n is the “problem size.” Bibliography Three basic notations: I f ∼ g (“f and g are asymptotically equivalent”) I f g (“f is asymptotically dominated by g”) I f g (f and g are asymptotically bounded by one another) ECE750-TXB Basic asymptotic notations Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: f (n) Motivation f ∼ g means lim = 1 n→∞ g(n) Bibliography Example: 3x2 + 2x + 1 ∼ 3x2. ∼ is an equivalence relation: I Transitive: (x ∼ y) ∧ (y ∼ z) ⇒ (x ∼ z) I Reflexive: x ∼ x I Symmetric: (x ∼ y) ⇒ (y ∼ x). Basic idea: We only care about the “leading term,” disregarding less quickly-growing terms. ECE750-TXB Basic asymptotic notations Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] f (n) f g means lim sup < ∞ Asymptotics Asymptotics: n→∞ g(n) Motivation Bibliography f (n) i.e., g(n) is eventually bounded by a finite value. I Basic idea: f grows more slowly than g, or just as quickly as g. I is a preorder (or quasiorder): I Transitive: (f g) ∧ (g h) ⇒ (f h). I Reflexive: f f I fails to be a partial order because it is not antisymmetric: there are functions f , g where f g and g f but f 6= g. I Variant: g f means f g. ECE750-TXB Basic asymptotic notations Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Write f g when there are positive constants c1, c2 such that Asymptotics Asymptotics: Motivation f (n) Bibliography c ≤ ≤ c 1 g(n) 2 for sufficiently large n. I Examples: I n 2n I n (2 + sin πn)n I is an equivalence relation. ECE750-TXB Strict forms Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: Write f ≺ g when f g but f 6 g. Motivation Bibliography I Basic idea: f grows strictly less quickly than g f (n) I Equivalent: f g exactly when limn→∞ g(n) = 0. I Examples 2 3 I x ≺ x I log x ≺ x I Variant: f g means g ≺ f ECE750-TXB Orders of growth Lecture 1: Asymptotics Todd L. Veldhuizen We can use ≺ as a “ruler” by which to judge the growth of [email protected] functions. Some common “tick marks” on this ruler are: Asymptotics n k 2 n n 2 Asymptotics: log log n ≺ log n ≺ log n ≺ n ≺ n ≺ n ≺ · · · ≺ 2 ≺ n! ≺ n ≺ 2 Motivation Bibliography We can always find in ≺ a dense total order without endpoints. i.e., I There is no slowest-growing function; I There is no fastest-growing function; I If f ≺ h we can always find a g such that f ≺ g ≺ h. (The canonical example of a dense total order without endpoints is Q, the rationals.) I This fact allows us to sketch graphs in which points on the axes are asymptotes. ECE750-TXB Big-O Notation Lecture 1: Asymptotics “Big-O” is a convenient family of notations for asymptotics: Todd L. Veldhuizen [email protected] O(g) ≡ {f : f g} Asymptotics Asymptotics: Motivation i.e., O(g) is the set of functions f so that f g. Bibliography 2 2 2 3/2 I O(n ) contains n , 7n , n, log n, n , 5,... I Note that f ∈ O(g) means exactly f g. I A standard abuse of notation is to treat a big-O expression as if it were a term: x2 + 2x1/2 + 1 = x2 + O(x1/2) | {z } ≺x2 The above equation should be read as “there exists a function f ∈ O(x1/2) such that x2 + 2x1/2 + 1 = x2 + f (x).” ECE750-TXB Big-O for algorithm analysis Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] I Big-O notation is an excellent tool for expressing machine/compiler/language-independent complexity Asymptotics Asymptotics: properties. Motivation Bibliography I On one machine a sorting algorithm might take ≈ 5.73n log n seconds, on another it might take ≈ 9.42n log n + 3.2n seconds. I We can wave these differences aside by saying the algorithm runs in O(n log n) seconds. I O(f (n)) means something that behaves asymptotically like f (n): I Disregarding any initial transient behaviour; I Disregarding any multiplicative constants c · f (n); I Disregarding any additive terms that grow less quickly than f (n). ECE750-TXB Basic properties of big-O notation Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Given a choice between an sorting algorithm that runs in Asymptotics Asymptotics: O(n2) time and one that runs in O(n log n) time, which Motivation should we choose? Bibliography 1. Gut instinct: the O(n log n) one, of course! 2. But: note that the class of functions O(n2) also contains n log n. Just because we say an algorithm is O(n2) does not mean it takes n2 time! 3. It could be that the O(n2) algorithm is faster than the O(n log n) one. ECE750-TXB Additional notations Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] To distinguish between “at most this fast,” “at least this Asymptotics Asymptotics: fast,” etc. there are additional big-O-like notations: Motivation Bibliography f ∈ O(g) ≡ f g upper bound f ∈ o(g) ≡ f ≺ g strict upper bound f ∈ Θ(g) ≡ f g tight bound f ∈ Ω(g) ≡ f g lower bound f ∈ ω(g) ≡ f g strict lower bound ECE750-TXB Tricks for a bad remembering day Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: Motivation Lower case means strict: I Bibliography I o(n) is strict version of O(n) I ω(n) is strict version of Ω(n) I ω, Ω (omega) is the last letter of the greek alphabet — if f ∈ ω(g) then g comes after f in asymptotic ordering. I f ∈ Θ(g): the line through the middle of the theta — asymptotes converge ECE750-TXB Notation: o(·) Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics f ∈ o(g) means f ≺ g Asymptotics: Motivation Bibliography I o(·) expresses a strict upper bound. I If f (n) is o(g(n)), then f grows strictly slower than g. Pn −k 1 I Example: k=0 2 = 2 − 2n = 2 + o(1) I o(1) indicates the class of functions for which g(n) limn→∞ 1 = 0, which means limn→∞ g(n) = 0. I 2 + o(1) means “2 plus something that vanishes as n → ∞” I If f is o(g), it is also O(g). n I n! = o(n ). ECE750-TXB Notation: ω(·) Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: f ∈ ω(g) means f g Motivation Bibliography I ω(·) expresses a strict lower bound. I If f (n) is ω(g(n)), then f grows strictly faster than g. I f ∈ ω(g) is equivalent to g ∈ o(f ). I Example: Harmonic series Pn 1 −1 I hn = k=0 k ∼ ln n + γ + O(n ) I hn ∈ ω(1) (It is unbounded.) I hn ∈ ω(ln ln n) n n I n! = ω(2 ) (grows faster than 2 ) ECE750-TXB Notation: Ω(·) Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] Asymptotics Asymptotics: Motivation f ∈ Ω(g) means f g Bibliography I Ω(·) expresses a lower bound, not necessarily strict I If f (n) is Ω(g(n)), then f grows at least as fast as g. I f ∈ Ω(g) is equivalent to g ∈ O(f ) 2 I Example: Matrix multiplication requires Ω(n ) time. (At least enough time to look at each of the n2 entries in the matrices.) ECE750-TXB Notation: Θ(·) Lecture 1: Asymptotics Todd L. Veldhuizen [email protected] f ∈ Θ(g) means f g Asymptotics Asymptotics: Motivation Bibliography I Θ(·) expresses a tight asymptotic bound I If f (n) is Θ(g(n)), then f (n)/g(n) is eventually contained in a finite positive interval [c1, c2].

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    363 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us