Sampling and Reconstruction Sampled Representations Reconstruction
Total Page:16
File Type:pdf, Size:1020Kb
Sampled representations • How to store and compute with continuous functions? • Common scheme for representation: samples write down the function’s values at many points Sampling and reconstruction CS 4620 Lecture 13 [FvDFH fig.14.14b / Wolberg] [FvDFH fig.14.14b / Cornell CS4620 Fall 2008 Lecture 13 • 1 Cornell CS4620 Fall 2008 Lecture 13 • 2 Reconstruction Filtering • Making samples back into a continuous function • Processing done on a function for output (need realizable method) can be executed in continuous form (e.g. analog circuit) for analysis or processing (need mathematical method) but can also be executed using sampled representation amounts to “guessing” what the function did in between • Simple example: smoothing by averaging [FvDFH fig.14.14b / Wolberg] [FvDFH fig.14.14b / Cornell CS4620 Fall 2008 Lecture 13 • 3 Cornell CS4620 Fall 2008 Lecture 13 • 4 Roots of sampling Sampling in digital audio • Nyquist 1928; Shannon 1949 • Recording: sound to analog to samples to disc famous results in information theory • Playback: disc to samples to analog to sound again • 1940s: !rst practical uses in telecommunications how can we be sure we are !lling in the gaps correctly? • 1960s: !rst digital audio systems • 1970s: commercialization of digital audio • 1982: introduction of the Compact Disc the !rst high-pro!le consumer application • This is why all the terminology has a communications or audio “"avor” early applications are 1D; for us 2D (images) is important Cornell CS4620 Fall 2008 Lecture 13 • 5 Cornell CS4620 Fall 2008 Lecture 13 • 6 Undersampling Undersampling • What if we “missed” things between the samples? • What if we “missed” things between the samples? • Simple example: undersampling a sine wave • Simple example: undersampling a sine wave unsurprising result: information is lost unsurprising result: information is lost surprising result: indistinguishable from lower frequency surprising result: indistinguishable from lower frequency also was always indistinguishable from higher frequencies also was always indistinguishable from higher frequencies aliasing: signals “traveling in disguise” as other frequencies aliasing: signals “traveling in disguise” as other frequencies Cornell CS4620 Fall 2008 Lecture 13 • 7 Cornell CS4620 Fall 2008 Lecture 13 • 7 Undersampling Undersampling • What if we “missed” things between the samples? • What if we “missed” things between the samples? • Simple example: undersampling a sine wave • Simple example: undersampling a sine wave unsurprising result: information is lost unsurprising result: information is lost surprising result: indistinguishable from lower frequency surprising result: indistinguishable from lower frequency also was always indistinguishable from higher frequencies also was always indistinguishable from higher frequencies aliasing: signals “traveling in disguise” as other frequencies aliasing: signals “traveling in disguise” as other frequencies Cornell CS4620 Fall 2008 Lecture 13 • 7 Cornell CS4620 Fall 2008 Lecture 13 • 7 Undersampling Undersampling • What if we “missed” things between the samples? • What if we “missed” things between the samples? • Simple example: undersampling a sine wave • Simple example: undersampling a sine wave unsurprising result: information is lost unsurprising result: information is lost surprising result: indistinguishable from lower frequency surprising result: indistinguishable from lower frequency also was always indistinguishable from higher frequencies also was always indistinguishable from higher frequencies aliasing: signals “traveling in disguise” as other frequencies aliasing: signals “traveling in disguise” as other frequencies Cornell CS4620 Fall 2008 Lecture 13 • 7 Cornell CS4620 Fall 2008 Lecture 13 • 7 Preventing aliasing Preventing aliasing • Introduce lowpass !lters: • Introduce lowpass !lters: remove high frequencies leaving only safe, low frequencies remove high frequencies leaving only safe, low frequencies choose lowest frequency in reconstruction (disambiguate) choose lowest frequency in reconstruction (disambiguate) Cornell CS4620 Fall 2008 Lecture 13 • 8 Cornell CS4620 Fall 2008 Lecture 13 • 8 Linear !ltering: a key idea Convolution warm-up • Transformations on signals; e.g.: • basic idea: de!ne a new function by averaging over a sliding window bass/treble controls on stereo • a simple example to start o#: smoothing blurring/sharpening operations in image editing smoothing/noise reduction in tracking • Key properties linearity: !lter(f + g) = !lter(f) + !lter(g) shift invariance: behavior invariant to shifting the input • delaying an audio signal • sliding an image around • Can be modeled mathematically by convolution Cornell CS4620 Fall 2008 Lecture 13 • 9 Cornell CS4620 Fall 2008 Lecture 13 • 10 Convolution warm-up Convolution warm-up • basic idea: de!ne a new function by averaging over a sliding window • basic idea: de!ne a new function by averaging over a sliding window • a simple example to start o#: smoothing • a simple example to start o#: smoothing Cornell CS4620 Fall 2008 Lecture 13 • 10 Cornell CS4620 Fall 2008 Lecture 13 • 10 Convolution warm-up Convolution warm-up • basic idea: de!ne a new function by averaging over a sliding window • basic idea: de!ne a new function by averaging over a sliding window • a simple example to start o#: smoothing • a simple example to start o#: smoothing Cornell CS4620 Fall 2008 Lecture 13 • 10 Cornell CS4620 Fall 2008 Lecture 13 • 10 Convolution warm-up Discrete convolution • Same moving average operation, expressed mathematically: • Simple averaging: every sample gets the same weight • Convolution: same idea but with weighted average each sample gets its own weight (normally zero far away) • This is all convolution is: it is a moving weighted average Cornell CS4620 Fall 2008 Lecture 13 • 11 Cornell CS4620 Fall 2008 Lecture 13 • 12 Filters Convolution and !ltering • Sequence of weights a[j] is called a filter • Can express sliding average as convolution with a box filter • Filter is nonzero over its region of support • abox = […, 0, 1, 1, 1, 1, 1, 0, …] usually centered on zero: support radius r • Filter is normalized so that it sums to 1.0 this makes for a weighted average, not just any old weighted sum • Most !lters are symmetric about 0 since for images we usually want to treat left and right the same a box !lter Cornell CS4620 Fall 2008 Lecture 13 • 13 Cornell CS4620 Fall 2008 Lecture 13 • 14 Example: box and step Example: box and step Cornell CS4620 Fall 2008 Lecture 13 • 15 Cornell CS4620 Fall 2008 Lecture 13 • 15 Example: box and step Example: box and step Cornell CS4620 Fall 2008 Lecture 13 • 15 Cornell CS4620 Fall 2008 Lecture 13 • 15 Example: box and step Convolution and !ltering • Convolution applies with any sequence of weights • Example: bell curve (gaussian-like) […, 1, 4, 6, 4, 1, …]/16 Cornell CS4620 Fall 2008 Lecture 13 • 15 Cornell CS4620 Fall 2008 Lecture 13 • 16 Convolution and !ltering And in pseudocode… • Convolution applies with any sequence of weights • Example: bell curve (gaussian-like) […, 1, 4, 6, 4, 1, …]/16 Cornell CS4620 Fall 2008 Lecture 13 • 16 Cornell CS4620 Fall 2008 Lecture 13 • 17 Discrete convolution Discrete !ltering in 2D • Notation: • Same equation, one more index • Convolution is a multiplication-like operation commutative associative now the !lter is a rectangle you slide around over a grid of numbers distributes over addition scalars factor out • Commonly applied to images identity: unit impulse e = […, 0, 0, 1, 0, 0, …] blurring (using box, using gaussian, …) sharpening (impulse minus blur) • Conceptually no distinction between !lter and signal • Usefulness of associativity often apply several !lters one after another: (((a * b1) * b2) * b3) this is equivalent to applying one !lter: a * (b1 * b2 * b3) Cornell CS4620 Fall 2008 Lecture 13 • 18 Cornell CS4620 Fall 2008 Lecture 13 • 19 And in pseudocode… [Philip Greenspun] original!!!!|!!!!box blur sharpened!!!!|!!!!gaussian blur Cornell CS4620 Fall 2008 Lecture 13 • 20 Cornell CS4620 Fall 2008 Lecture 13 • 21 Optimization: separable !lters • basic alg. is O(r2): large !lters get expensive fast! • de!nition: a2(x,y) is separable if it can be written as: [Philip Greenspun] this is a useful property for !lters because it allows factoring: original!!!!|!!!!box blur sharpened!!!!|!!!!gaussian blur Cornell CS4620 Fall 2008 Lecture 13 • 21 Cornell CS4620 Fall 2008 Lecture 13 • 22 Separable !ltering Separable !ltering second, convolve with this first, convolve with this first, convolve with this Cornell CS4620 Fall 2008 Lecture 13 • 23 Cornell CS4620 Fall 2008 Lecture 13 • 23 Continuous convolution: warm-up Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as • Can apply sliding-window average to a continuous function just as well well output is continuous output is continuous integration replaces summation integration replaces summation Cornell CS4620 Fall 2008 Lecture 13 • 24 Cornell CS4620 Fall 2008 Lecture 13 • 24 Continuous convolution: warm-up Continuous convolution: warm-up • Can apply sliding-window average to a continuous function just as • Can apply sliding-window average to a continuous function just as well well output is continuous output is continuous integration replaces summation integration replaces summation Cornell CS4620 Fall 2008 Lecture 13 • 24 Cornell CS4620 Fall 2008 Lecture 13 • 24 Continuous convolution: warm-up Continuous convolution • Can apply sliding-window average to a continuous function just as • Sliding average