Light Path of Light Looking at Things Getting Light to “Imager” Depth And
Total Page:16
File Type:pdf, Size:1020Kb
CS559 – Lecture 2 Light Lights, Cameras, Eyes These are course notes (not used as slides) • Electromagnetic radiation Written by Mike Gleicher, Sept. 2005 – Wavelengths between 380nm – 800nm – Wavelength vs. frequency Adjusted after class – stuff we didn’t get to removed / mistakes fixed • Particle model – Travels from source to receiver © 2005 Michael L. Gleicher Path of Light Looking at things • From source to viewer • Light leaves source • Light bounces off object • Light goes to receiver • Not known until around 1000 –Eye – Euclid and Ptolemy PROVED otherwise – Camera • Ibn Al-Haythan (Al-hazen) around 985 – Triumph of the scientific method • Receiver is 2D, process is • Proof by observation – not authority 3D – Experiment – stare at sun, burns eyes, … • Mathematics later – Also figure out light travels in straight lines • Camera first – Flat receiver Getting light to “imager” Depth and Distance • Light generally bounces off • Light travels in straight lines things in all directions – Except in weird cases that only occur in theoretical – See from any direction physics – Not the same! (mirror) – Deal with this in detail later • Doesn’t matter how far away – Can’t tell where photon comes from • Generally doesn’t matter if – Photons leaving source might not all make it to eye emitter (source) or reflector – Photons might bounce around on stuff – Same to receiver • Longer distance, more chance of hitting something 1 Capturing Images “Ideal Imaging” • Measure light at all places • Each point in world maps on the “imaging plane”? to a single point in image – Image appears sharp – Image is “in focus” • Not quite… • Otherwise image is “blurry” • Potentially all paths – Image is out of focus between world and imager – Need to be picky about • How to do this? which rays we look at • Pinhole Camera – Infinitessimal hole in blocking surface – just a point – Only 1 path from world point to image – Focal Point Why is pinhole imaging not so ideal in A “virtual pinhole” - Lenses practice? • Finite aperature • Lens bends light – Always will be some blurryness • convex lenses – take bundles of • Too selective about light light and make them converge (pass through a point) – Lets very little light • Parallel rays converge • A virtual pinhole! • Smaller aperature • Light rays from “far away” are – Less blurry (effectively) parallel – Less light • What about non-parallel rays? • Want bigger aperature, but • Infinitessimal aperature = infinite keep sharpness sharpness “Thin” Lenses Focusing with a lens • All points at one distance • Objects at “focussed get to another place distance” – sharp (in focus) • Objects at other distances • Different distances map to are not sharp different distances • Some blurryness is OK • If we fix the distance to the – Circle of Confusion image plane, then only • Depth of Field objects at a single distance – Range of distances that will be in focus things are “close enough” to – 1/D + 1/I = 1/F being in focus – Farther objects image closer Picture is wrong – inverse relationship between I and D 2 Controlling the image Measuring on the image plane • Smaller aperature = less blurry = larger depth of • Want to measure / record the light that hits the field image plane – But less light • At every position on the image plane (in the image) • Lens determines we can measure the amount of light – Continuous phenomenon (move a little bit, and it can be – What gets to the imaging surface different) – What is in focus – Can think of an image as a function that given a position (x,y) tells us the “amount” of light at that position i = f(x,y) – For now, simplify “amount” as just a quantity, ignoring that light can be different colors Measuring on the image plane Measuring on the image • i = f(x,y) • Water/rain analogy • Continuous quantities • Put a set of buckets to catch water – Continuous in space • Wait over a duration of time – Continuous in value – Use a shutter to control the amount of time • Computers (and measuring in general) is difficult with continuous things • Measurement depends on • Major issue – Amount of light – Limits to how much we can gather – Size of aperature (how much of the light we let through) – Reconstruct continuous thing based on discrete set of – Duration observations – Manipulate discrete representations Types of “buckets” Similarities • Film • Low light levels are hard – silver halide crystals change when exposed to light – Need to get enough photons to measure • Electronic – Small counting errors (noise) – are big relative to small – Old analog ways – vidicon tubes measurements • Store the charge on a plate, scan the plate to read • http://www.answers.com/topic/video-camera-tube • Tradeoffs on bucket sizes – New ways: use an MOS transistor as a bucket – Big buckets are good (lower noise in low light) • Biological – Lots of buckets are good (sense more places) – Chemicals (photo-pigments) store the photon and – For a fixed area, there is a tradeoff release it as electricity • Especially in digital cameras/videocameras – Isn’t really a shutter 3 MOS Transistors CCD sensors • Metal Oxide • CCD = Charge Coupled Semiconductors Device – “Bucket Brigade” of MOS • Semiconductor acts as a transistors Measurement “bucket” for electrons – Use gates to move charge along – Read out “at edge” • Metal at top is a “gate” – – Shift register to transfer out creates electric field that images can connect/disconnect the two sides • Advantage: – Cheap / easy to make large numbers of buckets –Uniform •Blooming CMOS sensors Digital Camera • Disadvantage of CCDs • Megapixels = number of buckets – Have to shift things out (slow, lose info) – 7 or 8 million buckets in a consumer camera – Different than computer chips • CMOS (complimentary Metal Oxide Semiconductor) – Just like computer chips •But… – Put more circuitry around each sensor transistor – How big are the sensors? • Amplify / switch signals as needed • Same size / more megapixels = smaller buckets = more noise – Use normal “wires” to carry info to where it needs to go • (unless the sensor technology gets better) • Downside: space for circuit means less space for sensors – How good is the lens? (smaller buckets = more noise), not uniform • Smaller buckets don’t do you any good if the lens can’t aim it • Upside: same “technology curve” as computers, so will get into the right bucket better, faster, cheaper, lower power, … Eye Retina – the “image plane” of the eye • Pupil – hole in the eye • Only place on body to see blood vessels directly • Lens •Iris • Has photoreceptors – round muscle – size of pupil • Cornea – Cells sensitive to light – Clear protective coating • Photopigments • Fluid filled spaces – acts as lens – Aqueous humor – chemicals that change when exposed to light – Vitreous humor • Rectus Muscles – Different photoreceptors have different pigments – Change shape of eyeball to – Different pigments behave differently focus •Optic Nerve • Sensitivity, color selectivity, regeneration speed – Carries information away • Types of photoreceptors • Blind Spot – Where the optic nerve is • Central Fovea 4 Rods • Photopigment: Rhodopsin – Breaks into retinene + protein – Must be reassembled before can work again • Very sensitive – Bright light means that it breaks down faster than it is regenerated – Less useful in bright light • Blinded by bright light at night 5.