Light Path of Light Looking at Things Getting Light to “Imager” Depth And

Total Page:16

File Type:pdf, Size:1020Kb

Light Path of Light Looking at Things Getting Light to “Imager” Depth And CS559 – Lecture 2 Light Lights, Cameras, Eyes These are course notes (not used as slides) • Electromagnetic radiation Written by Mike Gleicher, Sept. 2005 – Wavelengths between 380nm – 800nm – Wavelength vs. frequency Adjusted after class – stuff we didn’t get to removed / mistakes fixed • Particle model – Travels from source to receiver © 2005 Michael L. Gleicher Path of Light Looking at things • From source to viewer • Light leaves source • Light bounces off object • Light goes to receiver • Not known until around 1000 –Eye – Euclid and Ptolemy PROVED otherwise – Camera • Ibn Al-Haythan (Al-hazen) around 985 – Triumph of the scientific method • Receiver is 2D, process is • Proof by observation – not authority 3D – Experiment – stare at sun, burns eyes, … • Mathematics later – Also figure out light travels in straight lines • Camera first – Flat receiver Getting light to “imager” Depth and Distance • Light generally bounces off • Light travels in straight lines things in all directions – Except in weird cases that only occur in theoretical – See from any direction physics – Not the same! (mirror) – Deal with this in detail later • Doesn’t matter how far away – Can’t tell where photon comes from • Generally doesn’t matter if – Photons leaving source might not all make it to eye emitter (source) or reflector – Photons might bounce around on stuff – Same to receiver • Longer distance, more chance of hitting something 1 Capturing Images “Ideal Imaging” • Measure light at all places • Each point in world maps on the “imaging plane”? to a single point in image – Image appears sharp – Image is “in focus” • Not quite… • Otherwise image is “blurry” • Potentially all paths – Image is out of focus between world and imager – Need to be picky about • How to do this? which rays we look at • Pinhole Camera – Infinitessimal hole in blocking surface – just a point – Only 1 path from world point to image – Focal Point Why is pinhole imaging not so ideal in A “virtual pinhole” - Lenses practice? • Finite aperature • Lens bends light – Always will be some blurryness • convex lenses – take bundles of • Too selective about light light and make them converge (pass through a point) – Lets very little light • Parallel rays converge • A virtual pinhole! • Smaller aperature • Light rays from “far away” are – Less blurry (effectively) parallel – Less light • What about non-parallel rays? • Want bigger aperature, but • Infinitessimal aperature = infinite keep sharpness sharpness “Thin” Lenses Focusing with a lens • All points at one distance • Objects at “focussed get to another place distance” – sharp (in focus) • Objects at other distances • Different distances map to are not sharp different distances • Some blurryness is OK • If we fix the distance to the – Circle of Confusion image plane, then only • Depth of Field objects at a single distance – Range of distances that will be in focus things are “close enough” to – 1/D + 1/I = 1/F being in focus – Farther objects image closer Picture is wrong – inverse relationship between I and D 2 Controlling the image Measuring on the image plane • Smaller aperature = less blurry = larger depth of • Want to measure / record the light that hits the field image plane – But less light • At every position on the image plane (in the image) • Lens determines we can measure the amount of light – Continuous phenomenon (move a little bit, and it can be – What gets to the imaging surface different) – What is in focus – Can think of an image as a function that given a position (x,y) tells us the “amount” of light at that position i = f(x,y) – For now, simplify “amount” as just a quantity, ignoring that light can be different colors Measuring on the image plane Measuring on the image • i = f(x,y) • Water/rain analogy • Continuous quantities • Put a set of buckets to catch water – Continuous in space • Wait over a duration of time – Continuous in value – Use a shutter to control the amount of time • Computers (and measuring in general) is difficult with continuous things • Measurement depends on • Major issue – Amount of light – Limits to how much we can gather – Size of aperature (how much of the light we let through) – Reconstruct continuous thing based on discrete set of – Duration observations – Manipulate discrete representations Types of “buckets” Similarities • Film • Low light levels are hard – silver halide crystals change when exposed to light – Need to get enough photons to measure • Electronic – Small counting errors (noise) – are big relative to small – Old analog ways – vidicon tubes measurements • Store the charge on a plate, scan the plate to read • http://www.answers.com/topic/video-camera-tube • Tradeoffs on bucket sizes – New ways: use an MOS transistor as a bucket – Big buckets are good (lower noise in low light) • Biological – Lots of buckets are good (sense more places) – Chemicals (photo-pigments) store the photon and – For a fixed area, there is a tradeoff release it as electricity • Especially in digital cameras/videocameras – Isn’t really a shutter 3 MOS Transistors CCD sensors • Metal Oxide • CCD = Charge Coupled Semiconductors Device – “Bucket Brigade” of MOS • Semiconductor acts as a transistors Measurement “bucket” for electrons – Use gates to move charge along – Read out “at edge” • Metal at top is a “gate” – – Shift register to transfer out creates electric field that images can connect/disconnect the two sides • Advantage: – Cheap / easy to make large numbers of buckets –Uniform •Blooming CMOS sensors Digital Camera • Disadvantage of CCDs • Megapixels = number of buckets – Have to shift things out (slow, lose info) – 7 or 8 million buckets in a consumer camera – Different than computer chips • CMOS (complimentary Metal Oxide Semiconductor) – Just like computer chips •But… – Put more circuitry around each sensor transistor – How big are the sensors? • Amplify / switch signals as needed • Same size / more megapixels = smaller buckets = more noise – Use normal “wires” to carry info to where it needs to go • (unless the sensor technology gets better) • Downside: space for circuit means less space for sensors – How good is the lens? (smaller buckets = more noise), not uniform • Smaller buckets don’t do you any good if the lens can’t aim it • Upside: same “technology curve” as computers, so will get into the right bucket better, faster, cheaper, lower power, … Eye Retina – the “image plane” of the eye • Pupil – hole in the eye • Only place on body to see blood vessels directly • Lens •Iris • Has photoreceptors – round muscle – size of pupil • Cornea – Cells sensitive to light – Clear protective coating • Photopigments • Fluid filled spaces – acts as lens – Aqueous humor – chemicals that change when exposed to light – Vitreous humor • Rectus Muscles – Different photoreceptors have different pigments – Change shape of eyeball to – Different pigments behave differently focus •Optic Nerve • Sensitivity, color selectivity, regeneration speed – Carries information away • Types of photoreceptors • Blind Spot – Where the optic nerve is • Central Fovea 4 Rods • Photopigment: Rhodopsin – Breaks into retinene + protein – Must be reassembled before can work again • Very sensitive – Bright light means that it breaks down faster than it is regenerated – Less useful in bright light • Blinded by bright light at night 5.
Recommended publications
  • Invention of Digital Photograph
    Invention of Digital photograph Digital photography uses cameras containing arrays of electronic photodetectors to capture images focused by a lens, as opposed to an exposure on photographic film. The captured images are digitized and stored as a computer file ready for further digital processing, viewing, electronic publishing, or digital printing. Until the advent of such technology, photographs were made by exposing light sensitive photographic film and paper, which was processed in liquid chemical solutions to develop and stabilize the image. Digital photographs are typically created solely by computer-based photoelectric and mechanical techniques, without wet bath chemical processing. The first consumer digital cameras were marketed in the late 1990s.[1] Professionals gravitated to digital slowly, and were won over when their professional work required using digital files to fulfill the demands of employers and/or clients, for faster turn- around than conventional methods would allow.[2] Starting around 2000, digital cameras were incorporated in cell phones and in the following years, cell phone cameras became widespread, particularly due to their connectivity to social media websites and email. Since 2010, the digital point-and-shoot and DSLR formats have also seen competition from the mirrorless digital camera format, which typically provides better image quality than the point-and-shoot or cell phone formats but comes in a smaller size and shape than the typical DSLR. Many mirrorless cameras accept interchangeable lenses and have advanced features through an electronic viewfinder, which replaces the through-the-lens finder image of the SLR format. While digital photography has only relatively recently become mainstream, the late 20th century saw many small developments leading to its creation.
    [Show full text]
  • Light Conversion, S/N Characteristics of X-Ray Phosphor Screens
    Light conversion, S/N characteristics of x-ray phosphor screens Item Type text; Thesis-Reproduction (electronic) Authors Lum, Byron Kwai Chinn Publisher The University of Arizona. Rights Copyright © is held by the author. Digital access to this material is made possible by the University Libraries, University of Arizona. Further transmission, reproduction or presentation (such as public display or performance) of protected items is prohibited except with permission of the author. Download date 28/09/2021 05:29:31 Link to Item http://hdl.handle.net/10150/557456 LIGHT CONVERSION, S/N CHARACTERISTICS OF X-RAY PHOSPHOR SCREENS by Byron Kwai Chinn Lum A Thesis Submitted To the Committee on COMMITTEE ON OPTICAL SCIENCES (GRADUATE) In Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE In the Graduate College THE UNIVERSITY OF ARIZONA 19 8 0 STATEMENT BY AUTHOR This thesis has been submitted in partial fulfillment of re­ quirements for an advanced degree at The University of Arizona and is deposited in the University Library to be made available to borrowers under rules of the Library. Brief quotations from this thesis are allowable without special permission, provided that accurate acknowledgment of source is made. Requests for permission for extended quotation from or reproduction of this manuscript in whole or in part may be granted by the head of the major department or the Dean of the Graduate College when in his judg­ ment the proposed use of the material is in the interests of scholar­ ship. In all other instances, however, permission must be obtained from the author.
    [Show full text]
  • Overview of Camera Systems Used in Beam Instrumentation
    Beata Walasek-Höhne verview of Video Cameras used in Beam Instrumentation FAIR GmbH | GSI GmbH Outline: Taking an Image . Source of light more details: talk of E. Bravin „Transverse Profile measurements“ . Optics more details: talk of S. Gibson „Introduction to optics“ . Image sensors . Analog i. Video Tube . Solid state sensors i. CCD ii. CMOS iii. CID . Radiation hardness . Digitizer more details: talk of M. Gasior „Analog Digital Conversion“ . Post processing FAIR GmbH | GSI GmbH 2 Source of light . light is represented as both a particle (photon) and electromagnetic wave . photons have a defined energy . energy correlates to wavelength 풉풄 푬 = 흀 . wavelength corresponds to color . number of the photons corresponds to intensity . visible light is a very narrow band in the electromagnetic spectrum FAIR GmbH | GSI GmbH 3 Source of light 600mbar Kr 600mbar Kr © GSI, www.gsi.de Ruby-Ceramics (Chromox) screen at LHC for injection and first turn, protons at 450 GeV © CERN, www.cern.ch YAG:Ce at FLASH © DESY, www.desy.de FAIR GmbH | GSI GmbH 4 Source of light 600mbar Kr 100 mm © GSI, www.gsi.de FAIR GmbH | GSI GmbH 5 Analog Video Cameras © Pete Simpkin, Marconi vidicon Camera www.bbceng.info FAIR GmbH | GSI GmbH 6 Analog Video Cameras . early 1900s first experiment in image transmission . in 1930s new electronic designs based on a cathode-ray video camera tube, including two versions dissector tube (Philo Farnsworth) and iconoscope (Vladimir Zsworykin) Dissector tube © Television News magazine, 1931 FAIR GmbH | GSI GmbH 7 Analog Video Cameras . analog system became the standard in the television industry and remained in wide use until the 1980s Iconoscope © Radio News magazine, 1945 FAIR GmbH | GSI GmbH 8 Analog Video Cameras: Vidicon .
    [Show full text]
  • United States Patent [191 [11] 4,095,775 Hotham [45] Jun
    United States Patent [191 [11] 4,095,775 Hotham [45] Jun. 20, 1978 [54] PARTICLE EVALUATOR Primary Examiner-Conrad J. Clark Attorney, Agent, or Firm—Reed C. Lawlor [76] Inventor: Geoffrey A. Hotham, 1130 Channel Dr., Santa Barbara, Calif. 93108 [57] ABSTRACT [21] App1.No.: 642,915 The invention is employed for evaluating particles, such as droplets of aerosol sprays suspended in a gaseous [22] Filed: Dec. 22, 1975 medium. A stream of the particles to be analyzed is [51] Int. Cl.2 ................... .. G01N 21/00; G01N 21/18; ?owed into a narrow sample zone across the optical axis GOZB 21/34 of a beam of light. Scattered radiation from the particles [52] US. Cl. .................................. .. 356/102; 356/ 181; is focused on the image plane in which a photosensitive ' 250/574; 350/95 surface is located. Images of particles formed there are [58] Field of Search ............. 356/102, 103, 246, 181; reproduced on a ?uorescent or phosphorescent display 250/574; 350/63, 89, 95 screen of a cathode ray tube. A scale on the screen is employed for measuring the dimensions of the particles. [56] References Cited The con?nement of the stream of the mixture to about U.S. PATENT DOCUMENTS the depth of focus of the objective lens of the camera reduces effects of out-of-focus particles and permits 3,609,043 9/1971 Simmons et a1. .................. .. 356/102 evaluation of individual particles. 3,614,231 10/1971 Sham ..................... .. 356/102 3,646,352 2/ 1972 B01 et al. .. 356/ 102 3,720,470 3/ 1973 Berkham ...........................
    [Show full text]
  • Camera Sensors
    Welcome Processing Digital Camera Images Camera Sensors Michael Thomas Overview Many image sensors: Infrared, gamma ray, x-rays etc. Focus on sensors for visible light (slightly into infrared and uv light) Michael Thomas, TU Berlin, 2010 Processing Digital Camera Images, WS 2010/2011, Alexa/Eitz 2 The beginnings First Video camera tube sensors in the 1930s Cathode Ray Tube (CRT) sensor Vidicon and Plumbicon for TV-Broadcasting in the 1950s – 1980s Vidicon sensors on Galileo-spacecraft to Jupiter in 1980s Michael Thomas, TU Berlin, 2010 Processing Digital Camera Images, WS 2010/2011, Alexa/Eitz 3 The Photoelectric-Effect How to convert light to electric charge? Inner photoelectric-effect at a photodiode: Photon excites electron creating a free electron and a hole The hole moves towards the anode, the electron towards the cathode Now we have our charge! Michael Thomas, TU Berlin, 2010 Processing Digital Camera Images, WS 2010/2011, Alexa/Eitz 4 Charge-Coupled Device (CCD) Integrated circuit Array of connected capacitors (Shift register) Charge of capacitor is transfered to neighbour capacitor At the end of chain, charge is converted into voltage by charge amplifier Transfer stepped by Clock-Signal Serial charge processing Michael Thomas, TU Berlin, 2010 Processing Digital Camera Images, WS 2010/2011, Alexa/Eitz 5 CCD-Sensor Each capacitor is coupled with a photodiode All capacitors are charged parallelly Charges are transferred serially Michael Thomas, TU Berlin, 2010 Processing Digital Camera Images, WS 2010/2011, Alexa/Eitz
    [Show full text]
  • Radio Wave Engineering 16 13
    Fall Semester, 2011 UEC Exchange Program Index 1. Communication Systems 2 2. Video and Image Technologies for TV 3 3. Quality and Reliability Engineering; the Japanese Way 5 4. GO -- Playing and Computing 6 5. VLSI Devices and Technology 7 6. Fundamental Concepts of Discrete-time Signal Processing 8 7. Computer Algorithms 10 8. Interactive Computer Graphics 12 9. Visual Media Design 13 10. Experimental Electronics Laboratory 14 11. Computational Complexity 15 12. Radio Wave Engineering 16 13. Communication Systems Laboratory 17 14. Terrestrial Electromagnetic Environment 18 1 1. Communication Systems Lecturer Professor Noboru TOYAMA Course Description This course must be taken concurrently with the course "Communication Systems Laboratory." First two classes will be review sessions that concentrate efforts on familiarizing students with the basic mathematical knowledge including the subjects listed in the prerequisites. Students who do not have confidence in those items are requested to make extra efforts to catch up with other students during the first two classes. This course together with “Communication System Laboratory” discusses in depth how digital and analog communication systems work. The basic tools used here are waveform analyses. Topics covered in this course are, signal analysis, the Fourier spectrum, the autocorrelation function, power spectrum, line coding, inter-symbol interference, roll-off filters, the discrete Fourier transform, the Hilbert transform, and various types of modulation. Some experiments in threshold effects in the presence of noise are included. From the first chapter up to chapter 7 of the textbook will be covered during the course hours. The remaining chapters will be covered in the course given in the spring semester.
    [Show full text]
  • Chapter 1 Introduction to Nondestructive Testing 1.1 I
    www.NDTKALA.net www.NDTKALA.net www.NDTKALA.net www.NDTKALA.net HANDBOOK OF NONDESTRUCTIVE EVALUATION www.NDTKALA.net This page intentionally left blank. www.NDTKALA.net HANDBOOK OF NONDESTRUCTIVE EVALUATION Charles J. Hellier McGRAW-HILL New York Chicago San Francisco Lisbon London Madrid Mexico City Milan New Delhi San Juan Seoul Singapore Sydney Toronto www.NDTKALA.net ebook_copyright 8 x 10.qxd 7/7/03 5:07 PM Page 1 Copyright © 2003 by The McGraw-Hill Companies, Inc. All rights reserved. Manufactured in the United States of America. Except as permitted under the United States Copyright Act of 1976, no part of this publication may be reproduced or distrib- uted in any form or by any means, or stored in a database or retrieval system, without the prior written permission of the publisher. 0-07-139947-X The material in this eBook also appears in the print version of this title: 0-07-028121-1 All trademarks are trademarks of their respective owners. Rather than put a trademark symbol after every occurrence of a trademarked name, we use names in an editorial fashion only, and to the benefit of the trademark owner, with no intention of infringement of the trademark. Where such designations appear in this book, they have been printed with initial caps. McGraw-Hill eBooks are available at special quantity discounts to use as premiums and sales promotions, or for use in cor- porate training programs. For more information, please contact George Hoare, Special Sales, at george_hoare@mcgraw- hill.com or (212) 904-4069. TERMS OF USE This is a copyrighted work and The McGraw-Hill Companies, Inc.
    [Show full text]
  • Appendixes I Spatial Filtration for Video Line Removal
    Appendixes I Spatial Filtration for Video Line Removal GORDON W. ELLIS The growing popularity in microscopy of video recording and image-processing techniques presents users with a problem that is inherent in the method-horizontal scan lines. These lines on the monitor screen can be an obtrusive distraction in photographs of the final video image. Care and understanding in making the original photograph can minimize the contrast of these lines. Two simple, but essential, rules for photography of video images are: (l) Use exposures that are multiples of the video frame time (l/30 sec in the USA). An exposure time less than this value will not record a completely interlaced image.* (2) Adjust the v HOLD control on the monitor so that the two fields that make up the frame are evenly interlaced. Alternate scan lines should be centered with respect to their neighbors (a magnifier is helpful here). t Following these rules will often result in pictures in which the scan lines are acceptably unobtrusive without recourse to further processing. If the subject matter is such that the remaining line contrast is disturbing, Inoue (l981b) has described a simple technique that can often yield satisfactory results using a Ronchi grating. However, on occasion, when important image details are near the dimensions of the scan lines, the slight loss in vertical resolution resulting from this diffraction-smoothing method may make it worth the effort to remove the lines by spatial filtration. The technique of spatial filtration, pioneered by Marechal, is described in many current optics texts. A good practical discussion of these techniques is found in Shulman (1970).
    [Show full text]
  • US4523226.Pdf
    United States Patent (19) (11 Patent Number: 4,523,226 Lipton et al. 45 Date of Patent: Jun. 11, 1985 54 STEREOSCOPICTELEVISION SYSTEM 3,818,125 6/1974 Butterfield ............................ 358/92 (75) Inventors: Lenny Lipton, Pt. Richmond; 4,152,724 5/1979 Hunter ................................ 358/901 Michael R. Starks, Fairfax; James D. FOREIGN PATENT DOCUMENTS Stewart, Oakland; Lawrence D. Meyer, Ross, all of Calif. 34046 7/1956 Switzerland .......................... 352/60 73 Assignee: Stereographics Corporation, San OTHER PUBLICATIONS Rafael, Calif. Masters-Generating a 3D-TV Signal With One TV Camera-IBM Tech. Disclosure Bulletin, vol. 8 #1 Jun. (21) Appl. No.: 459,174 1965, p. 134. 22) Filed: Jan. 19, 1983 Kennedy-Weather Satellite Picture Processor-Wireless World, May 1980-vol. 86, No. 1533, pp. 41-46. Related U.S. Application Data Norling-the Stereoscopic Art-A Reprint-Jour. of Smpte-vol. 60, #53, pp. 268-307. 63 Continuation-in-part of Ser. No. 343,124, Jan. 27, 1982, abandoned. Primary Examiner-Joseph A. Orsino, Jr. (51) Int, C. ............................................... H04N 9/54 Attorney, Agent, or Firm-J. William Wigert, Jr. 52) U.S. Cl. ...................... poes no poor wou 358/88; 358/91; 57 ABSTRACT 358/92 An improved stereoscopic television system is dis (58) Field of Search ....................... 358/88, 91, 92,97, closed, having a great deal of compatibility with the 358/146, 237,238,239,901; 352/60 existing commercial television infrastructure. Flicker is 56) References Cited eliminated while preserving the existing bandwidth allowing the stereoscopic video signals to be handled U.S. PATENT DOCUMENTS with conventional apparatus such as video tape record 2,099,694 li/1937 Land ....................................
    [Show full text]
  • Doctoral Thesis
    i “diss-davidhinken” — 2012/5/12 — 10:13 — page i — #1 i i i LUMINESCENCE-BASED CHARACTERIZATION OF CRYSTALLINE SILICON SOLAR CELLS Von der Fakultät für Mathematik und Physik der Gottfried Wilhelm Leibniz Universität Hannover zur Erlangung des Grades Doktor der Naturwissenschaften Dr. rer. nat. genehmigte Dissertation von Dipl.-Phys. David Hinken geboren am 27.11.1981 in Hannover 2012 i i i i i “diss-davidhinken” — 2012/5/12 — 10:13 — page ii — #2 i i i Referent: Prof. Dr. Rolf Brendel Korreferent: Prof. Dr. Jörg Osten Tag der Promotion: 03.05.2012 i i i i i “diss-davidhinken” — 2012/5/12 — 10:13 — page iii — #3 i i i Abstract Camera-based luminescence imaging for the characterization of silicon solar cells was introduced into photovoltaics in 2005 by a Japanese research group. For this characterization technique, excess charge carriers in the solar cell bulk are either injected using a current source or optically generated by illumination. Luminescence imaging is fast and the resulting images contain abundant information about the solar cell device because the detected luminescence signal depends on properties like the charge carrier recombination, the optical confinement as well as on the series- and shunt resistance. However, at that time, measured luminescence images were interpreted qualitatively only and it was not demonstrated, how luminescence imaging can be used to extract quantitative images of a specific solar cell parameter. Therefore, this thesis presents a comprehensive description and understanding of luminescence images of wafer-based crystalline silicon solar cells. It aims at the determination of local solar cell parameters and its relation to the global solar cell energy conversion efficiency.
    [Show full text]
  • Dictionary of Video and Television Technology Newnes Is an Imprint of Elsevier Science
    Dictionary of Video and Television Technology Newnes is an imprint of Elsevier Science. Copyright © 2002, Elsevier Science (USA). All rights reserved. [This page intentionally left blank.] No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photocopying, recording, or otherwise, without the prior written permission of the publisher. Recognizing the importance of preserving what has been written, Elsevier Science prints its books on acid-free paper whenever possible. Library of Congress Cataloging-in-Publication Data ISBN: 1-878707-99-X British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library. The publisher offers special discounts on bulk orders of this book. For information, please contact: Manager of Special Sales Elsevier Science 225 Wildwood Avenue Woburn, MA 01801-2041 Tel: 781-904-2500 Fax: 781-904-2620 For information on all Newnes publications available, contact our World Wide Web home page at: http://www.newnespress.com 10 9 8 7 6 5 4 3 2 1 Printed in the United States of America Dictionary of Video and Television Technology Keith Jack Vladimir Tsatsulin An imprint of Elsevier Science Amsterdam Boston London New York Oxford Paris San Diego San Francisco Singapore Sydney Tokyo [This is a blank page.] CONTENTS Preface ............................................................................................................. vii About the Authors .....................................................................................
    [Show full text]
  • Pi Camera! By: Juan Medina
    Image Sensors: Pi Camera! By: Juan medina Outline: ● Digital Cameras ○ History (Mariner) and connectivity/social impact: ● What is an image sensor? ​ ​ ○ Types ○ Color separation and Performance metric ● Pi Camera ○ purpose, price, datasheet, examples! ● Conclusions Digital Cameras: History and Social Impact ​ On July 15 of 1965, the Mariner 4 spacecraft obtained the first flyby digital image (Figure 1: https://en.wikipedia.org/wiki/Mariner_4) of Mars with a camera system ​ ​ designed by the NASA Jet Propulsion Laboratory. Such system utilized a video camera tube followed by a digitizer (something somehow distant from what current digital cameras are) to record 21 images on tape that where the transmitted to Earth. By the way, such transmission took about 20 days to complete. In 1975, Steven Sasson, engineer at Kodak, build the first digital camera using a newly developed CCD (will explain later) image sensor sold Fairchild Semiconductor in 1973. Such camera was 3.6 Kg, had 0.01 megapixels, and took 23 seconds to capture the first image (Figure 2: http://giantsofscience.weebly.com/steven-sasson.html). ​ ​ Nowadays, digital cameras are a fundamental component of our lives. A tremendous amount of digital visual content is created every day (Figure 3: http://rocketpost.com/blog/instagram-direct-how-to-use-four-tips/). From our personal ​ life, up to an academic and professional environment, digital cameras have changed our life dramatically; image sensors are everywhere: phones, pc, security, robots, scanners, printers, etc. What’s an image sensor? An image sensor transforms the attenuation of electromagnetic radiation as it passess through or is reflected off objects into small currents that convey the information.
    [Show full text]