Microoptical Multi Aperture Imaging Systems Dissertation

Total Page:16

File Type:pdf, Size:1020Kb

Microoptical Multi Aperture Imaging Systems Dissertation Microoptical Multi Aperture Imaging Systems Dissertation zur Erlangung des akademischen Grades doctor rerum naturalium (Dr. rer. nat.) vorgelegt dem Rat der Physikalisch-Astronomischen Fakultät der Friedrich-Schiller-Universität Jena von Diplomphysiker Andreas Brückner geboren am 09. August 1980 in Magdeburg Gutachter 1. Prof. Dr. rer. nat. habil. Andreas Tünnermann, Friedrich-Schiller-Universität Jena 2. Prof. Dr. rer. nat. habil. Stefan Sinzinger, Technische Universität Ilmenau 3. Prof. Dr. Hans Peter Herzig, École Polytechnique Fédérale de Lausanne, Schweiz Tag der Disputation: 22.12.2011 Contents i Contents 1 Introduction 1 2 Fundamentals 4 2.1ApplicationsofMiniaturizedImagingSystems................ 4 2.2IntroductiontoDigitalVisionSystems.................... 5 2.2.1 ObjectiveLens............................. 6 2.2.2 ImageSensor.............................. 14 2.2.3 ImageProcessing............................ 19 2.3ScalingLimitsofSingleApertureImagingOptics.............. 20 3 Multi Aperture Imaging Systems 25 3.1NaturalArchetypesandStateoftheArtTechnicalSolutions........ 25 3.1.1 AppositionCompoundEyes...................... 26 3.1.2 SuperpositionCompoundEyes..................... 30 3.1.3 CompoundEyesofStrepsiptera.................... 32 3.2ClassificationofMultiApertureImagingSystems.............. 33 3.2.1 BasicParametersofMultiApertureImagingOptics(MAO).... 33 3.2.2 ConceptsforFieldofViewSegmentation............... 36 3.2.3 MultiApertureSuper-Resolution................... 41 3.2.4 ReasonsforaThicknessReductioninMAO............. 42 3.3MethodologyfortheOpticalDesignandSimulationofMAO........ 43 3.3.1 ParaxialModel............................. 45 3.3.2 ParaxialMatrixFormalism....................... 46 3.3.3 AnalyticalModeling.......................... 47 3.3.4 NumericalSimulation.......................... 50 4 Artificial Compound Eyes with Electronic Stitching of Segments 53 4.1ArtificialNeuralSuperpositionEye(ANSE)................. 53 4.1.1 IncreasedInformationCapacityofAPCOs.............. 53 4.1.2 FabricationandIntegration...................... 55 4.1.3 DemonstrationofIncreasedSensitivityandColorImaging..... 58 4.1.4 ContrastEnhancementbyImageDeconvolution........... 61 4.2ElectronicClusterEye(eCLEY)........................ 62 4.2.1 OpticalDesignforThinWafer-LevelCameraOptics......... 64 4.2.2 FabricationofeCLEYwithVGAresolution............. 66 Contents ii 4.2.3 SoftwareDistortionCorrectionandImageStitching......... 67 4.2.4 ExperimentalCharacterizationofeCLEYVGA........... 69 4.2.5 ImprovementsoftheeCLEY...................... 72 4.3SummaryofElectronicStitchingofSegments................ 74 5 Artificial Compound Eyes with Optical Stitching of Segments 75 5.1OpticalClusterEye(oCLEY)......................... 75 5.1.1 DesignofanArrayofMicro-TelescopesonSmallestFormat.... 75 5.1.2 PrototypeFabrication......................... 78 5.1.3 CharacterizationofoCLEY...................... 78 5.2Ultra-ThinArrayMicroscope......................... 81 5.2.1 HighResolutionImagingofLargeObjectFields........... 82 5.2.2 FabricationofaSystemwithAdaptableLateralFormat....... 83 5.2.3 ExperimentalCharacterization..................... 85 5.3MicroopticalGaborSuperlens(μoGSL).................... 88 5.3.1 Approach for an Ultra-Compact Objective with High Sensitivity . 88 5.3.2 DemonstrationofaPrototypeSystem................. 90 5.4SummaryofOpticalStitchingofSegments.................. 94 6 Conclusions and Outlook 96 Bibliography 99 Appendix 106 Abbreviations and Symbols 122 Acknowledgments 128 Zusammenfassung 130 Ehrenwörtliche Erklärung 131 Lebenslauf 132 Wissenschaftliche Veröffentlichungen 133 Die Menschheit existiert nicht um der Wissenschaft willen, sondern die Wissenschaft sollte betrieben werden, um die Bedingungen der Menschheit zu verbessern. Mankind does not exist for science’s sake, but science should be used to improve the conditions of mankind. Jozef Maximilián Petzval (1807-1891) 1 1 Introduction Vision is by far the most important sense of human beings. It enables the exact orientation in the 3D environment, reaches up to several kilometers in distance and has always been a transport medium of human’s memories. The majority of man-made optical imaging sys- tems follows the design principles of single aperture optics which is the basic architecture of all known mammalian eyes [1]. Single aperture refers to the fact that all light which is recorded in the image passes through a single clear aperture of the optical system. Even the ancestor of the photographic camera, the ’Camera obscura’ which had no lens at all, belongs to this type [2, 3]. In the second half of the 19th century, the manufacturing of optical glasses of constant quality, enabled the spread of matured optical systems such as telescopes, microscopes, and photographic objectives into scientific, medical, and con- sumer applications [4]. About one hundred years later, the fabrication of opto-electronic image sensors by photolithographic techniques known from micro-electronics engineering and the development of precise molding of plastic lenses using high grade optical polymers have paved the way for the miniaturization of imaging systems [5]. Shrinking pixel sizes enabled the increase of resolution in image space while reducing the image sensor size and the fabrication of highly precise aspherical lenses led to the development of compact and cheap optical imaging systems. Hence, digital cameras are now an integral part of various electronic products such as laptops, tablet PCs, and mobile phones but also automotive safety systems, video endoscopes, and point-of-care diagnostics to name only a few. At present, miniaturized digital imaging systems are rarely smaller than 4 mm2 in footprint size with a thickness of 3 mm at 1.75 μm pixel pitch. A further downscaling yields pixel sizes below the diffraction limit, and thus a reduced spatial resolution as well as increased noise. Although this might be tolerable in consumer digital cameras, the effect on the performance of subsequent image processing and analysis is not acceptable in machine vision, industrial inspection, and medical imaging. Additionally, technological limits apply such as tight mechanical tolerances which severely increase the costs for a hybrid integration of small lenses. Lower f-numbers are hard to achieve, so that the diffraction limit cannot be shifted, leading to insufficient performance with standard fabrication techniques [6, 7]. For a further miniaturization of optical imaging systems, it is worth looking at some of the fascinating approaches which have been present in the tiniest creatures in nature for millions of years. The evolutionary solution of choice is found in the vision systems of invertebrates - the compound eyes. Here, a large number of extremely small vision systems (called ommatidia) on a curved base capture the visual information of a large field of view (FOV) in parallel at the sacrifice of spatial resolution. Each ommatidium itself has a small diameter and a low information capacity when compared to the single aperture eye. However, due to the large number of channels, a high information capacity of the overall multi aperture objective can be achieved. Since the dawn of digital image sensors, several technical derivatives of compound eyes 2 have been realized in order to miniaturize man-made vision systems [8–14]. However, since the major challenge for a technical adoption of natural compound eyes is the required precision of fabrication and assembly, none of these attempts has led to a successful tech- nology, since macroscopic fabrication methods were exploited for the manufacturing of microoptical structures. First examples of a realization by well adapted microoptical tech- nologies provided the thinnest known optical sensors which were promising even though the achieved resolution was not yet appropriate for technical applications [15–20]. The aim of this thesis is the investigation of the prospects of multi aperture imaging optics (MAOs) in a combination of opto-electronics, optics, and image processing. Therefore, promising design rules of natural compound eyes are studied and adopted for technical so- lutions. In contrast to prior work, where the size advantage is often obtained at the sacrifice of resolution, here, the highest degree of miniaturization should be achieved while main- taining optical system parameters (such as resolution and field of view) which are relevant for industrial applications. General principles of insect vision such as (1) simplification of the design, (2) spatial segmentation of the field of view, (3) spectral segmentation, and (4) overlap of the FOVs of adjacent optical channels are applied to yield new approaches to miniaturized vision systems with high resolution which overcome the basic scaling limits of single aperture imaging optics. A classification is provided that distinguishes the dif- ferent state-of-the-art and novel approaches with respect to their working principle and application potential. The scaling behavior as well as new and faster methods for the simulation and optimization of MAOs are examined. The discussion is focused on optical systemswhichcanbefabricatedinparallelbyhighly precise microoptics technology based on wafer-level techniques. The analysis includes new methods for the characterization in order to investigate the potential and differences of the various demonstrated approaches. Chapter 2 starts out with an overview of applications of small single aperture digital imaging systems followed by an introduction of the basic properties
Recommended publications
  • About Raspberry Pi HQ Camera Lenses Created by Dylan Herrada
    All About Raspberry Pi HQ Camera Lenses Created by Dylan Herrada Last updated on 2020-10-19 07:56:39 PM EDT Overview In this guide, I'll explain the 3 main lens options for a Raspberry Pi HQ Camera. I do have a few years of experience as a video engineer and I also have a decent amount of experience using cameras with relatively small sensors (mainly mirrorless cinema cameras like the BMPCC) so I am very aware of a lot of the advantages and challenges associated. That being said, I am by no means an expert, so apologies in advance if I get anything wrong. Parts Discussed Raspberry Pi High Quality HQ Camera $50.00 IN STOCK Add To Cart © Adafruit Industries https://learn.adafruit.com/raspberry-pi-hq-camera-lenses Page 3 of 13 16mm 10MP Telephoto Lens for Raspberry Pi HQ Camera OUT OF STOCK Out Of Stock 6mm 3MP Wide Angle Lens for Raspberry Pi HQ Camera OUT OF STOCK Out Of Stock Raspberry Pi 3 - Model B+ - 1.4GHz Cortex-A53 with 1GB RAM $35.00 IN STOCK Add To Cart Raspberry Pi Zero WH (Zero W with Headers) $14.00 IN STOCK Add To Cart © Adafruit Industries https://learn.adafruit.com/raspberry-pi-hq-camera-lenses Page 4 of 13 © Adafruit Industries https://learn.adafruit.com/raspberry-pi-hq-camera-lenses Page 5 of 13 Crop Factor What is crop factor? According to Wikipedia (https://adafru.it/MF0): In digital photography, the crop factor, format factor, or focal length multiplier of an image sensor format is the ratio of the dimensions of a camera's imaging area compared to a reference format; most often, this term is applied to digital cameras, relative to 35 mm film format as a reference.
    [Show full text]
  • Tonal Artifactualizations: Light-To-Dark in Still Imagery and Effects on Perception
    TONAL ARTIFACTUALIZATIONS: LIGHT-TO-DARK IN STILL IMAGERY AND EFFECTS ON PERCEPTION By GREGORY BERNARD POINDEXTER Bachelor of Science in Communication Oral Roberts University Tulsa, Oklahoma. 1985 Submitted to the Faculty of the Graduate College of the Oklahoma State University in partial fulfillment of the requirements for the Degree of MASTER OF SCIENCE May, 2012 TONAL ARTIFACTUALIZATIONS: LIGHT-TO-DARK IN STILL IN STILL IMAGERY AND EFFECTS ON PERCEPTION Thesis Approved: Lori K. McKinnon, Ph.D. Thesis Adviser Cynthia Nichols, Ph.D. Todd Arnold, Ph.D. Sheryl A. Tucker, Ph.D. Dean of the Graduate College ii TABLE OF CONTENTS Chapter Page I. INTRODUCTION......................................................................................................1 Cognitive Framework (Background of the Problem) ..............................................2 Statement of the Problem.........................................................................................5 Rationale ..................................................................................................................6 Theoretical Framework............................................................................................7 Assumptions.............................................................................................................8 Outline of the Following Chapters...........................................................................9 II. REVIEW OF LITERATURE..................................................................................10 A
    [Show full text]
  • Journal of Interdisciplinary Science Topics Determining Distance Or
    Journal of Interdisciplinary Science Topics Determining Distance or Object Size from a Photograph Chuqiao Huang The Centre for Interdisciplinary Science, University of Leicester 19/02/2014 Abstract The purpose of this paper is to create an equation relating the distance and width of an object in a photograph to a constant when the conditions under which the photograph was taken are known. These conditions are the sensor size of the camera and the resolution and focal length of the picture. The paper will highlight when such an equation would be an accurate prediction of reality, and will conclude with an example calculation. Introduction may be expressed as a tan (opposite/adjacent) In addition to displaying visual information relationship: such as colour, texture, and shape, a photograph (1.0) may provide size and distance information on (1.1) subjects. The purpose of this paper is to develop such an equation from basic trigonometric principles when given certain parameters about the photo. Picture Angular Resolution With the image dimensions, we can make an Picture Field of View equation for one-dimensional angular resolution. First, we will determine an equation for the Since l2 describes sensor width and not height, we horizontal field of view of a photograph when given will calculate horizontal instead of vertical angular the focal length and width of the sensor. resolution: Consider the top down view of the camera (2.0) system below (figure 1), consisting of an object (left), lens (centre), and sensor (right). Let d be the (2.1) 1 distance of object to the lens, d2 be the focal length, Here, αhPixel is the average horizontal field of l1 be the width of the object, l2 be the width of the view for a single pixel, and nhPixel is the number of sensor, and α be the horizontal field of view of the pixels that make up a row of the picture.
    [Show full text]
  • Sub-Airy Disk Angular Resolution with High Dynamic Range in the Near-Infrared A
    EPJ Web of Conferences 16, 03002 (2011) DOI: 10.1051/epjconf/20111603002 C Owned by the authors, published by EDP Sciences, 2011 Sub-Airy disk angular resolution with high dynamic range in the near-infrared A. Richichi European Southern Observatory,Karl-Schwarzschildstr. 2, 85748 Garching, Germany Abstract. Lunar occultations (LO) are a simple and effective high angular resolution method, with minimum requirements in instrumentation and telescope time. They rely on the analysis of the diffraction fringes created by the lunar limb. The diffraction phenomen occurs in space, and as a result LO are highly insensitive to most of the degrading effects that limit the performance of traditional single telescope and long-baseline interferometric techniques used for direct detection of faint, close companions to bright stars. We present very recent results obtained with the technique of lunar occultations in the near-IR, showing the detection of companions with very high dynamic range as close as few milliarcseconds to the primary star. We discuss the potential improvements that could be made, to increase further the current performance. Of course, LO are fixed-time events applicable only to sources which happen to lie on the Moon’s apparent orbit. However, with the continuously increasing numbers of potential exoplanets and brown dwarfs beign discovered, the frequency of such events is not negligible. I will list some of the most favorable potential LO in the near future, to be observed from major observatories. 1. THE METHOD The geometry of a lunar occultation (LO) event is sketched in Figure 1. The lunar limb acts as a straight diffracting edge, moving across the source with an angular speed that is the product of the lunar motion vector VM and the cosine of the contact angle CA.
    [Show full text]
  • The Most Important Equation in Astronomy! 50
    The Most Important Equation in Astronomy! 50 There are many equations that astronomers use L to describe the physical world, but none is more R 1.22 important and fundamental to the research that we = conduct than the one to the left! You cannot design a D telescope, or a satellite sensor, without paying attention to the relationship that it describes. In optics, the best focused spot of light that a perfect lens with a circular aperture can make, limited by the diffraction of light. The diffraction pattern has a bright region in the center called the Airy Disk. The diameter of the Airy Disk is related to the wavelength of the illuminating light, L, and the size of the circular aperture (mirror, lens), given by D. When L and D are expressed in the same units (e.g. centimeters, meters), R will be in units of angular measure called radians ( 1 radian = 57.3 degrees). You cannot see details with your eye, with a camera, or with a telescope, that are smaller than the Airy Disk size for your particular optical system. The formula also says that larger telescopes (making D bigger) allow you to see much finer details. For example, compare the top image of the Apollo-15 landing area taken by the Japanese Kaguya Satellite (10 meters/pixel at 100 km orbit elevation: aperture = about 15cm ) with the lower image taken by the LRO satellite (0.5 meters/pixel at a 50km orbit elevation: aperture = ). The Apollo-15 Lunar Module (LM) can be seen by its 'horizontal shadow' near the center of the image.
    [Show full text]
  • Understanding Resolution Diffraction and the Airy Disk, Dawes Limit & Rayleigh Criterion by Ed Zarenski [email protected]
    Understanding Resolution Diffraction and The Airy Disk, Dawes Limit & Rayleigh Criterion by Ed Zarenski [email protected] These explanations of terms are based on my understanding and application of published data and measurement criteria with specific notation from the credited sources. Noted paragraphs are not necessarily quoted but may be summarized directly from the source stated. All information if not noted from a specific source is mine. I have attempted to be as clear and accurate as possible in my presentation of all the data and applications put forth here. Although this article utilizes much information from the sources noted, it represents my opinion and my understanding of optical theory. Any errors are mine. Comments and discussion are welcome. Clear Skies, and if not, Cloudy Nights. EdZ November 2003, Introduction: Diffraction and Resolution Common Diffraction Limit Criteria The Airy Disk Understanding Rayleigh and Dawes Limits Seeing a Black Space Between Components Affects of Magnitude and Color Affects of Central Obstruction Affect of Exit Pupil on Resolution Resolving Power Resolving Power in Extended Objects Extended Object Resolution Criteria Diffraction Fringes Interfere with Resolution Magnification is Necessary to See Acuity Determines Magnification Summary / Conclusions Credits 1 Introduction: Diffraction and Resolution All lenses or mirrors cause diffraction of light. Assuming a circular aperture, the image of a point source formed by the lens shows a small disk of light surrounded by a number of alternating dark and bright rings. This is known as the diffraction pattern or the Airy pattern. At the center of this pattern is the Airy disk. As the diameter of the aperture increases, the size of the Airy disk decreases.
    [Show full text]
  • From Point to Pixel: a Genealogy of Digital Aesthetics by Meredith Anne Hoy
    From Point to Pixel: A Genealogy of Digital Aesthetics by Meredith Anne Hoy A dissertation submitted in partial satisfaction of the requirements for the degree of Doctor of Philosophy in Rhetoric and the Designated Emphasis in Film Studies in the Graduate Division of the University of California, Berkeley Committee in charge: Professor Whitney Davis, co-chair Professor Jeffrey Skoller, co-chair Professor Warren Sack Professor Abigail DeKosnik Professor Kristen Whissel Spring 2010 Copyright 2010 by Hoy, Meredith All rights reserved. Abstract From Point to Pixel: A Genealogy of Digital Aesthetics by Meredith Anne Hoy Doctor of Philosophy in Rhetoric University of California, Berkeley Professor Whitney Davis, Co-chair Professor Jeffrey Skoller, Co-chair When we say, in response to a still or moving picture, that it has a digital “look” about it, what exactly do we mean? How can the slick, color-saturated photographs of Jeff Wall and Andreas Gursky signal digitality, while the flattened, pixelated landscapes of video games such as Super Mario Brothers convey ostensibly the same characteristic of “being digital,” but in a completely different manner? In my dissertation, From Point to Pixel: A Genealogy of Digital Aesthetics, I argue for a definition of a "digital method" that can be articulated without reference to the technicalities of contemporary hardware and software. I allow, however, the possibility that this digital method can acquire new characteristics when it is performed by computational technology. I therefore treat the artworks covered in my dissertation as sensuous artifacts that are subject to change based on the constraints and affordances of the tools used in their making.
    [Show full text]
  • Computation and Validation of Two-Dimensional PSF Simulation
    Computation and validation of two-dimensional PSF simulation based on physical optics K. Tayabaly1,2, D. Spiga1, G. Sironi1, R.Canestrari1, M.Lavagna2, G. Pareschi1 1 INAF/Brera Astronomical Observatory, Via Bianchi 46, 23807 Merate, Italy 2 Politecnico di Milano, Via La Masa 1, 20156 Milano, Italy ABSTRACT The Point Spread Function (PSF) is a key figure of merit for specifying the angular resolution of optical systems and, as the demand for higher and higher angular resolution increases, the problem of surface finishing must be taken seriously even in optical telescopes. From the optical design of the instrument, reliable ray-tracing routines allow computing and display of the PSF based on geometrical optics. However, such an approach does not directly account for the scattering caused by surface microroughness, which is interferential in nature. Although the scattering effect can be separately modeled, its inclusion in the ray-tracing routine requires assumptions that are difficult to verify. In that context, a purely physical optics approach is more appropriate as it remains valid regardless of the shape and size of the defects appearing on the optical surface. Such a computation, when performed in two-dimensional consideration, is memory and time consuming because it requires one to process a surface map with a few micron resolution, and the situation becomes even more complicated in case of optical systems characterized by more than one reflection. Fortunately, the computation is significantly simplified in far-field configuration, since the computation involves only a sequence of Fourier Transforms. In this paper, we provide validation of the PSF simulation with Physical Optics approach through comparison with real PSF measurement data in the case of ASTRI-SST M1 hexagonal segments.
    [Show full text]
  • Modern Astronomical Optics 1
    Modern Astronomical Optics 1. Fundamental of Astronomical Imaging Systems OUTLINE: A few key fundamental concepts used in this course: Light detection: Photon noise Diffraction: Diffraction by an aperture, diffraction limit Spatial sampling Earth's atmosphere: every ground-based telescope's first optical element Effects for imaging (transmission, emission, distortion and scattering) and quick overview of impact on optical design of telescopes and instruments Geometrical optics: Pupil and focal plane, Lagrange invariant Astronomical measurements & important characteristics of astronomical imaging systems: Collecting area and throughput (sensitivity) flux units in astronomy Angular resolution Field of View (FOV) Time domain astronomy Spectral resolution Polarimetric measurement Astrometry Light detection: Photon noise Poisson noise Photon detection of a source of constant flux F. Mean # of photon in a unit dt = F dt. Probability to detect a photon in a unit of time is independent of when last photon was detected→ photon arrival times follows Poisson distribution Probability of detecting n photon given expected number of detection x (= F dt): f(n,x) = xne-x/(n!) x = mean value of f = variance of f Signal to noise ration (SNR) and measurement uncertainties SNR is a measure of how good a detection is, and can be converted into probability of detection, degree of confidence Signal = # of photon detected Noise (std deviation) = Poisson noise + additional instrumental noises (+ noise(s) due to unknown nature of object observed) Simplest case (often
    [Show full text]
  • A Pentadic Criticism of Kosovar Muslim and Roma Identity As Represented in Photographs by James Nachtwey and Djordje Jovanovic
    DePaul University Via Sapientiae College of Communication Master of Arts Theses College of Communication Fall 11-30-2012 Other and Self-Representation: A Pentadic Criticism of Kosovar Muslim and Roma Identity as Represented in Photographs by James Nachtwey and Djordje Jovanovic Melody S. Follweiler DePaul University Follow this and additional works at: https://via.library.depaul.edu/cmnt Part of the Communication Commons Recommended Citation Follweiler, Melody S., "Other and Self-Representation: A Pentadic Criticism of Kosovar Muslim and Roma Identity as Represented in Photographs by James Nachtwey and Djordje Jovanovic" (2012). College of Communication Master of Arts Theses. 14. https://via.library.depaul.edu/cmnt/14 This Thesis is brought to you for free and open access by the College of Communication at Via Sapientiae. It has been accepted for inclusion in College of Communication Master of Arts Theses by an authorized administrator of Via Sapientiae. For more information, please contact [email protected]. Running Head: OTHER AND SELF-REPRESENTATION 1 Other and Self-Representation: A Pentadic Criticism of Kosovar Muslim and Roma Identity as Represented in Photographs by James Nachtwey and Djordje Jovanovic By: Melody S. Follweiler B.A., Communication and Media Studies, DePaul University, 2011 Master’s Thesis Submitted in Partial Fulfillment of the Requirements for the Degree of Master of Arts Multicultural Communication DePaul University Chicago, IL November 2012 OTHER AND SELF-REPRESENTATION: A PENTADIC CRITICISM 2 Table
    [Show full text]
  • A1the Eye in Detail
    A. The Eye A1. Eye in detail EYE ANATOMY A guide to the many parts of the human eye and how they function. The ability to see is dependent on the actions of several structures in and around the eyeball. The graphic below lists many of the essential components of the eye's optical system. When you look at an object, light rays are reflected from the object to the cornea , which is where the miracle begins. The light rays are bent, refracted and focused by the cornea, lens , and vitreous . The lens' job is to make sure the rays come to a sharp focus on the retina . The resulting image on the retina is upside-down. Here at the retina, the light rays are converted to electrical impulses which are then transmitted through the optic nerve , to the brain, where the image is translated and perceived in an upright position! Think of the eye as a camera. A camera needs a lens and a film to produce an image. In the same way, the eyeball needs a lens (cornea, crystalline lens, vitreous) to refract, or focus the light and a film (retina) on which to focus the rays. If any one or more of these components is not functioning correctly, the result is a poor picture. The retina represents the film in our camera. It captures the image and sends it to the brain to be developed. The macula is the highly sensitive area of the retina. The macula is responsible for our critical focusing vision. It is the part of the retina most used.
    [Show full text]
  • CMOS Image Sensors - Past Present and Future
    CMOS Image Sensors - Past Present and Future Boyd Fowler, Xinqiao(Chiao) Liu, and Paul Vu Fairchild Imaging, 1801 McCarthy Blvd. Milpitas, CA 95035 USA Abstract sors. CMOS image sensors can integrate sensing and process- In this paper we present an historical perspective of CMOS ing on the same chip and have higher radiation tolerance than image sensors from their inception in the mid 1960s through CCDs. In addition CMOS sensors could be produced by a vari- their resurgence in the 1980s and 90s to their dominance in the ety of different foundries. This opened the creative flood gates 21st century. We focus on the evolution of key performance pa- and allowed people all over the world to experiment with CMOS rameters such as temporal read noise, fixed pattern noise, dark image sensors. current, quantum efficiency, dynamic range, and sensor format, The 1990’s saw the rapid development of CMOS image i.e the number of pixels. We discuss how these properties were sensors by universities and small companies. By the end of improved during the past 30 plus years. We also offer our per- the 1990’s image quality had been significantly improved, but spective on how performance will be improved by CMOS tech- it was still not as good as CCDs. During the first few years of nology scaling and the cell phone camera market. the 21st century it became clear that CMOS image sensor could out–perform CCDs in the high speed imaging market, but that Introduction their performance still lagged in other markets. Then, the devel- MOS image sensors are not a new development, and they opment of the cell phone camera market provided the necessary are also not a typical disruptive technology.
    [Show full text]