White Paper3 V5
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Do Something Important.TM
2.6X zoom in a cool, compact, capsule design with Advanced Photo System convenience. Do somethingwww.minolta.com important.TM www.minolta.com It’s All Within Your Grasp The VECTIS 260 makes picture-taking more fun for the whole family. The Advanced Photo System means easier operation and higher quality pictures, and a powerful 2.6X zoom expands your photo possibilities. Various automatic functions make it simple for everyone to take great pictures. It’s the compact camera that gives you more. ADVANCED PHOTO SYSTEM Get closer with 2.6X zoom! High-quality zoom lens adds variety to your photos. Powerful 2.6X Zoom A zoom lens this powerful gives you much greater versatility in framing your shots. You can take broad shots of the scenery, then zoom in for close-ups of the kids. The zoom range is a wide 25 — 65mm (equivalent to 31— 81mm in 35mm for- mat). For close-ups, you can move in as close as 1.64 ft. to your subject at any focal length, without having to set a special mode. Best of all, 25mm WIDE your photo will be sharp and clear, thanks to the Minolta high quality 4-element, 4-group zoom lens. 65mm TELE Greater ease and convenience Advanced Photo System makes photography more enjoyable — from start to finish! Index Prints for At-a-Glance Selection Drop-In Loading With photos this good, you’ll want to make reprints ... and now and Film Chamber Lock you can easily select the ones you want. With your pictures The ultimate in film loading you’ll receive an Index Print sheet, making it easy ease. -
Invention of Digital Photograph
Invention of Digital photograph Digital photography uses cameras containing arrays of electronic photodetectors to capture images focused by a lens, as opposed to an exposure on photographic film. The captured images are digitized and stored as a computer file ready for further digital processing, viewing, electronic publishing, or digital printing. Until the advent of such technology, photographs were made by exposing light sensitive photographic film and paper, which was processed in liquid chemical solutions to develop and stabilize the image. Digital photographs are typically created solely by computer-based photoelectric and mechanical techniques, without wet bath chemical processing. The first consumer digital cameras were marketed in the late 1990s.[1] Professionals gravitated to digital slowly, and were won over when their professional work required using digital files to fulfill the demands of employers and/or clients, for faster turn- around than conventional methods would allow.[2] Starting around 2000, digital cameras were incorporated in cell phones and in the following years, cell phone cameras became widespread, particularly due to their connectivity to social media websites and email. Since 2010, the digital point-and-shoot and DSLR formats have also seen competition from the mirrorless digital camera format, which typically provides better image quality than the point-and-shoot or cell phone formats but comes in a smaller size and shape than the typical DSLR. Many mirrorless cameras accept interchangeable lenses and have advanced features through an electronic viewfinder, which replaces the through-the-lens finder image of the SLR format. While digital photography has only relatively recently become mainstream, the late 20th century saw many small developments leading to its creation. -
Image Sensors and Image Quality in Mobile Phones
Image Sensors and Image Quality in Mobile Phones Juha Alakarhu Nokia, Technology Platforms, Camera Entity, P.O. Box 1000, FI-33721 Tampere [email protected] (+358 50 4860226) Abstract 2. Performance metric This paper considers image sensors and image quality A reliable sensor performance metric is needed in order in camera phones. A method to estimate the image to have meaningful discussion on different sensor quality and performance of an image sensor using its options, future sensor performance, and effects of key parameters is presented. Subjective image quality different technologies. Comparison of individual and mapping camera technical parameters to its technical parameters does not provide general view to subjective image quality are discussed. The developed the image quality and can be misleading. A more performance metrics are used to optimize sensor comprehensive performance metric based on low level performance for best possible image quality in camera technical parameters is developed as follows. phones. Finally, the future technology and technology First, conversion from photometric units to trends are discussed. The main development trend for radiometric units is needed. Irradiation E [W/m2] is images sensors is gradually changing from pixel size calculated as follows using illuminance Ev [lux], reduction to performance improvement within the same energy spectrum of the illuminant (any unit), and the pixel size. About 30% performance improvement is standard luminosity function V. observed between generations if the pixel size is kept Equation 1: Radiometric unit conversion the same. Image sensor is also the key component to offer new features to the user in the future. s(λ) d λ E = ∫ E lm v 683 s()()λ V λ d λ 1. -
A Digital Astrophotography Primer - OR - This Is NOT Your Daddy’S SLR!
A Digital Astrophotography Primer - OR - This is NOT your Daddy’s SLR! Page 1 of 22 Table of Contents A Digital Astrophotography Primer...........................................................................................................................................................1 Table of Contents.......................................................................................................................................................................................2 Introduction............................................................................................................................................................................................3 What is an SLR, anyways? ....................................................................................................................................................................3 SLR, DSLR, What’s the Difference?.....................................................................................................................................................4 The Viewfinder ......................................................................................................................................................................................4 The Focus Mechanism ...........................................................................................................................................................................5 The Capture Medium .............................................................................................................................................................................6 -
A High Full Well Capacity CMOS Image Sensor for Space Applications
sensors Article A High Full Well Capacity CMOS Image Sensor for Space Applications Woo-Tae Kim 1 , Cheonwi Park 1, Hyunkeun Lee 1 , Ilseop Lee 2 and Byung-Geun Lee 1,* 1 School of Electrical Engineering and Computer Science, Gwangju Institute of Science and Technology, Gwangju 61005, Korea; [email protected] (W.-T.K.); [email protected] (C.P.); [email protected] (H.L.) 2 Korea Aerospace Research Institute, Daejeon 34133, Korea; [email protected] * Correspondence: [email protected]; Tel.: +82-62-715-3231 Received: 24 January 2019; Accepted: 26 March 2019; Published: 28 March 2019 Abstract: This paper presents a high full well capacity (FWC) CMOS image sensor (CIS) for space applications. The proposed pixel design effectively increases the FWC without inducing overflow of photo-generated charge in a limited pixel area. An MOS capacitor is integrated in a pixel and accumulated charges in a photodiode are transferred to the in-pixel capacitor multiple times depending on the maximum incident light intensity. In addition, the modulation transfer function (MTF) and radiation damage effect on the pixel, which are especially important for space applications, are studied and analyzed through fabrication of the CIS. The CIS was fabricated using a 0.11 µm 1-poly 4-metal CIS process to demonstrate the proposed techniques and pixel design. A measured FWC of 103,448 electrons and MTF improvement of 300% are achieved with 6.5 µm pixel pitch. Keywords: CMOS image sensors; wide dynamic range; multiple charge transfer; space applications; radiation damage effects 1. Introduction Imaging devices are essential components in the space environment for a range of applications including earth observation, star trackers on satellites, lander and rover cameras [1]. -
Lecture Notes 3 Charge-Coupled Devices (Ccds) – Part II • CCD
Lecture Notes 3 Charge-Coupled Devices (CCDs) { Part II • CCD array architectures and pixel layout ◦ One-dimensional CCD array ◦ Two-dimensional CCD array • Smear • Readout circuits • Anti-blooming, electronic shuttering, charge reset operation • Window of interest, pixel binning • Pinned photodiode EE 392B: CCDs{Part II 3-1 One-Dimensional (Linear) CCD Operation A. Theuwissen, \Solid State Imaging with Charge-Coupled Devices," Kluwer (1995) EE 392B: CCDs{Part II 3-2 • A line of photodiodes or photogates is used for photodetection • After integration, charge from the entire row is transferred in parallel to the horizontal CCD (HCCD) through transfer gates • New integration period begins while charge packets are transferred through the HCCD (serial transfer) to the output readout circuit (to be discussed later) • The scene can be mechanically scanned at a speed commensurate with the pixel size in the vertical direction to obtain 2D imaging • Applications: scanners, scan-and-print photocopiers, fax machines, barcode readers, silver halide film digitization, DNA sequencing • Advantages: low cost (small chip size) EE 392B: CCDs{Part II 3-3 Two-Dimensional (Area) CCD • Frame transfer CCD (FT-CCD) ◦ Full frame CCD • Interline transfer CCD (IL-CCD) • Frame-interline transfer CCD (FIT-CCD) • Time-delay-and-integration CCD (TDI-CCD) EE 392B: CCDs{Part II 3-4 Frame Transfer CCD Light−sensitive CCD array Frame−store CCD array Amplifier Output Horizontal CCD Integration Vertical shift Operation Vertical shift Horizotal shift Time EE 392B: CCDs{Part II 3-5 Pixel Layout { FT-CCD D. N. Nichols, W. Chang, B. C. Burkey, E. G. Stevens, E. A. Trabka, D. -
Digital Vs 35-Mm Photography to Convert Or Not to Convert?
ORIGINAL ARTICLE Digital vs 35-mm Photography To Convert or Not to Convert? Mimi S. Kokoska, MD; John W. Currens, MD; Chris S. Hollenbeak, MA; J. Regan Thomas, MD; Brendan C. Stack, Jr, MD Objectives: To compare the quality of images gener- Results: The null hypothesis that there is no difference ated from a conventional 35-mm camera with those gen- between photographs taken with the various cameras was erated from various digital cameras; and to note the costs rejected (P,.001) for each of the image attributes. The im- of the cameras and ease of use. ages produced by the 35-mm camera (Nikon 6006) had the best overall quality, followed by the Olympus D600L, Design: A prospective, randomized, independent analy- Kodak DCS 315, and Olympus D320L digital cameras. Dif- sis of specific facial images taken with a 35-mm camera ferences in individual attributes between several of the cam- and 3 digital cameras by 3 facial plastic surgeons who eras in each category were statistically significant (P,.05). were blinded to camera type. Conclusions: The 35-mm camera produced the best over- Setting: An academic medical center. all image quality and ranked first for each of the indi- vidual attributes analyzed in this study. The Olympus Subjects: Thirteen volunteer subjects ranging from age D600L digital camera placed second in overall quality, 27 to 58 years. but there was no statistically significant difference in fo- cus, distortion, and resolution compared with the im- Main Outcome Measures: The overall quality, fo- ages generated by the 35-mm camera. The Olympus cus, distortion, trueness of color, resolution, contrast, and D600L digital camera also ranked second in color, con- presence of shadows were evaluated for each image. -
CMOS Active Pixel Image Sensors for Highly Integrated Imaging Systems
IEEE JOURNAL OF SOLID-STATE CIRCUITS, VOL. 32, NO. 2, FEBRUARY 1997 187 CMOS Active Pixel Image Sensors for Highly Integrated Imaging Systems Sunetra K. Mendis, Member, IEEE, Sabrina E. Kemeny, Member, IEEE, Russell C. Gee, Member, IEEE, Bedabrata Pain, Member, IEEE, Craig O. Staller, Quiesup Kim, Member, IEEE, and Eric R. Fossum, Senior Member, IEEE Abstract—A family of CMOS-based active pixel image sensors Charge-coupled devices (CCD’s) are currently the dominant (APS’s) that are inherently compatible with the integration of on- technology for image sensors. CCD arrays with high fill-factor, chip signal processing circuitry is reported. The image sensors small pixel sizes, and large formats have been achieved and were fabricated using commercially available 2-"m CMOS pro- cesses and both p-well and n-well implementations were explored. some signal processing operations have been demonstrated The arrays feature random access, 5-V operation and transistor- with charge-domain circuits [1]–[3]. However, CCD’s cannot transistor logic (TTL) compatible control signals. Methods of be easily integrated with CMOS circuits due to additional on-chip suppression of fixed pattern noise to less than 0.1% fabrication complexity and increased cost. Also, CCD’s are saturation are demonstrated. The baseline design achieved a pixel high capacitance devices so that on-chip CMOS drive electron- size of 40 "m 40 "m with 26% fill-factor. Array sizes of 28 28 elements and 128 128 elements have been fabricated and ics would dissipate prohibitively high power levels for large characterized. Typical output conversion gain is 3.7 "V/e for the area arrays (2–3 W). -
Image Quality Assessment Through FSIM, SSIM, MSE and PSNR—A Comparative Study
Journal of Computer and Communications, 2019, 7, 8-18 http://www.scirp.org/journal/jcc ISSN Online: 2327-5227 ISSN Print: 2327-5219 Image Quality Assessment through FSIM, SSIM, MSE and PSNR—A Comparative Study Umme Sara1, Morium Akter2, Mohammad Shorif Uddin2 1National Institute of Textile Engineering and Research, Dhaka, Bangladesh 2Department of Computer Science and Engineering, Jahangirnagar University, Dhaka, Bangladesh How to cite this paper: Sara, U., Akter, M. Abstract and Uddin, M.S. (2019) Image Quality As- sessment through FSIM, SSIM, MSE and Quality is a very important parameter for all objects and their functionalities. PSNR—A Comparative Study. Journal of In image-based object recognition, image quality is a prime criterion. For au- Computer and Communications, 7, 8-18. thentic image quality evaluation, ground truth is required. But in practice, it https://doi.org/10.4236/jcc.2019.73002 is very difficult to find the ground truth. Usually, image quality is being as- Received: January 30, 2019 sessed by full reference metrics, like MSE (Mean Square Error) and PSNR Accepted: March 1, 2019 (Peak Signal to Noise Ratio). In contrast to MSE and PSNR, recently, two Published: March 4, 2019 more full reference metrics SSIM (Structured Similarity Indexing Method) Copyright © 2019 by author(s) and and FSIM (Feature Similarity Indexing Method) are developed with a view to Scientific Research Publishing Inc. compare the structural and feature similarity measures between restored and This work is licensed under the Creative original objects on the basis of perception. This paper is mainly stressed on Commons Attribution International License (CC BY 4.0). -
A Guide to Smartphone Astrophotography National Aeronautics and Space Administration
National Aeronautics and Space Administration A Guide to Smartphone Astrophotography National Aeronautics and Space Administration A Guide to Smartphone Astrophotography A Guide to Smartphone Astrophotography Dr. Sten Odenwald NASA Space Science Education Consortium Goddard Space Flight Center Greenbelt, Maryland Cover designs and editing by Abbey Interrante Cover illustrations Front: Aurora (Elizabeth Macdonald), moon (Spencer Collins), star trails (Donald Noor), Orion nebula (Christian Harris), solar eclipse (Christopher Jones), Milky Way (Shun-Chia Yang), satellite streaks (Stanislav Kaniansky),sunspot (Michael Seeboerger-Weichselbaum),sun dogs (Billy Heather). Back: Milky Way (Gabriel Clark) Two front cover designs are provided with this book. To conserve toner, begin document printing with the second cover. This product is supported by NASA under cooperative agreement number NNH15ZDA004C. [1] Table of Contents Introduction.................................................................................................................................................... 5 How to use this book ..................................................................................................................................... 9 1.0 Light Pollution ....................................................................................................................................... 12 2.0 Cameras ................................................................................................................................................ -
Video Quality Assessment: Subjective Testing of Entertainment Scenes Margaret H
1 Video Quality Assessment: Subjective testing of entertainment scenes Margaret H. Pinson, Lucjan Janowski, and Zdzisław Papir This article describes how to perform a video quality subjective test. For companies, these tests can greatly facilitate video product development; for universities, removing perceived barriers to conducting such tests allows expanded research opportunities. This tutorial assumes no prior knowledge and focuses on proven techniques. (Certain commercial equipment, materials, and/or programs are identified in this article to adequately specify the experimental procedure. In no case does such identification imply recommendation or endorsement by the National Telecommunications and Information Administration, nor does it imply that the program or equipment identified is necessarily the best available for this application.) Video is a booming industry: content is embedded on many Web sites, delivered over the Internet, and streamed to mobile devices. Cisco statistics indicate that video exceeded 50% of total mobile traffic for the first time in 2012, and predicts that two-thirds of the world’s mobile data traffic will be video by 2017 [1]. Each company must make a strategic decision on the correct balance between delivery cost and user experience. This decision can be made by the engineers designing the service or, for increased accuracy, by consulting users [2]. Video quality assessment requires a combined approach that includes objective metrics, subjective testing, and live video monitoring. Subjective testing is a critical step to ensuring success in operational management and new product development. Carefully conducted video quality subjective tests are extremely reliable and repeatable, as is shown in Section 8 of [3]. This article provides an approachable tutorial on how to conduct a subjective video quality experiment. -
Revisiting Image Vignetting Correction by Constrained Minimization of Log-Intensity Entropy
Revisiting Image Vignetting Correction by Constrained Minimization of log-Intensity Entropy Laura Lopez-Fuentes, Gabriel Oliver, and Sebastia Massanet Dept. Mathematics and Computer Science, University of the Balearic Islands Crta. Valldemossa km. 7,5, E-07122 Palma de Mallorca, Spain [email protected],[email protected],[email protected] Abstract. The correction of the vignetting effect in digital images is a key pre-processing step in several computer vision applications. In this paper, some corrections and improvements to the image vignetting cor- rection algorithm based on the minimization of the log-intensity entropy of the image are proposed. In particular, the new algorithm is able to deal with images with a vignetting that is not in the center of the image through the search of the optical center of the image. The experimen- tal results show that this new version outperforms notably the original algorithm both from the qualitative and the quantitative point of view. The quantitative measures are obtained using an image database with images to which artificial vignetting has been added. Keywords: Vignetting, entropy, optical center, gain function. 1 Introduction Vignetting is an undesirable effect in digital images which needs to be corrected as a pre-processing step in computer vision applications. This effect is based on the radial fall off of brightness away from the optical center of the image. Specially, those applications which rely on consistent intensity measurements of a scene are highly affected by the existence of vignetting. For example, image mosaicking, stereo matching and image segmentation, among many others, are applications where the results are notably improved if some previous vignetting correction algorithm is applied to the original image.