Color Image Processing Pipeline [A General Survey of Digital Still Camera Processing]

Total Page:16

File Type:pdf, Size:1020Kb

Color Image Processing Pipeline [A General Survey of Digital Still Camera Processing] [Rajeev Ramanath, Wesley E. Snyder, Youngjun Yoo, and Mark S. Drew ] FLOWER PHOTO © PHOTO FLOWER MEDIA, 1991 21ST CENTURY PHOTO:CAMERA AND BACKGROUND ©VISION LTD. DIGITAL Color Image Processing Pipeline [A general survey of digital still camera processing] igital still color cameras (DSCs) have gained significant popularity in recent years, with projected sales in the order of 44 million units by the year 2005. Such an explosive demand calls for an understanding of the processing involved and the implementation issues, bearing in mind the otherwise difficult problems these cameras solve. This article presents an overview of the image processing pipeline, Dfirst from a signal processing perspective and later from an implementation perspective, along with the tradeoffs involved. IMAGE FORMATION A good starting point to fully comprehend the signal processing performed in a DSC is to con- sider the steps by which images are formed and how each stage affects the final rendered image. There are two distinct aspects of image formation: one that has a colorimetric perspective and another that has a generic imaging perspective, and we treat these separately. In a vector space model for color systems, a reflectance spectrum r(λ) sampled uniformly in a spectral range [λmin,λmax] interacts with the illuminant spectrum L(λ) to form a projection onto the color space of the camera RGBc as follows: c = N STLMr + n ,(1) where S is a matrix formed by stacking the spectral sensitivities of the K color filters used in the imaging system column-wise, L is a diagonal matrix with samples of the illuminant spec- trum along its diagonal, M is another diagonal matrix with samples of the relative spectral sensitivity of the charge-coupled device (CCD) sensor, r is a vector corresponding to the rela- IEEE SIGNAL PROCESSING MAGAZINE [34] JANUARY 2005 1053-5888/05/$20.00©2005IEEE tive surface spectral reflectance of the object, Sensor, Aperture, and n is an additive noise and Lens term (in fact, noise may Focus even be multiplicative Control and signal dependent, making matters mathe- matically much more complex) [1], [2]. N cor- Preprocessing White Balance Demosaic responds to the nonlin- earity introduced in the system. A similar algebra- ic setting exists for color Exposure formation in the human Control eye, producing tristimu- lus values where A is used to denote the equivalent of matrix S, and t denotes Display on Preferred Color Transform Color Transform the tristimulus values in Device Post Processing RGB RGB RGB RGB the CIEXYZ space [3]. Let Compress and Store u s c u us denote by f an image with M rows, N columns, [FIG1] Image processing involved in a DSC. and K spectral bands. DSCs typically use K = 3, although one may conceive of a sen- Spaulding details some of these steps [5]. An excellent resource sor with a more spectral bands. In some recent cameras, four for DSC processing is a book chapter by Holm et al. [6]. The sensors are used: red, green, blue, and emerald or cyan, multitude of problems that still remain unsolved is a fascinating magenta, yellow, and green. (Depending on the choice of the source of research in the community. spectral sensitivities, the colors captured form color gamut of different sizes; for DSCs, it is typically important to capture PIPELINE skin tones with reasonable accuracy.) The image may also be The signal flowchart shown in Figure 1 briefly summarizes DSC considered as a two-dimensional (2-D) array with vector-val- processing. It should be noted that the sequence of operations ued pixels. Each vector-valued pixel is formed according to the differs from manufacturer to manufacturer. model in (1), with values determined by the reflectance and Each of these blocks may be fine-tuned to achieve better sys- illumination at the three-dimensional (3-D) world point temic performance [7], e.g., introducing a small amount of blur indexed by 2-D camera pixel position. The image formed is using the lens system increases the correlation between neigh- then further modeled as follows: boring pixels, which in turn may be used in the demosaicking step. Let us now consider each block in Figure 1. g = B{Hf},(2) SENSOR, APERTURE AND LENS where B is a color filter array (CFA) sampling operator, H is the Although there is a need to measure three (or more) bands at point spread function (a blur) corresponding to the optical sys- each pixel location, this requires the use of more than one sen- tem, and f is a lexical representation of the full-color image in sor and, consequently, drives up the cost of the camera. As a which each pixel is formed according to (1). cheaper and more robust solution, manufacturers place a CFA Although there is an overlap between color processing prob- on top of the sensor element. Of the many CFA patterns avail- lems for other devices, such as scanners and printers, working able, the Bayer array is by far the most popular [8]. Control with DSCs is complicated by problems stemming from the man- mechanisms interact with the sensor (shown in Figure 1 as a ner in which the input image is captured; the spatial variation in red-green-blue checkered pattern, the CFA), to determine the the scene lighting, the nonfixed scene geometry (location and exposure (aperture size, shutter speed, and automatic gain con- orientation of light source, camera, and surfaces in the scene), trol) and focal position of the lens. These parameters need to be varying scene illuminants (including combination of different determined dynamically based on scene content. It is conven- light sources in the same scene), or the use of a CFA to obtain tional to include an infrared blocking filter called a “hot mirror” one color sample at a sensor location, to name a few. (as it reflects infrared energy) along with the lens system, as Further resources on DSC issues include an illustrative most of the filters that are used in CFAs are sensitive in the overview of some of the processing steps involved in a DSC by near-infrared part of the spectrum, as is the silicon substrate Adams et al. [4]. A recent book chapter by Parulski and used in the sensor. IEEE SIGNAL PROCESSING MAGAZINE [35] JANUARY 2005 contrast ratio between the brightest pixel and the darkest pixel B in an image. The human visual system (HVS) can adapt to about four orders of magnitude in contrast ratio, while the sRGB sys- A tem and typical computer monitors and television sets have a dynamic range of about two orders of magnitude. This leads to spatial detail in darker areas becoming indistinguishable from black and spatial detail in bright areas become indistinguishable (a) (b) from white. To address this problem, researchers have used the approach of capturing multiple images of the same scene at [FIG2] Illustration of the division of the image into various blocks varying exposure levels and combining them to obtain a fused over which the luminance signal is measured for exposure control. (a) The image is divided into 24 blocks. (b) Sample image that represents the highlight (bright) and shadow (dark) partitioning of the scene for contrast measurement. regions of an image in reasonable detail [11]. A detailed discus- sion of this topic is beyond the scope of this article, however, we refer the interested reader to the appropriate references Exposure control usually requires characterization of the [12]–[14]. Another interesting approach to this problem is to brightness (or intensity) of the image: an over- or underexposed squeeze in two sensors in the location of what was formerly one image will greatly affect output colors. Depending on the meas- sensor element; each with a different sensitivity to light, captur- ured energy in the sensor, the exposure control system changes ing two images of different dynamic ranges, hence effectively the aperture size and/or the shutter speed along with a carefully increasing the net dynamic range of the camera. Commercial calibrated automatic gain controller to capture well-exposed cameras are beginning to incorporate high dynamic range images. Both the exposure and focus controls may be based on (HDR) imaging solutions into their systems, either in software either the actual luminance component through image processing routines or in derived from the complete RGB image or hardware by modifying the actual sensor, to simply the green channel data, which is a facilitate the capture of excessively front or good estimate of the luminance signal. backlit scenes. The image is divided into blocks [9], Focus control may be performed by [10] as shown in Figure 2(a). The average using one of two types of approaches, luminance signal is measured in each one active approaches that typically use a of these blocks that later is combined to pulsed beam of infrared light from a small form a measure of exposure based on the source placed near the lens system (called type of scene being imaged: backlit or (a) an auto-focus assist lamp) to obtain an frontlit scene or a nature shot. In a typical estimate of the distance to the object of image, the average luminance signal is interest or passive approaches that make measured and is compared to a reference use of the image formed in the camera to level, and the amount of exposure is con- determine the best focus. Passive trolled to maintain a constant scene lumi- approaches may be further divided into nance. Backlit or frontlit scenes may be two types, ones that analyze the spatial distinguished by measuring the difference frequency content of the image and ones between the average luminance signal in that use a phase detection technique to the blocks, as shown in Figure 2(b).
Recommended publications
  • Digital Still Camera Image File Format Standard
    Digital Still Camera Image File Format Standard (Exchangeable image file format for Digital Still Cameras: Exif) Version 2.1 June 12, 1998 Japan Electronic Industry Development Association (JEIDA) This standard makes no warranty, express or implied, with respect to the use of any intellectual property, such as patents, copyrights and trademarks, belonging to any corporation or individual. Nor does this standard make any warranty regarding system reliability or product liability. Windows™ is a registered trademark of Microsoft Corporation in the United States and elsewhere. FlashPix™ is a registered trademark of Eastman Kodak Company. Revision History This "Digital Still Camera Image File Format Standard" is issued as a standard for the image file format (Exif: Exchangeable image file format) used in digital still cameras and related systems. It was first published in October 1996 as Version 1.0i. Then in May 1997, Version 1.1ii was issued, adding specifications for optional attribute information as well as stipulations relating to format implementation, in addition to the mandatory specifications of Version 1.0. The desire for a uniform file format standard for the image data stored by digital still cameras has increased as these cameras have grown in popularity. At the same time, with the broadening application of this technology, a similar desire has arisen for uniformity of the attribute information that can be recorded in a file. The Version 2.0iii makes improvements to the Exif format for greater ease of use, while allowing for backward compatibility with products of manufacturers currently implementing Exif Version 1.x or considering its future implementation.
    [Show full text]
  • Khronos Data Format Specification
    Khronos Data Format Specification Andrew Garrard Version 1.2, Revision 1 2019-03-31 1 / 207 Khronos Data Format Specification License Information Copyright (C) 2014-2019 The Khronos Group Inc. All Rights Reserved. This specification is protected by copyright laws and contains material proprietary to the Khronos Group, Inc. It or any components may not be reproduced, republished, distributed, transmitted, displayed, broadcast, or otherwise exploited in any manner without the express prior written permission of Khronos Group. You may use this specification for implementing the functionality therein, without altering or removing any trademark, copyright or other notice from the specification, but the receipt or possession of this specification does not convey any rights to reproduce, disclose, or distribute its contents, or to manufacture, use, or sell anything that it may describe, in whole or in part. This version of the Data Format Specification is published and copyrighted by Khronos, but is not a Khronos ratified specification. Accordingly, it does not fall within the scope of the Khronos IP policy, except to the extent that sections of it are normatively referenced in ratified Khronos specifications. Such references incorporate the referenced sections into the ratified specifications, and bring those sections into the scope of the policy for those specifications. Khronos Group grants express permission to any current Promoter, Contributor or Adopter member of Khronos to copy and redistribute UNMODIFIED versions of this specification in any fashion, provided that NO CHARGE is made for the specification and the latest available update of the specification for any version of the API is used whenever possible.
    [Show full text]
  • Painting with Light: a Technical Overview of Implementing HDR Support in Krita
    Painting with Light: A Technical Overview of Implementing HDR Support in Krita Executive Summary Krita, a leading open source application for artists, has become one of the first digital painting software to achieve true high dynamic range (HDR) capability. With more than two million downloads per year, the program enables users to create and manipulate illustrations, concept art, comics, matte paintings, game art, and more. With HDR, Krita now opens the door for professionals and amateurs to a vastly extended range of colors and luminance. The pioneering approach of the Krita team can be leveraged by developers to add HDR to other applications. Krita and Intel engineers have also worked together to enhance Krita performance through Intel® multithreading technology, while Intel’s built-in support of HDR provides further acceleration. This white paper gives a high-level description of the development process to equip Krita with HDR. A Revolution in the Display of Color and Brightness Before the arrival of HDR, displays were calibrated to comply with the D65 standard of whiteness, corresponding to average midday light in Western and Northern Europe. Possibilities of storing color, contrast, or other information about each pixel were limited. Encoding was limited to a depth of 8 bits; some laptop screens only handled 6 bits per pixel. By comparison, HDR standards typically define 10 bits per pixel, with some rising to 16. This extra capacity for storing information allows HDR displays to go beyond the basic sRGB color space used previously and display the colors of Recommendation ITU-R BT.2020 (Rec. 2020, also known as BT.2020).
    [Show full text]
  • Optimized RGB for Image Data Encoding
    Journal of Imaging Science and Technology® 50(2): 125–138, 2006. © Society for Imaging Science and Technology 2006 Optimized RGB for Image Data Encoding Beat Münch and Úlfar Steingrímsson Swiss Federal Laboratories for Material Testing and Research (EMPA), Dübendorf, Switzerland E-mail: [email protected] Indeed, image data originating from digital cameras and Abstract. The present paper discusses the subject of an RGB op- scanners are highly device dependent, and the same variabil- timization for color data coding. The current RGB situation is ana- lyzed and those requirements selected which exclusively address ity exists for image reproducing devices such as monitors the data encoding, while disregarding any application or workflow and printers. Figure 1 shows the locations of the primaries related objectives. In this context, the limitation to the RGB triangle for a few important RGB color spaces, revealing a manifold of color saturation, and the discretization due to the restricted reso- lution of 8 bits per channel are identified as the essential drawbacks variety. The respective specifications for the primaries, white of most of today’s RGB data definitions. Specific validation metrics points and gamma values are given in Table I. are proposed for assessing the codable color volume while at the In practice, the problem of RGB misinterpretation has same time considering the discretization loss due to the limited bit resolution. Based on those measures it becomes evident that the been approached by either providing an ICC profile or by idea of the recently promoted e-sRGB definition holds the potential assuming sRGB if no ICC profile is present.
    [Show full text]
  • Recording Devices, Image Creation Reproduction Devices
    11/19/2010 Early Color Management: Closed systems Color Management: ICC Profiles, Color Management Modules and Devices Drum scanner uses three photomultiplier tubes aadnd typica lly scans a film negative . A fixed transformation for the in‐house system Dr. Michael Vrhel maps the recorded RGB values to CMYK separations. Print operator adjusts final ink amounts. RGB CMYK Note: At‐home, camera film to prints and NTSC television. TC 707 Basis of modern techniques for color specification, measurement, control and communication. 11/17/2010 1 2 Reproduction Devices Recording Devices, Image Creation 3 4 1 11/19/2010 Modern Color Modern Color Communication: Communication: Open systems. Open systems. Profile Connection Space (PCS) e.g. CIELAB, CIEXYZ or sRGB 5 6 Modern Color Communication: Open Standards, Standards, Standards.. Systems. Printing Industry: ICC profile as an ISO standard Postscript color management was the first adopted solution providing a connection ISO 15076‐1:2005 Image technology colour management Space 1990 in PostScript Level 2. Architecture, profile format and data structure ‐‐ Part 1: Based on ICC.1:2004‐10 Apple ColorSync was a competing solution introduced in 1993. Camera characterization standards ISO 22028‐1 specifies the image state architecture and encoding requirements. International Color Consortium (ICC) developed a standard that has now been widely ISO 22028‐3 specifies the RIMM, ERIMM (and soon the FP‐RIMM) RGB color encodings. adopted at least by the print industry. Organization founded in 1993 by Adobe, Agfa, IEC/ISO 61966‐2‐2 specifies the scRGB, scYCC, scRGB‐nl and scYCC‐nl color encodings. Kodak, Microsoft, Silicon Graphics, Sun Microsystems and Taligent.
    [Show full text]
  • Design of Digital Cameras Which Pursues Essence of "Camera"
    IS&T's 1998 PICS Conference IS&T's 1998 PICS Conference Copyright 1998, IS&T Design of Digital Cameras which Pursues Essence of "Camera" Tsuneaki Kadosawa / Yukichi Niwa Imaging Business Group, Canon Inc. Tokyo, Japan Overview film camera when producing postcard-sized prints) Reviewing a digital camera from the viewpoint of - Highly portable (i.e., that its exterior be completely the "tool of taking picture", Canon developed a flat and that it be small enough to fit into a shirt new camera "PowerShotA5" which has comparable pocket) operability and portability to a conventional camera. And it - Ease of use (i.e., that it be able to be used in the same also achieved in the quality of picture, that can be rivaled to way as an ordinary film camera and that users be conventional film photo at post card size print. able to take successive pictures at sufficiently short intervals) Introduction - Compatibility (i.e., that it be able to be connected to other multimedia devices and that it be compatible The ease with which digital cameras may be used together in with a wide range of PCs and other cameras) conjunction with PCs and communications networks and the greater range of freedom they provide in terms of image size Development of the PowerShot A5 and the taking of pictures when compared to camcorders or conventional film cameras have made these devices one of - Photo quality the central tools driving the multimedia revolution and have contributed to a rapid rise in demand for digital cameras. When one considers Yet the VGA digital cameras which served as the driving the picture quality force behind the increasing popularity of these devices have provided by ordinary limitations in that the largest true photo-quality print size is silver halide film, one only about the size of an ordinary business card, thus sees that in order to making them poorly suited for regular use as a full-function provide the same level Fig.
    [Show full text]
  • HDR PROGRAMMING Thomas J
    April 4-7, 2016 | Silicon Valley HDR PROGRAMMING Thomas J. True, July 25, 2016 HDR Overview Human Perception Colorspaces Tone & Gamut Mapping AGENDA ACES HDR Display Pipeline Best Practices Final Thoughts Q & A 2 WHAT IS HIGH DYNAMIC RANGE? HDR is considered a combination of: • Bright display: 750 cm/m 2 minimum, 1000-10,000 cd/m 2 highlights • Deep blacks: Contrast of 50k:1 or better • 4K or higher resolution • Wide color gamut What’s a nit? A measure of light emitted per unit area. 1 nit (nt) = 1 candela / m 2 3 BENEFITS OF HDR Improved Visuals Richer colors Realistic highlights More contrast and detail in shadows Reduces / Eliminates clipping and compression issues HDR isn’t simply about making brighter images 4 HUNT EFFECT Increasing the Luminance Increases the Colorfulness By increasing luminance it is possible to show highly saturated colors without using highly saturated RGB color primaries Note: you can easily see the effect but CIE xy values stay the same 5 STEPHEN EFFECT Increased Spatial Resolution More visual acuity with increased luminance. Simple experiment – look at book page indoors and then walk with a book into sunlight 6 HOW HDR IS DELIVERED TODAY High-end professional color grading displays - Dolby Pulsar (4000 nits), Dolby Maui, SONY X300 (1000 nit OLED) UHD TVs - LG, SONY, Samsung… (1000 nits, high contrast, UHD-10, Dolby Vision, etc) Rec. 2020 UHDTV wide color gamut SMPTE ST-2084 Dolby Perceptual Quantizer (PQ) Electro-Optical Transfer Function (EOTF) SMPTE ST-2094 Dynamic metadata specification 7 REAL WORLD VISIBLE
    [Show full text]
  • Appearance-Based Image Splitting for HDR Display Systems Dan Zhang
    Rochester Institute of Technology RIT Scholar Works Theses Thesis/Dissertation Collections 4-1-2011 Appearance-based image splitting for HDR display systems Dan Zhang Follow this and additional works at: http://scholarworks.rit.edu/theses Recommended Citation Zhang, Dan, "Appearance-based image splitting for HDR display systems" (2011). Thesis. Rochester Institute of Technology. Accessed from This Thesis is brought to you for free and open access by the Thesis/Dissertation Collections at RIT Scholar Works. It has been accepted for inclusion in Theses by an authorized administrator of RIT Scholar Works. For more information, please contact [email protected]. CHESTER F. CARLSON CENTER FOR IMAGING SCIENCE COLLEGE OF SCIENCE ROCHESTER INSTITUTE OF TECHNOLOGY ROCHESTER, NY CERTIFICATE OF APPROVAL M.S. DEGREE THESIS The M.S. Degree Thesis of Dan Zhang has been examined and approved by two members of the Color Science faculty as satisfactory for the thesis requirement for the Master of Science degree Dr. James Ferwerda, Thesis Advisor Dr. Mark Fairchild I Appearance-based image splitting for HDR display systems. Dan Zhang B.S. Yanshan University, Qinhuangdao, China (2005) M.S. Beijing Institute of Technology, Beijing, China (2008) A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Color Science in the Center for Imaging Science, Rochester Institute of Technology April 2011 Signature of the Author Accepted by Dr. Mark Fairchild Graduate Coordinator Color Science II THESIS RELEASE PERMISSION FORM CHESTER F. CARLSON CENTER FOR IMAGING SCIENCE COLLEGE OF SCIENCE ROCHESTER INSTITUTE OF TECHNOLOGY ROCHESTER, NEW YORK Title of Thesis Appearance-based image splitting for HDR display systems.
    [Show full text]
  • HDR RENDERING on NVIDIA GPUS Thomas J
    HDR RENDERING ON NVIDIA GPUS Thomas J. True, May 8, 2017 HDR Overview Visible Color & Colorspaces NVIDIA Display Pipeline Tone Mapping AGENDA Programing for HDR Best Practices Conclusion Q & A 2 HDR OVERVIEW 3 WHAT IS HIGH DYNAMIC RANGE? HDR is considered a combination of: • Bright display: 750 cm/m 2 minimum, 1000-10,000 cd/m 2 highlights • Deep blacks: Contrast of 50k:1 or better • 4K or higher resolution • Wide color gamut What’s a nit? A measure of light emitted per unit area. 1 nit (nt) = 1 candela / m 2 4 BENEFITS OF HDR Tell a Better Story with Improved Visuals Richer colors Realistic highlights More contrast and detail in shadows Reduces / Eliminates clipping and compression issues HDR isn’t simply about making brighter images 5 HUNT EFFECT Increasing the Luminance Increases the Colorfulness By increasing luminance it is possible to show highly saturated colors without using highly saturated RGB color primaries Note: you can easily see the effect but CIE xy values stay the same 6 STEPHEN EFFECT Increased Spatial Resolution More visual acuity with increased luminance. Simple experiment – look at book page indoors and then walk with a book into sunlight 7 HOW HDR IS DELIVERED TODAY Displays and Connections High-End Professional Color Grading Displays – via SDI • Dolby Pulsar (4000 nits) • Dolby Maui • SONY X300 (1000 nit OLED) • Canon DP-V2420 UHD TVs – via HDMI 2.0a/b • LG, SONY, Samsung… (1000 nits, high contrast, HDR10, Dolby Vision, etc) Desktop Computer Displays – coming soon to a desktop / laptop near you 8 VISIBLE COLOR &
    [Show full text]
  • Digital Compression and Coding of Continuous-Tone Still Images
    This preview is downloaded from www.sis.se. Buy the entire standard via https://www.sis.se/std-916183 INTERNATIONAL ISO/IEC STANDARD 10918-4 First edition 1999-08-15 AMENDMENT 1 2013-05-01 Information technology — Digital compression and coding of continuous- tone still images: Registration of JPEG profiles, SPIFF profiles, SPIFF tags, SPIFF colour spaces, APPn markers, SPIFF compression types and Registration Authorities (REGAUT) AMENDMENT 1: Application specific marker list Technologies de l'information — Compression numérique et codage des images fixes de nature photographique: Enregistrement des profils JPEG, profils SPIFF, «SPIFF tags», espaces de couleur SPIFF, marqueurs APPn, types de compression SPIFF et autorités d'enregistrement (REGAUT) AMENDEMENT 1: Liste de marqueurs pour application spécifique Reference number ISO/IEC 10918-4:1999/Amd.1:2013(E) © ISO/IEC 2013 This preview is downloaded from www.sis.se. Buy the entire standard via https://www.sis.se/std-916183 ISO/IEC 10918-4:1999/Amd.1:2013(E) COPYRIGHT PROTECTED DOCUMENT © ISO/IEC 2013 All rights reserved. Unless otherwise specified, no part of this publication may be reproduced or utilized otherwise in any form or by any means, electronic or mechanical, including photocopying, or posting on the internet or an intranet, without prior written permission. Permission can be requested from either ISO at the address below or ISO’s member body in the country of the requester. ISO copyright office Case postale 56 CH-1211 Geneva 20 Tel. + 41 22 749 01 11 Fax + 41 22 749 09 47 E-mail [email protected] Web www.iso.org Published in Switzerland ii © ISO/IEC 2013 – All rights reserved This preview is downloaded from www.sis.se.
    [Show full text]
  • Computing Chromatic Adaptation
    Computing Chromatic Adaptation Sabine S¨usstrunk A thesis submitted for the Degree of Doctor of Philosophy in the School of Computing Sciences, University of East Anglia, Norwich. July 2005 c This copy of the thesis has been supplied on condition that anyone who consults it is understood to recognise that its copyright rests with the author and that no quotation from the thesis, nor any information derived therefrom, may be published without the author’s prior written consent. ii Abstract Most of today’s chromatic adaptation transforms (CATs) are based on a modified form of the von Kries chromatic adaptation model, which states that chromatic adaptation is an independent gain regulation of the three photoreceptors in the human visual system. However, modern CATs apply the scaling not in cone space, but use “sharper” sensors, i.e. sensors that have a narrower shape than cones. The recommended transforms currently in use are derived by minimizing perceptual error over experimentally obtained corresponding color data sets. We show that these sensors are still not optimally sharp. Using different com- putational approaches, we obtain sensors that are even more narrowband. In a first experiment, we derive a CAT by using spectral sharpening on Lam’s corresponding color data set. The resulting Sharp CAT, which minimizes XYZ errors, performs as well as the current most popular CATs when tested on several corresponding color data sets and evaluating perceptual error. Designing a spherical sampling technique, we can indeed show that these CAT sensors are not unique, and that there exist a large number of sensors that perform just as well as CAT02, the chromatic adap- tation transform used in CIECAM02 and the ICC color management framework.
    [Show full text]
  • (12) United States Patent (10) Patent No.: US 8,970,761 B2 Anderson (45) Date of Patent: *Mar
    US0089.70761 B2 (12) United States Patent (10) Patent No.: US 8,970,761 B2 Anderson (45) Date of Patent: *Mar. 3, 2015 (54) METHOD AND APPARATUS FOR (58) Field of Classification Search CORRECTING ASPECTRATO INA USPC .......................... 348/333.01, 333.11,333.12, CAMERA GRAPHICAL USER INTERFACE 348/333.O2 333.08 See application file for complete search history. (75) Inventor: Eric C. Anderson, Gardnerville, NV (US) (56) References Cited (73) Assignee: Flashpoint Technology, Inc., Raleigh, NC (US) U.S. PATENT DOCUMENTS (*) Notice: Subject to any disclaimer, the term of this 610,861 A 9, 1898 Goodwin patent is extended or adjusted under 35 725,034 A 4, 1903 Brownell U.S.C. 154(b) by 0 days. (Continued) This patent is Subject to a terminal dis FOREIGN PATENT DOCUMENTS claimer. DE 3518887 C1 9, 1986 (21) Appl. No.: 13/305,288 EP OO59435 A2 9, 1982 (22) Filed: Nov. 28, 2011 (Continued) OTHER PUBLICATIONS (65) Prior Publication Data US 2012/O133817 A1 May 31, 2012 Klein, W. F. “Cathode-Ray Tube Rotating Apparatus.” IBM Techni cal Disclosure Bulletin, vol. 18, No. 11, Apr. 1976, 3 pages. Related U.S. Application Data (Continued) (63) Continuation of application No. 09/213,131, filed on Dec. 15, 1998, now Pat. No. 8,102,457, which is a continuation of application No. 08/891,424, filed on Primary Examiner —Yogesh Aggarwal Jul. 9, 1997, now Pat. No. 5,973,734. (74) Attorney, Agent, or Firm — Withrow & Terranova, PLLC (51) Int. Cl. H04N 5/222 (2006.01) H04N I/00 (2006.01) (57) ABSTRACT H04N I/2 (2006.01) A device and method are provided that retrieves a plurality of (Continued) thumbnails corresponding to a plurality of images captured (52) U.S.
    [Show full text]