Digital Camera and Image Forensics Katie Bouman

Total Page:16

File Type:pdf, Size:1020Kb

Digital Camera and Image Forensics Katie Bouman Digital Camera and Image Forensics Katie Bouman In the past twenty years, much emphasis has been placed on technological breakthroughs in consumer electronics. Whether it be the creation CD’s, DVD’s, HDTV’s, or MP3’s, they were all created on the same basic principle; converting conventional analog information into digital information. Since these electronic systems are common today they may seem simple to understand, the conversion of a fluctuating wave into ones and zeros is much more difficult than it may appear. One example of this major shift is in the development of the digital camera. Inspired by the conventional analog camera, which uses chemical and mechanical processes to create an image, the digital camera has instead achieved this through the use of a digital image sensor and a computer (Wilson, et al, 2006). In just seconds a digital camera captures a sample of light that has bounced off a subject and has traveled through a series of lenses, which is then focused on a sensor that records the light electronically by breaking down the light pattern into a series of pixel values, and performs a full-color rendering that includes color filter array interpolation, color calibration, anti-aliasing, infrared rejection, and white point correction. However, a lot goes into doing just that (Adams, et al, 1998). Andreas Vesalius (1514-1564) was one of the first to dissect and examine the human body. As a result of Vesalius’s extensive work, many corrections were made in medicine, and scientists were able to use the enhancement in knowledge to extend their theories (Szaflarski, Diane, September 22, 2004). Although cameras, and even more so digital cameras, were not invented until centuries later, many of the characteristics of cameras and how they work is very similar to the eye, and was essential to their development. In addition, understanding of the eye helped to propel curiosity in understanding of light and its properties (Watson, 2006). The color that is detected by the cone cells in the back of ones eye and in the back of a camera is part of the electromagnetic spectrum. Refer to Figure 1. The area of the spectrum that humans are able to see is called the visible spectra (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 b). The electromagnetic spectrum is an entire range of wavelengths and frequencies that extends from gamma rays, the shortest wavelength, to the long waves, radio waves (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a). Although colored light is part of the spectrum, there is more to light than meets the eye. Only about 300 nm of the spectrum is visible (400 nm to 700 nm). The rest of the light that is unseen by the naked eye is referred to as the “invisible spectrum” (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 b). There are 7 types of electromagnetic radiation including visible light. Light travels in waves (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a). In order of length sizes of waves from largest to smallest is: radio waves, microwaves, infrared, the visible spectra, ultraviolet light, X-rays, and gamma rays (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a). Refer to figure 1. All electromagnetic radiation moves at the same speed through a vacuum, 3.0 times 108 meters/second, however, while moving through matter instead of a vacuum the speed is slowed down slightly. In fact, the denser the material that the light wave is moving through the slower it tends to travel (Davis, et al, 2002). A longstanding debate has persisted as to whether light travel in waves, or is it composed of a stream of particles. Many noteworthy physicists have argued both sides of this question. In fact, light exhibits behaviors of both a stream of particles and a wave. Light travels in a wave composed of photons (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a; Davis, et al 2002; The Franklin Institute. September 20, 2004). A photon, an uncharged particle that has no mass, is the smallest unit of electromagnetic energy. Each photon contains a certain amount of energy defined by Einstein as E = h*f (Where E is the energy of the emitted electron, h is Planck's constant (6.63 * 10^-34 J*s), and f is the frequency of the given light source (Kudenov, 2003a). The furthur the distance that the waves are 1 from each other the less energy each photon contains (Campbell, et al, 1997; Davis, et al 2002; The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a). For example, microwaves’ distance between waves (wavelength) ranges from 106 nm to 109 nm, therefore, the amount of energy of each photon contains less energy than a gamma ray, which ranges in wavelength from 10-3 nm to 10-5 nm. However a microwave contains more energy per photon than does a Radio wave, which ranges from 109 nm to 103 meters (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 a; The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 b). Objects and events with more energy, create higher energy radiation than cool objects or events with less energy. Thus, the more heat emitted, the shorter the waves are in the electromagnetic spectrum (Watson, 2006). FigureFigure 2 1 While wavelength λ() is the measure of one period of a wave when moving through space, frequency 1 (v) frequency(v) = is defined as the number of waves that pass a certain point during a certain period(T) period of time (usually a second) and is measured in € cycles/second or a hertz (Hz). Refer to figure 2. Therefore a wave, such as a radio wave, that has a longer wavelength than another wave, such as a gamma ray, will have a lower frequency than the shorter wave (Davis, et al, 2002). The visible spectra ranges from approximately 400 nm to around 700 nm. White light, or light that comes from the sun and many other sources, is composed of all colors, which result from different 2 wavelengths of light. When white light hits a prism, the prism creates an area at which each wavelength bends a different amount, which causes all colors to separate (Levine, 2000; Sekuler, et al. 2002; Szaflarski, September 22, 2004; Davis, et al 2002). These colors are often memorized in the order of their appearance after going through a prism with the longest wave length first as ROY G. BIV, red, orange, yellow, green, blue, indigo, and violet (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 b). Light waves can be transmitted, absorbed or reflected. The object that the white light wave hits determines which wavelengths will be reflected (the energy of the light to be converted to heat), or transmitted (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 c; The Franklin Institute September 20, 2004). In a white object, all the colors are being reflected. A black object, on the other hand, is when all the colors are being absorbed in the substance and no light waves are emitted (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 c; The Franklin Institute. September 20, 2004.). Every atom contains electrons, which can be pictured attached to the nucleus by springs. These springs start to vibrate at specific frequencies. When a light wave of that specific frequency hits the electrons they start to vibrate and the energy from the light turns into vibrational motion. The vibrating electrons then react with the electrons of the atoms surrounding it, creating thermal energy as the light wave is absorbed. This absorbed light wave is not seen. Reflected and the transmitted light is seen. Color occurs when the light causes electrons to vibrate in small spurts. These small spurts of vibration emit this energy as the light wave (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 c). The color of a green leaf is not actually contained within it (Sekuler, Robert, et al. 2002; Levine, 2000; The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 c; The Franklin Institute. September 20, 2004). The green color seen is the light that is reflected because the frequency of the wave doesn’t match the frequency of the electrons in the leaf (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 c). The primary colors of light are red, green, and blue, referred to as RGB. When mixed together these colors combine to create light’s secondary colors magenta (from blue and red), cyan (from blue and green), and yellow (from green and red). When colors of light are mixed together, the color produced becomes closer to the color white than the beginning colors were (The Franklin Institute. September 20, 2004). Color Addition is a method used to produce different colors of light. Computer monitors and televisions both use color addition. Color addition is when the three primary colors, red, blue, and green, are mixed together at varying intensities to create a wide range of colors (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 d; The 3 Franklin Institute. September 20, 2004). Monitors and televisions produce all their colors by mixing phosphors (The Physics Classroom and Mathsoft Engineering & Education, Inc. 2004 d; Brain, 2006.). Phosphors are coated on the inside of screens. They emit visible light when it is struck by an electron beam. In a black and white screen there is only one type of phosphor, which glows white when struck, and black when not. However, in a color screen, three different types of phosphors, arranged as dots or stripes, use color addition to formulate a final color (Brain, 2006).
Recommended publications
  • Demosaicking: Color Filter Array Interpolation [Exploring the Imaging
    [Bahadir K. Gunturk, John Glotzbach, Yucel Altunbasak, Ronald W. Schafer, and Russel M. Mersereau] FLOWER PHOTO © PHOTO FLOWER MEDIA, 1991 21ST CENTURY PHOTO:CAMERA AND BACKGROUND ©VISION LTD. DIGITAL Demosaicking: Color Filter Array Interpolation [Exploring the imaging process and the correlations among three color planes in single-chip digital cameras] igital cameras have become popular, and many people are choosing to take their pic- tures with digital cameras instead of film cameras. When a digital image is recorded, the camera needs to perform a significant amount of processing to provide the user with a viewable image. This processing includes correction for sensor nonlinearities and nonuniformities, white balance adjustment, compression, and more. An important Dpart of this image processing chain is color filter array (CFA) interpolation or demosaicking. A color image requires at least three color samples at each pixel location. Computer images often use red (R), green (G), and blue (B). A camera would need three separate sensors to completely meas- ure the image. In a three-chip color camera, the light entering the camera is split and projected onto each spectral sensor. Each sensor requires its proper driving electronics, and the sensors have to be registered precisely. These additional requirements add a large expense to the system. Thus, many cameras use a single sensor covered with a CFA. The CFA allows only one color to be measured at each pixel. This means that the camera must estimate the missing two color values at each pixel. This estimation process is known as demosaicking. Several patterns exist for the filter array.
    [Show full text]
  • Image-Quality Metric System for Color Filter Array Evaluation
    PLOS ONE RESEARCH ARTICLE Image-quality metric system for color filter array evaluation Tae Wuk BaeID* Daegu-Gyeongbuk Research Center, Electronics and Telecommunications Research Institute, Daegu, South Korea * [email protected], [email protected] a1111111111 Abstract a1111111111 a1111111111 A modern color filter array (CFA) output is rendered into the final output image using a a1111111111 a1111111111 demosaicing algorithm. During this process, the rendered image is affected by optical and carrier cross talk of the CFA pattern and demosaicing algorithm. Although many CFA pat- terns have been proposed thus far, an image-quality (IQ) evaluation system capable of com- prehensively evaluating the IQ of each CFA pattern has yet to be developed, although IQ evaluation items using local characteristics or specific domain have been created. Hence, OPEN ACCESS we present an IQ metric system to evaluate the IQ performance of CFA patterns. The pro- Citation: Bae TW (2020) Image-quality metric posed CFA evaluation system includes proposed metrics such as the moire robustness system for color filter array evaluation. PLoS ONE 15(5): e0232583. https://doi.org/10.1371/journal. using the experimentally determined moire starting point (MSP) and achromatic reproduc- pone.0232583 tion (AR) error, as well as existing metrics such as color accuracy using CIELAB, a color Editor: Hocine Cherifi, Unviersity of Burgundy, reproduction error using spatial CIELAB, structural information using the structure similarity, FRANCE the image contrast based on MTF50, structural and color distortion using the mean deviation Received: October 7, 2019 similarity index (MDSI), and perceptual similarity using Haar wavelet-based perceptual simi- larity index (HaarPSI).
    [Show full text]
  • Comparison of Color Demosaicing Methods Olivier Losson, Ludovic Macaire, Yanqin Yang
    Comparison of color demosaicing methods Olivier Losson, Ludovic Macaire, Yanqin Yang To cite this version: Olivier Losson, Ludovic Macaire, Yanqin Yang. Comparison of color demosaicing methods. Advances in Imaging and Electron Physics, Elsevier, 2010, 162, pp.173-265. 10.1016/S1076-5670(10)62005-8. hal-00683233 HAL Id: hal-00683233 https://hal.archives-ouvertes.fr/hal-00683233 Submitted on 28 Mar 2012 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Comparison of color demosaicing methods a, a a O. Losson ∗, L. Macaire , Y. Yang a Laboratoire LAGIS UMR CNRS 8146 – Bâtiment P2 Université Lille1 – Sciences et Technologies, 59655 Villeneuve d’Ascq Cedex, France Keywords: Demosaicing, Color image, Quality evaluation, Comparison criteria 1. Introduction Today, the majority of color cameras are equipped with a single CCD (Charge- Coupled Device) sensor. The surface of such a sensor is covered by a color filter array (CFA), which consists in a mosaic of spectrally selective filters, so that each CCD ele- ment samples only one of the three color components Red (R), Green (G) or Blue (B). The Bayer CFA is the most widely used one to provide the CFA image where each pixel is characterized by only one single color component.
    [Show full text]
  • Interfacing Red/Clear Sensors to ADSP-BF609® Blackfin Processors Contributed by Gopal Karanam Rev 1 – May 28, 2013
    Engineer-to-Engineer Note EE-358 Technical notes on using Analog Devices DSPs, processors and development tools Visit our Web resources http://www.analog.com/ee-notes and http://www.analog.com/processors or e-mail [email protected] or [email protected] for technical support. Interfacing Red/Clear Sensors to ADSP-BF609® Blackfin Processors Contributed by Gopal Karanam Rev 1 – May 28, 2013 Introduction The ADSP-BF609 dual-core Blackfin® processor series is optimized for embedded vision and video analytics applications, such as automotive advanced driver assistance systems (ADAS), machine vision and robotics for industrial manufacturing systems, and security and surveillance system analytics. This Engineer- to-Engineer Note discusses the advanced features available for interfacing application specific cameras to the ADSP-BF609 Blackfin processor. Vision cameras have photosensors that convert Figure 1. Bayer Color Filter Array (CFA) light intensity into an electronic signal. Typical photosensors have little or no sensitivity to Vision based ADAS use multiple cameras to wavelength of incident light and therefore cannot capture the scene around the car to assist during separate color information. In order to capture the driving process. Typical machine vision color information, a mosaic of tiny color filters is algorithms use only the intensity of the light placed over the pixel sensors of an image sensor. (Luma). Hence, sensor manufactures produce monochrome sensors exclusively for ADAS One of the most common color filters used in applications. But, in order to meet stringent consumer cameras is the Bayer filter, which quality requirements the application needs access gives information about the intensity of light in to some color information as well.
    [Show full text]
  • Exposing Digital Forgeries in Color Filter Array Interpolated Images
    1 Exposing Digital Forgeries in Color Filter Array Interpolated Images Alin C. Popescu and Hany Faridy Abstract With the advent of low-cost and high-resolution digital cameras, and sophisticated photo editing software, digital images can be easily manipulated and altered. Although good forgeries may leave no visual clues of having been tampered with, they may, nevertheless, alter the underlying statistics of an image. Most digital cameras, for example, employ a single sensor in conjunction with a color filter array (CFA), and then interpolate the missing color samples to obtain a three channel color image. This interpolation introduces specific correlations which are likely to be destroyed when tampering with an image. We quantify the specific correlations introduced by CFA interpolation, and describe how these correlations, or lack thereof, can be automatically detected in any portion of an image. We show the efficacy of this approach in revealing traces of digital tampering in lossless and lossy compressed color images interpolated with several different CFA algorithms. I. INTRODUCTION When it first surfaced, the digital image of Jane Fonda and Senator John Kerry sharing a stage at an anti-war rally was explosive 1. It was also a fake. The image was created by digitally splicing together two separate images. Because of the ease with which digital media can be manipulated, these types of digital forgeries are becoming more common. As a result, photographs no longer hold the unique stature as a definitive recording of events. Digital watermarking has been proposed as a means by which an image can be authenticated (see, for example, [1], [2] for general surveys).
    [Show full text]
  • Design of Switchable On/Off Subpixels for Primary Color Generation Based on Molybdenum Oxide Gratings
    Article Design of Switchable On/Off Subpixels for Primary Color Generation Based on Molybdenum Oxide Gratings Gonzalo Santos 1, Francisco González 1, Dolores Ortiz 1, José María Saiz 1, Maria Losurdo 2 , Yael Gutiérrez 2 and Fernando Moreno 1,* 1 Department of Applied Physics, Universidad de Cantabria, Avda. Los Castros s/n, 39005 Santander, Spain; [email protected] (G.S.); [email protected] (F.G.); [email protected] (D.O.); [email protected] (J.M.S.) 2 Institute of Nanotechnology, CNR-NANOTEC, Via Orabona 4, 70126 Bari, Italy; [email protected] (M.L.); [email protected] (Y.G.) * Correspondence: [email protected] Abstract: Structural color emerges from the interaction of light with structured matter when its dimension is comparable to the incident wavelength. The reflected color can be switched by con- trolling such interaction with materials whose properties can be changed through external stimuli such as electrical, optical, or thermal excitation. In this research, a molybdenum oxide (MoOx) reflective grating to get a switchable on/off subpixel is designed and analyzed. The design is based on subpixel on and off states that could be controlled through the oxidation degree of MoOx.A suitable combination of three of these subpixels, optimized to get a control of primary colors, red, green, and blue, can lead to a pixel which can cover a wide range of colors in the color space for reflective display applications. Keywords: color reflective displays; phase-change materials; structural color Citation: Santos, G.; González, F.; Ortiz, D.; Saiz, J.M.; Losurdo, M.; Gutiérrez, Y.; Moreno, F.
    [Show full text]
  • High Sensitivity Color CMOS Image Sensor with White-RGB Color Filter
    High Sensitivity Color CMOS Image Sensor with WRGB Color Filter Array and Color Separation Process Using Edge Detection Hiroto Honda*1, Yoshinori Iida*1, and Yoshitaka Egawa*2 *1 Corporate Research & Development Center, Toshiba Corporation *2 Semiconductor Company, Toshiba Corporation ABSTRACT We have developed a CMOS image sensor with a novel color filter array (CFA) where one of the green pixels of the Bayer pattern was replaced with a white pixel. A transparent layer has been fabricated on the white pixel instead of a color filter to realize over 95% transmission for visible light with wavelengths of 400-700 nm. Pixel pitch of the device was 3.3 um and the number of pixels was 2 million (1600H x 1200V). By introducing the Bayer-like White-Red-Green-Blue (WRGB) CFA and by using the low-noise color separation process, signal-to-noise ratio (SNR) was improved. Low-illumination SNRs of interpolated R, G, and B values have been increased by 6dB, 1dB, and 6dB respectively, compared with those of the Bayer pattern. The false color signals at the edge have been suppressed by newly developed color separation process using edge detection. This new CFA has a great potential to significantly increase the sensitivity of CMOS/CCD image sensors with digital signal processing technology. 1 INTRODUCTION 2 BAYER-LIKE WRGB COLOR FILTER ARRAY A basic trend toward smaller pixels for CMOS image Figure 1 shows the Bayer layout and the newly sensors allows a huge number of pixels (more than developed Bayer-like WRGB layout. In a 2 x 2 unit 10M pixels), but causes decrease of the photodiode block one of the two green pixels was replaced with a area, resulting in decrease of incident photon number.
    [Show full text]
  • (12) United States Patent (10) Patent No.: US 8,343,365 B2 Yoshibayashi (45) Date of Patent: Jan
    USOO83.43365B2 (12) United States Patent (10) Patent No.: US 8,343,365 B2 Yoshibayashi (45) Date of Patent: Jan. 1, 2013 (54) METHOD FOR PRODUCING COLOR FILTER 5,140,396 A 8, 1992 Needham et al. 5,510,215 A * 4/1996 Prince et al. ...................... 430.7 5,677,202 A * 10/1997 Hawkins et al. ................ 438/70 (75) Inventor: Mitsuji Yoshibayashi, Shizuoka-ken 5,689,318 A * 11/1997 Matsuyama et al. .......... 349/106 (JP) 6,720,119 B2 * 4/2004 Ohtsu et al. ............ 430.7 6,866,972 B2 * 3/2005 Ishino ............................... 430.7 (73) Assignee: FUJIFILM Corporation, Tokyo (JP) (Continued) (*) Notice: Subject to any disclaimer, the term of this FOREIGN PATENT DOCUMENTS patent is extended or adjusted under 35 JP 55-146406 A 11, 1980 U.S.C. 154(b) by 858 days. JP 61-041102 A 2, 1986 JP 5-323113 A 12/1993 (21) Appl. No.: 11/779,212 (Continued) (22) Filed: Jul. 17, 2007 OTHER PUBLICATIONS Prior Publication Data Communication, dated Sep. 24, 2012, issued in corresponding EP (65) Application No. 07014051.2, 5 pages. US 2008/OO176O7 A1 Jan. 24, 2008 Primary Examiner —Nadine Norton (30) Foreign Application Priority Data Assistant Examiner — Christopher Remavege (74) Attorney, Agent, or Firm — Sughrue Mion, PLLC Jul. 19, 2006 (JP) ................................. 2006-196622 Dec. 22, 2006 (JP) ................................. 2006-346108 (57) ABSTRACT (51) Int. C. The invention provides a color filter producing method that is B29D II/00 (2006.01) based on dry etching and makes it possible to produce a color HOIL 2L/00 (2006.01) filter which has fine and rectangular pixels and is excellent in (52) U.S.
    [Show full text]
  • A Color Interpolation Method for Bayer Filter Array Images Based on Direction Flag
    International Conference on Civil, Materials and Environmental Sciences (CMES 2015) A Color Interpolation Method for Bayer Filter Array Images Based on Direction Flag Zheng Liu, Huachuang Wang Institute of Optics and Electronics Chinese Academy of Sciences, Sichuan Province, 610209, China Abstract—The paper presents a color interpolation method neighborhood interpolation method, bilinear interpolation for Bayer filter array images based on direction flag. The and cubic convolution interpolation method [2]. The proposed algorithm determines four edge patterns defined simplest and most referenced one is probably bilinear through direction flag by four nearest green values interpolation, which uses the same color component linear surrounding the green interpolation location. Firstly, Bayer averaging in 3×3 Bayer pattern block to estimate a pixel differences between multichannel pixels in horizontal and value. However, it introduces many color artifacts in the vertical directions of the color filter interpolation case are edge regions that blur the resulting image. The second obtained. Secondly according to direction flag and edge is correlation interpolation method by interpolating the pattern of the center pixel, missing green components of the color differences between green (G) pixel and red/blue center pixel are interpolated by different formulas in the (R/B) pixel, including edge gradient interpolation method horizontal, vertical and diagonal directions. Finally, correlation among channels is adopted to reconstruct [3,4], adaptive interpolation method [5,6] and so on. remaining red/blue components pixels. Experimental results Laroche [4] improved an effective color interpolation show that the presented method for Bayer CFA images method with edge direction detection by nearest green based on direction flag can provide better reconstruction values surrounding the green interpolation location.
    [Show full text]
  • Color Filter Arrays
    New Color Filter Arrays of High Light Sensitivity and High Demosaicking Performance Jue Wang1, Chao Zhang1, Pengwei Hao1,2 1 Peking University, 2 Queen Mary, University of London Outline • Introduction: What is CFA • How CFA works: demosaicking • CFA representation: frequency structure • CFA design: what should be optimized • CFA design: what constraints • New CFA designs & Evaluation • Conclusions Single-chip Color Cameras • Single-Chip Camera is based on a color filter mosaic fabricated on top of the light sensors, and the mosaic is generally an array (Color Filter Array, CFA). Lens On-chip color filter array Bayer Kodak CFA2.0 Image sensor Color filter arrays Demosaicking • As each individual sensor only records one color, at each pixel all the three primary colors, red, green and blue, of a color image must be reconstructed using a computational interpolation method – demosaicking. RGB pixels of CFA The CFA pattern CFA-filtered image Demosaicking Demosaicked image A few Commercialized CFAs • Bayer CFA pattern (Kodak, red, green, blue) • CMY CFA (Kodak, cyan, yellow, magenta) • RGBE CFA (Sony, red, green, blue, emerald) • CYGM CFA (a few, cyan, yellow, green, magenta) • CYGW CFA (JVC, yellow, cyan, green, unfiltered) CFAs proposed by researchers • Gindele & Gallagher (with white, 2002) • Parmar & Reeves (random, 2004) • Hirakawa & Wolfe (better recovery, 2008) • Condat (robust to noise, 2009) • Hao, Li, Lin & Dubois (better demosaicking, 2011) Kodak’s CFA2.0 • Second generation CFA for high-light sensitivity with 50% unfiltered pixels
    [Show full text]
  • A Study on Various Color Filter Array Based Techniques
    International Journal of Computer Applications (0975 – 8887) Volume 114 – No. 4, March 2015 A Study on Various Color Filter Array based Techniques Simranpreet Kaur Richa Sharma Computer Science and Engineering Assistant professor Global Institute Of Management and Emerging Global Institute Of Management and Emerging Technologies, Amritsar Technologies, Amritsar ABSTRACT A demosaicing algorithm is just a digital image process used to reconstruct full color image from the incomplete color samples output from a picture sensor overlaid with a color filter array (CFA). It is also known as CFA interpolation or color reconstruction. Most contemporary digital camera models acquire images using a single image sensor overlaid with a CFA, so demosaicing is the main processing pipeline required to render these images into a viewable format. Many modern Red Green Blue digital camera models can save images in a natural format Figure 1: Bayer Filter Sa mple (adapted from [12]) allowing the consumer to demosaicit using software, as opposed to utilizing the camera's built-in firmware. Thus A digital camera typically has means to reconstruct a whole demosaicing becomes and major area of research in vision RGB image using the above information. The resulting image processing applications. The key objective of the paper is to could be something like this: examine and analyze various image demosaicing techniques. The entire aim would be to explore various limitations of the earlier techniques. This paper ends up with the suitable gaps in earlier techniques. Keywords:- Demosaicing, Cfa, Bayer Layer, Smart Cameras. Original Reconstructed 1. INTRODUCTION A demosaicing [1]- [4] is just a digital image process used to Figure 2: Reconstructed Image (adaptive from [12] reconstruct full color image from incomplete color samples output from image sensor overlaid with a shade filter array 2.
    [Show full text]
  • A Case for Denoising Before Demosaicking Color Filter Array Data
    A case for denoising before demosaicking color filter array data Sung Hee Park, Hyung Suk Kim, Steven Lansel, Manu Parmar, and Brian A. Wandell Abstract Sensor defect —Denoising algorithms are well developed for Sensor correction grayscale and color images, but not as well for color filter array (CFA) data. Consequently, the common color imaging Intensity Pre-processing Demosaicking pipeline demosaics CFA data before denoising. In this paper scaling we explore the noise-related properties of the imaging pipeline that demosaics CFA data before denoising. We then propose and Enhancement, explore a way to transform CFA data to a form that is amenable compression to existing grayscale and color denoising schemes. Since CFA data are a third as many as demosaicked data, we can expect Post-processing Color conversion Denoising to reduce processing time and power requirements to about a third of current requirements. Index Terms—color imaging pipeline, demosaicking, denois- Fig. 1. Essential stages of the color imaging pipeline. Usually, a ing, CFA denoising denoising stage follows the demosaicking stage. I. INTRODUCTION researchers have proposed methods designed to denoise CFA data directly [3]–[5]. The color imaging pipeline in a typical digital camera starts In this paper we first motivate denoising before demosaick- with sensor data acquired through a color filter array (CFA). ing by analyzing issues related to denoising demosaicked The CFA mosaic allows the measurement of the intensity data. We then explore a novel method to denoise subsampled of only one waveband at a particular pixel location; the CFA data prior to demosaicking. This new approach requires image processing pipeline must have a demosaicking stage only a small change to the pipeline.
    [Show full text]