Exposing Digital Forgeries in Color Filter Array Interpolated Images

Total Page:16

File Type:pdf, Size:1020Kb

Exposing Digital Forgeries in Color Filter Array Interpolated Images 1 Exposing Digital Forgeries in Color Filter Array Interpolated Images Alin C. Popescu and Hany Faridy Abstract With the advent of low-cost and high-resolution digital cameras, and sophisticated photo editing software, digital images can be easily manipulated and altered. Although good forgeries may leave no visual clues of having been tampered with, they may, nevertheless, alter the underlying statistics of an image. Most digital cameras, for example, employ a single sensor in conjunction with a color filter array (CFA), and then interpolate the missing color samples to obtain a three channel color image. This interpolation introduces specific correlations which are likely to be destroyed when tampering with an image. We quantify the specific correlations introduced by CFA interpolation, and describe how these correlations, or lack thereof, can be automatically detected in any portion of an image. We show the efficacy of this approach in revealing traces of digital tampering in lossless and lossy compressed color images interpolated with several different CFA algorithms. I. INTRODUCTION When it first surfaced, the digital image of Jane Fonda and Senator John Kerry sharing a stage at an anti-war rally was explosive 1. It was also a fake. The image was created by digitally splicing together two separate images. Because of the ease with which digital media can be manipulated, these types of digital forgeries are becoming more common. As a result, photographs no longer hold the unique stature as a definitive recording of events. Digital watermarking has been proposed as a means by which an image can be authenticated (see, for example, [1], [2] for general surveys). The major drawback of this approach is that a watermark must be inserted at the time of recording, which would limit this approach to specially equipped digital cameras. This method also relies on the assumption that the digital watermark cannot be easily removed and reinserted — it is not yet clear whether this is a reasonable assumption (e.g., [3]). In contrast to these approaches, we and others have proposed techniques for detecting tampering in digital images that work in the absence of any digital watermark or signature [4], [5], [6], [7], [8], [9], [10]. These techniques work on the assumption that although digital forgeries may leave no visual clues of having been tampered with, they may, nevertheless, alter the underlying statistics of an image. Most digital cameras, for example, capture color images using a single sensor in conjunction with an array of color filters. As a result, only one third of the samples in a color image are captured by the camera, the other two thirds being interpolated. This interpolation introduces specific correlations between the samples of a color image. When creating a digital forgery these correlations may be destroyed or altered. We describe the form of these correlations, and propose a method that quantifies and detects them in any portion of an image 2. We show the general effectiveness of this technique in detecting traces of A. C. Popescu is with the Computer Science Department at Dartmouth College. Corresponding author: H. Farid, 6211 Sudikoff Lab, Computer Science Department, Dartmouth College, Hanover, NH 03755 USA (email: [email protected]; tel/fax: 603.646.2761/603.646.1672). This work was supported by an Alfred P. Sloan Fellowship, a National Science Foundation CAREER Award (IIS-99-83806), a departmental National Science Foundation Infrastructure Grant (EIA-98-02068), and under Award No. 2000-DT-CX-K001 from the Office for Domestic Preparedness, U.S. Department of Homeland Security (points of view in this document are those of the authors and do not necessarily represent the official position of the U.S. Department of Homeland Security). 1The actress Jane Fonda was well known for her strong views against the involvement of the United States in the Vietnam war. After serving in the Vietnam war, John Kerry spoke out against the war. The photograph of Fonda and Kerry appeared as Kerry was running for President of the United States, and came at a time when his involvement in the anti-war movement was under attack. 2Portions of this work also appear in the Ph.D. dissertation [11]. 2 r1;1 g1;2 r1;3 g1;4 r1;5 g1;6 g2;1 b2;2 g2;3 b2;4 g2;5 b2;6 r3;1 g3;2 r3;3 g3;4 r3;5 g3;6 g b g b g b 4;1 4;2 4;3 4;4 4;5 4;6 · · · r5;1 g5;2 r5;3 g5;4 r5;5 g5;6 g6;1 b6;2 g6;3 b6;4 g6;5 b6;6 . Fig. 1. The top-left portion of a CFA image obtained from a Bayer array. The red, r2i+1;2j+1 , and blue, b2i;2j , pixels are sampled on rectilinear lattices, while the green, g2i+1;2j and g2i;2j+1 , pixels are sampled twice as often on a quincunx lattice. Notice that at each pixel location only a single color sample is recorded. digital tampering, and analyze its sensitivity and robustness to simple image distortions (compression, noise, and gamma correction). II. COLOR FILTER ARRAY INTERPOLATION ALGORITHMS A digital color image consists of three channels containing samples from different bands of the color spectrum, e.g., red, green, and blue. Most digital cameras, however, are equipped with a single CCD or CMOS sensor, and capture color images using a color filter array (CFA). The most frequently used CFA, the Bayer array [12], employs three color filters: red, green, and blue. The red and blue pixels are sampled on rectilinear lattices, while the green pixels are sampled on a quincunx lattice, Fig. 1. Since only a single color sample is recorded at each pixel location, the other two color samples must be estimated from the neighboring samples in order to obtain a three-channel color image. Let S(x; y) denote the CFA image in Fig. 1, and R~(x; y), G~(x; y), B~(x; y) denote the red, green, and blue channels constructed from S(x; y) as follows: S(x; y) if S(x; y) = r R~(x; y) = x;y (1) (0 otherwise S(x; y) if S(x; y) = gx;y G~(x; y) = (2) (0 otherwise S(x; y) if B(x; y) = bx;y B~(x; y) = ; (3) (0 otherwise where (x; y) span an integer lattice. A complete color image, with channels R(x; y), G(x; y), and B(x; y) needs to be estimated. These channels take on the non-zero values of R~(x; y), G~(x; y), and B~(x; y), and replace the zeros with estimates from neighboring samples. The estimation of the missing color samples is referred to as CFA interpolation or demosaicking. CFA interpolation has been extensively studied and many methods have been proposed (see, for example, [13] for a survey, and [14], [15], [16] for more recent methods). We will briefly describe seven demosaicking techniques for the Bayer array that are used in our studies. In selecting these seven algorithms, we have tried to cover a range of CFA algorithms, from the simple (bilinear), to ones that are employed in commercially available digital cameras (gradient-based), to more recent algorithms that may one day be employed in commercial digital cameras. A. Bilinear and Bicubic The simplest methods for demosaicking are kernel-based interpolation methods that act on each channel independently. These methods can be efficiently implemented as linear filtering operations on 3 each color channel: N R(x; y) = h (u; v)R~(x u; y v) (4) r − − u;v= N X− N G(x; y) = h (u; v)G~(x u; y v) (5) g − − u;v= N X− N B(x; y) = h (u; v)B~(x u; y v); (6) b − − u;v= N X− where R~( ), G~( ), B~( ) are defined in Equations (1)-(3), and h ( ), h ( ), h ( ) are linear filters of size · · · r · g · b · (2N + 1) (2N + 1). Different forms of interpolation (nearest neighbor, bilinear, bicubic [17], etc.) differ × in the form of the interpolation filter used. For the Bayer array, the bilinear and bicubic filters for the red and blue channels are separable. The 1-D filters are given by: h = 1=2 1 1=2 h = 1=16 0 9=16 1 9=16 0 1=16 ; (7) l c − − T T and the 2-D filters are given by hl hl and hc hc. The bilinear and bicubic filters for the green channel, however, are no longer separable and take the form: 0 0 0 1 0 0 0 0 0 9 0 9 0 0 0 1 0 20 9 −0 81 −0 9 03 1 1 − − h = 1 4 1 h = 61 0 81 256 81 0 17 : (8) l 4 2 3 c 256 6 7 0 1 0 60 9 0 81 0 9 07 6 − − 7 4 5 60 0 9 0 9 0 07 6 − − 7 60 0 0 1 0 0 07 6 7 4 5 Using bilinear interpolation, for example, the color samples at pixel locations (3; 3), (3; 4), and (4; 4), Fig. 1, are given by: g + g + g + g b + b + b + b R(3; 3) = r G(3; 3) = 2;3 3;2 3;4 4;3 B(3; 3) = 2;2 2;4 4;2 4;4 (9) 3;3 4 4 b + b r + r G(3; 4) = g B(3; 4) = 2;4 4;4 R(3; 4) = 3;3 3;5 (10) 3;4 2 2 r + r + r + r g + g + g + g B(4; 4) = b R(4; 4) = 3;3 3;5 5;3 5;5 G(4; 4) = 3;4 4;3 4;5 5;4 : (11) 4;4 4 4 B.
Recommended publications
  • Color Models
    Color Models Jian Huang CS456 Main Color Spaces • CIE XYZ, xyY • RGB, CMYK • HSV (Munsell, HSL, IHS) • Lab, UVW, YUV, YCrCb, Luv, Differences in Color Spaces • What is the use? For display, editing, computation, compression, …? • Several key (very often conflicting) features may be sought after: – Additive (RGB) or subtractive (CMYK) – Separation of luminance and chromaticity – Equal distance between colors are equally perceivable CIE Standard • CIE: International Commission on Illumination (Comission Internationale de l’Eclairage). • Human perception based standard (1931), established with color matching experiment • Standard observer: a composite of a group of 15 to 20 people CIE Experiment CIE Experiment Result • Three pure light source: R = 700 nm, G = 546 nm, B = 436 nm. CIE Color Space • 3 hypothetical light sources, X, Y, and Z, which yield positive matching curves • Y: roughly corresponds to luminous efficiency characteristic of human eye CIE Color Space CIE xyY Space • Irregular 3D volume shape is difficult to understand • Chromaticity diagram (the same color of the varying intensity, Y, should all end up at the same point) Color Gamut • The range of color representation of a display device RGB (monitors) • The de facto standard The RGB Cube • RGB color space is perceptually non-linear • RGB space is a subset of the colors human can perceive • Con: what is ‘bloody red’ in RGB? CMY(K): printing • Cyan, Magenta, Yellow (Black) – CMY(K) • A subtractive color model dye color absorbs reflects cyan red blue and green magenta green blue and red yellow blue red and green black all none RGB and CMY • Converting between RGB and CMY RGB and CMY HSV • This color model is based on polar coordinates, not Cartesian coordinates.
    [Show full text]
  • An Improved SPSIM Index for Image Quality Assessment
    S S symmetry Article An Improved SPSIM Index for Image Quality Assessment Mariusz Frackiewicz * , Grzegorz Szolc and Henryk Palus Department of Data Science and Engineering, Silesian University of Technology, Akademicka 16, 44-100 Gliwice, Poland; [email protected] (G.S.); [email protected] (H.P.) * Correspondence: [email protected]; Tel.: +48-32-2371066 Abstract: Objective image quality assessment (IQA) measures are playing an increasingly important role in the evaluation of digital image quality. New IQA indices are expected to be strongly correlated with subjective observer evaluations expressed by Mean Opinion Score (MOS) or Difference Mean Opinion Score (DMOS). One such recently proposed index is the SuperPixel-based SIMilarity (SPSIM) index, which uses superpixel patches instead of a rectangular pixel grid. The authors of this paper have proposed three modifications to the SPSIM index. For this purpose, the color space used by SPSIM was changed and the way SPSIM determines similarity maps was modified using methods derived from an algorithm for computing the Mean Deviation Similarity Index (MDSI). The third modification was a combination of the first two. These three new quality indices were used in the assessment process. The experimental results obtained for many color images from five image databases demonstrated the advantages of the proposed SPSIM modifications. Keywords: image quality assessment; image databases; superpixels; color image; color space; image quality measures Citation: Frackiewicz, M.; Szolc, G.; Palus, H. An Improved SPSIM Index 1. Introduction for Image Quality Assessment. Quantitative domination of acquired color images over gray level images results in Symmetry 2021, 13, 518. https:// the development not only of color image processing methods but also of Image Quality doi.org/10.3390/sym13030518 Assessment (IQA) methods.
    [Show full text]
  • COLOR SPACE MODELS for VIDEO and CHROMA SUBSAMPLING
    COLOR SPACE MODELS for VIDEO and CHROMA SUBSAMPLING Color space A color model is an abstract mathematical model describing the way colors can be represented as tuples of numbers, typically as three or four values or color components (e.g. RGB and CMYK are color models). However, a color model with no associated mapping function to an absolute color space is a more or less arbitrary color system with little connection to the requirements of any given application. Adding a certain mapping function between the color model and a certain reference color space results in a definite "footprint" within the reference color space. This "footprint" is known as a gamut, and, in combination with the color model, defines a new color space. For example, Adobe RGB and sRGB are two different absolute color spaces, both based on the RGB model. In the most generic sense of the definition above, color spaces can be defined without the use of a color model. These spaces, such as Pantone, are in effect a given set of names or numbers which are defined by the existence of a corresponding set of physical color swatches. This article focuses on the mathematical model concept. Understanding the concept Most people have heard that a wide range of colors can be created by the primary colors red, blue, and yellow, if working with paints. Those colors then define a color space. We can specify the amount of red color as the X axis, the amount of blue as the Y axis, and the amount of yellow as the Z axis, giving us a three-dimensional space, wherein every possible color has a unique position.
    [Show full text]
  • Robust Pulse-Rate from Chrominance-Based Rppg Gerard De Haan and Vincent Jeanne
    IEEE TRANSACTIONS ON BIOMEDICAL ENGINEERING, VOL. ?, NO. ?, MONTH 2013 1 Robust pulse-rate from chrominance-based rPPG Gerard de Haan and Vincent Jeanne Abstract—Remote photoplethysmography (rPPG) enables con- PPG signal into two independent signals built as a linear tactless monitoring of the blood volume pulse using a regular combination of two color channels [8]. One combination camera. Recent research focused on improved motion robustness, approximated the clean pulse-signal, the other the motion but the proposed blind source separation techniques (BSS) in RGB color space show limited success. We present an analysis of artifact, and the energy in the pulse-signal was minimized the motion problem, from which far superior chrominance-based to optimize the combination. Poh et al. extended this work methods emerge. For a population of 117 stationary subjects, we proposing a linear combination of all three color channels show our methods to perform in 92% good agreement (±1:96σ) defining three independent signals with Independent Compo- with contact PPG, with RMSE and standard deviation both a nent Analysis (ICA) using non-Gaussianity as the criterion factor of two better than BSS-based methods. In a fitness setting using a simple spectral peak detector, the obtained pulse-rate for independence [5]. Lewandowska et al. varied this con- for modest motion (bike) improves from 79% to 98% correct, cept defining three independent linear combinations of the and for vigorous motion (stepping) from less than 11% to more color channels with Principal Component Analysis (PCA) [6]. than 48% correct. We expect the greatly improved robustness to With both Blind Source Separation (BSS) techniques, the considerably widen the application scope of the technology.
    [Show full text]
  • NTE1416 Integrated Circuit Chrominance and Luminance Processor for NTSC Color TV
    NTE1416 Integrated Circuit Chrominance and Luminance Processor for NTSC Color TV Description: The NTE1416 is an MSI integrated circuit in a 28–Lead DIP type package designed for NTSC systems to process both color and luminance signals for color televisions. This device provides two functions: The processing of color signals for the band pass amplifier, color synchronizer, and demodulator cir- cuits and also the processing of luminance signal for the luminance amplifier and pedestal clamp cir- cuits. The number of peripheral parts and controls can be minimized and the manhours required for assembly can be considerbly reduced. Features: D Few External Components Required D DC Controlled Circuits make a Remote Controlled System Easy D Protection Diodes in every Input and Output Pin D “Color Killer” Needs No Adjustements D “Contrast” Control Does Not Prevent the Natural Color of the Picture, as the Color Saturation Level Changes Simultaneously D ACC (Automatic Color Controller) Circuit Operates Very Smoothly with the Peak Level Detector D “Brightness Control” Pin can also be used for ABL (Automatic Beam Limitter) Absolute Maximum Ratings: (TA = +25°C unless otherwise specified) Supply Voltage, VCC . 14.4V Brightness Controlling Voltage, V3 . 14.4V Resolution Controlling Voltage, V4 . 14.4V Contrast Controlling Voltage, V10 . 14.4V Tint Controlling Voltage, V7 . 14.4V Color Controlling Voltage, V9 . 14.4V Auto Controlling Voltage, V8 . 14.4V Luminance Input Signal Voltage, V5 . +5V Chrominance Signal Input Voltage, V13 . +2.5V Demodulator Input Signal Voltage, V25 . +5V R.G.B. Output Current, I26, I27, I28 . –40mA Gate Pulse Input Voltage, V20 . +5V Gate Pulse Output Current, I20 .
    [Show full text]
  • Demosaicking: Color Filter Array Interpolation [Exploring the Imaging
    [Bahadir K. Gunturk, John Glotzbach, Yucel Altunbasak, Ronald W. Schafer, and Russel M. Mersereau] FLOWER PHOTO © PHOTO FLOWER MEDIA, 1991 21ST CENTURY PHOTO:CAMERA AND BACKGROUND ©VISION LTD. DIGITAL Demosaicking: Color Filter Array Interpolation [Exploring the imaging process and the correlations among three color planes in single-chip digital cameras] igital cameras have become popular, and many people are choosing to take their pic- tures with digital cameras instead of film cameras. When a digital image is recorded, the camera needs to perform a significant amount of processing to provide the user with a viewable image. This processing includes correction for sensor nonlinearities and nonuniformities, white balance adjustment, compression, and more. An important Dpart of this image processing chain is color filter array (CFA) interpolation or demosaicking. A color image requires at least three color samples at each pixel location. Computer images often use red (R), green (G), and blue (B). A camera would need three separate sensors to completely meas- ure the image. In a three-chip color camera, the light entering the camera is split and projected onto each spectral sensor. Each sensor requires its proper driving electronics, and the sensors have to be registered precisely. These additional requirements add a large expense to the system. Thus, many cameras use a single sensor covered with a CFA. The CFA allows only one color to be measured at each pixel. This means that the camera must estimate the missing two color values at each pixel. This estimation process is known as demosaicking. Several patterns exist for the filter array.
    [Show full text]
  • Improved Television Systems: NTSC and Beyond
    • Improved Television Systems: NTSC and Beyond By William F. Schreiber After a discussion ofthe limits to received image quality in NTSC and a excellent results. Demonstrations review of various proposals for improvement, it is concluded that the have been made showing good motion current system is capable ofsignificant increase in spatial and temporal rendition with very few frames per resolution. and that most of these improvements can be made in a second,2 elimination of interline flick­ er by up-conversion, 3 and improved compatible manner. Newly designed systems,for the sake ofmaximum separation of luminance and chromi­ utilization of channel capacity. should use many of the techniques nance by means of comb tilters. ~ proposedfor improving NTSC. such as high-rate cameras and displays, No doubt the most important ele­ but should use the component. rather than composite, technique for ment in creating interest in this sub­ color multiplexing. A preference is expressed for noncompatible new ject was the demonstration of the Jap­ systems, both for increased design flexibility and on the basis oflikely anese high-definition television consumer behaL'ior. Some sample systems are described that achieve system in 1981, a development that very high quality in the present 6-MHz channels, full "HDTV" at the took more than ten years.5 Orches­ CCIR rate of 216 Mbits/sec, or "better-than-35mm" at about 500 trated by NHK, with contributions Mbits/sec. Possibilities for even higher efficiency using motion compen­ from many Japanese companies, im­ sation are described. ages have been produced that are comparable to 35mm theater quality.
    [Show full text]
  • Convolutional Color Constancy
    Convolutional Color Constancy Jonathan T. Barron [email protected] Abstract assumes that the illuminant color is the average color of all image pixels, thereby implicitly assuming that object Color constancy is the problem of inferring the color of reflectances are, on average, gray [12]. This simple idea the light that illuminated a scene, usually so that the illumi- can be generalized to modeling gradient information or us- nation color can be removed. Because this problem is un- ing generalized norms instead of a simple arithmetic mean derconstrained, it is often solved by modeling the statistical [4, 36], modeling the spatial distribution of colors with a fil- regularities of the colors of natural objects and illumina- ter bank [13], modeling the distribution of color histograms tion. In contrast, in this paper we reformulate the problem [21], or implicitly reasoning about the moments of colors of color constancy as a 2D spatial localization task in a log- using PCA [14]. Other models assume that the colors of chrominance space, thereby allowing us to apply techniques natural objects lie with some gamut [3, 23]. Most of these from object detection and structured prediction to the color models can be thought of as statistical, as they either assume constancy problem. By directly learning how to discrimi- some distribution of colors or they learn some distribution nate between correctly white-balanced images and poorly of colors from training data. This connection to learning white-balanced images, our model is able to improve per- and statistics is sometimes made more explicit, often in a formance on standard benchmarks by nearly 40%.
    [Show full text]
  • Basics of Video
    Basics of Video Yao Wang Polytechnic University, Brooklyn, NY11201 [email protected] Video Basics 1 Outline • Color perception and specification (review on your own) • Video capture and disppy(lay (review on your own ) • Analog raster video • Analog TV systems • Digital video Yao Wang, 2013 Video Basics 2 Analog Video • Video raster • Progressive vs. interlaced raster • Analog TV systems Yao Wang, 2013 Video Basics 3 Raster Scan • Real-world scene is a continuous 3-DsignalD signal (temporal, horizontal, vertical) • Analog video is stored in the raster format – Sampling in time: consecutive sets of frames • To render motion properly, >=30 frame/s is needed – Sampling in vertical direction: a frame is represented by a set of scan lines • Number of lines depends on maximum vertical frequency and viewingg, distance, 525 lines in the NTSC s ystem – Video-raster = 1-D signal consisting of scan lines from successive frames Yao Wang, 2013 Video Basics 4 Progressive and Interlaced Scans Progressive Frame Interlaced Frame Horizontal retrace Field 1 Field 2 Vertical retrace Interlaced scan is developed to provide a trade-off between temporal and vertical resolution, for a given, fixed data rate (number of line/sec). Yao Wang, 2013 Video Basics 5 Waveform and Spectrum of an Interlaced Raster Horizontal retrace Vertical retrace Vertical retrace for first field from first to second field from second to third field Blanking level Black level Ӈ Ӈ Th White level Tl T␷ T␷ ⌬t 2 ⌬ t (a) Խ⌿( f )Խ f 0 fl 2fl 3fl fmax (b) Yao Wang, 2013 Video Basics 6 Color
    [Show full text]
  • Color Spaces Analysis for Luminance & Chrominance Signals AS NTSC
    Journal Of Engineering And Development, Vol. 15, No.4, Des 2011 ISSN 1813- 7822 Color Spaces Analysis For Luminance & Chrominance Signals AS NTSC-TV System Riyad Mitieb Mahmood Assistant Lecturer Electrical Eng. Dept. Abstract There are many regulations related to television and video broadcast signals in various countries around the world, especially the developed countries in technology and communication systems including the United States, Japan and the two together using (NTSC) system . So this system was chosen in research to analyze the space of colors in which to enables us to know the nature of this system and the possibility to convert to other working systems (PAL, SECAM ,...) to ensure the exchange of information and video signals through systems of the global direct broadcasting and the Internet. The research has included analysis of the colors according to an international standard (International Commission on Illumination), which represents the evolution of Maxwell triangle to the process of mixing colors and thus determine the color space for this system for parameters (I,Q,Y) and calculate the values of quantitative color as well as the value of grayscale according to mathematical equations and illustrations of this, After that simulate those results in the image have been selected for the test to be analyzed according to the matlab program as it has proved assess the pattern of color change correct outcomes of the application in the analysis provided and the color that fits the selected image. Keywords : color’s system , Commission International (CIE) , National Television System Committee (NTSC). تحليﻻت فراغات اﻷلوان ﻹشارات اﻹضاءة والتلوين لنظام التلفازNTSC الخﻻصة هْاك أّظَت مثُشة حخص اىبث أىخيفاصٌ واﻹشاساث اىفذَىَت اىعاٍيت فٍ ٍخخيف دوه اىعاىٌ وباﻷخص اىذوه اىَخطىسة حقُْا فٍ ٍْظىٍاث اﻻحصاﻻث و ٍْها اىىﻻَاث اىَخحذة اﻷٍشَنُت و اىُاباُ و اﻻثْاُ ٍعا حسخعَو ّظاً ( NTSC ( .
    [Show full text]
  • Image-Quality Metric System for Color Filter Array Evaluation
    PLOS ONE RESEARCH ARTICLE Image-quality metric system for color filter array evaluation Tae Wuk BaeID* Daegu-Gyeongbuk Research Center, Electronics and Telecommunications Research Institute, Daegu, South Korea * [email protected], [email protected] a1111111111 Abstract a1111111111 a1111111111 A modern color filter array (CFA) output is rendered into the final output image using a a1111111111 a1111111111 demosaicing algorithm. During this process, the rendered image is affected by optical and carrier cross talk of the CFA pattern and demosaicing algorithm. Although many CFA pat- terns have been proposed thus far, an image-quality (IQ) evaluation system capable of com- prehensively evaluating the IQ of each CFA pattern has yet to be developed, although IQ evaluation items using local characteristics or specific domain have been created. Hence, OPEN ACCESS we present an IQ metric system to evaluate the IQ performance of CFA patterns. The pro- Citation: Bae TW (2020) Image-quality metric posed CFA evaluation system includes proposed metrics such as the moire robustness system for color filter array evaluation. PLoS ONE 15(5): e0232583. https://doi.org/10.1371/journal. using the experimentally determined moire starting point (MSP) and achromatic reproduc- pone.0232583 tion (AR) error, as well as existing metrics such as color accuracy using CIELAB, a color Editor: Hocine Cherifi, Unviersity of Burgundy, reproduction error using spatial CIELAB, structural information using the structure similarity, FRANCE the image contrast based on MTF50, structural and color distortion using the mean deviation Received: October 7, 2019 similarity index (MDSI), and perceptual similarity using Haar wavelet-based perceptual simi- larity index (HaarPSI).
    [Show full text]
  • Comparison of Color Demosaicing Methods Olivier Losson, Ludovic Macaire, Yanqin Yang
    Comparison of color demosaicing methods Olivier Losson, Ludovic Macaire, Yanqin Yang To cite this version: Olivier Losson, Ludovic Macaire, Yanqin Yang. Comparison of color demosaicing methods. Advances in Imaging and Electron Physics, Elsevier, 2010, 162, pp.173-265. 10.1016/S1076-5670(10)62005-8. hal-00683233 HAL Id: hal-00683233 https://hal.archives-ouvertes.fr/hal-00683233 Submitted on 28 Mar 2012 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Comparison of color demosaicing methods a, a a O. Losson ∗, L. Macaire , Y. Yang a Laboratoire LAGIS UMR CNRS 8146 – Bâtiment P2 Université Lille1 – Sciences et Technologies, 59655 Villeneuve d’Ascq Cedex, France Keywords: Demosaicing, Color image, Quality evaluation, Comparison criteria 1. Introduction Today, the majority of color cameras are equipped with a single CCD (Charge- Coupled Device) sensor. The surface of such a sensor is covered by a color filter array (CFA), which consists in a mosaic of spectrally selective filters, so that each CCD ele- ment samples only one of the three color components Red (R), Green (G) or Blue (B). The Bayer CFA is the most widely used one to provide the CFA image where each pixel is characterized by only one single color component.
    [Show full text]