Color Threshold Functions: Application of Contrast Sensitivity Functions in Standard and High Dynamic Range Color Spaces

Total Page:16

File Type:pdf, Size:1020Kb

Color Threshold Functions: Application of Contrast Sensitivity Functions in Standard and High Dynamic Range Color Spaces Color Threshold Functions: Application of Contrast Sensitivity Functions in Standard and High Dynamic Range Color Spaces Minjung Kim, Maryam Azimi, and Rafał K. Mantiuk Department of Computer Science and Technology, University of Cambridge Abstract tion outside of vision science. However, it is not obvious as to how Contrast sensitivity functions (CSFs) describe the smallest contrast thresholds in DKL would translate to thresholds in other visible contrast across a range of stimulus and viewing param- color spaces across their color components due to non-linearities eters. CSFs are useful for imaging and video applications, as such as PQ encoding. contrast thresholds describe the maximum of color reproduction In this work, we adapt our spatio-chromatic CSF1 [14] to error that is invisible to the human observer. However, existing predict color threshold functions (CTFs). CTFs describe detec- CSFs are limited. First, they are typically only defined for achro- tion thresholds in color spaces that are more commonly used in matic contrast. Second, even when they are defined for chromatic imaging, video, and color science applications, such as sRGB contrast, the thresholds are described along the cardinal dimen- and YCbCr. The spatio-chromatic CSF model from [14] can sions of linear opponent color spaces, and therefore are difficult predict detection thresholds for any point in the color space and to relate to the dimensions of more commonly used color spaces, for any chromatic and achromatic modulation. In addition, the such as sRGB or CIE L∗a∗b∗. Here, we adapt a recently proposed CSF was fitted to the data that span 0.0002 cd/m2 (scotopic) to CSF to what we call color threshold functions (CTFs), which de- 10,000 cd/m2 (photopic), which makes it appropriate for predict- scribe thresholds for color differences in more commonly used ing thresholds in HDR color spaces. Using this CSF, we numeri- color spaces. We include color spaces with standard dynamic cally solve for detection thresholds in any non-linear color space. ∗ ∗ ∗ ∗ ∗ ∗ range gamut (sRGB, YCbCr, CIE L a b , CIE L u v ) and high Our work lends insight into the coding efficiency and the unifor- dynamic range gamut (PQ-RGB, PQ-YCbCr and ICTCP). Using mity of contrast thresholds in different color spaces. CTFs, we analyze these color spaces in terms of coding efficiency and contrast threshold uniformity. A Device-Independent CSF The spatio-chromatic CSF from [14] is the basis for the CTFs Introduction and Background presented in this work. The CSF was developed to account for Contrast thresholds describe the minimum difference in lu- contrast threshold measurements from 0.125 cycles per degree minance or chromaticity that a human observer can detect. Con- (cpd) to 32 cpd, from 0.0002 cd/m2 to 10,000 cd/m2, and for dif- trast thresholds vary with image and viewing parameters, such as ferent hues [16, 17, 18, 15, 19, 20]. A critical feature of this CSF spatial frequency [1], luminance [2], color [3], and the size of the is that it can accurately describe contrast thresholds for any lumi- stimulus [4]. The dependence of contrast thresholds to such pa- nance and chromaticity combination, meaning we can map con- rameters is described by contrast sensitivity functions (CSFs). trast thresholds from the native color space of the CSF onto color Having an accurate CSF is important for image and video coordinates of other color spaces. coding applications, as contrast thresholds provide a limit to the In Fig. 1, we show the CSF in DKL color space, a linear amount of color reproduction error that is noticeable to a human transformation of LMS color space: observer. For example, CSFs were used to create transfer func- tions for encoding high dynamic range color values [5, 6] in a perceptually uniform manner. In particular, Perceptual Quantizer 2 32 3 1 1 0 DL (PQ) is a non-linear function based on Barten’s CSF [7], used to 6 7 6 L0 76 7 code High Dynamic Range (HDR) content [6], akin to gamma DDKL = 6 1 − 0 76DM7 (1) encoding used in Standard Dynamic Range (SDR) [8]. For fixed 6 M0 76 7 4 L0 + M0 54 5 bit-depth, PQ assigns more code-words to luminance levels with −1 −1 DS S lower thresholds and fewer code-words to levels with high thresh- 0 olds, thus maximizing visual quality while efficiently allocating (L ;M ;S ) is the white point in LMS color space and code-words. 0 0 0 (DL;DM;DS) is the color coordinate with respect to the white However, many existing CSFs only describe achromatic con- point. For the CSFs in Fig. 1 and the CTFs derived from them, we trast, neglecting the detection thresholds in chromatic directions. used CIE 2006 cone fundamentals [21] and assumed D65 white This is true even for Barten’s CSF, meaning that PQ is only ap- point. DDKL is the color coordinate in DKL, representing po- propriate for coding luminance, even though the industry standard sition along achromatic, red-green, and yellow-violet axes. The is to use PQ for all color channels [9, 10]. In addition, exist- ing chromatic CSFs [11, 12] are usually reported in Derrington- 1The code and more details on the spatio-chromatic CSF can be found Krauskopf-Lennie (DKL) color space [13], a linear color oppo- at https://www.cl.cam.ac.uk/research/rainbow/projects/ nent space that is physiologically relevant but has limited applica- hdr-csf/. Achromatic Achromatic ment Dc that yields the detection threshold. For example, con- 1 1 1 1 sider sRGB. We start at (RD65;GD65;BD65) and search for DR such that (RD65 + DR;GD65;BD65) results in the Pdet = 0:5, as- 0.1 10 0.1 10 suming a guess rate of 0. To obtain Pdet, we convert from sRGB to LMS and query the CSF. 0.01 100 0.01 100 Since Dc is an increment in the non-linear target color space, not a contrast, we refer to our thresholds increment thresholds or 1e-3 1e3 1e-3 1e3 color increments thresholds, and the functions as Contrast Thresh- Red-Green Red-Green c > 1 1 Sensitivity (1 / cone contrast) 1 1 Sensitivity (1 / cone contrast) old Functions (CTFs). We show increment thresholds (D 0), but not decrement thresholds (Dc < 0). This is because the under- lying CSF has comparable positive and negative contrast thresh- 0.1 10 0.1 10 olds, resulting in similar increment and decrement thresholds in the target color space. 0.01 100 0.01 100 (cone contrast) (cone contrast) Below, we report increment thresholds for RGB (SDR and ∗ ∗ ∗ ∗ ∗ ∗ 1e-3 1e3 1e-3 1e3 HDR), YCbCr (SDR and HDR), CIE L a b and CIE L u v (SDR only), and IC C (HDR only). For SDR, we used the Yellow-Violet Yellow-Violet T P 1 1 1 1 Threshold Threshold sRGB non-linearity on ITU-R BT.709 color primaries [22], as- suming a display luminance range of 0.1 cd/m2 to 100 cd/m2. For 0.1 10 0.1 10 HDR, we used PQ as the transfer function on ITU-R BT.2020 color primaries [10], assuming a display luminance range be- 2 2 0.01 100 0.01 100 tween 0.005 cd/m and 10,000 cd/m . The values of the increment thresholds between SDR and HDR cannot be directly compared 1e-3 1e3 1e-3 1e3 because the SDR color gamut is smaller than that of HDR, mean- 0.25 1 4 16 0.01 1 100 10k ing an increment threshold of 0.1 corresponds to different amount 0.125Frequency0.5 2 (cpd)8 32 0.001Luminance0.1 10 (cd/m1k 2) of physical change for SDR and HDR. HDR In addition, all plots show the maximum quantization error 0.001 cd/m2 10000 cd/m2 0.125 cpd 4 cpd assuming different bit-depths for each color space (horizontal dot- 0.01 cd/m2 1000 cd/m2 0.25 cpd 8 cpd ted lines). The maximum quantization error is calculated as: SDR 0.5 cpd 16 cpd 0.1 cd/m2 100 cd/m2 1 cpd 32 cpd 2 2 (︃ )︃ 1 cd/m 10 cd/m 2 cpd vmax − vmin maxeq = 0:5 ; (3) Figure 1: Spatio-chromatic contrast threshold functions (CSFs) 2b − 1 along the cardinal directions of the DKL color space [14, 15]. Left column: CSFs as a function of frequency. Right column: where b is the bit-depth, and vmax and vmin are the maximum and CSFs as a function of luminance. Left y-axis: Contrast threshold. minimum values to be encoded, respectively. Note that vmin can Right y-axis: Contrast sensitivity. be negative for some color channels (e.g., a∗ of CIE L∗a∗b∗). When the error is above the threshold, quantization artifacts are contrast thresholds are expressed using cone contrast, a vector likely visible; when the error is below, the artifacts are likely in- length in LMS space: visible, however code-words are wasted. v RGB u "(︃ )︃2 (︃ )︃2 (︃ )︃2# u 1 DL DM DS We show CTFs in RGB with sRGB non-linearity (Fig. 2), Ccone = t + + (2) 3 L0 M0 S0 PQ encoding (PQ-RGB; Fig. 3), and without any non-linearities at HDR luminance levels (linear RGB; Fig. 4). Without any non- The scale factor 1=3 ensures that Ccone 2 [0;1]. At threshold, the linearities, thresholds rise with increasing luminance. In com- probability of detection is Pdet = 0:5. Using DKL space with cone parison, the CTFs for sRGB and PQ-RGB are approximately flat, contrast allows a device-independent representation of the CSF.
Recommended publications
  • An Improved SPSIM Index for Image Quality Assessment
    S S symmetry Article An Improved SPSIM Index for Image Quality Assessment Mariusz Frackiewicz * , Grzegorz Szolc and Henryk Palus Department of Data Science and Engineering, Silesian University of Technology, Akademicka 16, 44-100 Gliwice, Poland; [email protected] (G.S.); [email protected] (H.P.) * Correspondence: [email protected]; Tel.: +48-32-2371066 Abstract: Objective image quality assessment (IQA) measures are playing an increasingly important role in the evaluation of digital image quality. New IQA indices are expected to be strongly correlated with subjective observer evaluations expressed by Mean Opinion Score (MOS) or Difference Mean Opinion Score (DMOS). One such recently proposed index is the SuperPixel-based SIMilarity (SPSIM) index, which uses superpixel patches instead of a rectangular pixel grid. The authors of this paper have proposed three modifications to the SPSIM index. For this purpose, the color space used by SPSIM was changed and the way SPSIM determines similarity maps was modified using methods derived from an algorithm for computing the Mean Deviation Similarity Index (MDSI). The third modification was a combination of the first two. These three new quality indices were used in the assessment process. The experimental results obtained for many color images from five image databases demonstrated the advantages of the proposed SPSIM modifications. Keywords: image quality assessment; image databases; superpixels; color image; color space; image quality measures Citation: Frackiewicz, M.; Szolc, G.; Palus, H. An Improved SPSIM Index 1. Introduction for Image Quality Assessment. Quantitative domination of acquired color images over gray level images results in Symmetry 2021, 13, 518. https:// the development not only of color image processing methods but also of Image Quality doi.org/10.3390/sym13030518 Assessment (IQA) methods.
    [Show full text]
  • Creating 4K/UHD Content Poster
    Creating 4K/UHD Content Colorimetry Image Format / SMPTE Standards Figure A2. Using a Table B1: SMPTE Standards The television color specification is based on standards defined by the CIE (Commission 100% color bar signal Square Division separates the image into quad links for distribution. to show conversion Internationale de L’Éclairage) in 1931. The CIE specified an idealized set of primary XYZ SMPTE Standards of RGB levels from UHDTV 1: 3840x2160 (4x1920x1080) tristimulus values. This set is a group of all-positive values converted from R’G’B’ where 700 mv (100%) to ST 125 SDTV Component Video Signal Coding for 4:4:4 and 4:2:2 for 13.5 MHz and 18 MHz Systems 0mv (0%) for each ST 240 Television – 1125-Line High-Definition Production Systems – Signal Parameters Y is proportional to the luminance of the additive mix. This specification is used as the color component with a color bar split ST 259 Television – SDTV Digital Signal/Data – Serial Digital Interface basis for color within 4K/UHDTV1 that supports both ITU-R BT.709 and BT2020. 2020 field BT.2020 and ST 272 Television – Formatting AES/EBU Audio and Auxiliary Data into Digital Video Ancillary Data Space BT.709 test signal. ST 274 Television – 1920 x 1080 Image Sample Structure, Digital Representation and Digital Timing Reference Sequences for The WFM8300 was Table A1: Illuminant (Ill.) Value Multiple Picture Rates 709 configured for Source X / Y BT.709 colorimetry ST 296 1280 x 720 Progressive Image 4:2:2 and 4:4:4 Sample Structure – Analog & Digital Representation & Analog Interface as shown in the video ST 299-0/1/2 24-Bit Digital Audio Format for SMPTE Bit-Serial Interfaces at 1.5 Gb/s and 3 Gb/s – Document Suite Illuminant A: Tungsten Filament Lamp, 2854°K x = 0.4476 y = 0.4075 session display.
    [Show full text]
  • Method for Compensating for Color Differences Between Different Images of a Same Scene
    (19) TZZ¥ZZ__T (11) EP 3 001 668 A1 (12) EUROPEAN PATENT APPLICATION (43) Date of publication: (51) Int Cl.: 30.03.2016 Bulletin 2016/13 H04N 1/60 (2006.01) (21) Application number: 14306471.5 (22) Date of filing: 24.09.2014 (84) Designated Contracting States: (72) Inventors: AL AT BE BG CH CY CZ DE DK EE ES FI FR GB • Sheikh Faridul, Hasan GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO 35576 CESSON-SÉVIGNÉ (FR) PL PT RO RS SE SI SK SM TR • Stauder, Jurgen Designated Extension States: 35576 CESSON-SÉVIGNÉ (FR) BA ME • Serré, Catherine 35576 CESSON-SÉVIGNÉ (FR) (71) Applicants: • Tremeau, Alain • Thomson Licensing 42000 SAINT-ETIENNE (FR) 92130 Issy-les-Moulineaux (FR) • CENTRE NATIONAL DE (74) Representative: Browaeys, Jean-Philippe LA RECHERCHE SCIENTIFIQUE -CNRS- Technicolor 75794 Paris Cedex 16 (FR) 1, rue Jeanne d’Arc • Université Jean Monnet de Saint-Etienne 92443 Issy-les-Moulineaux (FR) 42023 Saint-Etienne Cedex 2 (FR) (54) Method for compensating for color differences between different images of a same scene (57) The method comprises the steps of: - for each combination of a first and second illuminants, applying its corresponding chromatic adaptation matrix to the colors of a first image to compensate such as to obtain chromatic adapted colors forming a chromatic adapted image and calculating the difference between the colors of a second image and the chromatic adapted colors of this chromatic adapted image, - retaining the combination of first and second illuminants for which the corresponding calculated difference is the smallest, - compensating said color differences by applying the chromatic adaptation matrix corresponding to said re- tained combination to the colors of said first image.
    [Show full text]
  • Khronos Data Format Specification
    Khronos Data Format Specification Andrew Garrard Version 1.2, Revision 1 2019-03-31 1 / 207 Khronos Data Format Specification License Information Copyright (C) 2014-2019 The Khronos Group Inc. All Rights Reserved. This specification is protected by copyright laws and contains material proprietary to the Khronos Group, Inc. It or any components may not be reproduced, republished, distributed, transmitted, displayed, broadcast, or otherwise exploited in any manner without the express prior written permission of Khronos Group. You may use this specification for implementing the functionality therein, without altering or removing any trademark, copyright or other notice from the specification, but the receipt or possession of this specification does not convey any rights to reproduce, disclose, or distribute its contents, or to manufacture, use, or sell anything that it may describe, in whole or in part. This version of the Data Format Specification is published and copyrighted by Khronos, but is not a Khronos ratified specification. Accordingly, it does not fall within the scope of the Khronos IP policy, except to the extent that sections of it are normatively referenced in ratified Khronos specifications. Such references incorporate the referenced sections into the ratified specifications, and bring those sections into the scope of the policy for those specifications. Khronos Group grants express permission to any current Promoter, Contributor or Adopter member of Khronos to copy and redistribute UNMODIFIED versions of this specification in any fashion, provided that NO CHARGE is made for the specification and the latest available update of the specification for any version of the API is used whenever possible.
    [Show full text]
  • Yasser Syed & Chris Seeger Comcast/NBCU
    Usage of Video Signaling Code Points for Automating UHD and HD Production-to-Distribution Workflows Yasser Syed & Chris Seeger Comcast/NBCU Comcast TPX 1 VIPER Architecture Simpler Times - Delivering to TVs 720 1920 601 HD 486 1080 1080i 709 • SD - HD Conversions • Resolution, Standard Dynamic Range and 601/709 Color Spaces • 4:3 - 16:9 Conversions • 4:2:0 - 8-bit YUV video Comcast TPX 2 VIPER Architecture What is UHD / 4K, HFR, HDR, WCG? HIGH WIDE HIGHER HIGHER DYNAMIC RESOLUTION COLOR FRAME RATE RANGE 4K 60p GAMUT Brighter and More Colorful Darker Pixels Pixels MORE FASTER BETTER PIXELS PIXELS PIXELS ENABLED BY DOLBYVISION Comcast TPX 3 VIPER Architecture Volume of Scripted Workflows is Growing Not considering: • Live Events (news/sports) • Localized events but with wider distributions • User-generated content Comcast TPX 4 VIPER Architecture More Formats to Distribute to More Devices Standard Definition Broadcast/Cable IPTV WiFi DVDs/Files • More display devices: TVs, Tablets, Mobile Phones, Laptops • More display formats: SD, HD, HDR, 4K, 8K, 10-bit, 8-bit, 4:2:2, 4:2:0 • More distribution paths: Broadcast/Cable, IPTV, WiFi, Laptops • Integration/Compositing at receiving device Comcast TPX 5 VIPER Architecture Signal Normalization AUTOMATED LOGIC FOR CONVERSION IN • Compositing, grading, editing SDR HLG PQ depends on proper signal BT.709 BT.2100 BT.2100 normalization of all source files (i.e. - Still Graphics & Titling, Bugs, Tickers, Requires Conversion Native Lower-Thirds, weather graphics, etc.) • ALL content must be moved into a single color volume space. Normalized Compositing • Transformation from different Timeline/Switcher/Transcoder - PQ-BT.2100 colourspaces (BT.601, BT.709, BT.2020) and transfer functions (Gamma 2.4, PQ, HLG) Convert Native • Correct signaling allows automation of conversion settings.
    [Show full text]
  • How Close Is Close Enough? Specifying Colour Tolerances for Hdr and Wcg Displays
    HOW CLOSE IS CLOSE ENOUGH? SPECIFYING COLOUR TOLERANCES FOR HDR AND WCG DISPLAYS Jaclyn A. Pytlarz, Elizabeth G. Pieri Dolby Laboratories Inc., USA ABSTRACT With a new high-dynamic-range (HDR) and wide-colour-gamut (WCG) standard defined in ITU-R BT.2100 (1), display and projector manufacturers are racing to extend their visible colour gamut by brightening and widening colour primaries. The question is: how close is close enough? Having this answer is increasingly important for both consumer and professional display manufacturers who strive to balance design trade-offs. In this paper, we present “ground truth” visible colour differences from a psychophysical experiment using HDR laser cinema projectors with near BT.2100 colour primaries up to 1000 cd/m2. We present our findings, compare colour difference metrics, and propose specifying colour tolerances for HDR/WCG displays using the ΔICTCP (2) metric. INTRODUCTION AND BACKGROUND From initial display design to consumer applications, measuring colour differences is a vital component of the imaging pipeline. Now that the industry has moved towards displays with higher dynamic range as well as wider, more saturated colours, no standardized method of measuring colour differences exists. In display calibration, aside from metamerism effects, it is crucial that the specified tolerances align with human perception. Otherwise, one of two undesirable situations might result: first, tolerances are too large and calibrated displays will not appear to visually match; second, tolerances are unnecessarily tight and the calibration process becomes uneconomic. The goal of this paper is to find a colour difference measurement metric for HDR/WCG displays that balances the two and closely aligns with human vision.
    [Show full text]
  • Khronos Data Format Specification
    Khronos Data Format Specification Andrew Garrard Version 1.3.1 2020-04-03 1 / 281 Khronos Data Format Specification License Information Copyright (C) 2014-2019 The Khronos Group Inc. All Rights Reserved. This specification is protected by copyright laws and contains material proprietary to the Khronos Group, Inc. It or any components may not be reproduced, republished, distributed, transmitted, displayed, broadcast, or otherwise exploited in any manner without the express prior written permission of Khronos Group. You may use this specification for implementing the functionality therein, without altering or removing any trademark, copyright or other notice from the specification, but the receipt or possession of this specification does not convey any rights to reproduce, disclose, or distribute its contents, or to manufacture, use, or sell anything that it may describe, in whole or in part. This version of the Data Format Specification is published and copyrighted by Khronos, but is not a Khronos ratified specification. Accordingly, it does not fall within the scope of the Khronos IP policy, except to the extent that sections of it are normatively referenced in ratified Khronos specifications. Such references incorporate the referenced sections into the ratified specifications, and bring those sections into the scope of the policy for those specifications. Khronos Group grants express permission to any current Promoter, Contributor or Adopter member of Khronos to copy and redistribute UNMODIFIED versions of this specification in any fashion, provided that NO CHARGE is made for the specification and the latest available update of the specification for any version of the API is used whenever possible.
    [Show full text]
  • AN9717: Ycbcr to RGB Considerations (Multimedia)
    YCbCr to RGB Considerations TM Application Note March 1997 AN9717 Author: Keith Jack Introduction Converting 4:2:2 to 4:4:4 YCbCr Many video ICs now generate 4:2:2 YCbCr video data. The Prior to converting YCbCr data to R´G´B´ data, the 4:2:2 YCbCr color space was developed as part of ITU-R BT.601 YCbCr data must be converted to 4:4:4 YCbCr data. For the (formerly CCIR 601) during the development of a world-wide YCbCr to RGB conversion process, each Y sample must digital component video standard. have a corresponding Cb and Cr sample. Some of these video ICs also generate digital RGB video Figure 1 illustrates the positioning of YCbCr samples for the data, using lookup tables to assist with the YCbCr to RGB 4:4:4 format. Each sample has a Y, a Cb, and a Cr value. conversion. By understanding the YCbCr to RGB conversion Each sample is typically 8 bits (consumer applications) or 10 process, the lookup tables can be eliminated, resulting in a bits (professional editing applications) per component. substantial cost savings. Figure 2 illustrates the positioning of YCbCr samples for the This application note covers some of the considerations for 4:2:2 format. For every two horizontal Y samples, there is converting the YCbCr data to RGB data without the use of one Cb and Cr sample. Each sample is typically 8 bits (con- lookup tables. The process basically consists of three steps: sumer applications) or 10 bits (professional editing applica- tions) per component.
    [Show full text]
  • Book of Abstracts of the International Colour Association (AIC) Conference 2020
    NATURAL COLOURS - DIGITAL COLOURS Book of Abstracts of the International Colour Association (AIC) Conference 2020 Avignon, France 20, 26-28th november 2020 Sponsored by le Centre Français de la Couleur (CFC) Published by International Colour Association (AIC) This publication includes abstracts of the keynote, oral and poster papers presented in the International Colour Association (AIC) Conference 2020. The theme of the conference was Natural Colours - Digital Colours. The conference, organised by the Centre Français de la Couleur (CFC), was held in Avignon, France on 20, 26-28th November 2020. That conference, for the first time, was managed online and onsite due to the sanitary conditions provided by the COVID-19 pandemic. More information in: www.aic2020.org. © 2020 International Colour Association (AIC) International Colour Association Incorporated PO Box 764 Newtown NSW 2042 Australia www.aic-colour.org All rights reserved. DISCLAIMER Matters of copyright for all images and text associated with the papers within the Proceedings of the International Colour Association (AIC) 2020 and Book of Abstracts are the responsibility of the authors. The AIC does not accept responsibility for any liabilities arising from the publication of any of the submissions. COPYRIGHT Reproduction of this document or parts thereof by any means whatsoever is prohibited without the written permission of the International Colour Association (AIC). All copies of the individual articles remain the intellectual property of the individual authors and/or their
    [Show full text]
  • KONA Series Capture, Display, Convert
    KONA Series Capture, Display, Convert Installation and Operation Manual Version 16.1 Published July 14, 2021 Notices Trademarks AJA® and Because it matters.® are registered trademarks of AJA Video Systems, Inc. for use with most AJA products. AJA™ is a trademark of AJA Video Systems, Inc. for use with recorder, router, software and camera products. Because it matters.™ is a trademark of AJA Video Systems, Inc. for use with camera products. Corvid Ultra®, lo®, Ki Pro®, KONA®, KUMO®, ROI® and T-Tap® are registered trademarks of AJA Video Systems, Inc. AJA Control Room™, KiStor™, Science of the Beautiful™, TruScale™, V2Analog™ and V2Digital™ are trademarks of AJA Video Systems, Inc. All other trademarks are the property of their respective owners. Copyright Copyright © 2021 AJA Video Systems, Inc. All rights reserved. All information in this manual is subject to change without notice. No part of the document may be reproduced or transmitted in any form, or by any means, electronic or mechanical, including photocopying or recording, without the express written permission of AJA Video Systems, Inc. Contacting AJA Support When calling for support, have all information at hand prior to calling. To contact AJA for sales or support, use any of the following methods: Telephone +1.530.271.3190 FAX +1.530.271.3140 Web https://www.aja.com Support Email [email protected] Sales Email [email protected] KONA Capture, Display, Convert v16.1 2 www.aja.com Contents Notices . .2 Trademarks . 2 Copyright . 2 Contacting AJA Support . 2 Chapter 1 – Introduction . .5 Overview. .5 KONA Models Covered in this Manual .
    [Show full text]
  • International Journal for Scientific Research & Development
    IJSRD - International Journal for Scientific Research & Development| Vol. 1, Issue 12, 2014 | ISSN (online): 2321-0613 Modifying Image Appearance for Improvement in Information Gaining For Colour Blinds Prof.D.S.Khurge1 Bhagyashree Peswani2 1Professor, ME2, 1, 2 Department of Electronics and Communication, 1,2L.J. Institute of Technology, Ahmedabad Abstract--- Color blindness is a color perception problem of Light transmitted by media i.e., Television uses additive human eye to distinguish colors. Persons who are suffering color mixing with primary colors of red, green, and blue, from color blindness face many problem in day to day life each of which is stimulating one of the three types of the because many information are contained in color color receptors of eyes with as little stimulation as possible representations like traffic light, road signs etc. of the other two. Its called "RGB" color space. Mixtures of Daltonization is a procedure for adapting colors in an image light are actually a mixture of these primary colors and it or a sequence of images for improving the color perception covers a huge part of the human color space and thus by a color-deficient viewer. In this paper, we propose a re- produces a large part of human color experiences. That‟s coloring algorithm to improve the accessibility for the color because color television sets or color computer monitors deficient viewers. In Particular, we select protanopia, a type needs to produce mixtures of primary colors. of dichromacy where the patient does not naturally develop Other primary colors could in principle be used, but the “Red”, or Long wavelength, cones in his or her eyes.
    [Show full text]
  • Color Image Enhancement with Saturation Adjustment Method
    Journal of Applied Science and Engineering, Vol. 17, No. 4, pp. 341-352 (2014) DOI: 10.6180/jase.2014.17.4.01 Color Image Enhancement with Saturation Adjustment Method Jen-Shiun Chiang1, Chih-Hsien Hsia2*, Hao-Wei Peng1 and Chun-Hung Lien3 1Department of Electrical Engineering, Tamkang University, Tamsui, Taiwan 251 2Department of Electrical Engineering, Chinese Culture University, Taipei, Taiwan 111 3Commercialization and Service Center, Industrial Technology Research Institute, Taipei, Taiwan 106 Abstract In the traditional color adjustment approach, people tried to separately adjust the luminance and saturation. This approach makes the color over-saturate very easily and makes the image look unnatural. In this study, we try to use the concept of exposure compensation to simulate the brightness changes and to find the relationship among luminance, saturation, and hue. The simulation indicates that saturation changes withthe change of luminance and the simulation also shows there are certain relationships between color variation model and YCbCr color model. Together with all these symptoms, we also include the human vision characteristics to propose a new saturation method to enhance the vision effect of an image. As results, the proposed approach can make the image have better vivid and contrast. Most important of all, unlike the over-saturation caused by the conventional approach, our approach prevents over-saturation and further makes the adjusted image look natural. Key Words: Color Adjustment, Human Vision, Color Image Processing, YCbCr, Over-Saturation 1. Introduction over-saturated images. An over-saturated image looks unnatural due to the loss of color details in high satura- With the advanced development of technology, the tion area of the image.
    [Show full text]