Programación Matemática y Software (2010) Vol. 2. No 2. ISSN: 2007-3283 Recibido: 17 de Agosto de 2010 Aceptado: 25 de Noviembre de 2010 Publicado en línea: 30 de Diciembre de 2010

Measuring perceived difference using YIQ NTSC transmission in mobile applications

Yuriy Kotsarenko, Fernando Ramos TECNOLOGICO DE DE MONTERREY, CAMPUS CUERNAVACA.

Resumen: En este trabajo varias fórmulas están introducidas que permiten calcular la medir la diferencia entre colores de forma perceptible, utilizando el espacio de colores YIQ. Las formulas clásicas y sus derivados que utilizan los espacios CIELAB y CIELUV requieren muchas transformaciones aritméticas de valores entrantes definidos comúnmente con los componentes de rojo, verde y azul, y por lo tanto son muy pesadas para su implementación en dispositivos móviles. Las fórmulas alternativas propuestas en este trabajo basadas en espacio de colores YIQ son sencillas y se calculan rápidamente, incluso en tiempo real. La comparación está incluida en este trabajo entre las formulas clásicas y las propuestas utilizando dos diferentes grupos de experimentos. El primer grupo de experimentos se enfoca en evaluar la diferencia perceptible utilizando diferentes fórmulas, mientras el segundo grupo de experimentos permite determinar el desempeño de cada una de las fórmulas para determinar su velocidad cuando se procesan imágenes. Los resultados experimentales indican que las formulas propuestas en este trabajo son muy cercanas en términos perceptibles a las de CIELAB y CIELUV, pero son significativamente más rápidas, lo que los hace buenos candidatos para la medición de las diferencias de colores en dispositivos móviles y aplicaciones en tiempo real.

Abstract: An alternative formulas are presented for measuring the perceived difference between two color samples defined in YIQ color space. The classical color difference formulas such as CIELAB, CIELUV and their derivatives require many arithmetic transformations of the input values commonly given using , and components, making these formulas cumbersome to implement on mobile devices. The alternative formulas based on YIQ color space presented in this work are lightweight and can be quickly calculated, even in real-time. The comparison is made between the newly introduced formulas and the classical ones using two different sets of experiments. The first set of experiments is focused on evaluating the perceived difference of using different formulas for picking colors. The second set of experiments allows benchmarking of the color difference formulas to evaluate their performance when processing images. The experimental results show that the newly introduced formulas are close in perceived terms to CIELAB and CIELUV but are significantly faster, making them good candidates for measuring color difference on mobile devices and applications even in real-time.

KEYWORDS: COLOR DIFFERENCE, COLOR METRIC, YIQ, PERCEPTUAL UNIFORMITY, COLOR SPACE 28 Yuriy Kotsarenko, Fernando Ramos

Introduction

There are many applications where two or This fact led to the development of more more colors need to be compared for some perceptually uniform color spaces described purpose. The examples of such applications later on in this work. In addition, not all colors include but are not limited to stereo vision visible to human eye can be modeled by the algorithms, recognition algorithms used in RGB color space [1]. The RGB color space surveillance, textile industry and image itself can be defined in many ways depending processing. As the number of mobile devices on the three chosen primaries making the is growing, there is an increasing range of space dependent on the particularly used devices, each with unique set of processing device. In the context of this work, the values capabilities. In the typical software of RGB color space are specified according applications running on the aforementioned to sRGB standard described in Rec. 709 [2]. systems the colors are usually defined by In an informal article “Colour Metric” their respective red, green and blue values, posted on CompuPhase web site on Internet also called RGB color space. A comparison [3], Thiadmer Riemersma suggested a of two sample colors usually involves variation to the equation (1), which calculating the distance or difference apparently gives better results in perceptual between them. If the sample colors have their terms and justifies it by the fact that the respective coordinates specified as modification is used in their products such as ( r , g , b ) and ( r , g , b ), then the 1 1 1 2 2 2 EGI, AniSprite and PaletteMaker. The difference between the two colors can be proposed modification is the following: calculated as:

 r  2 2  255  r  2 Eri 2   gr 24    b 2 2 2 256 256 (2)            bbggrrE   (1)     rgb 12 12 12   2   2  bgrE  2 (1.1) rgb where  rrr 21  2 and r1, g1, b1, r2, g2

The values of , , and , , are and b2 values are given in the range of [0, specified in the range of [0, 1]. In some 255]. According to Thiadmer Riemersma, the performance critical applications, the square formula was derived from several tests using

2 a program which displayed 64 colors palette root is omitted and the value of Ergb is and a sample color, that user had to match to used instead. an existing color on the palette; in addition, The equation (1.1) is both easy to the program also suggested a matching color implement and fast to compute. However, the depending on one of the available color arbitrary variations of values in RGB space difference equations. are not perceived as uniform by human eye. Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 29

The equation (2) has an elegant form the equation (4) are specified according to and can actually accept integer values for Rec. 709 standard [7], but other RGB components – a good point for mobile specifications exist [2]. applications, but it lacks experimental data to The CIE 1931 XYZ color space is justify the selection of weight coefficients and primarily used for specifying other the program used by the author to deduce perceptually uniform color spaces. One of the equation (2) displayed a very limited such color spaces that is also considered a palette to the users, making the reliability of CIE standard - the CIE 1976 L*, a*, b* the proposed formula inconclusive. (CIELAB) color space, which can encompass A color space that can encompass all all the colors visible to human eye and is the colors visible to the human eye has been somewhat perceptually uniform [2], [9]. The proposed - the CIE 1931 XYZ color space, coordinates in CIELAB color space can be which is considered to be CIE standard. The calculated from CIE XYZ values as the selection of XYZ functions and the following [2], [4]: development of the color space were  Y  described by J. Schanda [4]. The coordinates *  116 fL   16 (5) Y  specified in RGB color space can be  n  transformed to CIE XYZ as the following [2],   X   Y  *  500 fa    f   (6) [5]:  X   Y    n   n  X   .0 412453 .0 357580 .0 180423 r          *  Y   Z  Y    .0 212671 .0 715160 .0 072169  g (3)  200 fb    f   (7)           Z   .0 019334 .0 119193 .0 950227 b  Yn   Z n 

where X, Y and Z are the new coordinates in t 31 , t  6 29 3  2 CIE XYZ color space and r, g and b are the tf   1  29  4 (8)    t  , otherwise linear RGB coordinates [5], [6]. The 3  6  29

transformation from non-linear r , g and b where f(t) is a temporary transform function,

coordinates to their linear counterparts is Xn, Yn, and Zn are the coordinates of the made by applying [2], [6]: particular point used in the calculations.

According to Rec. 709 [5] the white point D65  1   VsRGB  VsRGB  .00, 03928 has the coordinates of X =0.9505, Y =1 and 12.92 n n L  4.2 (4)  V    sRGB .0 055  Zn=1.0891 (calculated from their x and y   .0, 03928  VsRGB  1  .1 055  coordinates of 0.3127 and 0.329 respectively on the CIE diagram [1], [6], [7] where VsRGB should be replaced by the [9]). The values of a* and b* can vary in [- r , g and b values to obtain the linear 100, 100] range for typical images, while the values of r, g and b. The coefficients used in value of L* is usually defined in the range of 30 Yuriy Kotsarenko, Fernando Ramos

[0, 100]. The color difference between a pair The simplified equation using the of two samples can therefore be calculated aforementioned cylindrical values of C* and as: h* according to J. Schanda [4] is the following: * * 2 * 2 * 2 (9) Lab      baLE  2 2 2 (15) ELCH      HCL  There are many color difference formulas derived from CIELAB color difference (for The equations (9), (10) and (15) represent instance, several variants are measured in the human perception of colors more closely [10]) and in many cases they can be (unless CIELAB is not really perceptually generalized to the following form [10], [11]: uniform [9]) and they provide a way of calculating color differences for all colors that 2 2 2  L   C *   H *  *       Egen           R (10) can be seen by human eye (as opposed to  Sk LL   Sk CC   Sk HH  RGB-based color difference formulas where R according to M. R. Luo, G. Cui described earlier). However, they require and B. Rigg [11] is an interactive term three transformations from original RGB between chroma and differences (in coordinates to CIELAB and equation (15) many variants it is set to zero [10]), kL, kC, requires further transformation of the and kH are parameters given by the user that coordinates to cylindrical coordinate system, according to M. Melgosa, J. J. Quesada, and making the CIELAB equations cumbersome E. Hita [10] depend on the experimental to implement on mobile devices, where conditions based on the environment of the arithmetical capabilities can be quite limited. application; SL, SC, and SH are the correction The generalized equation (10) for the factors or so-called weighting functions that CIELAB color difference variants can be apparently improve the perceptual uniformity difficult to implement for the end user, who of the equation. The parameters C * and may not know the details of the development of the equation to be able to properly H * are variations in chroma and hue compute arguments , k , k ,, k , S , S , respectively, they can be calculated L C H L C and S . Thus, the equation (10) is not only according to J. Schanda [4] (in a variant such H cumbersome to implement, but also difficult as CIEDE 2000 they are calculated in similar to use. The equations based on CIELAB way [12]): color space have successfully been used in * * 2 * 2 (11) still images but are not suitable for real-time i i  baC i  i=1,2 processing [2]. * * * 2  CCC 1 (12) An alternative perceptually uniform *  **1 i  tan abh ii  i=1,2 (13) color space has been proposed as CIE

* * * * * (14) standard – the CIE L*, u*, v* (CIELUV) color 2 2 1 sin2  hhCCH 1  2  space. The color coordinates in CIELUV Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 31 color space can be calculated from CIE XYZ The CIELUV color space can also be defined values as the following [2], [4]: using cylindrical coordinates similar to CIELAB as described earlier in this work, with 116  31 YYYY  6 29 3 L*  n n  3 3 (16) its accompanying color difference equation 29 3  n  YYYY n  6 29  but is not described here because of size * * 13   uuLu n  (17) constraints.

* * 13   vvLv n  (18) The equation (21) shares similar characteristics with the CIELAB color where u’ and v’ are the parameters from CIE difference equation (9). It predicts the color 1976 UCS Diagram [4], u and v are the n n difference between the samples close to how coordinates of the reference white point. The it can be seen by human eye, but it has the coordinates u’ and v’ (and their respective same weakness being difficult to implement white point values) can be calculated on mobile devices with limited mathematical according to J. Schanda [4] as: capabilities. In fact, the CIELUV equation (21) has implementation more complex than  4  15  3ZYXXu  (19) the CIELAB equation (9) because of  9  15  3ZYXYv  (20) intermediary values u’, v’ and their respective

The values of u* and v* may vary in [-100, equivalents for the white point. According to 100] for typical images, while L* is usually E. M. Granger [9], the CIELUV space is a defined in the range of [0, 100]. Note that L* better alternative for measuring perceived in CIELUV space refers to the same L* in color differences than CIELAB, suggesting CIELAB color space. The equation (16) for further improvements to the luminance calculating L* is slightly different from the parameter L*. equation (5) as they have been taken from A more complex color difference different sources, but it can be determined metrics have been developed, such as DIN99 graphically that they represent practically the color space [15] with its own color difference same curve. In the works of C. Poynton [2] formula and its derivatives [16]. The DIN99 and J. Schanda [4] it can be verified that L* color space is considered a German parameter in both CIELAB and CIELUV is the standard; its coordinates are calculated by same. transforming CIELAB coordinates using

The color difference in CIELUV logarithmic functions and rotation. The DIN99 space can be determined as [13], [14]: based color difference equations are more complex and require more mathematical * * 2 * 2 * 2 (21) operations, making them worse candidates Luv      vuLE  for mobile devices and applications. 32 Yuriy Kotsarenko, Fernando Ramos

An alternative proposed  eQ  (22) by S. L. Guth called ATD Model for Color ,,min bgr   1    (23) Vision [17] has a different approach to ,,max   Ybgr 0 measuring color difference using two      ,,min1,,max bgrQbgrQ   Lsaf  (24) separate formulas: short color difference and 2            rbbggrQ  large color difference. These two formulas   Csaf  (25) are calculated by a series of transformations, 3     starting with x, y and z chromaticity 1  bg H saf  tan   (26)    gr   coordinates on CIE diagram (not CIE XYZ values), which are consecutively transformed where Q is a luminosity tuning parameter, α to new set of coordinates in roughly five is a temporary variable, γ=3 and Y0=100. The steps, after which the aforementioned color values of r’, g’ and b’ are given in range of [0, difference formulas can be calculated. The 255]. The value of Lsaf is thought to have entire process uses a lot of multiplications values in the approximate range of [0, 216], and temporary variables. The color difference while Hsaf takes values in the range of [-π/2, formulas based on ATD Model have their π/2] (although the authors provide a solution own advantages, such as being closer to for redefining Hsaf in the full rage of [-π, π]). human perception as suggested by E. M. The range of Csaf is not explained in the work. Granger [9] and closely predicting MacAdam Therefore the color difference described in results [18] (parts on CIE chromaticity [19] is calculated as: diagram which are perceived by human eye 2 2 2 HCL safL  CH  1 2 21 cos2  HCCCCALAE  (27) as the same color), making them good alternative when perceptually uniform color  where AL=1.4456 and CH HA  .0 16 . distance measure is critical. However, the intermediary calculations required for The main advantage of the equation computing ATD color difference formulas (27) is that its underlying HCL color make a significant obstacle for using ATD coordinates can be calculated directly from Model on mobile devices, especially in real- non-linear RGB values. The HCL color space time. also attempts to model the luminance L (although according to C. Poynton [2] it is Another approach in modeling colors misspelled, as the term luminance is better and calculating the color difference has been described by the approximation of L*) in a described by M. Sarifuddin and R. Missaoui way that it’s perceived by human eye. [19]. According to the authors, their new HCL However, according to C. Poynton [8], the color space is perceptually uniform (although components of red, green and blue having further studies are required to justify this) and the same value do not generate the same its coordinates can be calculated from non- perceived brightness. The latter means that linear (presumably) RGB coordinates: Lsaf is not actually the perceived of Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 33 the color, making the equation (27) The reverse transformation from YIQ color perceptually non-uniform (that is, small space back to RGB is accomplished as: changes in the resulting color difference does r  .1 00000000 .0 95608445 .0 62088850 Y        not represent small changes in perceived g    .1 00000000  .0 27137664  .0 64860590   I  (29) b  .1 00000000  .1 10561724 .1 70250126 Q colors). The usage of trigonometric functions       is another drawback of the equation as it may The value of Y is defined in range of [0, be difficult to compute on mobile devices. 1], while I and Q are defined in ranges of approximately [-0.5, 0.5] for typical images. It

is important to note that luma component Y The alternative color metric although directly related to color brightness does not represent the luminance and should In television a color space denoted YIQ has not be confused with L* in the previously been introduced by the National Television described CIELAB and CIELUV color System Committee (NTSC) [20], [19] for the models. broadcast. The color space is The color difference in YIQ color composed of the component Y called luma space would be the distance between two [2], [8], [21], which is proportional to the color points, which can be calculated as the gamma-corrected luminance of a color and following: two other components I and Q, the combination of which describes both the hue 2 2 2 (30)      QIYE  and saturation of color [21]. where  YYY ,  III and According to W. K. Pratt [21], the 12 12 reasons for transmitting the YIQ components  QQQ 12 . However, in the above in television were that the Y signal alone equation all three terms Y, I and Q are not could be used with existing monochrome equally proportional in perceived terms – the receivers to display and white video range of the possible values is different for and that I and Q signals could have the each one of the components and the eye is transmission bandwidth limited without more sensitive to the brightness of the color noticeable image degradation. than to chroma [20]. In order to compensate

The values of Y, I and Q can be for these irregularities three coefficients are directly calculated from the non-linear r’, g’ introduced: and b’ components in the following way [21]: 2 2 2  yiq Y  I  Q  QkIkYkE  (31) Y   .0 29889531 .0 58662247 .0 11448223 r        I    .0 59597799  .0 27417610  .0 32180189  g  (28) where kY, kI and kQ are the weighting Q  .0 21147017  .0 52261711 .0 31114694 b coefficients to compensate for the proportions of the Y, I and Q coordinates. 34 Yuriy Kotsarenko, Fernando Ramos

2 2 2 2 The NTSC has allocated different bandwidth E fixed 129 Yi 76 Ii 50  Qi   8 (36) values for each of the signals [20] – where  YY *trunc 255 , I   I  and approximately 4 MHz, 1.4 Mhz and 0.6 Mhz i i trunc 256128 *

. The values of Ii and Qi to Y, I and Q respectively. Based on that Qi  trunc  256128 *Q conclusion, the experimental weighting are clamped to fit in the range of [0, 255] in the case of overflow. coefficients can be specified as kY  4 , The equations (35) and (36) have a k I  4.1 , and kQ  6.0 . The first step very compact form and are easy to compute would be processing the coefficients so that on mobile devices. The general equation (31) they sum to unity (this way their proportions and its specific variant (35) are to be can be seen): implemented on systems powerful enough to

kY work with real numbers, while the equation kY   .0 6667 (32)     kkk QIY  (36) is recommended for systems with severe arithmetic limitations. The resulting quality kI kI   .0 2333 (33) and performance of the equations is     kkk QIY  described below.

kQ kQ   .0 1000 (34)     kkk QIY 

Since finding the above coefficients is a matter of tuning, during the experiments it was found that taking the square root of the above values gives better perceptual results. The resulting color difference equation with the weighting coefficients normalized is the following:

2 2 2 Eyiq .0 5053 Y .0 299 I .0 1957  Q  (35)

In the performance critical applications, the color difference equation (35) can have its square root omitted and instead the value of

E yiq  to be used. If each component of Y, I and Q is stored using 8 bits in the range of [0, 255], the alternative fast color difference equation using the fixed-point coefficients is the following: Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 35

horizontal row, which is, however, split in Analysis of Perceived Color Difference multiple rows on the illustrated images.

Judging the resulting palettes on the The first set of experiments was focused on figure 2, it appears that Riemersma RGB is evaluating the perceived difference in colors much better than RGB, while DIN99, ATD95 calculated by the different formulas. The and YIQ are the best of the group. LAB, LUV experiments consisted in two techniques: and HCL made some good and bad choices, ordering a set of given colors with the LUV being slightly better than LAB and HCL. minimal perceived difference to produce the ATD95 arranged colors perceptually quite softer gradient and construction of color well even although failed slightly in darker maps by picking the closest color to the areas, while DIN99 made some mistakes in existing neighbors to produce a soft looking hue selection; YIQ made mistakes when image. It is important to note that there is no comparing -red shades, while handling exact solution to the tests described in this blue, green and shades quite well. work and the color selection is subjective CIEDE2000 performed slightly worse than even to humans, so a reasonable balance is ATD95, DIN95 and YIQ by making mistakes looked for. The judge of the experiments is, both in perceived brightness and hue. of course, the human eye. In the tests each of the equation is called either by the name of its color space (e.g. RGB, LAB, LUV, YIQ and so on), by its name (e.g. CIEDE2000) or by the author’s name and the color space (e.g. Riemersma RGB).

The initial test was made using uniform palette of colors in RGB space with the combination of each R, G and B colors in four steps each. The original unsorted palette and the newly generated palettes are shown on the figure 2. The algorithm sorts the palette starting from the middle gray and continues by finding the next closest color match using one of the color difference formulas. Note that as the algorithms progresses, there are less colors left in the palette and thus less chance of success at the end. The resulting palette is a single 36 Yuriy Kotsarenko, Fernando Ramos

and unbalanced palette in terms of pure colors.

Figure 1. Sorting 64 colors in uniform palette set: original unsorted (a), RGB (b), Riemersma RGB (c), LAB (d), LUV (e), DIN99 (f), HCL (g), CIEDE2000 (h), ATD95 (i) and YIQ (h).

The next set of images has been made using a random palette of 64 colors using uniform random distribution. The resulting palettes are shown on the figure 3. The Riemersma RGB gives again better results than the original RGB, performing well with less saturated colors, while LAB and

LUV give better results on less saturated colors but perform better with the proper hue ordering. DIN99 gives some really good choices as well as ATD95, with CIEDE2000 falling slightly behind. HCL properly detected color but didn’t order them properly and performed worse with perceived brightness. Figure 2. Sorting 64 colors in random YIQ handled well the changes of hue and palette set: original unsorted (a), RGB perceived brightness, making mistakes, (b), Riemersma RGB (c), LAB (d), LUV again, with colors having slightly different (e), DIN99 (f), HCL (g), CIEDE2000 (h), hues sufficient enough to be noticed, ATD95 (i) and YIQ (h). although in overall it performed quite well. The next experimental technique The random palette made the color ordering consisted in hexagonal map, which is filled by more difficult by giving more saturated colors picking colors from the source palette and Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 37 placing them on the map [22]. The process Figure 3. Accommodation of a begins in the center by gradually filling the randomly generated palette of 288 map in circles. Although there might be more colors on the hexagonal map: original colors in the palette, the algorithm stops unsorted (a), RGB (b), Riemersma when the entire map has been filled. The first RGB (c), LAB (d), LUV (e), DIN99 (f), set of images has been generated using a HCL (g), CIEDE2000 (h), ATD95 (i) random palette of 288 colors using uniform and YIQ (h). random distribution and the starting target The uniqueness of this experiment is sample in the center is black. The results are that the placed color is tested against all the shown on figure 4. already filled neighbors using color difference formula to achieve the less perceived difference, which shows the color difference between multiple colors instead of sample color pairs. From the results shown on the figure 4 it can be seen that RGB although performs better with several colors involved, still made some poor choices in the colors with the same perceived hue, while Riemersma RGB improved it quite a bit. LAB, LUV and HCL although grouped colors with the similar hue, the accommodated colors are quite different in perceived terms, making the outcome quite poor. DIN99 and CIEDE2000 performed quite well in most cases with the error growing gradually in the end. ATD95 in this scenario performed poorer especially in the darker colors. In this case, YIQ performed quite well also, especially in red and green hues, making some mistakes in the area of blue. In fact, LUV is the only formula that handled well blue colors on average.

The next set of images has been generated using uniform RGB palette with each of the component having five steps, making 125 colors in total. There are fewer

38 Yuriy Kotsarenko, Fernando Ramos

colors in this experiment, creating bigger difference between colors in perceived terms. In addition, the initial color in the middle is white instead of black. The resulting set of images is illustrated on the figure 5.It can be seen that both RGB and Riemersma RGB performed well in the beginning with Riemersma RGB giving better results in the area of blue and colors. Both LAB and LUV performed poorly with very little good choices. DIN99, CIEDE2000 and ATD95 performed surprisingly poorer than RGB and Riemersma RGB making some good and bad choices, where CIEDE2000 made the best choices in green-blue area. YIQ performed well in the area of perceived brightness (where many other algorithms failed), making Figure 4. Accommodation of a uniform mistakes with the colors having similar RGB palette of 125 colors on the brightness but somewhat different hue. The hexagonal map: original unsorted (a), last set of tests was particularly difficult for RGB (b), Riemersma RGB (c), LAB (d), the color difference formulas, where the LUV (e), DIN99 (f), HCL (g), CIEDE2000 majority was guided by the perceived color (h), ATD95 (i) and YIQ (h). hue resulting in large perceived mistakes. From the results of the previously mentioned experiments it can be concluded that many color difference formulas perform well when used for the sample pair of colors but perform poorer when used in larger

groups of colors. On average, LAB and LUV color difference formulas performed poorly when compared to the others such as DIN99, CIEDE2000 and ATD95. RGB and Riemersma RGB work better in groups than in sample color pairs. Apparently in the

experiments, DIN99 was the best performer in terms of perceived differences. Surprisingly, YIQ in many cases performed quite well sometimes being one of the best in Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 39

the group, although its weakness are colors YIQi 125 80.00 with same perceived brightness and small The experiments have been hue difference sufficient enough to be performed on a workstation with the perceived. processor of Intel Core 2 Quad Q6600 on bus of 1066 Mhz and memory DDR2 800 Mhz RAM with an average of 5 ms access Performance Analysis time. The application ran in a single thread.

From the results described on table 1 it can The color difference formulas used in the be seen that many color difference formulas previous set of experiments have been also can hardly be used for real-time processing subjected to the performance evaluation. The considering that even on a relatively powerful evaluation consisted in ten image pairs with machine the resulting frame rate is quite low the size of 640x480 pixels filled with random even for a small resolution. The frame rate color data. The algorithm called color shown on the aforementioned table might be difference formula between the pixels of each much less on mobile devices and their image pair calculating the time it takes to applications. The performance of RGB, process all ten images. The pixel data of Riemersma RGB and YIQ is quite each image had colors specified both in acceptable, with integer variation of YIQ sRGB and CIE XYZ color spaces – if the being the best candidate for real-time algorithm required a different color space processing. It is important to note that integer (e.g. CIE LUV), it had to perform the variants of RGB and Riemersma RGB were necessary conversion on per pixel basis. The not tested, but they are most likely to results are shown on the table 1. outperform the integer variation of YIQ. Table 1. Benchmarks of different color difference formulas.

Total Frame Rate Formula Duratio (640x480) n (ms) Frames/second RGB 258 38.83 Rie. RGB 297 33.73 LAB 3307 3.02 LUV 1763 5.67 DIN99 6299 1.59 HCL 2098 4.77 CIEDE200 7231 1.38 0 ATD95 4430 2.26 YIQ 386 25.91 40 Yuriy Kotsarenko, Fernando Ramos

candidate for mobile devices and applications. Conclusions and future work There are areas where the newly proposed color difference formula can be In this work a new color difference formula improved. Specifically, the weighting based on YIQ color space has been coefficients kY, kI and kQ have been derived introduced. The formula has a balance experimentally and analytically, but may not between giving perceptually uniform results produce the optimal results in a particular set and being fast to compute. The classical of conditions. More experiments are required color difference formulas such as those for fine-tuning the weighting coefficients to based on CIE LAB and CIE LUV are either produce better results. too complex to compute on mobile devices, or do not produce the results in perceptual The proposed color difference based terms. The newly proposed color difference on YIQ color space makes mistakes with formula comes in two variations each with its colors having similar perceived brightness own purpose – the high-precision variant but hue different enough to be noticed. A useful in general computer systems and the potential deformation of I and Q components integer variant using fixed-point arithmetic for may improve the accuracy of the color computing the color difference quickly on difference formula. more restricted processing equipment.

The experiments described in this References work show that RGB based color difference does not produce adequate results in perceptual terms for practical use. The color [1] Hill, Francis S. Computer Graphics using difference formulas directly applied in CIE OpenGL. Prentice Hall, 2000. LAB and CIE LUV space are better than [2] Poynton, Charles. Digital Video and RGB, but fail in some circumstances. More HDTV Algorithms and Interfaces. complex formulas such as DIN99 color Morgan Kaufmann, 2003. difference, CIEDE2000 formula and ATD95 large color difference produce better results [3] Riemersma, Thiadmer. "Colour metric." in perceived terms with DIN99 color CompuPhase. May 26, 2008. difference being the best on average in the http://www.compuphase.com/cmetric.htm group. The proposed color difference formula (accessed October 26, 2008). based on YIQ color space results in a balance between a perceptual quality being [4] Schanda, Janos. : close to CIEDE2000 and DIN99 but is very Understanding the CIE system. Wiley fast to compute. It is proposed as a good Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 41

Interscience, 2007. Development of the CIE 2000 Colour- Difference Formula: CIEDE2000." Color [5] Daniel Malacara-Hernandez. “Color Research & Application 26, no. 5 (2001): Vision and Colorimetry: Theory and 340-350. Applications”, April 2002. SPIE Press Book. [12] Sharma, Gaurav, Wencheng Wu, and Edul N. Dalal. "The CIEDE2000 Color- [6] Henry R. Kang. “Computational Color Difference Formula: Implementation Technology”. May 2006. SPIE Press Notes, Supplementary Test Data, and Book. Mathematical Observations." Color

[7] "Recommendation ITU-R BT.709-5." Research and Application 30, no. 1 Parameter values for the HDTV (February 2005): 21-30. standards for production and [13] Imai, F. H., N. Tsumura, and Y. Miyake. international programme exchange. "Perceptual color difference metric for International Telecommunication Union, complex images based on Mahalanobis 2008. distance." Journal of Electronic Imaging

[8] Poynton, Charles. "Frequently-Asked 10, no. 2 (2001): 385-393. Questions about Color." Charles Poynton [14] Carter, Robert C., and Ellen C. Carter. Web Site. November 28, 2006. "CIE L*u*v* Color-Difference Equations http://www.poynton.com/ColorFAQ.html for Self-Luminous Displays." Color (accessed October 22, 2008). Research & Application, 1983: 252-253.

[9] Granger, E. M. "Is CIE L*a*b* Good [15] Büring, Hendrik. "Eigenschaften des Enough for Desktop Publishing?" Farbenraumes nach DIN 6176 (DIN99- Proceedings of SPIE, the International Formel) und seine Bedeutung für die Society for Optical Engineering (Society industrielle Anwendung." 8. Workshop of Photo-Optical Instrumentation Farbbildverarbeitung der German Color Engineers) 2170 (1994): 144-148. Group, October 2002: 11-17.

[10] Melgosa, M., J. J. Quesada, and E. Hita. [16] Cui, G., M. R. Luo, B. Rigg, G. Roestler, "Uniformity of some recent color metrics and K. Witt. "Uniform Colour Spaces tested with an accurate color-difference Based on the DIN99 Colour-Difference tolerance dataset." Applied Optics Formula." Color Research & Application (Optical Society of America) 33, no. 34 27, no. 4 (2002): 282-290. (1994): 8069-8077. [17] Guth, Lee S. "Further Applications of the [11] Luo, M. R., G. Cui, and B. Rigg. "The 42 Yuriy Kotsarenko, Fernando Ramos

ATD Model for ." Proc. SPIE (The International Society for Optical Engineering) 2414 (April 1995): 12-26.

[18] MacAdam, David L. "Visual Sensitivities to Color Differences in Daylight." (Journal of the Optical Society of America) 32, no. 5 (May 1942): 247-273.

[19] Sarifuddin, M., and Rokia Missaoui. "A New Perceptually Uniform Color Space with Associated Color Similarity Measure for Content-Based Image and Video Retrieval." ACM SIGIR Workshop on Multimedia Information Retrieval, Salvador, Brazil, 2005.

[20] Hearn, Donald, and Pauline M. Baker. Computer Graphics, C Version. Prentice Hall, 1996.

[21] Pratt, William K. Digital Image Processing. 3rd Edition. Wiley- Interscience, 2001.

[22] Kotsarenko, Yuriy. “Quality measurement of existing color metrics using hexagonal color fields.” 7º Congreso Internacional de Cómputo en Optimización y Software, Cuernavaca, Mexico, 2009.

Measuring perceived color difference using YIQ NTSC transmission color space in mobile applications 43

inalámbricas, robótica cooperativa y bioinformática. Dirige múltiples proyectos de investigación, incluyendo investigación de modelos computacionales formales para la regulación genética, bioremediación de suelos, aguas y cultivos contaminados con

M. en Sc. Yuriy Kotsarenko se graduó como plaguicidas utilizando la enzima esterasa Ingeniero de Sistemas Computacionales en recombinante de la garrapata Boophilus el ITESM, Campus Cuernavaca en Microplus en la levadura Kluyveromyces Diciembre del 2004. Ahí mismo se graduó Lactis, robótica médica entre otros. como Maestro en Ciencias Computacionales en el Diciembre del 2006 con la tesis “Perceptual Color Similarity Measurement”. Actualmente es alumno de posgrado, candidato a graduación como Doctor en Ciencias Computacionales en la misma institución. Su tesis doctoral esta asesorado por el Dr. Fernando Ramos Quintana. Su especialización incluye gráficos computacionales, visión computacional y óptica. También es experto en el desarrollo de software corriendo su propia empresa de desarrollo de software y video juegos.

Dr. Fernando Ramos Quintana es Director del Programa de Graduados en Informática y Computación del ITESM, Campus Cuernavaca. Su especialización incluye optimización combinatoria, comunicaciones