<<

REPRODUCTION OF FACIAL PATTERN AND ENDOSCOPIC IMAGE BASED ON COLOR APPEARANCE MODELS

December 1996

Francisco Hideki Imai

Graduate School of Science and Technology Chiba University, JAPAN Dedicated to my parents

ii COLOR REPRODUCTION OF FACIAL PATTERN AND ENDOSCOPIC IMAGE BASED ON COLOR APPEARANCE MODELS

A dissertation submitted to the Graduate School of Science and Technology of Chiba University in partial fulfillment of the requirements for the degree of

Doctor of Philosophy

by Francisco Hideki Imai December 1996

iii Declaration

This is to certify that this work has been done by me and it has not been submitted elsewhere for the award of any degree or diploma.

Countersigned Signature of the student

______Yoichi Miyake, Professor Francisco Hideki Imai

iv The undersigned have examined the dissertation entitled

COLOR REPRODUCTION OF FACIAL PATTERN AND ENDOSCOPIC IMAGE BASED ON COLOR APPEARANCE MODELS presented by ______Francisco Hideki Imai______, a candidate for the degree of Doctor of Philosophy, and hereby certify that it is worthy of acceptance.

______Date Advisor Yoichi Miyake, Professor

EXAMINING COMMITTEE

______Yoshizumi Yasuda, Professor

______Toshio Honda, Professor

______Hirohisa Yaguchi, Professor

______Atsushi Imiya, Associate Professor

v Abstract

In recent years, many imaging systems have been developed, and it became increasingly important to exchange image data through the computer network. Therefore, it is required to reproduce color independently on each imaging device. In the studies of device independent color reproduction, colorimetric color reproduction has been done, namely color with same or tristimulus values is reproduced. However, even if the tristimulus values are the same, color appearance is not always same under different viewing conditions. Therefore we must introduce color appearance prediction for color reproduction of image. In this dissertation, a new color reproduction method based on color appearance prediction is introduced for color reproduction of facial pattern and endoscopic image. First, spectral reflectance of skin color was analyzed by principal component analysis, and it was shown that the spectral reflectance of skin can be estimated by three principal components. On the basis of these experimental results, spectral reflectances of facial pattern taken by HDTV camera was estimated, and computer simulation of colorimetric color reproduction has been done using those obtained spectral reflectances. Color appearance models of von Kries, Fairchild, LAB were also introduced to this color reproduction algorithm. The image calculated by those models were compared with the color hardcopies under various illuminants (“A”, “Day ”, “Horizon”, and “Cool ”) using memory matching technique. It was found that the Fairchild model is most significant to estimate the color appearance of skin color chart under different illuminants. However, the model was not significant to apply to estimation of color appearance in facial pattern. Therefore, modified Fairchild model was introduced to estimate the color appearance of facial pattern. It was shown in psychophysical experiments that the model can estimate well the color appearance of facial pattern compared to other models.

vi These experiments have been done in the dark environment. However, the practical image on a CRT is usually watched under the environmental illuminant. We must apply color appearance models to reproduce image on a CRT under various illuminants, specially for remote diagnosis of endoscopic images. It is necessary to do a -mapping for image which cannot be reproduced in gamut of CRT. However, up to this time, there is not any way to quantify how gamut-mapping affects the perceived color appearance. A psychophysical metric based on Mahalanobis distance was introduced to evaluate the influence of gamut-mapping in the color appearance instead of conventional . The distance is calculated by covariance matrix of metric , chroma and angle obtained by psychophysical experiments.

vii ACKNOWLEDGMENTS

At first, I would like to thank Prof. Yoichi Miyake for his supervision, giving the opportunity to work at his image processing laboratory. I also would like to thank Dr. Hideaki Haneishi and Dr. Norimichi Tsumura for their fruitful comments and advices. Special note of thanks goes to Prof. Hirohisa Yaguchi for his collaboration. I would like to express my gratitude to Dr. Nobutoshi Ojima of Kao Corporation, for his cooperation and useful advice. Thanks are also due to Kenji Koyama, Toshiki Saito and Takafumi Horiuchi for many helps in the psychophysical experiments, and many thanks to Reiko Fukumasu for her kindness and patience in being a model for the pictures. I am also indebted to Mr. Yasuaki Yokoyama for many helps in using the computers and devices at the laboratory, and to Mr. Kenji Hara for his kind collaboration. Special thanks to everyone who participated in the psychophysical experiments as observer. I am deeply grateful to my parents, brother, sister for their love and affection. Finally, I would like to acknowledge the financial support given by Chiba Prefecture Government from 1993 April to 1995 March and by Brazilian Government, CNPq, Ministry for Science and Technology, for the scholarship from 1996 September to the end of the course. I am also grateful to many other people, whose names cannot be mentioned one by one, that in some way have contributed for the accomplishment of this research.

Francisco Hideki Imai

viii CONTENTS

Chapter Page 1 INTRODUCTION 1 1.1 Color reproduction on CRT and hardcopy 3 1.1.1 Colorimetric color reproduction 3 1.1.2 Color appearance models 4 1.1.3 Limitation of the traditional color reproduction methods 5 1.2 Purpose and approach of this research 6 1.3 Organization of this dissertation 7

2 PREDICTION OF SPECTRAL REFLECTANCE OF A FACIAL PATTERN IMAGE TAKEN BY HDTV CAMERA 8 2.1 Introduction 8 2.2 Principal component analysis of the spectral reflectances of human skin and their linear approximation 8 2.3 Estimation of the skin spectral reflectance 15 2.4 Characterization of HDTV camera 18 2.5 Conclusion 22

3 COLORIMETRIC COLOR REPRODUCTION OF FACIAL PATTERN IMAGE ON CRT 23 3.1 Introduction 23 3.2 Tristimulus values of facial pattern image 24 3.3 Calibration of CRT 24 3.4 Experiments and their results 27 3.5 Conclusion 34

4 COLORIMETRIC COLOR REPRODUCTION OF FACIAL PATTERN IMAGE ON HARDCOPY 36 4.1 Introduction 36 4.2 Color reproduction method for hardcopy under various illuminants 36 4.3 Multiple regression analysis using skin spectral reflectances 37 4.4 Experimental results 40 4.5 Conclusion 42

ix 5 COLOR REPRODUCTION OF FACIAL PATTERN IMAGE BASED ON COLOR APPEARANCE MODELS 43 5.1 Introduction 43 5.1.1 Chromatic adaptation models 43 5.1.2 Viewing techniques for cross-media comparison 49 5.2 Color appearance matching method 50 5.3 Psychophysical experiments 52 5.4 Experimental results 60 5.5 Conclusion 63

6 MODIFIED FAIRCHILD CHROMATIC ADAPTATION MODEL FOR FACIAL PATTERN IMAGES 64 6.1 Introduction 64 6.2 Modified Fairchild chromatic adaptation model 64 6.3 Psychophysical experiments to tune 65 6.4 Comparison of the modified Fairchild model with other color appearance models. 67 6.5 Conclusion 76

7 COLOR REPRODUCTION OF ENDOSCOPIC IMAGES UNDER ENVIRONMENTAL ILLUMINATION 77 7.1 Introduction 77 7.2 Color reproduction method for endoscopic images under environmental illumination 78 7.3 Gamut-mapping of endoscopic image 83 7.3.1 Gamut-mapping in 1976 CIELUV L*C*h 83 7.3.2 Perceptual Mahalanobis distance 84 7.3.3 Psychophysical experiments 86 7.4 Analysis of gamut-mapping for endoscopic images 88 7.5 Conclusion 92

8 CONCLUSION 93

REFERENCES 95

LIST OF PRESENTATIONS AND PUBLICATIONS RELEVANT TO THE PRESENT THESIS 108

x LIST OF TABLES

Table 1. Averaged spectral reflectance of the sampled human skin. 10 Table 2. The first, second and third principal components of the spectral reflectance of human skin. 13 Table 3. Experimental setting of the CRTs. 27 Table 4. Coefficients obtained by quadratic curve fitting for CRTs (Nanao FlexScan 56T and AppleColor MO401) in a dark environment. 29 Table 5. Coefficients obtained by linear curve fitting for CRTs (Nanao FlexScan 56T and AppleColor MO401) in a dark environment. 32 Table 6. The tristimulus values of six skin color patches under illuminant “A”. 53

Table 7. XL, YL, ZL tristimulus values of reflected light on CRT under various illuminants. 82 Table 8. Variances and covariances of metric lightness, chroma and hue angle of endoscopic image. 86 Table 9. Result of psychophysical experiments performed by Hara with the collaboration of physicians. 88 Table 10. Variation of coefficients f , e, b, g anda in the Eq. (7-10). 89

xi LIST OF FIGURES

Figure 1. History of the color research. 1 Figure 2. Distribution of skin used in principal component analysis. 9 Figure 3. Averaged spectral reflectance of human skin. 9 Figure 4. Cumulative contribution ratio of principal components of the skin spectral reflectance. 11 Figure 5. The first, second and third principal components of the spectral reflectance of the skin. 12 Figure 6. Comparison between measured and predicted relative spectral reflectance of human skin. 15 Figure 7. Diagram of the portrait image acquisition and the prediction of the spectral reflectance of each pixel in the image. 19 Figure 8. Diagram of the proposed colorimetric color reproduction method to predict the tristimulus values of facial pattern image under various illuminants on a CRT in a dark environment. 23 Figure 9(a). The relationship between input levels of CRT (Nanao FlexScan 56T) and luminance of phosphor. 28 Figure 9(b). The relationship between input levels of CRT display (AppleColor MO401) and luminance of phosphor. 29 Figure 10(a). X-Y and Z-Y relationship for each RGB channel of the CRT (Nanao FlexScan 56T). 30 Figure 10(b). X-Y and Z-Y relationship for each RGB channel of the CRT (AppleColor MO401). 31 Figure 11. Relative spectral radiant power and of four illuminants used in the experiment. 33 Figure 12. A facial pattern image reproduced colorimetrically on CRT under four illuminants; (a) “Horizon,” (b)“A,” (c)“Cool White,” (d)“Day Light,” illuminants.34 Figure 13. Diagram of the color reproduction method for hardcopy of skin under various illuminants. 37 Figure 14. Diagram of the multiple regression analysis. 38

Figure 15. Averaged color differences DEab between skin color patches displayed on CRT and the corresponding hardcopies under four illuminants. 40 Figure 16. Printed facial pattern image under four different illuminants, (a) “Horizon”, (b) “A”, (c) “Cool White”, (d) “Day Light” illuminants. 41

xii Figure 17. Comparison between averaged spectral reflectance of human skin and printed skin color. 42 Figure 18. Diagram of color reproduction of facial pattern images on CRT based on color appearance models. 51 Figure 19. Predicted color appearance of skin color patches under various illuminants; (a) illuminant “A”, (b) “Horizon”, (c) “Cool White”, (d) “Day Light”. 54 Figure 20. Color appearance predictions of a facial pattern image under illuminant “A”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild. 55 Figure 21. Color appearance predictions of a facial pattern image under “Horizon”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild. 56 Figure 22. Color appearance predictions of a facial pattern image under “Cool White”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild. 57 Figure 23. Color appearance predictions of a facial pattern image under “Day Light”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild. 58 Figure 24. Arrangement of psychophysical experiment to compare skin color images on a CRT with a hardcopy in a standard illumination booth using memory matching. 59 Figure 25. Percentage of selection of each for skin color patch reproduction. 61 Figure 26. Percentage of selection of each color appearance model for facial pattern image reproduction. 62

Figure 27(a). A number of selection when KM , KS are changed from 2.9 to 3.2 and KL = 2.9. 65 Figure 27(b). A number of selection when KM , KS are changed from 2.9 to 3.2 and KL = 3.0. 66 Figure 27(c). A number of selection when KM , KS are changed from 2.9 to 3.2 and KL = 3.1. 66 Figure 27(d). A number of selection when KM , KS are changed from 2.9 to 3.2 and KL = 3.2. 67 Figure 28. A facial pattern image reproduced on a CRT for illuminant “A”; (a) XYZ, (b) Modified Fairchild (Custom model), (c) von Kries, (d) Fairchild. 68 Figure 29(a). Averaged interval scale of model performance for illuminant “A”. 69 Figure 29(b). Averaged interval scale of model performance for “Day Light”. 70 Figure 29(c). Averaged interval scale of model performance for “Cool White”. 70 Figure 30. A skin color patch reproduced on CRT for illuminant “A”; (a) RLAB 96, (b) Modified Fairchild (Custom model), (c) von Kries, (d) LLAB models. 71 Figure 31. A facial pattern image reproduced on CRT for illuminant “A”; (a) RLAB 96, (b) Modified Fairchild (Custom), (c) von Kries, (d) LLAB models. 72

xiii Figure 32. Percentage of skin color patch that was selected on CRT as the best one in 3 kind of illuminants. 73 Figure 33. Percentage of selection for reproduced facial pattern image chosen as the best one on CRT display in 3 kind of illuminants. 74 Figure 34(a). Averaged interval scale of model performance for illuminant “A”. 75 Figure 34(b). Averaged interval scale of model performance for “Day Light”. 75 Figure 34(c). Averaged interval scale of model performance for “Cool White”. 76 Figure 35. Diagram of color reproduction method to reproduce the appearance of endoscopic images on CRT under various illuminants. 78 Figure 36(a). Spectral reflectance of CRT without radiation from CRT under illuminant “A”. 80 Figure 36(b). Spectral reflectance of CRT without radiation from CRT under “Cool White”. 81 Figure 36(c). Spectral reflectance of CRT without radiation from CRT under “Day Light”. 81 Figure 37. Relationship between L*u*v* and L*C*h color spaces. 84 Figure 38. Stomach endoscopic image. 89 Figure 39. Reproduced image on CRT under illuminant “A”. 90 Figure 40. Reproduced stomach endoscopic image under illuminant “A”, for f =0.95, e=3, b=1.0, g=1 anda =0 of Eq. (7-10). 91

xiv Chapter 1

CHAPTER 1

INTRODUCTION

Studies of color have been introduced in human life and culture since the beginning of recorded history. Figure 1 shows a diagram of the history of color research.

Pre-scientific color studies

•@ Philosophical Color representation speculation on color 500 - 300 B.C. 15th century Greek and Chinese Search for color speculation on the organization and nature of color representation

Color Science

Physical Measurement C I E Color appearance 17th century 1931 1976-1996 Pshychophysical Physical approach Physical relationship relationship between to relate color with between stimulus and stimulus and color corresponding stimuli color perception sensation

Color imaging

Cross-media color New challenge •@reproduction Matching tristimulus Matching tristimulus values and appearance of values and appearance original scene with the of color images between reproduced images and devices gamut-mapping

Figure 1. History of the color research.

In the pre-scientific color studies, Chinese and Greek philosophers only speculated on the nature of color. During the 15th century, color organization and representation were searched. The color science began with the establishment of a correspondence between color and its physical stimuli in the 17th century. In the 19th century, the advance of

1 Chapter 1

sciences such as physics and chemistry1 contributed for the development of color science. The color science also developed with the advances in the study of physiology and psychophysics.2 CIE (Commission Internationale de l’Eclairage) specified the colorimetry in 1931 giving a physical relationship between measured stimulus and color response based on a standard observer. However, the CIE colorimetry cannot predict the appearance of color well, because it is based on adapted eyes in pre-defined laboratory conditions, unlike the environment where most colors are seen. The specification3 and prediction4 of color appearance have been studied in the last 20 years. The study of color appearance considers the influence of environmental factors in the sensation of color. The increasing speed of computing equipment and devices, scanner, digital camera, printer and CRT display (CRT), allowed the development of techniques5 for color imaging. The color science is applied to color imaging to produce cross-media reproductions. It also gives new tools to the color scientists with which they can improve the studies of human . A new challenge in color imaging systems is to match the color appearance of original scene with reproduced images. In the area of multimedia where we can access color images through the world wide network of computers, the WYSIWYG (what you see is what you get) fidelity in the reproduction of color has become very important. The traditional cross-media reproduction systems6,7 does not consider the original scene. The reproduction of the colors in the original scene can be achieved considering original spectral reflectances.8 The reproduction of the skin color appearance is also important to the cosmetic industry. The color appearance reproduction of the original scene is also very important in the area of telemedicine such as diagnosis of electronic endoscopic image. The endoscopic image is not always viewed in a dark environment, and viewing condition is

2 Chapter 1

dependent on the environment of the hospital. In remote diagnosis, however, the endoscopic images are usually sent to physicians in various illuminated environments. Therefore, we need to consider the outer illuminant condition in the color reproduction system. Unfortunately, the processed endoscopic images cannot always be represented inside the color gamut of CRT particularly due to the saturation in the channel. In such cases, gamut mapping is required. A new challenge in the endoscopic image reproduction system is to evaluate the influence of gamut-mapping in the color appearance of the reproduced image.

1.1 Color reproduction on CRT and hardcopy The accurate reproduction on CRT and hardcopy has been studied by many researchers.9-11 In the following subsections some types of color reproduction are described.

1.1.1 Colorimetric color reproduction The study of color science developed in the last century with the advance of physics and physiology.12-21 These studies constituted the base of the colorimetry.22 The colorimetry is obtained by measurement of chromatic properties of the objects by instruments such as spectrophotometer and colorimeter.23 In colorimetric color reproduction, chromaticities and relative luminances of reproduced color are equal to those of the original.24 It is possible to achieve device independent color reproduction, in matching the chromaticities and relative luminances of the original with the reproduced images. 1.1.2 Color appearance models Colorimetric color reproduction works well to reproduce the appearance of colors if the reproduction is viewed under the identical viewing condition of the original image.

3 Chapter 1

However, the use of colorimetric color reproduction is not enough to predict the appearance of colors under various environmental conditions. There are many environmental factors which give the influence upon the appearance of colors such as the illuminant,25-27 effect of background and surround,28-32 and the size of the colored area33 . One of the most significant factors affecting color appearance is the change of visual color sensitivities corresponding to changes of the illumination. This phenomenon is known as chromatic adaptation.34-37 There are two types of chromatic adaptation mechanisms: sensory and cognitive.38 Sensory mechanisms are based on the sensitivity control in the photoreceptors and neurons in the first stages of the visual system and considers changes in the white point, luminance of illuminant and other aspects of the viewing conditions. Many models have been proposed to measure the appearance of color.39-46 The first model of sensory chromatic adaptation was proposed by Johaness von Kries.47,48 Subsequent models of chromatic adaptation proposed by C. J. Bartleson,49 K. Richter,50 R. W. G. Hunt,51-54 Y. Nayatani and coworkers,55-64 M. D. Fairchild,65,66 M. R. Luo and coworkers67-69. Color spaces such as CIELUV70 , CIELAB,70 RLAB,38,71,72 LLAB73,74 are also used to predict color appearance. There is also neural models such as the ATD model proposed by S. L. Guth.75-77 Another color appearance model is the Retinex theory proposed by E. Land that considers a spatial distribution of all pixels in the field of view.78-80 On the other hand, cognitive mechanisms are influenced by observers’ knowledge of image content. The quantification of the cognitive mechanisms is very complex because its psychological nature. Then, no model has been proposed for cognitive mechanisms. Various color appearance models have been applied to color reproduction of complex images81 and evaluated by RIT group.38,82-84 These experiments show that a

4 Chapter 1

perfect color appearance model is not available and each model has advantages and disadvantages.

1.1.3 Limitation of the conventional color reproduction methods The conventional color reproduction methods have three considerable limitations. First, these methods cannot reproduce correctly the color appearance of the original scene under various illuminants on CRT and hardcopy using only three input channels due to the occurrence of metameric pairs24 of original and reproduced image. These metameric pairs are stimuli that are visually identical but spectrally different. A method to estimate the spectral composition of the color stimulus from the input signals is required to predict the in the reproduction on CRT and hardcopy under various illuminants. Second, many reproduction methods using color appearance models have been proposed. However, none of them has been applied specifically to skin color that is one of the most important colors for the evaluation of reproduction quality. A comparative experiment between color patches and complex images is also not available for reproduction systems using color appearance models. Third, some of these systems include gamut-mapping to reproduce images inside the color gamut of CRT displays. However, these methods do not consider yet how gamut- mapping actually modifies the color appearance of the reproduced image. It is desirable to minimize the effect of gamut-mapping in the color appearance reproduction. Then, a perceptual metric based on psychophysical experiments is required to quantify the effect of gamut-mapping on the color appearance of reproduced images.

5 Chapter 1

1.2 Purpose and approach of this research The purpose of this dissertation is to match the color appearance of an original scene under various illuminants with its reproductions on CRT and hardcopy. In this dissertation, facial pattern and endoscopic image are considered as original scene. Particularly, this study examines the performance of chromatic adaptation models for facial pattern image. This study also investigates the evaluation of gamut mapping influence on the color appearance of endoscopic image reproduced on a CRT under environmental illumination. The gamut-mapping is required in endoscopic images because these images generally have saturation of the red channel on CRT due to their reddish nature. As a first approach to this color appearance matching, the spectral reflectance of original scene is estimated using the principal component analysis. Then, the estimated spectral reflectance is used to predict the tristimulus values of the original scene under various illuminants. The predicted tristimulus values are reproduced on hardcopy by a printer calibration using a database of spectral reflectance. Chromatic adaptation models are introduced to reproduce the color appearance of hardcopy on CRT. Next, reproduction on CRT of skin color patches and facial pattern images using various color appearance models are compared by psychophysical experiments. The performance of color appearance models is compared for various illuminants and differences between reproduction of skin color patches and facial pattern image are examined to find an effective model for facial pattern reproduction. A perceptual distance using covariance matrix for CIELUV metric lightness, chroma and hue angle is defined based on Mahalanobis distance. This distance is applied to evaluate various gamut-mapping techniques required to reproduce endoscopic image on CRT under environmental illumination.

6 Chapter 1

1.3 Organization of this dissertation In Chapter 2, a technique for two-dimensional prediction of the spectral reflectance of facial pattern from the RGB three channel image taken by HDTV camera is proposed based on principal component analysis. Chapter 3 shows a method to reproduce colorimetrically facial pattern images under various illuminants on CRT in a dark environment, and Chapter 4 describes a method to reproduce colorimetrically facial pattern images as hardcopy. Chapter 5 presents a color reproduction based on color appearance models and psychophysical experiments are performed to evaluate various color appearance models. In Chapter 6, a modified Fairchild chromatic adaptation model for facial pattern images is introduced. In Chapter 7, a color appearance reproduction of endoscopic images on a CRT under various illuminants is described and a perceptual color distance based on Mahalanobis distance is introduced to evaluate gamut-mapping. In Chapter 8, the conclusion and the summary of the proposed color reproduction methods are presented.

7 Chapter 2

CHAPTER 2

PREDICTION OF SPECTRAL REFLECTANCE OF A FACIAL PATTERN IMAGE TAKEN BY HDTV CAMERA

2.1 Introduction Spectral reflectance of the object should be measured to predict the color of object under various illuminants. The spectral reflectance can be represented in a multidimensional space. Generally, we can obtain only three-channel data from input devices. The estimation from three-dimensional space to multi-dimensional space can be achieved using principal components of spectral reflectance.85-89 In this chapter, a method to predict the spectral reflectance of a portrait image taken by a HDTV camera is described.

2.2 Principal component analysis of the spectral reflectances of human skin and their linear approximation Ojima and his coworkers measured one hundred eight spectral reflectances of skin in human face for 54 Mongolians (Japanese women) who are between 20 and 50 years old.90 The Munsell values of the skins had a range as follows; H=2YR-8YR, V=5-7, C=2-5, and the distribution of these skin colors in CIE 1976 L*a*b* color space is shown in Fig. 2.

8 Chapter 2

80 25 BB B B BB B BBB B BB B B BBBB B B BBB B B BBB B B B BB 70 B B 20 BBBB BB B BB B B B B B BBB B B BBBBB B B BB BB B B BBBBB B BBB BB B B BBBBB BB B B B BB L* BBBBBB BB B B BBBB B b* B B BBBBB BBBB B BBB B B BB BBBBB BB BB B BB B B BBBB B B B B B B B B B B BB B BB 60 B 15 B BB B BB B BB B B BB B B B B B

50 10 0 5 10 15 0 5 10 15 a* a*

Figure 2. Distribution of skin colors used in principal component analysis.

The spectral reflectances were measured at intervals of 5 nm between 400 nm and 700 nm. Therefore, the spectral reflectance is described as vectors o in 61- dimensional vector space. Figure 3 shows the averaged spectral reflectance of human skin, and these values are given in Table 1.

60

50

40

30

20

10

0 Averaged spectral reflectance (%) 400 500 600 700 Wavelength (nm)

Figure 3. Averaged spectral reflectance of human skin.

9 Chapter 2

Table 1. Averaged spectral reflectance of the sampled human skin.

Wavelength (nm) Relative spectral Wavelength (nm) Relative spectral reflectance (%) reflectance (%) 400 15.98 555 31.88 405 16.20 560 32.24 410 16.41 565 32.46 415 16.62 570 32.67 420 16.83 575 33.36 425 17.35 580 34.05 430 17.86 585 36.23 435 18.93 590 38.41 440 19.99 595 40.80 445 21.16 600 43.19 450 22.33 605 44.79 455 23.22 610 46.38 460 24.11 615 47.32 465 24.77 620 48.26 470 25.41 625 48.82 475 25.94 630 49.38 480 26.46 635 49.87 485 27.07 640 50.35 490 27.68 645 50.78 495 28.43 650 51.21 500 29.18 655 51.53 505 29.88 660 51.84 510 30.57 665 52.18 515 30.87 670 52.51 520 31.16 675 52.86 525 31.11 680 53.20 530 31.05 685 53.54 535 31.03 690 53.88 540 31.00 695 54.12 545 31.26 700 54.35 550 31.51

The covariance matrix of the spectral reflectance was calculated for the principal component analysis. The eigenvectors of the covariance matrix are named as principle component vectors. Then, the spectral reflectance of human skin can be expressed as a linear combination of the principle component vectors as follows:

61

o = o + å a iui , (2-1) i =1

10 Chapter 2

where o is the averaged spectral reflectance, ui (i=1...61) are the eigenvectors and a i

(i=1...61) are the eigenvalues corresponding to the eigenvectors ui (i=1...61) respectively. The eigenvectors are combined in order of magnitude of the eigenvalues. The cumulative contribution rates of the principle component vectors are shown in Fig. 4.

100 B B B 90 B 80 70 60 50 40 30 20 Cumulative contribution(%) 10 0 0 1 2 3 4 5 Order of principal component

Figure 4. Cumulative contribution ratio of principal components of the skin spectral reflectance.

From this figure, we can see that the cumulative contribution rate from first to third components is about 99.5%. Then, the spectral reflectance of human skin can be represented approximately 99.5% by using linear combination of three principal

components u1 , u2 , u3 . The three principal components are shown in Fig. 5.

11 Chapter 2

4 First 3

2

1 Third

0

-1 Second Relative spectral reflectance -2 400 450 500 550 600 650 700

Wavelength (nm)

Figure 5. The first, second and third principal components of the spectral reflectance of skin.

The values of the principal components are given in Table 2. Therefore the Eq. (2-1) can be represented approximately as follows:

æa ö 3 1 ç o @o + å a i ui = o + (u1 u2 u3 ) a 2÷ . (2-2) i=1 ç ÷ èa 3ø

12 Chapter 2

Table 2. The first, second and third principal components of the spectral reflectance of human skin.

Wavelength•@ 1st Principal•@2nd Principal•@ 3rd Principal (nm) Component Component Component

•@•@400 •@2.258 •@0.326 •@-0.514 •@•@405 •@2.232 •@0.306 •@-0.506 •@•@410 •@2.207 •@0.284 •@-0.497 •@•@415 •@2.177 •@0.267 •@-0.491 •@•@420 •@2.149 •@0.250 •@-0.486 •@•@425 •@2.184 •@0.242 •@-0.488 •@•@430 •@2.218 •@0.234 •@-0.488 •@•@435 •@2.326 •@0.198 •@-0.488 •@•@440 •@2.435 •@0.161 •@-0.510 •@•@445 •@2.527 •@0.102 •@-0.533 •@•@450 •@2.619 •@0.041 •@-0.557 •@•@455 •@2.665 •@0.000 •@-0.572 •@•@460 •@2.711 -0.040 •@-0.588 •@•@465 •@2.735 -0.064 •@-0.589 •@•@470 •@2.758 -0.086 •@-0.592 •@•@475 •@2.767 -0.112 •@-0.590 •@•@480 •@2.776 -0.138 •@-0.589 •@•@485 •@2.800 -0.158 •@-0.572 •@•@490 •@2.824 -0.177 •@-0.554 •@•@495 •@2.878 -0.192 •@-0.508 •@•@500 •@2.933 -0.208 •@-0.461 •@•@505 •@2.983 -0.251 •@-0.390 •@•@510 •@3.033 -0.294 •@-0.321 •@•@515 •@3.044 -0.410 •@-0.225 •@•@520 •@3.056 -0.526 •@-0.130 •@•@525 •@3.045 -0.682 •@-0.018 •@•@530 •@3.035 -0.837 •@ 0.094 •@•@535 •@3.019 -0.942 •@ 0.172 •@•@540 •@3.003 -1.047 •@ 0.250 •@•@545 •@2.971 -1.087 •@ 0.266 •@•@550 •@2.940 -1.129 •@ 0.283 •@•@555 •@2.915 -1.173 0.307 •@•@560 •@2.891 -1.213 •@ 0.330 •@•@565 •@2.897 -1.335 0.424 •@•@570 •@2.902 -1.453 •@ 0.517 •@•@575 •@2.927 -1.477 •@ 0.564 •@•@580 •@2.950 -1.500 0.611 •@•@585 •@3.028 -1.202 •@ 0.526 •@•@590 •@3.107 -0.903 •@0.441 •@•@595 •@3.154 -0.455 •@ 0.319 •@•@600 •@3.202 -0.006 0.199

13 Chapter 2

Wavelength•@1st Principal 2nd Principal 3rd Principal (nm) Component Component Component

•@•@605 •@3.195 •@0.285 •@ 0.158 •@•@610 •@3.188 •@0.576 •@ 0.117 •@•@615 •@3.152 •@0.711 •@ 0.130 •@•@620 •@3.117 •@0.846 •@ 0.142 •@•@625 •@3.062 •@0.899 •@ 0.165 •@•@630 •@3.006 •@0.952 •@ 0.186 •@•@635 •@2.944 •@0.978 •@ 0.209 •@•@640 •@2.882 •@1.004 •@ 0.231 •@•@645 •@2.816 •@1.019 •@ 0.257 •@•@650 •@2.751 •@1.034 •@ 0.284 •@•@655 •@2.694 •@1.040 •@ 0.307 •@•@660 •@2.639 1.046 •@ 0.331 •@•@665 •@2.585 1.057 •@ 0.359 •@•@670 •@2.530 1.069 •@ 0.386 •@•@675 •@2.469 1.083 •@ 0.416 •@•@680 •@2.407 1.098 •@ 0.445 •@•@685 •@2.354 1.106 •@ 0.471 •@•@690 •@2.302•@ 1.114 •@ 0.497 •@•@695 •@2.253 1.119 •@ 0.524 •@•@700 •@2.202 1.124 •@ 0.550

Figure 6 shows a comparison between measured and predicted relative spectral reflectance of human skin by principal component analysis. From this figure it is possible to conclude that the linear approximation of the spectral reflectance using three principal components works well. Then, the spectral reflectance of each pixel of image can be predicted from the tristimulus values of each pixel and the spectral radiance of the illuminant, as is performed in electronic endoscopic images91 .

14 Chapter 2

60

50 ( ) Measured spectral reflectance of human skin 40 ( ) Predicted spectral 30 reflectance of human skin by three principal components 20 Spectral reflectance (%) 10

0 400 500 600 700 Wavelength (nm)

Figure 6. Comparison between measured and predicted spectral reflectance of human skin.

2.3 Estimation of the skin spectral reflectance Considering the result of above principal component analysis, we can estimate the spectral reflectance of skin using the tristimulus values. The tristimulus values can be easily measured by a colorimeter. The spectral reflectance of skin is estimated as follows. As well known, the tristimulus values X, Y, Z can be calculated by Eq. (2-3).

700 X = K å E(l )x (l )O(l ), (2-3a) l =400

700 Y = K å E(l )y (l )O(l ) , (2-3b) l =400

700 Z = K å E(l )z (l )O(l ), (2-3c) l =400

15 Chapter 2

where l is the wavelength, O(l ) is the spectral reflectance, E(l ) is the spectral radiance of the illuminant, x (l ) , y (l ) , z (l ) are color matching functions and K is a coefficient given by Eq. (2-4). 1 K = 700 (2-4) òE(l )y (l )dl l= 400 By vector notations, Eq. (2-3) can be expressed as follows:

X= KetX o, (2-5a)

Y = Ke tY o, (2-5b)

Z= KetZ o, (2-5c)

t where [.] represents the transpose of [.], the vectors e, o are vector notations of E(l ) and O(l ) respectively, and the matrixes X , Y , Z are represented as follows:

æ x1 (l 1 ) O ö ç ÷ x 2 (l 2 ) x (l ) ® X = ç ÷, (2-6a) ç O ÷ è O x (l )ø n n æ y1 (l 1) O ö ç ÷ y2 (l 2 ) y (l ) ® Y = ç ÷ , (2-6b) ç O ÷ è O y (l )ø n n æ z1 (l 1 ) O ö ç ÷ z 2 (l 2 ) z (l ) ® Z = ç ÷ . (2-6c) ç O ÷ è O z (l )ø n n

16 Chapter 2

From Eq. (2-2), the Eq. (2-5) can be written as,

é æa 1ö ù X @Ket X êo + u u u ça ÷ ú, (2-7a) ( 1 2 3 )ç 2 ÷ ê ú ë èa 3ø û

é æa 1ö ù Y @KetY êo + u u u ç a ÷ú , (2-7b) ( 1 2 3 )ç 2 ÷ ê ú ë è a 3ø û

é æa 1ö ù Z @Ket Z êo + u u u ç a ÷ú . (2-7c) ( 1 2 3 )ç 2 ÷ ê ú ë è a 3ø û

The Eq. (2-7) can be rewritten as follows:

æ a 1 ö t t ç ÷ X @Ke X o + Ke ( X u1 X u2 X u3 ) a , (2-8a) ç 2 ÷ è a 3ø

æ a 1 ö t t ç ÷ Y @Ke Y o + Ke (Y u1 Y u2 Y u3 ) a , (2-8b) ç 2 ÷ è a 3 ø

æ a 1 ö t t ç ÷ Z @Ke Z o + Ke (Z u1 Z u2 Z u3 ) a . (2-8c) ç 2 ÷ è a 3 ø

We can consider the first term of Eq. (2-8) as a contribution of the averaged spectral reflectance to the tristimulus values, and the second term as a contribution of three eigenvectors. Then, we can rewrite the Eq. (2-8) as follows:

æ æ Xö Xö æX 1 X2 X3 öæ a 1 ö çY ÷ @ç Y ÷ + ç Y Y Y ÷ç a ÷ , (2-9) ç ÷ ç ÷ ç 1 2 3 ÷ç 2 ÷ è Zø è Z ø è Z1 Z2 Z 3 øè a 3 ø

17 Chapter 2

where X , Y , Z are the averaged tristimulus values and Xi ,Yi , Zi (i=1,2,3) are the tristimulus values corresponding to the three eigenvectors of skin spectral reflectance.

Then, the eigenvalues a 1 , a 2 , and a 3 are given by

- 1 æa 1 ö æX 1 X2 X3ö éæ X ö æX ö ù ça ÷ = ç Y Y Y ÷ êç Y÷ - ç Y ÷ú . (2-10) ç 2÷ ç 1 2 3 ÷ ç ÷ ç ê ÷ú èa 3ø è Z1 Z 2 Z3 ø ëè Zø è Z ø û

The spectral reflectance of human skin can be estimated by above eigenvalues and three principal components by Eq. (2-2).

2.4 Characterization of HDTV camera Figure 7 shows the schematic diagram of the image acquisition and the estimation of the spectral reflectance in each pixel of the image based on the characterization of the HDTV camera. In the previous sections, it was shown how spectral reflectances of human face can be estimated from the device independent tristimulus values X, Y, Z using Eqs. (2-2) and (2-10). In this section, X, Y, Z in each pixel of original scene is calculated from the R, G, B three color channels of the HDTV camera.

The output color channel values Ro, Go, Bo of an ideal HDTV camera can be given by the Eq. (2-11),

700 Ro = å E(l )r (l )O(l ), (2-11a) l =400

700 Go = å E(l )g (l )O(l ), (2-11b) l =400

700 Bo = å E(l )b (l )O(l ), (2-11c) l =400 where r (l ), g (l ) , b (l ) are the spectral sensitivities of the camera, E(l ) is the spectral radiance of the illuminant and O(l ) is the spectral reflectance of the object.

18 Chapter 2

HDTV camera

( R, G, B )

M1

( X, Y, Z )

O(l )=O(l )+a 1u1(l )+a 2u2(l )+a 3u3(l )

First, second and third principal components of the spectral reflectance of human skin

Figure 7. Diagram of the portrait image acquisition and the prediction of the spectral reflectance of each pixel of the image.

However, the actual output color channel values R’, G’, B’ of a HDTV camera is given by Eq. (2-12);

R¢ =K r.fr (Ro), (2-12a)

G¢ = Kg.fg(Go), (2-12b)

B¢ = Kb . fb (Bo ), (2-12c)

19 Chapter 2

where f r , fg , f b are non-linear functions, and Kr , Kg , Kb are white balance constants.

The non-linearity between R, G, B level and luminance was linearized using the following quadratic equations;92

- 1 -3 2 R¢ = - 5.50 + 4.26 ´ 10 Ro + 2.04 ´ 10 Ro , (2-13a)

- 1 -1 - 3 2 G¢ = - 6.06 ´ 10 + 2.90 ´ 10 Go + 2.66 ´ 10 Go , (2-13b)

- 1 - 1 - 3 2 B¢ = - 7.37 ´ 10 + 2.31 ´ 10 Bo + 3.11´ 10 Bo . (2-13c)

Generally the r (l ), g (l ) , b (l ) are different from visual color matching functions. Then, it is necessary to find a color transformation function that gives the device independent tristimulus values X, Y, Z from the HDTV camera output channels

R’, G’, B’. Ojima and his coworkers showed that a color transformation matrix M1 shown in Eq. (2-14) gives sufficient accuracy for the HDTV camera calibration,92

æR ¢ ö æX ö ç G¢ ÷ ç Y ÷ = M (2-14) ç ÷ 1ç B¢ ÷ è Zø ç ÷ è 1 ø

This matrix M1 can be determined by the following technique. The tristimulus values of a series of color patches are measured by a colorimeter and are digitized by HDTV camera. The values of R’, G’, B’ channels in Eq. (2-13) are calculated from the output values R, G, B of the HDTV camera. The matrix M1 can be determined by the multiple regression analysis of measured X, Y, Z tristimulus values and calculated R’, G’, B’ color channel values of digitized color patches. The tristimulus values X, Y, Z of thirty-nine selected patches of Japanese skin color were measured by a spectrophotometer (Minolta CM1000). The Munsell values of the patches have a range as follows; H=0YR-10YR, V=5-8, C=2-5.

20 Chapter 2

The skin color patches were digitized under Metal halide lamp (RDS, with of 5,700 K) illumination at 2 o field of view by a HDTV camera (Nikon

HQ1500C). The calculated color transform matrix M1 is shown in Eq. (2-15).

æ2.09 ´ 10 - 1 5.37 ´ 10 - 2 5.09 ´ 10- 2 - 2.41 ö M = ç 1.17 ´ 10- 1 2.07 ´ 10- 1 - 4.80 ´ 10 - 3 - 2.05 ÷ (2-15) 1 ç ÷ è 7.69 ´ 10 -4 7.67 ´ 10- 3 3.65 ´ 10- 1 - 0.827ø

The accuracy of this color transformation is verified by DE*ab color

93,94 difference between the tristimulus values Xm, Ym, Zm, and Xe, Ye, Ze. Xm, Ym, Zm are measured using a spectrophotometer (Minolta CM1000) and Xe, Ye, Ze, are estimated using the matrix M1 in Eq. (2-15). The color difference is calculated as shown in Eq. (2-16)

* * * 2 * * 2 * * 2 DE ab = (Lm - Le ) + (am - ae ) + (bm - be ) , (2-16)

* * * * * * where ( Lm , am , bm ) and ( Le , ae , be ) are the CIELAB 1976 metric lightness and the coordinates calculated from Xm, Ym, Zm and Xe, Ye, Ze, respectively.

The averaged color difference DE* ab was 1.0 and the maximum color difference

* DE abmax was 2.3. This result shows that the color transformation by matrix M1 in Eq.

(2-14) has sufficient accuracy to calculate the X, Y, Z necessary to estimate the spectral reflectance.

21 Chapter 2

2.5 Conclusion In this chapter, a method to predict the spectral reflectance of a portrait image digitized by a HDTV camera is described. Spectral reflectance of Japanese women’s skin was estimated with 99.5% accuracy using the first, second and third principal components of the spectral reflectance of skin. The obtained color difference less than 2.30 indicates that the camera calibration was performed with sufficient accuracy to calculate the tristimulus values.

22 Chapter 3

CHAPTER 3

COLORIMETRIC COLOR REPRODUCTION OF FACIAL PATTERN IMAGE ON CRT

3.1 Introduction Figure 8 shows a schematic diagram of colorimetric color reproduction for facial pattern image on CRT in a dark environment. A facial pattern image is taken by a HDTV

camera under illuminant with spectral radiance E1(l ).

E1(l ) E2(l ) HDTV camera O(l )

(X',Y', Z')

M2 ( (R, G, B ) (R , G , B ) ( c c c M1 ( (X, Y, Z )

O(l )

CRT display

Figure 8. Diagram of the proposed colorimetric color reproduction method to predict the tristimulus values of facial pattern image under various illuminants on a CRT in a dark environment.

23 Chapter 3

The tristimulus values X, Y, Z of the portrait image is calculated from three channels R, G, B values of the HDTV camera using transformation matrix M1 as shown in the previous chapter. In the proposed method, the two-dimensional spectral reflectance of the object in the scene is calculated using the tristimulus values of the digitized image based on the principal component analysis of the spectral reflectance of human skin. Then, the tristimulus values of the image under a new illuminant with

spectral radiance E2(l ) can be predicted from the estimated spectral reflectance. The Rc,

Gc, Bc input values of CRT is calculated by color transform matrix M2 obtained by the colorimetric calibration of the CRT.

3.2 Tristimulus values of facial pattern image The tristimulus values X', Y', Z' of skin color under a selected illuminant can be easily calculated from the estimated spectral reflectance O(l ) and the spectral radiance E2(l ) of the illuminant.95 The tristimulus values X’, Y’, Z’ are calculated as follows:

700 X' = K O(l )E (l )x (l )dl (3-1a) ò 2 l = 400

700 Y' = K O(l )E (l )y (l )dl (3-1b) ò 2 l =400

700 Z' = K O(l )E (l )z (l )dl (3-1c) ò 2 l =400

3.3 Calibration of CRT Many methods to calibrate a computer controlled color monitor have been proposed.96-

101 In general cases, it is possible to define a two-stage model for CRT to transform Rc,

102,103 Gc, Bc input values to the tristimulus values X’, Y’, Z’. The first stage is a nonlinear transformation from Rc, Gc, Bc values to phosphor luminances produced by digital-to-

24 Chapter 3

analog converter. The second stage is a linear transformation where the monitor phosphor luminances are transformed to the X’, Y’, Z’ values. The calculation of the transformation M2 in the schematic diagram of Fig. 8 was performed as follows. A steady CRT with fixed luminance, contrast, white point, and gamma is used. Color patches are displayed on the CRT in a dark environment to avoid interference with external light sources. The luminance L and the tristimulus values X, Y, Z were measured at the displayed color patches in each channel. The luminance colorimeter is adjusted at the position of the observer eyes because the angle of incidence of the beams can influence the measurements.104 The relationship between the input level and luminance is plotted and the Eq. (3- 2) can be derived from curve fitting for quadratic curves.

2 LR = a0 Rc + a1 Rc + a2 (3-2a)

2 LG = b0G c + b1 Gc + b2 (3-2b)

2 LB = c0 Bc + c1Bc + c2 (3-2c)

These equations show the relationship between the luminance and input levels, where

LR , LG , LB are the luminance of red, , and phosphors respectively, and ai , bi , ci (i = 0 to 2) are coefficients.

The tristimulus values X, Y, Z on CRT can be decomposed as follows:

æX ö æX R + XG + XBö ç Y ÷ ç Y + Y + Y ÷ , (3-3) ç ÷ = ç R G B ÷ è Zø è Z R + ZG + ZB ø where Xi , Yi , Zi (i = R, G, B) are the tristimulus values corresponding to the emission of red, green, blue phosphor, respectively. A theoretical relationship between X-Y, Z-Y for each phosphor is given by

xR X R = YR , (3-4a) yR

25 Chapter 3

x G XG = YG , (3-4b) yG

xB X B = YB , (3-4c) yB

z R ZR = YR , (3-4d) yR

z G ZG = YG , (3-4e) yG

z B ZB = YB . (3-4f) y B

However the linear curve fitting introduces an offset error in the Eq. (3-4). This error is considered in the following linear equations:

XR = aRYR + bR (3-5a)

XG = aGYG + bG (3-5b)

XB = aBYB + bB (3-5c)

ZR = cRZR + dR (3-5d)

ZG = cGZG + dG (3-5e)

ZB = cBZB + dB (3-5f) where ai , bi , ci , di (i = R, G, B) are coefficients. From Eqs. (3-3) and (3-5), the following equation is obtained;

æ æX ö aRYR + aGYG + aBYB + bR + bG + bBö ç Y ÷ = ç Y + Y + Y ÷ . (3-6) ç ÷ ç R G B ÷ è Zø è cRYR + cGYG + cBYB + dR + dG + dBø

The Eq. (3-6) can be rewrited as follows;

æ æ æX ö YRö bR + bG + bB ö ç Y ÷ = AçY ÷ + ç 0.0 ÷ , (3-7) ç ÷ ç G÷ ç ÷ è Zø è YBø è dR + dG + dBø

26 Chapter 3

where

æ aR aG aBö A= ç 1.0 1.0 1.0÷ . (3-8) ç ÷ è cR cG cB ø

The luminance can be calculated from the Eq. (3-9) as follows;

æ æ æ LRö YR ö X - bR - bG - bB ö ç L ÷ = çY ÷ = A-1ç Y ÷. (3-9) ç G÷ ç G÷ ç ÷ è LBø è YB ø è Z - dR - dG - dB ø

Then, using the Eqs. (3-3) and (3-8), the transformation from the tristimulus values X, Y,

Z to the input levels Rc, Gc, Bc can be achieved.

3.4 Experiments and their results Two CRTs (Nanao Flex Scan56T Monitor and AppleColor High-Resolution RGB Monitor Model MO401) with fixed luminance, contrast, white point, and gamma were calibrated. These CRTs will also be used in the experiments of Chapter 7. The setting of the CRTs is shown in Table 3.

Table 3. Experimental setting of the CRTs.

CRT Luminance (cd/m2) Contrast White point gamma Nanao Flex Scan56T 93.3 Maximum D65 1.80 Apple MO401 71.5 Maximum D65 1.80

Twenty-six color patches were displayed on each CRT in a dark environment. The luminance L and the tristimulus values X, Y, Z of the displayed color patches in each channel were measured by a luminance colorimeter (TOPCOM BM-7). Figures 9(a) and 9(b) show the relationship between input levels of CRT and luminance of phosphor for Nanao Flex Scan56T CRT and AppleColor MO401 CRT

27 Chapter 3

respectively. From Figs. 9(a) and 9(b) it is possible to determine the coefficients shown in Table 4 of the quadratic equation (3-2). )

70 J 2 J J 60 J J G J 50 J J 40 J J J 30 J Fig.J 4 Monitor output luminance. J R HH 20 J HH J HHH J H J H J HH 10 HH J H Luminance of phosphor (cd/m J HH B BBB J HH BBBBBB J HHH BBBBBB 0 BJHBHJBHJ BHJBHBHBBBBBB 0 50 100 150 200 250 300 Input Level (8 bits)

Figure 9(a). The relationship between input levels of CRT (Nanao Flex Scan56T) and luminance of phosphor.

Figures 10(a) and 10(b) show the X-Y and Z-Y relationship for each RGB channel of Nanao Flex Scan56T CRT and AppleColor MO401 CRT respectively.

28 Chapter 3

60

2 G 50 JJ J J J 40 J J J J 30 J J J J 20 J J J R J BBB 10 J BBB JJ BBB Luminance of phosphor (cd/m ) J BBB H B JJ BBB HHHHHHH J BBBBHBHHHHHHH 0 BJHBHJBHJHBHBHBHBHHHH 0 50 100 150 200 250 300 Input levels (0-255)

Figure 9(b). The relationship between input levels of CRT (AppleColor MO401) and luminance of phosphor.

Table 4. Coefficients obtained by quadratic curve fitting for CRTs (Nanao FlexScan 56T and AppleColor MO401) in a dark environment.

Monitor Nanao Flex Scan56T Apple MO401 - 4 - 4 a0 1.216 ´ 10 1.536 ´ 10 - 2 - 3 a1 6.212 ´ 10 1.482 ´ 10 - 1 - 1 a2 - 5.592 ´ 10 - 1.745 ´ 10 - 4 - 4 b0 3.270 ´ 10 4.339 ´ 10 -1 - 2 b1 1.863 ´ 10 9.552 ´ 10 0 - 1 b2 - 1.714 ´ 10 - 7.855 ´ 10 - 5 - 5 c0 6.138 ´ 10 5.225 ´ 10 - 3 - 3 c1 6.782 ´ 10 8.492 ´ 10 - 1 - 2 c2 - 1.122 ´ 10 - 9.007 ´ 10

29 Chapter 3

40 B B 1.6 BB B B 35 B 1.4 B B B 30 B 1.2 B B 25 B B 1.0 B B B X 20 B 0.8 B R B ZR B 15 B B B 0.6 B B B B 10 0.4 B B B B B 5 B 0.2 B

0 0 0 5 10 15 20 25 0 5 10 15 20 25

YR YR a) R channel ( X-Y) b) R channel ( Z-Y) 35 14 B B B B B 30 B 12 B B B B 25 B 10 B B B B B B 20 8 B B B B B X 15 B Z 6 B G B G B B B B 4 10 B B B B B B B B 5 B 2 B B BB 0 0 0 10 20 30 40 50 60 70 Y 0 10 20 30 40 50 60 70

YG YG c) G channel ( X-Y) d) G channel ( Z-Y) 80 16 B B B 14 B 70 B B B 12 B 60 B B B 10 B 50 B B B B B 8 B 40 B XB B Z B B 30 B 6 B B B B B B B 4 B 20 B B B B B 10 B 2 B BB B B BB B 0 B 0 B 0 1 2 3 4 5 6 7 0 1 2 3 4 5 6 7

YB YB e) B channel ( X-Y) f) B channel ( Z-Y)

Figure 10(a). X-Y and Z-Y relationship for each RGB channel of the CRT (Nanao Flex Scan56T).

30 Chapter 3

25 1.2 B B B B 1 B 20 B B B B B 0.8 B B B 15 B B B B X B Z 0.6 B R B R B 10 B B B 0.4 B B B B B B B 5 B B B 0.2 B B BB B B BB BB B BB 0 BB 0 BB 0 2 4 6 8 10 12 14 0 2 4 6 8 10 12 14

YR YR a) R channel ( X-Y) b) R channel ( Z-Y) 25 B 12 B B B 10 B 20 B B B B B B 8 B B B 15 B B B B B 6 B X B Z B G B G B 10 B B 4 B B B B B B B B B 5 B B B 2 B B BB BB BB B B 0 BB 0 B 0 10 20 30 40 50 60 0 10 20 30 40 50 60

YG YG c) G channel ( X-Y) d) G channel ( Z-Y)

14 80 B B 12 B 70 BB B B B 10 60 B B B B B 50 B 8 B B B B 40 B X 6 B B B B Z B B B 30 B B B 4 B B B B B 20 B B B 2 B B BB 10 B B BB BB B B BB 0 B 0 BB 0 1 2 3 4 5 6 0 1 2 3 4 5 6 YB YB e) B channel ( X-Y) f) B channel ( Z-Y)

Figure 10(b). X-Y and Z-Y relationship for each RGB channel of the CRT (AppleColor MO401).

31 Chapter 3

From Figs. 10(a) and 10(b) it is possible to determine the coefficients shown in Table 5 for linear Eq. (3-7).

Table 5. Coefficients obtained by linear curve fitting for CRTs (Nanao FlexScan 56T and AppleColor MO401) in a dark environment.

Monitor Nanao Flex Scan56T AppleColor MO401

0 0 aR 1.773 ´ 10 1.791´ 10

- 1 - 1 aG 4.882 ´ 10 4.867 ´ 10

0 0 aB 2.429 ´ 10 2.420 ´ 10

- 1 - 2 bR 1.372 ´ 10 4.909 ´ 10

- 2 - 2 bG 3.714 ´ 10 1.020 ´ 10

- 2 - 3 bB - 2.258 ´ 10 2.220 ´ 10

- 2 -2 cR 7.242 ´ 10 7.719 ´ 10

- 1 -1 cG 1.934 ´ 10 1.967 ´ 10

1 1 cB 1.314 ´ 10 1.293´ 10

- 3 -4 dR 6.663 ´ 10 2.238 ´ 10

- 2 - 2 dG - 5.807 ´ 10 - 4.334 ´ 10

- 1 - 2 dB - 1.391 ´ 10 3.079 ´ 10

Five facial pattern images with 1920 by 1035 pixels were taken by a HDTV camera under the same conditions used for camera calibration. The model is a Japanese young woman. The tristimulus values X’, Y’, Z’ were calculated using the estimated

32 Chapter 3

spectral reflectance under four illuminants; “Day Light”, “A”, “Cool White”, and “Horizon”. Those spectral radiances and chromaticities are shown in Fig. 11. The calculated tristimulus values were converted to Rc, Gc, Bc values and they are displayed on CRT.

"Horizon" x=0.50 y=0.42

"A" x=0.46 y=0.42

"Day Light" x=0.32 y=0.34 Relative radiant power

"Cool White" x=0.39 y=0.41

Wavelength (nm)

Figure 11. Relative spectral radiant power and chromaticities of four illuminants used in the experiment.

Figure 12 shows the hardcopy of the predicted facial pattern printed directly from the input levels Rc, Gc, Bc without any preprocessing or color correction. Actually, these images should be observed on CRT. We can see that the portrait images under "A" and "Horizon" illuminants seem reddish, because longer wavelength components are predominant in these illuminants as shown in Fig. 11.

33 Chapter 3

(a)“Horizon” (b)“A”

(c)“Cool White” (d)“Day Light”

Figure 12. A facial pattern image reproduced colorimetrically on CRT under four illuminants; (a) “Horizon,” (b) “A,” (c) “Cool White,” (d) “Day Light” illuminants.

34 Chapter 3

3.5 Conclusion

In this chapter, a colorimetric color reproduction method for portrait image on CRT in a dark environment was proposed. In this method, the tristimulus values of each pixel of the image were calculated using the estimated spectral reflectance, the spectral radiance of the illuminant and the color matching functions. The calculated tristimulus values were transformed to R, G, B values of CRT. A portrait image under various illuminants was reproduced on CRT. The portrait images under “Horizon” and illuminant “A” seem very reddish because the predominance of the long-wavelength components in such illuminants. However, the portrait images under “Horizon” and illuminant “A” do not seem actually so reddish when viewed under such illuminants because the chromatic adaptation is not considered yet in this reproduction method.

35 Chapter 4

CHAPTER 4

COLORIMETRIC COLOR REPRODUCTION OF FACIAL PATTERN IMAGE ON HARDCOPY

4.1 Introduction Color correction is necessary to match the tristimulus values of hardcopies with those in the original scene. The colorimetric calibration of printers for color correction is difficult because this calibration depends on the observing environment and specially on the illuminants. Various color transform corrections have been proposed for colorimetric calibration.6,105,106 In this chapter, the printer calibration is achieved by making color transformation function during the printing process, using the data base of input values to printer and spectral reflectances, measured before the printing.

4.2 Color reproduction method for hardcopy under various illuminants Figure 13 shows the schematic diagram of the colorimetric color reproduction of facial pattern image on hardcopy under various illuminants. In this reproduction method, the tristimulus values X'', Y'', Z'' of a printed skin color image under an illuminant with spectral radiance E (l ) were matched to the tristimulus values X', Y', Z' calculated using 3 principal components of the spectral reflectance of the skin. To achieve the matching, the matrix M3 was used to transform the CRT input levels Rc , Gc , Bc to printer input levels R , G , B . The matrix M is a function of the spectral radiance E (l ) of the p p p 3 3 illuminant for viewing of hardcopy and this matrix is calculated by multiple regression analysis using a data base of input values to the printer and measured spectral

36 Chapter 4

reflectances. It is noted that we do not need to measure the tristimulus values of the color patches under each illuminant for calibration.

E1(l ) E2(l ) E3(l )

HDTV O(l )

Hardcopy (X', Y', Z' ) = ((X", Y", Z" ) Printer M2 ( ( R, G, B ) ( Rc, Gc, Bc ) ((Rp, Gp, Bp ) M1 M3(E3(l )) (( X, Y, Z )

O(l ) CRT display

Figure 13. Diagram of the color reproduction method for hardcopy of skin color image under various illuminants.

4.3 Multiple regression analysis using skin spectral reflectances Figure 14 shows a schematic diagram of the multiple regression analysis using the database. One hundred eight skin color patches were printed using a laser thermal

n development and transfer printer (Fujix Pictrography 3000) with printer input level Rp ,

n n n Gp , Bp . The spectral reflectance O (l ) of each patch was measured by a spectrophotometer (Datacolor Spectraflash 500).

In the on-line calibration, the tristimulus values X n " ,Y n ", Z n " were calculated under a selected illuminant E (l ) using the data base. The tristimulus values X n " , Y n ", 3

37 Chapter 4

n n n n Z " for each patch were transformed to Rc , Gc , Bc CRT input levels using the transformation matrix M2. The coefficients (ai,j ) (i=1...3, j=1...11) of matrix M3 in the

Eq. (4-1) were determined by multiple regression analysis107.

Input: Spectral radiance E3 (l )

CRT display Hardcopy Measured once

n n n n n" n" n l ( X ', Y ' , Z ' ) ( X ", Y , Z ) O ( )

Printed 2 Database M off-line

( R n n , n ) c , Gc B c n n ( R n G , B ) p , p p

Multiple regression analysis On-line O f f - l i n e

(n=1 to 108)

Output: M3(E3 (l ))

Figure 14. Diagram of the multiple regression analysis.

The accuracy of this colorimetric color reproduction was evaluated by averaged color differences of fifty-five skin color patches used in the multiple regression analysis. The color difference was calculated between CRT and hardcopy, with and without color transformation.

38 Chapter 4

æ RC ö

ç GC ÷ ç BC ÷ ç 2 ÷ RC ç 2 ÷ æR ö æa a ... a ö GC P 1,1 1,2 1,11 ç ÷ çG ÷ = ç a a ... a ÷ B2 ç P ÷ ç 2,1 2,2 2,11 ÷ç C ÷ è B ø a a ... a R G P è 3,1 3,2 3,11øç C C ÷ (4-1) ç GC BC ÷ ç RC BC ÷ ç R G B ç C C C ÷ è 1 ø

As shown in Fig. 15, the averaged color difference DE* ab was 19.9 without the color transformation, and 4.5 with the color transformation. Thereafter, fifty-five skin color patches, not used in multiple regression analysis, were printed with color transformation

* by M3. The averaged color difference DE ab was 4.9. We can conclude that the proposed color transformation is effective to match the skin color between displayed image and hardcopy. The matrix M3 was calculated using a workstation (SPARC station II; Sun micro system Inc.) in an average time of 25 seconds.

39 Chapter 4

25 J J J JJ J J JJ 20 JJJJJJJJJJ JJJJJJ JJ J JJJ JJJ J J JJJJJ JJJJ J JJ JJJJ Before Calibration J J 15

After Calibration 10 B B BB 5 B B BB B Color Difference L*a*b* BBBBB BB B B B B BBBB B BB B BBBB BB BBBBB BB B B B BB BB BB B BBB 0 10 20 30 40 50 Sample number of skin color patch

Figure 15. Averaged color differences DEab between skin color patches displayed on

CRT and the corresponding hardcopies under four illuminants.

4.4 Experimental results The portraits used in Chapter 3 was reproduced under four kinds of illuminants; “Horizon”, “A”, “Cool White”, and “Day Light” in a standard illumination booth (Macbeth Spectralight II). The spectral radiance of each illuminant is shown in Fig. 11.

The portrait images printed using transformation matrix M3 is shown in Fig. 16.

40 Chapter 4

(a)“Horizon” (b)“A”

(c)“Cool white” (d)“Day Light” Figure 16. Printed facial pattern image under four different illuminants, (a) “Horizon”, (b) “A”, (c) “Cool White”, (d) “Day Light” illuminants.

It is noted that the printed images under "A" and "Horizon" illuminants are not reddish as the corresponding portrait images on CRT as shown in Fig. 12 because each image will not be observed under the illuminant used in the calibration. Here, it is also noted that colors of printed portrait images under different illuminants in Fig. 16 are slightly different. If the spectral reflectance of print is as same as the spectral reflectance of human face, there is no difference in the printed images due to the illuminants. Averaged spectral reflectance of printed skin is shown in Fig. 17. It is clear that the spectral reflectance of human skin and printed skin color are different.

41 Chapter 4

60 Averaged spectral reflectance 50 of skin Averaged spectral reflectance 40 of printed skin color

30

20 Spectral reflectance (%) 10

0 400 450 500 550 600 650 700

Wavelength (nm)

Figure 17. Comparison between averaged spectral reflectance of human skin and printed skin color.

4.5 Conclusion In this chapter, skin color images are reproduced colorimetrically on hardcopy under various illuminants. The averaged color difference of 4.9 between the measured and predicted tristimulus values shows sufficient accuracy of the proposed colorimetric color reproduction. It is noticed that the color reproduction of portrait on CRT is different from the hardcopy under various illuminants because the chromatic adaptation is not considered yet in the reproduction on CRT. The color reproduction described in this paper is not applicable to the lips, hair, eyes and so on, because only the spectral reflectance of human skin was considered here.

42 Chapter 5

CHAPTER 5

COLOR REPRODUCTION OF FACIAL PATTERN IMAGE BASED ON COLOR APPEARANCE MODELS

5.1 Introduction Experimental methods to predict the tristimulus values of skin color under various illuminants on CRT and hardcopy were described in Chapters 3 and 4. In this chapter a method of color reproduction based on color appearance models is described. This reproduction can be made matching the appearance of CRT with the reproduction on hardcopy under various illuminants. Some color appearance models are introduced in the colorimetric color reproduction on CRT. However, there is not a standard color appearance model to match the skin color appearance. Then, a comparison between color appearance models is required to select a suitable model for skin color appearance reproduction. In this chapter an outline of some color appearance models and viewing techniques for psychophysical experiments are also presented.

5.1.1 Chromatic adaptation models A) von Kries model The von Kries coefficient law is based on the complete adaptation of human color visual system to the white point of the illuminant. The cone fundamental tristimulus values L, M, S are simply multiplicated by constant values, respectively. The constant values are taken to be the inverses of the respective cone responses for the maximum signal of the

43 Chapter 5

illuminant. Then, the responses at new adapting field, La , Ma , and Sa can be written as follows:

LNa La = kLL , kL = , (5-1a) LNo

MNa Ma = kMM , kM = , (5-1b) MNo

SNa Sa = kSS , kS = , (5-1c) SNo where, L, M, and S are the excitations of cones on retina at original adapting field, k L, k M , and kS are multiplicative factors, LNa , MNa , and SNa are cone excitations for the white point of new adapting illuminant, and LNo , MNo , and SNo are the cone excitations for the white point of original illuminant.

B) CIELAB (LAB) color space In 1976, CIE recommended CIELAB color space for color-difference metric which also incorporates a modified form of the von Kries model, X/ X N , Y/YN , and Z/ ZN as shown in Eq. (5-2),

1 æ Y ö 3 L* = 116ç ÷ - 16 , (5-2a) è YN ø

1 1 é 3 3 ù * æ X ö æ Y ö a = 500êç ÷ - ç ÷ ú, (5-2b) è X ø è Y ø ë N N û

1 1 é 3 3ù * æ Y ö æ Z ö b = 200êç ÷ - ç ÷ ú , (5-2c) è Y ø è Z ø ë N N û where X N , YN , and ZN are the tristimulus values of the illumination white point.

44 Chapter 5

The tristimulus values Xa , Ya , Za at the new adapting field can be calculated as follows:

3 æ a* æL * +16öö Xa = ç + ç ÷÷ X N¢ , (5-3a) è 500 è 116 øø

3 æL * + 16ö Ya = ç ÷ YN ¢ , (5-3b) è 116 ø

3 ææ L * +16ö b* ö Za = çç ÷ - ÷ Z¢ N , (5-3c) èè 116 ø 200ø where XN ¢ , YN ¢ , and Z¢ N are the tristimulus values of the white point of the adapting illuminant. CIELAB was not standardized to be a color appearance model. However it provides a good approximation for color appearance in near-daylight conditions.

C) Fairchild color appearance model The Fairchild color appearance model uses incomplete chromatic adaptation of cones to the white point. This model is based on von Kries coefficient law, with an introduction of a functional expression proposed by Hunt46 for incomplete levels of adaptation as shown in Eq. (5-4).

L¢ = r L/L , (5-4a) L N

M ¢ = rM M/MN , (5-4b)

S ¢ = rS S/SN , (5-4c) where L’, M’, and S’ are the cone excitations considering a certain degree of chromatic adaptation, r ,r , and r are parameters to represent degree of chromatic adaptation of L M S cones, respectively. L , M , S are respectively the L, M, S cone responses to the white N N N

45 Chapter 5

point of the illuminant. Equation (5-4) can be expressed in matrix form as shown in Eq. (5-5).

é L' ù é L ù ê M'ú = Aê Mú , (5-5) êë S' ûú ëê S ûú

The matrix A is

éa L 0 0 ù ê ú ê 0 aM 0 ú , (5-6) ê ú ë 0 0 aS û

where

a L = r L /LN , (5-7a)

a M = r M /M N , (5-7b)

aS = rS /SN . (5-7c)

The degree of chromatic adaptation can be calculated as follows:

u (1 + YN + l E ) rL = u , (5-8a) (1+ YN + 1/l E )

u (1 + YN + m E ) rM = u , (5-8b) (1+ YN + 1/mE )

u (1+ YN + sE ) rS = u , (5-8c) (1+ YN +1/sE )

u where YN is the luminance of the illuminant, is an exponent that defines the shape of the degree of the adaptation function and , m , and s are the fundamental l E E E chromaticity coordinates of the adapting stimulus. Originally, Fairchild used the exponent u equal to 0.22 as suggested for a dark environment. This exponent value

46 Chapter 5

was set equal to 0.29 in the refinement of RLAB model by Fairchild.72 The , m , and l E E sE values are calculated as follows: 3L = N , (5-9a) l E L + M + S N N N

3M N mE = , (5-9b) LN + M N + SN

3SN sE = , (5-9c) LN + M N + SN

From the Eqs. (5-4) to (5-9), we can see that the adaptation will be less complete for increasing values of the saturation of the adapting stimulus. The equations above also show that the adaptation will be more complete for increasing values of the luminance of the adapting stimulus. The final step in the calculation of adaptation cone signals is a transformation considering interaction among cones given by Eq. (5-10). This transformation allows the model to predict increases of the perceived called Hunt effect. This transformation also allows the prediction of the increases of contrast with increasing luminance, called Stevens effect.108

é La ù é1 c cùé L¢ ù ê ú Ma = êc 1 cúê M ¢ú , (5-10) ê ú ê úê ú ë Sa û ëc c 1ûë S ¢û where c is calculated as follows:

c = 0.2190 - 0.0784log (Y ) . (5-11) 10 N

The entire model to predict the tristimulus values XA , YA , ZA in a second adapting condition from the tristimulus values X, Y, Z in a first adapting condition can be expressed by a single matrix equation as follows,

47 Chapter 5

éX Aù éX ù ê ú -1 - 1 - 1 ê ú êY A ú = M A2 C2 C1A1Mê Y ú, (5-12) ëêZ A ûú ëê Zûú where matrix M is the transformation from tristimulus values to cone fundamental primaries. Matrix A1 and C1 respectively the matrices of Eqs. (5-6) and (5-10) for adapting condition 1. Matrix A2 and C2 are respectively the matrices of Eqs. (5-6) and (5-10) for adapting condition 2. Fairchild and Berns incorporate this chromatic adaptation model into CIE 1976 (L*, a*, b*) color space in the RLAB color appearance model.38 This color space can determine the required colors for reproduction across changes in media and viewing conditions. This model can also be used for calculating metrics of lightness, chroma, hue and color difference.

D) LLAB color appearance model.

73,74 In the model proposed by Luo and coworkers, the Rr , Gr ,Br cone responses at new adapting field is calculated from the Rs , Gs , Bs cone at original adapting field using the following equations;

é R ù R = D( or ) +1 - D R , (5-13a) r ê ú s ë Ros û

é G ù G = D( or ) +1 - D G , (5-13b) r ê ú s ë Gos û

é B ù B = D( or ) +1 - D Bb , for B ³ 0 (5-13c) r ê ú s s ë Bos û

é B ù B = - D( or ) +1 - D Bb , for B < 0 (5-13d) r ê ú s s ë Bos û

48 Chapter 5

Bos 0.0834 where b = ( ) , Ros , Gos , Bos are the cone responses for the white point under Bor original adapting field; Ror , Gor Bor are the cone responses for the white point in the environment in which the image will be reproduced; D value is 1.0 for hardcopies and 0.7 for CRT in dim surround.

5.1.2 Viewing techniques for cross-media comparison The comparison of color appearance models is based on the recommendations of many technical committees such as CIE Technical Committee 1-34, “Testing Colour- Appearance Models in order to establish standard guideline for coordinated research on evaluation of color appearance models for hardcopy and image on CRT”. 109-111 The recommended methods use psychophysical experiments to select a suitable color appearance model for cross-media reproduction. The psychophysical experiments for the comparison of color appearance models are based on many viewing techniques such as successive-binocular, simultaneous-binocular, simultaneous-haploscopic, successive- Ganzfeld haploscopic, and memory matching technique. In the memory matching technique, the illumination booth with the hardcopy and the CRT are placed at 90 degrees from each other ensuring that observers can not see both images on CRT and the hardcopy at the same time. In this technique observers are asked to observe the image in the booth before observing the image on CRT to compare images under different conditions. In this technique observers are allowed to look at both hardcopy and CRT only once. The successive-binocular viewing is equal to the memory technique except that observers are allowed to look at both hardcopy and CRT as much time as necessary to compare images. In the simultaneous-binocular technique, the hardcopy in the illumination booth and the image on CRT are place side by side and the observer is instructed to see the hardcopy and image on CRT with both eyes. In the

49 Chapter 5

simultaneous-haploscopic technique observers are instructed to observe the hardcopy with one eye and the image on CRT with the other eye. This technique allows that each eye adapts to a different white point while comparing the images. The successive- Ganzfeld haploscopic viewing technique112 is similar to the simultaneous-haploscopic viewing technique except that observers are not allowed to view both the hardcopy and image on CRT at the same time. In this technique covers with a neutral field switches over each eye allowing adaptation while the other eye is observing the image. These viewing techniques have been compared by psychophysical methodology such as the experiments performed by Braun and Fairchild.113 The binocular matching114 was more preferred than haploscopic matching. It happens because the binocular viewing is more natural way of viewing, and it is easier (it is accomplished without the necessity of any special device) and more comfortable for the observer than haploscopic viewing. These experiments also show that a successive color viewing was preferred than a simultaneous color viewing because successive matching yielded the shorter matching times115 and interocular interactions can be controlled116. Based on the considerations above, the memory matching that combines easiness, successive color matching and binocular color matching is preferred than other viewing techniques117 to compare reproduced images using various color appearance models. Therefore, memory matching technique will be used in the psychophysical experiments.

5.2 Color appearance matching method Figure 18 shows a schematic diagram of the skin color reproduction method based on color appearance model. Using color appearance models, the tristimulus values

X a, Ya , and Za after the chromatic adaptation can be calculated from the tristimulus values X’, Y’, Z’ obtained by colorimetric color reproduction. At first, X’, Y’, Z’ are transformed to cone fundamental tristimulus values L, M, and S. The Hunt-Pointer-

50 Chapter 5

Estévez transformation are used to calculate L, M, S values, as shown in Eqs. (5-14) and (5-15).38 This transformation is normalized to CIE Illuminant D65.

Color appearance ( L, M, S ) (L , M , S ) models ( a a a -1 M M Colorimetric (X', Y', Z' ) prediction (Xa, Ya, Za ) E(l ) Color Transform O(l ) Matrix

Principal Components

(X', Y', Z' ) Calculated tristimulus values CRT display

Figure 18. Diagram of color reproduction of facial pattern images on CRT based on color appearance models.

é L ù éX ¢ù êM ú = M êY ¢ú (5-14) ëê S ûú ëê Z ¢ûú

é 0.40 0.71 - 0.08ù ê ú M = ê- 0.23 1.17 0.05 ú (5-15) êë 0.00 0.00 0.92 ûú

51 Chapter 5

Thereafter, the calculated L, M, and S values are used to estimate fundamental tristimulus values La , Ma , and Sa corresponding to the cone responses after the chromatic adaptation using color appearance models. The predicted values La , Ma , and

Sa were calculated using von Kries, and Fairchild models. Next, X a, Ya , and Za considering chromatic adaptation are calculated by the inverse of matrix M . Finally, the predicted tristimulus values X a, Ya , and Za are reproduced on CRT using color transform matrix.

5.3 Psychophysical experiments Psychophysical experiments were performed to select a suitable color appearance model for the reproduction of skin color images on CRT in a dark environment. The images on CRT were compared with a hardcopy illuminated in a standard illumination booth (Macbeth Spectralight II). The booth contains four illuminants; “Horizon”, “A”, “Cool White”, and “Day Light”. The spectral radiances of these illuminants are shown in Fig. 11. We reproduced five facial pattern images of a Japanese woman and six skin color patches. The chromaticities of skin color patches are extracted from the facial pattern images and the tristimulus values of the six color patches under “A” illuminant are shown in Table 6.

52 Chapter 5

Table 6. The tristimulus values of the six skin color patches under illuminant “A”.

Patch X Y Z

1 105 90 20 2 72 66 18 3 75 70 19 4 38 37 12 5 47 45 13 6 72 61 14

Four images were generated on CRT for each illuminant; an image generated by X’, Y’ and Z’ without considering chromatic adaptation, named XYZ image; an image generated by von Kries color appearance model, von Kries image; an image generated by Fairchild model, Fairchild image; and an image generated by CIELAB model, CIELAB image. Figure 19 shows an example of predicted color appearance for the skin color patches under various illuminants. The reproduction for each color appearance model is shown in Figs. 20, 21, 22 and 23 respectively, under illuminant “A”, “Horizon”, “Cool white”, and “Day Light”.

53 Chapter 5

XYZ CIELAB XYZ CIELAB

von Kries Fairchild von Kries Fairchild (a) illuminant “A” (b) “Horizon”

XYZ CIELAB XYZ CIELAB

von Kries Fairchild von Kries Fairchild (c) “Cool White” (d) “Day Light” Figure 19. Predicted color appearance of skin color patches under various illuminants; (a) illuminant “A”, (b) “Horizon”, (c) “Cool White”, (d) “Day Light”.

54 Chapter 5

(a) XYZ (c) CIELAB

(c) von Kries (d) Fairchild

Figure 20. Color appearance predictions of a facial pattern image under illuminant “A”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild.

55 Chapter 5

(a) XYZ (c) CIELAB

(c) von Kries (d) Fairchild

Figure 21. Color appearance predictions of a facial pattern image under “Horizon”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild.

56 Chapter 5

(a) XYZ (c) CIELAB

(c) von Kries (d) Fairchild

Figure 22. Color appearance predictions of a facial pattern image under “Cool White”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild.

57 Chapter 5

(a) XYZ (c) CIELAB

(c) von Kries (d) Fairchild

Figure 23. Color appearance predictions of a facial pattern image under “Day Light”; (a) XYZ, (b) CIELAB, (c) von Kries, (d) Fairchild.

58 Chapter 5

Figure 24 shows a viewing technique using memory matching that was used to select a suitable color appearance model to reproduce skin color images on CRT.

Standard Illumination Booth

90o

h

w 1 m

h w=10 cm for facial pattern image 1 m w w=5 cm for skin color patch h=6 cm for facial pattern image h=5 cm for skin color patch Observer

CRT Display

Figure 24. Arrangement of psychophysical experiment to compare skin color images on a CRT with a hardcopy in a standard illumination booth using memory matching.

Both CRT and hardcopy were arranged at an equal viewing distance of approximately 1 m from the observer. The images, whether it is on the CRT or in the illumination booth, would be observed with both eyes. The observer can see the CRT and the hardcopy only once to obtain a comparative judgment.

59 Chapter 5

Ten observers took part in this experiment. They were asked to read the following instructions before the experiment; “In this experiment, you must compare a color image in the standard illumination booth with four reproductions displayed on CRT simultaneously. At first, you will look at the printed skin color image in the illuminant booth after adapting for a minute. You will be asked to memorize the skin color of the image. Turn towards the CRT, and look at the displayed neutral gray field. After adapting to the neutral gray field for a minute, examine the four reproductions and select the one that looks most like the hardcopy that you memorized. You must choose the reproduction on CRT based only on color judgment. You must judge only considering if the reproduction looks like the original hardcopy and not in the image quality or preference.” The observers were asked to look the image for about one minute because Fairchild and Reniff84 found that chromatic adaptation at constant luminance was 90% complete after approximately 60 seconds. The position of each model on the CRT was randomized for each set of predicted images.

5.4 Experimental results Figure 25 shows the percentage of skin color patch that was selected as the best one in 4 kinds of models on CRT. Figure 25(a) shows the percentage of selection under various illuminants. From Fig. 25(a), we can see that the color appearance of patch by Fairchild model was chosen as the best one. Figure 25(b) shows the percentage of selection for each illuminant. It is possible to observe that the incomplete adaptation, Fairchild model, was effective under “Horizon”, “Cool White” and illuminant “A”. However, under “Day Light”, Fairchild model was selected as well as the model proposed by von Kries.

60 Chapter 5

(a) Selected model under various illuminants.

100 90 80 70 Illuminant "A" 60 "Day Light" 50 "Horizon" 40 30 "Cool White" Percentage chosen 20 10 0 XYZ CIELAB von Kries Fairchild

(b) Selected model under each illuminant. Figure 25. Percentage of selection of each color appearance model for skin color patch reproduction.

On the other hand, Fig. 26 shows the percentage of selection of each color appearance model for reproduced facial pattern image. Figure 26(a) shows the percentage of selected models under various illuminants. It is clear that Fairchild model

61 Chapter 5

was chosen as well as the color patches. Figure 26(b) shows the percentage of selected model for each illuminant.

100 90 80 Face1 70 Face2 60 50 Face3 40 Face4 30 Face5

Percentage chosen 20 10 0 XYZ CIELAB von Kries Fairchild

(a) Selected model under various illuminants.

100 90 80 70 Illuminant "A" 60 "Day Light" 50 40 "Horizon" 30 "Cool White" Percentage chosen 20 10 0 XYZ CIELAB von Kries Fairchild

(b) Selected model under each illuminant.

Figure 26. Percentage of selection of each color appearance model for facial pattern image reproduction.

62 Chapter 5

It is clear that, there is no significant difference among illuminants for each model for facial pattern images. The comparison of the results obtained in Figs. 25 and 26 shows that the performance of Fairchild model was not so good for facial pattern images as in the case of skin color patches. The model proposed by Fairchild was applied well to skin color patches but it was not effective for facial pattern images as skin color patches. Facial skin color is one of the human memorized colors. We can believe that this difference in the performance of Fairchild model is due to the memorized colors118.

5.5 Conclusion A color reproduction method based on color appearance models was used to match the appearance of skin color on CRT with hardcopy under various illuminants. The colorimetric color reproduction was compared with images reproduced using color appearance models proposed by von Kries, Fairchild and CIELAB color space. The results of psychophysical experiments using memory matching showed that the incomplete chromatic adaptation model proposed by Fairchild is a better model than CIELAB or von Kries. It confirms that an incomplete chromatic adaptation model is effective for complex image.119 However, the model proposed by Fairchild does not be applied to facial pattern image as well as skin color patches. These results show that more studies of the influence of memorized facial skin color are necessary. The effect of surround on CRT120 and the effect of individual variation of color matching functions121,122, could also be applied to skin color reproduction.

63 Chapter 6

CHAPTER 6

MODIFIED FAIRCHILD CHROMATIC ADAPTATION MODEL FOR FACIAL PATTERN IMAGES

6.1. Introduction The incomplete chromatic adaptation model proposed by Fairchild could be applied well for skin color patches as shown in a previous chapter. However this model could not apply to facial pattern image as well as for skin color patches. Therefore, I tried to modify Fairchild model for facial pattern images.

6.2. Modified Fairchild chromatic adaptation model

Coefficients KL, KM, KS, were introduced in Eq. (5-9) as follows, K L = L N , (6-1a) l E L + M + S N N N

KM MN mE = , (6-1b) LN + M N + SN

KS SN sE = . (6-1c) L N + M N + SN

In the Fairchild model, the coefficients KL , KM, KS are defined as 3.0. These color coefficients provide adjustment of the degree of color balance for facial pattern images.

64 Chapter 6

6.3. Psychophysical experiments to tune color balance. Psychophysical experiments using memory matching were performed to select the optimum coefficients of color balance for facial pattern image. The coefficients KL ,

KM , KS were varied from 2.9 to 3.2 in intervals of 0.1 unit. These 64 combinations of coefficients were used to reproduce a facial pattern image under three kinds of illuminants (“A”, “Day Light”, “Cool White”). The observer was asked to select on CRT the most similar image to the original hardcopy using memory matching as described in Chapter 5. The results of this experiment were shown in Figs. 27(a), (b), (c), and (d). Number of selection times

KS 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 KM 2.9 2.9 2.9 2.9 3.0 3.0 3.0 3.0 3.1 3.1 3.1 3.1 3.2 3.2 3.2 3.2

KL 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9 2.9

Figure 27(a). A number of selection when KM , KS are changed from 2.9 to 3.2 and

KL = 2.9.

65 Chapter 6 Number of selection times

KS 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 KM 2.9 2.9 2.9 2.9 3.0 3.0 3.0 3.0 3.1 3.1 3.1 3.1 3.2 3.2 3.2 3.2

KL 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0 3.0

Figure 27(b). A number of selection when KM , KS are changed from 2.9 to 3.2 and

KL = 3.0. Number of selection times

KS 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 KM 2.9 2.9 2.9 2.9 3.0 3.0 3.0 3.0 3.1 3.1 3.1 3.1 3.2 3.2 3.2 3.2 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 KL

Figure 27(c). A number of selection when KM , KS are changed from 2.9 to 3.2 and

KL = 3.1.

66 Chapter 6 Number of selection times

KS 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 2.9 3.0 3.1 3.2 KM 2.9 2.9 2.9 2.9 3.0 3.0 3.0 3.0 3.1 3.1 3.1 3.1 3.2 3.2 3.2 3.2 KL 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1 3.1

Figure 27(d). A number of selection when KM , KS are changed from 2.9 to 3.2 and

KL = 3.2.

From Figs. 27(a), (b), (c), and (d), it is decided to choose KL =3.1, KM =2.9,

KS =3.1 as optimum coefficients of color balance for facial pattern image.

6.4. Comparison of the modified Fairchild model with other color appearance models. A) Experiment A (Modified Fairchild, XYZ, von Kries, Fairchild) The coefficients of color balance in the modified Fairchild incomplete chromatic adaptation equations were set as KL =3.1, KM =2.9, KS =3.1 and the performance of this modified model was compared with colorimetric color reproduction and other color appearance models (von Kries, Fairchild) by psychophysical experiments. The reproduced images under illuminant “A” are shown in Fig. 28.

67 Chapter 6

(a) XYZ (b) Custom

(c)von Kries (d) Fairchild

Figure 28. A facial pattern image reproduced on a CRT for illuminant “A”; (a) XYZ, (b) Modified Fairchild (Custom model), (c) von Kries, (d) Fairchild.

Psychophysical experiments were performed using memory matching. A hardcopy in a booth was compared with a pair of reproductions on CRT in a dark environment. Ten observers were asked to select the reproduction on CRT that provides the best match in color appearance with the hardcopy. The choices of reproduction on CRT were converted to an interval scale of color reproduction quality for various models using Thurstone’s law of comparative judgments123. Using statistical procedures described by Torgeson124 the averaged z-scores for each model were calculated. These z-scores give interval scale values of each model indicating their

68 Chapter 6

performance in the reproduction of the original hardcopy. A confidence interval of 95% for the averaged value is calculated in Eq. (6-2). 0.707 ±1.96 , (6-2) N M where N is the number of observation for a sample, M is the number of tested models. In our experiment, the confidence interval around each scale value was 0.283, where N=15 and M=4. The performances of two models are considered as equivalent if the average of one model falls within the error bar of another model. The results of these experiments are shown in Figs. 29(a), (b), and (c), where “Custom” indicates the modified Fairchild model for facial pattern image. The central bar in each plot of Figs. 29(a), (b), and (c) indicate the averaged interval scale value and the bars indicate the error in 95% confidence interval. In the case of the XYZ image under “Day Light” illuminant it was not possible to calculate the z-score because XYZ reproduction was not selected at all.

Figure 29(a). Averaged interval scale of model performance for illuminant “A”.

69 Chapter 6

Figure 29(b). Averaged interval scale of model performance for “Day Light”.

Figure 29(c). Averaged interval scale of model performance for “Cool White”.

The statistical result of these comparisons showed that the modified Fairchild model (Custom model) had the best averaged z-scores for all kind of illuminants. The XYZ image showed the worst results because this reproduction does not consider chromatic adaptation.

70 Chapter 6

B) Experiment B (Modified Fairchild, von Kries, RLAB 96, LLAB) XYZ and Fairchild reproductions were replaced by the RLAB refinement published in 1996 (RLAB 96),72 and LLAB model73,74 to refine and update the psychophysical experiments. These models were tested for both skin color patches and facial pattern images. Figure 30 shows a reproduced skin color patch under illuminant “A”. Figure 31 shows the reproduced facial pattern image under illuminant “A”.

(a) RLAB 96 (b) Custom

(c)von Kries (d) LLAB Figure 30. A skin color patch reproduced on CRT display for illuminant “A”; (a) RLAB 96, (b) Modified Fairchild (Custom model), (c) von Kries, (d) LLAB models.

71 Chapter 6

(a) RLAB 96 (b) Custom

(c)von Kries (d) LLAB

Figure 31. A facial pattern image reproduced on CRT display for illuminant “A”; (a) RLAB 96, (b) Modified Fairchild (Custom), (c) von Kries, (d) LLAB models.

72 Chapter 6

Six color patches were examined by two observers and a facial pattern image was observed by ten observers. Figure 32 shows the percentage of skin color patch that was selected on CRT as the best one in 3 kind of illuminants. On the other hand, Fig. 33 shows the percentage of selection for the facial pattern image chosen as the best one on CRT display in 3 kind of illuminants..

50 45 Illuminant "A" 40 "Day Light" 35 "Cool White" 30 25 20 15 10 Percentage chosen 5 0 von Kries RLAB 96 Custom LLAB

Figure 32. Percentage of skin color patch that was selected on CRT as the best one in 3 kind of illuminants.

73 Chapter 6

40 Illuminant "A" 35

30 "Day Light"

25 "Cool White"

20

15

10 Percentage chosen 5

0 von Kries RLAB 96 Custom LLAB

Figure 33. Percentage of selection for reproduced facial pattern image chosen as the best one on CRT display in 3 kind of illuminants.

From Fig. 32, it is clear that the color appearance of patch by RLAB 96 model was chosen as the best one. In this case the modified Fairchild model did not perform well as RLAB 96 model because the coefficients of color balance were adjusted for facial pattern image and not for skin color patches. From Fig. 33 it is possible to see that LLAB model performed well for illuminant “A” but it could not predict well other illuminants used in the experiment. On the other hand, von Kries model performed well for “Day Light“ and “Cool White” but it presented poor results for illuminant “A”. RLAB 96 model performed relatively well, however the modified Fairchild model (Custom model) presented the overall best result under all kind of illuminants. The result was also analyzed statistically. Figures 34(a), (b), and (c) show the averaged interval scale values for each reproduced facial pattern image under various illuminants.

74 Chapter 6

Figure 34(a). Averaged interval scale of model performance for illuminant “A”.

Figure 34(b). Averaged interval scale of model performance for “Day Light”.

75 Chapter 6

Figure 34(c). Averaged interval scale of model performance for “Cool White”.

From Figs. 34(a) and (b) it is clear that von Kries and RLAB 96 models are approximately same in statistical values for facial pattern image reproduction under “A” and “Day Light” illuminants. From Fig. 34(c) it can be also found that von Kries and LLAB models are approximately same in statistical values for reproduction under “Cool white” illuminant. It is also possible to see from Fig. 34(c) that RLAB 96 and Custom models are approximately same in statistical values for reproduction under “Cool white” illuminant.

6.5. Conclusion It was found from the psychophysical experiments that the performance of color appearance models for facial pattern depends on the illuminant. The modified Fairchild color appearance model (custom model) performed well for facial pattern images and showed the best averaged z-scores for all kind of illuminants. We can conclude that the proposed modified Fairchild model can be performed well to reproduce facial pattern images.

76 Chapter 7

Chapter 7

COLOR REPRODUCTION OF ENDOSCOPIC IMAGES UNDER ENVIRONMENTAL ILLUMINATION

7.1 Introduction In recent years, electronic endoscope has been widely used for diagnosis of various kinds of diseases. The color of stomach and intestine mucosa gives significant information for diagnosis of many diseases such as gastritis, cancer and so on. Sometimes the physician needs to ask another specialist for the diagnosis. The process of sending and receiving images from one hospital to another has been possible with the advance of telecommunication systems. However, in these systems the viewing conditions and devices vary case-by-case. Then, it is necessary to consider the influence of the environmental conditions on the appearance of reproduced image. The endoscopic images are generally viewed by physician in dark environment of an operation room. However, when the images are viewed in another room, such as meeting room, it is necessary to consider the influence of environmental illuminant on the appearance of endoscopic images. It is necessary to do gamut-mapping for image which cannot be reproduced in the gamut of CRT. However, there is not any way to quantify how gamut-mapping affects the perceived color appearance. In this chapter, a perceptual color difference is defined to evaluate gamut- mapping for endoscopic image reproduced on CRT under environmental illumination instead of the traditional color difference. The perceptual color difference is defined by Mahalanobis distance which is calculated using the covariance matrix of metric lightness, chroma and hue angle.

77 Chapter 7

7.2 Color reproduction method for endoscopic images under environmental illumination Figure 35 shows a schematic diagram of a color reproduction method to reproduce the appearance of endoscopic images on CRT under various illuminants.

Illuminant CRT1 White point CRT2 White point•@ Illuminant E(l ) White point RGB R'G'B' CRT 1 CRT 2 O(l )

CRT Calibration

X"Y"Z" XLYLZL

Substraction XYZ of reflected light on CRT X'Y'Z' Adaptation to CRT and illuminant

Chromatic Adaptation

Gamut- LCh mapping L'C'h

Figure 35. Diagram of color reproduction method to reproduce the appearance of endoscopic images on CRT under various illuminants.

78 Chapter 7

In Fig. 35, the R, G, B original endoscopic image is transformed to X, Y, Z using the characteristics of the CRT in the dark environment. The color appearance is calculated from X, Y, Z values and represented in CIELUV L*, C*, and h values. The gamut- mapping was performed to adjust the L*, C*, h values on CRT 1 to the L*’, C*’, h’ values on CRT 2 under various illuminants. The inverse chromatic adaptation is performed to get X’, Y’, Z’. The XL YL, ZL tristimulus values of the reflected light on CRT surface are subtracted from X’, Y’, Z’ to give X”, Y”, Z” tristimulus values of emitted light on CRT surface. Finally, the X”, Y”, Z” was transformed to the device dependent R,’ G ’, B’ that are displayed on CRT. XL, YL, ZL can be calculated from the spectral reflectance of the CRT surface r (l ) using the following equations;

700 X = K E(l )r (l )x (l )dl , (7-1a) L ò 400

700 Y = K E(l )r (l )y (l )dl , (7-1b) L ò 400

700 Z = K E(l )r (l )z (l )dl , (7-1c) L ò 400

700 K =100/ òE(l )y (l )dl , (7-1d) 400 where E(l ) is the spectral radiance of the illuminant.

The measurement of spectral reflectance r (l ) of the CRT depends on the viewing angle, so it was assumed that the endoscopic images are displayed on the center of the CRT and the observer watched the images in a fixed position in front of the CRT.

The spectral reflectance r (l ) of the CRT was obtained dividing the amount of radiant power reflected on CRT by the amount of radiant power reflected on the white perfect diffuser. The amounts of radiant power of three illuminants (“A”, “Cool white” and “Day Light” whose relative spectral radiances are given in Fig. 11) were measured on a

79 Chapter 7

white diffuser with reflectance of approximately 90%. Eventual scattering of light inside the CRT was ignored. The spectral reflectance was calculated for three kinds of illuminants to examine the illuminant dependence of the spectral reflectance as shown in Figs. 36(a), (b) and (c), respectively under “A”, “Cool white” and “Day Light.” As expected, the spectral reflectance has approximately the same shape for these three kinds of illuminants.

20 18 16 14 12 10 8 6 4 Spectral reflectance (%) 2 0 400 450 500 550 600 650 700 Wavelength (nm)

Figure 36(a). Spectral reflectance of CRT without radiation from CRT under illuminant “A”.

80 Chapter 7

18 16 14 12 10 8 6 4

Spectral reflectance (%) 2 0 400 450 500 550 600 650 700 Wavelength (nm)

Figure 36(b). Spectral reflectance of CRT without radiation from CRT under “Cool White”.

14

12

10

8

6

4

Spectral reflectance (%) 2

0 400 450 500 550 600 650 700 Wavelength (nm)

Figure 36(c). Spectral reflectance of CRT without radiation from CRT under “Day Light”.

XL, YL, ZL were calculated using spectral reflectance r (l ) for each illuminant and the values are shown in the Table 7.

81 Chapter 7

Table 7. XL, YL, ZL tristimulus values of reflected light on CRT under various illuminants.

Illuminant Illuminant “A” “Cool White” “Day Light”

XL 6.24 5.84 1.88

YL 5.52 5.83 2.95

ZL 3.68 4.07 4.24

The incomplete chromatic adaptation model proposed by Fairchild was used to predict the color appearance as in the case of skin color. The chromatic adaptation on CRT in a dark environment occurs in relation to the white point of the CRT. In the case of illuminated CRT, Katoh considered mixed chromatic adaptation.10,11,125 Katoh found that human visual system is 60% adapted to CRT white point and 40% to the ambient light. However, as Katoh stated, the degree of adaptation depends on the time of observation and the distance from the CRT. In medical systems, it is difficult to reproduce the same experimental conditions used by Katoh. Then, it is assumed that chromatic adaptation occurs to the white point of CRT added to the tristimulus values of reflected light on CRT. The color reproduction of the tristimulus values X’, Y’, Z’ of CRT under illumination could be predicted from the X, Y, Z of the CRT in the dark environment using a simple linear transformation of multiplying matrices as shown in Eq. (7-2), which was rewritten from Eq. (5-12).

éX' ù é Xù - 1 -1 - 1 ê Y' ú = M A2 C2 C1 A1 MêY ú , (7-2) êë Z'ûú ëê Zûú

82 Chapter 7

7.3 Gamut-mapping of endoscopic image 7.3.1 Gamut-mapping in 1976 CIELUV L*C*h color space The objective of gamut-mapping is to find a way to display the processed colors that are out of color gamut in CRT. The gamut-mapping should ensure that the appearance of the reproduction is as close to the original as possible.73 This appearance fidelity of the reproduced image to the original is very important in remote diagnosis for endoscopic images, because diagnosis of physicians is mainly based on the appearance of color.

The Luv *, Cuv *, huv metric lightness, chroma, and hue angle respectively were used for gamut-mapping because these color attributes have a psychophysical meaning. The relationship between the tristimulus values and the L*, u*, v* are given by Eq. (7-3) and the relationship between L*, u*, v* and Luv *, Cuv *, huv is shown in Fig. 37.

1 æ Y ö 3 Y L* = 116ç ÷ - 16 for > 0.008856 , (7-3a) è Yn ø Yn

Y Y L* = 903.3 for £ 0.008856, (7-3b) Yn Yn

* * 2 * 2 Cuv = (u ) + (v ) , (7-3c)

* - 1æu ö h = tan ç , (7-3d) uv è v * ø

where

* * u = 13L (u' - un') , (7-3e)

* * v = 13L (v' - vn '), (7-3f) 4X u' = , (7-3g) X +15Y + 3Z

4 Xn un' = , (7-3h) X n + 15Yn + 3Zn

9Y v' = , (7-3i) X +15Y + 3Z

83 Chapter 7

9Yn v n' = , (7-3j) Xn + 15Yn + 3Zn

Yellow L* h = 90o C* v * Green h = 180o

u*

0 v * h h 270o = h = 0o Blue Red u*

Figure 37. Relationship between L*u*v* and L*C*h color spaces.

7.3.2 Perceptual Mahalanobis distance The gamut-mapping algorithms are evaluated using psychophysical techniques, as performed by Katoh126,127 considering differences in metric lightness (DL), chroma

(DC ), and hue angle (Dh). However, the weighted color difference DE which was proposed by Katoh, as shown in Eq. (7-4), does not consider the correlation between the lightness, chroma and hue angle.

1 éæ * 2 æ * 2 æ * 2 ù 2 ç DL ö ç DCab ö ç DHab ö DE = ê + + ú , (7-4) ëè Kl ø è Kc ø è Kh ø û

* * * where DL , DCab , DHab are differences in lightness, chroma, and hue respectively. Kl ,

KC , Kh are mapping coefficients for each attribute of color.

84 Chapter 7

The CIE 1994 total color-difference128 shown in Eq. (7-5) uses correlation between hue and chroma in the estimation of weighting functions. However, it does not consider the correlations between lightness and hue, lightness and chroma.

1 2 2 2 2 éæ DL* ö æD C* ö æD H * ö ù * ç ÷ ç ab ÷ ç ab ÷ DE94 = ê + + ú (7-5) ëè kL SL ø è kC SC ø è kH SH ø û where k L , k C , k H are parametric factors, SL , SC , SH are weighting functions. SL =1;

* * SC =1+0.045Cab ; SH =1+0.015Cab are currently recommended by CIE.

A perceptual color distance based on Mahalanobis distance is defined using covariance matrix of perceptually equivalent metric lightness, chroma and hue angle of endoscopic images. The Mahalanobis distance129 makes uniform the influence of the distribution of each attribute and is defined as follows;

1 - 1 ì éV V V ù DL ü 2 ï LL LC Lh é ùï ê ú d = í [DL DC Dh] VCL VCC VCh êD Cúý , (7-6) ï ê ú ê úï î ë VhL VhC Vhh û ë Dhûþ where VLL, VCC , Vhh are the variances of metric lightness, chroma, hue angle, respectively.

On the other hand, VLC (VCL ), VLh (VhL ), VCh (VhC ) are the covariances between metric lightness and chroma, lightness and hue angle, and chroma and hue angle, respectively. This perceptual distance can be used to evaluate gamut-mapping. The gamut-mapping that provides the shortest perceptual Mahalanobis distance is considered as the best reproduction.

85 Chapter 7

7.3.3 Psychophysical experiments Psychophysical experiments were performed to find a covariance matrix to define the Mahalanobis distance for endoscopic image. For this purpose, the lightness of endoscopic image was changed by -4, -2, 0, 2, 4 units, the chroma was changed by -6, -3, 0, 3, 6 units, and the hue angle was changed by -2, -1, 0, 1, 2 degrees depart from the original image. Every possible combination of these three color attributes was prepared as 125 sets of endoscopic images. Each image was displayed randomly. Ten observers who are students and staff of our laboratory were asked to watch each pair on CRT and they were asked to select if two images are the same or not. The statistical result of this experiment is shown in Table 8.

Table 8. Variances and covariances for metric lightness, chroma and hue angle of endoscopic image.

Attribute Symbol Value

Variance of metric lightness VLL 5.818

Variance of metric chroma VCC 11.29

Variance of metric hue angle Vhh 1.326

Covariance between metric lightness and chroma VLC (VCL ) 2.664

Covariance between metric lightness and hue angle VLh (VhL ) 0.2198

Covariance between metric chroma and hue angle VCh (VhC ) -1.544

From Table 8, the variance of chroma is greater than variance of lightness; and then the variance of lightness greater than variance of hue angle.

86 Chapter 7

The perceptual Mahalanobis distance can be calculated by Eq. (7-7),

1 2 ïì é 0.2047 - 0.0630 - 0.1071ùé DL ùïü d = í [DL DC Dh]ê- 0.0630 0.1247 0.1556 úê D C úý . (7-7) îï ëê - 0.1071 0.1556 0.9530 ûúëê Dh ûúþï

The presence of negative non-diagonal terms in the matrix of Eq. (7-7) is corresponding to the correlation of lightness and chroma, and lightness and hue angle, and it suggests to decrease the perceptual color distance when DG´ DQ > 0 , where G and Q are L, C, h. On the other way, the positive non-diagonal terms are corresponding to the correlation between metric chroma and hue angle and it indicates to decrease the perceptual color distance when DG´ DQ < 0 . Hara and coworkers carried out psychophysical experiment,130,131 with collaboration of five physicians. They changed the metric lightness, chroma and hue angle in order to reproduce a preferred endoscopic image on CRT. In this experiment, they only examined variance of metric lightness, chroma, hue angle and covariance between metric chroma and hue angle as shown in Table 9. Unfortunately, it is not possible to calculate the perceptual Mahalanobis distance using the result of Hara’s experiment.

87 Chapter 7

Table 9. Result of psychophysical experiments performed by Hara with the collaboration of physicians.

Statistical value of color attribute Symbol Value

Variance of metric lightness VLL 23.56

Variance of metric chroma VCC 34.08

Variance of metric hue angle Vhh 7.44

Covariance between metric chroma and hue angle -8.97 VCh (VhC )

However, they found that the variance of metric chroma was greater than that of lightness, the variance of lightness was greater than that of hue angle, and covariance between chroma and hue angle have a negative value like the results of the pshychophysical experiments shown in Table 8.

7.4 Analysis of gamut-mapping for endoscopic images Scaling and translation of metric hue angle, chroma and lightness were done for gamut-mapping as shown in Eq. (7-8);

h' = h + a , (7-8 a)

C'= bC + g , (7-8 b)

L' =f L+ e, (7-8 c) where h, C, L are the hue angle, chroma and lightness respectively before the mapping, the h’, C’, L’ are the mapped hue angle, chroma and lightness respectively, and a , g, b, e and f are coefficients of hue angle, chroma translation, chroma scaling, lightness translation , and chroma scaling, respectively. Only one coefficient was used for hue angle since both scaling and translation of hue angle produces rotation of mapped points around lightness axis. The parameters were varied in intervals as described in Table 10.

88 Chapter 7

Table 10. Variation of coefficients f , e, b, g anda in the Eqs. (7-10).

Coefficient Initial value Final value Interval unit

f 0.95 1.00 0.05

e -3 3 1

b 1.0 1.0 0

g -10 10 1

a -1 1 1

A stomach endoscopic image was reproduced from a CRT (Nanao FlexScan 56T) in a dark environment (corresponding to CRT 1 in Fig. 35) to a CRT (AppleColor High-Resolution RGB Model MO401) under illuminant “A” (CRT 2 in Fig. 35). The characteristics of CRT were described in Chapter 3. Figure 38 shows the endoscopic image to be displayed on a CRT in a dark environment.

89 Chapter 7

Figure 38. Stomach endoscopic image.

This image was chosen because it presents dark and bright regions that are generally difficult to reproduce simultaneously inside the color gamut of the CRT.

Figure 39 shows a reproduction of stomach image shown in Fig. 38 on a CRT under illuminant “A”. This image was not reproduced inside the color gamut of the CRT. In this case 2.76% of the pixels was out of gamut in red channel, 0.86% of pixels were out of gamut in green channel, and however, all pixels were inside the gamut in blue channel. Lightness, chroma and hue angle mapping was applied to the image in Fig. 39 and the perceptual Mahalanobis distance produced by each mapping were calculated when 99% of the pixels are reproduced inside color gamut of the CRT for all the combinations of coefficients a , b, g, e, and f changed as shown in Table 10.

Figure 39. Reproduced image on CRT under illuminant “A”.

90 Chapter 7

The minimum averaged perceptual Mahalanobis distance was 1.357 for f =0.95, e=3, b=1.0, g=1 and a =0, coefficients of Eq. (7-8). The Eq. (7-8) can be rewritten as follows;

h' = h, (7-9 a)

C' = C +1, (7-9 b)

L' = 0.95L + 3, (7-9 c) The image was reproduce inside the color gamut of the CRT for 99.17% in red channel, 99.99% in the green channel, and 100.00% in the blue channel. The reproduced endoscopic image for these coefficients is shown in Fig. 40.

Figure 40. Reproduced stomach endoscopic image under illuminant “A”, for f =0.95, e=3, b=1.0, g=1 and a =0 of Eq. (7-8).

91 Chapter 7

This result shows that it is better to adjust metric lightness and chroma simultaneously and maintain the hue angle unchanged to minimize changes in the color appearance. The hue angle is the appearance attribute that can be distinguished with the highest precision and the psychophysical experiments showed that human visual system is very sensitive for changes in hue.

7.5 Conclusion A method to reproduce endoscopic images on illuminated CRT was proposed. This color reproduction method uses color appearance models and gamut-mapping. Gamut- mapping was performed in 1976 CIELUV L*C*h color space by scaling and translation of the color attributes. A perceptual Mahalanobis distance was defined to evaluate the proposed gamut-mapping techniques. This perceptual distance is calculated using covariance matrix of perceptually equivalent metric lightness, chroma and hue angle and indicates how the color appearance of reproduced endoscopic image is affected by gamut-mapping. Experiments using this perceptual color distance showed that a mapping that minimizes changes in color appearance was obtained for f =0.95, e=3, b=1.0, g=1 and a =0, coefficients of Eq. (7-8). Therefore, it is better to map lightness and chroma simultaneously and maintain hue angle unchanged.

92 Chapter 8

CHAPTER 8

CONCLUSION

Many aspects and problems involving color reproduction of original scene under various illuminants have been studied, specially for skin color and endoscopic images. At first, spectral reflectance of facial pattern images taken by HDTV camera was estimated using three principal components of skin spectral reflectance. Computer simulation of colorimetric color reproduction was done using those obtained spectral reflectance. The predicted images were reproduced colorimetrically on CRT and hardcopy. Color appearance models were also introduced into colorimetric color reproduction method to predict the color appearance of skin colors under various illuminants on CRT. Computer simulations of Fairchild, von Kries and LAB chromatic adaptation models were compared with the color hardcopies by psychophysical experiments using memory matching technique. These experiments showed that the incomplete chromatic adaptation model proposed by Fairchild can be used for a suitable prediction for color appearance of skin color patches. However, Fairchild model was not so effective for facial pattern images as skin color patches. Coefficients of Fairchild model were adjusted to reproduce facial pattern images by psychophysical experiments. The modified Fairchild model showed the best overall performance compared to the other color appearance models under various illuminants.

93 Chapter 8

The proposed color reproduction method based on color appearance models was also applied to endoscopic image on CRT under environmental illumination. However, gamut-mapping is required for images that is not reproduced inside the gamut of CRT. From this experiment, we introduced a new perceptual color distance based on Mahalanobis distance for gamut-mapping. Mahalanobis distance was calculated using covariance matrix of metric lightness, chroma and hue angle obtained by psychophysical experiments. Therefore, I consider that the distance could be used as

* part of an extension of CIE 1994 total color-difference DE94 as perceptual color distance. Various gamut-mapping techniques were evaluated using this perceptual color distance. As a result, it was shown that the lightness and chroma should be mapped simultaneously preserving the hue angle for effective color reproduction of endoscopic image. In this dissertation, the scope of the study was limited to specific application such as facial pattern and endoscopic image reproduction. However, I believe that the methods developed in this research can be extended to the reproduction of other types of complex images, such as fine art image, and landscape pictures.

94 References

REFERENCES

(1) Nassau, K., “The fifteen causes of color: the physics and chemistry of color,” Color Res. Appl. 12, 4-26 (1987). (2) Gouras, P., The Perception of colour, MacMillan Press, London, 1991. (3)Hunt, R. W. G., “Chromatic adaptation in image reproduction,” Color Res. Appl. 7, 46-49 (1982). (4)Hunt, R. W. G., “A model of colour vision for predicting colour appearance,” Color Res. Appl. 7, 95-118 (1982). (5) Robertson A. R., “The future of color science,” Color Res. Appl. 7, 16-18 (1982). (6) Haneishi, H., Miyata, K., Yaguchi, H., and Miyake, Y., “A new method for color correction in hardcopy from CRT images,” Journal of Imaging Science and Technology 37, 30-36 (1993). (7) Fairchild, M. D., Lester, A. A., and Berns, R. S., “Accurate Color Reproduction of CRT-Displayed image as projected 35 mm slides,” Journal of Electronic Imaging 5, 87-96, (1993). (8) Wandell, B. A., “The synthesis and analysis of color images,” IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-9, 2-13 (1987). (9) Choi, K. F., “Comparison of Color difference perception on soft-display versus hardcopy,” Proc. of IS&T/SID’s 2nd Color Imaging Conference: Color Science, Systems and Applications, 18-21, (1994). (10) Katoh, N., “Appearance match between soft copy and hard copy under mixed chromatic adaptation,” Proc. of the IS&T/SID Color imaging conference: Color science, systems and applications, 22-25, (1995). (11) Katoh, N., “Appearance match between soft copy and hard copy (III),” Proc. of Color Forum Japan ‘95, 33-36, (1995) (in Japanese).

95 References

(12) Tonnquist, G., “25 years of color with the AIC - and 25,000 without,” Color Res. Appl. 18, 353-365 (1993). (13) Maxwell J. C., “On the theory of compound colours, and the relations of the colours of the spectrum,” (first published in Philosophical Transactions in 1860), Color Res. Appl. 18, 270-287 (1993). (14) Helmholtz, H. von, Handbuk der physiologischen Optik, 2nd ed., Voss, Hamburg & Leipzig, 1896. (15) Hering, E., Outlines of a theory of the light sense [English translation, L. M. Hurvich and D. Jameson (Harvard U. Press, Cambridge, Mass., 1964)]. (16) Munsell, A. H., A color notation, Munsell Color Co., Boston, 1905. (17) Schrödinger, E., “On the relationship of four- to three-color theory,” (translation by National translation center, commentary by Qasim Zaidi), Color Res. Appl. 19, 37-47 (1994). (18) Judd, D. B., “Hue saturation and lightness of surface colors with chromatic illumination,” J. Opt. Soc. Am. 27, 2-32 (1940). (19) Wright, W. D., Researches on normal and defective colour vision , Kimpton, London, 1946. (20) Wyszecki, G., and Stiles, W. S., Color science: Concepts and Methods, Quantitative Data and Formulae, 2nd Edition, Wiley Sons, New York, 1982. (21) MacAdam, D. L., Color measurement. Theme and Variations, Springer-Verlag, Berlin & Heidelberg, 1981. (22) Richter, M., “The development of color metrics,” Color Res. and Application 9, 69-83 (1984). (23) Hunter, R. S., The measurement of appearance, Wiley, New York, 1975. (24) Hunt, R. W. G., The reproduction of colour, Fountain Press, Kingston-upon- Thames, 226, 1995.

96 References

(25) Helson, H., Judd, D. B., Warren, M. H., “Object-color changes from daylight to incandescent filament illumination,” Illumin. Eng. 47, 221-233 (1952). (26) Macdonald, L. W., Luo, M. R., Scrivener, S. A. R., “Factor affecting the appearance of coloured images on a video display monitor,” The Journal of Photographic Science 38, 177-186 (1990). (27) Ives, H. E., “The relation between the color of the illuminant and the color of the illuminated object,” Color Res. Appl. 20, 70-76 (1995). (28) Hunt, R. W. G., Winter, L. M., “Colour adaptation in picture-viewing situations,” The Journal of photographic science 23, 112-115 (1975). (29) Werner, J. S., Walraven J., “Effect of chromatic adaptation on the achromatic locus: the role of contrast, luminance and background color,” Vis. Research 22, 929- 943 (1982). (30) Jacobson, A. R., “The effect of background luminance on color recognition”, Color Res. Appl. 11, 263-269 (1986). (31) Fairchild, M. D., “Considering the surround in device-independent color imaging,” Color Res. Appl. 20, 352-363 (1995). (32) Mihashi, T., Okajima, K., “An effect of surround condition in cross-media color reproduction,” Proc. of Color Forum Japan’ 96, 25-28 (1996) (in Japanese). (33) Blackwell, K. T., Buchsbaum, G., “The effect of spatial and chromatic parameters on chromatic induction,” Color Res. Appl. 13, 166-173 (1988). (34) MacAdam, D. L., “Chromatic adaptation,” J. Opt. Soc. Am. 46, 500-513 (1956). (35) Loomis, J. M., Berger, T., “Effects of chromatic adaptation on color discrimination and color appearance,” Vis. Research 19, 891-901 (1979). (36) Bartleson, C. J., “On chromatic adaptation and persistence,” Color Res. Appl. 6, 153-160 (1981).

97 References

(37) Wei J., Shevell, S. K., “Color appearance under chromatic adaptation varied along theoretically significant axes in color space,” J. Opt. Soc. Am. A 12, 36-46 (1995). (38) Fairchild, M. D., Berns, R. S., “Image color-appearance specification through extension of CIELAB,” Color Res. Appl. 18, 178-190 (1993). (39) Bartleson, C. J., “Prediction corresponding colors with changes in adaptation,” Color Res. Appl. 4, 143-155 (1979). (40) Indow, T., “Global color metrics and color-appearance systems,” Color Res. Appl. 5, 5-12 (1980). (41) Billmeyer, Jr., F. W., “Quantifying color appearance visually and instrumentally,” Color Res. Appl. 13, 140-145 (1988). (42) Wright, W. D., “Why and how chromatic adaptation has been studied,” Color Res. Appl. 6, 147-152 (1981). (43) Beretta, G. B. , Color appearance in computer applications - Some background information on perceptual color matching, Technical report NTIS PB94-102167, Canon Information systems, Inc., 1993. (44) Hunt, R. W. G., “Measures of colour appearance in colour reproduction,” Color Res. Appl. 4, 39-43 (1979). (45) Hunt, R. W. G., “Chromatic adaptation in image reproduction,” Color Res. Appl. 7, 46-49 (1982). (46) Hunt, R. W. G., Measuring Colour, 2nd Edition, Ellis Horwoo, London, 1991. (47) von Kries, J. A., “Die Gesichtsempfindungen,“ in W. Nagel, Ed., Die Physiologie der Sinne III, Vieweg, Braunschweig, 109-282, 1904. (48) von Kries, J. A., “Chromatic adaptation,” in Festschrift der Albrecht-Ludwig- Universität, Fribourg (1902). [Translation: D. L. MacAdam, Sources of Color Science, MIT Press, Cambridge, (1970)], 109-119.

98 References

(49) Bartleson, C. J., “Changes in color appearance with variations in chromatic adaptation,” Color Res. Appl. 4, 119-138 (1979). (50) Richter, K., “Cube-root color spaces and chromatic adaptation,” Color Res. Appl. 5, 25-43 (1980). (51) Hunt, R. W. G., “A model of colour vision for predicting colour appearance,” Color. Res. Appl. 7, 95-118 (1982). (52) Hunt, R. W. G., “A model of colour vision for predicting colour appearance in various viewing conditions,” Color. Res. Appl. 12, 297-314 (1987). (53) Hunt, R. W. G., “Revised colour-appearance model for related and unrelated colours,” Color Res. Appl. 16, 146-165 (1991). (54) Hunt, R. W. G., “An improved predictor of colourfulness in a model of colour vision,” Color Res. Appl. 19, 23-33 (1994). (55) Takahama, K., Sobagaki, H., Nayatani, Y., “Analysis of chromatic-adaptation effect by a linkage model,” J. Opt. Soc. Am. 67, 651-656 (1977). (56) Nayatani, Y., Takahama, K., and Sobagaki, H., “Formulation of a Nonlinear Model of Chromatic Adaptation,” Color Res. Appl. 6, 161-171 (1981). (57) Nayatani, Y., Takahama, K., Sobagaki, H., and Hirono, J., “On exponents of a nonlinear model of chromatic adaptation,” Color Res. Appl. 7, 34-45 (1982). (58) Takahama, K., Sobagaki, H., and Nayatani, Y., “Formulation of a nonlinear model of chromatic adaptation for a light-gray background,” Color Res. Appl. 9, 106-115 (1984). (59) Nayatani, Y., Takahama, K., and Sobagaki, H., “Prediction of color appearance under various adapting conditions,” Color Res. Appl. 11, 62-71 (1986). (60) Nayatani, Y., Hashimoto, K., Takahama, K., and Sobagaki, H., “Whiteness- blackness and brightness response in a nonlinear color appearance model,” Color Res. Appl. 12, 121-127 (1987).

99 References

(61) Nayatani, Y., Hashimoto, K., Takahama, K., and Sobagaki, H., “A nonlinear color- appearance model using Estévez-Hunt-Pointer primaries,” Color Res. Appl. 12, 231- 242 (1987). (62) Nayatani, Y., Takahama, K., Sobagaki, H., and Hashimoto, K., “Color-appearance model and chromatic-adaptation transform,” Color Res. and Appl. 15, 210-221 (1990). (63) Nayatani, Y., “Revision of the chroma and hue scales of a nonlinear color- appearance model,” Color Res. Appl. 20, 143-155 (1995). (64) Nayatani, Y., Sobagaki, H., Hashimoto K., and Yano, T., “Lightness dependency of chroma scales of a nonlinear color-appearance model and its latest formulation,” Color Res. Appl. 20, 156-167 (1995). (65) Fairchild, M. D., “Formulation and testing of an incomplete-chromatic-adaptation model,” Color Res. Appl. 16 243-250 (1991). (66) Fairchild, M. D., “A model of incomplete chromatic adaptation,” Proc. of CIE 22nd Session, 33-34, (1991). (67) Luo, M. R., Clarke, A. A., Rhodes, P. A., Schapppo, A., Scrivener, S. A. R. , Tait, C. J., “Quantifying colour appearance. Part I. LUTCHI Colour appearance data,” Color Res. Appl. 16 166-180, (1991). (68) Luo, M. R., Clarke, A. A., Rhodes, P. A., Schapppo, A., Scrivener, S. A. R. , Tait, C. J., “Quantifying colour appearance. Part II. Testing colour models performance using LUTCHI colour appearance data,” Color Res. Appl. 16, 181-197 (1991). (69) Luo, M. R., Gao, X. W., Rhodes, P. A., Xin, H. J. , Clarke, A. A., Scrivener, S. A. R., “Quantifying colour appearance. Part III. Supplementary LUTCHI colour appearance data,” Color Res. Appl. 16, 98-113 (1993). (70) Colorimetry, 2nd ed., CIE Publication No. 15.2, Central Bureau of the CIE, Vienna, 1986.

100 References

(71) Fairchild, M. D., “Visual Evaluation and evolution of the RLAB color space,” Proc. of IS&T and SID’s 2nd Color Imaging Conference: Color Science, Systems and Applications, 9-12, (1994). (72) Fairchild, M. D., “Refinement of the RLAB color space, Color Res. Appl. 21, 338- 346 (1996). (73) Luo, M. R., “Two unsolved issues in colour management color appearance and gamut mapping,” Proc. of World Techno Fair in Chiba ‘96: Imaging science and technology, evolution and promise, 136-147, (1996). (74) Luo, M. R., Lo, M., Kuo, W., “The LLAB(I:c) colour model,” Color Res. Appl. 21, 412-429 (1996). (75) Guth, S. L., “Model for color vision and light adaptation,” J. Opt. Soc. Am. A 8, 976-993 (1991). (76) Guth, S. L., “Further applications of the ATD model for color vision,” Proc. of SPIE Vol. 2414, 12-26, (1994). (77) Guth, S. L., “ATD model for color vision I: background,” Proc. of SPIE Vol. 2170, 149-162, (1995). (78) McCann, J. J., McKee, S. P., Taylor, T. H., “Quantitative studies in Retinex theory,” Vis. Research 15, 445-458 (1976). (79) Land, E., “The Retinex theory of color vision,” Scientific American 237, 108-128 (1977). (80) Land, E. H., McCann, J. J., “Lightness and Retinex theory,” J. Opt. Soc. Am. 61, 1-11 (1977). (81) McCann, J. J., “Color sensations in complex images,” Proc. of IS&T and SID’s Color Imaging Conference: transforms & transportability of color, 16-23, (1993).

101 References

(82) Kim, T. G., Berns, R. S. , Fairchild, M. D., “Comparing appearance models using pictorial images,” Proc. of IS&T and SID’s Proc. of Color Imaging Conference: transforms & transportability of color, 72-77, (1993). (83) Braun, K. M., Fairchild, M. D., “Evaluation of five color-appearance transforms across changes in viewing conditions and media,” Proc. of the IS&T/SID 1995 Color Imaging conference: Color science and applications, 93-96, (1995). (84) Fairchild, M. D., Reniff, L., “Time course of chromatic adaptation for color- appearance judgments,” J. Opt. Soc. Am. A 12, 824-833 (1995). (85) Sato, T., Nakano, Y., Iga, T., Nakauchi, S., Usui, S., “Color reproduction based on low dimensional spectral reflectances using the principal component analysis,” Proc. of Color Forum Japan ‘96, 45-48, (1996). (86) Vhrel, M. J., Gershon, R., Iwan, L. S., “Measurement and analysis of object reflectance spectra,” Color Res. Appl. 19, 4-9 (1994). (87) Vhrel, M. J., Trussel, H. J., “Color correction using principal components,” Color Res. Appl. 17, 328-338 (1992). (88) Tominaga, S., Wandell, B. A., “Standard surface-reflectance model and illuminant estimation,” J. Opt. Soc. Am. A 6, 576-584 (1989). (89) Marimont, D. H., Wandell, B. A., “Linear models of surface and illuminant spectra,” J. Opt. Soc. Am. A 9, 1905-1913 (1992). (90) Ojima, N., Haneishi, H., and Miyake, Y., “The Appearance of Skin with Make-up (III): Estimation for spectral reflectance of skin with the HDTV color image,” J. SPSTJ, 57, 78-83 (1994) (in Japanese). (91) Shiobara, T., Zhou, S., Haneishi, H., and Miyake, Y., “Improved Color Reproduction of Electronic Endoscopes,” Proc. of the IS&T/SID’s 1995 Color Imaging Conference: Color Science, Systems and Applications, 186-189, (1995).

102 References

(92) Ojima, N., Basic research on the quality analysis of make-up skin, Ph.D. thesis, Chiba University, 1994 (in Japanese). (93) Robertson, A. R., “The CIE 1976 color-difference formulae,” Color Res. Appl. 2, 7-11 (1977). (94) Robertson, A. R., “Historical development of CIE recommended color difference equations,” Color Res. Appl. 15, 167-170 (1990). (95) Billmeyer, Jr., F. W., Fairman, H. S., ”CIE method for calculating tristimulus values,” Color Res. Appl. 12, 27-36 (1987). (96) Cowan, W. B., “An inexpensive scheme for calibration of a colour monitor in terms of CIE standard coordinates,” Computer Graphics 17, 315-321 (1983). (97) Barco, J. L. del, Lee, Jr., “Colorimetric calibration of a video digitizing system: algorithm and applications,” Color Res. Appl. 13, 180-186 (1988). (98) Barco, J. L. del, Diaz, J. A., Jimenez, J. R., Rubino, M., “Considerations on the calibration of color displays assuming constant channel chromaticity,” Color Res. Appl. 20, 377-387 (1995). (99) Brainard, D. H., “Calibration of a computer controlled color monitor,” Color Res. Appl. 14, 23-34 (1989). (100) Post, D. L., Calhoun, C. S., “An evaluation of methods for producing desired colors on CRT monitors,” Color Res. Appl. 14, 172-186 (1989). (101) Rich, D. C. , Alston, D. L. , Allen, L. H. , ”Psychophysical verification of the accuracy of color and color-difference simulations of surface samples on a CRT display,” Color Res. Appl. 17, 45-61 (1992). (102) Berns, R. S., Motta, R. J., Gorzynski, R. J., “CRT Colorimetry. Part I: Theory and practice,” Color Res. Appl. 18, 299-314 (1993). (103) Berns, R. S., Motta, R. J., Gorzynski, M. E., “CRT Colorimetry. Part II: Metrology,” Color Res. Appl. 18, 315-325 (1993).

103 References

(104) Alpern, M., Kitahara, H., Fielder, G. H., “The change in color matches with retinal angle of incidence of the colorimetric beams,” Vis. Research 27, 1763-1778 (1987). (105) Balasubramanian, R., “Color transformation for printer color correction,” Proc. of IS&T and SID’s Color Imaging Conference: Color Science, Systems and Applications, 62-64, (1994). (106) Tominaga, S., Iritani, T., “Color specification conversion for CMYK printer,” Proc. of Color Forum JAPAN’ 96, 13-16, (1996). (107) Miyata, K., On the color reproduction and evaluation of image quality in hardcopy, Master Thesis, Chiba University, 1991 (in Japanese). (108) Stevens, J. C., Stevens, S. S., “Brightness Function: Effects of Adaptation,” J. Opt. Soc. Am. 53, 375-385 (1963). (109) Alessi, P. J., “CIE guidelines for coordinated research on evaluation of colour appearance models for reflection print and self-luminous display image comparisons,” Color Res. Appl. 19, 48-58 (1994). (110) Fairchild, M. D., Alvin, R. L., “Precision of color matches and accuracy of color- matching functions in cross-media color reproduction,” Proc. of the IS&T/SID’s 1995 color Imaging conference: Color science, systems and applications, 18-21, (1995). (111) Fairchild, M. D., “Testing colour appearance models: guidelines for coordinated research,” Color Res. Appl. 20, 262-267 (1995). (112) Fairchild, M. D., Pirrotta, E., Kim, T., “Successive-Ganzfeld haploscopic viewing technique for color-appearance research,” Color Res. Appl. 19, 214-221 (1994). (113) Braun, K., Fairchild, M. D., “Viewing environments for cross-media image comparisons,” Proc. of IS&T 47th Annual conference/ICPS, 391-396, (1994). (114) Brewer, W. L., “Fundamental response functions and binocular color matching,” J. Opt. Soc. Am. 44, 207-212 (1953).

104 References

(115) Newhall, S. M., Burham, R. W., Clark, J. R., “Comparison of successive with simultaneous color matching,” J. Opt. Soc. Am. 47, 43-56 (1957). (116) Eastman, A. A., Brecher, G. A., “The subjective measurement of color shifts with and without chromatic adaptation,” Journal of IES 1, 239-246 (1972). (117) Braun, K., Fairchild, M. D., “Viewing techniques for cross-media image comparisons,” Color Res. Appl. 21, 6-17 (1996). (118) Bolles, R., Mackintosh, I., Hanly, B., “Colour judgment as a function of stimulus conditions and memory colour,” Canad. J. Psychol. 13, 175-185 (1959). (119) Breneman, E. J., “Corresponding chromaticities for different states of adaptation to complex visual fields,” J. Opt. Soc. Am. A 4, 1115-1129 (1987). (120) Satoh, C., “The appearance of face color as contrasted with object color,” Proc. of Color Forum Japan ‘96, 53-56, (1996). (121) Smith V. C., Pokorny J., “Chromatic-discrimination axes, CRT phosphor spectra, and individual variation in color vision”, J. Opt. Soc. Am. A 12, 27-35 (1995). (122) Nayatani, Y., Hashimoto, K., Takahama, K., Sobagaki, H., “Physiological causes of individual variations in color-matching functions,” Color Res. Appl. 13, 298-306 (1988). (123) Thurstone, L. L., “A law of comparative judgment,” Psych. Rev. 34, 273-286 (1927). (124) Torgeson, W. S., “The Law of Comparative Judgment,“ Theory and Methods of Scaling, Chapter 9, Wiley, New York, 1967. (125) Katoh N., “Practical method for appearance match between soft copy and hard copy (II),” Advance printing of papers “Workshop ~ Electronic Photography “94~”, STSPJ, 55-58, (1994) (in Japanese). (126) Itoh, M., Katoh, N., “Gamut compression for computer graphic images,” Proceedings of Symposia on Fine Imaging, 85-88, (1995) (in Japanese).

105 References

(127) Katoh, M., Itoh, M., Gamut mapping for computer generated images (II), Proc. of IS&T and SID Fourth Color Imaging Conference: Color Science, Systems and Applications, 126-129, (1996). (128) Schanda, J., “CIE Colorimetry and Colour Displays,” Proc. of IS&T and SID Fourth Color Imaging Conference: Color Science, Systems and Applications, 230- 234, (1996). (129) Takagi, M., Shimoda, H., Handbook of image analysis, The Tokyo University, 652-654, 1995 (in Japanese). (130) Hara, K., Evaluation of preferred color and analysis of mucous with electric endoscope, Master Thesis, Chiba University, 1992 (in Japanese). (131) Hara, K., Haneishi, H., Yaguchi, H., Miyake, Y., “On the preferred color reproduction of electric endoscopic images,” Proc. of 23rd Engineering Image conference, 119-122, (1992) (in Japanese).

106