<<

Adopting ISO Standards for Museum Imaging Scott Geffert, imagingetc.com, Inc. (originally published 1/2008)

Introduction:

This document is an effort to take a critical look at the growing desire within the museum community to utilize open, international standards for the digitization and output of original two and three-dimensional objects. Over the years, there has been much discussion regarding the long-term viability of digital assets. Unfortunately, most of these discussions have been focussed on file formats and the life of storage media. While these are critical issues, there is a third and possibly more immediate concern; the current lack of standardization of the process for capture and output. While most people assume that simply implementing an ICC -managed workflow is adequate for museum imaging, there are both legacy issues and recent advancements that users should be aware of.

For those charged with both communicating about and preserving works of art it is incredibly important that digital images are carefully defined and objective standard procedures are adhered to. You may ask: “What standards?”. It is for this reason that we set out to explore the current and emerging standards that form the basis of today’s imaging practices. While the limitations of traditional film and processing made it almost impossible to achieve consistency, there is absolutely no reason why today’s digital imaging tools cannot be run in such a way as to deliver consistent, accurate representations of artworks. As electronic image distribution has made images and information more accessible than ever, it is essential that the images and information released to the public are carefully created and vetted.

To illustrate the seriousness of this issue, perform a Google image search of any well- known artwork. The results will certainly be chaotic. This exercise demonstrates why standards are critical if we are to achieve consistent, trustworthy images worldwide.

©2008 Scott Geffert www.imagingetc.com Page 1 This document is an effort to begin a dialog between the imaging community, the computer industry, manufacturers and software developers. Our goal is to encourage everyone involved to critically evaluate the current best practices, and to explore ways we can improve the experience for users worldwide by moving quickly towards emerging OPEN IMAGING STANDARDS and best practices.

While the content is technical in nature, I have tried to incorporate as many practical examples and real world scenarios to illustrate the most important issues. I have also boldfaced statements that are important to discuss further. The document has been organized around the following outline:

History: A look back at the early days of digital imaging to help underscore the fact that digital imaging is a relatively new technology still experiencing growing pains.

Color Management- the early years: ICC (International Color Consortium) color workflow has provided the basis for controlling the digital imaging process. This section looks back at how crossed paths with digital imaging.

The LAB color model and how it relates to photographic : Illustrations regarding the historically close relationship between the LAB color model, human vision and . Here you will begin to see where the existing imaging standards are illogical and where the emerging ISO (International Organization for Standardization) standards are so important as we strive to refine the process.

2007 and beyond: standards evolve and imaging matures: A discussion regarding a move towards ISO standards, the L* function for working spaces and universal display calibration standards.

A proposed enhancement for ISO consideration: The ISO imaging standard does not currently incorporate a wide gamut L* . This section discusses the possible benefits of considering a wide gamut addendum to the proposed standards.

Evaluation and testing of RGB working spaces for ISO standard printing: A detailed summary of recent capture to print testing for The Rijksmuseum, Amsterdam, using new of artworks from collections around the world. This first- ever test of this scope compares the effect of the RGB working space from calibrated capture to the printing press.

Conclusions and Recommendations: Discussions regarding the testing and how the community can get involved.

Addendum 1 - 4

Acknowledgments: Special thanks to all of the people that made this research possible.

©2008 Scott Geffert www.imagingetc.com Page 2 History:

It is important to understand that digital imaging is a relatively new technology and is only a part of a larger trend towards overall digital convergence. As new technologies hit different industries at different times, there is a ripple effect that occurs as people adapt to sometimes dramatic changes in tools and best practices. Unfortunately, during these technological upheavals it is very easy for technology to run ahead of itself to a point of diminishing returns. As , computer and software developers compete for the more lucrative consumer marketplace we are beginning to see an alarming trend away from objective tools and standardization.

In order to gain a proper perspective on the scope of this problem it is important to take a look back at how digital imaging gained popularity in the late 1980’s. While CCD imagers became available for flatbed scanners and video , the personal computer revolution was well underway. It was just a matter of time before these various technologies would cross paths. At the time of this convergence the most powerful desktop computers were unable to display more than 256 shades of color.

The predominant computer platforms of the time were computers running the Microsoft® Windows™ and Apple® ™ operating systems. As computer manufacturers looked to display more tones on CRT displays, standards were adopted from the television industry. The Windows™ platform adopted a display gamma function of 2.2, prevalent for televisions at the time, which was well suited to making bar charts look rich and saturated on small color gamut CRT displays. Apple®, on the other hand adopted a display gamma function of 1.8 in an attempt to drive the display closer to the gamma of printed material. This decision is probably why Apple® Macintosh™ computers became so prevalent in print production. In some ways this became an early, yet passive form of color management; if you used a Macintosh™ computer, by default you were working in the color space of the ® Trinitron™ display (Apple RGB- 6500k white point, 1.8 gamma).

At this stage of development, around 1988, digital imaging, outside of the military and high-end turnkey prepress systems, was a curiosity for most people. With the introduction of Adobe® Photoshop™ v1.0 in 1991, digital imaging became a reality. When Adobe® Photoshop™ was created it simply deferred to the computer display parameters as a basis for the dynamics of the file. Essentially, if your display was dark, you would lighten the file tonal value numbers to compensate. If your display was too blue, you would adjust the to add yellow to the file to compensate. As digital printers became available, people would adjust the settings in the printer software to manually compensate for color and density shifts of the output. With careful attention, users could achieve a very high quality and predictable result in this “closed loop” color workflow.

©2008 Scott Geffert www.imagingetc.com Page 3 Color Management- the early years:

The problem with a closed loop color workflow is that if you took a file out of one system and into another the process would break down completely as the closed loop would be broken. To address this problem, Adobe® and others began to ship utilities such as Adobe® Gamma. This software tool allowed you to visually calibrate your computer display to attempt to normalize the viewing environment. While this was a step in the right direction; users utilized this tool to visually match print output to create a 1:1 relationship in the workflow, ultimately, this approach did little to solve the problem of sharing files. Up until this point in time, Apple® had dominated the imaging market due to it’s closed design, but Apple®, through a series of missteps, began to falter. The Adobe Gammaª Utility

In order to help open up the Windows™ platform to imaging, Adobe® needed to come up with a method for managing the very undefined nature of the Windows™computing platform, where manufacturers and users would literally cobble systems together from any and all types of displays and display cards. In 1998 Adobe® adopted the ICC (International Color Consortium) color managed workflow method.

Apple Computer®, Adobe®, Agfa®, Eastman ®, Linotype-Hell®, and others created a cross platform standard for device profiling in 1993. This group became known as ICC (International Color Consortium). Devices such as scanners, digital cameras, monitors and output devices could be characterized with the creation of ICC profiles. These small files, containing numeric data which describe the characteristics of these devices could be used throughout a Color Management System (CMS), which is based on characterizing each device in the workflow and editing in a device independent “RGB Working Color Space”. ICC profiles describe the color characteristics of a particular device, defining a mapping between the source or target color space and a profile connection space (PCS). This PCS is generally LAB Color Space (defined in the next section).

Conceptually, this model makes perfect sense, as it liberates the file from the platform, but it was not without flaws as implemented in 1998.

It is around this time frame that museums and libraries around the world began adopting digital photography in significant numbers. After almost a decade of digital imaging, museums and other cultural institutions have still done a poor job of adopting universal standards. The ICC workflow method was adopted by the ISO standards organization in December 2005 as ISO 15076-1:2005.

©2008 Scott Geffert . www.imagingetc.com Page 4 In the ICC color-managed workflow, color transformations are applied to image data based on unique device characterizations. The working color space is supposed to be a device independent definition of RGB parameters.

When Adobe® shipped Photoshop™ v5.0, users experienced terrible incompatibilities with legacy files. Images that printed perfectly prior to updating Photoshop™ would fail miserably afterward. At the time, most users simply shut off the color management options altogether. If you look back to articles from this time period you will understand just how difficult this transition was for users. An article written by a publishing expert, Don Hutcheson, sums up the challenges users faced during this time period(http://findarticles.com/p/articles/ mi_m3065/is_n12_v27/ai_21095562). To try to help users get a handle on the color workflow, Adobe® eventually decided to brand their own working space. The Adobe RGB (1998) working space was adopted from the film and television standard SMPTE240M. At the same time Microsoft®, Hewlett Packard® and other companies in Silicon Valley began to push the sRGB (standardized RGB) color space as an environment to describe the average computer display. Both of these environments were based upon 2.2 gamma.

In my opinion, it is at this stage where the industry lost it’s way regarding digital imaging standards. Confusion over working spaces spawned a cottage industry of alphabet soup, poorly documented, working color spaces that continues to this day. Bruce RGB, ColorMatch RGB, DonRGB, ProPhoto RGB, Apple RGB, RGB, sRGB all became available at this time, and all include some form of legacy gamma transformation function in their descriptions. Gamma transformation functions incorporated into popular working color spaces and display calibrations are at the core of the current instability in digital imaging practices. Ideally, a working color space should be truly device independent. Unfortunately, this is not the case with current working space options.

©2008 Scott Geffert . www.imagingetc.com Page 5 The LAB color model and photographic exposure:

It is important to take a closer look at the underpinnings of digital imaging in the color- managed workflow to gain a better understanding of the problem with gamma transformations and photographic exposure. The core of the of the ICC color-managed workflow is the LAB color model. LAB is a method of describing in a three- dimensional mathematical model. The original LAB model was surprisingly created by a painter to better describe the relationships of paint colors in 1905. One interesting aspect of the LAB color model is that it has survived the test of time with only minor modifications, and is universally accepted across different industries as the best model to describe the colors that humans can experience. A critical foundation of the LAB color model is the Luminance axis (L*). The model breaks down luminance (or brightness) in 100 steps from black to white,placing middle gray at a value of 50.

The model incorporates the fact that humans do not see light energy in a linear fashion. The human eye responds to relative luminance differences.

Stated differently, lightness perception is roughly logarithmic. You can detect an intensity difference between two patches when the ratio of their intensities differs by more than about one percent

“That's why we think of exposure in terms of zones, f-stops, or EV (), where a change of one unit corresponds to halving or doubling the exposure. The eye's relative sensitivity is expressed by the Weber-Fechner law, ΔL ≈ 0.01 L –or– ΔL/L ≈ 0.01 where ΔL is the smallest luminance difference the eye can distinguish.”

(see source quote (hyperlink): Imatest - Stepchart, for more on this topic).

©2008 Scott Geffert . www.imagingetc.com Page 6 For example: an object with 50% physical reflectance is perceived by humans as much lighter than it actually is. This is why years before digital imaging came along, photographers relied on 18% gray cards to control exposure. If you measure an 18% gray card with a modern spectrophotometer you will read a 50,0,0 LAB value. In fact, if you measure any photographic or graphic arts target, the middle gray value will always be at or near the 50 Luminance value.

The bottom line is that, the non-linear response of human vision is built into the LAB model. It is referred to as L-Star or L*. Photographic exposure methods were well-established prior to the adoption of digital imaging and are based on this same model.

The above illustrates how the same luminance values (in blue) translate to common RGB working spaces. Notice the wide range of “acceptable” translations.

Once you gain an understanding of the LAB model and accept the fact that traditional photographic exposure methods are built upon the same premise, you begin to understand why it has become so difficult for people to get a handle on arriving at standard practices for digital imaging. The computer industry accidentally broke a model that had served photography well for many years. Instead of adopting a true device independent RGB working space for media agnostic imaging, the computer industry adopted an output-centric implementation of the RGB working space. In short, when you are using Adobe RGB (1998) and a 2.2 gamma display calibration, you are essentially editing your images to the fingerprint of a 1970’s CRT television set. The reason this model has worked as well as it has for the past decade is that the ICC workflow compensates for these unnecessary gamma transformations. The problem this situation poses for museum imaging is that the working color space chosen by the user directly impacts the exposure values of the camera or scanner being used to capture images of artworks; even the much-touted RAW capture data is affected. As long as workflow gamma environments are based upon 1.8 or 2.2 gamma transformation functions, there will be an instability in museum digital collections worldwide as these hard-wired output-centric gamma gradations bias the capture and editing RGB values that provide the numeric basis for proper exposure.

©2008 Scott Geffert . www.imagingetc.com Page 7 The following graphs illustrate the life of a digital file from capture to print.

The first graph represents taking a 21 step LAB scale, converting it to popular RGB working spaces, and reading the RGB values from the Photoshop™ color measure tool. Notice how Adobe RGB (1998), ProPhoto RGB and sRGB distort the tonal relationships. The magenta line indicates the original source values.

Graph 1: Distortions in RGB Values (LAB to RGB Working Spaces)

The second graph represents the measured luminance values of Epson® prints of the 21 step scale from each working space with no color management. The third graph represents the measured luminance values of Epson® prints created with a custom ICC profile. This test, that anyone can perform, clearly illustrates that the ICC profile will always drive the tonal values back to the perceptually linear state.

Graph 2: The Gamma distortions carry through to Graph 3: The ICC printer profile compensates for the un-calibrated output. distortions, but valuable data is lost during the transformation.

©2008 Scott Geffert . www.imagingetc.com Page 8 The problem with legacy gamma in working spaces is compounded when digital camera manufacturers build RGB histograms and readouts towards Adobe RGB (1998) or sRGB values by default as opposed to LAB capture values. Without knowing it, users worldwide are building inconsistencies into their stored data. Think of this as capturing all of your images with up to a -1/3 stop . Most ICC users have not spent time focussing on input issues as color management was developed first and foremost as a method for controlling output. Few people Does you camera histogram display capture truly understand measured digital capture. or output values? If it displays output values what are they based on?

If digital camera and RAW processing software is designed properly, the user should be able to determine the exposure using LAB capture values independent of the desired working (destination) color space. In this manner, the RAW data is always correct to the target values, and more importantly, the exposure of the RAW data will always be consistent. If the user wishes to read RGB values based on the selected destination (working) color space, this should also be possible. For example: when capturing with a Leaf digital camera, capture histograms and values are presented in the LAB capture space. If the user selects an RGB destination color space, the correct RGB values are presented. This approach is easy for users to understand and it is technically an ideal model for all camera and RAW processing software developers to follow. Currently most digital camera software applications hide the source capture values from the user, or they present undefined RGB values that make it difficult for users to know if the exposure readouts are based on capture or output values.

This screen shot indicates a perfect exposure using Leaf®Captureª software. Notice how the Luminance value of the middle gray (50L patch) chart reads 50L, and the histogram lines up.

©2008 Scott Geffert www.imagingetc.com Page 9 Recently, color experts around the world have started to take this matter seriously. Last year the ECI (European Color Initiative) adopted a working color space incorporating the L* (perceptually linear) function. This working color space is similar in scope to Adobe RGB (1998) gamut and follows the exact tone curve of the LAB model, thus human visual response. At the same time, software tools from a wide range of display manufacturers and profiling tools have incorporated the L* calibration model. The eciRGBv2 working space has recently been ratified by the ISO Standards Commission (ISO 22028). While Adobe RGB may also be included in the ISO standards, Adobe® wishes to keep control of Adobe RGB, so the working space may have to be published as a technical specification instead of an international standard.

The graph indicates that the L* function of the eciRGBv2 working color space maintains a perfect 1:1 relationship from the source LAB luminance values to RGB values. Other common working spaces alter the tonal relationships. (L* shown in Magenta)

In addition to the L* function the D50 white point of eciRGBv2 (indicated in white) is in line with the LAB model. When compared to Adobe RGB (1998) (indicated in red). In this color gamut graph the “+” marks indicate the white point of the working color spaces. Notice how in the Adobe RGB (1998) working space, the white point (6500k) is located far off-center.

©2008 Scott Geffert www.imagingetc.com Page 10 2007 and beyond: imaging matures and worldwide standards evolve:

With these advances in ISO standardization, for the first time in the history of digital imaging, users worldwide can agree on the tonal relationships and exposure methods required for high quality capture and reproduction. Every manufacturer will be able to offer the same histogram readouts in a common environment, and images edited to human perception can be published to any media with the benefit of ICC profiled devices. Some argue that moving towards the L* workflow is very much like the perceived shortcomings of sRGB (standardized RGB) environment. An Adobe engineer once stated “They already tried that with sRGB and it caused more problems than it solved. It's kind of like saying world peace could be achieved if everyone spoke Latin”. Actually, I disagree with this quite viewpoint. Moving towards the L* workflow is moving closer to the ideal vision of a truly device independent RGB working color space enhancing the ICC workflow by eliminating unnecessary data transformations and ambiguity regarding exposure and tone while preserving the benefit of the ICC workflow.

To be clear; continuing to create images in working spaces with gamma transformations other than L* will remain viable as ICC color management will compensate for the differences. The real issue at this point in the evolution and maturing of the technology is that the time has come to correct for this legacy shortcoming to begin taking full advantage of what has improved over the past decade. To be fully successful in cross-media publishing and automated data-driven workflows, the community and industry need to universally agree on the foundations of a and how it is displayed. By maintaining a 1:1 relationship with captured tones throughout the entire capture and edit workflow chain, photographers, print vendors, designers, and labs can all agree on the tonal basis of a digital image file. If you think the current status quo is acceptable, try taking the same image file to ten different photo labs; the results will vary greatly, to say the least. The core reason for this is that no one can agree on what a digital file is. The industry has done little to resolve the problems. ISO standards worked for film and they can work for digital imaging as well.

Standardized display calibration:

When it comes to displaying digital images it is no longer acceptable to allow Microsoft® Windows™ and Apple® Macintosh™ computers to be different in terms of display characteristics. Now that the majority of display calibration tools offer the L* calibration function, all displays can be calibrated to the same exact standards. For the first time since the mass-adoption of digital imaging the community can move towards true universal and open standards driven by sound science, not the whims of the computer industry. Ideally, if enough end users become empowered with this knowledge, the computer manufacturers will begin to respond by using L* calibration by default. We do not expect this to happen overnight, but if digital imaging is to mature, and cultural institutions truly wish to create universal assets that will stand the test of time, now is the time to begin adopting universal standards.

©2008 Scott Geffert www.imagingetc.com Page 11 A proposed enhancement for ISO consideration:

The one possible caveat to the ISO standard is regarding the use of wide gamut working color spaces such as ProPhoto RGB. ProPhoto RGB is currently an ideal capture,edit, and archive space for 16Bit “master” digital captures and scans. As ProPhoto RGB was based on limits of human vision, as opposed to limits of printers and displays, this environment has certain advantages. Unfortunately, even though ProPhoto RGB was originally engineered to utilize the L* gamma function, it was ultimately put forth with a 1.8 gamma gradation. ProPhoto RGB continues to be used by many cultural institutions and growing numbers of commercial photographers. We have started testing a very straight forward solution to bringing ProPhoto RGB forward by creating a version (we call it ProStarRGB) which is ProPhoto RGB with the L* gamma function.

When you consider that a consumer level inkjet printer like the Epson SPR2400 utilizing glossy paper has a wider color gamut than both Adobe RGB (1998) and eciRGBv2, you begin to understand why we feel that wide gamut working spaces are important for digital preservation. While we cannot predict the future, it is not difficult to envision a printer five years from now that will print significantly wider color gamut than today’s devices. Wide gamut working spaces are one way to “future proof” today’s captures.

Diagram: Red=Adobe RGB (1998) White=eciRGB Yellow=Epson SPR2400 Glossy Paper Blue=ProPhoto RGB / ProStarRGB

This minor change would allow for the following workflow scenario:

Capture and edit “master” files in ProStarRGB on L* calibrated displays using measured photography methods and custom ICC profiles. These “master” images would be stored and could be converted on the fly via a digital asset management system (DAM) or non- destructive image editing applications to any destination working space or printer space for true cross-media publishing. Think of this as Camera Raw™ for high bit depth wide color gamut Tiff files.

We feel strongly that color working space considerations are critical for all imaging users, thus we are sharing the ProStarRGB specification as a fully open source option, and we are hoping that it can be included in the ISO 2208 definition as a compliment to the eciRGBv2 definition. As a first step towards this process we have included the ProStarRGB working space in recent digital capture to print tests, and encourage others to test and discuss this option. We do not encourage the spread of ill-documented working space environments. Our goal is to insure that possible candidates are properly tested and scrutinized before being considered viable standards.

©2008 Scott Geffert www.imagingetc.com Page 12 Evaluation and testing of RGB working spaces for ISO standard printing:

While discussions related to color management often seem far removed from the day to day process of imaging, we have always found it useful to incorporate real-world examples and testing to validate the science. Recently, as part of an evaluation of ISO printing standards for the Rijksmuseum in Amsterdam, The Netherlands, we set out to test the effect of the current and proposed calibration methods and RGB working spaces on a very carefully maintained press, verified to be well within the ISO 12647-2 international printing standard. The primary goal of this test was to verify the emerging standards prior to locking down a major revision to the museum’s photography program processes, and implementation of a museum-wide DAM system with integrated ICC conversion capabilities.

To evaluate the standards across a wide range of challenging source material we contacted the following museums to participate in the test in addition to the material from the Rijksmuseum (captured using the Leaf Aptus 75):

Metropolitan Museum of Art, New York (Leaf Aptus 75) Solomon R. Guggenheim Museum, New York (Sinar 54H) Yale Center for British Art, Connecticut (Hasselblad 39MS) Victoria and Albert Museum, London (Sinar 54H)

We asked each museum to capture a new image by carefully exposing the camera to chart values and to provide the RAW camera files of the GretagMacbeth® (X-Rite®) DCSG chart, as well as a painting captured under the same conditions. We specifically requested challenging artworks with wide dynamic range.

The Kitchen Maid c.1658 Johannes Vermeer oil on canvas 45,5 x 41 cm SK-A-2344 The Rijksmuseum, Amsterdam

©2008 Scott Geffert . www.imagingetc.com Page 13 Once the images were received, we opened them in the respective capture applications, verified the exposure values, and created custom ICC profiles. The custom profiles were applied to the painting captures with no subjective editing. The images were then exported from the capture applications to the following working color spaces:

Adobe RGB (1998) (2.2 gamma) eciRGBv2 (L* gamma) ProPhoto RGB (1.8 gamma) ProStarRGB (ProPhoto RGB modified to L* gamma).

Files were laid out into four separate test forms based on each working color space. It is important to note that each test form was created from unique exports of the files from the respective capture applications, not simply profile conversions in Photoshop™. The test targets require some explanation as they are unique to this test.

We were looking to explore the effect of the color working space on ICC calibrated RGB TIFF files, thus targets were created by translating the actual measured values of the GretagMacbeth® Color Checker™ (24 patch) as well as the newer GretagMacbeth® Digital Color Checker™ SG to the four different working spaces. In addition, we created a 100 step LAB grayscale chart to test linearity. This LAB grayscale chart was converted to each of the four RGB working spaces. These targets are not normally utilized for evaluating output. Our goal was to follow the life of the digital capture from working space to output to determine if we could maintain a 1:1 tonal relationship from capture to print.

©2008 Scott Geffert www.imagingetc.com Page 14 100 step LAB grayscale:

This chart begins as a 100 step LAB grayscale file created using Adobe® Photoshop™. The file is then converted to each RGB working space.

When output in print, if the process is successful, you should be able to measure the LAB values using a spectrophotometer and the numbers should match the original values. More importantly, if the numbers do not match, this chart will help identify specific problems in the reproduction process.

LAB to RGB Q-14 grayscale values overlay:

This chart begins as a LAB luminance value scale based upon averaged measurements of the standard Kodak® Q-14 grayscale. The LAB file is then converted in Photoshop™ to the various working color spaces. This chart also includes the various luminance to RGB value translations. The advantage of this chart is that it can be physically photographed and the digital file can be positioned as a layer over a capture of the actual chart as a visual and numeric form of validation. Most importantly, the Q-14 grayscale predates digital imaging and is recognized worldwide.

Published LAB values of the GretagMacbeth® DCSG and Color Checker™ were also translated from LAB to the various RGB working spaces for the test forms.

©2008 Scott Geffert www.imagingetc.com Page 15 Once each test form was completed, the layered RGB files were saved, and then files were converted to CMYK via Photoshop™ using the “Coated FOGRA39_GCR_bas.icc” profile. This profile conforms to the ISO 12647-2 international printing standard. The files were proofed on an Epson® 7880 printer using a calibrated Perfect Proof™ RIP. Plates were created and output on a press verified to be well within the ISO 12647-2 international printing standard.

When proofs were evaluated under ISO standard viewing conditions, the visual difference between the results of the four color spaces were minimal. Upon closer inspection, the L* based working spaces appeared to translate with better color separation, and the mid-tones were lighter. It was also immediately clear that all of the images appeared to be too dense in the upper tones. One would think that the RGB files were the cause of this problem, but the tone increase was also clearly visible on the 100 step LAB chart.

By measuring a 21 step sampling of the 100 step LAB scale with the GretagMacbeth® EyeOne® spectrophotometer, we were able to identify the exact nature of the density increase (tone value increase or dot gain) that we were seeing. By taking these output luminance measurements into an Excel spreadsheet we were able to translate the LAB differences of each target step.

120

100

80

60

40

20

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

Source LAB Output LAB The numeric differences were then translated to RGB value differences to construct a Photoshop™ tone curve compensation. By using this approach we were able to compensate for the tone value increase (TVI) of the press. Notice that no adjustments were made to the shadow areas (below the point at which the input and output cross over each other at approximately 40 to 45 LAB). Adding density to these areas of the file would not necessarily result in increased density on the output.

©2008 Scott Geffert www.imagingetc.com Page 16 The tone curve adjustment was applied to each painting image, in each test form, and it was also applied to one of the two 100 step LAB scales. The second 100 step LAB scale was left as a control to measure the before and after effects of the curve adjustment.

The test forms with the tone compensation curve applied were output on the press and the results were improved. While this method was useful for our tests, it is not an especially practical method of dealing with the printing process. It is very difficult for print shops to maintain tight tolerances for tone value increase. There are so many factors from plate to press to paper stock that make this a moving target. However, in the case of our tests, you can see that the Photoshop™ curve adjustment did in fact correct for the tone value increase, as the following graph illustrates: It is a bit disturbing that after all of the efforts of the industry to manage printing presses,

100

90

80

70

60

50

40

30

20

10

0 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21

In this graph the sourceSource LAB values LAB (Blue)Silk Adobe are compared Output LAB to the measurementsSilk Adobe Adjusted of a 31Output step LABsampling of the 100 step LAB grayscale before (Magenta) and after (Yellow) the TVI compensation curve adjustment.

that the compensations for dot gain are not fully automatic, and input does not automatically equal output. Solving this problem is clearly outside the scope of this document and the testing connected to it. What is clear is that a perfectly accurate image file does not always translate to press properly. We encourage everyone to include the 100 step LAB target on press runs to try to get a handle on exactly what is happening with the tones in their image files. You will find that the tone value increase will vary wildly depending on the press and paper stock. The good news is that if you own a spectrophotometer you can enter your measurements in a spreadsheet and quickly graph the results. While these measurements will not directly solve your printing problems, they will allow you to understand the nature of the problems.

©2008 Scott Geffert www.imagingetc.com Page 17 Conclusions:

The move towards L* gamma for displays and working space has proven to be completely viable for museum capture to print applications. We have also found that files created in this environment display more accurately in non-ICC compliant applications like DAM systems and web browsers, as the RGB data is inherently, perceptually linear.

Visual comparison of press proofs indicates that even though the results are all acceptable, the eciRGBv2 images are more open in the midtones, and the color separation is more distinct. We also found that the ProStarRGB working space is fully viable as an archive and editing environment. Further testing and consideration of ProStarRGB by the standards organizations as a possible wide gamut compliment to eciRGBv2 would be advisable.

The testing indicates that the current ISO printing standards are simply too broad for museum work. While the ISO standard allows for overall color shifts of over 5 delta e values, for museum work, especially paintings, this is simply too coarse. More importantly, tone value increase, and the spread between CMYK values all have to be monitored carefully as color and density work hand in hand. During print testing we also found that variances in the color of the black ink directly impacted the quality of the printed work. All of the ISO specifications need to be adhered to and frankly, exceeded, in order to be successful.

As part of the testing for the Rijksmuseum, a document outlining a set of specifications has been drawn up. These specifications are all within the ISO standard, but specify tighter tolerances across the board. These tolerances were tested and verified across multiple press runs to prove that typical printers could realistically achieve the desired results. With these clear, objective parameters in place, print jobs can be more predictable.

Test forms and suggested parameters will be distributed to print vendors and the results will be evaluated against the control proof, press outputs and the original artworks. As measurements are collected across a larger sampling we can begin to explore ways to better compensate for press related loss of image quality.

For the time being, it is suggested that original calibrated captures are archived and if adjustments are necessary, that either a derivative file or adjustment layer is created and edited to compensate for the press. We look forward to improvements in Adobe® Photoshop™ software that would allow for non-destructive editing and an editing history that can be saved as metadata (similar to RAW file processors). This would allow for editing adjustments to be saved without the file size overhead of adjustment layers. In the longer-term we look forward to resolving the core print-related issues.

©2008 Scott Geffert www.imagingetc.com Page 18 There have been several other interesting findings as a result of this testing:

• Three-dimensional object photography and two-dimensional works on paper reproduced quite well using standard ICC camera calibration and exposure methods. Paintings, especially 17th century work, predominantly dark images, suffer from the combined effects of tonal compression and press dot gain (tone value increase or TVI).

• It is clear that permanently altering the data of a color-managed painting capture file is not advisable. Adjustment layers, or separate saved versions are the only logical solutions to improve the appearance of paintings that do not translate well to press. Unfortunately, many museum employees spend hours perfecting images; in most cases these edits are subjectively performed over and over as the image files move through various production cycles. This workflow only leads to collections of images that bear little resemblance to the original artworks in terms of color and density. In the long term, these adjustments hopefully can be better defined, and fully automated via DAM driven ICC conversions. This assumes that printers can consistently adhere to tighter tolerances.

In the example below, this Vermeer painting from the Rijksmuseum collection was photographed using measured capture techniques. The density and color are so accurate you can capture spectral measurements of the surface of the painting with a spectrophotometer and pick up the same LAB values as the image capture.

The image on the right is a simulation based on the dot gain of the press (TVI). Note how dark and dull this painting can become. Due to the skill of the artist, we have the impression that the woman’s bonnet is white, but in fact it is very dark relative to pure white. At the same time, the dark areas of the painting are much darker than the press can reproduce. The midtones are actually the only portion of the painting that survive with a measurable 1:1 relationship. If the file is subjectively adjusted to make the bonnet lighter, it looks washed out and loses it’s subtle color.

The Kitchen Maid c. 1658 Johannes Vermeer Oil on canvas 45,5 x 41 cm SK-A-2344 The Rijksmuseum, Amsterdam

©2008 Scott Geffert www.imagingetc.com Page 19 • We found that all too often people point to capture as the cause of undesirable output. Most users simply do not trust, or fully understand the process. In most cases, subjective editing only complicates the problem. Many assume that the press is the gold standard, when in fact it is the least stable device in the print production chain.

• We found that printers have had a very difficult time with the very recent transition to ink jet proofing from traditional (and much more stable) proofing technologies. The testing indicates that with very careful measurement and calibration, ink jet proofers can provide an excellent press match, but the manufacturers’ default settings can lead to judging a false reality. The maintenance of the proofer becomes critically important.

• Print vendors seem to have abandoned the idea of creating ICC profiles of presses, as they are simply too unstable. The emphasis has shifted to matching the ink jet proofer calibrated to ISO standards. Unfortunately, many printers have had trouble maintaining the proofing process. This may help explain why so many are experiencing print problems.

• We found that the L* display calibration settings using D50 white point and 160 luminance provided a very accurate representation of the proofs and actual printed results. Under ISO standard viewing conditions. Further testing will be performed at various sites to develop guidelines for managing legacy files as display hardware and standards evolve.

• The most important finding after over a year of testing is that for the first time since I can remember, everyone I speak to about this topic that has working experience with capture editing or printing has the same positive reaction when they are exposed to the idea of L* in display calibration and working spaces. Many admit that they never really understood gamma and why Macs and PC’s were different. Most admit that they were advised to use one gamma or another at a trade show or workshop or from a print vendor. Everyone that I discuss this topic with across the board enjoys a moment of clarity, but then in many cases they hesitate to change until someone they know decides to make the change. It is this reaction that has driven me to distribute this document and encourage people to take control of their imaging workflow. Since the fall of 2007 users worldwide have started to make the change thanks to the ECI organization making the decision to adopt the L* based eciRGBv2 working space.

©2008 Scott Geffert www.imagingetc.com Page 20 Recommendations:

As a result of this testing we feel that any museum that is launching a new digitization effort should seriously consider adopting the eciRGBv2 working space and ISO standards for printing, proofing, and display calibration. We always recommend testing with your own materials and vendors before adopting new practices.

For sites currently utilizing Adobe RGB (1998), we recommend switching to eciRGBv2 and to explore ISO standards for print, proofing, and display calibration. It is not recommended to convert your existing Adobe RGB (1998) legacy files to eciRGBv2; a proper color-managed workflow will compensate for the differences. Again, we recommend testing with your own materials and vendors. We suggest test assets be created using both environments and evaluation of the results alongside your current workflow to identify any potential issues.

For sites currently utilizing ProPhoto RGB working space and 1.8 gamma display calibration, switching to eciRGBv2 does not satisfy the wide gamut that you are familiar with. Print results indicate that the visual differences are minor on output. For the time being, it may be best to continue as is until further testing of the ProStarRGB working space, and possible acceptance by standards organizations takes place. Technically there is no reason that ProStarRGB and L* display calibration could not be used today, but to be very conservative, we suggest that more users test and share the results of this environment.

It is interesting to note that Adobe® Camera RAW and Adobe® Lightroom™ both appear to utilize a linearized version of ProPhoto RGB as their internal working space, thus there is already a case to be made for incorporating these settings into the international standards for high bit depth “master” image files and RAW files.

Note: For those concerned with delivering files to outside vendors that request specific color spaces rest assured that performing a profile conversion from the linear space to a 1.8 or 2.2 gamma space will actually result in a cleaner transformation. This is due to the fact that the transformation occurs from a perceptually linear state. If your DAM system offers conversion on the fly, assets can be translated to any destination upon request by the user eliminating problems with derivatives and file versioning.

©2008 Scott Geffert www.imagignetc.com Page 21 What can manufacturers do to help?

• It is time that camera manufacturers agree on some very basic, foundation capabilities and software functionality to help restore faith in digital assets. The current and disturbing trend has been to simply push RAW file formats at the expense of objective numeric capture. While this “feel good” solution is easy to market, and on the surface makes users happy, there are simply no consistent standards to latch onto to build repeatable workflow solutions. Though the process is admirably non- destructive, it is almost guaranteed that users will not be able to output the same image ten years from now as they are able to do today. When it comes to cultural heritage applications, and especially object conservation record photography, where an image is historically significant, the current trend towards subjective process is simply unacceptable.

• Digital cameras and scanners should all have the ability to read out RAW LAB capture values. Histograms should also reflect LAB capture values. Currently, most digital cameras read out and display histograms based on sRGB or Adobe RGB (1998) 2.2 gamma gradations. If RGB readouts are the only readouts, then the user should be able to select the destination color space. These options should be properly documented in user manuals.

• All RAW processing software should support custom ICC color profiles, and ideally, there should be a built-in process to create custom ICC input profiles. The industry is notorious for going against any standards that can make one camera brand match another. This makes perfect sense for the manufacturers, but does little to help the end user.

• All RAW processing software should enable source LAB readouts as well as RGB readouts in the selected destination color space. Some applications have adopted a percentage readout for color picker values. This is acceptable, but only if these readouts correlate to LAB or specific destination RGB readouts. For example: currently if you open a file with a properly exposed middle gray value (50 LAB) created in Adobe® Photoshop™ using Adobe® Lightroom™ you expect a 50% readout, but currently the values do not match up. The middle gray value reads as 46% in Lightroom™. If you adjust the file to read 50% in Lightroom™ and open the same image in Photoshop™ the file will look lighter. To make matters worse, there is no documentation on why this is the case. If software developers are afraid that these technical settings are too advanced for users, they can simply offer an advanced tab so high-end users are not excluded and penalized as currently is the case in Adobe® Camera RAW™ and Lightroom™. (See Addendum 3: Discrepancies between Adobe® Photoshop™ and Lightroom™)

• It would be incredibly valuable for Adobe® or others to implement a method of non- destructive editing of high bit depth TIFF image files. Currently, users are forced to saved separate versions of images, or images with one or multiple adjustment layers that greatly increase file size. A simple curve adjustment layer can currently almost double the file size. RAW processor software alters image metadata by recording user adjustment history, and applying those settings. This functionality would be a great leap forward.

©2008 Scott Geffert www.imagingetc.com Page 22 While these seem like strong demands, they are easily implemented via firmware or software updates. The Leaf® digital cameras already offer a very elegant workflow that allows for this functionality today, and years ago, the Kodak® DCS Pro Back had EV and percentage readouts directly on the camera’s display. Leaf® and Sinar® digital cameras offer built-in ICC profiling. LaserSoft Imaging® SilverFast DC Pro™ and scan software also offer built-in profiling. Curiously, the DCS Pro Back software had one of the most elegant custom profiling functions ever, but this camera is no longer on the market.

Apple® ™, Adobe® Lightroom™, and Adobe® Camera RAW™ currently do not offer custom ICC profiling. Instead, each company strives to create preset settings that are hard-wired into the application. In many ways, the user is powerless to use objective capture methods with these tools. Frequent software updates often include reworked camera profiles that can shift image colors without warning. Phase One® Capture One™ and SilverFast DC Pro™ are the only RAW processor applications that allow the user to turn off color presets, create true ICC profiles, and select any output color space.

We would like to see Apple® Aperture™, Adobe® LightRoom™, as well as Hasselblad® Phocus™ software incorporate built-in profiling and the other requested features. Regardless of what working space one chooses to utilize, the software tools should support the flexibility required by professional users.

What can users do to help?

• Stay on top of the technical process, and perform tests until the workflow can be fully documented.

• Share successes and failures with others. You would be surprised to find that in most cases, the problem you may think is isolated to your site is an issue with others worldwide.

• Be vocal with manufacturers and sales representatives; don’t settle for quick fixes and buggy software. The cameras and software you faithfully use are extremely expensive. As a customer you deserve professional support and technical feedback.

• Band together with like users to build common requirements for product purchases. Place feature requirements on purchase orders and RFP’s so the manufacturers know your needs loud and clear at the time of purchase. Post-purchase is not the time to find out a camera does not offer the features you need.

• Build relationships with factory representatives, especially if you do not have a dedicated reseller and your purchasing department uses competitive bidding for hardware purchases.

©2008 Scott Geffert www.imagingetc.com Page 23 Addendum 1:

Test form artworks:

1) Old Woman Cutting Her Nails, 1655–60 Style of Rembrandt (Dutch, second or third quarter 17th century) Metropolitan Museum of Art, New York Bequest of Benjamin Altman, 1913 (14.40.609)

2) May Day William Collins, 1788-1847 c. 1811-12 1 2 Oil on canvas 37 x 44in. (94 x 111.8cm) Yale Center for British Art, New Haven Gift of Jean M. Harford B1997.20

3) Street in Delft 3 4 Vermeer, Jan c. 1657-1658 Oil on canvas 54.3 x 44 cm Rijksmuseum, Amsterdam M-SK-A-4995_5237.241 5 4) Black Lines 6 Vasily Kandinsky December 1913. Oil on canvas, 51 x 51 5/8 inches Solomon R. Guggenheim Museum, New York Solomon R. Guggenheim Founding Collection, Gift, Solomon R. Guggenheim 37.241

5) The Kitchen Maid Johannes Vermeer c. 1658,Oil on canvas,45,5 x 41 cm Rijksmuseum, Amsterdam SK-A-2344

6) St.Catherine by Vittore Crivelli (c.1444 - c.1501) Italian (Venice); c.1481. Egg tempera on panel. Victoria and Albert Museum, London 765A-1865

©2008 Scott Geffert www.imagingetc.com Page 24 Tools used for testing:

Software: Adobe® Photoshop™ CS3 Leaf® Capture™ v11.0.1 Sinar® Captureshop™ 5.5.1 Hasselblad® Flexcolor™ v4.8.1 Microsoft® Excel™

Color Management GretagMacbeth® (X-Rite®) Profile Maker 5.5, Measure Tool, EyeOne Share GretagMacbeth® (X-Rite®) EyeOnePro and iO automated reader BasICColor Display (used for ISO L* display calibration) BasICColor Catch (process control software specifically designed to verify print output)

Targets: Kodak® Q-14 Grayscale GretagMacbeth® (X-Rite®) Color Checker (24 patch) and Color Checker DC Imagingetc 100 Step LAB Grayscale Imagingetc Q-14 Grayscale Lab Values Overlay Imagingetc DCSG Lab Values/Overlay Imagingetc Color Checker Values/Overlay

Digital Cameras: Rijksmuseum, Amsterdam (captured using the Leaf® Aptus 75) Metropolitan Museum of Art, New York (Leaf® Aptus 75) Solomon R Guggenhiem Museum, New York (Sinar® 54H) Yale Center for British Art, New Haven (Hasselblad® 39MS) Victoria and Albert Museum, London (Sinar® 54H)

Standards Organizations: International Organization for Standards International Color Consortium European Color Initiative MetaMorphoze (The Netherlands' national programme for the preservation of paper heritage)

©2008 Scott Geffert . www.imagingetc.com Page 25 Addendum 3: Discrepancies between Adobe® Photoshop™ and Lightroom™

To illustrate how the current trend in image editing applications can be dangerous for long-term viability of digital assets for cultural institutions try this experiment:

Step 1: Create a new image in Adobe® Photoshop™ CS3 in the Adobe RGB (1998) working space. Using the color picker enter the following LAB values: 50, 0, 0. Select the entire image area and fill it with this value. Note that the RGB values are 118, 118, 118. Save the file as a TIFF format 8 or 16 bit.

Step 2: Import the file into Lightroom™. Using the eyedropper, read the RGB values. You will find that the RGB values read 46.6%, 46.6%, 46.6%. Adjust the exposure slider until the RGB values read 50%, 50%, 50% and export the image to Adobe RGB (1998) working space.

Step 3: Open the exported image in Photoshop™ CS3. You will find that the LAB values went from 50, 0, 0 to 54, 0, 0 and the RGB values went from 118, 118, 118 to 127, 127, 127. This means that the same RAW file of a properly exposed target processed in Adobe® Lightroom™ and Adobe® Camera RAW will process differently. The root cause is that Photoshop™ and Lightroom™ are based upon different gamma transformation functions. Today every raw processor will yield different results due to lack of standards.

©2008 Scott Geffert www.imagingetc.com Page 26 Addendum 4: eciRGBv2 and L* display calibration to be included in Metamorfoze preservation imaging guidelines.

Metamorfoze is the Netherlands' national program for the preservation of paper heritage. The program started in 1997. The Metamorfoze program is a collaborative effort of the Koninklijke Bibliotheek (National Library of the Netherlands) and the Nationaal Archief (National Archives of the Netherlands).

L*, ECIRGBv2 and Metamorfoze What you see is what you get.

Recently, Metamorfoze published a draft version of the Guidelines Preservation Imaging Metamorfoze. Within a few months version 1.0 will be published. This version will include minor as well as major improvements and additions. It is very probably that this version will contain the advise to switch from monitor gamma 2.2 to L* monitor gamma. In version 1.0 will probably also be recommended to apply color space eciRGBv2 instead of Adobe RGB 1998. The eciRGBv2 is a D50 color space and is particularly suitable for conversions to color spaces that are used in printing. This advice is applicable in a workflow in which the 16 Bit RAW files have to be developed into an 8 Bits color space.

The essence, the basic principle, of the Metamorfoze guidelines is: everything that can be visually perceptible in the original must also be perceptible in the digital derivative, in the same contrast ratio. The derivative must be as good (as is technically possible within a realistic workflow) as the original (paper cultural heritage). Correct exposure (between tolerances of +/- 1/12th F-stop) and correct tonal capture are essential in order to realize this goal. We analyze the tonal capture in the entire range - high lights, middle tones as well as the dark parts of the exposure - with the help of a technical target (cf. Guidelines at: www.metamorfoze.nl ).

With a monitor gamma of 2.2, correct tonal capture is not possible. Middle gray in the original, patch 7 on the Kodak Gray Scale (optical density 0.75), shifts to patch 6 (optical density 0.65) in the digital derivative image. Everyone understands the importance of preserving and transferring the correct contrast ratio when we are talking about digitizing cultural heritage, such as for example the letters of Vincent van Gogh. After all, a pencil drawing by Van Gogh in a letter must look like a pencil drawing in the digital derivative image and not like a drawing made with a pen or a felt-tip.

L* is a monitor gamma which is based on the way the human eye experiences contrast. Middle gray in the original remains middle gray in the digital derivative image. Besides this great advantage, L* provides slightly more space in the darker area, patches 17, 18 and 19 on the Kodak Gray Scale. In other words, with a correct exposure and conversion of a 16 Bit RAW file to a specific 8 Bit file, using the L* monitor gamma, the blacks will not fuse so quickly. In the highlights, patches A, 1, 2 and 3, the mutual distances are slightly smaller.

©2008 Scott Geffert www.imagingetc.com Page 27 Working digitally is easier than working analog. When developing film, factors such as temperature, developing time and regeneration play an important role. These factors are, however, difficult to keep stable. In the digital workflow, in order to develop the RAW data specific elements has to be chosen for and can easily be applied over and over again, such as: tonal curve correction as necessary, monitor gamma, color space and eventually storage format.

The KB and many other cultural heritage institutions are on the verge of preservation imaging. Let’s all make the right decisions now.

Hans van Dormolen Quality manager Metamorfoze www.metamorfoze.nl / [email protected] Koninklijke Bibliotheek National Library of the Netherlands, The Hague

Kodak Gray Scale: optical densities and corresponding values per patch. The values in this table are based on bit depth 8 and monitor gamma 2.2 and L*.

Optical Patch no Pixel value density 2.2 L*

A 0.05 242 244 1 0.15 218 223 2 0.25 196 203 3 0.35 177 185 4 0.45 159 169 5 0.55 143 153 6 0.65 129 139 M 0.75 116 126 8 0.85 105 113 9 0.95 94 102 10 1.05 85 91 11 1.15 77 82 12 1.25 69 73 13 1.35 62 64 14 1.45 56 56 15 1.55 50 49 B 1.65 45 43 17 1.75 41 36 18 1.85 37 31 19 1.95 33 25

©2008 Scott Geffert . www.imagingetc.com Page 28 Endnote:

While it may seem that I have singled out Adobe® as the guilty party in this document, it is important to realize that Adobe® has enabled so much positive change over the years in the way that we communicate visually. I have tried to point out that the problems are simply growing pains of a relatively new way of working. Fortunately, or unfortunately, many traditional mediums converged through the computer at the same time in the late 1980’s. The pace of change and increased competition between manufacturers and software developers, combined with a lack of user education, has allowed the technology to run ahead of itself. This document is an effort for all involved to consider some minor but significant improvements to the current process. We feel strongly that users worldwide will benefit from an increased emphasis on standardization when it comes to digital imaging and print production. Unfortunately, creating change requires that people get directly involved in the discussions and testing required to influence the computer industry.

Computer manufacturers, camera manufacturers, and software developers need to realize that moving towards open imaging standards will not result in lost revenue nor will it impede innovation. In fact the opposite result is more likely, as users will be able to enjoy a more productive and consistent experience, and will be more likely to purchase additional hardware and software.

While I am not an expert in the early history of photography it is obvious that at some point many years ago photographers became frustrated with the lack of consistency and eventually adopted standardized f-stops, speeds, light meters, photographic targets, ISO film speeds, etc. Clearly without these foundations photography would not have become a universal medium. What we are experiencing today is a similar need for movement towards solid universal standards.

I can only expect to bring these issues to the surface to help open up the dialog for change by voicing the common concerns of the wide, cross section of imaging users that we work with on a regular basis. I hope that Adobe®, Apple®, Microsoft®, the camera manufacturers and others begin to appreciate that it’ s time to embrace open standards. We are more than happy to help provide support for any organization that wishes to move towards international imaging standards.

Testing and evaluation will continue during 2008 with a focus on running the RGB working space testform in the US using SWOP and GRACOL standards. Special attention will be given to process stability and exploring methods for controlling the effects of tone value increase to improve the accuracy and predictability of artwork reproduction.

Thank you,

Scott Geffert President, Imagingetc Inc. email: [email protected]

©2008 Scott Geffert . www.imagingetc .com Page 29 Acknowledgments:

We would like to thank Jan Willem Sieburgh, Managing Director, the Rijksmuseum, Amsterdam, for encouraging the RGB working space testing and allowing us to incorporate the works from other museums in the press testing.

We would also like to thank the following people for their support:

Cecile van der Harten, Manager of the Photography Studio, and the entire photography studio staff of The Rijksmuseum, Amsterdam Rob Hendriks, ICT Manager, the Rijksmuseum, Amsterdam

Barbara Bridgers, Manager, the Studio, and the photo studio staff of the Metropolitan Museum of Art, New York David Heald, Manager, the Photo Studio, Solomon R Guggenheim Museum, New York James Stevenson, Manager, the Photograph Studio, Victoria and Albert Museum, London Melissa Gold Fournier, Associate Museum Registrar, Yale Center for British Art, New Haven

Karl Koch, Color Solutions GMBH Ben Dekamp, Rene Boshuis, Wim Van Dijk and the crew at Wifac Hans van Dormolen, Quality Manager Metamorfoze Research & Development Koninklijke Bibliotheek-National Library of The Netherlands

We would also like to thank Katrin Eismann and the inaugural Masters of Digital Imaging program participants at the School of in New York. My partner, Howard Goldstein, and I had the privilege of working with this great group of students. Their quest for answers in the Materials and Processes of Digital Imaging course directly led to our development of the verification process used to get to the bottom of working color space and output problems. We also explored the current lack of standards that face all imaging users. Through SVA, we had the opportunity to share these technical issues with a team of Adobe® engineers, which will directly lead to advancing the state of the technology. Thanks to:

Allen Furbeck, Andria Phillips, Benjamin Bobkoff, Brendan Austin, Christina Tisi-Kramer David Lehman, Heayeon Yoon, Jun Won Yoh, June Young Lim, Kristy May, Na Yeon Jin, Ned Castle, Nick Himmel, Sean Mcgiver.

©2008 Scott Geffert . www.imagingetc .com Page 30