Technical

Article

Better automatic balanced pictures using high-ac- curacy sensors

Dave Moon, ams

Images and captured using today vastly outnumber those captured with all other types of digital cameras, which include point-and-shoot cameras and digital single lens reflex (DSLR) cameras dating from the 1990s. The convenience of a smartphone coupled with its image quality make it such an attractive that it is making dedicated point-and-shoot cam- eras nearly obsolete.

Despite the sophistication of smartphone camera image processing algorithms, their automatic white-balance (AWB) algorithms can fail spectacularly under certain conditions, as they get easily confused in certain scenes where they are not able to extract a clean .

This article discusses the effectiveness of white-balancing technology and how using a high-accu- racy color sensor can help a smartphone camera’s AWB algorithm to produce a better picture.

Camera AWB algorithms

Smartphone AWB algorithms have evolved over the years, from the simplest, gray-world algorithm, to white-patch algorithms, and now to -mapping algorithms. All three types are prone to fail when a scene is dominated by one prominent color, such as a grass field or a wall painted with a bold color, or in mixed- environments such as an office lit by a combination of natural sun- and artificial lighting.

AWB algorithms fail in both these scenarios because they need to correct for the light source color. Unfortunately, in these cases they are not able to effectively measure the light source. Instead, they must rely on certain assumptions about the scene to infer the color of the light sources from a sub- set of image .

The gray-world algorithm assumption that the average color of an object in the scene is gray is not always true. In a scene dominated by one prominent color, the average color assumption is clearly not true, because there might not be a white patch in the image from which the white-patch algo- rithm can cleanly extract a white point and function properly. In mixed-lighting environments in which there is a combination of natural sunlight and artificial lighting, all three algorithms will errone- ously lock on to the color of one of the two light sources, ignoring the color of the other light source. The human eye on the other hand averages these multiple white points illuminating the scene, and arrives at a white balance compromise.

XYZ color science for AWB algorithm enhancements

Smartphone OEMs looking to differentiate in a market that recently has slowed down can now offer a new feature called ‘True White’ to differentiate their camera’s picture quality. True-white AWB en- hancements are now possible because of modern optical filter techniques, the accuracy of which approaches that of the human eye. Color sensors implementing these filter techniques are available at a cost point that is suitable for consumer electronics applications. These high-accuracy XYZ color sensors are ideally suited for measuring the in mixed-lighting scenes, and enable

Page 1 / 9 Technical Article

electronics manufacturers to deploy camera solutions which benefit from true-white AWB enhance- ments.

The optical filters are deposited directly on the die of optical sensor products. Traditional RGB color sensors offer CCT measurement accuracy of ±10%. By contrast, the accuracy of on-wafer XYZ color filters is in the range ±1 – 5%.

The need for accurate CCT measurement stems from the development of the standard developed in 1931 and known as the CIE xy color space (see Figure 1).

Fig. 1: the standard CIE chromaticity diagram explained (Image credit: https://www.pinter- est.com/pin/538039486704316392/)

Artificial light sources tend to be at warmer color temperatures, with residential lighting tending to be the warmest (2700K - 3100K). Office lighting is typically 3100K – 4500K. Daylight color tempera- tures can range from 6000K at noon to as high as 15000K just before sunrise or just after in the shade on a cloudless day.

The CIE chromaticity diagram in Figure 1 captures the human perception of visible light wave- lengths between 380nm to 780nm in the electromagnetic energy spectrum. Figure 2 shows the nor- malized spectral sensitivity of the human eye’s cone cells. The eye contains three types of cone cells.

Page 2 / 9 Technical Article

Normalized Responsivity Spectra

Wavelength λ (in nm)

Fig. 2: The normalized spectral sensitivity of human cone cells of short, middle and long wavelength types (Image credit: https://en.wikipedia.org/wiki/CIE_1931_color_space)

The eye’s response is driven by neural responses from the short, middle and long cone cells of the which have peak sensitivity to wavelengths in the , or portions of the visible light spectrum. The wavelength sensitivities of the cones span a rather wide range and overlap each other; each curve is normalized in the diagram for the sake of . The relative response of the three types of cone cells in the retina is sufficient to explain : color can be character- ized by numerous sets of color matching functions, all of which are linear transformations of the cone response functions and by extension of each other.

Figure 3 shows how the middle (M) wavelength response was then defined as a photopic view, and is used to define illuminance (in lux), because the green wavelengths are closest to what humans see. Humans are more sensitive to green and less sensitive to red and blue.

Fig. 3: The green photopic response is closest to what humans see – from the CIE photopic luminosity function (Image credit: https://en.wikipedia.org/wiki/Photopic_vision)

Lux is a measure of the amount of visible light illuminating a point on a surface from all directions above the surface and is the unit of measurement for . The XYZ tristimulus response (see Figure 4) provides another model for human vision. It is known as the CIE1931 2° Standard Observer, and provides a connection between visible-spectrum wave- lengths and the perceived by people.

Page 3 / 9 Technical Article

Fig. 4: The CIE1931 2° Standard Observer Color Matching functions or XYZ tTristimulus human eye response (Image credit: https://en.wikipedia.org/wiki/CIE_1931_color_space)

Color can be divided into brightness (or luminance, measured in lux) and chromaticity (measured in xy chromaticity parameters). The chromaticity diagram in Figure 1 is a tool which specifies how the human eye will experience light with a given spectrum. It does not specify the colors of objects, since the chromaticity observed while looking at an object depends on the ambient lighting sur- rounding the user.

The visual system in humans is very complex, and is tightly coupled to our brains’ processing en- gine, the visual cortex. It is capable of identifying the color of an object even with changes in lighting conditions. The way that humans see colors is not fixed; rather, it is a relative perception of the change in colors when the light source changes. It is a dynamic relationship between an object’s surface, the type of light source and our eyes.

Our visual system adjusts the relative response of the long, medium and short cone cells in re- sponse to the spectral content. Human eyes have a chromatic adaptation mechanism for under- standing different ambient light conditions. This is how we react to make white and objects look white and grey under different ambient light illuminant conditions. The optical gain adjustments for this chromatic adaptation principle is illustrated in Figure 5.

Fig. 5: Chromatic adaptation

Page 4 / 9 Technical Article

A camera’s image processing AWB algorithms unfortunately are not as capable of measuring the dynamic relationship between an object’s surface with changing lighting conditions as the humans eye is. They lack this ability to perform chromatic adaptation, which is needed to understand differ- ent ambient light conditions and react to make white and grey objects look white and grey under dif- ferent conditions.

An XYZ color sensor with its spectral power distribution (SPD) response is shown in Figure 6.

Fig. 6: The XYZ spectral power distribution of the TCS3430 (Image credit: https://ams.com/docu- ments/20143/36005/TCS3430_DS000464_3-00./e7dde8f1-c089-5b48-01b8-2298637f6cfd)

The XYZ spectral response is based on the human eye, thereby providing more accurate infor- mation on how people perceive a color. While there are methods to convert RGB values to XYZ, the RGB spectral response functions are not an exact color-matching function so the resulting values from the conversion do not match how the human eye perceives color. By closely matching the color response of the human eye, the data from an XYZ sensor can detect differences in color simi- lar to the way a person would. Using a high-accuracy XYZ color sensor that outputs a measure of the CIE XYZ tristimulus values of incident light provides the best results when measuring the ambi- ent lighting conditions.

Figure 7 shows the : the solid curve in the middle of the CIE chromaticity diagram.

Page 5 / 9 Technical Article

Fig. 7: The CIE1931 color space chromaticity diagram – illustrating the planckian locus (Image credit: https://en.wikipedia.org/wiki/CIE_1931_color_space)

Each dot on the locus corresponds to body color temperatures with a corresponding CCT val- ues.

Using an XYZ color sensor for better camera AWB enhancements

A TCS3430 XYZ color sensor with a diffuser senses light over a hemispherical field-of-view (FoV), similar to the FoV of the human eye, and more accurately captures light from all of the light sources present, thus dramatically improving the white-balance of an image. Figure 8 shows examples of a picture in which the AWB algorithm gets confused and is not able to cleanly extract a white point from the scene.

Fig. 8: Pictures taken with AWB corrected and confused

The image on the right has its blue background grayed-out and the person's skin tones have more of an or yellowish color, while the image on the left is more natural and properly white-bal- anced.

An example of how to enable True-White capabilities through the use of a high-accuracy XYZ color sensor to help a camera’s AWB algorithms not fail and take the best pictures is shown in Figure 9.

Fig. 9: Subset of AWB algorithm functions in which the TCS3430 information is processed

Page 6 / 9 Technical Article

Figure 10 shows a subset of the AWB algorithm flow, highlighting where the TCS3430 XYZ tristimu- lus values can be input into the flow for processing. Since the RGB has a limited FoV, it might not be able to accurately measure the color temperature for a scene adequately for all illu- minant conditions. The TCS3430 with a diffusor positioned above the of its package opens up the FoV and significantly improves the camera’s ability to accurately measure the color tempera- ture of a scene and improve the white balance of the picture.

True-white picture taking can be accomplished by using the measured CIE color temperature of the scene and adjusting the AWB algorithm. The camera’s AWB algorithm is configured to avoid the use of the auto-mode presets, but instead to use a preset color temperature from a CIE of F2 at 4,230K. From the central region of the CIE diagram in Figure 10, it is clear how the measured CIE color temperature of the TCS3430 is used; the scene illuminant’s white point is transitioned to by using an industry-standard chromatic adaptation algorithm.

Fig. 10: CIE chromaticity coordinate diagram starting preset standard illuminant with measured XYZ processed information

This true-white technique has been used by several smartphone OEMs through Android APIs to control the per-channel RGGB (in the BAYER domain), and the resulting correction used to modify the RGB-to-RGB matrix of the AWB algorithm.

DXOMark improved AWB camera pictures using XYZ color sensor

The DXOMark pictures in Figure 11 show how a color sensor that measures the CIE chromaticity coordinates in low-light ambient lighting conditions provides the best AWB picture results with better detail and significantly lower noise levels.

Page 7 / 9 Technical Article

Fig. 11: DXOMark pictures – Left smartphone using color sensor vs Right without (Image credit: https://www.dxomark.com/huawei-p20-pro-camera-review-innovative-technologies-outstanding-re- sults/)

Summary

Smartphone camera AWB algorithms are all prone to fail when a scene is dominated by one promi- nent color or in mixed-lighting environments with both natural sunlight and artificial lighting condi- tions. In mixed-lighting environments, AWB algorithms can erroneously lock onto only one of the lighting colors and ignore the other light source contribution. Gray-world, white-patch, and gamut- mapping algorithms fail under these conditions because they need to correct for the light source color. They cannot, however, always accurately measure the color temperature of the lighting envi- ronment, and erroneously rely on certain key assumptions about the scene to infer the color of the light sources from a subset of image pixels.

As a result, pictures taken in scenes in which the camera’s AWB algorithm gets confused, and is not able to cleanly extract a white-point from the scene, or in pictures taken of scenes dominated by one prominent color such as a grass field or a wall painted with a bold color, the background color can be grayed out and people’ skin tones have more of an orange or yellowish color. Smartphone OEMs can both differentiate and monetize on the recent availability of new XYZ color sensors like the TCS3430 through true-white enhanced camera AWB features. Better pictures can be taken by using an XYZ color sensor, which assists a camera’s image processing AWB algorithm. By optimally measuring the color temperature of a scene a better picture can be achieved making people look more natural, capturing scene colors accurately, and producing a properly white-bal- anced picture.

Page 8 / 9 Technical Article

Author information Your name: Dave Moon No 1 Target Area for publication?: ☒Europe ☒USA ☒China ☒Korea ☒Taiwan ☒Japan What kind of medium shall cover the article? • General electronics magazine ☒ Your Email: [email protected] Your Line Manager: Matt Hubbard Tel: +1 469-298-4283

Biography Dave Moon is a Senior Product Marketing Manager for the Advanced Optical Solutions group at ams AG. He has over 25 years’ experience working in the semiconductor industry and has held various applica- tions, systems, and product definition positions at Texas Instruments, Agere Systems, Lucent Microelectronics, and AT&T Bell Labs. Dave received a Bachelor’s of Electrical Engineering from the University of Delaware and a Master’s of Science in Electrical Engineering from The Johns Hopkins University.

For further information ams AG Tel: +1 (469)298-4283 [email protected] www.ams.com

Page 9 / 9