<<

APPLICATION NOTE

White Balance for Allied Vision V1.0.0 2016-Feb-24 GigE

Introduction

White Balance and Auto White Balance explained The human has the ability to map “white” to the sensation of white, even though an object has different radiance when it is illuminated with different sources. In other words, if you were shown a sheet of white paper under natural daylight or under incandescent or fluorescent , you would say that it was white. This phenomenon is called constancy. When an image is captured by a digital , the sensor response at each depends on the illumination. That is, each pixel value recorded by the sensor is related to the color temperature1 of the light source. When a white object is illuminated with warm lamp, it will appear reddish in the recorded image. Similarly, it will appear bluish under a cold color temperature lamp. The purpose of white balance2 (WB) is to process the image so that visually it looks the same way, regardless of source of light [6]. Almost all sensors used in digital cameras are sensitive only to the intensity of light and not to its components (different wavelengths). That is why having a requires color filters, most often an RGB Bayer filter. Different filter elements have different spectral transmittance depending on the colorants used. The sensors themselves have different sensitivity and, even for the same manufacturer, they have differences in the spectral characteristics [2], [5].

a. b. Figure 1. Changes in of light: transmittance (a) and reflectance (b)

1 The color temperature of a light source is the temperature of an ideal -body radiator that radiates light of comparable to that of the light source [3] 2 Also known as color balance, balance or neutral balance [6]

For a given scene, the chromaticity of the image produced by a depends on the following [2]:

• Light source (illuminant) • Light transmission through transparent or translucent materials – transmittance, and/or reflection of light – reflectance (see Figure 1)

• Spectral response of the sensor A change in one of the three factors above will change the chromaticity of the image produced by the camera. Figure 2 shows the same image with different white balance adjustments: only the image in the middle looks “natural” (accurate white balance for the illuminant). There are various automatic white balance (AWB) algorithms proposed in the literature [2]: Gray World theory, Retinex theory, perfect , standard deviation-weighted gray world, etc. Most of these algorithms make certain assumptions of the color distribution of the image. They differ in the way the illumination is estimated.

Figure 2. Three images showing chromaticity differences because of white balance not adjusted to the illuminant (left: incorrect white balance – too cold; middle: accurate white balance; right: incorrect white balance – too warm)

Gray World Theory

One of the simplest and the most often used assumptions about the is the so-called Gray World Theory; the majority of all visual scenes in the world can be integrated to the gray, i.e.

Ravg = Gavg = Bavg . The most direct solution for automatic correction of the white is to calculate the average values for each color channel of the captured image (Ravg, Gavg, Bavg) and to use them to calculate their ratios (CorrR, CorrB), with the channel used as base value [2]:

CorrR = Gavg / Ravg ; CorrB = Gavg / Bavg (1)

The obtained coefficients are used to correct the color values of the in the image (see Figure 3). In the image processing path of the camera, each color channel can be controlled through its own digital gain; if this gain is 1.0, the pixel value is not changed. Digital gain range is 0.0 to 4.0 in most

Page 2 of 8

designs, enough to compensate for chromaticity shifts due to reasons mentioned in the previous section.

Red

Sensor Green

Blue

CorrR WB Control CorrB

Figure 3. White Balance control module (simplified block diagram)

Allied Vision GigE cameras use Gray World Theory as the basis for auto white balance modes (Once or Continuous). If the sensor is exposed to uniform white (or grey) illumination, the resulted image should match the chromaticity of the light by producing equal average signals (Ravg = Gavg = Bavg) on all three color channels; the auto white balance algorithm increases or decreases the correction factors CorrR and CorrB to achieve this result.

White Balance in Allied Vision GigE cameras

All Allied Vision GigE cameras use a white balance control similar to the one in Figure 3. White balance can be done manually or automatically in continuous mode or one-shot-mode. The best choice is determined by the type of application.

Manual White Balance

The user has direct access to the white balance gains (GCred and GCblue) to the and channels while the green channel gain remains 1.0:

GCred = 1 / CorrR; GCblue = 1 / CorrB (2)

Caution must be exercised for white balance gains lower than 1.0 – the respective channels (red or blue) can saturate prematurely producing chromaticity errors in the highlights. The default white balance gains are not calibrated and their range is not normalized. Consequently, images captured with multiple cameras of the same type may differ slightly because of small variations from sensor to sensor and from camera to camera.

Page 3 of 8

 Applying the same white balance gain values (GCred and GCblue) to different cameras does not guarantee perfect chromaticity match for images taken with multiple cameras of the same type. This statement is particularly important for image stitching applications (e.g. panorama or aerial ).

Figure 4. Stitching four images from cameras with the same white balance gains; the scene is a white wall uniformly illuminated with studio lamps (image was slightly altered to make the differences more obvious)

In Figure 4, four cameras of the same type and using the same white balance gains captured the same scene (white wall uniformly illuminated with studio lamps) in identical conditions. Stitching the images together made the white balance differences visible.

 In Manual white balance mode, the application is entirely responsible to adjust white balance gains to obtain matching image chromaticity from multiple cameras (even of the same type).

Auto White Balance

As mentioned, all Allied Vision GigE cameras use Gray World Theory as primary algorithm to achieve automatic white balance. The auto white balance feature can be used as:

• One time operation: the white balance is achieved while streaming for a time interval required reaching the color balance target;

• Continuous operation: the white balance is continuously adjusted while streaming. Gray World Theory produces images with a fair white balance in most conditions and it is relatively easy to implement. In image stitching applications auto white balance could produce disappointing results because the same color balance must be preserved for all the images being stitched while auto white balance is only applied for the individual images.

 Do not enable auto white balance in order to achieve chromaticity matching from multiple cameras in image stitching applications.

Page 4 of 8

White Balance features

Allied Vision GigE cameras expose the following features to control white balance: Feature Description Visibility Type Min Max Inc3 Adjust the white balance gains of the red or blue BalanceRatioAbs channel (see BalanceRatioSelector); the green channel Beginner Float 0.8 3 0.1 gain is always 1.00 Select the red or blue channel to adjust with BalanceRatioSelector Beginner Enumeration N/A N/A N/A BalanceRatioAbs Method used to set white balance gain values: Off: automatic mode is off Once: auto white balance occurs until target BalanceWhiteAuto Beginner Enumeration N/A N/A N/A is achieved, then BalanceWhiteAuto returns to Off Continuous: auto white balance always runs Tolerance, in percent, allowed from the ideal white balance gain values, within which the auto white BalanceWhiteAutoAdjustTol Beginner Integer 0 50 1 balance does not run; this prevents needless small adjustments from occurring with every image Rate of auto white balance adjustments, from 1 BalanceWhiteAutoRate (slowest) to 100 (fastest); use this control to slow Beginner Integer 1 100 1 down the white balance adjustments when necessary

User calibration of White Balance

Image stitching applications need white balance calibration in order to achieve overall chromaticity consistency of the resulting image. The white balance calibration is best performed in the lab (see Figure 5); the cameras are placed in front of a white screen (or wall) illuminated with at least two identical white light sources (lamps) with a color temperature between 5000°K and 6500°K (5500°K is recommended [4]). The most important quality of the lamps is the stability of light during the white balance calibration.

3 Increment Page 5 of 8

White screen

Lamp Lamp (5500°K) (5500°K)

Camera 1 Camera 4 Camera 2 Camera 3

Figure 5. White Balance calibration setup

The white balance calibration will be performed as follows:

1. With the lamps turned on, point the cameras to the central area of the screen where the best uniformity of the light is expected.

2. If the lamps are not electronically stabilized, wait 15 to 30 minutes to warm them up. 3. Use a camera viewer (e.g. Vimba Viewer that comes with Vimba SDK) and take a set of images from the cameras with the default white balance gains. If they have the same chromaticity, white balance calibration is not required and the process ends. Otherwise, continue with the next step.

4. Turn auto white balance on by setting BalanceWhiteAuto to Continuous and wait few seconds until the images show the same chromaticity, then turn auto white balance off (BalanceWhiteAuto = Off).

5. Save the results by reading BalanceRatioAbs for the red and blue channel. These two values (named GCredref and GCblueref) are the white balance calibration results and can be used as reference to balance the colors consistently in image stitching applications.

6. Once the cameras are placed in a different environment (e.g. in an imaging system), the reference white balance gains can be multiplied with adjustment factors in order to change white balance. The same set of adjustment factors will be used for all the calibrated cameras. Example The four cameras Figure 5 are calibrated for white balance with a 5500°K lamp obtaining the following reference white balance gains:

• Camera 1: GCredref = 1.26 and GCblueref = 1.37

Page 6 of 8

• Camera 2: GCredref = 1.23 and GCblueref = 1.39

• Camera 3: GCredref = 1.28 and GCblueref = 1.33

• Camera 4: GCredref = 1.22 and GCblueref = 1.35 These reference white balance gains are saved in the application (or as a User Set in the camera). Take a set of real images in the environment where the camera will be used; if the illuminant is around 5500°K, the images will look natural and provide matching chromaticity. If the illuminant is different (e.g. 3500°K, warm light), a set of white balance adjustment factors can be calculated from these images using the equations (1) – e.g. GCredadj = 0.82 (reduce red) and GCblueadj = 1.17 (enhance blue). If the white balance calibration was done properly, the same values should result from each image. The same set of adjustments can be applied to all cameras that were calibrated together. Applying the white balance adjustment factors means calculating the white balance gains and applying them using BalanceRatioSelector and BalanceRatioAbs:

GCred(i) = GCredref(i) ∙ GCredadj; GCblue(i) = GCblueref(i) ∙ GCblueadj (3) where i = 1…n and n is the total number of cameras. In the example above, for camera 1:

GCred(1) = GCredref(1) ∙ GCredadj = 1.26 ∙ 0.82 = 1.0332

GCblue(1) = GCblueref(1) ∙ GCblueadj = 1.37 ∙ 1.17 = 1.6029

References

[1]. EMVA – Standard 1288 – Characterization and Presentation of Specification Data for Image Sensors and Cameras, Rev.3.00, 29 Nov 2010. [2]. G. Zapryanov, D. Ivanova, I. Nikolova – Automatic White Balance Algorithms for Digital Still Cameras – a Comparative Study, Information Technologies and Control, no.1, 2012. [3]. Wikipedia – Color temperature [4]. Wikipedia – [5]. Wikipedia – [6]. Wikipedia – Color balance

Page 7 of 8

For technical support, please contact [email protected]. For comments or suggestions regarding this document, please contact [email protected].

Disclaimer Due to continual product development, technical specifications may be subject to change without notice. All trademarks are acknowledged as property of their respective owners. We are convinced that this information is correct. We acknowledge that it may not be comprehensive. Nevertheless, Allied Vision cannot be held responsible for any damage in equipment or subsequent loss of data or whatsoever in consequence of this document. For the latest version of this document, please visit our technical documentation website. Copyright © 2016 Allied Vision Technologies GmbH. All rights reserved. This document was prepared by the staff of Allied Vision Technologies Canada (“Allied Vision”) and is the property of Allied Vision, which also owns the copyright therein. All rights conferred by the law of copyright and by virtue of international copyright conventions are reserved to Allied Vision. This document must not be copied, or reproduced in any material form, either wholly or in part, and its contents and any method or technique available there from must not be disclosed to any other person whatsoever without the prior written consent of Allied Vision.

Page 8 of 8