<<

Cameras and Image Processing

Computational Photography CSE 291 Lecture 2 Announcements

• Assignment 1 will be released today – Due Apr 8, 11:59 PM

CSE 291, Spring 2020 2 Traditional photography

• The acquisition of images by recording light (or other electromagnetic radiation) either electronically (using an image sensor) or chemically (using film) • Typically, a lens is used to focus the light reflected or emitted from objects in the scene being imaged into an image on the light sensitive surface inside a camera during a timed exposure

CSE 291, Spring 2020 3 Geometric image formation

• How do 3D world points project to 2D image points?

CSE 291, Spring 2020 4 The projective camera

• Extrinsic Parameters: Since camera may not be at the origin, there is a rigid transformation between the world coordinates and the camera coordinates • Intrinsic parameters: Since scene units (e.g., cm) differ image units (e.g., pixels) and coordinate system may not be centered in image, we capture that with a 3x3 transformation comprised of focal length, principal point, pixel aspect ratio, and skew  X   x  Transformation 1 0 0 0Rigid Transformation         Y   y  = represented by 0 1 0 0represented by   Z          w intrinsic parameters0 0 1 0extrinsic parameters    T  3 x 3 4 x 4

CSE 291, Spring 2020 5 The reason for lenses

We need light, but big pinholes cause blur

CSE 291, Spring 2020 6 Thin lens, image of point

P

F O P’ All rays passing through lens and starting at P converge upon P’ So light gather capability of lens is given the area of the lens and all the rays focus on P’ instead of become blurred like a pinhole CSE 291, Spring 2020 7 Thin lens, image plane

Q’ P

F O P’ Q Image Plane A price: Whereas the image of P is in focus, the image of Q isn’t.

CSE 291, Spring 2020 8 Thin lens, aperture

P O P’ Image Plane • Smaller Aperture -> Less Blur • Pinhole -> No Blur

CSE 291, Spring 2020 9 Lens distortion

• Radial distortion • Tangential distortion

Images of straight lines should be straight

CSE 291, Spring 2020 10 Photometric image formation

• What is the projected point?

Need bidirectional reflectance distribution function (BRDF) at point on surface CSE 291, Spring 2020 11 Lighting, reflectance, and shading

BRDF

CSE 291, Spring 2020 12 Image acquisition

CSE 291, Spring 2020 13 Digitization, one row of image

CSE 291, Spring 2020 14 Digitization, whole image

CSE 291, Spring 2020 15 Number of quantization levels 256 128 16 8

64 32 4 2 CSE 291, Spring 2020 16 Image sensing pipeline

Image processing

CSE 291, Spring 2020 17 Image processing

• A discipline in which both the input and output of a process are images – There are usually other input parameters to the process

CSE 291, Spring 2020 18 Demosaicing (CFA)

Image sensor CFA Interpolated (lower case) pixel values

Bayer pattern

CSE 291, Spring 2020 19 Image processing

• Color spaces • mapping • White balancing and color balancing

CSE 291, Spring 2020 20 Image coding

• Common standards – Video • Recommendation ITU-R BT.601 (standard-definition television (SDTV)) • SMPTE standard 240M (precursor to Rec. 709) • Recommendation ITU-R BT.709 (high-definition television (HDTV)) – Image • sRGB • Adobe RGB • Wide gamut RGB (or Adobe Wide Gamut RGB) • ProPhoto RGB (or reference output medium metric (ROMM) RGB)

CSE 291, Spring 2020 21 Different codings, different pixel values

CSE 291, Spring 2020 22 Color specification:

• Chromaticity coordinates – (x, y, z) where x + y + z = 1 – Usually specified by (x, y) where z = 1 – x – y

The CIE 1931 chromaticity diagram

CSE 291, Spring 2020 23

• Set of chromaticities – Red – Green – Blue – White (point)

CSE 291, Spring 2020 24 Standard illuminants

Hue of each , calculated with Y = 0.54

CSE 291, Spring 2020 25 Chromaticities of common video standards

Recommendation ITU-R BT.709 SMPTE standard 240M Color CIE x CIE y CIE z Color CIE x CIE y CIE z Red 0.6400 0.3300 0.0300 Red 0.6300 0.3400 0.0300 Green 0.3000 0.6000 0.1000 Green 0.3100 0.5950 0.0950 Blue 0.1500 0.0600 0.7900 Blue 0.1550 0.0700 0.7750 White 0.3127 0.3290 0.3583 D65 White 0.3127 0.3290 0.3583 D65

Recommendation ITU-R BT.601 Recommendation ITU-R BT.601 625 lines 525 lines Color CIE x CIE y CIE z Color CIE x CIE y CIE z Red 0.6400 0.3300 0.0300 Red 0.6300 0.3400 0.0300 Green 0.2900 0.6000 0.1100 Green 0.3100 0.5950 0.0950 Blue 0.1500 0.0600 0.7900 Blue 0.1550 0.0700 0.7750 White 0.3127 0.3290 0.3583 D65 White 0.3127 0.3290 0.3583 D65

CSE 291, Spring 2020 26 Chromaticities of common image standards

sRGB Adobe RGB Color CIE x CIE y CIE z Color CIE x CIE y CIE z Red 0.6400 0.3300 0.0300 Red 0.6400 0.3300 0.0300 Green 0.3000 0.6000 0.1000 Green 0.2100 0.7100 0.0800 Blue 0.1500 0.0600 0.7900 Blue 0.1500 0.0600 0.7900 White 0.3127 0.3290 0.3583 D65 White 0.3127 0.3290 0.3583 D65

Wide gamut RGB ProPhoto RGB (or Adobe Wide Gamut RGB) (or reference output media metric (ROMM RGB)) Color CIE x CIE y CIE z Color CIE x CIE y CIE z Red 0.7347 0.2653 0.0000 Red 0.7347 0.2653 0.0000 Green 0.1152 0.8264 0.0584 Green 0.1596 0.8404 0.0000 Blue 0.1566 0.0177 0.8257 Blue 0.0366 0.0001 0.9633 White 0.3457 0.3585 0.2958 D50 White 0.3457 0.3585 0.2958 D50

CSE 291, Spring 2020 27 Chromaticity diagrams

Rec. 709 and sRGB Adobe RGB 35.9% of visible 52.1% of visible colors CSE 291, Spring 2020 28 Chromaticity diagrams

Rec. 709 and sRGB Wide gamut RGB 35.9% of visible colors 77.6% of visible colors CSE 291, Spring 2020 29 Chromaticity diagrams

Rec. 709 and sRGB ProPhoto RGB 35.9% of visible colors 90% of visible colors CSE 291, Spring 2020 30 Academy Color Encoding Specification (ACES)

ACES Color CIE x CIE y CIE z Red 0.73470 0.26530 0.00000 Green 0.00000 1.00000 0.00000 Blue 0.00010 -0.07700 1.07690 White 0.32168 0.33767 0.34065 Approximately D60

100% of visible colors

CSE 291, Spring 2020 31 Nonlinear encoding

• All of these standards use nonlinear encoding (gamma encoding) – Video • Recommendation ITU-R BT.601 (standard-definition television (SDTV)) • SMPTE standard 240M (precursor to Rec. 709) • Recommendation ITU-R BT.709 (high-definition television (HDTV)) – Image • sRGB • Adobe RGB • Wide gamut RGB (or Adobe Wide Gamut RGB) • ProPhoto RGB (or reference output medium metric (ROMM) RGB)

CSE 291, Spring 2020 32 Nonlinear encoding and conversion to linear • Typical CRT monitors have a transfer function of gamma = 2.2 • Image and video standards were designed to be directly displayed on CRTs – Pixel values are encoded to approximate gamma = 2.2 • Nonlinear to linear (floating-point) using a lookup table • Linear to nonlinear calculation

Linear Nonlinear CSE 291, Spring 2020 33 Nonlinear R’G’B’ color space and linear RGB color space • Example: sRGB

Slope of sRGB nonlinear in log-log space

RGB linear

sRGB nonlinear

CSE 291, Spring 2020 34 XYZ color space

• Encompasses all color sensations the average person can experience • Standard reference – Many other color space definitions are based on XYZ • Y is luminance • Z is quasi-equal to blue stimulation • X is a linear combination of cone response curves chosen to be nonnegative • The plane parallel to the XZ plane and that Y lies on contains all possible chromaticities at that luminance

CSE 291, Spring 2020 35 RGB color space to XYZ color space

푋 푟푋 푔푋 푏푋 푅 푌 = 푟푌 푔푌 푏푌 퐺 푍 푟푍 푔푍 푏푍 퐵 푋 푟푥(푟푋 + 푟푌 + 푟푍) 푔푥(푔푋 + 푔푌 + 푔푍) 푏푥(푏푋 + 푏푌 + 푏푍) 푅 푌 = 푟푦(푟푋 + 푟푌 + 푟푍) 푔푦(푔푋 + 푔푌 + 푔푍) 푏푦(푏푋 + 푏푌 + 푏푍) 퐺 푍 푟푧(푟푋 + 푟푌 + 푟푍) 푔푧(푔푋 + 푔푌 + 푔푍) 푏푧(푏푋 + 푏푌 + 푏푍) 퐵 푋 푟푥 푔푥 푏푥 푟푋 + 푟푌 + 푟푍 0 0 푅 푌 = 푟푦 푔푦 푏푦 0 푔푋 + 푔푌 + 푔푍 0 퐺 푍 푟푧 푔푧 푏푧 0 0 푏푋 + 푏푌 + 푏푍 퐵

CSE 291, Spring 2020 36 RGB color space to XYZ color space

푤푥 • Substitute in RGB white 1 푤푦 푊푅퐺퐵 = 1 푊푋푌푍 = 1 and XYZ white 1 푤푧 푤푦

푤푥 푤푦 푟푥 푔푥 푏푥 푟푋 + 푟푌 + 푟푍 0 0 1 1 = 푟푦 푔푦 푏푦 0 푔푋 + 푔푌 + 푔푍 0 1 푤푧 푟푧 푔푧 푏푧 0 0 푏푋 + 푏푌 + 푏푍 1 푤푦 푤푥 푤푦 푟푥 푔푥 푏푥 푟푋 + 푟푌 + 푟푍 푟푋 + 푟푌 + 푟푍 1 = 푟푦 푔푦 푏푦 푔푋 + 푔푌 + 푔푍 , solve for 푔푋 + 푔푌 + 푔푍 푤푧 푟푧 푔푧 푏푧 푏푋 + 푏푌 + 푏푍 푏푋 + 푏푌 + 푏푍 푤푦 푟푥 푔푥 푏푥 푟푋 + 푟푌 + 푟푍 0 0 M_RGB_to_XYZ = 푟푦 푔푦 푏푦 0 푔푋 + 푔푌 + 푔푍 0 푟푧 푔푧 푏푧 0 0 푏푋 + 푏푌 + 푏푍 CSE 291, Spring 2020 37

• Estimating the appearance of a sample under a different illuminant – Convert between different white points • LMS color space – Response of the three types of cones in the • Long, medium, and short wavelengths • XYZ to LMS – Bradford transformation matrix • Chromatic adaptation – Adaptation matrix

CSE 291, Spring 2020 38 Color mapping (using chromaticities)

• Image 1 to image 2 – Calculate transformation matrix from image 1 RGB to XYZ (M_RGB1_to_XYZ1) and transformation matrix from image 2 RGB to XYZ (M_RGB2_to_XYZ2) • White points in XYZ are the same as in RGB – If white points are the same • M_RGB1_to_RGB2 = inv(M_RGB2_to_XYZ2) * M_RGB1_to_XYZ1 – Else (white points are different, include chromatic adaptation) • Bradford transformation matrix (M_XYZ_to_LMS) • Map white points to LMS and calculate adaptation matrix (M_LMS1_to_LMS2) • Compose transformation matrices – M_RGB1_to_RGB2 = inv(M_RGB2_to_XYZ2) * inv(M_XYZ_to_LMS) * M_LMS1_to_LMS2 * M_XYZ_to_LMS * M_RGB1_to_XYZ1

CSE 291, Spring 2020 39 Note: luminance Y and luma Y’

• Luminance is calculated from linear RGB – Y coordinate of XYZ • Luma is calculated from nonlinear R’G’B’

• Example: sRGB Y = 0.21263903 * R + 0.71516871 * G + 0.072192319 * B Y’ = 0.21263903 * R’ + 0.71516871 * G’ + 0.072192319 * B’ (Coefficients are the middle row of M_RGB_to_XYZ)

CSE 291, Spring 2020 40 Remember

• Common video and image standards use nonlinear encoding and may use different chromaticities • Chromaticities define that contain a percent of the visible colors – Some contain more than others • Mapping pixel data between these standards requires mapping to the XYZ color space and possibly the LMS color space • Luminance is different than luma

CSE 291, Spring 2020 41