Quick viewing(Text Mode)

Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Depth of Field Depth of Field from London and Upton

Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Depth of Field Depth of Field from London and Upton

Lecture 22: & III

Computer Graphics and Imaging UC Berkeley CS184/284A, Spring 2017 Depth of Field From London and Upton

• Depth of field is the range of object depths that are rendered with acceptable sharpness in an image

CS184/284A Ren Ng from subjects closer than “infinity” Depth of field by the F number, converge at a point in the focal plane. The area in front of and behind a focused regardless of the . With subject in which the photographed image modern autofocus SLR cameras, focusing CircleFigure-15 of ConfusionRelationship Between for the IdealDepth Focal of Fieldappears sharp. In other words, the depth of is performed by detecting the state of Point and the Permissible Circle of sharpness to the front and rear of the Confusion and DepthSet of circle Field of confusion as the maximum in the image plane (focal plane) permissible blur spot on the image planesubject that where image blur in the focal using a sensor which is both optically will appear sharp under final viewing conditions • For printed from 35mmplane film, falls within the limits of the equivalent (1:1 magnification) and 0.025mm (on ) is typical permissible circle of confusion. Depth of positioned out of the focal plane, and • For sensors, 1 is typical (e.g. 1.4 micron for phones) field varies according to the lens’ focal automatically controlling the lens to bring • Larger if intended for viewing at web resolution, or if lens is poor length, value and shooting the subject image within the depth of

distance, so[Canon, EF Lens Work III] if these values are known, a focus area. Lens Ideal focal point rough estimate of the depth of field can be Figure-17 Relationship Between Depth of Fr on o t d calculated using the following formulas: Focus and Aperture f fi ep eld th 2 2 Re Front depth of field = d·F·a /(f + d·F·a) 50mm f/1.8 D ar ep of de 2 2 Aperture th fie pt Permissible of ld h Rear depth of field = d·F·a /(f — d·F·a) foc circle of confusion us f: focal length F: F number d: minimum f/1.8 Permissible circle of confusion circle of confusion diameter CS184/284A a:Ren subject Ng distance (distance from the Circle of confusion first principal point to subject) at maximum aperture Since all lenses contain a certain amount Aperture × Permissible of spherical aberration and astigmatism, Near point limiting shooting distance circle of confusion = they cannot perfectly converge rays from a distance hyperfocal distance + f/5.6 subject point to form a true image point shooting distance Depth of hyperfocal distance × focus at f/5.6 (i.e., an infinitely small dot with zero area). Far point limiting shooting distance distance = In other words, images are formed from a hyperfocal distance - shooting distance composite of dots (not points) having a (Shooting distance: Distance from focal plane to subject) Hyperfocal distance certain area, or size. Since the image Using the depth of field principle, as a becomes less sharp as the size of these If the hyperfocal distance is known, the lens is gradually focused to farther dots increases, the dots are called “circles following formulas can also be used: subject distances, a point will eventually of confusion.” Thus, one way of indicating In general , depth of field is be reached where the far limit of the the quality of a lens is by the smallest dot characterised by the following attributes: rear depth of field will be equivalent to it can form, or its “minimum circle of ቢ Depth of field is deep at short focal “infinity.” The shooting distance at this confusion.” The maximum allowable dot lengths, shallow at long focal lengths. point, i,e., the closest shooting distance size in an image is called the “permissible ባ Depth of field is deep at small at which “infinity” falls within the depth circle of confusion.” , shallow at large apertures. of field, is called the hyperfocal distance. ቤ Depth of field is deep at far shooting The hyperfocal distance can be Permissible circle of confusion distances, shallow at close shooting determined as follows: The largest circle of confusion which still distances. appears as a “point” in the image. Image ብ Front depth of field is shallower Hyperfocal f 2 f: focal length F: F number distance = sharpness as sensed by the human eye is than rear depth of field. d•F number d: minimum circle of confusion closely related to the sharpness of the diameter Figure-16 Depth of Field and Depth of Focus actual image and the “resolution” of Minimum circle of confusion Thus, by presetting the lens to the human eyesight. In photography, image hyperfocal distance, the depth of field will sharpness is also dependent on the degree extend from a distance equal to half the Depth of field Depth of focus of image enlargement or projection hyperfocal distance to infinity. This

distance and the distance from which the Far point Near point method is useful for presetting a large image is viewed. In other words, in depth of field and taking snapshots

practical work it is possible to determine Front without having to worry about adjusting Rear Front depth of field depth depth of certain “allowances” for producing images focus the lens focus, especially when using a of field Near point distance which, although actually blurred to a Image Subject distance distance Rear wide-angle Photo-1 Hyperfocal Length Set certain degree, still appear sharp to the depth Condtion Far point distance of focus lens. (For observer. For 35mm single lens reflex Shooting distance example, when cameras, the permissible circle of confusion Focal plane the EF 20mm is about 1/1000~1/1500 the length of the f/2.8 USM is set film diagonal, assuming the image is Depth of focus to f/16 and the enlarged to a 5”×7” (12 cm × 16.5 cm) The area in front of and behind the focal shooting print and viewed from a distance of 25~30 plane in which the image can be distance is set cm/0.8~1 ft. EF lenses are designed to photographed as a sharp image. Depth of to the hyperfocal distance of ap- produce a minimum circle of confusion of focus is the same on both sides of the proximately 0.7m/2.3ft, all subjects within 0.035 mm, a value on which calculations image plane (focal plane) and can be a range of approximately 0.4m/1.3ft from for items such as depth of field are based. determined by multiplying the minimum the to infinity will be in focus.)

197 Depth of Field

dN dS C A = d A Depth of field Depth of focus N d d C S F = dF A Circle of confusion, C f NN == DA 1 1 1 + = f f DF dF f 1 1 1 DF dF + = DS dS f DS dS 1 1 1 DN dN + = DN dN f DOF = D D F N D f 2 D f 2 D = S D = S F f 2 NC(D f) N f 2 + NC(D f) S S CS184/284A Ren Ng DOF Demonstration

http://graphics.stanford.edu/courses/cs178/applets/dof.html

CS184/284A Ren Ng Hyperfocal Distance

The focus distance that maximizes the depth of field (such that infinity is at limit of acceptable sharpness)

Hyperfocal DOF

Hyperfocal distance H H/2 ∞

D f 2 D f 2 D = S D = S F f 2 NC(D f) N f 2 + NC(D f) S S f 2 H As D ,D = H = + f, D = (Calculation omitted) F !1 S NC N 2

CS184/284A Ren Ng Ansel Adams, Mount Williamson Clearing Storm Other Focus / DOF Situations to Consider

• How does sensor size affect defocus blur and DOF? • E.g. consider cell phone vs 35mm format sensors • For a given lens & f-stop, how does moving closer/ further from the subject (and adjusting focus onto subject) affect defocus / DOF of other objects? • In 1:1 macro, does focal length affect DOF? • What is the lens-sensor separation for hyperfocal condition, for full-resolution viewing vs web- resolution viewing? If you understand these, you understand lenses!

CS184/284A Ren Ng Bokeh

Bokeh is the shape and quality of out-of-focus blur • For small, out-of-focus lights, bokeh takes on the shape of the lens aperture M Yashna, flickr, 40mm f/3.0

CS184/284A Ren Ng Bokeh diyphotography.net

Heart-shaped bokeh?

CS184/284A Ren Ng Bokeh Dino Quinzani, Leica Noctilux 50mm, f/0.95

Why does the bokeh vary across the image?

CS184/284A Ren Ng The Psychological Effect of Shallow Depth of Field Dr. Joanne Liu, the president of Doctors without Borders, spoke on 10/7/15 in Geneva. Denis Balibouse/Reuters Hillary Clinton spoke during a campaign event at Cornell College in Mount Vernon, Iowa, on 10/7/15. Scott Morgan/Reuters https://www.youtube.com/watch?v=W5cbk0xVnzA Exposure

• H = T x E • Exposure = time x irradiance • Exposure time (T) • Controlled by (discussed last lecture) • Irradiance (E) • Power of light falling on a unit area of sensor • Controlled by f-stop (aperture and focal length)

CS184/284A Ren Ng Exposure Controls in Photography

Aperture size • Change the f-stop by opening / closing the aperture (if camera has iris control) • Change the duration the sensor integrate light ISO gain • Change the amplification (analog and/or digital) between sensor values and digital image values

CS184/284A Ren Ng Constant Exposure: F-Stop vs Shutter Speed

Example: these pairs of aperture and shutter speed give equivalent exposure

F-Stop 1.4 2.0 2.8 4.0 5.6 8.0 11.0 16.0 22.0 32.0

Shutter 1/500 1/250 1/125 1/60 1/30 1/15 1/8 1/4 1/2 1

If the exposure is too bright/dark, may need to adjust f-stop and/or shutter up/down.

CS184/284A Ren Ng Constant Exposure: Depth of Field vs Motion Blur

f / 4 f / 11 f / 32 1/125 sec 1/15 sec 1/2 sec • Photographers must trade off depth of field and motion blur for moving subjects

CS184/284A Ren Ng Shallow Depth of Field Can Create a Stronger Image

From Peterson, Understanding Exposure 200mm, f/4, 1/1000 (left) and f/11, 1/125 (right) Motion Blur Can Help Tell The Story

From Peterson, Understanding Exposure 1/60, f/5.6, 180mm Fastest Photography Lens F-Stop?

Leica Noctilux-M 50mm f/0.95 ASPH Lens

Hari Subramanyam, https://www.flickr.com/photos/dementedjesus/

CS184/284A Ren Ng ISO (Gain)

Third variable for exposure Film: trade sensitivity for grain Digital: trade sensitivity for noise • Multiply signal before analog-to-digital conversion • Linear effect (ISO 200 needs half the light as ISO 100)

More on this in a later lecture.

CS184/284A Ren Ng ISO Gain vs Noise in Canon T2i Credit: bobatkins.com

CS184/284A Ren Ng Real Compound Lenses Recall: Snell’s Law of Refraction

⌘i sin ✓i = ⌘t sin ✓t

CS184/284A Ren Ng Recall: Snell’s Law of Refraction

!i ~n Medium ⌘ *

Vacuum 1.0 Air (sea level) 1.00029 Water (20°C) 1.333 Glass 1.5-1.6 Diamond 2.42

* index of refraction is wavelength dependent (these are averages)

!t

⌘i sin ✓i = ⌘t sin ✓t

CS184/284A Ren Ng , ], ￿ ￿￿￿￿ ￿￿￿￿ ]! Today, mm zoom int lens. ￿ ￿￿ ￿￿￿￿ ware approach to ￿ ]. ort that led to the ex- er the is ￿ ￿ ￿￿￿￿ gured with the plenoptic erent glass elements, and ￿ ￿ di ￿￿ [Dickerson and Lepp eld camera records the light traveling ￿ ￿￿ eld camera con ￿ ]. As an example, commodity ￿￿￿￿￿￿￿ ￿￿￿￿￿￿￿￿￿￿ ￿￿ ￿￿￿￿ ￿￿￿￿￿￿￿￿￿￿￿ . ￿￿￿￿ is approach complements the classical optical tech- ￿ int glass lens and a crown glass lens. Finally, in is chapter introduces a new pure-so ￿ ￿￿￿￿￿￿￿ ￿ Real Refraction Through A Lens Is Not Ideal – Aberrations ￿ this phenomenon. Zooming a lens requiresof a at non-linear least three shi groups of lens elements relativemaking to it one another, very challenging toaberration maintain correction over a the reasonable zoom range. level However, of thevenience con- of the original zoom systems wasquickly so launched an desirable that intense it research e tremely sophisticated, but complex designtoday forms [Mann that we see lenses contain no fewer than some have as many as all modern lens design work iswhere computer-aided [Smith design forms are iterativelyOne optimized reason by for the a large computer. numbers ofprovide lens greater elements degrees is of freedom that for they the optimizer to achieve the desired optical quality [Kingslake compensating for lens aberrationstaken. a , Chevalier improved the design by splitting the meniscus into a ]. It consisted of a single-element meniscus lens with concave side to ￿￿￿￿ ￿￿￿￿ e most classical example of this might be the historical sequence of improve- nal images by reducing residual aberrations present in any given optical recipe. e central concept is simple: since a light : Spherical ￿ ￿ ￿ . ￿ ￿ [Kingslake Today the process of correcting aberrations by combining glass elements has been car- For , this chapter assumes a light ￿￿￿ system. ments in the original photographic￿￿￿￿ objective, a landscape lens designedan by aperture Wollaston stop. in In cemented doublet composed of a Dallmeyer split the crown lens again, placing one on either side of the central ried to remarkable extremes. Zoom lenses provide perhaps the most dramatic illustration of Figure aberration. niques. along all rays inside the camera, weto can where use they the should computer ideally to have re-sortquality converged. aberrated of rays Digital of correction light of this kind improves the

Real plano-convex lens (spherical surface shape). Lens does not converge rays to a point anywhere.

CS184/284A Ren Ng Real Lenses vs Ideal Thin Lenses ilovephotography.com

• Real optical system • Theoretical abstraction • Multiple physical elements • Assume all rays refract at a in compound design plane & converge to a point • Optical aberrations • Quick and intuitive prevent rays from calculation of main imaging converging perfectly effects

CS184/284A Ren Ng Modern Lens Designs Are Highly Complex ilovephotography.com

Photographic lens cross section

CS184/284A Ren Ng Modern Lens Designs Are Highly Complex ilovehatephoto.com

4 element mobile phone lens (on 24x36mm sensor)

CS184/284A Ren Ng Modern Lens Designs Are Highly Complex [Apple]

CS184/284A Ren Ng Modern Lens Designs Are Highly Complex Zeiss flickr.com account

Microscope objective

CS184/284A Ren Ng Example Lens Formula: Double Gauss

Data from W. Smith, Modern Lens Design, p 312

Radius (mm) Thick (mm) nd V-no Aperture (mm) 58.950 7.520 1.670 47.1 50.4 169.660 0.240 50.4 38.550 8.050 1.670 47.1 46.0 81.540 6.550 1.699 30.1 46.0 25.500 11.410 36.0 9.000 34.2 –28.990 2.360 1.603 38.0 34.0 81.540 12.130 1.658 57.3 40.0 –40.770 0.380 40.0 874.130 6.440 1.717 48.0 40.0 –79.460 72.228 40.0

CS184/284A Ren Ng Tracing Through Real Lens Designs

200 mm telephoto 35 mm wide-angle

50 mm double-gauss 16 mm fisheye

From Kolb, Mitchell and Hanrahan (1995)

CS184/284A Ren Ng Ray Tracing Through Real Lens Designs

200 mm telephoto

Notice shallow depth of field (out of focus background)

CS184/284A Ren Ng Ray Tracing Through Real Lens Designs

16 mm fisheye

Notice distortion in the corners (straight lines become curved)

CS184/284A Ren Ng Ray Tracing Real Lens Designs

Monte Carlo approach • At every sensor pixel, compute integral of rays incident on pixel area arriving from all paths through the lens Algorithm (for a pixel) • Choose N random positions in pixel • For each position x’, choose a random position on x00 the back element of the lens x’’ • Trace a ray through from x’ to x’’, trace refractions through lens elements until it misses the next element (kill ray) or exits the lens (path trace through the scene) x0 • Weight each ray according to radiometric calculation on next slide to estimate irradiance E(x’)

CS184/284A Ren Ng Radiometry for Tracing Lens Designs

dA’’ x’ r x’’ Back element of lens

L(x’’, x’) s Z θ’’ x θ’ P P’

Figure 5: To trace a ray from through a thick lens, a point on a Sensor Plane the exit pupil is chosen. The point of intersection of the ray from x’ to with is found, and is then translated parallel to the axis to . The ray from this point through , the image of , is then used to Figure 6: Geometry for computing the irradiance at a point on the sample the scene. cos ✓0 cos ✓00 fiElm(x plane0)= and the exactL form(x factor.00 x0) 2 dA00 x D ! x00 x0 Z 002 || || 4 Radiometry and Sampling where is1 the axial distance from the film plane4 to the disk, = L(x00 x0) cos ✓dA00 and is the2 area of the disk. If is assumed to be the focal Z x D ! In this section we describe how we compute exposure on the film length, (7) canZ be00 written2 plane. CS184/284A Ren Ng (8) 4.1 Exposure where is the f-number of the lens. Equation (7) is the one Sensor response is a function of exposure, the integral of the irra- most often found in optics texts, while (8) appears in many diance at a point on the film plane over the time that the shutter photographic texts. Note that both assume a small solid angle. is open. If we assume that irradiance is constant over the exposure period, and that exposure time is fixed, 2. For larger solid angles, a more accurate way to estimate the variation in irradiance is to compute the differential form fac- (4) tor from a point on the film plane to a disk. This correctly ac- counts for the finite size of the disk, and the variation in angle where is the irradiance at , is the exposure duration, and as we integrate over the disk. This integral may be computed is the exposure at . This model is a simplification of the analytically[4] (an elegant derivation may be found in [8]). exposure process in physical systems, where the exposure at a point is dependent upon the shape and movement of the shutter. (9) In order to compute , we integrate the radiance at over the solid angle subtended by the exit pupil, which is represented as a disk, as shown in Figure 6. In real lens systems these analytical formulas overestimate the exposure. This is due to , the blocking of light by lens el- (5) ements other than the aperture stop when a ray passes through the system at a large angle to the axis. Vignetting can be a significant effect in wide-angle lenses and when using a lens at full aperture. If the film plane is parallel to the disk, this can be rewritten as Fortunately, the ray tracing algorithm described in the last section accounts for this blockage, and hence computes the exposure cor- (6) rectly. Figure 7 compares the irradiance computed by tracing rays through the lens system pointed at a uniform radiance field with where is the axial distance from the film plane to the disk. This formula differs from that described by Cook et al., which assumed 0.20 each ray has the same weight. It is also important to perform the in- tegral using a disc-shaped exit pupil, rather than a rectangular one. Using a rectangular pupil causes the depth of field to be computed 0.15 Standard incorrectly, since points not in focus will then have rectangular “cir- Form factor cles” of confusion on the film plane. cos^4 The weighting in the irradiance integral leads to variation in irra- 0.10 Vignetted diance across the film plane due to the lens system. There are two

simple analytical ways to estimate this effect: the law and the Irradiance (W/m^2) differential form factor to a disk. 0.05 5 10 15 1. If the exit pupil subtends a small solid angle from , can be assumed to be constant and equal to the angle between and Distance from center (mm) the center of the disk. This allows us to simplify (5) to: Figure 7: Irradiance on the film plane resulting from a uniform unit radiance field imaged through the double-Gauss lens at full aperture, (7) as a function of distance from the center of the film. Connection to Thin Lens Model Spherical Lens & Paraxial Approximation

R1 R2

nA nL f =? Assume: • Lens has negligible thickness • Ray is very close to axis (paraxial) • All angles are small, so sin tan ⇡ ⇡ What is the focal length?

CS184/284A Ren Ng Spherical Lens & Paraxial Approximation

↵ ✓2 h ✓1 ↵

R1

(1) nA sin ↵ = nL sin ✓1 (Snell’s Law) (2) ✓ = ↵ ✓ 1 2 = (3) n ↵ n (↵ ✓ ) (Paraxial approx.) ) A ⇡ L 2 (4) ↵ h/R (Paraxial approx.) ⇡ 1

CS184/284A Ren Ng Spherical Lens & Paraxial Approximation

✓2 ✓4

✓3 ✓2 h ✓end

R2 f =?

(5) nL sin ✓3 = nA sin ✓4 (Snell’s Law)

(6) ✓3 = ✓2 +

(7) ✓4 = + ✓end = (8) n (✓ + ) n ( + ✓ ) (Paraxial approx.) ) L 2 ⇡ A end (9) h/R (Paraxial approx; note sign R ) ⇡ 2 2 (10) ✓ h/f (Paraxial approx.) end ⇡ CS184/284A Ren Ng Spherical Lens & Paraxial Approximation

h R1 R2

nA nL f =?

(3, 8) = (11) ✓ (n /n 1)(↵ + ) ) end ⇡ L A h nL h h (subst. 4,9,10 into 11) = (12) = 1 h factors out; applies ) f n R R to all paraxial rays ✓ A ◆✓ 1 2 ◆ 1 nL 1 1 (h factors out,CS184/284A applies to all paraxial rays)(13) = 1 Ren Ng f n R R ✓ A ◆✓ 1 2 ◆ Lens Maker’s Equation

R1 R2

nA nL f =?

1 n 1 1 = L 1 f n R R ✓ A ◆✓ 1 2 ◆

f, R1 and R2 are signed quantities, positive pointing to the right; radii point from surface to center.

CS184/284A Ren Ng Lens Maker’s Equation – Convex / Concave

R1 > 0 R1 < 0 R1 < 0 R1 >R2 > 0

R2 < 0 R2 > 0 R2 0 f<0 f<0

1 n 1 1 = L 1 f n R R ✓ A ◆✓ 1 2 ◆

f, R1 and R2 are signed quantities, positive pointing to the right; radii point from surface to center.

CS184/284A Ren Ng Thick Lens Approximation

Approximate a complex optical system by a “thick lens” where idealized refraction occurs at two “principal planes”, one each for forward and backward propagation. Characterize lens by just few parameters.

[Smith]

CS184/284A Ren Ng Things to Remember

Effect Cause Field of view Sensor size, focal length Depth of field Aperture, focal length, object dist. Exposure Aperture, shutter, ISO Motion blur Shutter Grain/noise ISO

Pinholes and lenses form perspective images Perspective composition, dolly zoom

CS184/284A Ren Ng Things to Remember

Ideal thin lenses • Paraxial approximation, lens maker equation • Thin lens equation, various applications • Focusing, defocus blur, depth of field, hyperfocal Ray-tracing optical designs for real compound lenses • Monte Carlo ray-tracing of train of optical elements • Defocus blur, optical aberrations, bokeh

CS184/284A Ren Ng Acknowledgments

Many thanks to Marc Levoy, Pat Hanrahan, Matt Pharr and Joyce Farrell for presentation resources.

CS184/284A Ren Ng Extra Auto Focus Contrast Detection Autofocus

A target object is imaged through the lens to an image patch on the sensor. The contrast of this image patch is high if the object is in focus, low otherwise. The physical focus of the lens is adjusted until the contrast of this image patch is maximized. Many ways to estimate how in- focus the image patch is: gradient, Sum Modified Demo (Levoy, Willet, Adams) Laplacian (Nayar), variance… https://graphics.stanford.edu/courses/cs178-10/applets/autofocusCD.html CS184/284A Ren Ng Phase Detection Autofocus

Ray bundles from a target object converge to points at different depths in the camera depending on the lens focus. In a phase detection AF system ray bundles passing through different portions of the lens (red and green shown) are brought to focus on separate lenslets with separate AF sensors. Depending on depth of focus point, the ray bundles converge to different positions on their respective AF sensors (see interactive demo). Demo (Levoy, Willet, Adams) A certain spacing (disparity) between these images is “in focus” https://graphics.stanford.edu/courses/cs178-10/applets/autofocusPD.html

CS184/284A Ren Ng Phase Detection AF Used in DSLRs

[Canon]

• Distance between phase-detect images correlates to distance in focus to target object (allows “jumping” to the right focus) • Separate AF units cannot be used with “live view” or video recording CS184/284A Ren Ng Phase Detection Pixels Embedded in Sensor Canon

• Modern image sensors have small pixels, and may embed phase detection pixels directly into sensor image arrays

CS184/284A Ren Ng