Modeling and Synthesis of Aperture Effects in Cameras
Total Page:16
File Type:pdf, Size:1020Kb
Computational Aesthetics in Graphics, Visualization, and Imaging (2008) P. Brown, D. W. Cunningham, V. Interrante, and J. McCormack (Editors) Modeling and Synthesis of Aperture Effects in Cameras Douglas Lanman1,2, Ramesh Raskar1,3, and Gabriel Taubin2 1Mitsubishi Electric Research Laboratories, Cambridge, MA, (USA) 2Brown University, Division of Engineering, Providence, RI (USA) 3Massachusetts Institute of Technology, Media Lab, Cambridge, MA (USA) Abstract In this paper we describe the capture, analysis, and synthesis of optical vignetting in conventional cameras. We analyze the spatially-varying point spread function (PSF) to accurately model the vignetting for any given focus or aperture setting. In contrast to existing "flat-field" calibration procedures, we propose a simple calibration pattern consisting of a two-dimensional array of point light sources – allowing simultaneous estimation of vignetting correction tables and spatially-varying blur kernels. We demonstrate the accuracy of our model by deblurring images with focus and aperture settings not sampled during calibration. We also introduce the Bokeh Brush: a novel, post-capture method for full-resolution control of the shape of out-of-focus points. This effect is achieved by collecting a small set of images with varying basis aperture shapes. We demonstrate the effectiveness of this approach for a variety of scenes and aperture sets. Categories and Subject Descriptors (according to ACM CCS): I.3.8 [Computer Graphics]: Applications 1. Introduction 1.1. Contributions A professional photographer is faced with a seemingly great challenge: how to select the appropriate lens for a given sit- The vignetting and spatially-varying point spread function uation at a moment’s notice. While there are a variety of capture, analysis, and synthesis methods introduced in this heuristics, such as the well-known “sunny f/16” rule, a pho- paper integrate enhancements to a number of prior results in tographer’s skill in this task must be honed by experience. It a novel way. The primary contributions include: is one of the goals of computational photography to reduce some of these concerns for both professional and amateur i. By exploiting the simple observation that the out-of-focus photographers. While previous works have examined meth- image of a point light directly gives the point spread func- ods for refocusing, deblurring, or augmenting conventional tion, we show a practical low-cost method to simulta- images, few have examined the topic of bokeh. In general, a neously estimate the vignetting and the spatially-varying good bokeh is characterized by a subtle blur for out-of-focus point spread function. (Note that, while straightforward, points – creating a pleasing separation between foreground this method can prove challenging in practice due to the and background objects in portrait or macro photography. In long exposure times required with point sources.) this paper we develop a new method to allow post-capture control of lens bokeh for still life scenes. ii. We introduce the Bokeh Brush: a novel, post-capture method for full-resolution control of the shape of out-of- To inform our discussion of image bokeh, we present a focus points. This effect is achieved by collecting a small unified approach to vignetting calibration in conventional set of images with varying basis aperture shapes. We cameras. Drawing upon recent work in computer vision and demonstrate that optimal basis aperture selection is es- graphics, we propose a simple, yet accurate, vignetting and sentially a compression problem – one solution of which spatially-varying point spread function model. This model is to apply PCA or NMF to training aperture images. and calibration procedure should find broad applicability as (Note that we assume a static scene so that multiple ex- more researchers begin exploring the topics of vignetting, posures can be obtained with varying aperture shapes.) highlight manipulation, and aesthetics. °c The Eurographics Association 2008. D. Lanman, R. Raskar, and G. Taubin / Modeling and Synthesis of Aperture Effects in Cameras 1.2. Related Work The topic of vignetting correction can be subsumed within the larger field of radiometric calibration. As described in Litvinov and Schechner [LS05], cameras exhibit three pri- mary types of radiometric non-idealities: (1) spatial non- uniformity due to vignetting, (2) nonlinear radiometric re- sponse of the sensor, and (3) temporal variations due to auto- matic gain control (AGC). Unfortunately, typical consumer- grade cameras do not allow users to precisely control intrin- sic camera parameters and settings (e.g., zoom, focal length, and aperture). As a result, laboratory flat-field calibration us- ing a uniform white light area source [Yu04] proves prob- lematic – motivating recent efforts to develop simpler radio- metric calibration procedures. Several authors have focused on single-image radiometric calibration, as well as single- image vignetting correction [ZLK06]. In most of these Figure 1: Illustration of optical vignetting. From left to works the motivating application is creating image mosaics, right: (a) reference image at f/5.6, (b) reference image at whether using a sequence of still images [GC05,d’A07] or a f/1.4, and (inset) illustration of entrance pupil shape as a video sequence [LS05]. function of incidence angle and aperture setting [vW07]. Recently, several applications in computer vision and graphics have required high-accuracy estimates of Optical vignetting occurs in multi-element optical de- spatially-varying point spread functions. Veeraraghavan et ∗ signs. As shown in Figure 1, for a given aperture the al. [VRA 07] and Levin et al. [LFDF07] considered coded clear area will decrease for off-axis viewing angles and aperture imaging. In those works, a spatially-modulated can be modeled using the variable cone model described mask (i.e., an aperture pattern) was placed at the iris plane in [AAB96]. Optical vignetting can be reduced by stopping of a conventional camera. In the former work, a broadband down the lens (i.e., reducing the aperture size), since this will mask enabled post-processing digital refocusing (at full reduce exit pupil variation for large viewing angles. sensor resolution) for layered Lambertian scenes. In the later work, the authors proposed a similar mask for simulta- Natural vignetting causes radial attenuation that, unlike neously recovering scene depth and high-resolution images. the previous types, does not arise from occlusion of light. In both cases, the authors proposed specific PSF calibration Instead, this source of vignetting is due to the physical prop- erties of light and the geometric construction of typical cam- patterns, including: general scenes under natural image 4 statistics and a planar pattern of random curves, respectively. eras. Typically modeled using the approximate cos (θ) law, We also recognize the closely-related work on confocal where θ is the angle of light leaving the rear of the lens, natu- stereo and variable-aperture photography developed by ral vignetting combines the effects due to the inverse square Hasinoff and Kutulakos [HK06, HK07]. Note that we will fall-off of light, Lambert’s law, and the foreshortening of the discuss their models in more detail in Section 2.1. exit pupil for large incidence angles [KW00, vW07]. Pixel vignetting arises in digital cameras. Similar to me- 2. Modeling Vignetting chanical and optical vignetting, pixel vignetting causes a ra- dial falloff of light recorded by a digital sensor (e.g., CMOS) Images produced by optical photography tend to exhibit a due to the finite depth of the photon well, causing light to be radial reduction in brightness that increases towards the im- blocked from the detector regions for large incidence angles. age periphery. This reduction arises from a combination of factors, including: (1) limitations of the optical design of the camera, (2) the physical properties of light, and (3) particular 2.1. Geometric Model: Spatially-varying PSF characteristics of the imaging sensor. In this work we sepa- Recall that a thin lens can be characterized by rate these effects using the taxonomy presented by Goldman 1 1 1 and Chen [GC05] and Ray [Ray02]. = + , f fD D Mechanical vignetting results in radial brightness atten- where f is the focal length, f is the separation between uation due to physical obstructions in front of the lens body. D the image and lens planes, and D is the distance to the Typical obstructions include lens hoods, filters, and sec- object plane (see Figure 2). As described by Bae and Du- ondary lenses. In contrast to other types of vignetting, me- rand [BD07], the diameter c of the PSF is given by chanical vignetting can completely block light from reach- ing certain image regions, preventing those areas from being |S − D| f 2 c = · , recovered by any correction algorithm. S N(D − f ) °c The Eurographics Association 2008. D. Lanman, R. Raskar, and G. Taubin / Modeling and Synthesis of Aperture Effects in Cameras where S is the distance to a given out-of-focus point and N is the f-number. This model predicts that the PSF will scale as a function of the object distance S and the f-number N. As a result, a calibration procedure would need to sample both these parameters to fully characterize the point spread func- tion. However, as noted by Hasinoff and Kutulakos [HK07], the effective blur diameterc ˜ is given by the linear relation |S − D| c˜ = A, S where A is the aperture diameter. Under this approximation, we find that the spatially varying PSF could potentially be estimated from a single image. In conclusion, we find that the spatially-varying PSF B(s,t;x,y) will scale linearly with Figure 3: Example of piecewise-linear point spread function the effective blur diameterc ˜ such that interpolation. Gray kernels correspond to images of blurred 1 x y point sources, whereas the red kernel is linearly interpolated Bc˜(s,t;x,y)= Bc˜ s,t; , , c˜2 ³ c˜ c˜´ from its three nearest neighbors.