<<

Efficient freeform optimization for computational caustic displays

Gerwin Damberg1,3 and Wolfgang Heidrich2,1,4 1Department of Computer Science, University of British Columbia 2Visual Computing Center, King Abdullah University of Science and Technology [email protected] [email protected]

www.cs.ubc.ca/ gdamberg/

Abstract: Phase-only light modulation shows great promise for many imaging applications, including future projection displays. While images can be formed efficiently by avoiding per-pixel attenuation of light most projection efforts utilizing phase-only modulators are based on holographic principles utilizing interference of coherent laser light and a Fourier lens. Limitations of this type of an approach include scaling to higher power as well as visible artifacts such as speckle and image noise. We propose an alternative approach: operating the spatial phase modulator with broadband illumination by treating it as a programmable freeform lens. We describe a simple optimization approach for generating phase modulation patterns or freeform that, when illuminated by a colli- mated, broadband light source, will project a pre-defined caustic image on a designated image plane. The optimization procedure is based on a simple geometric image formation model and can be implemented compu- tationally efficient. We perform simulations and show early experimental results that suggest that the implementation on a phase-only modulator can create structured light fields suitable, for example, for efficient illumination of a spatial light modulator (SLM) within a traditional projector. In an alternative application, the algorithm provides a fast way to compute geometries for static, freeform lens manufacturing. © 2015 Optical Society of America

OCIS codes: (080.4225) Nonspherical lens design; (120.2040) Displays; (100.3190) Inverse problems; (110.1758) Computational imaging.

References and links 1. L. Lesem, P. Hirsch, and J. Jordan, “The kinoform: a new wavefront reconstruction device,” IBM Journal of Research and Development 13, 150–155 (1969). 2. P. R. Haugen, H. Bartelt, and S. K. Case, “Image formation by multifacet holograms,” Applied Optics 22, 2822– 2829 (1983). 3. G. Damberg, H. Seetzen, G. Ward, W. Heidrich, and L. Whitehead, “3.2: High dynamic range projection sys- tems,” in “SID Symposium Digest of Technical Papers,” , vol. 38 (Wiley Online Library, 2007), vol. 38, pp. 4–7. 4. M. Berry, “Oriental magic mirrors and the laplacian image,” European journal of physics 27, 109 (2006). 5. M. Papas, W. Jarosz, W. Jakob, S. Rusinkiewicz, W. Matusik, and T. Weyrich, “Goal-based caustics,” in “Com- puter Graphics Forum,” , vol. 30 (Wiley Online Library, 2011), vol. 30, pp. 503–511. 6. T. Kiser, M. Eigensatz, M. M. Nguyen, P. Bompas, and M. Pauly, Architectural CausticsControlling Light with Geometry (Springer, 2013). 7. Y. Schwartzburg, R. Testuz, A. Tagliasacchi, and M. Pauly, “High-contrast computational caustic design,” ACM Transactions on Graphics (TOG) 33, 74 (2014). 8. Y. Yue, K. Iwasaki, B.-Y. Chen, Y. Dobashi, and T. Nishita, “Pixel art with refracted light by rearrangeable sticks,” in “ Forum,” , vol. 31 (Wiley Online Library, 2012), vol. 31, pp. 575–582. 9. Y. Ohno, “Color rendering and luminous efficacy of white led spectra,” in “Optical Science and Technology, the SPIE 49th Annual Meeting,” (International Society for Optics and Photonics, 2004), pp. 88–98.

1. Introduction In this work we propose to use phase only spatial light modulation combined with broadband illumination for image formation. We achieve this by treating the spatial phase modulator as a programmable freeform lens, and devising a simple and computationally efficient optimization procedure to derive a lens surface or modulation pattern that will form a caustic representing a predefined target image when illuminated by a collimated, broadband light source. Our research draws from a number of different research fields, including:

Holographic Displays. Early holographic image formation models [1] have been adapted to create digital holograms [2]. Most of the common approaches require coherent light which has several disadvantages. Coherent light can result in high resolution artifacts, including screen speckle and diffraction on structures such as the discrete pixel grid of a SLM. On the other hand, using broadband light sources can eliminate screen speckle, and low-frequency modulation patterns can significantly reduce diffraction artifacts. Remaining diffraction is averaged out by the broadband nature of the illumination, resulting in a small amount of blur that can be modeled and compensated for [3].

Freeform Lenses. Recently, there has been strong interest in freeform lens design, both for general lighting applications and also to generate images from caustics [4]. In the latter application, we can distinguish between discrete optimization methods that work on a pixelated version of the problem (e.g. [5]), and those that optimize for continuous surfaces without obvious pixel structures (e.g. [6, 7, 8]). The current state of the art [8] defines an optimization problem on the gradients of the lens surface, which then have to be integrated up into a height field. In addition to low computational performance, this leads to a tension between satisfying a data term (the target caustic image) and maintaining the integrability of the gradient field.

In our work we derive a simple and efficient formulation in which we optimize directly for the phase function (i.e. the shape of the wavefront in the lens plane) without the need for a subsequent integration step. This is made possible by a new parameterization of the problem that allows us to express the optimization directly in the lens plane rather than the image plane.

2. Freeform Lensing 2.1. Phase Modulation Image Formation To derive the image formation model for a phase modulator, we consider the geometry shown in Fig. 1: a lens plane and an image plane (screen) are parallel to each other at focal distance f . Collimated light is incident at the lens plane from the normal direction, but a phase modulator in the lens plane distorts the phase of the light, resulting in a curved phase function p(x) which corresponds to a local deflection of the light rays. lens image plane plane

f

φ u-x

x=(x,y) u=(u,v) phase function image p(x) i(u)

Fig. 1. Geometry for image formation model: Phase modulation in lens plane at focal dis- tance f from image plane resulting in curvature of the wavefront (phase function p(x)).

Light deflection. With the paraxial approximation sinφ ≈ φ we obtain the following equation for the mapping between x on the lens plane and u on the image plane: u(x) ≈ x + f · ∇p(x). (1)

Intensity modulation. Using the above geometric mapping, we derive the intensity change associated with this distortion as follows. Let dx be a differential area on the lens plane, and let du = m(x) · dx be the differential area of the corresponding region in the image plane, where m(.) is a spatially varying magnification factor. The intensity on the image plane is then given as dx 1 i(u(x)) = i = i , (2) du 0 m(x) 0 where i0 is the intensity of the collimated light incident at the lens plane. In the following we set i0 = 1 for simplicity of notation.

lens plane image plane ε x+(0, ) u(x+(0,ε))

dx m⋅dx u(x+(ε,0)) x x+(ε,0) u(x)

intensity i0 intensity i0/m

Fig. 2. Intensity change due to distortion of a differential area dx.

The magnification factor m(.) can be expressed in terms of the derivatives of the mapping between the lens and image planes (also compare Fig. 2):  ∂   ∂  m(x) = u(x) × u(x) ≈ 1 + f · ∇2 p(x). (3) ∂x ∂y This yields the following expression for the intensity distribution in the image plane: 1 i(x + f · ∇p(x)) = . (4) 1 + f · ∇2 p(x)

In other words, the magnification m, and therefore the intensity i(u) on the image plane can be directly computed from the Laplacian of the scalar phase function in the lens plane.

2.2. Optimization Problem While it is possible to directly turn the image formation mode from Equation 4 into an op- timization problem, we found that we can achieve better convergence by first linearizing the equation with a first-order Taylor approximation, which yields

i(x + f · ∇p(x)) ≈ 1 − f · ∇2 p(x), (5) where the left hand side can be interpreted as a warped image ip(x) = i(x + f · ∇p(x)) for which the target intensity i(u) in the image plane has been warped backwards onto the lens plane using the geometric distortion u(x) produced by a known phase function p(x). With this parameterization, the continuous least-square optimization problem for determining the desired phase function becomes Z 2 2 pˆ(x) = argminp(x) ip(x) − 1 + f · ∇ p(x) dx. (6) x This problem can be solved by iterating between updates to the phase function and updates to the warped image, as shown in Algorithm 1:

Algorithm 1 Freeform lens optimization // Initialization 0 ip(x) = i(u) while not converged do // phase update 2 (k) R  (k−1) 2  p (x)=argminp(x) x ip (x) − 1 + f · ∇ p(x) dx // image warp (k) k ip (x)=i(x + f · ∇p (x)) end while

After discretization of i(.) and p(.) into pixels, the phase update corresponds to solving a linear least squares problem with a discrete Laplace operator as the system matrix. We can solve this positive semi-definite system using a number of different algorithms, including Conjugate Gradient, BICGSTAB and Quasi Minimal Residual (QMR). The image warp corresponds to a texture mapping operation and can be implemented on a GPU. We implement a non-optimized prototype of the algorithm in the Matlab programming environment using QMR as the least squares solver. Table 1 shows runtimes for Algorithm 1 and a selection of artificial and natural test images at different resolution. It was executed on a single core of a mobile Intel Core i7 clocked at 1.9GHz with 8 GByte of memory. We note that due to the continuous nature of the resulting lens surfaces, computation of the phase with resolutions as low as 128 × 64 are sufficient for applications such as structured illumination in a projector. We also note that the algorithm could, with slight modifications, be described as a convolution in the Fourier domain which would result in orders of magnitude shorter computation time for single threaded CPU Fig. 3. Algorithm progression for six iterations: target i gets progressively distorted by (k) (k) backwards warping onto lens plane ip as phase function p converges towards a solution. The 3D graphic depicts the final lens height field. implementations and even further speed-ups on a parallel hardware such as GPUs. With these improvements, computations at, for example, 1920 × 1080 resolution should be possible at video frame rates. In addition both, the resulting contrast of the caustic image as well as the sharpness (effective resolution), benefit from higher working resolution. The progression of this algorithm is depicted in Fig. 3. We show the undistorted target im- age, from which we optimize an initial phase function. Using this phase function, we update the target image in the lens plane by backward warping the image-plane target. This process increasingly distorts the target image for the modulator plane as the phase function converges. The backward warping step implies a non-convex objective function, but we empirically find that we achieve convergence in only a small number of iterations (5-10).

Table 1. Run times for various resolution inputs and images (5 iterations of Algorithm 1) Image Resolution Runtime Logo 128 × 64 2.62 s Lena 128 × 64 2.14 s Wave 128 × 64 1.81 s Logo 256 × 128 4.03 s Lena 256 × 128 4.75 s Wave 256 × 128 3.23 s Logo 512 × 256 9.37 s Lena 512 × 256 10.22 s Wave 512 × 256 5.27 s

3. Simulation Results We evaluate the performance of our algorithm by utilizing different simulation techniques: a common computer graphics tracer and a wavefront model based on the Van Huygens principle, to simulate diffraction effects at a spectral resolution of 5nm. 3.1. Ray Tracer Simulation For the ray tracer simulation we use the LuxRender framework, an unbiased, physically-based rendering engine for the tool. The setup of the simulation is quite straightforward: the freeform lens is imported as a mesh, and material properties are set to mimic a physical lens manufactured out of acrylic.

Fig. 4. LuxRender results of caustic image. The insert shows the absolute difference to the target image.

A distant spot light provides approximately collimated illumination, a white surface with Lambertian reflectance properties serves as screen. The linear, high dynamic range data output from the simulation is tone mapped for display. The results (see Fig. 4) match the target well. Some geometric distortions can be attributed to approximations in the model and the slightly diverging beam of the spot light source.

3.2. Physical Optics Simulation To analyze possible diffraction effects that cannot be modeled in a ray tracer based on geometric optics principles, we perform a wave optics simulation based on the Van Huygens principle. We compute a freeform lens surface for a binary test image (see Fig. 5) and illuminate it in simulation with light from a common 3-LED (RGB) white light source (see Fig. 6, dotted line) in 5nm steps. We integrate over spectrum using the luminous efficiency of the LED and the spectral sensitivity curves of the CIE color matching functions (see Fig. 6, solid line), as well as a 3x3 transformation matrix and a 2.2 gamma to map tristimulus values to display/print RGB primaries for each LED die and for the combined white light source (see Fig.7). As expected, the wavefront simulation reveals chromatic aberrations within the pattern and diffraction off the edge of the modulator, which can be (partially) mitigated, for example, by computing separate lens surfaces for each of R,G and B.

4. Experimental Results In addition to the simulations, we report on early experimental results using the computed freeform lenses in a static (acrylic, physical lens) and programmable (dynamically addressable phase modulator) fashion. Fig. 5. Binary test pattern (left) and resulting lens height field (right) used in the wave optics simulation.

Fig. 6. Spectra of standard white 3-LED (RGB) [9] (dotted graph) and the CIE standard observer color matching functions (solid graph) used in the wave optics simulation.

(a) Red LED (b) Green LED (c) Blue LED

(d) Combined (RGB) White LED

Fig. 7. Wave optics simulation for a test lens using standard white 3-LED (RGB) spectra. The simulation was performed at 5nm intervals and mapped to a RGB color space for print. Fig. 8. 3D printed refractive lens (left), broadband LED spotlight and rear-projection screen with image (right).

4.1. Static Lenses For refractive lens surfaces the phase function p(x) is converted to a geometric model describ- ing the lens shape. We design a lens that is flat on one side, and has a freeform height field h(x) on the other side. In the (x,z) plane, the deflection angle φ is related to the incident (θi) and the exitant (θo) angles at the height field as follows:

∂ p(x) ≈ φ = θo − θi. (7) ∂x The analogous relationship holds in the (y,z) plane. In addition, the lens material has a of n. Using Snell’s law, and again the paraxial approximation, we obtain

1 sinθ θ = i ≈ i . (8) n sinθo θo

Using Equations 7 and 8, as well as θi ≈ ∂h(x)/∂x, we can derive the lens shape as 1 h(x) = h + p(x), (9) 0 n − 1 where h0 is a base thickness for the lens. Figure 8 shows a prototype of a 3D printed (42µm res- olution) lens. Improved results and longer focal lengths can be achieved using other fabrication methods [7].

4.2. Implementation on Spatial Light Modulators The phase function p(x) can be directly implemented on a phase-only modulator: in our exper- iment a LCoS-based SLM with a pixel pitch of 8.0µm and a maximum phase retardation of 2π, the PLUTO SLM, by HOLOEYE Photonics AG. Since most high contrast images, for focal lengths reasonably far away from a modulator, require lens thickness of multiple wavelengths, we wrap the phase from Fig. 3 at multiples of 2π, comparable to grooves in a Fresnel lens (see Fig. 9, left). A broadband, white LED spot light provides collimated light on the reflective phase modulator and we observe the resulting image on a small Lambertian screen in Fig. 9, right.

5. Conclusion We introduce a novel, computationally inexpensive method to compute freeform lenses and propose a new implementation for applications requiring dynamic updates. We validate the approach in simulation and confirm viability in early experiments. Fig. 9. Left: Wrapped phase function p(x) from Fig. reffig:convergence). Right: Experi- mental test set-up with phase modulator a collimated, white LED light source and screen

6. Acknowledgements Research reported in this publication was supported by MTT Innovation Inc., NSERC, and the King Abdullah University of Science and Technology (KAUST).