CS4340 Digital Special Effects Semester 2, 2011/2012

Realistic Rendering of Synthetic Objects into Real Scenes

Guest Lecture by Low Kok Lim

School of Computing National University of Singapore Goal

. To put synthetic objects (computer rendered objects) into pictures or video of real scenes such that results "look right"

. Need to match  Scale  Camera motion . intrinsic & extrinsic parameters  Illumination Mystique in X-Men 2 [Frank Vitz, 2003] 2 Mr. Smith from Matrix Reloaded

real computer real computer generated generated Taken from http://www.virtualcinematography.org/publications/acrobat/BRDF-s2003.pdf 3 Match Illumination

. Old (labor-intensive) methods  Manually survey positions of light sources, and instantiate similar virtual lights to light virtual objects  Photograph a neutral reference object in the scene, and use it as a guide to manually configure a lighting environment  Reflection mapping

. Cannot easily simulate indirect illumination effects between real and virtual objects 4 Image-Based Lighting (IBL)

. Solves the problem by "faithfully" recording the scene  In a High-Dynamic Range Light Probe Image

. Use the recorded scene radiance to light the synthetic objects

[Paul Debevec, 2002] 5 Light probe image

A frame of the short film "Rendering with Natural Light" http://www.debevec.org/RNL/

6 Light probe image

[Debevec1998] 7 Overview of IBL Steps

1. Acquire background photographs or video 2. Acquire and assemble the light probe image 3. Construct light-based model . Map the light probe to an emissive surface surrounding the scene 4. Identify local scene and model its geometry and reflectance 5. Render the scene as illuminated by the IBL environment 6. Postprocess, tone map and composite the renderings

8 Detour

. We will come back to the details of the IBL steps later

. Need to first understand  High-dynamic range imaging . For faithful recording of scene radiance  Global illumination . For realistic rendering of synthetic objects and part of real scene

9 High Dynamic Range Imaging (HDRI)

10 Motivation

. Ordinary cameras cannot record wide range of scene radiance in one image  Typically only 8-11 stops (EV) . Solution: Take multiple images of different exposures (different exposure times) and "combine" them

Multiple exposures HDR image Tone-mapped image Images from http://www.cambridgeincolour.com/tutorials/high-dynamic-range.htm 11 Results . Combining the multiple exposures, we get  Irradiance at each pixel (unknown scale) . The HDR image  Camera response function . R, G, B channels are generally different

12 Example

. Exposures from 30 sec to 1/1000 sec, at 1-stop increment

[Devebec1997]

13 Example

. Response functions of a Fuji 100 ASA negative film

14 Example

. The HDR image (the false colors show relative radiance values)  Dynamic range about 25,000:1 (>14 stops)

[Devebec1997] 15 Example

Input images

Tone-mapped image

Images from http://en.wikipedia.org/wiki/Tone_mapping 16 Application of HDRI

. Recovery of surface BRDF

. Image processing and photography  Exposures after image acquisition

Images from http://en.wikipedia.org/wiki/High_dynamic_range_image

17 Application of HDRI

 Blurring (e.g. simulating out-of-focus)

 Motion Blur

Images from http://en.wikipedia.org/wiki/High_dynamic_range_image

18 Application of HDR Images

. More realistic rendering  HDR rendering supported in hardware

Images from http://en.wikipedia.org/wiki/High_dynamic_range_rendering 19 HDRI References

. Books  High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting by Erik Reinhard, Greg Ward, Sumanta Pattanaik, and Paul Debevec, 2005 . Tools  HDR Shop: http://gl.ict.usc.edu/HDRShop/  Photoshop CS2: http://www.adobe.com/products/photoshop/  Photomatix: http://www.hdrsoft.com/ . HDR Image Formats  ILM OpenEXR (.exr): http://www.openexr.com/  RADIANCE RGBE (.hdr or .rgbe): http://radsite.lbl.gov/radiance/ . Papers  [Devebec1997] . Paul Devebec et al., "Recovering High Dynamic Range Radiance Maps from Photographs," SIGGRAPH '97  [Mitsunaga1999] . Tomoo Mitsunaga et al., "Radiometric Self Calibration," CVPR '99

20 Global Illumination

21 Global Illumination

. Evaluating light reflected from a point x by taking into consideration all illumination that arrives at the point

Figure by Frédo Durand, MIT 22 The Rendering Equation

. Mathematical formulation of global illumination

Integrate over the hemisphere around x

x

23 The Rendering Equation

. Cannot be evaluated analytically  In practice, send tons of random rays (Monte Carlo methods) . It is recursive

 To evaluate Lref (x, ref), we need to evaluate Lin (x', in), and so on

24 Some Lighting Effects [Henrik Jensen]

Caustics caused by Color bleeding caused by focusing of light diffuse-to-diffuse interactions

25 Global Illumination Algorithms

. Ray-tracing approach  Whitted ray tracing [Whitted1980]  Distributed ray tracing [Cook1984]  [Kajiya1986]  Two-pass ray tracing [Arvo1986]  Photon mapping [Jensen1995]* . not a complete GI algorithm . Finite-element approach  Radiosity [Goral1984]

26 Path Tracing

. For each pixel, shoot multiple random primary rays . At each intersection, only a secondary ray is shot  The secondary ray can be in any direction, not just sampled from the specular lobe . Each primary ray from the eye and its subsequent secondary rays form a light path . The ray tree has branching factor of one

27 Path Tracing

. Simulates complete global illumination  But at very high computational cost . Indirect illumination, such as caustics, exhibits high variance

10 paths / pixel

[Henrik Jensen] 28 Radiosity

. Implements only diffuse-diffuse interactions . Scene is discretized into patches, and interaction between patches are considered . Global illumination solution is computed by solving a set of linear equations . Solution is view independent and consists of a constant radiosity (W/m2) for every patch in the scene  Once solution is computed, it can be viewed from any view

29 Radiosity Images The Cornell Box

[Cornell University Program of Computer Graphics] 30 Global Illumination References . Books  Advanced Global Illumination, Second Edition by Philip Dutré, Kavita Bala, Philippe Bekaert, 2006  Physically Based Rendering: From Theory to Implementation by Matt Pharr & Greg Humphreys, 2004  Realistic Ray Tracing, 2nd Edition by Peter Shirley & R. Keith Morley, 2003  Realistic Image Synthesis Using Photon Mapping by Henrik Wann Jensen, 2001  Principles of Digital Image Synthesis by Andrew S. Glassner, 1995  Radiosity and Realistic Image Synthesis by Michael F. Cohen & John R. Wallace, 1993

31 Global Illumination References . Non-Commercial Renderers  YafRay: http://www.yafray.org/  RADIANCE: http://radsite.lbl.gov/radiance/  PBRT (Physically-Based Raytracer): http://www.pbrt.org/  POV-Ray: http://www.povray.org/ (v3.6 does not support HDR IBL)  MegaPOV: http://megapov.inetart.net/  Indigo Renderer: http://www.indigorenderer.com/ . Commercial Renderers  Mental Ray: http://www.mentalimages.com/  Pixar's RenderMan: https://renderman.pixar.com/  Maxwell Renderer: http://www.maxwellrender.com/ 32 Global Illumination References

. Papers  Rendering equation [Kajiya1986] . J. T. Kajiya, "The Rendering Equation," SIGGRAPH '86  Whitted ray tracing [Whitted1980] . T. Whitted, "An Improved Illumination Model for Shaded Display," Comm. ACM, 23(6):343-349, 1980  Distributed ray tracing [Cook1984] . R. Cook et al., "Distributed Ray Tracing," SIGGRAPH '84  Radiosity [Goral1984] . C. Goral et al., "Modeling the Interaction of Light Between Diffuse Surfaces," SIGGRAPH '84  Path tracing [Kajiya1986]  Two-pass ray tracing [Arvo1986] . J. Arvo, "Backwards Ray Tracing," Developments in Ray Tracing, SIGGRAPH '86 Course Notes #12  Photon mapping [Jensen1995] . H. W. Jensen et al., "Photn Maps in Bidirectional Monte Carlo Ray Tracing of Complex Objects," Computer & Graphics 19(2):215-224, 1995 33 Image-Based Lighting (IBL)

34 Overview of IBL Steps

1. Acquire background photographs or video 2. Acquire and assemble the light probe image 3. Construct light-based model . Map the light probe to an emissive surface surrounding the scene 4. Identify local scene and model its geometry and reflectance 5. Render the scene as illuminated by the IBL environment 6. Postprocess, tone map and composite the renderings

35 Example

. Use this example to demonstrate the IBL steps

Background photo

Synthetic objects

[Debevec1998] 36 1. Acquire Background Photograph

37 2. Acquire Light Probe Image (HDR)

The pattern is for camera calibration. Light probe image 38 3. Construct Light-Based Model

Need to have an approximate 3D model of the environment

39 Separation of Scene

40 4. Identify Local Scene . Model its geometry and estimate its reflectance

. References  Yizhou Yi et al., "Inverse Global Illumination: Recovering Reflectance Models of Real Scenes from Photographs," SIGGRAPH '99  Paul Devebec et al., "Estimating Surface Reflectance Properties of s Complex Scene Under Natural Illumination," ACM Transactions on Graphics, 2005

41 5. Render Local and Synthetic Scene

. Using light-based model as lighting

42 6. Compositing

. When estimate of local scene reflectance is accurate

Lfinal =  Llocal+synthetic + (1) Lbackground

Llocal+synthetic (Tone-mapped)

Lfinal

L  background

43 6. Compositing using Differential Rendering

. When estimate of local scene reflectance is not accurate

Lfinal =  Llocal+synthetic + (1) (Lbackground + Llocal+synthetic  Llocal )

Llocal+synthetic (Tone-mapped)

Lfinal

 L (Tone-mapped) local 44 Other Examples

45 Other Examples

46 IBL References

. Books and Notes  High Dynamic Range Imaging: Acquisition, Display, and Image-Based Lighting by Erik Reinhard, Greg Ward, Sumanta Pattanaik, and Paul Debevec, 2005  HDRI and Image-Based Lighting by Paul Devebec et al., SIGGRAPH 2003 Course #19, http://www.debevec.org/IBL2003/ . Software Tools  HDR Shop: http://gl.ict.usc.edu/HDRShop/ . Renderers  As listed in the global illumination references . Papers  [Devebec1998] . Paul Devebec, “Rendering Synthetic Objects into Real Scenes: Bridging Traditional and Image-based Graphics with Global Illumination and High Dynamic Range Photography," SIGGRAPH '98

47 Semi-Automatic Approach

48 Semi-Automatic Approach

. From a single LDR photo, semi-automatically estimate  Geometry  Camera parameters  Surface properties  Lighting info

49 System Overview

Input image Object insertion

Scene authoring Scene synthesis

50 System Overview

Scene authoring

51 Manual input

Bounding geometry Supporting geometry Spatial Layout Occluding geometry [Hedau et al. ’09] Light sources Spectral matting[Levin et al. ’09] Manual input

52 Area lights

Textured billboard Bounding cuboid (with transparency)

Extruded polygon

53 System Overview

Scene synthesis

54 Scene Synthesis Physical scene model Rendered scene Area lights

Bounding cuboid Textured billboard

Extruded polygon

Auto-material estimation & Auto-lighting refinement

Match input image and rendered scene 55 Material Estimation

Input + geometry

Retinex-like Direct decomposition

Reflectance 56 Lighting Estimation

Input image Physical model

Lights

Geometry w/ materials

57 Lighting Estimation

Input image Rendered (initial) Rendered (final)

58 Lighting Estimation

Result using initial Result using refined lights lights

59 External Lighting

60 External Lighting

Source bounding box Shaft bounding box

61 External Lighting

Shadow matting via [Guo et al. ‘11]

Shaft direction

62 63 64 System Overview

Object insertion

65 Inserting Objects

. Load scene into 3D modeler . Insert objects, animations . Render with any physically based renderer

66 Final Composite

Additive differential technique [Debevec ‘98]

67 Results

68 69 70 71 72 73 74 75 References

. Kevin Karsch, Varsha Hedau, David Forsyth, Derek Hoiem, "Rendering Synthetic Objects into Legacy Photographs," SIGGRAPH Asia 2011  http://kevinkarsch.com/publications/sa11.html

76 The End

77