Photon Mapping
Total Page:16
File Type:pdf, Size:1020Kb
CS 371 Project 4: Photon Mapping Figure 1: Sponza scene with indirect illumination rendered by photon mapping. 1 Introduction 1.1 Overview Photon mapping [1995; 1996] is a popular global illumination algorithm with attractive mathematical properties, especially for simulating caustics. It is popular for film and game production. For example, it was used for lighting in Halo 3 [Chen and Liu 2008] and Alice in Wonderland [Martinez 2010]. Henrik Wann Jensen, the primary inventor of photon mapping, received an academy award (“Oscar”) in 2004 for his algorithmic contribution to film rendering. A common problem in global illumination is that most light transport paths traced backward from the eye never reach a light source, and most paths traced forward from the light sources never reach the eye. Photon mapping is a Monte Carlo algorithm for tracing both kinds of light paths half-way and then loosely con- necting them to form full transport paths. It can capture all real-world illumination phenomena, and is mathematically consistent, meaning that its radiance estimate converges to the true solution as the number of path samples increase. CS371 2010 j PROJECT 4: PHOTON MAPPING 1.2 Educational Goals In this project you will implement a photon mapping framework that can render a photorealistic image of any scene for which you can provide a suitable BRDF and geometric model. You will work from resources in the computer science literature including Jensen’s paper [1996] and his SIGGRAPH course notes. Along the way you will: • Learn to read a computer science paper and implement an algorithm from it. • Design your own structure for a complex mathematical program. • Learn to create and perform experimental analysis without explicit guidance. • Solve the kind of radiometry problems we’ve seen in lecture on your own. • Gain experience with Monte Carlo importance sampling, an essential tech- nique for graphics and other high-performance computing domains including computational biology, finance, and nuclear science. • Gain experience with the hash grid spatial data structure for expected O(1) insert, remove, and linear-time gather in the size of the query radius and output. • Use the trait and iterator software design patterns employed for statically- typed polymorphic data structures in C++. 1.3 Schedule This is a challenging, pair-programming project that builds extends the previous ray tracer project. The program will only add about 100 lines to your existing ray tracer. However, they are mathematically sophisticated lines and you will take responsibility for the program design and experiment design yourself this week instead of having them given to you in the handout. Note that you have three extra days and one extra lab session compared to most projects because the project spans Fall reading period. If you plan to take a four- day weekend I recommend completely implementing and debugging the forward trace before you leave. As a reference, my implementation contained 9 files, 420 statements, and 300 comment lines as reported by iCompile. I spent substantially longer debugging this program than I did for the other projects this semester. Out: Tuesday, October 5 Checkpoint 1 (Sec. 3.2): Thursday, October 7, 1:00 pm Checkpoint 2 (Sec. 3.3): Thursday, October 14, 1:00 pm Due: Friday, October 15, 11:00 pm 2 Rules/Honor Code You are encouraged to talk to other students and share strategies and programming techniques. You should not look at any other group’s code for this project or use http://graphics.cs.williams.edu/courses/cs371 2 CS371 2010 j PROJECT 4: PHOTON MAPPING code from other external sources except for: Jensen’s publications, the pbrt library, RTR3 and FCG textbooks, materials presented in lecture, and the G3D library (as limited below). You may look at and use anyone’s code from last week’s project, with their permission. During this project, you may use any part of the G3D library and look at all of its source code, including sample programs, with the exception of SuperBSDF::scatter, which you may not invoke or read the source for. You may share data files and can collaborate with other groups to create test and visually impressive scenes. If you share a visually impressive scene with another group, ensure that you use different camera angles or make other modifications to distinguish your image of it. On this project you may use any development style that you choose. You are not required to use pure side-by-side pair programming, although you may still find that an effective and enjoyable way of working. 3 Specification 1. Implement the specific variant of the photon mapping [Jensen 1996] algo- rithm described in Section 3.1. 2. Create a graphical user interface backed by functionality for: (a) Selecting resolution (b) Real-time preview with camera control (c) Rendering the view currently observed in the real-time preview (d) Setting RenderSettings::numEmittedPhotons (e) Setting RenderSettings::maxForwardBounces and RenderSettings::maxBackwardBounces (f) Enabling explicit direct illumination (vs. handling direct illumination via the photon map) (g) Enabling shadow rays when direct illumination is enabled (h) Enabling real-time visualization of the stored photons over the wire- frame (see Section 6.1). (i) Setting the photon gather radius r, RenderSettings::photonRadius (j) Displaying the number of triangles in the scene (k) Displaying the forward and backward trace times 3. Document the interfaces of your source code using Doxygen formatting. 4. Devise and render one visually impressive scene of your own creation, as described in Section 3.4. 5. Devise and render as many custom scenes as needed to demonstrate correct- ness or explore errors and performance, as described in Section 3.4. 6. Produce the reports described in Sections 3.2-3.4 as a cumulative Doxygen mainpage. http://graphics.cs.williams.edu/courses/cs371 3 CS371 2010 j PROJECT 4: PHOTON MAPPING 3.1 Details Photon mapping algorithm was introduced by Jensen and Christensen [1995] and refined by Jensen [1996]. Use those primary sources as your guide to the algo- rithm. Additional information about the algorithm and implementation is available in Jensen’s book [2001] and course notes [2007]. The notes are almost identical to the book and are available online. Section7 of this project document summarizes mathematical aspects covered in class that are ambiguous in the original papers. Make the following changes relative to the algorithm described in the 1996 paper. These increase the number of photons needed for convergence by a constant factor but greatly simplify the implementation. The algorithm remains mathematically consistent. 1. Do not implement: (a) The projection map; it is a useful optimization for irregular scenes but is hard to implement. (b) The illumination maps described in the 1995 paper. Pure photon maps are today the preferred implementation [Jensen 1996]. (c) The caustic map—just store a single global photon map and do not distinguish between path types. (d) Shadow photons; they are an important optimization for area light sources but are unnecessary for point emitters. 2. Use G3D::PointHashGrid instead of a kd-tree to implement the photon map data structure. This gives superior performance [Ma and McCool 2002]. 3. Ignore the 1995 implementation of the photon class. Instead use a variant on the 1996 implementation: store position, incident direction, and power for each photon. Both papers recommend compressing photons. Use a nat- ural floating point representation instead of a compressed one. Do not store surface normals in the photons. 4. Use a constant-radius gather sphere (as described on page 7 of the 1996 paper). When you have that working, you may optionally implement the ellipsoid gathering method or the k-nearest-neighbor method and compare performance and image quality. 5. Implement the algorithm that Jensen calls“direct visualization of the photon map”1 rather than final gathering. This means that you should not make multiple ray casts from a point when estimating indirect illumination, but instead use the radiance estimate from the photon map directly. Beware that the description of the trace from the eye sounds more complicated than it actually is. You’re just going to extend your existing ray tracer by adding indirect light to the direct illumination at every point. 1Jensen refers to the kind of gathering you’re implementing as “visualizing the photon map.” This document calls that simply “gathering” and the “radiance estimate” and uses “visualization” to refer to debugging tools. http://graphics.cs.williams.edu/courses/cs371 4 CS371 2010 j PROJECT 4: PHOTON MAPPING 3.2 Checkpoint 1 Report (due Thursday, Oct. 7, 1:00 pm) 1. Implement Photon class (it need not have any methods!) 2. Give pseudo-code for the entire photon mapping algorithm in your Doxygen Tip: Parts of the pho- mainpage. ton mapping algorithm will be presented in lec- (a) Almost all of the information that you need is in this handout, but is ture the Wednesday after distributed so that you have to think and read instead of copying it. Be the project starts, however sure to read everything in here. you can begin now be- (b) Use a hierarchical list, with the four major steps at the top level and the cause the handout and pa- details inside. Your full pseudo-code, including some LaTeX for the pers contain all of that in- equations should be around 40 lines long. formation and more. (c) I recommend using either nested HTML lists (<ol><li>...</ol>) or simple preformatted HTML (<pre>...</pre>) inside a Doxygen \htmlonly....\endhtmlonly block. (d) The details you need are in the implementation section (7), Jensen’s notes [2007], and McGuire and Luebke’s appendix [2009]. (e) Use actual (and anticipated actual) names of RayTracer methods and members in the code so that they will be linked correctly when you re- ally implement them.