Real-Time High Dynamic Range Image-Based Lighting

Real-Time High Dynamic Range Image-Based Lighting

Real-time High Dynamic Range Image-based Lighting César Palomo Department of Computer Science PUC-Rio, Rio de Janeiro, Brazil ABSTRACT ible effects as in games like Half-Life 2:Lost Coast (Valve In this work we present a real time method of lighting virtual Software R ), Oblivion (Bethesda Softworks R ) and Far Cry objects using measured scene radiance. To compute the il- (Ubisoft R ), to make the list short. lumination of CG objects inserted in a virtual environment, the method uses previously generated high dynamic range The rest of this paper is organized as follows. In the next (HDR) image-based lighting information of the scene, and section we discuss related work relevant to this paper. Sec- performs all calculations in graphics hardware to achieve tion 3 provides background about the image-based lighting real-time performance. This work is not intended to be a technique and how lighting calculations are done, since this breakthrough in real-time processing or in HDR image-based is the core of our method of lighting the virtual objects. Sec- lighting (IBL) of CG objects, but still we expect to provide tion 4 presents the concepts involved in HDR rendering: ac- the reader with the overall IBL technique knowledge up- quiring HDR images, storage, rendering, tone mapping and to-date, an in-depth understanding of the HDR rendering commonly used post-effects. Section 5 describes in depth the (HDRR) pipeline in current graphics cards' hardware, and general method used in this work, joining the use of graph- interesting images' post-effects which deliver higher realism ics hardware and techniques IBL and HDRR described in for the scene final viewer. previous sections to produce real-time and realistic scenes. Section 6 presents results obtained, and we conclude in sec- Keywords tion 7. Image-based lighting, High Dynamic Range, Real-time Ren- dering, Shading, Graphics Hardware Programming 2. RELATED WORK Debevec [5] uses high-dynamic range images of incident illu- 1. INTRODUCTION mination to render synthetic objects into real-world scenes. However, that work employed non-interactive global illumi- We describes the use of a previously captured HDR map nation rendering to perform the lighting calculations. Sig- with lighting information of a scene to illuminate computer- nificant work has been developed to approximate these full generated virtual objects as if they were seamlessly placed in illumination calculations in real-time by using graphics hard- the captured scene. All computations are performed using ware. As a particular example, [17] and [18] use multi- graphics hardware, by means of extensive use of vertex and pass rendering methods to simulate arbitrary surface re- fragment shaders in order to enable for real-time interaction flectance properties. In this work we use a multi-pass render- and display. We provide effects like reflection, refraction, ing method to perfom all the illumination and post-effects fresnell effect and chromatic dispersion [29] as options for the calculations to render the final scene at a highly interactive lighting equation. Furthermore, post-effects such as bloom frame rate, similar to Kawase's 2003 work [19]. [12] are produced to give the final scene a more realistic appearance. We make the simplification of not performing inter-reflection among virtually computer-generated objects 3. IMAGE-BASED LIGHTING TECHNIQUE to achieve a high frame-rate. Image-based lighting can be summarized as the use of real- world images of a scene to make up a model representing a Specially video games can benefit from this technique as surrounding surface, and the later use of this model's light- it creates more realistic scenes than with standard light- ing characteristics to correctly illuminate added subjects to ing, which allowed developers and artists to produce incred- the 3D scene. From the explanation above, we can iden- tify two main decisions which need to be made when using IBL: how to represent the surrounding scene into a model and how to perform the illumination calculations of added subjects into the scene. Subsection 3.1 discusses the main kinds of environment maps commonly used with IBL, and also makes a brief comparison among them, while subsection 3.2 lists the lighting effects used in this work. 3.1 Environment Mapping Techniques In short, environment mapping (EM ) simulates objects re- R is used to access the cube map texture in the correct face. flecting its surroundings. This technique assumes that an If we assume that the object is a perfect reflector such as a object's environment is infinitely far from the object, and mirror, the vector R can be computed in terms of the vectors that there is no self-reflection. If that assumptions hold, the I and N with Equation 1. environment surrounding the subject can be encoded and R = I − 2N(N:I) (1) modeled in an omnidirectional image known as an environ- ment map. The method was introduced by Blinn and Newell [3]. 3.2.2 Refraction All EM methods start with a ray frow the viewer to a point in the reflector. This ray, then, is reflected or refracted with respect to the normal at that point. This resulting direction is used as an index to an image containing the environment to determine the color for the point on the surface. 3.1.1 Spherical Environment Mapping The early method described in 1976 by Blinn and Newell [3] is known as Spherical Environment Mapping. For each environment-mapped pixel, the reflection vector is trans- When light passes through a boundary between two mate- formed into spherical coordinates, which in turn are trans- rials of different density (air and glass, for instance), the formed to the range [0; 1] and used as (u; v) coordinates to light's direction changes, since light travels more slowly in access the environment texture. Despite of being easy to denser materials. Snell's Law describes what happens in implement, this technique has several disadvantages as de- this boundary with Equation 2. η1 and η2 are the refraction scribed in [1], such as the limitations of view-point depen- index for media 1 and 2, respectively. dency, distortions in the environment map's poles and the required computational time. η1 sin θI = η2 sin θT (2) 3.1.2 Cubic Environment Mapping Ten years later, in 1986, Greene [8] introduced the EM tech- Basically, we use the built-in GLSL function refract to com- nique which is by far the most popular method implemented pute vector T to be used to lookup the environment map. in modern graphics hardware, due to its speed and flexibility. Vector T is calculated in terms of the vectors I, N and the The cubic environment map is obtained taking 6 projected ratio of the index of refraction η /η . faces of the scene that surrounds the object. This cube map 1 2 shape allows for linear mapping in all directions to six pla- nar texture maps. For that reason, the resulting reflection 3.2.3 Fresnell effect does not undergo the warping nor damaging singularities as- In real scenes, when light hit a boundary between two ma- sociated with a sphere map, particularly at the edges of the terials, some light reflects off the surface and some refracts reflection. through the surface. The Fresnell equation describes this For its characteristics of being view-independent, not pre- phenomenon precisely, but since it is a complex equation, it senting singularities and common implementation in current is common to use the simplified version depicted in Equa- graphics hardware, cube maps have been our choice in this tion 3. In this case, both reflection and refraction vectors work. are calculated and used to look up the environment map, but the resulting color in the incident point is calculated as 3.2 Lighting calculations shown in Equation 4 Having made the decision of how to represent the model of the surrounding scene, the illumination model for the IBL technique still needs to be chosen. Below we briefly present reflCoef = max(0; min(1; bias + scale ∗ (1 + I ∗ N)power)) the physical basis for the selected illumination models avail- (3) able in this work. A deeper explanation can be found in [29]. finalColor = reflCoef∗reflCol+(1−reflCoef)∗refracCol 3.2.1 Reflection (4) 3.2.4 Chromatic dispersion When an incident vector I from the viewer's position reaches the object's surface in a point P , the reflection vector R is calculated taking into account the normal N at point P and the incident angle θI , as depicted in figure 3.2.1. This vector The assumption that the refraction depends on the surface Since conventional current monitors have limitations of con- normal, incident angle and ratio of indices of refraction is in trast and dynamic range, with common contrast ratios be- fact a simplification for what happens in reality. In addition tween 500:1 and 3000:1, and HDR images have pixel val- to the mentioned factors, also the wavelength of the incident ues which by far exceed those limitations, we need to use light affects the refraction. This phenomenon is known as a method called Tone Mapping to display those images in chromatic dispersion, and it is what happens when white these conventional monitors. These tone mapping operators light enters a prism and emerges as a rainbow. are simply functions which map values from [0; 1) to [0; 1). Several operators have been developed and give different re- Figure 3.2.4 illustrates chromatic dispersion conceptually. sults, as can be seen in [7], [20], [22], [23] and [2]. Some The incident illumination (assumed to be white) is split into of them can be simple and efficient to be used in real-time, several refracted rays.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    7 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us