Shiny Pixels and Beyond: Real-Time Raytracing at SEED

Total Page:16

File Type:pdf, Size:1020Kb

Shiny Pixels and Beyond: Real-Time Raytracing at SEED Towards effortless photorealism through real-time raytracing Tomasz Stachowiak SEED – Electronic Arts Video S E E D // Towards effortless photorealism through real-time raytracing “PICA PICA” ▪ Exploratory mini-game & world ▪ For our self-learning AI agents to play, not for humans ☺ ▪ “Imitation Learning with Concurrent Actions in 3D Games” [Harmer 2018] ▪ Uses SEED’s Halcyon R&D engine ▪ Goals ▪ Explore hybrid rendering with DXR ▪ Clean and consistent visuals ▪ Procedural worlds [Opara 2018] ▪ No precomputation S E E D // Towards effortless photorealism through real-time raytracing Why use raytracing? Path tracing ~15 seconds accumulation Path tracing (1 sample per pixel) Real-time hybrid with raytracing No raytracing S E E D // Towards effortless photorealism through real-time raytracing Is this a fair comparison? ▪ “Classic” game solutions exist ▪ Local reflection volumes ▪ Planar reflections ▪ Lightmaps ▪ “Sky visibility” maps ▪ … S E E D // Towards effortless photorealism through real-time raytracing Zero artist overhead 1. Enable raytracing 2. Ship it Making it believable S E E D // Towards effortless photorealism through real-time raytracing Quality at the core ▪ Embrace physically-based rendering ▪ Ensure correct attribute ranges ▪ Applicable to stylized rendering too ▪ Always compare to ground truth ▪ Know when to cut corners ▪ Focus on good workflows ▪ Iteration yields quality S E E D // Towards effortless photorealism through real-time raytracing Materials ▪ Multiple microfacet layers ▪ Inspired by [Weidlich 2007] ▪ Rapidly try different looks ▪ Energy conserving Objects with Multi-Layered Materials ▪ Automatic Fresnel between layers ▪ All lighting & rendering modes ▪ Raster, path-traced reference, hybrid Multi-Layered Materials [Weidlich 2007] S E E D // Towards effortless photorealism through real-time raytracing Light transport ▪ Light-matter interaction creates varying phenomena ▪ Useful to model separately ▪ Different optimization strategies ▪ Varying difficulty ▪ Correct by default ▪ Consistency is key ▪ Avoid intensity hacks [Ritchell 2011] S E E D // Towards effortless photorealism through real-time raytracing Shadows ▪ Not just for grounding objects ▪ Greatly enhance micro-detail ▪ Contact hardening ▪ Softness needed to convey scale ▪ Hard shadows visually busy S E E D // Towards effortless photorealism through real-time raytracing Post-processing ▪ It’s part of light transport ▪ Replace hacks with physically-based effects ▪ Try to match a real lens S E E D // Towards effortless photorealism through real-time raytracing Art direction ▪ Even a tech demo needs good art ▪ Power of the Utah Teapot is finite ▪ Tech and content should evolve together ▪ Rendering is not a solved problem ▪ Raytracing helps, yet edge cases remain ▪ Play to the strengths of both ▪ Anastasia Opara’s talk on Friday ▪ 11:20, Room 313+314 Making it fast S E E D // Towards effortless photorealism through real-time raytracing Back to school ▪ Hardware finally getting there ▪ Still a lot of work on us "Juggler" - Eric Graham, 1987 ▪ Use rays sparingly ▪ Trace only where necessary ▪ Reconstruct, denoise ▪ Redundancy in spatial and temporal domains ▪ Adaptive techniques ▪ Tune ray count to importance ▪ Lots of research ahead! Turner Whitted, 1980 S E E D // Towards effortless photorealism through real-time raytracing Hybrid Rendering Pipeline Deferred shading (raster) Direct shadows Direct lighting (compute) Reflections (raytrace or raster) (raytrace or compute) Global Illumination (raytrace) Ambient occlusion Transparency & Translucency Post processing (compute) (raytrace or compute) (raytrace) S E E D // Towards effortless photorealism through real-time raytracing Raytraced Reflections ▪ Launch rays from G-Buffer ▪ Raytrace at half resolution ▪ ¼ ray/pixel for reflection ▪ ¼ ray/pixel for reflected shadow ▪ Reconstruct at full resolution ▪ Arbitrary normals ▪ Spatially-varying roughness Raytraced Reflections S E E D // Towards effortless photorealism through real-time raytracing Reflection Pipeline Importance Screen-space sampling reflection Raytracing Envmap gap fill Temporal Bilateral cleanup Spatial accumulation reconstruction S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling ▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling ▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] ▪ Re-roll if ray below horizon S E E D // Towards effortless photorealism through real-time raytracing Layered Material Sampling ▪ One ray for the whole stack ▪ Stochastically select a layer ▪ Based on layer visibility estimate ▪ Bottom layers occluded by top layers ▪ View-dependent due to Fresnel ▪ Approximate with Schlick ▪ Sample BRDF of selected layer ▪ Multiple layers can generate same direction ▪ Query each layer about generated direction ▪ Add up probabilities S E E D // Towards effortless photorealism through real-time raytracing Reflection Raygen ▪ Read generated ray ▪ TraceRay() ▪ Same hit shaders as path-tracing reference ▪ Output packed in 2x RGBA 16F Color R Color G Color B G-buffer Depth Dir X * Hit T Dir Y * Hit T Dir Z * Hit T Inverse PDF Raw raytrace output S E E D // Towards effortless photorealism through real-time raytracing Why Stochastic? ▪ Gives right answer, but ▪ Produces noise ▪ Thrashes caches ▪ Alternative: post-filter ▪ Needs arbitrarily large kernel ▪ Introduces bleeding ▪ Sensitive to aliasing ▪ Meet in the middle? ▪ Bias samplers ▪ Combine with spatial filtering ▪ High variance -> increase bias Image Space Gathering [Robison 2009] Raw raytrace output +Spatial reconstruction ▪ Based on Stochastic Screen-Space Reflections [Stachowiak 2015] ▪ For every full-res pixel, use 16 half-res ray hits ▪ Poisson-disk distribution around pixel ▪ Scale by local BRDF result = 0.0 ▪ Divide by ray PDF weightSum = 0.0 for pixel in neighborhood: ▪ Ratio estimator [Heitz 2018] weight = localBrdf(pixel.hit) / pixel.hitPdf result += color(pixel.hit) * weight weightSum += weight result /= weightSum +Spatial reconstruction +Temporal accumulation +Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count +Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count +BilateralVariance estimate cleanup +Bilateral cleanup +Temporal anti-aliasing S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Two sources of velocity ▪ Per-pixel motion vector ▪ Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Neither reprojection method robust Per-pixel motion vector Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Dual-Source Reprojection ▪ Sample history using both reprojection strategies ▪ Estimate local statistics during spatial reconstruction ▪ RGB mean ▪ RGB standard deviation ▪ Of used hit samples ▪ Weigh contributions ▪ rgb_dist = (rgb – rgb_mean) / rgb_dev ▪ dist = (rgb – rgb_mean) / rgb_dev ▪ w = exp2(-10 * luma(dist)) S E E D // Towards effortless photorealism through real-time raytracing Reprojection Clamp ▪ Build color box from local statistics ▪ rgb_min = rgb_mean – rgb_dev ▪ rgb_max = rgb_mean + rgb_dev ▪ Clamp reprojected values to box ▪ rgb = clamp(rgb, rgb_min, rgb_max) ▪ Same idea as in temporal anti-aliasing [Karis 2014] ▪ Weigh clamped contributions ▪ rgb_sum += val * w ▪ Introduces bias ▪ Fixes most ghosting S E E D // Towards effortless photorealism through real-time raytracing Transparency & Translucency ▪ Refractions ▪ Order-independent (OIT) ▪ Variable roughness ▪ Multiple IOR transitions ▪ Absorption ▪ Translucency ▪ Light scattering inside a medium ▪ Inspired by Translucency in Frostbite [Barré-Brisebois 2011] Glass and Translucency S E E D // Towards effortless photorealism through real-time raytracing Integration ▪ Multiple samples required for rough refractions and translucency ▪ Multi-layer screen-filtering complicated ▪ More on that later! ▪ Accumulate in texture-space instead ▪ Stable integration domain ▪ Static texture allocation ▪ 512*512 per object ▪ Time-sliced, ~1M rays/frame S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside
Recommended publications
  • An Overview of Modern Global Illumination
    Scholarly Horizons: University of Minnesota, Morris Undergraduate Journal Volume 4 Issue 2 Article 1 July 2017 An Overview of Modern Global Illumination Skye A. Antinozzi University of Minnesota, Morris Follow this and additional works at: https://digitalcommons.morris.umn.edu/horizons Part of the Graphics and Human Computer Interfaces Commons Recommended Citation Antinozzi, Skye A. (2017) "An Overview of Modern Global Illumination," Scholarly Horizons: University of Minnesota, Morris Undergraduate Journal: Vol. 4 : Iss. 2 , Article 1. Available at: https://digitalcommons.morris.umn.edu/horizons/vol4/iss2/1 This Article is brought to you for free and open access by the Journals at University of Minnesota Morris Digital Well. It has been accepted for inclusion in Scholarly Horizons: University of Minnesota, Morris Undergraduate Journal by an authorized editor of University of Minnesota Morris Digital Well. For more information, please contact [email protected]. An Overview of Modern Global Illumination Cover Page Footnote This work is licensed under the Creative Commons Attribution- NonCommercial-ShareAlike 4.0 International License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-sa/ 4.0/. UMM CSci Senior Seminar Conference, April 2017 Morris, MN. This article is available in Scholarly Horizons: University of Minnesota, Morris Undergraduate Journal: https://digitalcommons.morris.umn.edu/horizons/vol4/iss2/1 Antinozzi: An Overview of Modern Global Illumination An Overview of Modern Global Illumination Skye A. Antinozzi Division of Science and Mathematics University of Minnesota, Morris Morris, Minnesota, USA 56267 [email protected] ABSTRACT world that enable visual perception of our surrounding envi- Advancements in graphical hardware call for innovative solu- ronments.
    [Show full text]
  • Global Illumination for Dynamic Voxel Worlds Using Surfels and Light Probes
    Master of Science in Computer Science May 2020 Global Illumination for Dynamic Voxel Worlds using Surfels and Light Probes Dan Printzell Faculty of Computing, Blekinge Institute of Technology, 371 79 Karlskrona, Sweden This thesis is submitted to the Faculty of Computing at Blekinge Institute of Technology in partial ful- filment of the requirements for the degree of Master of Science in Computer Science. The thesis is equivalent to 20 weeks of full time studies. The authors declare that they are the sole authors of this thesis and that they have not used any sources other than those listed in the bibliography and identified as references. They further declare that they have not submitted this thesis at any other institution to obtain a degree. Contact Information: Author(s): Dan Printzell E-mail: [email protected] University advisor: Dr. Prashant Goswami Department of Computer Science Faculty of Computing Internet : www.bth.se Blekinge Institute of Technology Phone : +46 455 38 50 00 SE–371 79 Karlskrona, Sweden Fax : +46 455 38 50 57 Abstract Background. Getting realistic in 3D worlds has been a goal for the game industry since its creation. With the knowledge of how light works; computing a realistic looking image is possible. The problem is that it takes too much computational power for it to be able to render in real-time with an acceptable frame rate. In a paper Jendersie, Kuri and Grosch [8] and in a thesis by Kuri [9] they present a method of calculation light paths ahead-of- time, that will then be used at run-time to get realistic light.
    [Show full text]
  • Global Illumination Compendium (PDF)
    Global Illumination Compendium September 29, 2003 Philip Dutré [email protected] Computer Graphics, Department of Computer Science Katholieke Universiteit Leuven http://www.cs.kuleuven.ac.be/~phil/GI/ This collection of formulas and equations is supposed to be useful for anyone who is active in the field of global illu- mination in computer graphics. I started this document as a helpful tool to my own research, since I was growing tired of having to look up equations and formulas in various books and papers. As a consequence, many concepts which are ‘trivial’ to me are not in this Global Illumination Compendium, unless someone specifically asked for them. There- fore, any further input and suggestions for more useful content are strongly appreciated. If possible, adequate references are given to look up some of the equations in more detail, or to look up the deriva- tions. Also, an attempt has been made to mention the paper in which a particular idea or equation has been described first. However, over time, many ideas have become ‘common knowledge’ or have been modified to such extent that they no longer resemble the original formulation. In these cases, references are not given. As a rule of thumb, I include a reference if it really points to useful extra knowledge about the concept being described. In a document like this, there is always a fair chance of errors. Please report any errors, such that future versions have the correct equations and formulas. Thanks to the following people for providing me with suggestions and spotting errors: Neeharika Adabala, Martin Blais, Michael Chock, Chas Ehrlich, Piero Foscari, Neil Gatenby, Simon Green, Eric Haines, Paul Heckbert, Vladimir Koylazov, Vincent Ma, Ioannis (John) Pantazopoulos, Fabio Pellacini, Robert Porschka, Mahesh Ramasubramanian, Cyril Soler, Greg Ward, Steve Westin, Andrew Willmott.
    [Show full text]
  • Capturing and Simulating Physically Accurate Illumination in Computer Graphics
    Capturing and Simulating Physically Accurate Illumination in Computer Graphics PAUL DEBEVEC Institute for Creative Technologies University of Southern California Marina del Rey, California Anyone who has seen a recent summer blockbuster has witnessed the dramatic increases in computer-generated realism in recent years. Visual effects supervisors now report that bringing even the most challenging visions of film directors to the screen is no longer a question of whats possible; with todays techniques it is only a matter of time and cost. Driving this increase in realism have been computer graphics (CG) techniques for simulating how light travels within a scene and for simulating how light reflects off of and through surfaces. These techniquessome developed recently, and some originating in the 1980sare being applied to the visual effects process by computer graphics artists who have found ways to channel the power of these new tools. RADIOSITY AND GLOBAL ILLUMINATION One of the most important aspects of computer graphics is simulating the illumination within a scene. Computer-generated images are two-dimensional arrays of computed pixel values, with each pixel coordinate having three numbers indicating the amount of red, green, and blue light coming from the corresponding direction within the scene. Figuring out what these numbers should be for a scene is not trivial, since each pixels color is based both on how the object at that pixel reflects light and also the light that illuminates it. Furthermore, the illumination does not only come directly from the lights within the scene, but also indirectly Capturing and Simulating Physically Accurate Illumination in Computer Graphics from all of the surrounding surfaces in the form of bounce light.
    [Show full text]
  • 3Delight 11.0 User's Manual
    3Delight 11.0 User’s Manual A fast, high quality, RenderMan-compliant renderer Copyright c 2000-2013 The 3Delight Team. i Short Contents .................................................................. 1 1 Welcome to 3Delight! ......................................... 2 2 Installation................................................... 4 3 Using 3Delight ............................................... 7 4 Integration with Third Party Software ....................... 27 5 3Delight and RenderMan .................................... 28 6 The Shading Language ...................................... 65 7 Rendering Guidelines....................................... 126 8 Display Drivers ............................................ 174 9 Error Messages............................................. 183 10 Developer’s Corner ......................................... 200 11 Acknowledgements ......................................... 258 12 Copyrights and Trademarks ................................ 259 Concept Index .................................................. 263 Function Index ................................................. 270 List of Figures .................................................. 273 ii Table of Contents .................................................................. 1 1 Welcome to 3Delight! ...................................... 2 1.1 What Is In This Manual ?............................................. 2 1.2 Features .............................................................. 2 2 Installation ................................................
    [Show full text]
  • Real-Time Global Illumination Using Precomputed Illuminance Composition with Chrominance Compression
    Journal of Computer Graphics Techniques Vol. 5, No. 4, 2016 http://jcgt.org Real-Time Global Illumination Using Precomputed Illuminance Composition with Chrominance Compression Johannes Jendersie1 David Kuri2 Thorsten Grosch1 1TU Clausthal 2Volkswagen AG Figure 1. The Crytek Sponza1 (5.7ms, 245k triangles, 10k caches, 4 band SHs) with a visu- alization of the caches (left) and a Volkswagen test data set using an environment map from Persson2 (10.5ms, 1.35M triangles, 10k caches, 6 band SHs) with multiple bounce indirect illumination and slightly glossy surfaces (right). Abstract In this paper we present a new real-time approach for indirect global illumination under dy- namic lighting conditions. We use surfels to gather a sampling of the local illumination and propagate the light through the scene using a hierarchy and a set of precomputed light trans- port paths. The light is then aggregated into caches for lighting static and dynamic geometry. By using a spherical harmonics representation, caches preserve incident light directions to allow both diffuse and slightly glossy BRDFs for indirect lighting. We provide experimental results for up to eight bands of spherical harmonics to stress the limits of specular reflections. In addition, we apply a chrominance downsampling to reduce the memory overhead of the caches. The sparse sampling of illumination in surfels also enables indirect lighting from many light sources and an efficient progressive multi-bounce implementation. Furthermore, any existing pipeline can be used for surfel lighting, facilitating the use of all kinds of light sources, including sky lights, without a large implementation effort. In addition to the general initial 1Meinl, F.
    [Show full text]
  • Fast Global Illumination for Visualizing Isosurfaces with a 3D Illumination Grid
    A NATOMIC R ENDERING AND V ISUALIZATION Fast Global Illumination for Visualizing Isosurfaces with a 3D Illumination Grid Users who examine isosurfaces of their 3D data sets generally view them with local illumination because global illumination is too computationally intensive. By storing the precomputed illumination in a texture map, visualization systems can let users sweep through globally illuminated isosurfaces of their data at interactive speeds. isualizing a 3D scalar function is an ing) the equation for light transport. Generally, important aspect of data analysis in renderers that solve the light transport equation science, medicine, and engineering. are implemented in software and are conse- However, many software systems for quently much slower than their hardware-based 3DV data visualization offer only the default ren- counterparts. So a user faces the choice between dering capability provided by the computer’s fast-but-incorrect and slow-but-accurate displays graphics card, which implements a graphics li- of illuminated 3D scenes. Global illumination brary such as OpenGL1 at the hardware level to might make complicated 3D scenes more com- render polygons using local illumination. Local prehensible to the human visual system, but un- illumination computes the amount of light that less we can produce it at interactive rates (faster would be received at a point on a surface from a than one frame per second), scientific users will luminaire, neglecting the shadows cast by any continue to analyze isosurfaces of their 3D scalar intervening surfaces and indirect illumination data rendered with less realistic, but faster, local from light reflected from other surfaces. This illumination.
    [Show full text]
  • Real-Time Global Illumination on GPU
    Real-Time Global Illumination on GPU Mangesh Nijasure, Sumanta Pattanaik Vineet Goel U. Central Florida, Orlando, FL ATI Research, Orlando, FL Abstract Our algorithm takes advantage of the capability of the GPU for computing accurate global illumination in 3D We present a system for computing plausible global illu- scenes and is fast enough so that interactive and immer- mination solution for dynamic environments in real time on sive applications with desired complexity and realism are programmable graphics processors (GPUs). We designed a possible. We simulate the transport of light in a synthetic progressive global illumination algorithm to simulate mul- environment by following the light emitted from the light tiple bounces of light on the surfaces of synthetic scenes. source(s) through its multiple bounces on the surfaces of the The entire algorithm runs on ATI’s Radeon 9800 using ver- scene. Light bouncing off the surfaces is captured by cube- tex and fragment shaders, and computes global illumination maps distributed uniformly over the volume of the scene. solution for reasonably complex scenes with moving objects Cube-maps are rendered using graphics hardware. Light is and moving lights in realtime. distributed from these cube-maps to the nearby surface po- sitions by using a simple trilinear interpolation scheme for Keywords: Global Illumination, Real Time Rendering, the computation of each subsequent bounce. Iterative com- Programmable Graphics Hardware. puting of a small number of bounces gives us a plausible approximation of global illumination for scenes involving 1 Introduction both moving objects and changing light sources in fractions of seconds. Traditionally, cube maps are used for hardware simula- Accurate lighting computation is one of the key elements tion of the reflection of the environment on shiny surfaces to realism in rendered images.
    [Show full text]
  • GLOBAL ILLUMINATION in GAMES What Is Global Illumination?
    Nikolay Stefanov, PhD Ubisoft Massive GLOBAL ILLUMINATION IN GAMES What is global illumination? Interaction between light and surfaces Adds effects such as soft contact shadows and colour bleeding Can be brute-forced using path tracing, but is very slow Cornell98 Global illumination is simply the process by which light interacts with physical surfaces. Surfaces absorb, reflect or transmit light. This here is a picture of the famous Cornell box scene, which is a standard environment for testing out global illumination algorithms. The only light source in this image is at the top of the image. Global illumination adds soft shadows and colour between between the cubes and the walls. There is also natural darkening along the edges of the walls, floor and ceiling. One such algorithm is path tracing. A naive, brute force path trace essentially shoot millions and millions of rays or paths, recording every surface hit along the way. The algorithm stops tracing the path when the ray hits a light. As you can imagine, this takes ages to produce a noise-free image. Nonetheless, people have started using some variants of this algorithm for movie production (e.g. “Cloudy With a Chance of Meatballs”) What is global illumination? Scary looking equation! But can be done in 99 lines of C++ All of this makes for some scary looking maths, as illustrated here by the rendering equation from Kajiya. However that doesn’t mean that the *code* is necessarily complicated. GI in 99 lines of C++ Beason2010 Here is an image of Kevin Beason’s smallpt, which is 99 lines of C++ code.
    [Show full text]
  • Scene Independent Real-Time Indirect Illumination
    Scene Independent Real-Time Indirect Illumination Jeppe Revall Frisvad¤ Rasmus Revall Frisvady Niels Jørgen Christensenz Peter Falsterx Informatics and Mathematical Modelling Technical University of Denmark ABSTRACT 2 RELATED WORK A novel method for real-time simulation of indirect illumination is An extensive survey of techniques for interactive global illumina- presented in this paper. The method, which we call Direct Radiance tion is given in [3]. Most of these techniques handle only few scene Mapping (DRM), is based on basal radiance calculations and does changes. An example could be that only the camera and a few rigid not impose any restrictions on scene geometry or dynamics. This objects are allowed to move, while light sources, surface attributes, makes the method tractable for real-time rendering of arbitrary dy- object shapes, etc. are static. Such interactive methods usually re- namic environments and for interactive preview of feature anima- sort to interactive refinement of traditional global illumination solu- tions. Through DRM we simulate two diffuse reflections of light, tions or rely on parallel computations using multiple processors. To but can also, in combination with traditional real-time methods for the contrary the method presented in this paper imposes no restric- specular reflections, simulate more complex light paths. DRM is a tions on scene changes, therefore we call our method scene inde- GPU-based method, which can draw further advantages from up- pendent, and it recomputes the entire solution for each frame using coming GPU functionalities. The method has been tested for mod- a single CPU in cooperation with a GPU. erately sized scenes with close to real-time frame rates and it scales Light Mapping, originating in [22], is an approach which im- with interactive frame rates for more complex scenes.
    [Show full text]
  • Real-Time Global Illumination Using Opengl and Voxel Cone Tracing
    FREIE UNIVERSITÄT BERLIN BACHELOR’S THESIS Real-Time Global Illumination Using OpenGL And Voxel Cone Tracing Author: Supervisor: Benjamin KAHL Prof. Wolfgang MULZER A revised and corrected version of a thesis submitted in fulfillment of the requirements for the degree of Bachelor of Science in the arXiv:2104.00618v1 [cs.GR] 1 Apr 2021 Department of Mathematics and Computer Science on February 19, 2019 April 2, 2021 i FREIE UNIVERSITÄT BERLIN Abstract Institute of Computer Science Department of Mathematics and Computer Science Bachelor of Science Real-Time Global Illumination Using OpenGL And Voxel Cone Tracing by Benjamin KAHL Building systems capable of replicating global illumination models with interactive frame-rates has long been one of the toughest conundrums facing computer graphics researchers. Voxel Cone Tracing, as proposed by Cyril Crassin et al. in 2011, makes use of mipmapped 3D textures containing a voxelized representation of an environments direct light component to trace diffuse, specular and occlusion cones in linear time to extrapolate a surface fragments indirect light emitted towards a given photo- receptor. Seemingly providing a well-disposed balance between performance and phys- ical fidelity, this thesis examines the algorithms theoretical side on the basis of the rendering equation as well as its practical side in the context of a self-implemented, OpenGL-based variant. Whether if it can compete with long standing alternatives such as radiosity and raytracing will be determined in the subsequent evaluation. ii Contents Abstracti 1 Preface1 1.1 Introduction . .1 1.2 Problem and Objective . .1 1.3 Thesis Structure . .2 2 Introduction3 2.1 Basic Optics .
    [Show full text]
  • A Framework for Global Illumination in Animated Environments1
    A Framework for Global Illumination in Animated Environments1 Jeffry Nimeroff Department of Computer and Information Science University of Pennsylvania Julie Dorsey Department of Architecture Massachusetts Institute of Technology Holly Rushmeier Computing and Applied Mathematics Laboratory National Institute of Standards and Technology Abstract. We describe a new framework for ef®ciently computing and storing global illumination effects for complex, animated environments. The new frame- work allows the rapid generation of sequencesrepresenting any arbitrary path in a ªview spaceº within an environment in which both the viewer and objects move. The global illumination is stored as time sequencesof range images at base loca- tions that span the view space. We present algorithms for determining locations forthesebaseimages,andthe time stepsrequired to adequately capture the effects of object motion. We also present algorithms for computing the global illumina- tion in the baseimagesthat exploit spatialandtemporal coherenceby considering direct and indirect illumination separately. We discuss an initial implementation using the new framework. Results from our implementation demonstrate the ef- ®cient generation of multiple tours through a complex space, and a tour of an environment in which objects move. 1 Introduction The ultimate goal of global illumination algorithms for computer image generation is to allow users to interact with accurately rendered, animated, geometrically complex environments. While many useful methods have been proposed for computing global illumination,the generation of physically accurate images of animated, complex scenes still requires an inordinatenumber of CPU hours on state of the art computer hardware. Since accurate, detailed images must be precomputed, very little user interaction with complex scenes is allowed. In this paper, we present a range-image based approach to computing and storing the results of global illumination for an animated, complex environment.
    [Show full text]