<<

Towards effortless photorealism through real-time raytracing

Tomasz Stachowiak SEED – Electronic Arts

Video S E E D // Towards effortless photorealism through real-time raytracing “PICA PICA” ▪ Exploratory mini-game & world ▪ For our self-learning AI agents to play, not for humans ☺ ▪ “Imitation Learning with Concurrent Actions in 3D Games” [Harmer 2018]

▪ Uses SEED’s Halcyon R&D engine ▪ Goals ▪ Explore hybrid rendering with DXR ▪ Clean and consistent visuals ▪ Procedural worlds [Opara 2018] ▪ No precomputation S E E D // Towards effortless photorealism through real-time raytracing

Why use raytracing? ~15 seconds accumulation Path tracing (1 sample per pixel) Real-time hybrid with raytracing No raytracing S E E D // Towards effortless photorealism through real-time raytracing Is this a fair comparison? ▪ “Classic” game solutions exist ▪ Local reflection volumes ▪ Planar reflections ▪ ▪ “Sky visibility” maps ▪ … S E E D // Towards effortless photorealism through real-time raytracing Zero artist overhead

1. Enable raytracing

2. Ship it Making it believable S E E D // Towards effortless photorealism through real-time raytracing Quality at the core ▪ Embrace physically-based rendering ▪ Ensure correct attribute ranges ▪ Applicable to stylized rendering too ▪ Always compare to ground truth ▪ Know when to cut corners ▪ Focus on good workflows ▪ Iteration yields quality S E E D // Towards effortless photorealism through real-time raytracing Materials ▪ Multiple microfacet layers ▪ Inspired by [Weidlich 2007] ▪ Rapidly try different looks ▪ Energy conserving Objects with Multi-Layered Materials ▪ Automatic Fresnel between layers ▪ All lighting & rendering modes ▪ Raster, path-traced reference, hybrid

Multi-Layered Materials [Weidlich 2007] S E E D // Towards effortless photorealism through real-time raytracing Light transport ▪ Light-matter interaction creates varying phenomena ▪ Useful to model separately ▪ Different optimization strategies ▪ Varying difficulty ▪ Correct by default ▪ Consistency is key ▪ Avoid intensity hacks

[Ritchell 2011] S E E D // Towards effortless photorealism through real-time raytracing Shadows ▪ Not just for grounding objects ▪ Greatly enhance micro-detail ▪ Contact hardening ▪ Softness needed to convey scale ▪ Hard shadows visually busy S E E D // Towards effortless photorealism through real-time raytracing Post-processing ▪ It’s part of light transport ▪ Replace hacks with physically-based effects ▪ Try to match a real lens S E E D // Towards effortless photorealism through real-time raytracing Art direction ▪ Even a tech demo needs good art ▪ Power of the Utah Teapot is finite ▪ Tech and content should evolve together ▪ Rendering is not a solved problem ▪ Raytracing helps, yet edge cases remain ▪ Play to the strengths of both ▪ Anastasia Opara’s talk on Friday ▪ 11:20, Room 313+314 Making it fast S E E D // Towards effortless photorealism through real-time raytracing Back to school ▪ Hardware finally getting there ▪ Still a lot of work on us

"Juggler" - Eric Graham, 1987 ▪ Use rays sparingly ▪ Trace only where necessary ▪ Reconstruct, denoise ▪ Redundancy in spatial and temporal domains ▪ Adaptive techniques ▪ Tune ray count to importance ▪ Lots of research ahead!

Turner Whitted, 1980 S E E D // Towards effortless photorealism through real-time raytracing Hybrid Rendering Pipeline

Deferred shading (raster) Direct shadows Direct lighting (compute) Reflections (raytrace or raster) (raytrace or compute)

Global Illumination (raytrace) Transparency & Translucency Post processing (compute) (raytrace or compute) (raytrace) S E E D // Towards effortless photorealism through real-time raytracing Raytraced Reflections

▪ Launch rays from G-Buffer ▪ Raytrace at half resolution ▪ ¼ ray/pixel for reflection ▪ ¼ ray/pixel for reflected shadow

▪ Reconstruct at full resolution ▪ Arbitrary normals ▪ Spatially-varying roughness Raytraced Reflections S E E D // Towards effortless photorealism through real-time raytracing Reflection Pipeline

Importance Screen-space sampling reflection Raytracing Envmap gap fill

Temporal Bilateral cleanup Spatial accumulation reconstruction S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling

▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling

▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] ▪ Re-roll if ray below horizon S E E D // Towards effortless photorealism through real-time raytracing Layered Material Sampling

▪ One ray for the whole stack ▪ Stochastically select a layer ▪ Based on layer visibility estimate ▪ Bottom layers occluded by top layers ▪ View-dependent due to Fresnel ▪ Approximate with Schlick ▪ Sample BRDF of selected layer

▪ Multiple layers can generate same direction ▪ Query each layer about generated direction ▪ Add up probabilities S E E D // Towards effortless photorealism through real-time raytracing Reflection Raygen

▪ Read generated ray

▪ TraceRay()

▪ Same hit shaders as path-tracing reference

▪ Output packed in 2x RGBA 16F

Color R Color G Color B G-buffer Depth

Dir X * Hit T Dir Y * Hit T Dir Z * Hit T Inverse PDF Raw raytrace output S E E D // Towards effortless photorealism through real-time raytracing Why Stochastic? ▪ Gives right answer, but ▪ Produces noise ▪ Thrashes caches ▪ Alternative: post-filter ▪ Needs arbitrarily large kernel ▪ Introduces bleeding ▪ Sensitive to aliasing ▪ Meet in the middle? ▪ Bias samplers ▪ Combine with spatial filtering ▪ High variance -> increase bias

Image Space Gathering [Robison 2009] Raw raytrace output +Spatial reconstruction ▪ Based on Stochastic Screen-Space Reflections [Stachowiak 2015] ▪ For every full-res pixel, use 16 half-res ray hits ▪ Poisson-disk distribution around pixel ▪ Scale by local BRDF result = 0.0 ▪ Divide by ray PDF weightSum = 0.0 for pixel in neighborhood: ▪ Ratio estimator [Heitz 2018] weight = localBrdf(pixel.hit) / pixel.hitPdf result += color(pixel.hit) * weight weightSum += weight result /= weightSum +Spatial reconstruction +Temporal accumulation +Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count

+Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count

+BilateralVariance estimate cleanup +Bilateral cleanup +Temporal anti-aliasing S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Two sources of velocity ▪ Per-pixel motion vector ▪ Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Neither reprojection method robust

Per-pixel motion vector Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Dual-Source Reprojection ▪ Sample history using both reprojection strategies ▪ Estimate local statistics during spatial reconstruction ▪ RGB mean ▪ RGB standard deviation ▪ Of used hit samples ▪ Weigh contributions ▪ rgb_dist = (rgb – rgb_mean) / rgb_dev ▪ dist = (rgb – rgb_mean) / rgb_dev ▪ w = exp2(-10 * luma(dist)) S E E D // Towards effortless photorealism through real-time raytracing Reprojection Clamp ▪ Build color box from local statistics ▪ rgb_min = rgb_mean – rgb_dev ▪ rgb_max = rgb_mean + rgb_dev ▪ Clamp reprojected values to box ▪ rgb = clamp(rgb, rgb_min, rgb_max) ▪ Same idea as in temporal anti-aliasing [Karis 2014] ▪ Weigh clamped contributions ▪ rgb_sum += val * w

▪ Introduces bias ▪ Fixes most ghosting S E E D // Towards effortless photorealism through real-time raytracing Transparency & Translucency ▪ Refractions ▪ Order-independent (OIT) ▪ Variable roughness ▪ Multiple IOR transitions ▪ Absorption ▪ Translucency ▪ Light scattering inside a medium ▪ Inspired by Translucency in Frostbite [Barré-Brisebois 2011]

Glass and Translucency S E E D // Towards effortless photorealism through real-time raytracing Integration ▪ Multiple samples required for rough refractions and translucency ▪ Multi-layer screen-filtering complicated ▪ More on that later! ▪ Accumulate in texture-space instead ▪ Stable integration domain ▪ Static texture allocation ▪ 512*512 per object ▪ Time-sliced, ~1M rays/frame S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) ▪ Compute lighting at intersection S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) ▪ Compute lighting at intersection ▪ Gather all samples S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) ▪ Compute lighting at intersection ▪ Gather all samples ▪ Update value in texture S E E D // Towards effortless photorealism through real-time raytracing Translucency Filtering

▪ Can denoise spatially and/or temporally

▪ Temporal: build an update heuristic ▪ Reactive enough for moving lights & objects ▪ Exponential moving average can be OK

▪ Variance-adaptive mean estimation ▪ Based on exponential moving average ▪ Adaptive hysteresis S E E D // Towards effortless photorealism through real-time raytracing Transparency ▪ Similar approach to translucency ▪ Launch ray using view’s origin and direction ▪ Bend based on medium's index of refraction ▪ Sample a BSDF for rough refraction ▪ Trace a ray in the scene & sample lighting ▪ Tint result ▪ Chromatic aberration from interface ▪ Beer-Lambert absorption in medium S E E D // Towards effortless photorealism through real-time raytracing Indirect diffuse ▪ Important for consistency ▪ Minimize artist overhead ▪ We want a technique with: ▪ No precomputation ▪ No parametrization (UVs, proxies) ▪ Support for dynamic scenes ▪ Adaptive refinement Indirect diffuse off Indirect diffuse on S E E D // Towards effortless photorealism through real-time raytracing Spatial Storage ▪ Can’t afford to solve every frame ▪ Our GI budget: 250k rays / frame ▪ World space vs screen space ▪ World space is stable ▪ Screen-space fully dynamic ▪ World space surfels ▪ Easy to accumulate ▪ Distribute on the fly ▪ Smooth result by construction ▪ Position, normal, radius, info S E E D // Towards effortless photorealism through real-time raytracing Skinning S E E D // Towards effortless photorealism through real-time raytracing Surfel Skinning S E E D // Towards effortless photorealism through real-time raytracing Surfel Skinning S E E D // Towards effortless photorealism through real-time raytracing Surfel Skinning S E E D // Towards effortless photorealism through real-time raytracing Surfel Screen Application ▪ Render like deferred light sources ▪ Diffuse only ▪ Smoothstep distance attenuation ▪ Mahalanobis metric ▪ Angular weight: squared dot of normals ▪ Inspired by [Lehtinen 2008]

▪ World-space 3D culling ▪ Uniform grid ▪ Each cell holds list of surfels ▪ Find one cell per pixel, use all from list Combine with Screen Space AO [Jimenez 2016] Combine with Screen Space AO [Jimenez 2016] S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement

Surfel Spawning From Camera @ 1% speed S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement

Surfel Spawning From Camera @ 1% speed S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement

Surfel Spawning From Camera @ 1% speed S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement

Surfel Spawning From Camera @ 1% speed S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement Algorithm

▪ Iterative screen-space hole filling ▪ Calculate surfel coverage for each pixel S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement Algorithm

▪ Iterative screen-space hole filling ▪ Calculate surfel coverage for each pixel ▪ Find lowest coverage in tile (16x16 pixels) S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement Algorithm

▪ Iterative screen-space hole filling ▪ Calculate surfel coverage for each pixel ▪ Find lowest coverage in tile (16x16 pixels) ▪ Probabilistically spawn surfel on pixel ▪ Chance proportional to pixel’s projected area ▪ G-Buffer depth and normal S E E D // Towards effortless photorealism through real-time raytracing Surfel Placement Algorithm

▪ Iterative screen-space hole filling ▪ Calculate surfel coverage for each pixel ▪ Find lowest coverage in tile (16x16 pixels) ▪ Probabilistically spawn surfel on pixel ▪ Chance proportional to pixel’s projected area ▪ G-Buffer depth and normal ▪ Continue where coverage below threshold S E E D // Towards effortless photorealism through real-time raytracing Irradiance Calculation ▪ Path trace from surfels ▪ Unidirectional, explicit light connections Sky ▪ More samples for new surfels

Light source

Surfel

Scene geometry S E E D // Towards effortless photorealism through real-time raytracing Irradiance Calculation ▪ Path trace from surfels ▪ Unidirectional, explicit light connections ▪ More samples for new surfels ▪ Limit light path length ▪ Sample previous GI at last bounce ▪ "Infinite" bounces amortized over frames

Light source

Surfel

Query surfels at hit point S E E D // Towards effortless photorealism through real-time raytracing Monte Carlo Integration ▪ Monte Carlo assumes immutable integrand ▪ Plain old mean estimation

푥ҧ0 = 0

1 푥ҧ = 푙푒푟푝 푥ҧ , 푥 , 푛+1 푛 푛+1 푛 + 1

▪ Can’t use in dynamic GI ▪ Hard reset undesirable

Interactive GI in Frostbite [Hillaire 2018] S E E D // Towards effortless photorealism through real-time raytracing Adaptive Integration Give up on fully converging ▪ Exponential moving average 푥ҧ0 = 0

푥ҧ푛+1 = 푙푒푟푝 푥ҧ푛, 푥푛+1, 푘 ▪ Estimate short-term statistics ▪ Mean, variance ▪ Adapt blend factor 푘 ▪ Reduce if variance high ▪ Increase if mean far from short-term statistics S E E D // Towards effortless photorealism through real-time raytracing Adaptive Integration Give up on fully converging ▪ Exponential moving average 푥ҧ0 = 0

푥ҧ푛+1 = 푙푒푟푝 푥ҧ푛, 푥푛+1, 푘 ▪ Estimate short-term statistics ▪ Mean, variance ▪ Adapt blend factor 푘 ▪ Reduce if variance high ▪ Increase if mean far from short-term statistics S E E D // Towards effortless photorealism through real-time raytracing Adaptive Integration Give up on fully converging ▪ Exponential moving average 푥ҧ0 = 0

푥ҧ푛+1 = 푙푒푟푝 푥ҧ푛, 푥푛+1, 푘 ▪ Estimate short-term statistics ▪ Mean, variance ▪ Adapt blend factor 푘 ▪ Reduce if variance high ▪ Increase if mean far from short-term statistics Adaptive integration in 1D Short-term Adaptive mean Exponential moving average distribution bounds S E E D // Towards effortless photorealism through real-time raytracing Shadows ▪ Hard shadows trivial ▪ Launch a ray towards light position ▪ Check for any hit ▪ Soft shadows easy ▪ Sample direction from uniform cone [PBRT] ▪ Cone width determines penumbra S E E D // Towards effortless photorealism through real-time raytracing Shadows ▪ Hard shadows trivial ▪ Launch a ray towards light position ▪ Check for any hit ▪ Soft shadows easy ▪ Sample direction from uniform cone [PBRT] ▪ Cone width determines penumbra ▪ … Easy but noisy ▪ 1 visibility sample per pixel not enough ▪ Filter instead of shooting more S E E D // Towards effortless photorealism through real-time raytracing Shadow Filtering ▪ Based on Nvidia’s SVGF [Schied 2017] ▪ Temporal accumulation ▪ Multi-pass weighted blur ▪ Kernel size driven by variance estimate ▪ No shadow-specific heuristics

Input

(Over)simplified diagram of SVGF

Accumulate Estimate Blur some Blur temporally variance more

Output Hard Raytraced Shadows Soft Raytraced Shadows (Unfiltered) Soft Raytraced Shadows (Filtered) S E E D // Towards effortless photorealism through real-time raytracing Shadow Filtering ▪ Modified to reduce ghosting ▪ TAA-like color clip ▪ Variance-based bounds (5x5 kernel) [Salvi 2016]

Baseline SVGF +TAA-like clip S E E D // Towards effortless photorealism through real-time raytracing

Research Directions S E E D // Towards effortless photorealism through real-time raytracing Partial Coverage ▪ Transparency ▪ Deep framebuffers? ▪ Participating media ▪ Raymarch in hit shaders? ▪ Blending non-trivial Transparency in the Maxwell Renderer ▪ Defocus effects ▪ Depth of field ▪ Motion blur ▪ Really hard on transparency S E E D // Towards effortless photorealism through real-time raytracing Partial Coverage ▪ Current denoising assumes one visibility sample ▪ Hacks everywhere else ▪ Transparency ▪ Deep framebuffers?

▪ Participating media Transparency in the Maxwell Renderer ▪ Raymarch in hit shaders? ▪ Blending non-trivial ▪ Defocus effects ▪ Depth of field ▪ Motion blur ▪ Really hard on transparency S E E D // Towards effortless photorealism through real-time raytracing Primary Visibility ▪ Limits of raster on regular grids ▪ Temporal anti-aliasing got us far ▪ Need more samples, but cheap ▪ Adaptive sample allocation ▪ [Marrs 2018], [Mallett 2018], [Elias 2000] ▪ Coarse shading [Vaidyanathan 2014] ▪ Trade sampling quality for stability ▪ Temporally reuse samples [Corso 2017] ▪ … or “just” prefilter everything pbrt (Matt Pharr, Wenzel Jakob, Greg Humphreys)

▪ No more Sponza, please! ▪ Sorry, Sponza! ▪ Foliage is a great case study S E E D // Towards effortless photorealism through real-time raytracing Unified Perceptual Integration ▪ Importance of rendering phenomena varies ▪ Signal frequency varies ▪ Perceptual acuity varies ▪ Distribute ray/shading budget accordingly ▪ Existing research – adapt to real-time? ▪ Analogies in the film industry ▪ RenderMan + RSL ▪ Shaders raytrace arbitrarily ▪ Manual tweaking of budgets ▪ Open Shading Language ▪ Shaders define irradiance closures ▪ Integrator allocates rays Adaptive Wavelet Rendering [Overbeck 2009] S E E D // Towards effortless photorealism through real-time raytracing Filtering ▪ Spatiotemporal reconstruction ▪ Exploit spatiotemporal continuity ▪ Trades noise for spatiotemporal blur ▪ Detect discontinuities, trim kernels ▪ Spatial: à-Trous [Schied 2017] ▪ Temporal: forward projection [Schied 2018] ▪ Domain-specific knowledge ▪ PICA PICA reflections ▪ Area light shadows [Heitz 2018] ▪ Machine learning [Chaitanya 2017] ▪ Filter while integrating [Overbeck 2009] S E E D // Towards effortless photorealism through real-time raytracing Indirect lighting ▪ Raytracing doesn’t solve real-time GI ▪ Open problem even in offline rendering ▪ Variance still way too high ▪ Reduce frequency ▪ Prefiltering ▪ Path-space filtering ▪ Denosing/reconstruction ▪ Integrate in a smart way (MLT, etc.) [Veach 1997] ▪ Combine with other techniques ▪ Cone tracing ▪ (Ir)radiance caching [Jarosz 2008] ▪ Screen-space techniques ▪ Virtual point lights [Tokuyoshi 2017] S E E D // Towards effortless photorealism through real-time raytracing Shading Coherence ▪ Incoherent rays thrash caches, ruin execution ▪ BVH data ▪ Hit shader invocation ▪ Texture and mesh loading Rendering cost break-down in a non-trivial scene ▪ Bottleneck shifts away from ray intersection [Eisenacher 2013] ▪ Towards shading and data access [Eisenacher 2013] ▪ Build coherence of rays and/or hit points ▪ Disney’s Hyperion – batches of 30-60 million rays ▪ Candidate for hardware implementation ▪ Make shaders RT-friendly ▪ Object/texture space shading [Munkberg 2016] ▪ Coherent rays by construction [Segovia 2006] S E E D // Towards effortless photorealism through real-time raytracing Summary ▪ Replace fine-tuned hacks with unified approaches ▪ Reconstruction and filtering instead ▪ Correct by default ▪ Real-time visuals with almost path traced quality ▪ Achievable today on consumer hardware ▪ Use rays wisely ▪ PICA PICA shoots ~2.25 rays per pixel ▪ We’re at the start of the journey ▪ Plenty of research opportunities S E E D // Towards effortless photorealism through real-time raytracing Thanks ▪ SEED ▪ NVIDIA ▪ ▪ Magnus Nordin Johan Andersson ▪ Tomas Akenine-Möller ▪ ▪ Niklas Nummelin Colin Barré-Brisebois ▪ Nir Benty ▪ ▪ Anastasia Opara Jasper Bekkers ▪ Jiho Choi ▪ ▪ Kristoffer Sjöö Joakim Bergdahl ▪ Peter Harrison ▪ ▪ Ida Winterhaven Ken Brown ▪ Alex Hyder ▪ ▪ Graham Wihlidal Dean Calver ▪ Jon Jansen ▪ Dirk de la Hunt ▪ Aaron Lefohn ▪ ▪ Microsoft Jenna Frisk ▪ Ignacio Llamas ▪ ▪ Chas Boyd Paul Greveson ▪ Henry Moreton ▪ ▪ Ivan Nevraev Henrik Halen ▪ Martin Stich ▪ Effeli Holst ▪ Amar Patel ▪ Andrew Lauritzen ▪ Matt Sandy S E E D / / S E A R C H F O R E X T R A O R D I N A R Y E X P E R I E N C E S D I V I S I O N

S T O C K H O L M – L O S A N G E L E S – M O N T R É A L – R E M O T E

W W W . E A . C O M / S E E D

W E ‘ R E H I R I N G ! S E E D // Towards effortless photorealism through real-time raytracing

Questions? S E E D // Towards effortless photorealism through real-time raytracing References

▪ [Barré-Brisebois 2011] Barré-Brisebois, Colin and Bouchard, Marc. “Approximating Translucency for a Fast, Cheap and Convincing Subsurface Scattering Look”, available online. ▪ [Barré-Brisebois 2017] Barré-Brisebois, Colin. “A Certain Slant of Light: Past, Present and Future Challenges of Global Illumination in Games”, available online. ▪ [Chaitanya 2017] Chakravarty R. Alla Chaitanya et al. “Interactive Reconstruction of Monte Carlo Image Sequences using a Recurrent Denoising Autoencoder ”, available online. ▪ [Corso 2017] Alessandro Dal Corso, et al. “Interactive Stable ”, available online. ▪ [Eisenacher 2013] Christian Eisenacher, Gregory Nichols, Andrew Selle, Brent Burley. “Sorted Deferred Shading for Production Path Tracing”, available online. ▪ [Elias 2000] Hugo Elias. “Radiosity”, available online. ▪ [Harmer 2018] Jack Harmer, Linus Gisslén, Henrik Holst, Joakim Bergdahl, Tom Olsson, Kristoffer Sjöö and Magnus Nordin“ Imitation Learning with Concurrent Actions in 3D Games”. available online ▪ [Heitz 2018] Eric Heitz, Stephen Hill, Morgan McGuire. "Combining Analytic Direct Illumination and Stochastic Shadows", available online. ▪ [Hillaire 2018] Sébastien Hillaire. "Real-time Raytracing for Interactive Global Illumination Workflows in Frostbite", available online. ▪ [Igehy 1999] Igehy, Homan. “Tracing Ray Differentials”, available online. ▪ [Jarosz 2008] Wojciech Jarosz, Craig Donner, Matthias Zwicker, Henrik Wann Jensen. “Radiance Caching for Participating Media”, available online. ▪ [Jimenez 2016] Jorge Jimenez, Xian-Chun Wu, Angelo Pesce, Adrian Jarabo. "Practical Realtime Strategies for Accurate Indirect Occlusion", available online. ▪ [Karis 2014] Brian Karis. "High-Quality Temporal Supersampling", available online. ▪ [Lehtinen 2008] Jaakko Lehtinen, Matthias Zwicker, Emmanuel Turquin, Janne Kontkanen, Frédo Durand, François Sillion, Timo Aila. "A Meshless Hierarchical Representation for Light Transport“, available online. S E E D // Towards effortless photorealism through real-time raytracing References (cont’d)

▪ [Mallett 2018] Ian Mallett, Cem Yuksel. “Deferred Adaptive Compute Shading”, available online. ▪ [Marrs 2018] Adam Marrs, Josef Spjut, Holger Gruen, Rahul Sathe, Morgan McGuire. "Adaptive Temporal Antialiasing“, available online. ▪ [Munkberg 2016] Jacob Munkberg et al. “Texture Space Caching and Reconstruction for Ray Tracing”, available online. ▪ [Opara 2018] Anastasia Opara. “Creativity of Rules and Patterns”, available online. ▪ [Overbeck 2008] Ryan S. Overbeck, Craig Donner, Ravi Ramamoorthi. “Adaptive Wavelet Rendering”, available online. ▪ [PBRT] Pharr, Matt. Jakob, Wenzel and Humphreys, Greg. “Physically Based Rendering”, Book, http://www.pbrt.org/. ▪ [Ritchel 2011] Ritschel, Tobias et al. “The State of the Art in Interactive Global Illumination”, available online. ▪ [Robison 2009] Austin Robison, Peter Shirley. "Image Space Gathering", available online. ▪ [Schied 2017] Schied, Christoph et. Al. “Spatiotemporal Variance-Guided Filtering: Real-Time Reconstruction for Path-Traced Global Illumination”, NVIDIA Research, available online. ▪ [Schied 2018] Schied, Christoph et. al. “Gradient Estimation for Real-Time Adaptive Temporal Filtering”, available online. ▪ [Segovia 2006] Benjamin Segovia, Jean-Claude Iehl, Bernard Péroche. “Non-interleaved Deferred Shading of Interleaved Sample Patterns ”, available online. ▪ [Stachowiak 2015] Stachowiak, Tomasz. “Stochastic Screen-Space Reflections”, available online. ▪ [Salvi 2016] Marco Salvi. "An excursion in temporal super sampling", available online. ▪ [Tokuyoshi 2017] Yusuke Tokuyoshi, Takahiro Harada. “Stochastic Light Culling for VPLs on GGX Microsurfaces”, available online. ▪ [Vaidyanathan 2014] Karthik Vaidyanathan et al. “Coarse Pixel Shading”, available online. ▪ [Veach 1997] Eric Veach. “Robust Monte Carlo Methods for Light Transport Simulation”, available online. ▪ [Weidlich 2007] Weidlich, Andrea and Wilkie, Alexander. “Arbitrarily Layered Micro-Facet Surfaces”, available online. ▪ [Williams 1983] Williams, Lance. “Pyramidal Parametrics”, available online.