
Towards effortless photorealism through real-time raytracing Tomasz Stachowiak SEED – Electronic Arts Video S E E D // Towards effortless photorealism through real-time raytracing “PICA PICA” ▪ Exploratory mini-game & world ▪ For our self-learning AI agents to play, not for humans ☺ ▪ “Imitation Learning with Concurrent Actions in 3D Games” [Harmer 2018] ▪ Uses SEED’s Halcyon R&D engine ▪ Goals ▪ Explore hybrid rendering with DXR ▪ Clean and consistent visuals ▪ Procedural worlds [Opara 2018] ▪ No precomputation S E E D // Towards effortless photorealism through real-time raytracing Why use raytracing? Path tracing ~15 seconds accumulation Path tracing (1 sample per pixel) Real-time hybrid with raytracing No raytracing S E E D // Towards effortless photorealism through real-time raytracing Is this a fair comparison? ▪ “Classic” game solutions exist ▪ Local reflection volumes ▪ Planar reflections ▪ Lightmaps ▪ “Sky visibility” maps ▪ … S E E D // Towards effortless photorealism through real-time raytracing Zero artist overhead 1. Enable raytracing 2. Ship it Making it believable S E E D // Towards effortless photorealism through real-time raytracing Quality at the core ▪ Embrace physically-based rendering ▪ Ensure correct attribute ranges ▪ Applicable to stylized rendering too ▪ Always compare to ground truth ▪ Know when to cut corners ▪ Focus on good workflows ▪ Iteration yields quality S E E D // Towards effortless photorealism through real-time raytracing Materials ▪ Multiple microfacet layers ▪ Inspired by [Weidlich 2007] ▪ Rapidly try different looks ▪ Energy conserving Objects with Multi-Layered Materials ▪ Automatic Fresnel between layers ▪ All lighting & rendering modes ▪ Raster, path-traced reference, hybrid Multi-Layered Materials [Weidlich 2007] S E E D // Towards effortless photorealism through real-time raytracing Light transport ▪ Light-matter interaction creates varying phenomena ▪ Useful to model separately ▪ Different optimization strategies ▪ Varying difficulty ▪ Correct by default ▪ Consistency is key ▪ Avoid intensity hacks [Ritchell 2011] S E E D // Towards effortless photorealism through real-time raytracing Shadows ▪ Not just for grounding objects ▪ Greatly enhance micro-detail ▪ Contact hardening ▪ Softness needed to convey scale ▪ Hard shadows visually busy S E E D // Towards effortless photorealism through real-time raytracing Post-processing ▪ It’s part of light transport ▪ Replace hacks with physically-based effects ▪ Try to match a real lens S E E D // Towards effortless photorealism through real-time raytracing Art direction ▪ Even a tech demo needs good art ▪ Power of the Utah Teapot is finite ▪ Tech and content should evolve together ▪ Rendering is not a solved problem ▪ Raytracing helps, yet edge cases remain ▪ Play to the strengths of both ▪ Anastasia Opara’s talk on Friday ▪ 11:20, Room 313+314 Making it fast S E E D // Towards effortless photorealism through real-time raytracing Back to school ▪ Hardware finally getting there ▪ Still a lot of work on us "Juggler" - Eric Graham, 1987 ▪ Use rays sparingly ▪ Trace only where necessary ▪ Reconstruct, denoise ▪ Redundancy in spatial and temporal domains ▪ Adaptive techniques ▪ Tune ray count to importance ▪ Lots of research ahead! Turner Whitted, 1980 S E E D // Towards effortless photorealism through real-time raytracing Hybrid Rendering Pipeline Deferred shading (raster) Direct shadows Direct lighting (compute) Reflections (raytrace or raster) (raytrace or compute) Global Illumination (raytrace) Ambient occlusion Transparency & Translucency Post processing (compute) (raytrace or compute) (raytrace) S E E D // Towards effortless photorealism through real-time raytracing Raytraced Reflections ▪ Launch rays from G-Buffer ▪ Raytrace at half resolution ▪ ¼ ray/pixel for reflection ▪ ¼ ray/pixel for reflected shadow ▪ Reconstruct at full resolution ▪ Arbitrary normals ▪ Spatially-varying roughness Raytraced Reflections S E E D // Towards effortless photorealism through real-time raytracing Reflection Pipeline Importance Screen-space sampling reflection Raytracing Envmap gap fill Temporal Bilateral cleanup Spatial accumulation reconstruction S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling ▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] S E E D // Towards effortless photorealism through real-time raytracing Reflection Sampling ▪ Microfacet importance sampling ▪ Normal distribution function (NDF) part of BRDF ▪ Very few rays, so quality matters ▪ Low-discrepancy Halton sequence ▪ Cranley-Patterson rotation per pixel [PBRT] ▪ Re-roll if ray below horizon S E E D // Towards effortless photorealism through real-time raytracing Layered Material Sampling ▪ One ray for the whole stack ▪ Stochastically select a layer ▪ Based on layer visibility estimate ▪ Bottom layers occluded by top layers ▪ View-dependent due to Fresnel ▪ Approximate with Schlick ▪ Sample BRDF of selected layer ▪ Multiple layers can generate same direction ▪ Query each layer about generated direction ▪ Add up probabilities S E E D // Towards effortless photorealism through real-time raytracing Reflection Raygen ▪ Read generated ray ▪ TraceRay() ▪ Same hit shaders as path-tracing reference ▪ Output packed in 2x RGBA 16F Color R Color G Color B G-buffer Depth Dir X * Hit T Dir Y * Hit T Dir Z * Hit T Inverse PDF Raw raytrace output S E E D // Towards effortless photorealism through real-time raytracing Why Stochastic? ▪ Gives right answer, but ▪ Produces noise ▪ Thrashes caches ▪ Alternative: post-filter ▪ Needs arbitrarily large kernel ▪ Introduces bleeding ▪ Sensitive to aliasing ▪ Meet in the middle? ▪ Bias samplers ▪ Combine with spatial filtering ▪ High variance -> increase bias Image Space Gathering [Robison 2009] Raw raytrace output +Spatial reconstruction ▪ Based on Stochastic Screen-Space Reflections [Stachowiak 2015] ▪ For every full-res pixel, use 16 half-res ray hits ▪ Poisson-disk distribution around pixel ▪ Scale by local BRDF result = 0.0 ▪ Divide by ray PDF weightSum = 0.0 for pixel in neighborhood: ▪ Ratio estimator [Heitz 2018] weight = localBrdf(pixel.hit) / pixel.hitPdf result += color(pixel.hit) * weight weightSum += weight result /= weightSum +Spatial reconstruction +Temporal accumulation +Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count +Bilateral cleanup ▪ Secondary bilateral filter ▪ Much dumber than reconstruction ▪ Introduces blur ▪ Only run where variance high ▪ Variance from spatial reconstruction ▪ Temporally smoothed with hysteresis = 0.5 ▪ Variable kernel width, sample count +BilateralVariance estimate cleanup +Bilateral cleanup +Temporal anti-aliasing S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Two sources of velocity ▪ Per-pixel motion vector ▪ Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Temporal Reprojection ▪ Neither reprojection method robust Per-pixel motion vector Reprojection of hit point S E E D // Towards effortless photorealism through real-time raytracing Dual-Source Reprojection ▪ Sample history using both reprojection strategies ▪ Estimate local statistics during spatial reconstruction ▪ RGB mean ▪ RGB standard deviation ▪ Of used hit samples ▪ Weigh contributions ▪ rgb_dist = (rgb – rgb_mean) / rgb_dev ▪ dist = (rgb – rgb_mean) / rgb_dev ▪ w = exp2(-10 * luma(dist)) S E E D // Towards effortless photorealism through real-time raytracing Reprojection Clamp ▪ Build color box from local statistics ▪ rgb_min = rgb_mean – rgb_dev ▪ rgb_max = rgb_mean + rgb_dev ▪ Clamp reprojected values to box ▪ rgb = clamp(rgb, rgb_min, rgb_max) ▪ Same idea as in temporal anti-aliasing [Karis 2014] ▪ Weigh clamped contributions ▪ rgb_sum += val * w ▪ Introduces bias ▪ Fixes most ghosting S E E D // Towards effortless photorealism through real-time raytracing Transparency & Translucency ▪ Refractions ▪ Order-independent (OIT) ▪ Variable roughness ▪ Multiple IOR transitions ▪ Absorption ▪ Translucency ▪ Light scattering inside a medium ▪ Inspired by Translucency in Frostbite [Barré-Brisebois 2011] Glass and Translucency S E E D // Towards effortless photorealism through real-time raytracing Integration ▪ Multiple samples required for rough refractions and translucency ▪ Multi-layer screen-filtering complicated ▪ More on that later! ▪ Accumulate in texture-space instead ▪ Stable integration domain ▪ Static texture allocation ▪ 512*512 per object ▪ Time-sliced, ~1M rays/frame S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside ▪ Launch rays in uniform sphere dist. ▪ (Importance-sample phase function) S E E D // Towards effortless photorealism through real-time raytracing Translucency Breakdown ▪ For every valid position & normal ▪ Flip normal and push (ray) inside
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages97 Page
-
File Size-