CMSC427 Computer Graphics

CMSC427 Computer Graphics

CMSC427 Computer Graphics Matthias Zwicker Fall 2018 Today Advanced shading techniques • Bump mapping • Shadows • Shadow mapping • Shadow mapping in OpenGL 2 Bump mapping • Surface detail is often the result of small perturbations in the surface geometry – Modeling detailed surfaces would lead to impractical number of triangles • Bump mapping alters the surface normal – Normals are encoded in texture maps – Provide the illusion of small scale surface detail – Does not change geometry (triangles) • Requires per-pixel shading using a fragment program 3 Bump mapping No bump mapping With bump mapping Bump mapped plane No bump mapping With bump mapping Bump texture 4 Bump mapping 1. Generating and storing bump maps 2. Rendering with bump maps 5 Generating bump maps • Usually done in a pre-process • Input – Texture map that encodes small surface displacements – Height field – E.g., use gray scale image as height values • Output – Texture map that encodes normals of displaced surface – This texture will be stored as an image, read by the application 6 Generating bump maps • Start with displacement map (height field) • Normal • Discrete case using central differencing – Usually, ∆u, ∆v = 1 • Normalize length of normal! 7 Storing bump maps • Encode normal direction in RGB color channels – Coordinates of unit normal are in [-1..1]3 – Need to map range [-1..1] to [0..255] for all channels RGB encoded bump map 8 Rendering with bump maps • When applying a bump map to a curved surface, how are the normals specified in the bump map related to the surface? • Normals are defined relative to local tangent/normal vectors Local tangent vector Normals Bump map applied Normals in bump map to curved surface 9 Rendering with bump maps • Bump map normals are defined in tangent space – Defined by two tangent vectors and normal • Will define tangent space for each triangle – Texture coordinates provide parameterization of each triangle, i.e., parametric patch x(u,v) – Compute tangent vectors using partial derivatives of parameterization • For shading, will need to transform normals from tangent space to camera space 10 Tangent space • Triangle with texture coordinates can be expressed as parametric surface x(u,v) – Triangle vertices in object space v0, v1, v2 – Texture coordinates (u0, v0), (u1, v1), (u2, v2), • Interpolation constraints: we know 11 Tangent space • Solve for affine function • Using constraints at vertices 12 Tangent space • Tangent space defined by two tangent vectors (tangent t, “bi-tangent” b), and normal vector n • t, b, n defined in object space coordinates • Tangent, bi-tangent not orthogonal in general • No normalization necessary 13 Normal in object space • Normal map stores normals in tangent coordinates – Basis vectors t, b, n • Can transform normal from tangent to object space – Given values [bm0, bm1, bm2] from bump map – Unpacked from [0..1] to range [-1..1] 14 Storing tangent vectors Before rendering • For each triangle, compute tangent, bi- tangent vector • At each vertex, average tangent, bi- tangent vectors over adjacent triangles to get smooth transitions between triangles • Store tangent vector as additional vertex attributes – Only one tangent vector and normal necessary – Second tangent vector computed on the fly 15 Rendering Vertex shader • Per-vertex input – Vertex position, normal, tangent vector in object space – Bump map texture coordinates • Compute bi-tangent vector • Transform everything to camera space using modelview matrix • Output to fragment shader (will be interpolated to each pixel) – Vertex position, texture coordinates, tangent, bi- tangent, normal vector in camera space – Bump map texture coordinates 16 Rendering Fragment shader • Transform normal [bm0, bm1, bm2] stored in bump map to camera coordinates – Use t, b, n basis to transform to object space – Use modelview matrix to transform from object space to camera space – Normalize • Perform lighting in camera coordinates 17 Caveats • Need mesh with texture coordinates to define tangent space • Avoid triangles with zero area in texture space – Cannot compute valid tangent space • Avoid triangles with negative area in texture space – May happen when texture is mirrored • Avoid non-uniform stretching of bump map 18 Combination with env. map • “Environment mapped bump mapping” (EMBM) • Use bump mapped normal to compute reflection vector, look up cube map http://zanir.wz.cz/?paged=3&lang=en 19 Env. mapped bump mapping • Use additional ‘dirt’ texture to modulate strength of reflection from environment map 20 Tutorials • Caution, slightly different derivation http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-13- normal-mapping/ • OpenGL shading language book – Bump mapping uses shading in tangent space 21 Today • Bump mapping • Shadows • Shadow mapping • Shadow mapping in OpenGL 22 Why are shadows important? • Cues on scene lighting 23 Why are shadows important? • Contact points • Depth cues 24 Why are shadows important? • Realism Without self-shadowing Without self-shadowing 25 Terminology • Umbra: fully shadowed region • Penumbra: partially shadowed region (area) light source occluder penumbra umbra receiver shadow 26 Hard and soft shadows • Point and directional lights lead to hard shadows, no penumbra • Area light sources lead to soft shadows, with penumbra point directional area umbra penumbra 27 Hard and soft shadows Hard shadow, Soft shadow, point light source area light source 28 Shadows for interactive rendering • Focus on hard shadows – Soft shadows often too hard to compute in interactive graphics • Two main techniques – Shadow mapping – Shadow volumes • Many variations, subtleties • Still active research area 29 Today • Bump mapping • Shadows • Shadow mapping • Shadow mapping in OpenGL 30 Shadow mapping http://en.wikipedia.org/wiki/Shadow_mapping Main idea • Scene point is lit by light source if it is visible from light source • Determine visibility from light source by placing camera at light source position and rendering scene Determine visibility from Scene points are lit if light source by placing camera visible from light source at light source position 31 Two pass algorithm First pass • Render scene by placing depth value camera at light source in shadow map position • Store depth image (shadow map) Depth image seen from light source Two pass algorithm Second pass • Render scene from depth value camera (eye) positionin shadow map vb is in • At each pixel, compare shadow pixel seen distance to light source from eye vb (yellow) with value in shadow map (red) – If yellow distance is larger than red, we are in shadow – If distance is smaller or equal, pixel is lit Final image with shadows 33 Issues • Limited field of view of shadow map • Z-fighting • Sampling problems 34 Limited field of view • What if a scene point is outside the field of view of the shadow map? field of view of shadow map Limited field of view • What if a scene point is outside the field of view of the shadow map? shadow • Use six shadow maps, maps arranged in a cube • Requires rendering pass for each shadow map! z-fighting • In theory, depth values for points visible from light source are equal Camera image in both rendering passes Shadow map Image • Because of limited pixels resolution, depth of pixel visible from camera could be larger Shadow map than shadow map value pixels Pixel is considered • Need to add bias in Depth of in shadow! first pass to make sure shadow map Depth of pixel visible pixels are lit from camera Solution • Add bias when rendering shadow map – Move geometry away from light by small amount • Finding correct amount of bias is tricky Correct bias Not enough bias Too much bias 38 Bias Not enough Too much Correct 39 Sampling problems • Shadow map pixel may project to many image pixels • Ugly stair-stepping artifacts 40 Solutions • Increase resolution of shadow map – Not always sufficient • Split shadow map into several slices • Tweak projection for shadow map rendering – Light space perspective shadow maps (LiSPSM) http://www.cg.tuwien.ac.at/research/vr/lispsm/ – With GLSL source code! • Combination of splitting and LiSPSM – Basis for most serious implementations – List of advanced techniques see http://en.wikipedia.org/wiki/Shadow_mapping 41 LiSPSM Basic shadow map Light space perspective shadow map 42 Percentage closer filtering • Goal: avoid stair-stepping artifacts • Similar to texture filtering, but with a twist Simple shadow mapping Percentage closer filtering http://http.developer.nvidia.com/GPUGems/gpugems_ch11.html 43 Percentage closer filtering • Instead of looking up one shadow map pixel, look up several • Perform depth test for each shadow map pixel • Compute percentage of lit shadow map pixels 44 Percentage closer filtering • Supported in hardware for small filters (2x2 shadow map pixels) • Can use larger filters (look up more shadow map pixels) at cost of performance penalty • Fake soft shadows – Larger filter, softer shadow boundary 45 Advanced shadow mapping • See https://en.wikipedia.org/wiki/Shadow_mapping 46 Today • Bump mapping • Shadows • Shadow mapping • Shadow mapping in OpenGL 47 Shadow mapping with OpenGL • Tutorial http://www.opengl-tutorial.org/intermediate-tutorials/tutorial-16-shadow-mapping/ • Recommended book: „OpenGL Shading Language“ by Randi Rost 48 First pass • Render scene by placing camera at light source position • Compute light view (look at) matrix – Similar to computing camera matrix from look-at, up vector – Compute its inverse to get world-to-light transform • Determine view frustum such

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    59 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us