05 Detail mapping Steve Marschner CS5625 Spring 2016 Predicting Reflectance Functions from Complex Surfaces Stephen H. Westin James R. Arvo Kenneth E. Torrance Program of Computer Graphics Cornell University Ithaca, New York 14853 Hierarchy of scales Abstract 1000 macroscopic Geometry We describe a physically-based Monte Carlo technique for ap- proximating bidirectional reflectance distribution functions Object scale (BRDFs) for a large class of geometriesmesoscopic by directly simulating 100 optical scattering. The technique is more general than pre- vious analytical models: it removesmicroscopic most restrictions on sur- Texture, face microgeometry. Three main points are described: a new bump maps 10 representation of the BRDF, a Monte Carlo technique to esti- mate the coefficients of the representation, and the means of creating a milliscale BRDF from microscale scattering events. Milliscale These allow the prediction of scattering from essentially ar- (Mesoscale) 1 mm Texels bitrary roughness geometries. The BRDF is concisely repre- sented by a matrix of spherical harmonic coefficients; the ma- 0.1 trix is directly estimated from a geometric optics simulation, BRDF enforcing exact reciprocity. The method applies to rough- ness scales that are large with respect to the wavelength of Microscale light and small with respect to the spatial density at which 0.01 the BRDF is sampled across the surface; examples include brushed metal and textiles. The method is validated by com- paring with an existing scattering model and sample images are generated with a physically-based global illumination al- Figure 1: Applicability of Techniques gorithm. CR Categories and Subject Descriptors: I.3.7 [Computer model many surfaces, such as those with anisotropic rough- Graphics]: Three-Dimensional Graphics and Realism. ness. Models that deal with anisotropic surfaces [8, 11] fail to spherical harmonics, Monte Carlo, Additional Key Words: assure physical consistency. anisotropic reflection, BRDF This paper presents a new method of creating local scat- tering models. The method has three main components: a concise, general representation of the BRDF, a technique to 1 Introduction estimate the coefficients of the representation, and a means of using scattering at one scale to create a BRDF for a larger Since the earliest days of computer graphics, experimenters scale. The representation used makes it easy to enforce the ba- have recognized that the realism of an image is limited by sic physical property of scattering reciprocity, and its approx- the sophistication of the model of local light scattering [3, 12]. imation does not require discretizing scattering directions as Non-physically-based local lighting models, such as that of in the work of Kajiya [8] and Cabral et al. [1]. Phong [12], although computationally simple, exclude many The method can predict scattering from any geometry that important physical effects and lack the energy consistency can be ray-traced: polygons, spheres, parametric patches, needed for global illumination calculations. Physically-based and even volume densities. Previous numerical techniques models [2, 5, 15] reproduce many effects better, but cannot were limited to height fields, and analytical methods have been developed only for specific classes of surface geome- try. The new method accurately models both isotropic and anisotropic surfaces such as brushed metals, velvet, and wo- Permission to copy without fee all or part of this material is granted ven textiles. provided that the copies are not made or distributed for direct commercial advantage, the ACM copyright notice and the title of the Figure 1 shows several representations used in realistic ren- publication and its date appear, and notice is given that copying is by dering, along with approximate scale ranges where each is permission of the Association for Computing Machinery. To copy applicable. At the smallest scale (size 1 mm), which we call otherwise, or to republish, requires a fee and/or specific permission. microscale,theBRDFaccuratelycapturestheappearanceofa 1992 ACM-0-89791-479-1/92/007/0255 Texture Maps • Most flexible part of graphics hardware • Textures can modulate – Material § Diffuse, Specular/roughness (gloss maps) – Geometry § Positions • displacement mapping § Normals • bump mapping, normal mapping – Lighting § Environment mapping § Reflection mapping § Shadow mapping slide courtesy of Kavita Bala, Cornell University Displacement and Bump/Normal Mapping • Mimic effect of geometric detail/meso geometry – Also detail mapping Geometry Bump Displacement mapping mapping slide courtesy of Kavita Bala, Cornell University Displacement Mapping p0(u, v)=p(u, v)+h(u, v)n(u, v) slide courtesy of Kavita Bala, Cornell University Displacement Maps: where? slide courtesy of Kavita Bala, Cornell University Displacement Maps: vertex map slide courtesy of Kavita Bala, Cornell University Displacement Maps • Pros – Gives you very complex surfaces • Cons – Gives you very complex surfaces – Or boring with small numbers of vertices • Relationship with tesselation shaders slide courtesy of Kavita Bala, Cornell University Original Tesselated Displacement Mapped slide courtesy of Kavita Bala, Cornell University Bump mapping • “Simulation of Wrinkled Surfaces” Blinn 78 • Blinn: keep surface, use new normals slide courtesy of Kavita Bala, Cornell University Bump Mapping • Alter normals of surface – Only affects shading normals • Also, mimics effect of small scale geometry – Detail map – Except at silhouette – Adds perceived bumps, wrinkles slide courtesy of Kavita Bala, Cornell University Bump Mapping slide courtesy of Kavita Bala, Cornell University Bump Mapping slide courtesy of Kavita Bala, Cornell University How to change the normal? • First, need some frame of reference – Normal is modified with respect to that – Have tangent space basis: t and b – Normal, tangent and bitangent vectors slide courtesy of Kavita Bala, Cornell University Heightfield: Blinn’s original idea • Single scalar, more computation to infer N’ slide courtesy of Kavita Bala, Cornell University Perturbed normal given height map Normal is determined by partial derivatives of height • in the local frame of the displacement map: ndisp =(hu,hv, 1) • approx: heights are small compared to radius of curvature (constant normal) • then the displaced surface is locally a linear transformation of the height field • normal transforms by the adjoint matrix (as normals always to) • perform 4 lookups to get 4 neighboring height values • subtract to obtain finite difference derivatives Height Field Bump Maps • Older technique, less memory • Texture map value is a height • Gray scale value: light is +, dark is - slide courtesy of Kavita Bala, Cornell University Bump Mapping • Look up bu and bv • N’ is not normalized • N’ = N + bu T + bv B slide courtesy of Kavita Bala, Cornell University Rendering with Bump Maps • N’.L • Perturb N to get N’ using bump map • Transform L to tangent space of surface – Have N, T (tangent), bitangent B = T x N slide courtesy of Kavita Bala, Cornell University http://www.youtube.com/watch?v=1mdR2imNeZI slide courtesy of Kavita Bala, Cornell University Normal Maps • Preferred technique for bump mapping for modern graphics cards • Store new normals in texture map – Encodes (x, y, z) mapped to [-1, 1] • More memory but lower computation Normal Map Height Map slide courtesy of Kavita Bala, Cornell University • Store colorComponent = 0.5 * normalComponent + 0.5 • • Use normalComponent = 2* colorComponent -1 slide courtesy of Kavita Bala, Cornell University Normal Map slide courtesy of Kavita Bala, Cornell University Creating Normal Maps • First create complex geometry • Simplify (in modeling time) to simple mesh with normal map slide courtesy of Kavita Bala, Cornell University Displacement Maps vs. Normal Maps slide courtesy of Kavita Bala, Cornell University Compare with the opposite view Original Tesselated Displacement Mapped slide courtesy of Kavita Bala, Cornell University Unreal 3 slide courtesy of Kavita Bala, Cornell University 2M polys slide courtesy of Kavita Bala, Cornell University 5k slide courtesy of Kavita Bala, Cornell University Creating Normal Maps slide courtesy of Kavita Bala, Cornell University Which space is normal map in? • World space – Easy computation § Get normal § Get light vector § Compute shading – Can we use the same normal map for… § two walls § A rotating object • Object space – Better, but cannot be reused for symmetric parts of object slide courtesy of Kavita Bala, Cornell University Which space is normal in? • Tangent space normals – Can reuse for deforming surfaces – Transform lighting to this space and shade slide courtesy of Kavita Bala, Cornell University Parallax Mapping • Problem with normal mapping – No self-occlusion – Supposed to be a height field but never see this occlusion across different viewing angles • Parallax mapping – Positions of objects move relative to one other as viewpoint changes slide courtesy of Kavita Bala, Cornell University Parallax Mapping • Want Tideal • Use Tp to approximate it v h slide courtesy of Kavita Bala, Cornell University Parallax Offset Limiting • Problem: at steep viewing, can offset too much • Limit offset slide courtesy of Kavita Bala, Cornell University Parallax Offset Limiting • Widely used in games – the standard in bump mapping Normal Mapping Parallax Mapping Offset Limiting slide courtesy of Kavita Bala, Cornell University 1,100 polygon object w/ 1.5 million polygon parallax occlusion mapping slide courtesy
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages44 Page
-
File Size-