Visualizing 3D Objects with Fine-Grain Surface Depth Roi Mendez Fernandez

Total Page:16

File Type:pdf, Size:1020Kb

Visualizing 3D Objects with Fine-Grain Surface Depth Roi Mendez Fernandez Visualizing 3D models with fine-grain surface depth Author: Roi Méndez Fernández Supervisors: Isabel Navazo (UPC) Roger Hubbold (University of Manchester) _________________________________________________________________________Index Index 1. Introduction ..................................................................................................................... 7 2. Main goal .......................................................................................................................... 9 3. Depth hallucination ........................................................................................................ 11 3.1. Background ............................................................................................................. 11 3.2. Output .................................................................................................................... 14 4. Displacement mapping .................................................................................................. 15 4.1. Background ............................................................................................................. 15 4.1.1. Non-iterative methods ................................................................................... 17 a. Bump mapping ...................................................................................... 17 b. Parallax mapping ................................................................................... 18 c. Parallax mapping with offset limiting .................................................... 19 d. Parallax mapping taking into account the slope ................................... 19 4.1.2. Unsafe iterative methods ............................................................................... 20 a. Iterative parallax mapping .................................................................... 20 b. Binary search ......................................................................................... 21 c. Secant method ...................................................................................... 22 4.1.3. Safe iterative methods ................................................................................... 22 a. Linear search ......................................................................................... 22 b. Other methods ...................................................................................... 23 4.2. Displacement mapping implementation ................................................................ 24 4.2.1. Relief mapping ................................................................................................ 24 4.2.2. Interval mapping............................................................................................. 25 4.2.3. Relaxed Cone stepping ................................................................................... 27 4.3. Results of implementation ..................................................................................... 31 5. Illumination .................................................................................................................... 37 5.1. Background ............................................................................................................. 38 5.2. HDR Images ............................................................................................................ 39 5.2.1. Environment map creation ............................................................................. 40 5.2.2. Median cut sampling ...................................................................................... 44 a. Implementation ..................................................................................... 44 5.3. Results of implementation ..................................................................................... 52 6. Geometry images ........................................................................................................... 55 6.1. Background ............................................................................................................. 55 6.2. Implementation and use ........................................................................................ 55 6.3. Results of implementation and use........................................................................ 57 6.4. Conclusions and future work.................................................................................. 59 7. Acknowledgements ........................................................................................................ 61 References ...................................................................................................................... 63 Appendices ..................................................................................................................... 65 3 Visualizing 3D models with fine-grain surface depth___________________________________ 4 __________________________________________________________________ Figures index Figures index Figure 1: Model reconstructed from a set of photographs ............................................ 7 Figure 2: Model reconstructed from photographs lit with one point light ......................... 9 Figure 3: Original photograph from a real Mayan building, derived model relighted with different light conditions and the same model with another illumination ....................... 11 Figure 4: Flow chart showing the steps in the process ............................................... 12 Figure 5: Example of an input photograph pair ....................................................... 12 Figure 6: Example of a profile of a textured surface and the separation between the above- plane and below-plane surface models .................................................................. 13 Figure 7: Difference between a render using a depth hallucinated map and a render using a laser scanned map ........................................................................................... 13 Figure 8: Original photograph and reconstructed and relighted model .......................... 14 Figure 9: Depth map and computed normal map using the Sobel operator ...................... 14 Figure 10: Displacement mapping scheme .............................................................. 15 Figure 11: Displacement mapping process scheme .................................................... 16 Figure 12: Displacement mapping ........................................................................ 16 Figure 13: Bump mapping .................................................................................. 17 Figure 14: Texture mapping and Bump mapping ...................................................... 17 Figure 15: Parallax mapping ............................................................................... 18 Figure 16: Comparison between bump mapping and parallax mapping ........................... 18 Figure 17: Parallax mapping with offset limiting ..................................................... 19 Figure 18: Parallax mapping and Parallax mapping with offset limiting ......................... 19 Figure 19: Parallax mapping taking into account the slope ......................................... 20 Figure 20: Comparison between parallax mapping with offset limiting and parallax mapping taking into account the slope ............................................................................. 20 Figure 21: Comparison between parallax mapping with slope and iterative parallax mapping .................................................................................................................. 21 Figure 22: Binary search .................................................................................... 21 Figure 23: Binary search results .......................................................................... 21 Figure 24: Secant method .................................................................................. 22 Figure 25: Linear search .................................................................................... 22 Figure 26: Linear search results .......................................................................... 23 Figure 27: Cone stepping ................................................................................... 23 Figure 28: Interval mapping ............................................................................... 25 Figure 29: Comparison between interval mapping and relief mapping with two steps in the interval mapping and binary search step respectively ............................................... 26 Figure 30: Comparison between conservative and relaxed cones .................................. 27 Figure 31: An intermediate step during computation of a cone ratio ............................. 28 Figure 32: Some viewing directions and relaxed cone ................................................ 28 Figure 33: Cone map and relaxed cone map ........................................................... 29 Figure 34: Ray-relaxed cone intersection process ..................................................... 29 Figure 35: Relief mapping artifacts with 10 linear search steps and relief mapping with 20 linear search steps .......................................................................................... 32 Figure 37: Interval mapping using 10 linear search steps and interval mapping using 20 search steps. .........................................................................................................
Recommended publications
  • Advanced Texture Mapping
    Advanced texture mapping Computer Graphics COMP 770 (236) Spring 2007 Instructor: Brandon Lloyd 2/21/07 1 From last time… ■ Physically based illumination models ■ Cook-Torrance illumination model ° Microfacets ° Geometry term ° Fresnel reflection ■ Radiance and irradiance ■ BRDFs 2/21/07 2 Today’s topics ■ Texture coordinates ■ Uses of texture maps ° reflectance and other surface parameters ° lighting ° geometry ■ Solid textures 2/21/07 3 Uses of texture maps ■ Texture maps are used to add complexity to a scene ■ Easier to paint or capture an image than geometry ■ model reflectance ° attach a texture map to a parameter ■ model light ° environment maps ° light maps ■ model geometry ° bump maps ° normal maps ° displacement maps ° opacity maps and billboards 2/21/07 4 Specifying texture coordinates ■ Texture coordinates needed at every vertex ■ Hard to specify by hand ■ Difficult to wrap a 2D texture around a 3D object from Physically-based Rendering 2/21/07 5 Planar mapping ■ Compute texture coordinates at each vertex by projecting the map coordinates onto the model 2/21/07 6 Cylindrical mapping 2/21/07 7 Spherical mapping 2/21/07 8 Cube mapping 2/21/07 9 “Unwrapping” the model 2/21/07 images from www.eurecom.fr/~image/Clonage/geometric2.html 10 Modelling surface properties ■ Can use a texture to supply any parameter of the illumination model ° ambient, diffuse, and specular color ° specular exponent ° roughness fr o m ww w.r o nfr a zier.net 2/21/07 11 Modelling lighting ■ Light maps ° supply the lighting directly ° good for static environments
    [Show full text]
  • Texture Mapping Objectives the Limits of Geometric Modeling
    Objectives •Introduce Mapping Methods Texture Mapping - Texture Mapping - Environment Mapping - Bump Mapping CS 432 Interactive Computer Graphics •Consider basic strategies Prof. David E. Breen - Forward vs backward mapping Department of Computer Science - Point sampling vs area averaging E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 1 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 2 1 2 The Limits of Geometric Modeling an Orange Modeling •Although graphics cards can render over •Consider the problem of modeling an orange 20 million polygons per second, that (the fruit) number is insufficient for many phenomena •Start with an orange-colored sphere - Clouds - Too simple - Grass •Replace sphere with a more complex shape - Terrain - Does not capture surface characteristics (small - Skin dimples) - Takes too many polygons to model all the dimples E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 3 E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 4 3 4 Modeling an Orange (2) Three Types of Mapping •Take a picture of a real orange, scan it, •Texture Mapping and “paste” onto simple geometric model - Uses images to fill inside of polygons - This process is known as texture mapping •Environment (reflection mapping) •Still might not be sufficient because - Uses a picture of the environment for texture resulting surface will be smooth maps - Need to change local shape - Allows simulation of highly specular surfaces - Bump mapping •Bump mapping - Emulates altering normal vectors during the rendering process E. Angel and D. Shreiner: Interactive Computer Graphics 6E © Addison-Wesley 2012 5 E.
    [Show full text]
  • Real-Time Recursive Specular Reflections on Planar and Curved Surfaces Using Graphics Hardware
    REAL-TIME RECURSIVE SPECULAR REFLECTIONS ON PLANAR AND CURVED SURFACES USING GRAPHICS HARDWARE Kasper Høy Nielsen Niels Jørgen Christensen Informatics and Mathematical Modelling The Technical University of Denmark DK 2800 Lyngby, Denmark {khn,njc}@imm.dtu.dk ABSTRACT Real-time rendering of recursive reflections have previously been done using different techniques. How- ever, a fast unified approach for capturing recursive reflections on both planar and curved surfaces, as well as glossy reflections and interreflections between such primitives, have not been described. This paper describes a framework for efficient simulation of recursive specular reflections in scenes containing both planar and curved surfaces. We describe and compare two methods that utilize texture mapping and en- vironment mapping, while having reasonable memory requirements. The methods are texture-based to allow for the simulation of glossy reflections using image-filtering. We show that the methods can render recursive reflections in static and dynamic scenes in real-time on current consumer graphics hardware. The methods make it possible to obtain a realism close to ray traced images at interactive frame rates. Keywords: Real-time rendering, recursive specular reflections, texture, environment mapping. 1 INTRODUCTION mapped surface sees the exact same surroundings re- gardless of its position. Static precalculated environ- Mirror-like reflections are known from ray tracing, ment maps can be used to create the notion of reflec- where shiny objects seem to reflect the surrounding tion on shiny objects. However, for modelling reflec- environment. Rays are continuously traced through tions similar to ray tracing, reflections should capture the scene by following reflected rays recursively un- the actual synthetic environment as accurately as pos- til a certain depth is reached, thereby capturing both sible, while still being feasible for interactive render- reflections and interreflections.
    [Show full text]
  • Environment Mapping
    CS 418: Interactive Computer Graphics Environment Mapping Eric Shaffer Some slides adapted from Angel and Shreiner: Interactive Computer Graphics 7E © Addison-Wesley 2015 Environment Mapping How can we render reflections with a rasterization engine? When shading a fragment, usually don’t know other scene geometry Answer: use texture mapping…. Create a texture of the environment Map it onto mirror object surface Any suggestions how generate (u,v)? Types of Environment Maps Sphere Mapping Classic technique… Not supported by WebGL OpenGL supports sphere mapping which requires a circular texture map equivalent to an image taken with a fisheye lens Sphere Mapping Example 5 Sphere Mapping Limitations Visual artifacts are common Sphere mapping is view dependent Acquisition of images non-trivial Need fisheye lens Or render from fisheye lens Cube maps are easier to acquire Or render Acquiring a Sphere Map…. Take a picture of a shiny sphere in a real environment Or render the environment into a texture (see next slide) Why View Dependent? Conceptually a sphere map is generated like ray-tracing Records reflection under orthographic projection From a given view point What is a drawback of this? Cube Map Cube mapping takes a different approach…. Imagine an object is in a box …and you can see the environment through that box 9 Forming a Cube Map Reflection Mapping How Does WebGL Index into Cube Map? •To access the cube map you compute R = 2(N·V)N-V •Then, in your shader vec4 texColor = textureCube(texMap, R); V R •How does WebGL
    [Show full text]
  • Texture Mapping Effects
    CS 543 Computer Graphics Texture Mapping Effects by Cliff Lindsay “Top Ten List” Courtesy of David Letterman’s Late Show and CBS Talk Format List of Texture Mapping Effects from Good to Spectacular (my biased opinion): Highlights: . Define Each Effect . Describe Each Effect Briefly: Theory and Practice. Talk about how each effect extends the idea of general Texture Mapping (previous talk) including Pros and Cons. Demos of selected Texture Mapping Effects 1 Texture Mapping Effect #10 Light Mapping Main idea: Static diffuse lighting contribution for a surface can be captured in a texture and blended with another texture representing surface detail. Highlights: . Eliminate lighting calculation overhead . Light maps are low resolution . Light maps can be applied to multiple textures * = [Images courtesy of flipcode.com] Light Mapping Below is a night scene of a castle. No lighting calculation is being performed at all in the scene. Left: No Light Map applied Right: Light Map applied [Images courtesy of www.gamasutra.com] 2 Texture Mapping Effect #9 Non-Photorealistic Rendering Main idea: Recreating an environment that is focused on depicting a style or communicating a motif as effectively as possible. This is in contrast to Photorealistic which tries to create as real a scene as possible. High lights of NPR: . Toon Shading . Artistic styles (ink, water color, etc.) . Perceptual rendering [Robo Model with and without Toon Shading, Image courtesy of Michael Arias] Non-Photorealistic Rendering Non-Photorealistic Rendering Simple Example: Black
    [Show full text]
  • Real-Time Rendering (Echtzeitgraphik)
    Real-Time Rendering (Echtzeitgraphik) Dr. Michael Wimmer [email protected] Shading and Lighting Effects Overview Environment mapping Cube mapping Sphere mapping Dual-paraboloid mapping Reflections, Refractions, Speculars, Diffuse (Irradiance) mapping Normal mapping Parallax normal mapping Advanced Methods Vienna University of Technology 3 Environment Mapping Main idea: fake reflections using simple textures Vienna University of Technology 4 Environment Mapping Assumption: index envmap via orientation Reflection vector or any other similar lookup! Ignore (reflection) position! True if: reflecting object shrunk to a single point OR: environment infinitely far away Eye not very good at discovering the fake Environment Map Viewpoint Vienna University of Technology 5 Environment Mapping Can be an “Effect” Usually means: “fake reflection” Can be a “Technique” (i.e., GPU feature) Then it means: “2D texture indexed by a 3D orientation” Usually the index vector is the reflection vector But can be anything else that’s suitable! Vienna University of Technology 6 Environment Mapping Uses texture coordinate generation, multitexturing, new texture targets… Main task: Map all 3D orientations to a 2D texture Independent of application to reflections Sphere Cube Dual paraboloid Top top Top Left Front Right Left Front Right Back left front right Bottom Bottom Back bottom Vienna University of Technology 7 Cube Mapping OpenGL texture targets Top Left Front Right Back Bottom glTexImage2D( GL_TEXTURE_CUBE_MAP_POSITIVE_X, 0, GL_RGB8, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE,
    [Show full text]
  • Environment Mapping Good for ? with the Help of Environment Mapping, an Object Can Reflect the Environment
    Environment Mapping good for ? With the help of environment mapping, an object can reflect the environment. For that Simone Kriglstein texture mapping can be used. But some problems Günter Wallner are occurring. The problem how the textures are saved must be solved (i.e. to save memory). The second problem is that the calculation of mapping Abstract the 2-dimensional to 3-dimensional coordinates must be possible in a short time, so that these In this paper environment mapping is presented. techniques can be used in interactive applications. First we will discuss environment mapping in For parameterization three different kinds are general and the methods of parameterization like, existing. The most famous methods are spherical, sphere -, cubic –and parabolic maps. For sphere, cube and parabolic maps. We will discuss each of parabolic and cubic maps the way of indexing is the methods mentioned above and enumerate the shown. Then ways of prefiltering of environment advantages and disadvantages of those maps are shown. In detail we present how this can parameterizations. Since there are different kinds of be done with the Phong reflection model and how surfaces (wood, plastic,...) with different kinds of the Fresnel term can be used. In addition reflection properties (Figure 1.2 shows a very good environment mapping with the help of BRDF´s is example of reflection), these properties should be presented. Also environment mapped bump taken into account when generating an environment mapping (EMBM) will be discussed. In this chapter map. The reflection properties can be divided into we discuss mainly how the environment is mapped three parameters: diffuse, glossy and mirror.
    [Show full text]
  • Texture Mapping
    Computer Graphics Texture Mapping Teacher: A.prof. Chengying Gao(高成英) E-mail: [email protected] School of Data and Computer Science Basic Conception of Texture Mapping Computer Graphics 2 Limitations Computer Graphics 3 Limitations • Even though the graphics card can display up to ten million polygons per second,It's difficult to simulate all the phenomenon in nature. • Cloud • Grass • Leaf • Landforms • Hair • Fire and Water Computer Graphics 4 The Importance of Surface Texture Computer Graphics 5 The Quest for Visual Realism Computer Graphics 6 Texturing Example • SkyBox Computer Graphics 7 Texturing Example Computer Graphics 8 Model of Orange • Consider the method to create model of orange (or some of other fruits). • Coloring orange in a simple sphere? • Not reality • Instead more complicate shape of sphere? • Too many polygons to display the dimples in the surface Computer Graphics 9 Why Texture Mapping? • Texture is variation in the surface attributes • like color, surface normals, specularity, transparency, surface displacement etc. • Computer generated images look more realistic if they are able to capture these complex details • It is difficult to represent these details using geometric modeling because they heavily increase computational requirements • Texture mapping is an effective method of “faking” surface details at a relatively low cost Computer Graphics 10 What is texture mapping • Texture mapping is the process of transforming a texture on to a surface of a 3D object. • It is like mapping a function on to a surface in 3D • the domain of the function could be 1D, 2D or 3D • the function could be represented by either an array or it could be an algebraic function.
    [Show full text]
  • Cube-To-Sphere Projections for Procedural Texturing and Beyond
    Journal of Computer Graphics Techniques Vol. 7, No. 2, 2018 http://jcgt.org Cube-to-sphere Projections for Procedural Texturing and Beyond Matt Zucker and Yosuke Higashi Swarthmore College (a) (b) (c) (d) Figure 1. Real-time shader illustrating stratified sampling of the sphere. (a) jittered uniform grid on cube; (b) nonlinear warp of each face; (c) projection to sphere via normalization; (d) resulting Voronoi diagram, generated in constant time per fragment. Abstract Motivated by efficient GPU procedural texturing of the sphere, we describe several approxi- mately equal-area cube-to-sphere projections. The projections not only provide low-distortion UV mapping, but also enable efficient generation of jittered point sets with O(1) nearest- neighbor lookup. We provide GLSL implementations of the projections with several real-time procedural texturing examples. Our numerical results summarize the various methods’ abil- ity to preserve projected areas as well as their performance on both integrated and discrete GPUs. More broadly, the overall cube-to-sphere approach provides an underexplored avenue for adopting existing 2D grid-based methods to the sphere. As an example, we showcase fast Poisson disk sampling. 1 ISSN 2331-7418 Journal of Computer Graphics Techniques Vol. 7, No. 2, 2018 Cube-to-sphere Projections for Procedural Texturing and Beyond http://jcgt.org 1. Introduction Spheres are ubiquitous primitives in computer graphics, used to render everything from molecules to planets. Sampling and tessellating the sphere is fundamental to texturing and modeling these common objects. Most users of modern graphics APIs are familiar with cube mapping [Greene 1986], a technique that is commonly used to map between 3D direction vectors and 2D textures tiling the surface of a cube.
    [Show full text]
  • 15-462 Project 2: Texture Mapping and Shaders
    15-462 Project 2: Texture Mapping and Shaders Release Date: Tuesday, February 10, 2009 Due Date, Tuesday, February 24, 2009 at 23:59:59 1 Overview In this project, you will learn to do texture mapping in OpenGL and implement bump mapping, environment mapping and Fresnel effect in OpenGL Shading Language (GLSL). Before you start, you will need to download the staff code from the course website and update your project 1 code according to section4 of this document. Starter Code: http://www.cs.cmu.edu/~15462/proj/02/p2.tar.gz You may also find the following documents very useful for GLSL programming: GLSL Quick Reference Guide: http://www.opengl.org/sdk/libs/OpenSceneGraph/glsl_quickref.pdf GLSL Specification: http://www.opengl.org/registry/doc/GLSLangSpec.Full.1.20.8.pdf OpenGL Reference Pages: http://www.opengl.org/sdk/docs/man/ 2 Submission Process Your handin directory may be found at /afs/cs.cmu.edu/academic/class/15462-s09-users/andrewid/p2/. You should submit the entire contents of the handout, your modified files, and any screenshots in this directory. Regardless of what machine you use to code your assignment, it must com- pile and run correctly on the machines in the Wean 5336 cluster. There are a lot of students in the course and only 25 machines, so you probably don't want to wait until the last minute to test it. To test your code, we will simply cd into the directory, then run make clean; make. Therefore, your code should should be placed in the root directory un- 1 compressed (i.e.
    [Show full text]
  • CG08-Texturing.Pdf
    Computer Graphics - Texturing - Philipp Slusallek Texture • Textures modify the input for shading computations – Either via (painted) images textures or procedural functions • Example texture maps for – Reflectance, normals, shadow reflections, … 2 Definition: Textures • Texture maps texture coordinates to shading values – Input: 1D/2D/3D/4D texture coordinates • Explicitly given or derived via other data (e.g. position, direction, …) – Output: Scalar or vector value • Modified values in shading computations – Reflectance • Changes the diffuse or specular reflection coefficient (푘푑, 푘푠) – Geometry and Normal (important for lighting) • Displacement mapping 푃′ = 푃 + Δ푃 • Normal mapping 푁′ = 푁 + Δ푁 • Bump mapping 푁′ = 푁(푃 + 푡푁) – Opacity • Modulating transparency (e.g. for fences in games) – Illumination • Light maps, environment mapping, reflection mapping – And anything else … IMAGE TEXTURES 4 Wrap Mode • Texture Coordinates 0, 4 v 4, 4 – (u, v) in [0, 1] x [0, 1] ? ? ? ? • What if? – (u, v) not in unit square? ? ? ? ? ? ? ? ? 0, 1 ? ? ? u 0, 0 1, 0 4, 0 Wrap Mode • Repeat 0, 4 v 4, 4 • Fractional Coordinates – 푡푢 = 푢 − 푢 – 푡푣 = 푣 − 푣 u 0, 0 4, 0 Wrap Mode • Mirror 0, 4 v 4, 4 • Fractional Coordinates – 푡푢 = 푢 − 푢 – 푡푣 = 푣 − 푣 • Lattice Coordinates – 푙푢 = 푢 – 푙푣 = 푣 • Mirror if Odd – if (l_u % 2 == 1) t_u = 1 - t_u u – if (l_v % 2 == 1) t_v = 1 - t_v 0, 0 4, 0 Wrap Mode • Clamp 0, 4 v 4, 4 • Clamp u to [0, 1] if (u < 0) tu = 0; else if (u > 1) tu = 1; else tu = u; • Clamp v to [0, 1] if (v < 0) tv = 0; else if (v > 1) tv = 1; else tv =
    [Show full text]
  • Texture Mapping
    Texture Mapping University of California Riverside Limits of geometric modeling Although modern GPUs can render millions of triangles/sec, that's not enough sometimes. Texture mapping for detail This image contains 8 polygons! Texture mapping comparison no texture with texture Pixar - Toy Story Other uses of textures. Light maps Shadow maps Environment maps Bump maps Opacity maps Animation Lookup reflectances in image Image source: [1, 2] Texture mapping in the pipeline vertex input vertex shader Geometry and pixels have separate tessellation paths through pipeline geometry shader Textures applied in fragment shader post vertex processing End of pipeline Efficient since relatively few polygons primitive assembly get past clipper rasterization fragment shader per-sample operations uv Mapping (u0; v0) 2D texture is parameterized by (u; v) (u1; v1) Assign polygon vertices texture coordinates Interpolate within polygon (u2; v2) Texturing triangles Store (u; v) at each vertex Interpolate inside triangles using barycentric coordinates Texture Space Object Space v 1, 1 (u,v) = (0.2, 0.8) 0, 1 A a c (0.4, 0.2) b B C 0, 0 1, 0 u (0.8, 0.4) Texturing triangles Store (u; v) at each vertex Interpolate inside triangles using barycentric coordinates p(β; γ) = pa + β(pb − pa) + γ(pc − pa) u(β; γ) = ua + β(ub − ua) + γ(uc − ua) v(β; γ) = va + β(vb − va) + γ(vc − va) Texture mapping Point sampling Map back to texture image and use the nearest texel Aliasing Point sampling textures can lead to aliasing artifacts Magnification and minification Minification Magnification
    [Show full text]