Texture Mapping Outline Texture Mapping Texture Mapping

Total Page:16

File Type:pdf, Size:1020Kb

Texture Mapping Outline Texture Mapping Texture Mapping Texture Mapping Outline • Modeling surface details with images. Roger Crawfis • Texture parameterization Ohio State University • Texture evaluation • Anti-aliasing and textures. Texture Mapping Texture Mapping • Why use textures? • Modeling complexity Quote Texture Mapping “I am interested in the effects on an object that speak of • Given an object and an image: human intervention. This is another factor that you must take into consideration. How many times has the object – How does the image map to the vertices or set been painted? Written on? Treated? Bumped into? of points defining the object? Scraped? This is when things get exciting. I am curious about: the wearing away of paint on steps from continual use; scrapes made by a moving dolly along the baseboard of a wall; acrylic paint peeling away from a previous coat of an oil base paint; cigarette burns on tile or wood floors; chewing gum – the black spots on city sidewalks; lover’s names and initials scratched onto park benches…” -Owen Demers [digital] Texturing & Painting, 2002 Jason Bryan Texture Mapping Texture Mapping Jason Bryan Jason Bryan Texture Mapping Texture Mapping Jason Bryan Jason Bryan Texture Mapping Texture Mapping • Given an object and an image: • Given an object with an image mapped to it: – How does the image map to the vertices or set – How do we use the color information from the of points defining the object? texture image to determine a pixel’s color? Texture Mapping Texture Mapping • Problem #1 Fitting a square peg in a round • Problem #2 Mapping from a pixel to a texel hole What is an image? What is a Texture? • How would I rotate an image 45 degrees? • Given the (u,v), want: • How would I translate it 0.5 pixels? – F(u,v) ==> a continuous reconstruction • = { R(u,v), G(u,v), B(u,v) } • = { I(u,v) } • = { index(u,v) } • = { alpha(u,v) } • = { normals(u,v) } • = { surface_height(u,v) } • = ... What is the source of your Texture? Procedural Texture • Procedural Image Periodic and everything else • RGB Image Checkerboard • Intensity image Scale: s= 10 • Opacity table If (u * s) % 2=0 && (v * s)%2=0 texture(u,v) = 0; // black Else texture(u,v) = 1; // white RGB Textures Intensity Modulation Textures • Places an image on the object • Multiply the objects color by that of the texture. Camuto 1998 Opacity Textures Color Index Textures • A binary mask, really redefines the • New Microsoft Extension for 8-bit textures. geometry. • Also some cool new extensions to SGI’s OpenGL to perform table look-ups after the texture samples have been computed. Lao 1998 Bump Mapping Displacement Mapping • This modifies the surface normals. • Modifies the surface position in the • More on this later. direction of the surface normal. Lao 1998 Lao 1998 Reflection Properties Texture and Texel • Each pixel in a texture map is called a Texel • Kd, Ks • Each Texel is associated with a 2D, (u,v), • BDRF’s texture coordinate – Brushed Aluminum • The range of u, v is [0.0,1.0] – Tweed – Non-isotropic or anisotropic surface micro facets. (u,v) tuple Two-Stage Mapping • For any (u,v) in the range of (0-1, 0-1), we 1. Model the mapping: (x,y,z) -> (u,v) can find the corresponding value in the 2. Do the mapping texture using some interpolation Image space scan Image space scan For each scanline, y • Problems: For each pixel, x – Finding the inverse mapping compute u(x,y) and v(x,y) • Use one of the analytical mappings that are copy texture(u,v) to image(x,y) invertable. • Bi-linear or triangle inverse mapping • Samples the warped texture at the – May miss parts of the texture map appropriate image pixels. Image Texture • inverse mapping Texture space scan Texture space scan For each v • Problems: For each u – May not fill image compute x(u,v) and y(u,v) – Forward mapping needed copy texture(u,v) to image(x,y) • Places each texture sample to the mapped image pixel. Texture Image • forward mapping Continuous functions F(u,v) How do we get F(u,v)? • We are given a discrete set of values: • Higher-order interpolation – F[i,j] for i=0,…,N, j=0,…,M – F(u,v) = ∑ i∑j F[i,j] h(u,v) • Nearest neighbor: – h(u,v) is called the reconstruction kernel – F(u,v) = F[ round(N*u), round(M*v) ] • Guassian • Sinc function • Linear Interpolation: • splines – i = floor(N*u), j = floor(M*v) – Like linear interpolation, need to find – interpolate from F[i,j], F[i+1,j], F[i,j+1], neighbors. F[i+1,j+1] • Usually four to sixteen Texture Parameterization Texture Parameterization • Definition: • Mapping to a 3D Plane – The process of assigning texture coordinates or – Simple Affine transformation a texture mapping to an object. • rotate • scale • The mapping can be applied: y • translate – Per-pixel v z – Per-vertex x u Texture Parameterization Texture Parameterization • Mapping to a Cylinder • Mapping to Sphere – Rotate, translate and scale in the uv-plane – Impossible!!!! – u -> theta – Severe distortion at the poles – v -> z – u -> theta – v -> phi – x = r cos(theta), y = r sin(theta) v – x = r sin(theta) cos(phi) – y = r sin(theta) sin(phi) – z = r cos(theta) u Texture Parameterization Example (Rogers) Part of a sphere x(θ,φ) = sin θ sin φ 0 ≤θ≤π/2 y(θ,φ) = cos φ • Mapping to a Sphere π/4 ≤φ≤π/2 z(θ,φ) = cos θ sin φ (u,v) = (0,0) ⇔ (θ,φ) = (0, π/2) (u,v) = (1,0) ⇔ (θ,φ) = (π/2, π/2) v (u,v) = (0,1) ⇔ (θ,φ) = (0, π/4) (u,v) = (1,1) ⇔ (θ,φ) = (π/2, π/4) • Setup up surface, define correspondence, and voila! u Example Continued All Is Not Good • Can even solve for (θ,φ) and (u,v) • Let’s take a closer look: – A= π/2, B=0, C=-π/4, D=π/2 π π π θ (u,v) = u φ(u,v) = − v 2 2 4 θ π −φ u(θ ,φ) = v(θ ,φ) = 2 π π 2 4 So looks like we have the texture space ⇔ object space part done! Started with squares and ended with curves / It only gets worse for larger parts of the sphere Texture Parameterization Two-pass Mappings v • Mapping to a Cube • Map texture to: – Plane – Cylinder u – Sphere u-axis v – Box common seam • Map object to same. u S and O Mapping S Mapping Example • Pre-distort the texture by mapping it onto a • Cylindrical Mapping simple surface like a plane, cylinder, sphere, or box • Map the result of that onto the surface • Texture → Intermediate is S mapping • Intermediate → Object is O mapping S T (u,v) (xi,yi)(xo,yo,zo) θ −θ z − z s = A t = A Texture space Intermediate space Object space θ B −θ A zB − z A O Mapping O Mappings Cont’d • A method to relate the surface to the cylinder • Bier and Sloan defined 4 main ways Reflected ray Object normal or or Object centroid Intermediate surface normal Texture Parameterization Texture Parameterization • Plane/ISN (projector) • Plane/ISN – Works well for planar objects • Cylinder/ISN (shrink-wrap) – Works well for solids of revolution • Box/ISN Works well for roughly • Sphere/Centroid spherical shapes • Box/Centroid Texture Parameterization Texture Parameterization • Plane/ISN • Plane/ISN – Resembles a slide – Draw vector from point (vertex or object space projector pixel point) in the direction of the texture plane. – Distortions on surfaces perpendicular to the – The vector will intersect plane. the plane at some point depending on the coordinate system Watt Texture Parameterization Texture Parameterization • Cylinder/ISN • Sphere/ISN – Distortions on – Small distortion horizontal planes everywhere. – Draw vector from – Draw vector from point to cylinder sphere center through – Vector connects point point on the surface to cylinder axis and intersect it with the sphere. Watt Watt Texture Parameterization Texture Parameterization • What is this ISN? • Flip direction of – Intermediate surface vector such that it normal. points in the same – Needed to handle half-space as the concave objects outward surface properly. normal. – Sudden flip in texture coordinates when the object crosses the axis. Triangle Mapping Triangle Mapping • Given: a triangle with texture coordinates at • Given: a triangle with texture coordinates at each vertex. each vertex. • Find the texture coordinates at each point • Find the texture coordinates at each point within the triangle. within the triangle. U=1,v=1 U=0,v=1 U=0,v=0 U=1,v=0 U=0,v=0 U=1,v=0 Triangle Mapping Triangle Interpolation • Triangles define linear mappings. • The equation: f(x,y) = Ax+By+C defines a linear • u(x,y,z) = Ax + By + Cz + D function in 2D. • v(x,y,z) = Ex + Fy + Gz + H • Knowing the values of f() at three locations gives us • Plug in the each point and corresponding enough information texture coordinate. to solve for A, B • Three equations and three unknowns and C. • Need to handle special cases: u==u(x,y) or • Provided the triangle v==v(x), etc. lies in the xy-plane. Triangle Interpolation Triangle Interpolation • We need to find two 3D functions: u(x,y,z) • A linear function in 3D is defined as and v(x,y,z). – f(x,y,z) = Ax + By + Cz + D • However, there is a relationship between x, • Note, four points uniquely determine this y and z, so they are not independent. equation, hence a tetrahedron has a unique • The plane equation of the triangle yields: linear function through it.
Recommended publications
  • Texture Mapping Textures Provide Details Makes Graphics Pretty
    Texture Mapping Textures Provide Details Makes Graphics Pretty • Details creates immersion • Immersion creates fun Basic Idea Paint pictures on all of your polygons • adds color data • adds (fake) geometric and texture detail one of the basic graphics techniques • tons of hardware support Texture Mapping • Map between region of plane and arbitrary surface • Ensure “right things” happen as textured polygon is rendered and transformed Parametric Texture Mapping • Texture size and orientation tied to polygon • Texture can modulate diffuse color, specular color, specular exponent, etc • Separation of “texture space” from “screen space” • UV coordinates of range [0…1] Retrieving Texel Color • Compute pixel (u,v) using barycentric interpolation • Look up texture pixel (texel) • Copy color to pixel • Apply shading How to Parameterize? Classic problem: How to parameterize the earth (sphere)? Very practical, important problem in Middle Ages… Latitude & Longitude Distorts areas and angles Planar Projection Covers only half of the earth Distorts areas and angles Stereographic Projection Distorts areas Albers Projection Preserves areas, distorts aspect ratio Fuller Parameterization No Free Lunch Every parameterization of the earth either: • distorts areas • distorts distances • distorts angles Good Parameterizations • Low area distortion • Low angle distortion • No obvious seams • One piece • How do we achieve this? Planar Parameterization Project surface onto plane • quite useful in practice • only partial coverage • bad distortion when normals perpendicular Planar Parameterization In practice: combine multiple views Cube Map/Skybox Cube Map Textures • 6 2D images arranged like faces of a cube • +X, -X, +Y, -Y, +Z, -Z • Index by unnormalized vector Cube Map vs Skybox skybox background Cube maps map reflections to emulate reflective surface (e.g.
    [Show full text]
  • Texture / Image-Based Rendering Texture Maps
    Texture / Image-Based Rendering Texture maps Surface color and transparency Environment and irradiance maps Reflectance maps Shadow maps Displacement and bump maps Level of detail hierarchy CS348B Lecture 12 Pat Hanrahan, Spring 2005 Texture Maps How is texture mapped to the surface? Dimensionality: 1D, 2D, 3D Texture coordinates (s,t) Surface parameters (u,v) Direction vectors: reflection R, normal N, halfway H Projection: cylinder Developable surface: polyhedral net Reparameterize a surface: old-fashion model decal What does texture control? Surface color and opacity Illumination functions: environment maps, shadow maps Reflection functions: reflectance maps Geometry: bump and displacement maps CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 1 Classic History Catmull/Williams 1974 - basic idea Blinn and Newell 1976 - basic idea, reflection maps Blinn 1978 - bump mapping Williams 1978, Reeves et al. 1987 - shadow maps Smith 1980, Heckbert 1983 - texture mapped polygons Williams 1983 - mipmaps Miller and Hoffman 1984 - illumination and reflectance Perlin 1985, Peachey 1985 - solid textures Greene 1986 - environment maps/world projections Akeley 1993 - Reality Engine Light Field BTF CS348B Lecture 12 Pat Hanrahan, Spring 2005 Texture Mapping ++ == 3D Mesh 2D Texture 2D Image CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 2 Surface Color and Transparency Tom Porter’s Bowling Pin Source: RenderMan Companion, Pls. 12 & 13 CS348B Lecture 12 Pat Hanrahan, Spring 2005 Reflection Maps Blinn and Newell, 1976 CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 3 Gazing Ball Miller and Hoffman, 1984 Photograph of mirror ball Maps all directions to a to circle Resolution function of orientation Reflection indexed by normal CS348B Lecture 12 Pat Hanrahan, Spring 2005 Environment Maps Interface, Chou and Williams (ca.
    [Show full text]
  • Seamless Texture Mapping of 3D Point Clouds
    Seamless Texture Mapping of 3D Point Clouds Dan Goldberg Mentor: Carl Salvaggio Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology Rochester, NY November 25, 2014 Abstract The two similar, quickly growing fields of computer vision and computer graphics give users the ability to immerse themselves in a realistic computer generated environment by combining the ability create a 3D scene from images and the texture mapping process of computer graphics. The output of a popular computer vision algorithm, structure from motion (obtain a 3D point cloud from images) is incomplete from a computer graphics standpoint. The final product should be a textured mesh. The goal of this project is to make the most aesthetically pleasing output scene. In order to achieve this, auxiliary information from the structure from motion process was used to texture map a meshed 3D structure. 1 Introduction The overall goal of this project is to create a textured 3D computer model from images of an object or scene. This problem combines two different yet similar areas of study. Computer graphics and computer vision are two quickly growing fields that take advantage of the ever-expanding abilities of our computer hardware. Computer vision focuses on a computer capturing and understanding the world. Computer graphics con- centrates on accurately representing and displaying scenes to a human user. In the computer vision field, constructing three-dimensional (3D) data sets from images is becoming more common. Microsoft's Photo- synth (Snavely et al., 2006) is one application which brought attention to the 3D scene reconstruction field. Many structure from motion algorithms are being applied to data sets of images in order to obtain a 3D point cloud (Koenderink and van Doorn, 1991; Mohr et al., 1993; Snavely et al., 2006; Crandall et al., 2011; Weng et al., 2012; Yu and Gallup, 2014; Agisoft, 2014).
    [Show full text]
  • Poseray Handbuch 3.10.3
    Das PoseRay Handbuch 3.10.3 Zusammengestellt von Steely. Angelehnt an die PoseRay Hilfedatei. PoseRay Handbuch V 3.10.3 Seite 1 Yo! Hör genau zu: Dies ist das deutsche Handbuch zu PoseRay, basierend auf dem Helpfile zum Programm. Es ist keine wörtliche Übersetzung, und FlyerX trifft keine Schuld an diesem Dokument (wenn man davon absieht, daß er PoseRay geschrieben hat). Dies ist ein Handbuch, kein Tutorial. Es erklärt nicht, wie man mit Poser tolle Frauen oder mit POV- Ray tolle Bilder macht. Es ist nur eine freie Übersetzung der poseray.html, die PoseRay beiliegt. Ich will mich bemühen, dieses Dokument aktuell zu halten, und es immer dann überarbeiten und erweitern, wenn FlyerX sichtbar etwas am Programm verändert. Das ist zumindest der Plan. Damit keine Verwirrung aufkommt, folgt das Handbuch in seinen Versionsnummern dem Programm. Die jeweils neueste Version findest Du auf meiner Homepage: www.blackdepth.de. Sei dankbar, daß Schwedenmann und Tom33 von www.POVray-forum.de meinen Prolltext auf Fehler gecheckt haben, sonst wäre das Handbuch noch grausiger. POV-Ray, Poser, DAZ, und viele andere Programm- und Firmennamen in diesem Handbuch sind geschützte Warenzeichen oder zumindest wie solche zu behandeln. Daß kein TM dahinter steht, bedeutet nicht, daß der Begriff frei ist. Unser Markenrecht ist krank, bevor Du also mit den Namen und Begriffen dieses Handbuchs rumalberst, mach dich schlau, ob da einer die Kralle drauf hat. Noch was: dieses Handbuch habe ich geschrieben, es ist mein Werk und ich kann damit machen, was ich will. Deshalb bestimme ich, daß es nicht geschützt ist. Es gibt schon genug Copyright- und IPR- Idioten; ich muß nicht jeden Blödsinn nachmachen.
    [Show full text]
  • Deconstructing Hardware Usage for General Purpose Computation on Gpus
    Deconstructing Hardware Usage for General Purpose Computation on GPUs Budyanto Himawan Manish Vachharajani Dept. of Computer Science Dept. of Electrical and Computer Engineering University of Colorado University of Colorado Boulder, CO 80309 Boulder, CO 80309 E-mail: {Budyanto.Himawan,manishv}@colorado.edu Abstract performance, in 2001, NVidia revolutionized the GPU by making it highly programmable [3]. Since then, the programmability of The high-programmability and numerous compute resources GPUs has steadily increased, although they are still not fully gen- on Graphics Processing Units (GPUs) have allowed researchers eral purpose. Since this time, there has been much research and ef- to dramatically accelerate many non-graphics applications. This fort in porting both graphics and non-graphics applications to use initial success has generated great interest in mapping applica- the parallelism inherent in GPUs. Much of this work has focused tions to GPUs. Accordingly, several works have focused on help- on presenting application developers with information on how to ing application developers rewrite their application kernels for the perform the non-trivial mapping of general purpose concepts to explicitly parallel but restricted GPU programming model. How- GPU hardware so that there is a good fit between the algorithm ever, there has been far less work that examines how these appli- and the GPU pipeline. cations actually utilize the underlying hardware. Less attention has been given to deconstructing how these gen- This paper focuses on deconstructing how General Purpose ap- eral purpose application use the graphics hardware itself. Nor has plications on GPUs (GPGPU applications) utilize the underlying much attention been given to examining how GPUs (or GPU-like GPU pipeline.
    [Show full text]
  • POV-Ray Reference
    POV-Ray Reference POV-Team for POV-Ray Version 3.6.1 ii Contents 1 Introduction 1 1.1 Notation and Basic Assumptions . 1 1.2 Command-line Options . 2 1.2.1 Animation Options . 3 1.2.2 General Output Options . 6 1.2.3 Display Output Options . 8 1.2.4 File Output Options . 11 1.2.5 Scene Parsing Options . 14 1.2.6 Shell-out to Operating System . 16 1.2.7 Text Output . 20 1.2.8 Tracing Options . 23 2 Scene Description Language 29 2.1 Language Basics . 29 2.1.1 Identifiers and Keywords . 30 2.1.2 Comments . 34 2.1.3 Float Expressions . 35 2.1.4 Vector Expressions . 43 2.1.5 Specifying Colors . 48 2.1.6 User-Defined Functions . 53 2.1.7 Strings . 58 2.1.8 Array Identifiers . 60 2.1.9 Spline Identifiers . 62 2.2 Language Directives . 64 2.2.1 Include Files and the #include Directive . 64 2.2.2 The #declare and #local Directives . 65 2.2.3 File I/O Directives . 68 2.2.4 The #default Directive . 70 2.2.5 The #version Directive . 71 2.2.6 Conditional Directives . 72 2.2.7 User Message Directives . 75 2.2.8 User Defined Macros . 76 3 Scene Settings 81 3.1 Camera . 81 3.1.1 Placing the Camera . 82 3.1.2 Types of Projection . 86 3.1.3 Focal Blur . 88 3.1.4 Camera Ray Perturbation . 89 3.1.5 Camera Identifiers . 89 3.2 Atmospheric Effects .
    [Show full text]
  • Texture Mapping Modeling an Orange
    Texture Mapping Modeling an Orange q Problem: Model an orange (the fruit) q Start with an orange-colored sphere – Too smooth q Replace sphere with a more complex shape – Takes too many polygons to model all the dimples q Soluon: retain the simple model, but add detail as a part of the rendering process. 1 Modeling an orange q Surface texture: we want to model the surface features of the object not just the overall shape q Soluon: take an image of a real orange’s surface, scan it, and “paste” onto a simple geometric model q The texture image is used to alter the color of each point on the surface q This creates the ILLUSION of a more complex shape q This is efficient since it does not involve increasing the geometric complexity of our model Recall: Graphics Rendering Pipeline Application Geometry Rasterizer 3D 2D input CPU GPU output scene image 2 Rendering Pipeline – Rasterizer Application Geometry Rasterizer Image CPU GPU output q The rasterizer stage does per-pixel operaons: – Turn geometry into visible pixels on screen – Add textures – Resolve visibility (Z-buffer) From Geometry to Display 3 Types of Texture Mapping q Texture Mapping – Uses images to fill inside of polygons q Bump Mapping – Alter surface normals q Environment Mapping – Use the environment as texture Texture Mapping - Fundamentals q A texture is simply an image with coordinates (s,t) q Each part of the surface maps to some part of the texture q The texture value may be used to set or modify the pixel value t (1,1) (199.68, 171.52) Interpolated (0.78,0.67) s (0,0) 256x256 4 2D Texture Mapping q How to map surface coordinates to texture coordinates? 1.
    [Show full text]
  • Relief Texture Mapping
    Advanced Texturing Environment Mapping Environment Mapping ° reflections Environment Mapping ° orientation ° Environment Map View point Environment Mapping ° ° ° Environment Map View point Environment Mapping ° Can be an “effect” ° Usually means: “fake reflection” ° Can be a “technique” (i.e., GPU feature) ° Then it means: “2D texture indexed by a 3D orientation” ° Usually the index vector is the reflection vector ° But can be anything else that’s suitable! ° Increased importance for modern GI Environment Mapping ° Uses texture coordinate generation, multi-texturing, new texture targets… ° Main task ° Map all 3D orientations to a 2D texture ° Independent of application to reflections Sphere Cube Dual paraboloid Top top Top Left Front Right Left Front Right Back left front right Bottom Bottom Back bottom Cube Mapping ° OpenGL texture targets Top Left Front Right Back Bottom glTexImage2D( GL_TEXTURE_CUBE_MAP_POSITIVE_X , 0, GL_RGB8, w, h, 0, GL_RGB, GL_UNSIGNED_BYTE, face_px ); Cube Mapping ° Cube map accessed via vectors expressed as 3D texture coordinates (s, t, r) +t +s -r Cube Mapping ° 3D 2D projection done by hardware ° Highest magnitude component selects which cube face to use (e.g., -t) ° Divide other components by this, e.g.: s’ = s / -t r’ = r / -t ° (s’, r’) is in the range [-1, 1] ° remap to [0,1] and select a texel from selected face ° Still need to generate useful texture coordinates for reflections Cube Mapping ° Generate views of the environment ° One for each cube face ° 90° view frustum ° Use hardware render to texture
    [Show full text]
  • A Snake Approach for High Quality Image-Based 3D Object Modeling
    A Snake Approach for High Quality Image-based 3D Object Modeling Carlos Hernandez´ Esteban and Francis Schmitt Ecole Nationale Superieure´ des Tel´ ecommunications,´ France e-mail: carlos.hernandez, francis.schmitt @enst.fr f g Abstract faces. They mainly work for 2.5D surfaces and are very de- pendent on the light conditions. A third class of methods In this paper we present a new approach to high quality 3D use the color information of the scene. The color informa- object reconstruction by using well known computer vision tion can be used in different ways, depending on the type of techniques. Starting from a calibrated sequence of color im- scene we try to reconstruct. A first way is to measure color ages, we are able to recover both the 3D geometry and the consistency to carve a voxel volume [28, 18]. But they only texture of the real object. The core of the method is based on provide an output model composed of a set of voxels which a classical deformable model, which defines the framework makes difficult to obtain a good 3D mesh representation. Be- where texture and silhouette information are used as exter- sides, color consistency algorithms compare absolute color nal energies to recover the 3D geometry. A new formulation values, which makes them sensitive to light condition varia- of the silhouette constraint is derived, and a multi-resolution tions. A different way of exploiting color is to compare local gradient vector flow diffusion approach is proposed for the variations of the texture such as in cross-correlation methods stereo-based energy term.
    [Show full text]
  • Advanced Texture Mapping
    Advanced texture mapping Computer Graphics COMP 770 (236) Spring 2007 Instructor: Brandon Lloyd 2/21/07 1 From last time… ■ Physically based illumination models ■ Cook-Torrance illumination model ° Microfacets ° Geometry term ° Fresnel reflection ■ Radiance and irradiance ■ BRDFs 2/21/07 2 Today’s topics ■ Texture coordinates ■ Uses of texture maps ° reflectance and other surface parameters ° lighting ° geometry ■ Solid textures 2/21/07 3 Uses of texture maps ■ Texture maps are used to add complexity to a scene ■ Easier to paint or capture an image than geometry ■ model reflectance ° attach a texture map to a parameter ■ model light ° environment maps ° light maps ■ model geometry ° bump maps ° normal maps ° displacement maps ° opacity maps and billboards 2/21/07 4 Specifying texture coordinates ■ Texture coordinates needed at every vertex ■ Hard to specify by hand ■ Difficult to wrap a 2D texture around a 3D object from Physically-based Rendering 2/21/07 5 Planar mapping ■ Compute texture coordinates at each vertex by projecting the map coordinates onto the model 2/21/07 6 Cylindrical mapping 2/21/07 7 Spherical mapping 2/21/07 8 Cube mapping 2/21/07 9 “Unwrapping” the model 2/21/07 images from www.eurecom.fr/~image/Clonage/geometric2.html 10 Modelling surface properties ■ Can use a texture to supply any parameter of the illumination model ° ambient, diffuse, and specular color ° specular exponent ° roughness fr o m ww w.r o nfr a zier.net 2/21/07 11 Modelling lighting ■ Light maps ° supply the lighting directly ° good for static environments
    [Show full text]
  • Visualizing 3D Objects with Fine-Grain Surface Depth Roi Mendez Fernandez
    Visualizing 3D models with fine-grain surface depth Author: Roi Méndez Fernández Supervisors: Isabel Navazo (UPC) Roger Hubbold (University of Manchester) _________________________________________________________________________Index Index 1. Introduction ..................................................................................................................... 7 2. Main goal .......................................................................................................................... 9 3. Depth hallucination ........................................................................................................ 11 3.1. Background ............................................................................................................. 11 3.2. Output .................................................................................................................... 14 4. Displacement mapping .................................................................................................. 15 4.1. Background ............................................................................................................. 15 4.1.1. Non-iterative methods ................................................................................... 17 a. Bump mapping ...................................................................................... 17 b. Parallax mapping ................................................................................... 18 c. Parallax mapping with offset limiting ...................................................
    [Show full text]
  • Texture Mapping There Are Limits to Geometric Modeling
    Texture Mapping There are limits to geometric modeling http://www.beinteriordecorator.com National Geographic Although modern GPUs can render millions of triangles/sec, that’s not enough sometimes... Use texture mapping to increase realism through detail Rosalee Wolfe This image is just 8 polygons! [Angel and Shreiner] [Angel and No texture With texture Pixar - Toy Story Store 2D images in buffers and lookup pixel reflectances procedural photo Other uses of textures... [Angel and Shreiner] [Angel and Light maps Shadow maps Environment maps Bump maps Opacity maps Animation 99] [Stam Texture mapping in the OpenGL pipeline • Geometry and pixels have separate paths through pipeline • meet in fragment processing - where textures are applied • texture mapping applied at end of pipeline - efficient since relatively few polygons get past clipper uv Mapping Tschmits Wikimedia Commons • 2D texture is parameterized by (u,v) • Assign polygon vertices texture coordinates • Interpolate within polygon Texture Calibration Cylindrical mapping (x,y,z) -> (theta, h) -> (u,v) [Rosalee Wolfe] Spherical Mapping (x,y,z) -> (latitude,longitude) -> (u,v) [Rosalee Wolfe] Box Mapping [Rosalee Wolfe] Parametric Surfaces 32 parametric patches 3D solid textures [Dong et al., 2008] al., et [Dong can map object (x,y,z) directly to texture (u,v,w) Procedural textures Rosalee Wolfe e.g., Perlin noise Triangles Texturing triangles • Store (u,v) at each vertex • interpolate inside triangles using barycentric coordinates Texture Space Object Space v 1, 1 (u,v) = (0.2, 0.8)
    [Show full text]