Texture Mapping
Total Page:16
File Type:pdf, Size:1020Kb

Load more
Recommended publications
-
Texture Mapping Textures Provide Details Makes Graphics Pretty
Texture Mapping Textures Provide Details Makes Graphics Pretty • Details creates immersion • Immersion creates fun Basic Idea Paint pictures on all of your polygons • adds color data • adds (fake) geometric and texture detail one of the basic graphics techniques • tons of hardware support Texture Mapping • Map between region of plane and arbitrary surface • Ensure “right things” happen as textured polygon is rendered and transformed Parametric Texture Mapping • Texture size and orientation tied to polygon • Texture can modulate diffuse color, specular color, specular exponent, etc • Separation of “texture space” from “screen space” • UV coordinates of range [0…1] Retrieving Texel Color • Compute pixel (u,v) using barycentric interpolation • Look up texture pixel (texel) • Copy color to pixel • Apply shading How to Parameterize? Classic problem: How to parameterize the earth (sphere)? Very practical, important problem in Middle Ages… Latitude & Longitude Distorts areas and angles Planar Projection Covers only half of the earth Distorts areas and angles Stereographic Projection Distorts areas Albers Projection Preserves areas, distorts aspect ratio Fuller Parameterization No Free Lunch Every parameterization of the earth either: • distorts areas • distorts distances • distorts angles Good Parameterizations • Low area distortion • Low angle distortion • No obvious seams • One piece • How do we achieve this? Planar Parameterization Project surface onto plane • quite useful in practice • only partial coverage • bad distortion when normals perpendicular Planar Parameterization In practice: combine multiple views Cube Map/Skybox Cube Map Textures • 6 2D images arranged like faces of a cube • +X, -X, +Y, -Y, +Z, -Z • Index by unnormalized vector Cube Map vs Skybox skybox background Cube maps map reflections to emulate reflective surface (e.g. -
Texture / Image-Based Rendering Texture Maps
Texture / Image-Based Rendering Texture maps Surface color and transparency Environment and irradiance maps Reflectance maps Shadow maps Displacement and bump maps Level of detail hierarchy CS348B Lecture 12 Pat Hanrahan, Spring 2005 Texture Maps How is texture mapped to the surface? Dimensionality: 1D, 2D, 3D Texture coordinates (s,t) Surface parameters (u,v) Direction vectors: reflection R, normal N, halfway H Projection: cylinder Developable surface: polyhedral net Reparameterize a surface: old-fashion model decal What does texture control? Surface color and opacity Illumination functions: environment maps, shadow maps Reflection functions: reflectance maps Geometry: bump and displacement maps CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 1 Classic History Catmull/Williams 1974 - basic idea Blinn and Newell 1976 - basic idea, reflection maps Blinn 1978 - bump mapping Williams 1978, Reeves et al. 1987 - shadow maps Smith 1980, Heckbert 1983 - texture mapped polygons Williams 1983 - mipmaps Miller and Hoffman 1984 - illumination and reflectance Perlin 1985, Peachey 1985 - solid textures Greene 1986 - environment maps/world projections Akeley 1993 - Reality Engine Light Field BTF CS348B Lecture 12 Pat Hanrahan, Spring 2005 Texture Mapping ++ == 3D Mesh 2D Texture 2D Image CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 2 Surface Color and Transparency Tom Porter’s Bowling Pin Source: RenderMan Companion, Pls. 12 & 13 CS348B Lecture 12 Pat Hanrahan, Spring 2005 Reflection Maps Blinn and Newell, 1976 CS348B Lecture 12 Pat Hanrahan, Spring 2005 Page 3 Gazing Ball Miller and Hoffman, 1984 Photograph of mirror ball Maps all directions to a to circle Resolution function of orientation Reflection indexed by normal CS348B Lecture 12 Pat Hanrahan, Spring 2005 Environment Maps Interface, Chou and Williams (ca. -
Real-Time Rendering Techniques with Hardware Tessellation
Volume 34 (2015), Number x pp. 0–24 COMPUTER GRAPHICS forum Real-time Rendering Techniques with Hardware Tessellation M. Nießner1 and B. Keinert2 and M. Fisher1 and M. Stamminger2 and C. Loop3 and H. Schäfer2 1Stanford University 2University of Erlangen-Nuremberg 3Microsoft Research Abstract Graphics hardware has been progressively optimized to render more triangles with increasingly flexible shading. For highly detailed geometry, interactive applications restricted themselves to performing transforms on fixed geometry, since they could not incur the cost required to generate and transfer smooth or displaced geometry to the GPU at render time. As a result of recent advances in graphics hardware, in particular the GPU tessellation unit, complex geometry can now be generated on-the-fly within the GPU’s rendering pipeline. This has enabled the generation and displacement of smooth parametric surfaces in real-time applications. However, many well- established approaches in offline rendering are not directly transferable due to the limited tessellation patterns or the parallel execution model of the tessellation stage. In this survey, we provide an overview of recent work and challenges in this topic by summarizing, discussing, and comparing methods for the rendering of smooth and highly-detailed surfaces in real-time. 1. Introduction Hardware tessellation has attained widespread use in computer games for displaying highly-detailed, possibly an- Graphics hardware originated with the goal of efficiently imated, objects. In the animation industry, where displaced rendering geometric surfaces. GPUs achieve high perfor- subdivision surfaces are the typical modeling and rendering mance by using a pipeline where large components are per- primitive, hardware tessellation has also been identified as a formed independently and in parallel. -
Deconstructing Hardware Usage for General Purpose Computation on Gpus
Deconstructing Hardware Usage for General Purpose Computation on GPUs Budyanto Himawan Manish Vachharajani Dept. of Computer Science Dept. of Electrical and Computer Engineering University of Colorado University of Colorado Boulder, CO 80309 Boulder, CO 80309 E-mail: {Budyanto.Himawan,manishv}@colorado.edu Abstract performance, in 2001, NVidia revolutionized the GPU by making it highly programmable [3]. Since then, the programmability of The high-programmability and numerous compute resources GPUs has steadily increased, although they are still not fully gen- on Graphics Processing Units (GPUs) have allowed researchers eral purpose. Since this time, there has been much research and ef- to dramatically accelerate many non-graphics applications. This fort in porting both graphics and non-graphics applications to use initial success has generated great interest in mapping applica- the parallelism inherent in GPUs. Much of this work has focused tions to GPUs. Accordingly, several works have focused on help- on presenting application developers with information on how to ing application developers rewrite their application kernels for the perform the non-trivial mapping of general purpose concepts to explicitly parallel but restricted GPU programming model. How- GPU hardware so that there is a good fit between the algorithm ever, there has been far less work that examines how these appli- and the GPU pipeline. cations actually utilize the underlying hardware. Less attention has been given to deconstructing how these gen- This paper focuses on deconstructing how General Purpose ap- eral purpose application use the graphics hardware itself. Nor has plications on GPUs (GPGPU applications) utilize the underlying much attention been given to examining how GPUs (or GPU-like GPU pipeline. -
Texture Mapping Modeling an Orange
Texture Mapping Modeling an Orange q Problem: Model an orange (the fruit) q Start with an orange-colored sphere – Too smooth q Replace sphere with a more complex shape – Takes too many polygons to model all the dimples q Soluon: retain the simple model, but add detail as a part of the rendering process. 1 Modeling an orange q Surface texture: we want to model the surface features of the object not just the overall shape q Soluon: take an image of a real orange’s surface, scan it, and “paste” onto a simple geometric model q The texture image is used to alter the color of each point on the surface q This creates the ILLUSION of a more complex shape q This is efficient since it does not involve increasing the geometric complexity of our model Recall: Graphics Rendering Pipeline Application Geometry Rasterizer 3D 2D input CPU GPU output scene image 2 Rendering Pipeline – Rasterizer Application Geometry Rasterizer Image CPU GPU output q The rasterizer stage does per-pixel operaons: – Turn geometry into visible pixels on screen – Add textures – Resolve visibility (Z-buffer) From Geometry to Display 3 Types of Texture Mapping q Texture Mapping – Uses images to fill inside of polygons q Bump Mapping – Alter surface normals q Environment Mapping – Use the environment as texture Texture Mapping - Fundamentals q A texture is simply an image with coordinates (s,t) q Each part of the surface maps to some part of the texture q The texture value may be used to set or modify the pixel value t (1,1) (199.68, 171.52) Interpolated (0.78,0.67) s (0,0) 256x256 4 2D Texture Mapping q How to map surface coordinates to texture coordinates? 1. -
A Snake Approach for High Quality Image-Based 3D Object Modeling
A Snake Approach for High Quality Image-based 3D Object Modeling Carlos Hernandez´ Esteban and Francis Schmitt Ecole Nationale Superieure´ des Tel´ ecommunications,´ France e-mail: carlos.hernandez, francis.schmitt @enst.fr f g Abstract faces. They mainly work for 2.5D surfaces and are very de- pendent on the light conditions. A third class of methods In this paper we present a new approach to high quality 3D use the color information of the scene. The color informa- object reconstruction by using well known computer vision tion can be used in different ways, depending on the type of techniques. Starting from a calibrated sequence of color im- scene we try to reconstruct. A first way is to measure color ages, we are able to recover both the 3D geometry and the consistency to carve a voxel volume [28, 18]. But they only texture of the real object. The core of the method is based on provide an output model composed of a set of voxels which a classical deformable model, which defines the framework makes difficult to obtain a good 3D mesh representation. Be- where texture and silhouette information are used as exter- sides, color consistency algorithms compare absolute color nal energies to recover the 3D geometry. A new formulation values, which makes them sensitive to light condition varia- of the silhouette constraint is derived, and a multi-resolution tions. A different way of exploiting color is to compare local gradient vector flow diffusion approach is proposed for the variations of the texture such as in cross-correlation methods stereo-based energy term. -
UNWRELLA Step by Step Unwrapping and Texture Baking Tutorial
Introduction Content Defining Seams in 3DSMax Applying Unwrella Texture Baking Final result UNWRELLA Unwrella FAQ, users manual Step by step unwrapping and texture baking tutorial Unwrella UV mapping tutorial. Copyright 3d-io GmbH, 2009. All rights reserved. Please visit http://www.unwrella.com for more details. Unwrella Step by Step automatic unwrapping and texture baking tutorial Introduction 3. Introduction Content 4. Content Defining Seams in 3DSMax 10. Defining Seams in Max Applying Unwrella Texture Baking 16. Applying Unwrella Final result 20. Texture Baking Unwrella FAQ, users manual 29. Final Result 30. Unwrella FAQ, user manual Unwrella UV mapping tutorial. Copyright 3d-io GmbH, 2009. All rights reserved. Please visit http://www.unwrella.com for more details. Introduction In this comprehensive tutorial we will guide you through the process of creating optimal UV texture maps. Introduction Despite the fact that Unwrella is single click solution, we have created this tutorial with a lot of material explaining basic Autodesk 3DSMax work and the philosophy behind the „Render to Texture“ workflow. Content Defining Seams in This method, known in game development as texture baking, together with deployment of the Unwrella plug-in achieves the 3DSMax following quality benchmarks efficiently: Applying Unwrella - Textures with reduced texture mapping seams Texture Baking - Minimizes the surface stretching Final result - Creates automatically the largest possible UV chunks with maximal use of available space Unwrella FAQ, users manual - Preserves user created UV Seams - Reduces the production time from 30 minutes to 3 Minutes! Please follow these steps and learn how to utilize this great tool in order to achieve the best results in minimum time during your everyday productions. -
9.Texture Mapping
OpenGL_PG.book Page 359 Thursday, October 23, 2003 3:23 PM Chapter 9 9.Texture Mapping Chapter Objectives After reading this chapter, you’ll be able to do the following: • Understand what texture mapping can add to your scene • Specify texture images in compressed and uncompressed formats • Control how a texture image is filtered as it’s applied to a fragment • Create and manage texture images in texture objects and, if available, control a high-performance working set of those texture objects • Specify how the color values in the image combine with those of the fragment to which it’s being applied • Supply texture coordinates to indicate how the texture image should be aligned with the objects in your scene • Generate texture coordinates automatically to produce effects such as contour maps and environment maps • Perform complex texture operations in a single pass with multitexturing (sequential texture units) • Use texture combiner functions to mathematically operate on texture, fragment, and constant color values • After texturing, process fragments with secondary colors • Perform transformations on texture coordinates using the texture matrix • Render shadowed objects, using depth textures 359 OpenGL_PG.book Page 360 Thursday, October 23, 2003 3:23 PM So far, every geometric primitive has been drawn as either a solid color or smoothly shaded between the colors at its vertices—that is, they’ve been drawn without texture mapping. If you want to draw a large brick wall without texture mapping, for example, each brick must be drawn as a separate polygon. Without texturing, a large flat wall—which is really a single rectangle—might require thousands of individual bricks, and even then the bricks may appear too smooth and regular to be realistic. -
Programming Guide: Revision 1.4 June 14, 1999 Ccopyright 1998 3Dfxo Interactive,N Inc
Voodoo3 High-Performance Graphics Engine for 3D Game Acceleration June 14, 1999 al Voodoo3ti HIGH-PERFORMANCEopy en GdRAPHICS E NGINEC FOR fi ot 3D GAME ACCELERATION on Programming Guide: Revision 1.4 June 14, 1999 CCopyright 1998 3Dfxo Interactive,N Inc. All Rights Reserved D 3Dfx Interactive, Inc. 4435 Fortran Drive San Jose CA 95134 Phone: (408) 935-4400 Fax: (408) 935-4424 Copyright 1998 3Dfx Interactive, Inc. Revision 1.4 Proprietary and Preliminary 1 June 14, 1999 Confidential Voodoo3 High-Performance Graphics Engine for 3D Game Acceleration Notice: 3Dfx Interactive, Inc. has made best efforts to ensure that the information contained in this document is accurate and reliable. The information is subject to change without notice. No responsibility is assumed by 3Dfx Interactive, Inc. for the use of this information, nor for infringements of patents or the rights of third parties. This document is the property of 3Dfx Interactive, Inc. and implies no license under patents, copyrights, or trade secrets. Trademarks: All trademarks are the property of their respective owners. Copyright Notice: No part of this publication may be copied, reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, photographic, or otherwise, or used as the basis for manufacture or sale of any items without the prior written consent of 3Dfx Interactive, Inc. If this document is downloaded from the 3Dfx Interactive, Inc. world wide web site, the user may view or print it, but may not transmit copies to any other party and may not post it on any other site or location. -
Antialiasing
Antialiasing CSE 781 Han-Wei Shen What is alias? Alias - A false signal in telecommunication links from beats between signal frequency and sampling frequency (from dictionary.com) Not just telecommunication, alias is everywhere in computer graphics because rendering is essentially a sampling process Examples: Jagged edges False texture patterns Alias caused by under-sampling 1D signal sampling example Actual signal Sampled signal Alias caused by under-sampling 2D texture display example Minor aliasing worse aliasing How often is enough? What is the right sampling frequency? Sampling theorem (or Nyquist limit) - the sampling frequency has to be at least twice the maximum frequency of the signal to be sampled Need two samples in this cycle Reconstruction After the (ideal) sampling is done, we need to reconstruct back the original continuous signal The reconstruction is done by reconstruction filter Reconstruction Filters Common reconstruction filters: Box filter Tent filter Sinc filter = sin(πx)/πx Anti-aliased texture mapping Two problems to address – Magnification Minification Re-sampling Minification and Magnification – resample the signal to a different resolution Minification Magnification (note the minification is done badly here) Magnification Simpler to handle, just resample the reconstructed signal Reconstructed signal Resample the reconstructed signal Magnification Common methods: nearest neighbor (box filter) or linear interpolation (tent filter) Nearest neighbor bilinear interpolation Minification Harder to handle The signal’s frequency is too high to avoid aliasing A possible solution: Increase the low pass filter width of the ideal sinc filter – this effectively blur the image Blur the image first (using any method), and then sample it Minification Several texels cover one pixel (under sampling happens) Solution: Either increase sampling rate or reduce the texture Frequency one pixel We will discuss: Under sample artifact 1. -
Power Optimizations for Graphics Processors
POWER OPTIMIZATIONS FOR GRAPHICS PROCESSORS B.V.N.SILPA DEPARTMENT OF COMPUTER SCIENCE & ENGINEERING INDIAN INSTITUTE OF TECHNOLOGY DELHI March 2011 POWER OPTIMAZTIONS FOR GRAPHICS PROCESSORS by B.V.N.SILPA Department of Computer Science and Engineering Submitted in fulfillment of the requirements of the degree of Doctor of Philosophy to the Indian Institute of Technology Delhi March 2011 Certificate This is to certify that the thesis titled Power optimizations for graphics pro- cessors being submitted by B V N Silpa for the award of Doctor of Philosophy in Computer Science & Engg. is a record of bona fide work carried out by her under my guidance and supervision at the Deptartment of Computer Science & Engineer- ing, Indian Institute of Technology Delhi. The work presented in this thesis has not been submitted elsewhere, either in part or full, for the award of any other degree or diploma. Preeti Ranjan Panda Professor Dept. of Computer Science & Engg. Indian Institute of Technology Delhi Acknowledgment It is with immense gratitude that I acknowledge the support and help of my Professor Preeti Ranjan Panda in guiding me through this thesis. I would like to thank Professors M. Balakrishnan, Anshul Kumar, G.S. Visweswaran and Kolin Paul for their valuable feedback, suggestions and help in all respects. I am indebted to my dear friend G Kr- ishnaiah for being my constant support and an impartial critique. I would like to thank Neeraj Goel, Anant Vishnoi and Aryabartta Sahu for their technical and moral support. I would also like to thank the staff of Philips, FPGA, Intel laboratories and IIT Delhi for their help. -
Texture Filtering Jim Van Verth Some Definitions
Texture Filtering Jim Van Verth Some definitions Pixel: picture element. On screen. Texel: texture element. In image (AKA texture). Suppose we have a texture (represented here in blue). We can draw the mapping from a pixel to this space, and represent it as a red square. If our scale from pixel space to texel space is 1:1, with no translation, we just cover one texel, so rendering is easy. If we translate, the pixel can cover more than one texel, with different areas. If we rotate, then the pixel can cover even more texels, with odd shaped areas. And with scaling up the texture (which scales down the pixel area in texture space)... ...or scaling down the texture (which scales up the pixel area in texture space), we also get different types of coverage. So what are we trying to do? We often think of a texture like this -- blocks of “color” (we’ll just assume the texture is grayscale in these diagrams to keep it simple). But really, those blocks represent a sample at the texel center. So what we really have is this. Those samples are snapshots of some function, at a uniform distance apart. We don’t really know what the function is, but assuming it’s fairly smooth it could look something like this. So our goal is to reconstruct this function from the samples, so we can compute in-between values. We can draw our pixel (in 1D form) on this graph. Here the red lines represent the boundary of the pixel in texture space.