Modelling Revisited

Total Page:16

File Type:pdf, Size:1020Kb

Modelling Revisited 11/27/2016 2 THIS WEEK’S OBJECTIVES The main goal is to get upto scratch with MAX and do a few things we explored in POVray: 1. CSG 2. Material and Texture: a) Material properties b) Texture maps c) Image Maps MATERIALS REVISITED d) Global Illumination Material Modelling in 3DS Max e) Environment Maps 3. Lights 3 MATERIALS MODELLING 4 MODELLING COMPLEXITY Visual appearance of objects depends on properties of the materials they are made of. Modelling implies manual creation of objects by artists/modellers In order to achieve realism or complexity of the real world, several options Lots of manual work Scanning Procedural generation Materials in MetaSL © NVIDIA ARC GmbH How we represent materials depends on the rendering technique used Realistic techniques require physical parameters e.g. density, index of refraction, etc. Often, intermediate information is used (“colour”) to allow more direct control over appearance Surface Appearance Scanner at Columbia University © Columbia Univ. 5 TEXTURES 6 RECAP: PHONG ILLUMINATION MODEL Light reflected off a point always depends on some (or all) of the following vectors: DIFFUSE REFLECTION is the main Textures add complexity to scenes without additional geometry contributor to the colour of an object. 풏 The colour (intensity) of light reflected off an object is determined by: Approximate using an image how additional geometry will affect the final rendering 풗 • Light Intensity (푅), 풍 • Diffuse reflectance of material (푘푑) 풓 • and the angle between Surface 휃 휙 Many different types: Normal and the Light Direction 휃. Diffuse map (most common, this is what we’ve used): assigns colours to points on an object 퐼푑 = 푅. 푘푑. 푐표푠θ Ambient, specular, gloss maps: assigns different illumination parameters (as in Phong Model) 풏: Normal to surface 풗: Viewer direction 풍: Light Direction 풓: direction of reflection Bump, normal, displacement maps: affects the illumination by modifying normal etc. Reflection, refraction maps (approximating more complex global illumination effects) SPECULAR REFLECTION causes small glossy highlights on the surface of smooth objects Intensity is affected by • Light Intensity (푅) • Specular reflectance of material (푘푠) • Glossiness/shininess (훼) • Angle between Viewer and Direction of Reflection 휙 AMBIENT ILLUMINATION caters for randomly scattered light in the scene. Multiply ambient light in scene (푅푎) with 훼 ambient reflectivity (푘푎) 퐼푠 = 푅. 푘푠. cos 휙 퐼 = 푅. 푘 DIFFUSE MAP GLOSS MAP BUMP MAP REFLECTANCE MAP 푎 푎 1 11/27/2016 7 SHADING ALGORITHM (A.K.A. SHADER) 8 RECAP: RENDERING EQUATION N.B. No need to understand this equation, but it is important to know what factors affect it Light received Occlusion / Emission from a point distance 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′, 푥′′ 퐼 푥′, 푥′′ 푑푥′′] 푆 Reflectance: Reflected Light Incoming Light Local Illumination: Diffuse Repeat the process of calculating illumination across the surface of each Specular object – we can take shortcuts by calculating lighting just once for each Ambient vertex – and then interpolating between vertices Global Illumination Reflection Transmission / refraction 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 푆 푆 9 10 DIFFUSE MAP AFFECTS: Reflected Light SPECULAR MAP Specular Reflected Light Simplest texturing directly applies a colour to object In basic illumination colour is mostly based on diffuse component Specular level + Specular color Controls the intensity of specular highlights Glossiness (“Gloss Map”) / Shininess Control the specular exponent in Phong light model Or roughness in other light models 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 푆 11 12 SPECULAR+ GLOSSINESS MAP SELF-ILLUMINATION MAP Emitted Light A.k.a. Emission Models light sources on a surface Does not depend on surface normal or light sources. Diffuse Color Specular Color Glossiness Shiny Specular map allows a subset of object to be Dull shiny (without making the whole object uniformly shiny) 2 11/27/2016 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 푆 푆 13 14 VISIBILITY MAPPINGS Occlusion BUMP MAP All Reflected Light Displacement Map Use texture as an input to perturb geometric representation Shift in texture lookup based on view-ray intersection with heightfield Can account for occlusions Opacity Map Transparency mapping across primitive BUMP MAP OBJECT WITH DIFFUSE TEXTURE 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 푆 15 16 LIGHT MAPS Incoming Light ENVIRONMENT MAPPING Incoming Light 퐼 푥, 푥′ = 푔 푥, 푥′ [휀 푥, 푥′ + න 휌 푥, 푥′,푥′′ 퐼 푥′,푥′′ 푑푥′′] 푆 Approximation of complex lighting in scene Approximation of incoming light from scene TEXTURED ENVIRONMENT Pre-computed incident light across 2D surfaces Global effects, such as reflection and refraction require information on light coming in from the scene When there are multiple light sources it can be too But it usually too expensive to illuminate the whole scene complex to calculate how they illuminate surfaces Approximate “the rest of the scene” as images If the light and object are not expected to move, we can use static textures that capture the effect of lighting the object LIGHT MAPS ONLY Very popular technique in computer games © 2003, Keshav Chana and FLipcode COMBINED TEXTURES AND LIGHTMAPS 18 TEXTURE MAPPING For 2D textures, we need a 3D to 2D mapping Sometimes called a “projector function” Object Space UV-MAPPING Each vertex in the model will need a (u,v) co-ordinate Normally defined by the artist and added to the vertex stream 3 11/27/2016 19 UV MAPPINGS LIGHTS 21 22 LIGHTS Positional/Point Light: light source Directional Light: light source is near – distorts shadow is far away Positional Light Source: Angle of incoming light changes across a surface; This in turn affects colour at different points. Directional Light Source: source is very distant thus the angle of incoming light remains (almost) constant 23 LIGHTING 24 LIGHT AND SHADOW On the Command Panel: When you create a lightsource, you can choose to Under the “Create” tab let it cast shadows (by default this is off for some Select Lights lights) Start with “Standard” Lights for now Spot and Omni are Positional lights The choice of shadow renderer affects the quality “Direct” indicates a Directional light of the shadow some are more realistic but conversely cause the Target and Free simply refer to how the light can be moved rendering to take a longer time (transformed) in max Target remains aimed at a certain point when you translate it Free remains facing a certain direction when you translate it 4 11/27/2016 25 MATERIALS M 26 BASIC SHADERS Press M or click on the material icon to bring up the Shader Basic Parameters material editor window. Choose a types: This stores all current materials used in your scene. Phong – we know this one already You can drag any existing material onto objects in your scene. Blinn – a more accurate version of Phong You can also load in materials from other .MAT libraries Strauss – metallic objects Oren-Nayer-Blinn – matte surfaces Selecting any shader changes the nature of the Basic Parameters and Extended Parameters 27 THE MAPS ROLLOUT 28 TEXTURES To Texture Map an object, most frequently we Maps allow you to apply a filter or pattern change its diffuse mapping to some of the objects properties recall that diffuse is what mostly affects an object’s colour The Map will decide which colors (or distortions in the case of some mappings) In effect, we give MAX a function which will decide the colour of individual points on the object The map can be derived from an image or procedurally (from some mathematically defined function) 29 30 MATERIAL BROWSER Once you select a map the material Editor window changes. The Material/Map Browser gives you a list of existing functions you can map to a material Additional ones can be generated or imported from outside sources Here we selected marble so Most of these are procedural and can apply the editor shows options for to both texture and bump/normal maps affecting the marble procedure To add your own image use BITMAP To go back to the original object (and perhaps add other effects, click “Go To Parent” 5 11/27/2016 31 HIGHLY REFLECTIVE OBJECTS 32 REFRACTION For refraction choose a Refraction map and in the Maps window choose Raytrace The index of refraction (IOR) value can be changed under Extended Parameters (you may have to click “Go to Parent” to get there) If you don’t see this select a Blinn or Phong Shader Tick reflection – click on the map button to bring up the Material/Maps browser Select Raytrace 33 BUMP MAPS 34 DISPLACEMENT MAPPING Displacement mapping actually changes the geometry before rendering – displaced “bumps” can actually occlude and cast shadows on other objects. Tick Bump under the Maps rollout. In the Material/Map Browser. Select a preset pattern or Load Exaggerated displacement Click on the Map to open the one from an image file by mapping with same texture as Material/Map Browser selecting Bitmap previous slide 35 ENVIRONMENT MAP 36 An environment map (similar to the sky_sphere in POV-Ray) adds a background texture which can add an extra touch of realism to your reflective/refractive (ray-traced) objects Cylindrical mapping Under the co-ordinates roll-out. Spherical mapping Click Environ.
Recommended publications
  • Seamless Texture Mapping of 3D Point Clouds
    Seamless Texture Mapping of 3D Point Clouds Dan Goldberg Mentor: Carl Salvaggio Chester F. Carlson Center for Imaging Science, Rochester Institute of Technology Rochester, NY November 25, 2014 Abstract The two similar, quickly growing fields of computer vision and computer graphics give users the ability to immerse themselves in a realistic computer generated environment by combining the ability create a 3D scene from images and the texture mapping process of computer graphics. The output of a popular computer vision algorithm, structure from motion (obtain a 3D point cloud from images) is incomplete from a computer graphics standpoint. The final product should be a textured mesh. The goal of this project is to make the most aesthetically pleasing output scene. In order to achieve this, auxiliary information from the structure from motion process was used to texture map a meshed 3D structure. 1 Introduction The overall goal of this project is to create a textured 3D computer model from images of an object or scene. This problem combines two different yet similar areas of study. Computer graphics and computer vision are two quickly growing fields that take advantage of the ever-expanding abilities of our computer hardware. Computer vision focuses on a computer capturing and understanding the world. Computer graphics con- centrates on accurately representing and displaying scenes to a human user. In the computer vision field, constructing three-dimensional (3D) data sets from images is becoming more common. Microsoft's Photo- synth (Snavely et al., 2006) is one application which brought attention to the 3D scene reconstruction field. Many structure from motion algorithms are being applied to data sets of images in order to obtain a 3D point cloud (Koenderink and van Doorn, 1991; Mohr et al., 1993; Snavely et al., 2006; Crandall et al., 2011; Weng et al., 2012; Yu and Gallup, 2014; Agisoft, 2014).
    [Show full text]
  • Poseray Handbuch 3.10.3
    Das PoseRay Handbuch 3.10.3 Zusammengestellt von Steely. Angelehnt an die PoseRay Hilfedatei. PoseRay Handbuch V 3.10.3 Seite 1 Yo! Hör genau zu: Dies ist das deutsche Handbuch zu PoseRay, basierend auf dem Helpfile zum Programm. Es ist keine wörtliche Übersetzung, und FlyerX trifft keine Schuld an diesem Dokument (wenn man davon absieht, daß er PoseRay geschrieben hat). Dies ist ein Handbuch, kein Tutorial. Es erklärt nicht, wie man mit Poser tolle Frauen oder mit POV- Ray tolle Bilder macht. Es ist nur eine freie Übersetzung der poseray.html, die PoseRay beiliegt. Ich will mich bemühen, dieses Dokument aktuell zu halten, und es immer dann überarbeiten und erweitern, wenn FlyerX sichtbar etwas am Programm verändert. Das ist zumindest der Plan. Damit keine Verwirrung aufkommt, folgt das Handbuch in seinen Versionsnummern dem Programm. Die jeweils neueste Version findest Du auf meiner Homepage: www.blackdepth.de. Sei dankbar, daß Schwedenmann und Tom33 von www.POVray-forum.de meinen Prolltext auf Fehler gecheckt haben, sonst wäre das Handbuch noch grausiger. POV-Ray, Poser, DAZ, und viele andere Programm- und Firmennamen in diesem Handbuch sind geschützte Warenzeichen oder zumindest wie solche zu behandeln. Daß kein TM dahinter steht, bedeutet nicht, daß der Begriff frei ist. Unser Markenrecht ist krank, bevor Du also mit den Namen und Begriffen dieses Handbuchs rumalberst, mach dich schlau, ob da einer die Kralle drauf hat. Noch was: dieses Handbuch habe ich geschrieben, es ist mein Werk und ich kann damit machen, was ich will. Deshalb bestimme ich, daß es nicht geschützt ist. Es gibt schon genug Copyright- und IPR- Idioten; ich muß nicht jeden Blödsinn nachmachen.
    [Show full text]
  • POV-Ray Reference
    POV-Ray Reference POV-Team for POV-Ray Version 3.6.1 ii Contents 1 Introduction 1 1.1 Notation and Basic Assumptions . 1 1.2 Command-line Options . 2 1.2.1 Animation Options . 3 1.2.2 General Output Options . 6 1.2.3 Display Output Options . 8 1.2.4 File Output Options . 11 1.2.5 Scene Parsing Options . 14 1.2.6 Shell-out to Operating System . 16 1.2.7 Text Output . 20 1.2.8 Tracing Options . 23 2 Scene Description Language 29 2.1 Language Basics . 29 2.1.1 Identifiers and Keywords . 30 2.1.2 Comments . 34 2.1.3 Float Expressions . 35 2.1.4 Vector Expressions . 43 2.1.5 Specifying Colors . 48 2.1.6 User-Defined Functions . 53 2.1.7 Strings . 58 2.1.8 Array Identifiers . 60 2.1.9 Spline Identifiers . 62 2.2 Language Directives . 64 2.2.1 Include Files and the #include Directive . 64 2.2.2 The #declare and #local Directives . 65 2.2.3 File I/O Directives . 68 2.2.4 The #default Directive . 70 2.2.5 The #version Directive . 71 2.2.6 Conditional Directives . 72 2.2.7 User Message Directives . 75 2.2.8 User Defined Macros . 76 3 Scene Settings 81 3.1 Camera . 81 3.1.1 Placing the Camera . 82 3.1.2 Types of Projection . 86 3.1.3 Focal Blur . 88 3.1.4 Camera Ray Perturbation . 89 3.1.5 Camera Identifiers . 89 3.2 Atmospheric Effects .
    [Show full text]
  • Texture Mapping There Are Limits to Geometric Modeling
    Texture Mapping There are limits to geometric modeling http://www.beinteriordecorator.com National Geographic Although modern GPUs can render millions of triangles/sec, that’s not enough sometimes... Use texture mapping to increase realism through detail Rosalee Wolfe This image is just 8 polygons! [Angel and Shreiner] [Angel and No texture With texture Pixar - Toy Story Store 2D images in buffers and lookup pixel reflectances procedural photo Other uses of textures... [Angel and Shreiner] [Angel and Light maps Shadow maps Environment maps Bump maps Opacity maps Animation 99] [Stam Texture mapping in the OpenGL pipeline • Geometry and pixels have separate paths through pipeline • meet in fragment processing - where textures are applied • texture mapping applied at end of pipeline - efficient since relatively few polygons get past clipper uv Mapping Tschmits Wikimedia Commons • 2D texture is parameterized by (u,v) • Assign polygon vertices texture coordinates • Interpolate within polygon Texture Calibration Cylindrical mapping (x,y,z) -> (theta, h) -> (u,v) [Rosalee Wolfe] Spherical Mapping (x,y,z) -> (latitude,longitude) -> (u,v) [Rosalee Wolfe] Box Mapping [Rosalee Wolfe] Parametric Surfaces 32 parametric patches 3D solid textures [Dong et al., 2008] al., et [Dong can map object (x,y,z) directly to texture (u,v,w) Procedural textures Rosalee Wolfe e.g., Perlin noise Triangles Texturing triangles • Store (u,v) at each vertex • interpolate inside triangles using barycentric coordinates Texture Space Object Space v 1, 1 (u,v) = (0.2, 0.8)
    [Show full text]
  • What You Seam Is What You Get: Automatic and Interactive UV Unwrapping
    Online Submission ID: 0270 What you seam is what you get: automatic and interactive UV unwrapping Figure 1: Left: from a meshed model, our system automatically proposes an initial set of seams (black lines) and a valid texture mapping that the user starts with. Right: the user can interactively improve the mapping, by sewing charts and constraining seams (blue lines). As shown in the video, each user interaction is systematically echoed with instant visual feedback. Abstract • (2) Parameterization: each chart in 3-space is put into one to one 2 correspondence with a subset of R ; 3D paint systems opened the door to new texturing tools, directly operating on 3D objects. However, although time and effort was • (3) Packing: the charts are arranged in texture space to minimize devoted to mesh parameterization, UV unwrapping is still known storage requirements. to be a tedious and time-consuming process in Computer Graphics Time and effort was devoted to the problem of mesh parameteriza- production. We think that this is mainly due to the lack of well- tion (see e.g. SIGGRAPH course notes [Hormann et al. 2007] for adapted segmentation method. To make UV unwrapping easier, we an overview). Such mesh parameterization techniques are now well propose a new system, based on three components : • A novel spectral segmentation method that proposes reasonable known and broadly used in the Computer Graphics industry. Re- initial seams to the user; cently, formalizing the relations between deformations and curva- ture lead to both efficient and provably correct methods [Ben-Chen • Several tools to edit and constrain the seams.
    [Show full text]
  • Introduction to POV-Ray
    Introduction to POV-Ray POV-Team for POV-Ray Version 3.6.1 ii Contents 1 Introduction 1 1.1 Program Description . 2 1.2 What is Ray-Tracing? . 2 1.3 What is POV-Ray? . 2 1.4 Features . 3 1.5 The Early History of POV-Ray . 3 1.5.1 The Original Creation Message . 5 1.5.2 The Name . 6 1.5.3 A Historic ’Version History’ . 7 1.6 How Do I Begin? . 8 1.7 Notation and Basic Assumptions . 9 2 Getting Started 11 2.1 Our First Image . 11 2.1.1 Understanding POV-Ray’s Coordinate System . 11 2.1.2 Adding Standard Include Files . 12 2.1.3 Adding a Camera . 13 2.1.4 Describing an Object . 13 2.1.5 Adding Texture to an Object . 14 2.1.6 Defining a Light Source . 14 2.2 Basic Shapes . 15 2.2.1 Box Object . 15 2.2.2 Cone Object . 16 2.2.3 Cylinder Object . 16 2.2.4 Plane Object . 16 2.2.5 Torus Object . 17 2.3 CSG Objects . 22 2.3.1 What is CSG? . 22 2.3.2 CSG Union . 22 2.3.3 CSG Intersection . 23 2.3.4 CSG Difference . 24 2.3.5 CSG Merge . 25 2.3.6 CSG Pitfalls . 26 2.4 The Light Source . 26 2.4.1 The Pointlight Source . 26 2.4.2 The Spotlight Source . 28 2.4.3 The Cylindrical Light Source . 29 2.4.4 The Area Light Source . 29 2.4.5 The Ambient Light Source .
    [Show full text]
  • Developing a Process for Automating UV Mapping and Polygon Reduction
    Linköping University | Department of Computer and Information Science Bachelor’s Thesis | Computer Science Spring Term 2016 | LIU-IDA/LITH-EX-G--16/063—SE Developing a process for automating UV mapping and polygon reduction Julius Willén Tutor, Ivan Ukhov Examinator, Petru Ion Eles Upphovsrätt Detta dokument hålls tillgängligt på Internet – eller dess framtida ersättare – under 25 år från publiceringsdatum under förutsättning att inga extraordinära omständigheter uppstår. Tillgång till dokumentet innebär tillstånd för var och en att läsa, ladda ner, skriva ut enstaka kopior för enskilt bruk och att använda det oförändrat för ickekommersiell forskning och för undervisning. Överföring av upphovsrätten vid en senare tidpunkt kan inte upphäva detta tillstånd. All annan användning av dokumentet kräver upphovsmannens medgivande. För att garantera äktheten, säkerheten och tillgängligheten finns lösningar av teknisk och administrativ art. Upphovsmannens ideella rätt innefattar rätt att bli nämnd som upphovsman i den omfattning som god sed kräver vid användning av dokumentet på ovan beskrivna sätt samt skydd mot att dokumentet ändras eller presenteras i sådan form eller i sådant sammanhang som är kränkande för upphovsmannens litterära eller konstnärliga anseende eller egenart. För ytterligare information om Linköping University Electronic Press se förlagets hemsida http://www.ep.liu.se/. Copyright The publishers will keep this document online on the Internet – or its possible replacement – for a period of 25 years starting from the date of publication barring exceptional circumstances. The online availability of the document implies permanent permission for anyone to read, to download, or to print out single copies for his/hers own use and to use it unchanged for non- commercial research and educational purpose.
    [Show full text]
  • Proposed Workflow for UV Mapping and Texture Painting
    Thesis no: BGD-2016-07 Proposed workflow for UV mapping and texture painting Mostafa Hassan Faculty of Computing Blekinge Institute of Technology SE-371 79 Karlskrona Sweden This thesis is submitted to the Faculty of Computing at Blekinge Institute of Technology in partial fulfillment of the requirements for the degree of Bachelor of Science in Digital Game Development. The thesis is equivalent to 10 weeks of full time studies. Contact Information: Author(s): Mostafa Hassan E-mail: [email protected] University advisor: Francisco Lopez Department of creative technologies Faculty of Computing Internet : www.bth.se Blekinge Institute of Technology Phone : +46 455 38 50 00 SE-371 79 Karlskrona, Sweden Fax : +46 455 38 50 57 i i ABSTRACT Context. There are several workflows for texturing 3D models. 3D models will often have to be constructed and textured before they can be viewed in a game engine. Files would have to be exported and imported in order to view the result. This thesis will look at the usability of having the programs that are used to construct and texture assets connected with each other. In other words, a program would send and receive data in real-time which can be used to avoid the exporting and importing of assets. Objectives. Define a better workflow for texturing models that will be used for games. Compare the usability in terms of speed and the bother of managing asset files. Methods. This work utilizes a comparative experiment were subjects get to test and evaluate two workflows, the traditional workflow which the subjects should already be familiar with, and the prototype system that allows subjects texture painting in real-time.
    [Show full text]
  • Meshes in Games Part 1/2
    3D Video Games 2021-04-22 08: Meshes in Games Part 1/2 3D video games Models for Games Marco Tarini 1 Course Plan lec. 1: Introduction lec. 2: Mathematics for 3D Games lec. 3: Scene Graph ◗ lec. 4: Game 3D Physics + lec. 5: Game Particle Systems ◗ lec. 6: Game 3D Models lec. 7: Game Textures ◗ lec. 8: Game 3D Animations lec. 9: Game 3D Audio lec. 10: Networking for 3D Games lec. 11: Artificial Intelligence for 3D Games lec. 12: Game 3D Rendering Techniques 2 Marco Tarini Università degli studi di Milano 1 3D Video Games 2021-04-22 08: Meshes in Games Part 1/2 Solomons’s key (1986, Temco) on Z80 reminder: during the ’80s – early ‘90s, the principal asset in games consisted in sprites / tilemaps authored by pixel artists ... Metal Slug (1996, Nazca Copr), on Neo Geo (SNK) 3 Triangle Meshes The visual appearance of 3D objects Data structure for modelling 3D objects GPU friendly Resolution = number of faces (Potentially) Adaptive resolution Used in games to represent the visual appearance of 3D objects at least, the ones which can be represented by their surface most solid objects (rigid or not) Mathematically: a piecewise linear surface a bunch of surface samples “vertices” connected by a set of triangular “faces” attached side to side by “edges” 4 Marco Tarini Università degli studi di Milano 2 3D Video Games 2021-04-22 08: Meshes in Games Part 1/2 Triangle Mesh (or simplicial mesh) A set of adjacent triangles faces vertices edges 5 Mesh: data structure A mesh is made of geometry The vertices, each with pos (x,y,z)
    [Show full text]
  • Two Types of UV-Maps RGB Maps: How Are They Obtained?
    3D video gaming - Marco Tarini 2019-05-03 Univ Milano Two types of UV-maps aka: “UV-map” (the standard) NOT injective UV mapping Different zones of the mesh mapped to the same texture region e.g.: with overlapping charts Optimization of texture RAM Can exploit of simmetries / repetitions aka: “Unwrapping” Injective UV mapping or: “Unwrapped UVs” 1 (non empty) point on the texture = or: “1:1 UV-map” 1 point on the mesh or: “Lightmap” UV-map non-overlapping charts! or: Generality / Flexibility “Non-overlapping” UV-map Used for several scopes (e.g. light baking) Different objectives often, both are present: 2 distinct UV maps 2 distinct UV attributes for each vertex Which is the type of the UV-maps shown in prev slides? RGB maps: How are they obtained? Image first, then UV-mapping e.g. Images from photos e.g. tileable images UV-mapper UV-mapping first, then paint 2D paint with 2D app (e.g. photoshop) UV-mapper 2D painter UV-mapping first, then paint 3D paint within 3D modelling software, or: 1. export 2D rendering, 2. paint over with e.g. photoshop, UV-mapper 3D painter 3. reimport images 4. goto 1 Texures part 2 1 3D video gaming - Marco Tarini 2019-05-03 Univ Milano RGB maps: How are they obtained? …or: first Paint 3D on hi-res model, “paint” on vertex attributes e.g. with Z brush… then coarsen build / autobuild final low-poly version then UV-map the low-poly model must be a 1:1 mapping! more then auto-texture about auto build texture this later… Cutout textures example Texels = transparency level (0 or 1) Alpha map RGB map Texures part 2 2 3D video gaming - Marco Tarini 2019-05-03 Univ Milano Cutout textures Texels = transparency level (0 or 1) e.g.: drapes, beard..
    [Show full text]
  • Introduction to POV-Ray
    Introduction to POV-Ray POV-Team for POV-Ray Version 3.6 BETA 2 Contents 1 Introduction 13 1.1 Program Description . 14 1.2 What is Ray-Tracing? . 14 1.3 What is POV-Ray? . 15 1.4 Features . 15 1.5 The Early History of POV-Ray . 16 1.5.1 The Original Creation Message . 18 1.5.2 The Name . 19 1.5.3 A Historic ’Version History’ . 21 1.6 How Do I Begin? . 22 1.7 Notation and Basic Assumptions . 22 2 Getting Started 25 2.1 Our First Image . 25 2.1.1 Understanding POV-Ray’s Coordinate System . 25 2.1.2 Adding Standard Include Files . 27 2.1.3 Adding a Camera . 28 2.1.4 Describing an Object . 28 2.1.5 Adding Texture to an Object . 28 2.1.6 Defining a Light Source . 29 2.2 Basic Shapes . 30 2.2.1 Box Object . 30 2.2.2 Cone Object . 31 2.2.3 Cylinder Object . 31 2.2.4 Plane Object . 31 2.2.5 Torus Object . 32 2.3 CSG Objects . 38 2.3.1 What is CSG? . 38 2.3.2 CSG Union . 38 2.3.3 CSG Intersection . 39 2.3.4 CSG Difference . 40 2.3.5 CSG Merge . 41 2.3.6 CSG Pitfalls . 42 2.4 The Light Source . 42 2.4.1 The Pointlight Source . 43 2.4.2 The Spotlight Source . 44 2.4.3 The Cylindrical Light Source . 45 4 CONTENTS 2.4.4 The Area Light Source . 46 2.4.5 The Ambient Light Source .
    [Show full text]
  • Texture Mapping & Shaders
    Texture Mapping & Shaders © Remedy Enterainment. All rights reserved. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/help/faq-fair-use/. MIT EECS 6.837 Computer Graphics MIT EECS 6.837 – Matusik 1 BRDF in Matrix II & III © ACM. All rights reserved. This content is excluded from our Creative Commons license. For more information, see http://ocw.mit.edu/help/faq-fair-use/. 2 Spatial Variation • All materials seen so far are the same everywhere – In other words, we are assuming the BRDF is independent of the surface point x – No real reason to make that assumption Courtesy of Fredo Durand. Used with permission. © ACM. All rights reserved. This content is excluded © source unknown. All rights reserved. from our Creative Commons license. For more This content is excluded from our Creative information, see http://ocw.mit.edu/help/faq-fair-use/. Commons license. For more information, see http://ocw.mit.edu/help/faq-fair-use/. 3 Spatial Variation • We will allow BRDF parameters to vary over space – This will give us much more complex surface appearance – e.g. diffuse color kd vary with x – Other parameters/info can vary too: ks, exponent, normal © ACM. All rights reserved. This content is excluded © source unknown. All rights reserved. Courtesy of Fredo Durand. Used with permission. from our Creative Commons license. For more This content is excluded from our Creative information, see http://ocw.mit.edu/help/faq-fair-use/. Commons license. For more information, see http://ocw.mit.edu/help/faq-fair-use/. 4 Two Approaches •From data : texture mapping – read color and other information from 2D images © ACM.
    [Show full text]