Diploma Thesis Filtering Reflection Properties of Rough Surfaces

Total Page:16

File Type:pdf, Size:1020Kb

Diploma Thesis Filtering Reflection Properties of Rough Surfaces TECHNICAL UNIVERSITY DRESDEN FACULTY OF COMPUTER SCIENCE INSTITUTE OF SOFTWARE AND MULTIMEDIA TECHNOLOGY CHAIR OF COMPUTER GRAPHICS AND VISUALIZATION PROF.DR.STEFAN GUMHOLD Diploma thesis for the acquisition of the academic degree Diplom-Informatiker Filtering reflection properties of rough surfaces Andreas Ecke (Born 11. November 1987 in Ilmenau) Supervisor: Prof. Dr. rer. nat. Stefan Gumhold Dipl. Medien-Inf. Andreas Stahl Dresden, September 27, 2012 Task Filtering of color textures with mipmaps is important to reduce artifacts when rendering a surface at different resolutions. However, this approach does not work for rough surfaces with varying reflection properties. For filtering of reflection properties, solutions like BRDF hierarchies and BTF filters are available, but have several drawbacks. For geometry filtering commonly used techniques include geometry hierarchies and the rendering of displacement maps by ray-casting. The objective of this work is to design a filtering approach for the reflection properties of rough surfaces that can be used to realistically render a surface at different resolutions. This approach should combine BTFs with ray-casting based displacement mapping to allow the rendering of the surface in real-time. Subgoals: • Literature research on BTFs, filtering and displacement mapping • Procedure for synthesis of a BTF from a surface description • Design of a BTF filter with image correlation • Procedure for real-time rendering of a surface with displacement mapping • Investigation of properties of the implementation like handling the empty border of the BTF images, selection of the mipmap level, and application on curved surfaces • Evaluation of the approach concerning correctness and time requirements • Optional: BTF compression, illumination by sky maps Statement of authorship I hereby certify that the diploma thesis I submitted today to the examination board of the faculty of computer science with the title: Filtering reflection properties of rough surfaces has been composed solely by myself and that I did not use any sources and aids other than those stated, with quotations duly marked as such. Dresden, September 27, 2012 Andreas Ecke Abstract Filtering of color textures with mipmaps is important to reduce artifacts when rendering a surface at different resolutions. However, this approach does not work for rough surfaces with varying reflection properties. For filtering of reflection properties, solutions like BRDF hierarchies and BTF filters are available, but have several drawbacks. For geometry filtering commonly used techniques include geometry hierarchies and the rendering of displacement maps by ray-casting. In this thesis, a filtering approach is designed for the reflection properties of rough surfaces, that can be used to realistically render a surface at different resolutions. This approach combines BTFs with ray-casting based displacement mapping to allow high-quality rendering of the surface for any resolution in real-time. 1 Contents Nomenclature 3 1 Introduction 5 1.1 Structure of this thesis . .5 2 Preliminaries 7 2.1 Rough Surfaces . .7 2.1.1 Per-vertex displacement mapping . .7 2.1.2 Per-pixel displacement mapping . .8 2.1.3 Curved surfaces . 10 2.2 Varying reflectance . 10 2.2.1 BRDFs . 11 2.2.2 Models . 11 2.3 Filtering . 14 2.3.1 Mipmapping . 14 2.3.2 Transition between bump rendering algorithms . 15 2.3.3 BRDF mixture models . 16 2.4 Bidirectional texture functions . 16 2.4.1 LOD for BTFs . 17 2.4.2 Compression and rendering . 18 3 BTF generation 19 3.1 Surface rendering . 20 3.1.1 Lighting models . 20 3.1.2 Per-pixel displacement mapping . 21 3.2 Image rectification . 24 3.2.1 Transformation . 24 3.2.2 Projection matrix . 24 3.3 Filtering & Compression . 26 3.3.1 Laplacian pyramid . 26 3.3.2 Principal component analysis . 27 3.3.3 Texture packing . 28 3.4 Handling of empty border pixels . 28 4 Rendering 31 4.1 BTF rendering . 31 4.1.1 Projection & Directions . 31 4.1.2 Decompression . 32 4.1.3 Level of Detail . 32 4.1.4 Interpolation . 33 4.2 Transition between the BTF and relief renderer . 34 4.3 Curved surfaces . 35 4.3.1 Using BTFs with curved surfaces . 36 2 5 Evaluation 39 5.1 Performance . 39 5.1.1 Preprocessing . 39 5.1.2 Rendering . 40 5.2 Level of detail . 41 5.3 BTF compression . 43 5.3.1 Border pixels . 44 5.3.2 Mean encoding . 45 5.4 Interpolation . 46 5.5 Curved surfaces . 48 6 Conclusion 51 Bibliography 53 List of Figures 55 List of Tables 57 3 Nomenclature (n;t;b) or TBN tangent space consisting of normal vector n, tangent t and bi-tangent b (s;t) and (u;v) texture coordinates mk mean vector resulting from performing PCA on the kth Gaussian pyramid levels w = (q;f) direction consisting of inclination q and polar angle f wi = (qi;fi) direction of incident irradiance wr = (qr;fr) direction of emitted radiance r albedo of the surface s surface roughness for the Oren-Nayar reflection model depth(u;v) value stored in the depth map for texture coordinates (u;v) E0 light intensity F0 reflectance of a surface at normal incidence fr(wr;wi) general BRDF defined for incident direction wi and exitant direction wr fr(qr;qi;fr − fi) isotropic BRDF wi;wr Gk kth level of the Gaussian pyramid for light direction wi and view direction wr h half vector between l and v ka, kd and ks ambient, diffuse and specular color of a surface l light direction wi;wr Lk kth level of the Laplacian pyramid for light direction wi and view direction wr Lr resulting color by evaluating the reflection model lod level of detail parameter m surface roughness for the Cook-Torrance reflection model n normal vector p = (x;y) a point on the surface s shininess exponent for the Blinn-Phong reflection model v view direction 4 5 1 Introduction Rendering surfaces is one of the most fundamental tasks in computer graphics, as objects are generally represented by their surfaces only. Surfaces of natural objects can be highly complex, featuring roughness, a certain structure and variable reflection properties. Rendering such surfaces is an equally complex task. Much research has been done to find methods to render them as realistic and as fast as possible. An example of such a complex surface is a planet in a space simulation. The planet surface has different elevations, like high mountain ranges and flat oceans, and highly varying reflection properties such as specular water, diffuse forests and glistening, snowy mountain tops. This high complexity also holds for surfaces at other scales, such as house fronts with their reflective windows, a glossy car paint with dirt splashes or printed circuit boards. By combining several reflection models and using techniques such as displacement mapping to add geometric detail, these surfaces can be rendered fairly well. However, another important aspect of realistic rendering is filtering of the surface at low resolutions, i.e. when the surface is far away, and this is where these basic methods fail: They produce heavy aliasing artifacts or change the appearance of the filtered surface incorrectly. This is because standard filtering of the base textures is incorrect for general surfaces. This can be seen for our planet example: If the surface color of this planet is described by a simple color texture, filtering this texture would yield a mixture of the blue water, green forests, a white mountain tops. However, in reality, the blue water and green forest may not be visible when looking from shallow angles, as the are partially or fully occluded by the mountains. Then, the filtered textures should not contain these colors for that particular view direction. Hence, filtering the color texture using the usual methods produces wrong results. Therefore, this thesis main concern is the question of how to filter complex surfaces such that highly realistic and correct real-time rendering is possible at arbitrary resolutions. Given the importance of surface rendering as one of the most fundamental tasks and its huge impact on perceived image quality, a general method that solves this problem will be highly useful in various areas of computer graphics. The proposed solution consists of constructing a bidirectional texture function (BTF), which consists of images of the surface for many different view and light directions, from the surface description. Using such a BTF, each of these images of the BTF can be filtered separately using standard methods. While the constructed BTF solves the filtering problem this way, it introduces further difficulties: BTFs are generally too large for direct rendering on the GPU, and because of their small spatial resolution they are inappropriate when rendering the surface for large resolutions, i.e. when the surface is very close. 1.1 Structure of this thesis In Chapter 2, we introduce the basic concepts and methods needed for the rendering of realistic surfaces: reflection models, methods to add geometric detail such as displacement mapping and, of cause, standard approaches to filtering. We also discuss related work and point out the issues, and how a solution ultimately leads to a BTF-based filtering approach. The chapter will conclude with an introduction of BTFs and how filtering them can be achieved. The next two chapters explain our approach to surface rendering. In particular, Chapter 3 deals with the 6 1. INTRODUCTION preprocessing steps needed such as the construction of the BTF from the surface description, its filtering and compression, while Chapter 4 explains how the data from the preprocessing stage is finally used in the real-time renderer. Following this, in Chapter 5 we analyze a prototypic implementation and evaluate it in terms of quality, run time and memory requirements.
Recommended publications
  • Flat Shading
    Shading Reading: Angel Ch.6 What is “Shading”? So far we have built 3D models with polygons and rendered them so that each polygon has a uniform colour: - results in a ‘flat’ 2D appearance rather than 3D - implicitly assumed that the surface is lit such that to the viewer it appears uniform ‘Shading’ gives the surface its 3D appearance - under natural illumination surfaces give a variation in colour ‘shading’ across the surface - the amount of reflected light varies depends on: • the angle between the surface and the viewer • angle between the illumination source and surface • surface material properties (colour/roughness…) Shading is essential to generate realistic images of 3D scenes Realistic Shading The goal is to render scenes that appear as realistic as photographs of real scenes This requires simulation of the physical processes of image formation - accurate modeling of physics results in highly realistic images - accurate modeling is computationally expensive (not real-time) - to achieve a real-time graphics pipeline performance we must compromise between physical accuracy and computational cost To model shading requires: (1) Model of light source (2) Model of surface reflection Physically accurate modelling of shading requires a global analysis of the scene and illumination to account for surface reflection between surfaces/shadowing/ transparency etc. Fast shading calculation considers only local analysis based on: - material properties/surface geometry/light source position & properties Physics of Image Formation Consider the
    [Show full text]
  • An Importance Sampling Method for Arbitrary Brdfs Used in Global Illumination Applications
    UNIVERSITY OF GRANADA Postgraduate Programme in Software Systems Development Departament of Lenguajes y Sistemas Inform´aticos An Importance Sampling Method for arbitrary BRDFs used in Global Illumination Applications Ph.D. Thesis Dissertation (for european mention) presented by Rosana Montes Soldado Advisor Carlos Ure na˜ Almagro Granada. June, 2008 Editor: Editorial de la Universidad de Granada Autor: Rosana Montes Soldado Nº REGISTRO: 11/45216 ISBN: 978-84-694-3302-7 To my parents Antonio and Maria Abstract Nowadays there are many applications in the real world for Global Illumination algorithms. From modern movies’ special effects to the design of interior lighting, and even amazing video games. The scattering of light in an environment create an interesting effect on a surface: a distinctive quality of many materials. In this sense the BRDF function captures the behaviour of light when it arrives on a surface and reflects in several directions. Our main objective of this work involves obtaining visually realistic images from computers, that accomplish a simulation of the propagation of light with numerical calculations [Mil84, WRC88, GSCH93, GTGB84, CWH93] using programs. The use of these algorithms allow us to obtain images with a more natural aspect, similar to the photographs. It is possible to simulate these characteristics with a computer using Global Illumination algorithms [Whi80, CPC84, Kaj86, WH92], that are based on following a ray path applying Monte Carlo’s method. Often these calculations present high run times, therefor we tried to implement efficient techniques suitable for them. In addition, Monte Carlo’s algorithms carry an intrinsic error in the estimator, that it is corrected using many more random samples.
    [Show full text]
  • Lighting CS425: Computer Graphics I
    Lighting CS425: Computer Graphics I Fabio Miranda https://fmiranda.me Fabio Miranda | CS425: Computer Graphics I Overview • Light and shading • Rendering equation • Light-material interaction • Reflection models: • Phong, • Blinn-Phong • Shading models: • Flat • Gouraud • Phong Fabio Miranda | CS425: Computer Graphics I 2 Lighting and shading • Light is emitted by a light source. • Light interacts with objects in the scene: • Part is absorbed, part is scattered in new directions. • Finally, light is absorbed by a sensor (e.g., human eye, film). Fabio Miranda | CS425: Computer Graphics I 3 Lighting and shading • Shading objects so their images appear three-dimensional. • How can we model light-material interactions? • We will see how to build a simple reflection model (Phong model) that can be used with real-time graphics hardware. Fabio Miranda | CS425: Computer Graphics I 4 Why we need shading? • Appearance of surfaces, taking into account: • Surface material • Lighting conditions MC Escher Fabio Miranda | CS425: Computer Graphics I 5 Why we need shading? Full Sun Morning Sun Setting Sun Grey Weather Rouen Cathedral (Monet series) Fabio Miranda | CS425: Computer Graphics I 6 Light and shading 1. Light source emits photons 3. Some are captured 2. Photons interact by eye / camera with the environment: absorption, reflection Fabio Miranda | CS425: Computer Graphics I 7 Lighting and shading • Light-material interactions cause each point to have a different color or shade. • We need to consider: • Light sources. • Material properties. • Location
    [Show full text]
  • Computer Graphics Lecture 8: Lighting and Shading
    CS130 : Computer Graphics Lecture 8: Lighting and Shading Tamar Shinar Computer Science & Engineering UC Riverside Why we need shading •Suppose we build a model of a sphere using many polygons and color each the same color. We get something like •But we want 2 The more realistically lit sphere has gradations in its color that give us a sense of its three-dimensionality Shading •Why does the image of a real sphere look like •Light-material interactions cause each point to have a different color or shade •Need to consider Light sources Material properties Location of viewer Surface orientation (normal) 3 We are going to develop a local lighting model by which we can shade a point independently of the other surfaces in the scene our goal is to add this to a fast graphics pipeline architecture General rendering • The most general approach is based on physics - using principles such as Shreiner] [Angel and conservation of energy • a surface either emits light (e.g., light bulb) or reflects light for other illumination sources, or both • light interaction with materials is recursive • the rendering equation is an integral equation describing the limit of this recursive process http://en.wikipedia.org/wiki/Rendering_equation Fast local shading models • the rendering equation can’t be solved analytically • numerical methods aren’t fast enough for real-time • for our fast graphics rendering pipeline, we’ll use a local model where shade at a point is independent of other surfaces • use Phong reflection model • shading based on local light-material
    [Show full text]
  • Recovering Facial Reflectance and Geometry from Multi-View Images
    Recovering Facial Reflectance and Geometry from Multi-view Images Guoxian Song∗ Jianmin Zheng Nanyang Technological University Nanyang Technological University [email protected] [email protected] Jianfei Cai Tat-Jen Cham Nanyang Technological University Nanyang Technological University [email protected] [email protected] ABSTRACT While the problem of estimating shapes and diffuse reflectances of human faces from images has been extensively studied, there is rela- tively less work done on recovering the specular albedo. This paper presents a lightweight solution for inferring photorealistic facial reflectance and geometry. Our system processes video streams from two views of a subject, and outputs two reflectance maps for diffuse and specular albedos, as well as a vector map of surface normals. A model-based optimization approach is used, consisting of the three stages of multi-view face model fitting, facial reflectance inference and facial geometry refinement. Our approach is based on a novel formulation built upon the 3D morphable model (3DMM) for rep- resenting 3D textured faces in conjunction with the Blinn-Phong reflection model. It has the advantage of requiring only a simple Figure 1: Given image streams from two views, our method setup with two video streams, and is able to exploit the interaction can infer the 3D geometry as well as the diffuse and specu- between the diffuse and specular reflections across multiple views lar albedos of faces, together with the environment lighting. as well as time frames. As a result, the method is able to reliably This allows rendering to new views with or without specular recover high-fidelity facial reflectance and geometry, which facili- reflections.
    [Show full text]
  • Illumination Introduction Llumination and Shading Introduction
    Introduction Illumination and Shading A number of different types of light sources exist to provide customization for the shading of objects. Upon rendering a scene a number of different lighting techniques will be used to make the rendering look more realistic. Overview Model for calculating light intensity(color) at a single surface point as an illumination model or a lighting model ) Shading (how to color the whole surface)? LIGHT SOURCES Light can be emitted through either self-emission or reflection. Light sources are categorized by their light emitting direction and the energy emitted at each wavelength – determining the color of the light. LIGHT SOURCES Objects can absorb or reflect light emitted from a light source depending on the reflecting object’s material properties. Light will thus only be ‘visible’ when illuminated surfaces have the ability to reflect or absorb the said light. Material properties are user-defined parameters built around rules determining the amount of scattering or reflection of incident light. Point Lights A point light emits light uniformly in 360 degrees. Point lights have fixed color and position values and are omnidirectional in nature. Spotlights Spotlights are specified by a color, spatial position and some specific direction and range in which light is emitted. A spotlight is basically a point light with its emitting light constrained within an angle range. Spotlights Parallel Lights A parallel or directional light illuminates objects through a series of parallel light rays. These light sources can be considered as point lights located a significant distance from the surface of an object. Basic Illumination Model Simple 3 parameter model • Ambient : 'background' illumination • Specular : bright, shiny reflections • Diffuse : non-shiny illumination and shadows surface normal (specifies surface orientation) Light source 'Virtual' (here point light camera source) Object Ambient Lighting A surface that is not exposed directly to a light source still will be visible it nearby objects are illuminated.
    [Show full text]
  • Lighting and Shading Tamar Shinar Computer Science & Engineering
    CS130 : Computer Graphics Lighting and Shading Tamar Shinar Computer Science & Engineering UC Riverside Why we need shading •Suppose we build a model of a sphere using many polygons and color each the same color. We get something like •But we want 2 Shading •Why does the image of a real sphere look like •Light-material interactions cause each point to have a different color or shade •Need to consider Light sources Material properties Location of viewer Surface orientation (normal) 3 General rendering • The most general approach is based on physics - using principles such as Shreiner] [Angel and conservation of energy • a surface either emits light (e.g., light bulb) or reflects light from other illumination sources, or both • light interaction with materials is recursive • the rendering equation is an integral equation describing the limit of this recursive process Fast local shading models • the rendering equation can’t be solved analytically • numerical methods aren’t fast enough for real-time • for our fast graphics rendering pipeline, we’ll use a local model where shade at a point is independent of other surfaces • use Phong reflection model • shading based on local light-material interactions Local shading model [Angel and Shreiner] [Angel and direct light reflected light Global Effects shadow [Angel and Shreiner] [Angel and multiple reflection translucent surface 7 Light-material interactions at a surface, light is absorbed, reflected, or transmitted [Angel and Shreiner] [Angel and specular diffuse translucent Idealized light sources
    [Show full text]
  • Light Reflection Models Direct Illumination
    Rendering Light Reflection Models Direct Illumination Visual Imaging in the Electronic Age Donald P. Greenberg October 20 , 2020 Lecture #14 Ivan Sutherland - 1963 General Electric 1967 Program of Computer Graphics, Cornell University Professors Office Janitor’s Closet 1973 DPG 1967 Program of Computer Graphics, Cornell University Cornell in Perspective Film Program of Computer Graphics, Cornell University Cornell in Perspective Film Direct Illumination Model Camera Perspective Raster Operations Image Storage Display Perspective Transformation Model Camera • Perspective transformation Matrix multiplication Perspective Raster • Clipping Operations Image Storage • Culling Display Goal of Realistic Imaging “The resulting images should be physically accurate and perceptually indistinguishable from real world scenes” Goal of Realistic Imaging From Strobel, Photographic Materials and Processes © Focal Press, 1986. Lighting Jeremy Birn, “[digital] Lighting & Rendering”, 2000 New Riders Publishers The three dimensional shape is only inferred with this lighting. Lighting Jeremy Birn, “[digital] Lighting & Rendering”, 2000 New Riders Publishers The geometry is better understood with correct lighting and shading. Rendering Framework 1997 Cornell Box with Cameras Direct Lighting and Indirect Lighting Direct Direct 9 9 9 7 5 5 5 9 9 9 7 5 5 5 9 9 9 7 5 5 5 Indirect 9 9 9 7 5 5 5 9 7 14 9 7 5 5 9 7 14 9 7 5 5 9 3 1 7 9 7 7 9 3 1 7 9 7 7 7 3 1 7 9 9 9 7 3 1 7 9 9 9 6 3 1 7 9 9 9 6 3 1 7 9 9 9 6 6 6 9 9 9 9 6 6 6 9 9 9 9 Direct Lighting and Indirect Lighting Assumptions In Direct Lighting Light travels directly from light source to all object surfaces (no occlusion) no shadows All light sources are point light sources (no geometric area) No interreflections from any surfaces Lights maybe “directional”, “spot” or “omni lights” Model Raster Operations • Conversion from polygons Camera to pixels Perspective • Hidden surface removal Raster (z-buffer) Operations Image • Incremental shading Storage Display Diffuse Reflections Roy S.
    [Show full text]
  • Brdfs & Texturing
    Computer Graphics - BRDFs & Texturing - Hendrik Lensch Computer Graphics WS07/08 – BRDFs and Texturing Overview • Last time – Radiance – Light sources – Rendering Equation & Formal Solutions • Today – Bidirectional Reflectance Distribution Function (BRDF) – Reflection models – Projection onto spherical basis functions – Shading • Next lecture – Varying (reflection) properties over object surface: texturing Computer Graphics WS07/08 – BRDFs and Texturing Reflection Equation - Reflectance • Reflection equation L (x,ω ) = f (ω , x,ω ) L (x,ω ) cosθ dω o o ∫ r i o i i i i Ω+ • BRDF – Ratio of reflected radiance to incident irradiance dLo (x,ωo ) fr (ωo , x,ωi ) = dEi (x,ωi ) Computer Graphics WS07/08 – BRDFs and Texturing Bidirectional Reflectance Distribution Function • BRDF describes surface reflection for light incident from direction (θi,φi) observed from direction (θο,φο) • Bidirectional – Depends on two directions and position (6-D function) • Distribution function – Can be infinite • Unit [1/sr] dLo (x,ω o ) fr (ω o , x,ω i ) = dEi (x,ω i ) dL (x,ω ) = o o dLi (x,ω i )cosθi dωi Computer Graphics WS07/08 – BRDFs and Texturing BRDF Properties • Helmholtz reciprocity principle – BRDF remains unchanged if incident and reflected directions are interchanged fr (ωo ,ωi ) = fr (ωi ,ωo ) • Smooth surface: isotropic BRDF – reflectivity independent of rotation around surface normal – BRDF has only 3 instead of 4 directional degrees of freedom fr (x,θi ,θo ,ϕo −ϕi ) Computer Graphics WS07/08 – BRDFs and Texturing BRDF Properties • Characteristics
    [Show full text]
  • DESIGN Pi2inciples of HARDWARE-BASED PHONG SHADING and BUMP-MAPPING
    Comput. & Graphics, Vol. 21, No. 2, pp. 143-149, 1997 0 1997 Elsevier Science Ltd. All rights reserved Printed in Great Britain 0097*493/97 s17.oo+o.oo PII: s0097-8493@6)00077-5 Graphics Hardware DESIGN Pi2INCIPLES OF HARDWARE-BASED PHONG SHADING AND BUMP-MAPPING K. BENNEBROEK’, I. ERNST’, H. RUSSELER2 and 0. WITTIG2+ ‘Department of Electrical Engineering, Networking Theory Group, University of Twente, Twente, The Netherlands *German National Research Center for Computer Science,Institute for Computer Architecture and Software Technology (GMD FIRST), Rudower Chaussee 5, 12489Berlin, Germany e-mail: [email protected] Abstract-The VISA+ hardwarearchitecture is the first of a new generationof graphicsaccelerators designedprimarily to renderbump-, texture-, environment- and environment-bump-mappedpolygons. This paperpresents examples of the maingraphical capabilities and discusses methods and simplifications usedto createhigh quality images.One of the key conceptsin the VISA + design,the useof reflectance cubes,is predestinedfor environmentmapping. In combinationwith bump-and texture-mapping it shows the strengthof our new architecture.Furthermore it justifiessome of the decisionsmade during simulationand developmentof the complexVISA+ architecture.0 1997Elsevier Science Ltd 1. INTBODUCI’ION interpolation approach can be important if diffuse Even the fastest high quality graphics workstations intensitiesare calculatedon an incremental basis. like the recently announcedInfiniteReality@ graphics The normal-vector shading approach of the acceleratorsare still basedon common texturing in GMD-FIRST is a reflection map method. This combination with Gouraud shading. With continu- shadingtechnique, implemented in the VISA system ously improving VLSI technologiesin combination [5l operatesin a similar way to the reflection-vector with CAD tools to support full custom chip design, shadinghardware of Voorhies and Foran [6], new graphic architectures must evolve.
    [Show full text]
  • CHAPTER 3 from Surface to Image
    CHAPTER 3 From Surface to Image 3.1. Introduction Given a light source, a surface, and an observer, a reflectance model describes the intensity and spectral composition of the reflected light reaching the observer. The intensity of the reflected light is determined by the intensity and size of the light source and by the reflecting ability and surface properties of the surface. In this chapter, we will discuss the modelling techniques from 3D surface to 2D image. First all, we introduce surface roughness models, including height distribution model and slope distribution model, and then the illumination geometry used in this thesis is illustrated. Secondly, various reflection and illumination modelling is under the review. Therefore a simple Lambertian illumination model is presented to describe diffuse reflection. Thirdly, We present the Kube-Pentland’s surface model, a linear Lambertian model used in this thesis. A deep investigation about this model is given, with regard to its frequency domain responses, directional filter, effect of non-linear and shadowing. Fourthly, four models of rough synthetic surfaces are given for the purpose of simulation process. Finally, we demonstrate that surface rotation classification is not equivalent to image rotation classification. 3.1.1. Surface Roughness 30 The manner in which light is reflected by a surface is dependent on the shape characteristics of the surface. In order to analyse the reflection of incident light, we must know the shape of the reflecting surface. In another words, we need a mathematical model of the surface. There are two ways to describe the model of surface and its roughness: the height distribution model and the slope distribution model [Nayar91].
    [Show full text]
  • Shading Ô Specular Reflection
    Computer Graphics 1, Lecture 5 April 25, 2005 The Phong Reflection Model Ambient reflection + Diffuse reflection + Shading Specular reflection Computer Graphics 1, Spring 2005 n v Lecture 5 r l θ φ Chapter 6 Ingela Nyström Ambient reflection Ambient reflection Ia = k a La Diffuse reflection Diffuse reflection n l θ θ • Id = kd Ld cos( ) = kd Ld (n l) Lambert’s law Ingela Nyström 1 Computer Graphics 1, Lecture 5 April 25, 2005 Specular reflection Specular reflection n v r φ l α φ • α Is = ks Ls cos ( ) = ks Ls (v r) Shininess = 20 The Phong Reflection Model Phong reflection For each color (r,g,b) calculate the reflected intensity : Ambient Distance term Diffuse Specular 1 • • α )) = Σ ( k L + 2 ( k L (n l) + k L (v r) I a a a+ bd + cd d d s s All lightsources Shininess Distance to lightsource Material properties: n v α r ka, kd, ks , l θ φ Lightsource properties: La, L d, L s, a, b, c Calculation of reflection vector r n l r ambient + diffuse + specular s s (l • n) n r = l + 2 s = l + 2 ( ( l • n) n - l ) = 2 ( l • n) n - l Ingela Nyström 2 Computer Graphics 1, Lecture 5 April 25, 2005 Use of halfway vector h When to use which parts of the model? v n v n h n r r r l l ψ l φ v I = Ia + I d + I s I = Ia Lightsource and observer are on the same side if sign ( l•n) = sign ( v•n) h = ( l+v) / | l+v| (v•r)α is replaced with ( n•h)α’ 2ψ = φ (or equally : (l•n)( v•n) > 0 ) Flat shading Flat shading • One colour for the whole polygon • Normal vector is easily calculated from a cross product • Fast, but with rather poor result • Mach bands are
    [Show full text]