Pixel pipeline  OpenGL has a pixel pipeline along with the geometry pipeline (from processor memory to frame buffer)

Tom Shermer Richard (Hao) Zhang  There is a whole set of OpenGL functions on pixel Introduction to manipulation, see OpenGL reference cards CMPT 361 – Lecture 18  Many functions for texture mapping and manipulation

1 2

Texture mapping Texture mapping in OpenGL

 How to map an image (texture) onto a surface  In older versions, 1D and 2D texture mapping

 Now 3D/solid textures are supported in OpenGL

 Basic steps of texture mapping, see Chapter 7 in text 1. Generate a texture image and place it in texture memory on the GPU, e.g., glGenTextures(), glBindTexture(), etc. 2. Assign texture coordinates (references to texture image) to  Image-related (2D) but also incorporating 3D data each fragment (pixel)  in-between, texture mapping between  Textures are assigned by texture coordinates image to surface (then fragments) associated with points on a surface 3. Apply texture to each fragment

3 4

1 Why texture mapping? Effect of texture mapping

 Real-world objects often have complex surface details even of a stochastic nature, e.g., the surface of an orange, patterns on a wood table, etc.

 Using many small polygons and smooth shading them to capture the details is too expensive

 A “cheat” : use an image (it is easy to take a photo and digitize it) to capture the details and patterns and paint it onto simple surfaces

5 6

Effect of texture mapping Textures

 Textures are patterns

 e.g., stripes, checkerboard, or those representing and characterizing natural material (wood, grass field)

 They can be in 1D, 2D, or 3D form

 2D textures, given as an image, is most common

 Key problem is to map a 2D texture onto an arbitrary (polygonal or curved) surface

 2D texturing techniques can be generalized

7 8

2 Texture synthesis Some terminologies

 A single captured image is not  Texture map: a 2D image to paint on a surface rich enough for applications  Texels (texture elements): the elements of the texture map – to distinguish from pixels

 Textures are often synthesized  Textures are defined in texture (s, t) coordinates – think of it as a continuous image from a small texture examplar,  Mathematically, texture mapping assigns a unique texture point (s, t) e.g., example-based synthesis to each point on a surface (one-to-one)

 Most natural textures appear  One pixel may be covered by many texels or vice versa

random, e.g., use of noise texels pixel texels pixel models such as Perlin noise

9 10

Rendering with texture mapped Rending approach

texel space object space pixel space 2. Use texture mapping to map the four 1. Map the four corners of a screen pixel onto surface points to points in texture map a surface in the scene (e.g., ray casting)

11 12

3 Rendering approach A more simplistic approach

 Such an approach (area sampling) is much better than

1. Cast a single ray through the center of a pixel P

2. Find point T in texture map corresponding to the first ray- surface intersection

3. Use color of T in the texture map for the pixel P

 This is called point sampling: 3. Use a weighted sum of texels covered by the could be a severe case of quadrilateral in texture map to color pixel under-sampling (aliasing) — This is called area sampling

13 14

Point vs. area sampling Aliasing in texture mapping

Aliasing resulting from Result of area sampling of point sampling of pixels pixels – we get a shade of gray without seeing the true patterns – aliasing, but not much we can do due to limited screen resolution

15 16

4 Two main issues Texture mapping on surfaces

 Finding the right mapping between the 2D texture and surface in 3D

 One-to-one

 Minimize distortion, e.g., angle-, area-, or distance-preserving maps

 Honoring certain constraints Greiner and Hormann (IMA Workshop, [Zieglman et al. 02] Geometric Design, 2001)  Finding the right color for each pixel, i.e., the right way to sample and combine texels to shade a pixel Surfaces modeled using triangle meshes  The key is to reduce aliasing effects Assuming texture (image) is sufficiently large  it has been synthesized/replicated

17 18

Interpolation of texture coordinates Texture interpolation in 2D

 Given texture coordinates at three vertices of a  Why? - use scanline and incremental algorithm triangle, how to find texture coordinate of an  OK for orthographic projections interior point? midpoint (u1, v1)  But wrong in perspective!

Note that the  The textures appear to be warped triangle is in 3D (?, ?)  There is no foreshortening (u3, v3) (u2, v2)

Recreated from Java Applet at:  Bilinear interpolation in 3D would be right http://graphics.lcs.mit.edu/classes/6.837  But is it right to interpolate in screen (2D) space?

19 20

5 Color interpolation Texture vs. color interpolation

 But remember Gouraud shading?  Yes, Gouraud shading was done in screen space  Did we not do interpolation in screen space?  This is not right, but we barely notice it since color variation is smooth – not so for textures!

 Can do texture color interpolation in 2D, like Gouraud

21 22

Texture interpolation in perspective Perspective-correct interpolation

Let P’ and Q’ be the projections of points P = (xp, yp, zp) and How to do interpolation correctly Q = (xq, yq, zq). Again, concentrate on x. Plane of in perspective projection? projection Interpolate between P’ and Q’ in screen space,  First recall how projections are x x x P computed p q p R(s) R'(t)x  P'x t(Q'x P'x )   t(  ) P’ z p zq z p  Assume projection plane d = 1, R’(t) (xp, xq, zp, zq in world space) Q also focus on x (case for y is Q’ the same), we have Problem statement: given t and texture coordinates at P’ and Q’, find the texture coordinate corresponding to point R’(t)x xp = x/z

23 24

6 Perspective-correct interpolation Perspective-correct interpolation

First, find world-space P and Q corresponding to Plane of Recalling that xp xq xp (screen-space) P’ and Q’. projection R'(t)x   t(  ) z p zq z p Next, a lerp between P and Q gives us P xp  xq  xp  R(s) We set that quantity equal to R(s) P s(Q P) s xp xq xp xp  s(xq  xp )          P’ the one in the previous  t(  )  z p zq  z p z z z z  s(z  z )     equation to get p q p p q p R’(t) y ignored Q Q’

Now we solve for s, grinding tz p Projecting R(s) onto the image plane z = 1, the x- s  through to component of the projection is zq  t(z p  zq )

xp  s(xq  xp ) proj(R(s))x  R'(t)x  z p  s(zq  z p ) s depends only on t and the z’s

25 26

Perspective-correct interpolation Perspective-correct interpolation

We can use this s to interpolate any attributes at the A comparison: vertices of our 3D triangle. To do that for texture coordinate up at P and uq at Q : A lerp of u/z with parameter t

Plane of u p z p  tuq zq  u p z p  projection u(s)  u p  s(uq  u p )  1 z p  t1 zq 1 z p  P A lerp of 1/z with parameter t R(s) P’

So we can interpolate in screen space (with respect R’(t) Q to t), but instead of interpolating u and v, we Q’ interpolate u/z, v/z, and 1/z. At each pixel, we then divide the interpolated values to get u and v. Next question: what texel value (color) do we get given a texture coordinate or texture region?

27 28

7 Sampling of texture maps Magnification of texture map

 Resolution of screen space (size of pixel) rarely  What to do with oversampling? – interpolation (over matches resolution of texture map (size of texel) triangle after determining color at vertices)  Under-sampling of texture map or minification: one texels pixels pixel corresponds to a texel region

texels pixel  How to determine a texel (color) value given a texture texels pixel coordinate (u, v)?

 Nearest point sampling: use texel closest to (hit by) the point sample  Over-sampling of texture map or magnification: one pixel corresponds to portion of a texel  Linear filtering: use average of group (2  2) of texels close to the point sample

29 30

Point sampling vs. linear filtering Point sampling vs. linear filtering

In areas where there is under-sampling, moiré patterns (false high-frequency) still appear. This is because a 2  2 filter is too Linear filtering reduces the jaggies over areas where small. By enlarging the filter mask, we can get a smoother shade there is texture over-sampling (magnification) of gray, but we can’t eliminate moiré altogether.

31 32

8 Minification: fast averaging of texels

 When one pixel covers a region of texels, need  MIP (multum in parvo in Latin) = many in a small place to sum up contribution of the covered texels  Mipmaps: stores a texture map in a multiresolution manner – called an image pyramid  Need to do this quickly – O(1) time per pixel  How much extra storage is required?  Computing texel average during rasterization can slow things down significantly

 So do some preprocessing – called prefiltering – of the texture map to enable faster computations

1. Use of mipmaps

2. Use of summed area table (SAT)

33 34

Mipmaps How to use mipmaps

 Each texel at level i+1 is the average of a 2  2 area of The rough idea: the texture map at level i  Level i+1 is a blurred version of level i at reduced size  To rasterize a pixel P in screen space 1. Determine how much (size) the pixel P covers the original (level 0) texture map 2. Find the level i at which a single texel covers about the same area at level 0 as P 3. Do a linear filtering using the mipmap at level i to obtain a color for pixel P

35 36

9 Better use of mipmaps Result of using mipmaps

 If the size of coverage by pixel P lies between level i and i+1 of the mipmaps

1. Compute color of P at both mipmap levels: ci at level i and ci+1 at level i+1

2. Linearly interpolate between ci and ci+1 based on size of coverage and “size” of texels at mipmap level i and i+1

Mipmapping point sampling Mipmapping linear filtering

37 38

Result of using mipmaps Summed area table (SAT)

 Another prefiltering technique to speed up computation of texel averages

 An entry in a SAT is the sum of all entries above and to the left of that entry (inclusive) in the base table.

No prefiltering at all, just Mipmapping linear point sampling filtering – quite blurry

Base table (texture) SAT

39 40

10 Use of SAT Results of using SAT

 We want to compute the average of texels in an aligned rectangle bounded by (x0, y0) and (x1, y1). This rectangle should approximate the projection of the pixel into the base texture.  Let T be the SAT. Then the sum of the texels in the rectangle is

S = T(x1, y1) – T(x0, y1) – T(x1, y0) + T(x0, y0)  The average of the texels is Result of using mipmaps Result of using SATs: S / A, where (x , y ) with linear filtering (larger 0 0 Shaper and less blurry A = (y1 – y0) * (x1 – x0) filter mask) is the area of the rectangle. but more aliasing (false high frequencies) (x1, y1)

41 42

Summary of texture mapping Applications

 Mapping between texture and surface:  So far, we have only considered texture maps  Linear, two-step, optimization-based mappings, etc. composed of colors (the diffuse reflection coefficients kd in our lighting models)  From texture coordinate to texel value

 Nearest point sampling or linear filtering  “Texture” maps can store other attributes, e.g.,

 If one texel covers many pixels – interpolation  The shininess parameter for specular reflection  Normal perturbations, etc.  If one pixel covers many texels – fast averaging

 Use of mipmaps – blur things at multiresolution  One can generate all kinds of interesting effects

 Use of summed area tables – can generalize to handle arbitrary and the possibilities are endless polygonal coverage (assignment #3)

43 44

11 Bump mapping Bump mapping

 Goal: faking bumpy surfaces  A bump map (texture) is composed of a height  Idea: use textures (bump maps) to alter surface function H – it stores scalars normals while keeping geometry unchanged  The partial derivatives of the height function are used to perturb the normals over the surface n n’ n Impose pv p’v change in H onto normals p’u pu H H Original: n  p  p Perturbed : n' n  pu  pv Original shading Bump map texture Bump mapped u v u v

45 46

Bump mapping Example of bump mapping

 How expensive is bump mapping?

 Computing the partial derivatives is not a big deal, why not? – can do this in preprocessing and store them in a lookup table

 Computing perturbed normals is also OK

 The catch is that to have the effect we desire, we must do lighting at each point – Phong shading (not Gouraud) – so this is expensive!

47 48

12