CS464 Intro to Rendering Process

Arrays of Objects You Create Pipeline you configure What appears on screen and control. Elements of Scene Rendering Algorithm Rendered Image List of Graphics Primitives Framebuffer Polygons, Lights, Lines For each Primitive Our Image • Transform to view Drawn 60 fps • Draw View Camera Appearance controlled by configuration and Configuration Controls . Axis Aligned Scene

View Coordinate Space = Object Coordinate Space Right Handed Coordinates Z comes out of the screen. Rendered Scene Overlapping Triangles?

How do make certain render properly.

After 2D projection Z is lost. Painters Algorithm

Keep Triangles Sorted Back to front by Z. Render furthest triangles first.

Pretty good. Painters Algorithm

Overlapping shapes cause failure.

Not fixable with painters algorithm. Solution: Z-Buffer Introduce an additional framebuffer that just holds depth information.

In SetPixel function, compare z-depth of current primitive against value in Z-Buffer. If closer to viewpoint then change . Otherwise do not adjust pixel. Use to be considered expensive. Standard now. Framebuffers • Framebuffer is the interface between the device and the computer’s notion of an image • A memory array in which the computer stores an image – On most computers,separate memory bank from main memory – Many different variations,motivated by cost of memory Framebuffers: True-Color

• A true-color (aka 24-bit or 32-bit) framebuffer stores one byte each for red,green,and blue • Each pixel can thus be one of 224 colors • Pay attention to Endian-ness • How can 24-bit and 32-bit mean the same thing here? Framebuffers: Indexed-Color

• An indexed-color (8-bit or PseudoColor) framebuffer stores one byte per pixel (also:GIF image format) • This byte indexes into a color map: • How many colors can a pixel be? • Common on low-end displays (cell phones,PDAs, GameBoys) Framebuffers:  Illustration of how an indexed works  A 2-bit indexed-color image.  The color of each pixel is represented by a number; each number corresponds to a color in the palette.

Image credits: Wikipedia Framebuffers: Hi-Color

• Hi-Color is (was?) a popular PC SVGA standard • Packs pixels into 16 bits: – 5 Red,6 Green,5 Blue (why would green get more?) – Sometimes just 5,5,5 • Each pixel can be one of 216 colors • Hi-color images can exhibit worse quantization artifacts than a well-mapped 8-bit image The Rendering Pipeline The Rendering Pipeline: A Tour

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Framebuffer Display Parameters The Display You Know

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Framebuffer Parameters Display The Framebuffer You Know

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Display Parameters Framebuffer The Rendering Pipeline

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Framebuffer Display Parameters 2-D Rendering: Rasterization

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Framebuffer Display Parameters The Rendering Pipeline: 3-D

Transform Illuminate Transform Clip Project

Rasterize

Model & Camera Rendering Pipeline Framebuffer Display Parameters The Rendering Pipeline: 3-D

Scene graph Object geometry Result:

Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations •Vertices shaded according to lighting model

Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system

Clipping • Exactly those vertices & portions of polygons in view frustum

Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Transformations • So far,discussion has been in screen space • But model is stored in model space (a.k.a.object space or world space) • Three sets of geometric transformations: – Modeling transforms – Viewing transforms – Projection transforms The Rendering Pipeline: 3-D

Scene graph Object geometry Result:

Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations •Vertices shaded according to lighting model

Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system

Clipping • Exactly those vertices & portions of polygons in view frustum

Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Lighting • Illuminating a scene:coloring pixels according to some approximation of lighting – Global illumination:solves for lighting of the whole scene at once – Local illumination:local approximation,typically lighting each polygon separately • Interactive graphics (e.g.,hardware) does only local illumination at run time The Rendering Pipeline: 3-D

Scene graph Object geometry Result:

Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations • Vertices shaded according to lighting model

Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system

Clipping • Exactly those vertices & portions of polygons in view frustum

Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Clipping

• Clipping a 3-D primitive returns its intersection with the view frustum: Rendering: Clipping • Clipping is tricky!

In: 3 vertices Clip Out: 6 vertices

Clip In: 1 polygon Out: 2 polygons The Rendering Pipeline: 3-D

Transform Illuminate Transform Clip Project Rasterize

Model & Camera Rendering Pipeline Framebuffer Display Parameters Fixed Function Pipeline Fixed Function Pipeline  Everything is hardcoded  Lighting   Hard to customize effects  Use parameters-based‘customization’  Widely used in most games from id Software Graphics hardware  Companies such as (SGI) and Evans & Sutherland designed specialized and expensive graphics hardware  The graphics systems developed by these companies introduced many of the concepts,such as vertex transformation and texture mapping,that we take for granted today.  These systems were very important to the historical development of computer graphics, but because they were so expensive,they did not achieve the mass-market success Commodity Graphics Hardware  entered the graphics hardware industry and started competing with 3dfx and laterATI  Developed 3D graphics accelerators  Later known as Graphics Processing Units (GPU)  Moved pieces to hardware, thus providing huge speedups for graphics applications (mostly games!)  200 Mhz clock with 1.5 Gigatexel/sec fill rate What is a GPU?  GPU stands for  Specialized hardware for graphics  Free the CPU from the burden of rendering  Used in desktops,laptops,mobile devices and so on The GPU  GPU has evolved into a very powerful and flexible processor  Programmable using high level languages  Supports 32-bit and 64-bit floating point IEEE-754 precision  Capable ofTeraFLOPS FLOPS and Memory Bandwidth Programmable Pipeline  NVIDIA had moved all the possible units of the graphics pipeline to hardware  Game developers wanted more control over the output of their program  At the vertex and pixel level,they were constrained by the fixed function graphics pipeline

 Between 2001-2003,NVIDIA introduced programmable vertex shaders and pixel shaders Programmable Pipeline New effects using Shaders New effects using Shaders What is a ?  A program that runs on graphics hardware  Used to be written in assembly language  Allowed a user to perform simple math operations on vectors,matrices and textures such as add,multiply,dot product,sine,cosine etc.

 Vertex shaders are executed once per vertex  Geometry shaders are executed once per ‘triangle’  Fragment/Pixel shaders are executed once per pixel on the screen OpenGL Pipeline

Line(p0,p1) Triangle(p0,p1,p2)

Rasterization Data Flow in the Graphics Pipeline InputAssembler

Vertex Processor

PrimitiveAssembly

Video Geometry Processor Memory

Rasterization

Fragment Processor

Framebuffer Data Flow in the Graphics Pipeline InputAssembler

Vertex Processor Vertex Shader

PrimitiveAssembly

Video Geometry Processor Geometry Shader Memor y Rasterization

Fragment Processor Fragment Shader

Framebuffer Vertex Shader  Executed once per vertex InputAssembler  Transform input vertices

Video  Input attributes Memory  Vertex normal  Texture coordinates Vertex Shader  Colors

PrimitiveAssembly Geometry Shader  Geometry composition PrimitiveAssembly

 Executed once per geometry  Input primitives  Points,Lines,Triangles Video Geometry Shader  Lines andTriangles with Memory adjacency

 Output primitives  Points,line strips,triangle strips, Rasterization quad strips  [0,n] primitives outputted Fragment/Pixel Shader  Per-pixel (or fragment) Rasterization composition

 Executed once per fragment Fragment Shader  Operations on interpolated Video values Memory  Vertex attributes  User-defined varying variables Framebuffer History of Languages  Renderman – Pixar, software used to create movies such asTo y Story  Cg – NVIDIA developed the first commercial Shading Language  HLSL – Microsoft & Nvidia (Xbox etc.)  GLSL – SGI/,Architecture Review Board (ARB)  Stanford RTSL –Academic Shading Language Fixed function tasks are bypassed  VertexTasks  Vertex transformations  Normal transformation,Normalization  Lighting  Texture coordinate generation and transformation

 FragmentTasks  Texture accesses  Fog  Discard fragment (culling)

This is WebGL = No Fixed Function Pipeline Anatomy of GLSL  Built-in variables  Always prefaced with gl_  Accessible to both vertex and fragment shaders  Uniform variables  Matrices (i.e.ModelViewMatrix,ProjectionMatrix,inverses, transposes)  Materials (in MaterialParameters struct,ambient,diffuse,etc)  Lights (in LightSourceParameters struct,specular,position,etc)

 Varying variables  FrontColor for colors  TexCoord [ ] for texture coordinates Anatomy of GLSL  Vertex Shaders  Have access to vertex attributes such as gl_Color,gl_Normal, gl_Vertex etc.  Also write to special output variables such as gl_Position, gl_PointSize etc.

 Fragment Shaders  Have access to special input variables such as gl_FragCoord, gl_FrontFacing,etc  Also write to special output variables such as gl_FragColor, gl_FragDepth,etc. Structure of a shader /** * Comments */

Global Definitions void main (void) { Body of the function } Hello World – Vertex Shader void main (void) { // Pass vertex color to next stage gl_FrontColor = gl_Color; // Transform vertex position before passing it gl_Position = gl_ModelViewProjectionMatrix * gl_Vertex; } Hello World – Geometry Shader # extension GL_EXT_geometry_shader4: enable void main (void) { // Iterate over all vertices in input primitive for(int i=0; i

Fragment shader takes in uniforms and varying’s, outputs a color with alpha.

Steps in pipeline gl_{Variables} Shaders must set certain gl_{variables} to draw anything: Vertex shader gl_{variables} Vec4 gl_Position : vertex coordinates in normalized clip space. Float gl_PointSize: size if drawing point objects. Fragment shader gl_{variables} vec4 gl_FragColor : RGBA color to use in 0 to 1.0 values. vec4 gl_FragData[gl_MaxDrawBuffers] : advanced multi-buffer feature. To see anything vertex shader must set gl_Position and fragment shader must set gl_FragColor. Example Compiling Shader

function getShader(gl, id) var shader; { if (shaderScript.type == "x-shader/x-fragment") { var shaderScript = document.getElementById(id); shader = gl.createShader(gl.FRAGMENT_SHADER); var str = ""; } else if (shaderScript.type == "x-shader/x-vertex") { var k = shaderScript.firstChild; shader = gl.createShader(gl.VERTEX_SHADER); } while (k) { gl.shaderSource(shader, str); if (k.nodeType == 3) { gl.compileShader(shader); str += k.textContent; } if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) { k = k.nextSibling; alert(gl.getShaderInfoLog(shader)); return null; } } return shader; } Example Vertex Shader

varying vec3 nVertexPosition; uniform sampler2D uSampler; varying vec4 clrVec; varying vec4 nclrVec; Example Vertex Shader

void main(void) { // here we calculate the lighting information from // modify the height of the terrain vec3 testNormal = aVertexNormal; nVertexPosition[0] = aVertexPosition[0]; vec3 transformedNormal = uNMatrix * aVertexNormal; nVertexPosition[1] = 0.5; nVertexPosition[2] = aVertexPosition[2]; float directionalLightWeighting = max(dot(transformedNormal, uLightingDirection), 0.0); vTextureCoord = aTextureCoord; clrVec = texture2D(uSampler,vTextureCoord); vLightWeighting = uAmbientColor + uDirectionalColor * directionalLightWeighting; nclrVec = normalize(clrVec); nVertexPosition[1] = nclrVec[0]*nclrVec[0] + nclrVec[1]*nclrVec[1]+nclrVec[2]*nclrVec[2]; } gl_Position = uPMatrix * uMVMatrix * vec4(nVertexPosition, 1.0); Example Fragment Shader Shader Program

var fragmentShader = getShader(gl, "shader-fs"); var vertexShader = getShader(gl, "shader-vs"); shaderProgram = gl.createProgram(); gl.attachShader(shaderProgram, vertexShader); gl.attachShader(shaderProgram, fragmentShader); gl.linkProgram(shaderProgram);

if (!gl.getProgramParameter(shaderProgram, gl.LINK_STATUS)) { alert("Could not initialise shaders"); }

gl.useProgram(shaderProgram); // specify shader to use for draw calls. Connecting Buffers to Shaders sp.vertexPositionAttribute = gl.getAttribLocation(shaderProgram, "aVertexPosition"); gl.enableVertexAttribArray(shaderProgram.vertexPositionAttribute); sprogram.vertexNormalAttribute = gl.getAttribLocation(shaderProgram, "aVertexNormal"); gl.enableVertexAttribArray(shaderProgram.vertexNormalAttribute); sp.textureCoordAttribute = gl.getAttribLocation(shaderProgram, "aTextureCoord"); gl.enableVertexAttribArray(shaderProgram.textureCoordAttribute); sp.pMatrixUniform = gl.getUniformLocation(shaderProgram, "uPMatrix"); sp.mvMatrixUniform = gl.getUniformLocation(shaderProgram, "uMVMatrix"); sp.samplerUniform = gl.getUniformLocation(shaderProgram, "uSampler"); Connecting Buffers to Shaders // code to handle lighting sp.tnMatrixUniform = gl.getUniformLocation(shaderProgram, "uNMatrix"); sp.useLightingUniform = gl.getUniformLocation(shaderProgram, "uUseLighting"); sp.ambientColorUniform = gl.getUniformLocation(shaderProgram, "uAmbientColor"); sp.lightingDirectionUniform = gl.getUniformLocation(shaderProgram, "uLightingDirection"); sp.directionalColorUniform = gl.getUniformLocation(shaderProgram, "uDirectionalColor"); The End