
CS464 Intro to Computer Graphics Rendering Process Arrays of Objects You Create Pipeline you configure What appears on screen and control. Elements of Scene Rendering Algorithm Rendered Image List of Graphics Primitives Framebuffer Polygons, Lights, Lines For each Primitive Our Image • Transform to view Drawn 60 fps • Draw Pixels View Camera Appearance controlled by configuration and Configuration Controls shaders. Axis Aligned Scene View Coordinate Space = Object Coordinate Space Right Handed Coordinates Z comes out of the screen. Rendered Scene Overlapping Triangles? How do make certain render properly. After 2D projection Z is lost. Painters Algorithm Keep Triangles Sorted Back to front by Z. Render furthest triangles first. Pretty good. Painters Algorithm Overlapping shapes cause failure. Not fixable with painters algorithm. Solution: Z-Buffer Introduce an additional framebuffer that just holds depth information. In SetPixel function, compare z-depth of current primitive against value in Z-Buffer. If closer to viewpoint then change pixel. Otherwise do not adjust pixel. Use to be considered expensive. Standard now. Framebuffers • Framebuffer is the interface between the device and the computer’s notion of an image • A memory array in which the computer stores an image – On most computers,separate memory bank from main memory – Many different variations,motivated by cost of memory Framebuffers: True-Color • A true-color (aka 24-bit or 32-bit) framebuffer stores one byte each for red,green,and blue • Each pixel can thus be one of 224 colors • Pay attention to Endian-ness • How can 24-bit and 32-bit mean the same thing here? Framebuffers: Indexed-Color • An indexed-color (8-bit or PseudoColor) framebuffer stores one byte per pixel (also:GIF image format) • This byte indexes into a color map: • How many colors can a pixel be? • Common on low-end displays (cell phones,PDAs, GameBoys) Framebuffers: Indexed Color Illustration of how an indexed palette works A 2-bit indexed-color image. The color of each pixel is represented by a number; each number corresponds to a color in the palette. Image credits: Wikipedia Framebuffers: Hi-Color • Hi-Color is (was?) a popular PC SVGA standard • Packs pixels into 16 bits: – 5 Red,6 Green,5 Blue (why would green get more?) – Sometimes just 5,5,5 • Each pixel can be one of 216 colors • Hi-color images can exhibit worse quantization artifacts than a well-mapped 8-bit image The Rendering Pipeline The Rendering Pipeline: A Tour Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Display Parameters The Display You Know Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Parameters Display The Framebuffer You Know Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Display Parameters Framebuffer The Rendering Pipeline Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Display Parameters 2-D Rendering: Rasterization Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Display Parameters The Rendering Pipeline: 3-D Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Display Parameters The Rendering Pipeline: 3-D Scene graph Object geometry Result: Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations •Vertices shaded according to lighting model Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system Clipping • Exactly those vertices & portions of polygons in view frustum Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Transformations • So far,discussion has been in screen space • But model is stored in model space (a.k.a.object space or world space) • Three sets of geometric transformations: – Modeling transforms – Viewing transforms – Projection transforms The Rendering Pipeline: 3-D Scene graph Object geometry Result: Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations •Vertices shaded according to lighting model Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system Clipping • Exactly those vertices & portions of polygons in view frustum Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Lighting • Illuminating a scene:coloring pixels according to some approximation of lighting – Global illumination:solves for lighting of the whole scene at once – Local illumination:local approximation,typically lighting each polygon separately • Interactive graphics (e.g.,hardware) does only local illumination at run time The Rendering Pipeline: 3-D Scene graph Object geometry Result: Modeling •All vertices of scene in shared 3-D “world” coordinate Transforms system Lighting Calculations • Vertices shaded according to lighting model Viewing • Scene vertices in 3-D“view” or“camera” coordinate Transform system Clipping • Exactly those vertices & portions of polygons in view frustum Projection • 2-D screen coordinates of clipped vertices Transform Rendering: Clipping • Clipping a 3-D primitive returns its intersection with the view frustum: Rendering: Clipping • Clipping is tricky! In: 3 vertices Clip Out: 6 vertices Clip In: 1 polygon Out: 2 polygons The Rendering Pipeline: 3-D Transform Illuminate Transform Clip Project Rasterize Model & Camera Rendering Pipeline Framebuffer Display Parameters Fixed Function Pipeline Fixed Function Pipeline Everything is hardcoded Lighting Texture Mapping Hard to customize effects Use parameters-based‘customization’ Widely used in most games from id Software Graphics hardware Companies such as Silicon Graphics (SGI) and Evans & Sutherland designed specialized and expensive graphics hardware The graphics systems developed by these companies introduced many of the concepts,such as vertex transformation and texture mapping,that we take for granted today. These systems were very important to the historical development of computer graphics, but because they were so expensive,they did not achieve the mass-market success Commodity Graphics Hardware NVIDIA entered the graphics hardware industry and started competing with 3dfx and laterATI Developed 3D graphics accelerators Later known as Graphics Processing Units (GPU) Moved graphics pipeline pieces to hardware, thus providing huge speedups for graphics applications (mostly games!) 200 Mhz clock with 1.5 Gigatexel/sec fill rate What is a GPU? GPU stands for Graphics Processing Unit Specialized hardware for graphics Free the CPU from the burden of rendering Used in desktops,laptops,mobile devices and so on The GPU GPU has evolved into a very powerful and flexible processor Programmable using high level languages Supports 32-bit and 64-bit floating point IEEE-754 precision Capable ofTeraFLOPS FLOPS and Memory Bandwidth Programmable Pipeline NVIDIA had moved all the possible units of the graphics pipeline to hardware Game developers wanted more control over the output of their program At the vertex and pixel level,they were constrained by the fixed function graphics pipeline Between 2001-2003,NVIDIA introduced programmable vertex shaders and pixel shaders Programmable Pipeline New effects using Shaders New effects using Shaders What is a shader? A program that runs on graphics hardware Used to be written in assembly language Allowed a user to perform simple math operations on vectors,matrices and textures such as add,multiply,dot product,sine,cosine etc. Vertex shaders are executed once per vertex Geometry shaders are executed once per ‘triangle’ Fragment/Pixel shaders are executed once per pixel on the screen OpenGL Pipeline Line(p0,p1) Triangle(p0,p1,p2) Rasterization Data Flow in the Graphics Pipeline InputAssembler Vertex Processor PrimitiveAssembly Video Geometry Processor Memory Rasterization Fragment Processor Framebuffer Data Flow in the Graphics Pipeline InputAssembler Vertex Processor Vertex Shader PrimitiveAssembly Video Geometry Processor Geometry Shader Memor y Rasterization Fragment Processor Fragment Shader Framebuffer Vertex Shader Executed once per vertex InputAssembler Transform input vertices Video Input attributes Memory Vertex normal Texture coordinates Vertex Shader Colors PrimitiveAssembly Geometry Shader Geometry composition PrimitiveAssembly Executed once per geometry Input primitives Points,Lines,Triangles Video Geometry Shader Lines andTriangles with Memory adjacency Output primitives Points,line strips,triangle strips, Rasterization quad strips [0,n] primitives outputted Fragment/Pixel Shader Per-pixel (or fragment) Rasterization composition Executed once per fragment Fragment Shader Operations on interpolated Video values Memory Vertex attributes User-defined varying variables Framebuffer History of Shading Languages Renderman – Pixar, software used to create movies such asTo y Story Cg – NVIDIA developed the first commercial Shading Language HLSL – Microsoft & Nvidia (Xbox etc.) GLSL – SGI/3DLabs,Architecture Review Board (ARB) Stanford RTSL –Academic Shading Language Fixed function tasks are bypassed VertexTasks Vertex transformations Normal transformation,Normalization Lighting Texture coordinate generation and transformation FragmentTasks Texture accesses Fog Discard fragment (culling) This is WebGL = No Fixed Function Pipeline Anatomy of GLSL Built-in variables Always
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages75 Page
-
File Size-