Infinitereality™ Video Format Combiner User's Guide

Total Page:16

File Type:pdf, Size:1020Kb

Infinitereality™ Video Format Combiner User's Guide InfiniteReality™ Video Format Combiner User’s Guide Document Number 007-3279-001 CONTRIBUTORS Written by Carolyn Curtis Illustrated by Carolyn Curtis Production by Laura Cooper Engineering contributions by Gregory Eitzmann, David Naegle, Chase Garfinkle, Ashok Yerneni, Rob Wheeler, Ben Garlick, Mark Schwenden, and Ed Hutchins Cover design and illustration by Rob Aguilar, Rikk Carey, Dean Hodgkinson, Erik Lindholm, and Kay Maitz © 1996, Silicon Graphics, Inc.— All Rights Reserved The contents of this document may not be copied or duplicated in any form, in whole or in part, without the prior written permission of Silicon Graphics, Inc. RESTRICTED RIGHTS LEGEND Use, duplication, or disclosure of the technical data contained in this document by the Government is subject to restrictions as set forth in subdivision (c) (1) (ii) of the Rights in Technical Data and Computer Software clause at DFARS 52.227-7013 and/or in similar or successor clauses in the FAR, or in the DOD or NASA FAR Supplement. Unpublished rights reserved under the Copyright Laws of the United States. Contractor/manufacturer is Silicon Graphics, Inc., 2011 N. Shoreline Blvd., Mountain View, CA 94043-1389. Silicon Graphics, the Silicon Graphics logo, Onyx, and IRIS are registered trademarks and IRIX, Sirius Video, POWER Onyx, and InfiniteReality are trademarks of Silicon Graphics, Inc. X Window System is a trademark of Massachusetts Institute of Technology. InfiniteReality™ Video Format Combiner User’s Guide Document Number 007-3279-001 Contents List of Figures vii List of Tables ix About This Guide xi Audience xi Structure of This Guide xi Conventions xii 1. InfiniteReality Graphics and the Video Format Combiner Utility 1 Channel Input and Output 2 Video Format Combinations 2 Video Format Combiner Utility and the Sirius Video Option 2 Programmable Querying of Video Format Combinations 3 2. Creating and Setting Video Format Combinations 5 Loading the Video Format Combiner GUI 5 Bandwidth Used Scale 7 Reduced Fill Scale 7 Error Indicator 7 Downloading a Video Format Combination 8 Specifying Channels 8 Selecting a Video Format for a Channel 9 Resizing and Repositioning a Channel 12 Copying a Channel 12 Aligning Channels 13 Associating a Video Channel With an X Window 13 Modifying Channel Attributes 14 Field Layout 15 Format 16 iii Contents Origin and Size 16 Pixel Format 16 Dither 17 Setup 17 Tri-Level Sync 17 Cursor Priority 18 Channel Placement in the Framebuffer 18 Horizontal and Vertical Phase 19 Sync 20 Gamma 20 Alpha Channel 20 Gain 21 Selecting Channels for a Combination 22 Changing Global Attributes for Combinations 23 Setting Pixel Depth 24 Setting Sync Source and Format 26 Saving a Video Format Combination 26 Using the Target Hardware Facility 27 3. Using the Encoder Channel 29 Setting Up the Encoder Channel 30 Independent Mode 30 Dependent Mode 30 Modifying Encoder Channel Attributes 31 Specifying the Filter Width 32 4. Using the Combiner With Sirius Video 33 Setting Up the Sirius Video Channel 33 Independent Mode 33 Dependent Mode 34 Modifying Sirius Video Channel Attributes 35 iv Contents 5. Combination Considerations 37 Trading Off Format Parameters in a Combination 37 Swap Rate 37 Transmission Bandwidth 38 DAC Output Bandwidth 39 Framebuffer Memory and Read/Write Bandwidth 39 Emergency Backup Combination 40 A. Error Messages 41 Index 69 v List of Figures Figure 2-1 Video Format Combiner Main Window 6 Figure 2-2 Select Format Window 9 Figure 2-3 Video Format Specified for Channel 0 11 Figure 2-4 Channel Attributes Window 14 Figure 2-5 Stacked Field Layout 15 Figure 2-6 Framebuffer 18 Figure 2-7 Run-Time Panning and Tile Boundaries 19 Figure 2-8 Combination Attributes Window 23 Figure 2-9 Framebuffer and Pixel Depth 25 Figure 2-10 Target Hardware Window 27 Figure 3-1 Encoder, Roaming in Channel 2 29 Figure 3-2 Encoder Channel Attributes Window 31 Figure 3-3 Varying the Width of the Filter 32 Figure 4-1 Sirius Video Channel Attributes Window 35 Figure Gl-1 Color Burst and Chrominance Signal 51 Figure Gl-2 Component Video Signals 53 Figure Gl-3 Horizontal Blanking 57 Figure Gl-4 Horizontal Blanking Interval 58 Figure Gl-5 Waveform Monitor Readings With and Without Setup 62 Figure Gl-6 Red or Blue Signal 64 Figure Gl-7 Y or Green Plus Sync Signal 64 Figure Gl-8 Video Waveform: Composite Video Signal With Setup (Typical NTSC) 65 Figure Gl-9 Video Waveform: Composite Video Signal (Typical PAL) 66 ix List of Tables Table 2-1 Resizing and Repositioning 12 ix About This Guide The Video Format Combiner is a utility for the InfiniteReality™ video display subsystem for the Silicon Graphics® ONYX® or POWER Onyx™ deskside or rackmount graphics workstation running IRIX™ 6.2 or later. The display subsystem supports programmable pixel timings to allow the system to drive displays with a wide variety of resolutions, refresh rates, and interlace/non-interlace characteristics. Each InfiniteReality pipe includes either two or eight video output channels. The Video Format Combiner is the utility that you use to select video output formats for the channels and to group the formats of the individual channels into a combination. You also use the Combiner to define the per-channel and global parameters of the combination. Audience This guide is written for the sophisticated graphics and video user who wishes to develop arrangements of video formats for InfiniteReality capabilities. Structure of This Guide This guide contains the following chapters: • Chapter 1, “InfiniteReality Graphics and the Video Format Combiner Utility,” introduces the main features of InfiniteReality and of the Combiner graphical user interface. • Chapter 2, “Creating and Setting Video Format Combinations,” explains how to use Combiner features to design video format combinations by selecting channels and specifying attributes. • Chapter 3, “Using the Encoder Channel,” describes how to use the Combiner Encoder channel, which encodes its own or another channel’s pixels to NTSC or PAL for industrial-quality video. xi About This Guide • Chapter 4, “Using the Combiner With Sirius Video,” explains how to configure the Sirius Video™ channel that interfaces with an installed Sirius Video option. • Chapter 5, “Combination Considerations,” gives tips for creating viable video format combinations and explains the Emergency Backup Combination and setting up an analog alpha out channel. • Appendix A, “Error Messages,” presents Video Format Combiner error messages, which appear near the bottom of the Combiner main window, and suggests solutions. A glossary and an index complete this guide. Conventions In command syntax descriptions and examples, square brackets ( [ ] ) surrounding an argument indicate an optional argument. Variable parameters are in italics. Replace these variables with the appropriate string or value. In text descriptions, IRIX filenames are in italics. The names of keyboard keys are printed in typewriter font and enclosed in angle brackets, such as <Enter> or <Esc>. Messages and prompts that appear on-screen are shown in fixed-width type. Entries that are to be typed exactly as shown are in boldface fixed-width type. xii Chapter 1 1. InfiniteReality Graphics and the Video Format Combiner Utility The InfiniteReality video display subsystem takes rendered images from the raster subsystem digital framebuffer and processes the pixels through digital-to-analog converters (DACs) to generate an analog pixel stream suitable for display on a high-resolution RGB video monitor. The display subsystem supports programmable pixel timings so that the system can drive displays with a wide variety of resolutions, refresh rates, and interlace/non-interlace characteristics. The InfiniteReality hardware includes either two or eight independent video channels in the standard configuration. The second video channel serves a dual role as a standard RGB video channel or as a composite or S-Video encoder. For the InfiniteReality video graphics display subsystem, Silicon Graphics provides a configuration utility, the Video Format Combiner, ircombine(1G). The Combiner groups video output formats into a video format combination—descriptions of raster sizes and timing to be used on video outputs— to validate that they operate correctly on the InfiniteReality video display subsystem hardware. The Combiner also configures the underlying graphics framebuffer and instructs the video subsystem to convert digital information stored there into a variety of video signals, or channels, ranging from high-resolution to low-resolution outputs. The output can then be displayed on additional monitors or projection devices, or stored on videotape, in any combination. Output can be genlocked to an external reference signal. This chapter discusses • channel input and output • video output combinations • Video Format Combiner utility and the Sirius Video option • programmable querying of video format combinations Note: For more information on ircombine(1G), consult its reference page. 1 Chapter 1: InfiniteReality Graphics and the Video Format Combiner Utility Channel Input and Output All channels derive their pixels from areas on the raster (framebuffer area). You can use the Combiner to select any area of the screen, or the contents of any window on the screen, as channel input; channels can overlap. A convenience feature allows you to click a specific window on the screen for channel input. You can also copy one channel’s format to another. InfiniteReality’s multichannel feature takes the digital, ordered pixel output from the framebuffer and allows you to specify two to eight separate rectangular areas to be sent from the rectangular framebuffer area managed by the X Window System™ to independent component RGB video outputs. Each video channel can have its own video timing. This capability is particularly useful for applications such as visual simulation, virtual reality, or entertainment. Video Format Combinations After you set up the channels, you can save them as a video format combination. You can download a video format combination into the hardware as the current video configuration, store it as the default configuration to be used at system power-on or graphics initialization, or save it in a video format combination file.
Recommended publications
  • Prisma II Headend Driver Amplifiers (HEDA)
    Prisma II Forward and Reverse Headend Driver Amplifiers Installation and Operation Guide For Your Safety Explanation of Warning and Caution Icons Avoid personal injury and product damage! Do not proceed beyond any symbol until you fully understand the indicated conditions. The following warning and caution icons alert you to important information about the safe operation of this product: You may find this symbol in the document that accompanies this product. This symbol indicates important operating or maintenance instructions. You may find this symbol affixed to the product. This symbol indicates a live terminal where a dangerous voltage may be present; the tip of the flash points to the terminal device. You may find this symbol affixed to the product. This symbol indicates a protective ground terminal. You may find this symbol affixed to the product. This symbol indicates a chassis terminal (normally used for equipotential bonding). You may find this symbol affixed to the product. This symbol warns of a potentially hot surface. You may find this symbol affixed to the product and in this document. This symbol indicates an infrared laser that transmits intensity- modulated light and emits invisible laser radiation or an LED that transmits intensity-modulated light. Important Please read this entire guide. If this guide provides installation or operation instructions, give particular attention to all safety statements included in this guide. Notices Trademark Acknowledgments Cisco and the Cisco logo are trademarks or registered trademarks of Cisco and/or its affiliates in the U.S. and other countries. To view a list of cisco trademarks, go to this URL: www.cisco.com/go/trademarks.
    [Show full text]
  • Analog/SDI to SDI/Optical Converter with TBC/Frame Sync User Guide
    Analog/SDI to SDI/Optical Converter with TBC/Frame Sync User Guide ENSEMBLE DESIGNS Revision 6.0 SW v1.0.8 This user guide provides detailed information for using the BrightEye™1 Analog/SDI to SDI/Optical Converter with Time Base Corrector and Frame Sync. The information in this user guide is organized into the following sections: • Product Overview • Functional Description • Applications • Rear Connections • Operation • Front Panel Controls and Indicators • Using The BrightEye Control Application • Warranty and Factory Service • Specifications • Glossary BrightEye-1 BrightEye 1 Analog/SDI to SDI/Optical Converter with TBC/FS PRODUCT OVERVIEW The BrightEye™ 1 Converter is a self-contained unit that can accept both analog and digital video inputs and output them as optical signals. Analog signals are converted to digital form and are then frame synchronized to a user-supplied video reference signal. When the digital input is selected, it too is synchronized to the reference input. Time Base Error Correction is provided, allowing the use of non-synchronous sources such as consumer VTRs and DVD players. An internal test signal generator will produce Color Bars and the pathological checkfield test signals. The processed signal is output as a serial digital component television signal in accordance with ITU-R 601 in both electrical and optical form. Front panel controls permit the user to monitor input and reference status, proper optical laser operation, select video inputs and TBC/Frame Sync function, and adjust video level. Control and monitoring can also be done using the BrightEye PC or BrightEye Mac application from a personal computer with USB support.
    [Show full text]
  • Real-Time Graphics Architecture P
    4/18/2007 Real-Time Graphics Architecture Lecture 4: Parallelism and Communication Kurt Akeley Pat Hanrahan http://graphics.stanford.edu/cs448-07-spring/ CS448 Lecture 4 Kurt Akeley, Pat Hanrahan, Spring 2007 Topics 1. Frame buffers 2. Types of parallelism 3. Communication patterns and requirements 4. Sorting classification for parallel rendering (with examples) CS448 Lecture 4 Kurt Akeley, Pat Hanrahan, Spring 2007 1 4/18/2007 Frame Buffers Raster vs. calligraphic Raster (image order) dominant choice Calligraphic (object order) Earliest choice (Sketchpad) E&S terminals in the 70s and 80s Works with light pens Scene complexity affects frame rate Monitors are expensive Still required for FAA simulation Increases absolute brightness of light points CS448 Lecture 4 Kurt Akeley, Pat Hanrahan, Spring 2007 2 4/18/2007 Frame buffer definitions What is a frame buffer? What can we learn by considering different definitions? CS448 Lecture 4 Kurt Akeley, Pat Hanrahan, Spring 2007 Frame buffer definition #1 Storage for commands that are executed to refresh the display Allows for raster or calligraphiccalligraphic display (e. g. Megatech) “Frame buffer” for calligraphic display is a “display list” OpenGL “render list”? Key point: frame buffer contents are interpreted Color mapping Image scaling, warping Window system (overlay, separate windows, …) Address Recalculation Pipeline CS448 Lecture 4 Kurt Akeley, Pat Hanrahan, Spring 2007 3 4/18/2007 Frame buffer definition #2 Image memory used to decouple the render frame rate from the display
    [Show full text]
  • AN1089: EL4089 and EL4390 DC Restored Video Amplifier
    EL4089 and EL4390 DC Restored ® Video Amplifier Application Note June 21, 2005 AN1089.1 Authors: John Lidgey, Chris Toumazou and Mike Wong The EL4089 is a complete monolithic video amplifier sub- amplitude between black and white of 0.7V. At the end of the system in a single 8-pin package. It comprises a high quality picture information is the front-porch, followed by a sync video amplifier and a nulling, sample-and-hold amplifier pulse, which is regenerated to provide system specifically designed to stabilize video performance. The synchronization. The back-porch is the part of the signal that part is a derivative of Intersil's high performance video DC represents the black or blanking level. In NTSC color restoration amplifier, the EL2090, but has been optimized for systems, the chroma or color burst signal is added to the lower system cost by reducing the pin count and the number back-porch and normally occupies 9 cycles of the 3.58MHz of external components. For RGB and YUV applications the subcarrier. EL4390 provides three channel in a single 16-pin package. DC Restoration—The Classical Approach This application note provides background information on Video signals are often AC coupled to avoid DC bias DC restoration. Typical applications circuits and design hints interaction between different systems. The blanking level of are given to assist in the development of cost effective the composite video signal therefore needs to be restored to systems based on the EL4089 and EL4390. an externally defined DC voltage, which locks the video signal to a predetermined common reference level, ensuring Video Signal Refresher consistency in the displayed picture.
    [Show full text]
  • 20 Years of Opengl
    20 Years of OpenGL Kurt Akeley © Copyright Khronos Group, 2010 - Page 1 So many deprecations! • Application-generated object names • Depth texture mode • Color index mode • Texture wrap mode • SL versions 1.10 and 1.20 • Texture borders • Begin / End primitive specification • Automatic mipmap generation • Edge flags • Fixed-function fragment processing • Client vertex arrays • Alpha test • Rectangles • Accumulation buffers • Current raster position • Pixel copying • Two-sided color selection • Auxiliary color buffers • Non-sprite points • Context framebuffer size queries • Wide lines and line stipple • Evaluators • Quad and polygon primitives • Selection and feedback modes • Separate polygon draw mode • Display lists • Polygon stipple • Hints • Pixel transfer modes and operation • Attribute stacks • Pixel drawing • Unified text string • Bitmaps • Token names and queries • Legacy pixel formats © Copyright Khronos Group, 2010 - Page 2 Technology and culture © Copyright Khronos Group, 2010 - Page 3 Technology © Copyright Khronos Group, 2010 - Page 4 OpenGL is an architecture Blaauw/Brooks OpenGL SGI Indy/Indigo/InfiniteReality Different IBM 360 30/40/50/65/75 NVIDIA GeForce, ATI implementations Amdahl Radeon, … Code runs equivalently on Top-level goal Compatibility all implementations Conformance tests, … It’s an architecture, whether Carefully planned, though Intentional design it was planned or not . mistakes were made Can vary amount of No feature subsetting Configuration resource (e.g., memory) Config attributes (e.g., FB) Not a formal
    [Show full text]
  • Brighteye 42 Manual
    HD/SD/ASI Distribution Amplifier User Guide ENSEMBLE DESIGNS Revision 3.0 SW v1.0 This user guide provides detailed information for using the BrightEye™42 HD/SD/ASI Distribution Amplifier. The information in this user guide is organized into the following sections: • Product Overview • Applications • Rear Connections • Operation • Front Panel Status Indicators • Warranty and Factory Service • Specifications • Glossary BrightEye-1 HD/SD/ASI Distribution Amplifier PRODUCT OVERVIEW The BrightEye™ 42 is a reclocking distribution amplifier that can be used with high definition, standard definition, or ASI signals. When used with SD or ASI input signals, the serial input automatically equalizes up to 300 meters of digital cable. When used with an HD input signal, the serial input automatically equalizes up to 100 meters of digital cable. The input signal is reclocked and delivered to four simultaneous outputs as shown in the block diagram below. The reclocker is ASI compliant and all four outputs have the correct ASI polarity. Front panel indictors permit the user to monitor input signal and power status Signal I/O and power is supplied to the rear of the unit, that is powered by a modular style power supply. There are no adjustments required on this unit. A glossary of commonly used video terms is provided at the end of this guide. HD/SD/ASI In HD/SD/ASI Out Reclocker (follows input) Power Front Panel Indicators BrightEye 42 Functional Block Diagram BrightEye-2 APPLICATIONS BrightEye 42 can be utilized in any number of different applications where distri- bution of HD, SD, or ASI is required.
    [Show full text]
  • PACKET 22 BOOKSTORE, TEXTBOOK CHAPTER Reading Graphics
    A.11 GRAPHICS CARDS, Historical Perspective (edited by J Wunderlich PhD in 2020) Graphics Pipeline Evolution 3D graphics pipeline hardware evolved from the large expensive systems of the early 1980s to small workstations and then to PC accelerators in the 1990s, to $X,000 graphics cards of the 2020’s During this period, three major transitions occurred: 1. Performance-leading graphics subsystems PRICE changed from $50,000 in 1980’s down to $200 in 1990’s, then up to $X,0000 in 2020’s. 2. PERFORMANCE increased from 50 million PIXELS PER SECOND in 1980’s to 1 billion pixels per second in 1990’’s and from 100,000 VERTICES PER SECOND to 10 million vertices per second in the 1990’s. In the 2020’s performance is measured more in FRAMES PER SECOND (FPS) 3. Hardware RENDERING evolved from WIREFRAME to FILLED POLYGONS, to FULL- SCENE TEXTURE MAPPING Fixed-Function Graphics Pipelines Throughout the early evolution, graphics hardware was configurable, but not programmable by the application developer. With each generation, incremental improvements were offered. But developers were growing more sophisticated and asking for more new features than could be reasonably offered as built-in fixed functions. The NVIDIA GeForce 3, described by Lindholm, et al. [2001], took the first step toward true general shader programmability. It exposed to the application developer what had been the private internal instruction set of the floating-point vertex engine. This coincided with the release of Microsoft’s DirectX 8 and OpenGL’s vertex shader extensions. Later GPUs, at the time of DirectX 9, extended general programmability and floating point capability to the pixel fragment stage, and made texture available at the vertex stage.
    [Show full text]
  • Interactive Visualization of Large-Scale Architectural Models Over the Grid Strolling in Tang Chang’An City
    Interactive Visualization of Large-Scale Architectural Models over the Grid Strolling in Tang Chang’an City XU Shuhong1, HENG Chye Kiang2, SUBRAMANIAM Ganesan1, HO Quoc Thuan1, KHOO Boon Tat1 and HUANG Yan2 1 Institute of High Performance Computing, Singapore 2 Department of Architecture, National University of Singapore Keywords: remote visualization, grid-enabled visualization, large-scale architectural models, virtual heritage Abstract: Virtual reconstruction of the ancient Chinese Chang’an city has been continued for ten years at the National University of Singapore. Motivated by sharing this grand city with people who are geographically distant and equipped with normal personal computers, this paper presents a practical Grid-enabled visualization infrastructure that is suitable for interactive visualization of large-scale architectural models. The underlying Grid services, such as information service, visualization planner and execution container etc, are developed according to the OGSA standard. To tackle the critical problem of Grid visualization, i.e. data size and network bandwidth, a multi-stage data compression approach is deployed and the corresponding data pre-processing, rendering and remote display issues are systematically addressed. 1 INTRODUCTION Chang’an, meaning long-lasting peace, was the capital of China’s Tang dynasty from 618 to 907 AD. At its peak, it had a population of about one million. Measuring 9.7 by 8.6 kilometres, the city’s architecture inspired the planning of many capital cities in East Asia such as Heijo-kyo and Heian-kyo in Japan in the 8th century, and imperial Chinese cities like Beijing of the Ming and Qing dynasties. To bring this architectural miracle back to life, a team led by Professor Heng Chye Kiang at the National University of Singapore (NUS) has conducted extensive research since the middle of 90s.
    [Show full text]
  • Onyx2™ Deskside Workstation Owner's Guide
    Onyx2™ Deskside Workstation Owner’s Guide Document Number 007-3454-004 CONTRIBUTORS Written by Mark Schwenden and Carolyn Curtis Illustrated by Dan Young and Cheri Brown Production by Allen Clardy Engineering contributions by Bob Marinelli, Brad Morrow, Bharat Patel, Ed Reidenbach, Philip Montalban, Mike Mackovitch, Jim Ammon, Bob Murphy, Reuel Nash, Mohsen Hosseini, Suzanne Jones, Jeff Milo, Patrick Conway, Kirk Law, Dave Lima, and Martin Frankel St Peter’s Basilica image courtesy of ENEL SpA and InfoByte SpA. Disk Thrower image courtesy of Xavier Berenguer, Animatica. © 1998, Silicon Graphics, Inc.— All Rights Reserved The contents of this document may not be copied or duplicated in any form, in whole or in part, without the prior written permission of Silicon Graphics, Inc. RESTRICTED RIGHTS LEGEND Use, duplication, or disclosure of the technical data contained in this document by the Government is subject to restrictions as set forth in subdivision (c) (1) (ii) of the Rights in Technical Data and Computer Software clause at DFARS 52.227-7013 and/or in similar or successor clauses in the FAR, or in the DOD or NASA FAR Supplement. Unpublished rights reserved under the Copyright Laws of the United States. Contractor/manufacturer is Silicon Graphics, Inc., 2011 N. Shoreline Blvd., Mountain View, CA 94043-1389. FCC Warning This equipment has been tested and found compliant with the limits for a Class A digital device, pursuant to Part 15 of the FCC rules. These limits are designed to provide reasonable protection against harmful interference when the equipment is operated in a commercial environment. This equipment generates, uses, and can radiate radio frequency energy and if not installed and used in accordance with the instruction manual, may cause harmful interference to radio communications.
    [Show full text]
  • Infinitereality: Ein Real-Time- Grafiksystem
    InfiniteReality: Ein Real-Time- Grafiksystem Vortrag am GDV-Seminar ETH Zürich Andreas Deller 18. Mai 1998 05/19/98 InfiniteReality Seite 1 Outline I. Einführung A. Grundinformationen über InfiniteReality B. Eckdaten der InfiniteReality C. Zum Vortrag II. Architektur A. Übersicht B. Detailbeschreibungen 1. Geometry Board a. Host Interface b. Geometry Distributor c. Geometry Engines (2 oder 4 pro Pipeline) d. Geometry Raster FIFO 2. Raster Memory Board (1, 2, oder 4 pro Pipeline) a. Fragment Generators b. Image Engines 3. Display Generator Board III. Features 1. Virtual Texture 2. Scene Load Management IV. Host-Systeme A. Systembeschreibung 1. Node Cards 2. Beispielkonfigurationen V. Performance VI. Endnoten/Literaturangaben VII. Index 05/19/98 InfiniteReality Seite 2 Einführung Grundinformationen über InfiniteReality Das InfiniteReality(IR)-Grafiksystem von SGI (Silicon Graphics, Inc.) ist das erste Workstation-System, das für gleichmässige, garantierte Hertz-Raten entwickelt worden ist. Als Nachfolger der RealityEngine wird es in den Hochleistungs-Grafikcomputern Onyx und Onyx2 von SGI eingesetzt. Das Design musste derart angepasst werden, dass die IR-Engine sowohl von den Vorzügen der verbesserten Systemarchitektur der Onyx2 profitieren als auch in der Onyx optimal eingesetzt werden kann. Zudem sollte sie skalierbar sein, um den verschiedenen Kundenwünschen möglichst gut zu entsprechen und einen benötigten Ausbau ohne grossen Aufwand zu erledigen. Das IR-Grafiksystem wird als «third-generation graphics system» bezeichnet. Dies bedeuetet, dass das System beleuchtete, schattierte, tiefengepufferte, texturierte, antialiased Dreiecke hardwareunterstützt rendern kann, und zwar mit möglichst wenig Performance-Einbusse. Weiter sollten Texturen eine praktisch unbegrenzte Grösse haben dürfen, ohne die Performance erheblich zu beeinflussen. Der Programmierer sollte mit dem Handling der Textur so wenig Zeit wie möglich aufbringen müssen.
    [Show full text]
  • Camera Interfacing Guide
    Camera Interface Guide Table of Contents Video Basics .......................................................................................................................................................... 5-12 Introduction ....................................................................................................................................................................3 Video formats ..................................................................................................................................................................3 Standard analog format ..................................................................................................................................................3 Blanking intervals...........................................................................................................................................................4 Vertical blanking .............................................................................................................................................................4 Horizontal blanking ........................................................................................................................................................4 Sync Pulses.....................................................................................................................................................................4 Color coding ....................................................................................................................................................................5
    [Show full text]
  • GPU Architecture:Architecture: Implicationsimplications && Trendstrends
    GPUGPU Architecture:Architecture: ImplicationsImplications && TrendsTrends DavidDavid Luebke,Luebke, NVIDIANVIDIA ResearchResearch Beyond Programmable Shading: In Action GraphicsGraphics inin aa NutshellNutshell • Make great images – intricate shapes – complex optical effects – seamless motion • Make them fast – invent clever techniques – use every trick imaginable – build monster hardware Eugene d’Eon, David Luebke, Eric Enderton In Proc. EGSR 2007 and GPU Gems 3 …… oror wewe couldcould justjust dodo itit byby handhand Perspective study of a chalice Paolo Uccello, circa 1450 Beyond Programmable Shading: In Action GPU Evolution - Hardware 1995 1999 2002 2003 2004 2005 2006-2007 NV1 GeForce 256 GeForce4 GeForce FX GeForce 6 GeForce 7 GeForce 8 1 Million 22 Million 63 Million 130 Million 222 Million 302 Million 754 Million Transistors Transistors Transistors Transistors Transistors Transistors Transistors 20082008 GeForceGeForce GTX GTX 200200 1.41.4 BillionBillion TransistorsTransistors Beyond Programmable Shading: In Action GPUGPU EvolutionEvolution -- ProgrammabilityProgrammability ? Future: CUDA, DX11 Compute, OpenCL CUDA (PhysX, RT, AFSM...) 2008 - Backbreaker DX10 Geo Shaders 2007 - Crysis DX9 Prog Shaders 2004 – Far Cry DX7 HW T&L DX8 Pixel Shaders 1999 – Test Drive 6 2001 – Ballistics TheThe GraphicsGraphics PipelinePipeline Vertex Transform & Lighting Triangle Setup & Rasterization Texturing & Pixel Shading Depth Test & Blending Framebuffer Beyond Programmable Shading: In Action TheThe GraphicsGraphics PipelinePipeline Vertex Transform
    [Show full text]