US007003588B1

12) United States Patent 10) Patent No.: US 7.003.5889 9 B1

Takeda et al. 45) Date of Patent: Feb. 21,5 2006

(54) PERIPHERAL DEVICES FOR A VIDEO 4,567,516 A 1/1986 Scherer et al...... 358/114 GAME SYSTEM 4,570,233 A. 2/1986 Yan et al. 4,589,089 A 5/1986 Frederiksen ...... 364/900 (75) Inventors: Genyo Takeda, Kyoto (JP), Ko Shiota, 4,592,012 A 5/1986 Braun ...... 364/900

Kyoto- (JP);2 Munchitoe Oira,2 Kyoto- 4,725,831† A 2/19884/1987 Colemanº ºatoshi º Nishiumi, Kyoto sº (JP) (JP); 4,829,2954,799,635 A 5/19891/1989 HiroyukiNakagawa ...... 364/900 - e 4,837,488 A 6/1989 Donahue ...... 324/66 (73) Assignee: Nintendo Co., Ltd., Kyoto (JP) 4,850,591. A 7/1989 Takezawa et al...... 273/85 (*) Notice: Subject to any disclaimer, the term of this (Continued) patent is extended or adjusted under 35 U.S.C. 154(b) by 0 days. FOREIGN PATENT DOCUMENTS CA 2070934 12/1993 (21) Appl. No.: 10/225,488 (Continued) (22) Filed: Aug. 22, 2002 OTHER PUBLICATIONS Related U.S. Application Data “Pinouts for Various Connectors in Real Life (tm)”, ?on (60) Provisional application No. 60/313,812, filed on Aug. line], retrieved from the Internet, http://repairfaq.ece.drex 22, 2001. el.edu/REPAIR/F, Pinoutsl.html, updated May 20, 1997 (per document). (51) Int. Cl. A63F I3/00 (2006.01) (Continued) (52) U.S. Cl...... "...º.º. Primary Examiner—Christopher Shin (58) Fielde of Classificatione e Search ...... 2 710/100,2 (74) Attorney, Agent, or Firm—Nixon & Vanderhye, P.C. 710,300-304,305,62,72,8–13, 463,3047, (57) ABSTRACT 463/16 See application file for complete search history. A peripheral device is provided for connection to a home (56) References Cited system having a recess formed in an external surface thereof. The peripheral device includes an electrical U.S. PATENT DOCUMENTS component, an electrical connector coupled to the electrical 3,369,177 A 2/1968 Graham et al. component for connecting to a connector of the home video 4,167,019 A 9/1979 Shepperd ...... 358/22 game system, and a housing. The housing of the peripheral 4,278.972 A 7/1981 Wozniak ...... 340/703 device is configured so that when the peripheral device is 4,296,476 A 10/1981 Mayer et al...... 364/900 inserted in the recess of the home video game system, it is 4,324,401 A 4/1982 Stubben et al...... 273/85 substantially flush with the external surface of the home 4,388,620 A 6/1983 Sherman video game system. In this way, the footprint of the video 4,407,298 A 10/1983 Lentz et al...... 128/713 game system can remain the same, even if peripheral devices 4.425,559 A 1/1984 Sherman are added. 4,454,594 A 6/1984 Heffron et al...... 364/900 4,462,076 A 7/1984 Smith, III ...... 364/200 4,562,308 A 12/1985 Kopetzky et al...... 179/18 10 Claims, 14 Drawing Sheets

1549 (P10)

US 7,003,588 B1 Page 3

JP 11076614 3/1999 Whitepaper: Dot Product Texture Blending, Dec. 3, 1999, JP 11161819 6/1999 www..com. JP 11203500 7/1999 Whitepaper: Technical Brief AGF 4X with Fast Writes, JP 11226257 8/1999 JP 11259671 9/1999 Nov. 10, 1999, www.nvidia.com. JP 11259678 9/1999 Whitepaper: Transform and Lighting, Nov. 10, 1999, www. JP 2000–66985 3/2000 nvidia.com. JP 2000-923.90 3/2000 Technical Brief: What’s New With Microsoft DirectX7, JP 2000–132704 5/2000 posted Nov. 10, 1999, www.nvidia.com. JP 2000–132706 5/2000 Mitchell et al., “Multitexturing in DirectX6”, Game JP 2000–149053 5/2000 Developer, Sep. 1998, www.gdmag.com. JP 2000–1568.75 6/2000 VisionTek, “GeForce2 GS ”, JP 2000–182077 6/2000 ©2000 www.visiontek.com. JP 2000–207582 7/2000 Jim Bushnell et al. “Advanced Multitexture Effects With JP 2000–215325 8/2000 and OpenGL", Pyramid Peak Design & ATI WO 88.09573 A1 6/1990 Research, Inc., GameDevelopers Conference, (C) 1999. WO WO 94/10641 5/1994 Sony PlayStation II Instruction Manual, Sony Computer OTHER PUBLICATIONS Entertainment Inc., (C)2000. Stand and Be Judged, Next Generation, May 2000. “Pinouts for Various Connectors in Real Life (tm)”, ?on PlayStation II: Hardware Heaven or Hell?, Next Generation, line], retrieved from the Internet, http://repairfaq.ece.drex Jan. 2000. el.edu/REPAIR/F, Pinouts2.html, updated May 20, 1997 Chris Charla, “Play Station II: The Latest News”, Next (per document). Generation, Sep. 1999. “Pinouts for Various Connectors in Real Life (tm)” , “First Playstation II Gameplay Screens Revealed!”, Next |online], retrieved from the Internet, http://repairfaq.ece. Generation, Sep. 1999. drexel.edu/REPAIR/F, Pinouts3.html, updated May 20, Game Enthusiast Online Highlights, Mar. 18, 1999. 1997 (per document). Game Enthusiast Online Highlights, Mar. 19, 1999. Tilton, Jay; “Nintendo NES and SNES controllers”, Game Enthusiast Online Highlights, Mar. 17, 1999. Deathskull Laboratories!, Jun. 28. 1998 (per document), Game Enthusiast Online Highlights, Oct. 20, 1999. |online], retrieved from the Internet, http://users.erols.com/ Joel Easley, “PlayStation II Revealed”, Game Week, Sep. tilton/games/tech/nescont.html. 29, 1999. 68HC705 V8 Specification Rev. 2.1 (General Release), MCU Inside Sony’s Next Generation Playstation, ©1999. System Design Group, Oak Hill, Texas, Aug. 12, 1994, pp. Press Releases, Mar. 18, 1999. iii-xi, xiii, and 89–96. Chris Johnston, “PlayStation Part Deux”, Press Start, HCO8–68HC08AS32, Advance Information Rev. 3.0, (C) 1999. Motorola, printed out Jul. 24, 2001, pp. 2–27, and 234-308. Nikkei Shimbun, “Sony Making SME, Chemical and SPT Website http://www.repairfaq.org/REPAIR/F Pinouts.html into Wholly-Owned Subsidiaries”, Mar 9, 1999. entitled “Pinouts for various conectors in Real Life(tm)”, p. AM News: Japanese Developers Not All Sold on PS2, Next 1 of 3, dated May 20, 1997, author not established. Generation, Mar 16, 1999. Christy J., Website http://www.repairfaq.org/REPAIR/ Sony To Turn PlayStation Maker Into Wholly Owned Unit F SNES.html entitled “Super Nintendo Entertainment Nikkei, Dow Johnes News Service, Mar. 8, 1999. System: pinouts & protocol, Mar. 26, 1996, 5 pages by way Yumiko Ono, Sony Antes Up Its Chips In Bet On New Game of link "2.2) SNES” in the website above. System, Dow Jones News Service, Mar. 4, 1999. Photograph of Sony PlayStation II System. MacWeek.Com Gets Inside Story on Connectix VGS for Photograph of Sega Dreamcast System. Windows; Controversial of Sony PlayStation Photograph of System. Games Cureently Available for Macs Only, Business Wire, Whitepaper: 3D Graphics Demystified, Nov. 11, 1999, ww Mar. 12, 1999. w.nvidia.com. “DexDrive Bridges Gap”, The Tampa Tribune, Mar. 12, Whitepaper: “Z Buffering, Interpolation and More W 1999. Buffering”, Doug Rogers, Jan. 31, 2000, www.nvidia.com. A Microprocessor With a 128b CPU, 10 Floating-Point Whitepaper: Using GL NV vertex array and MAC’s, 4 Floating-Point Dividers, and an MPEG2 Decoder, GL NV fence, posted Aug. 1, 2000, www.nvidia.com. 1999 IEEE International Solid-State Circuits Conference, Whitepaper: Anisotropic Texture Filtering in OpenGL, Feb. 16, 1999. posted Jul. 17, 2000, www.nvidia.com. Dreamcast Instruction Manual, Sega Enterprises, Ltd., Whitepaper: Mapping Texels to in D3D, posted Apr. ©1998. 5, 2000, www.nvidia.com. “Sega To Launch Video Camera for Dreamcast”, Reuters Whitepaper: Guard Band Clipping, poted Jan. 31, 2000, Business News, Feb. 16, 2000. www.nvidia.com. David Pescovitz, “Dream On”, Wired, Aug. 1999. Whitepaper: Cube Environment Mapping, posted Jan. 14, Randy Nelson, “Dreamcast 101: Everything You Ever 2000, www.nvidia.com. Wanted To Know About Sega's Powerful New Console”, Whitepaper: Color Key in D3D, posted Jan 11, 2000, Official Sega Dreamcast Magazine, Jun. 1999. www.nvidia.com. 2D/3D Graphics Card User Manual, Guillemot (Ö1999. Whitepaper: Vertex Blending Under DX7 for the GeForce Nintendo 64 Instruction Booklet, Nintendo of America, 256, Jan. 5, 2000, www.nvidia.com. 1998. Whitepaper: Optimizing Direct3D for the GeForce 256, Jan. Steven Levy, “Here Comes PlayStation II”, Newsweek, Mar. 3, 2000, www.nvidia.com. 6, 2000. US 7,003,588 B1 Page 4

David Sheff, “Sony Smackage:Test Driving The PlayStation Technical Presentation: Hardware Transform and Lighting, II”, Wired, Nov. 1999. Nov. 12, 1999, www.nvidia.com. Introducing The Next Generation PlayStation, Sony Technical Presentation: GeForce 256 Overview, Nov. 12, Computer Entertainment Inc., (C)1999. 1999, www.nvidia.com. Technical Presentation: DirectX 7 and Texture Management, Lead GTS, Aug. 3, 2000, www.hexus.net. Nov. 12, 1999 www.nvidia.com. Voodoo 5 5500 Review, Jul. 26, 2000, www.hexus.net. Technical Presentation: Dot Product Lighting, Nov. 12, ATI 64 Meg DDR OEM, Aug., 19, 2000, www. 1999, www.nvidia.com. hexus.net. Technical Presentation: Texture Coordinate Generation, Microsoft Xbox—The Future of Gaming, Microsoft Xbox Nov. 3, 1999, www.nvidia.com. Performance Sheet, www.xbox.com. Technical Presentation: Phong and Lightmaps, Nov. Robert L. Cook, “Shade Trees”, Computer Graphics, vol. 18, 3, 1999, www.nvidia.com. No. 3, Jul. 1984. Technical Presentation: The ARB multitexture Extension, Wang et al., “Second-Depth Shadow Mapping”, Department Nov. 3, 1999 www.nvidia.com. of Computer Science, Univ. N.C, Chapel Hill, N.C. pp. 1-7. Technical Presentation: Multitexture Combiners, Nov. 3, Peercy et al., “Efficient Bump Mapping Hardware”, 1999, www.nvidia.com. Technical Presentation: Emboss Bump Mapping, Nov. 3, Computer Graphics Proceedings, Annual Conference Series, 1999, www.nvidia.com. 1997. Technical Presentation: Hardware Accelerated Anisotropic Gustavo Oliveira, “Refractive Texture Mappig, Part One”, Lighting, Nov. 3, 1999 www.nvidia.com. www.gamasutra.com, Nov., 10, 2000. Technical Presentation: Guard Band Clipping, Nov. 3, 1999, John Schlag, Fast Embossing Effects on Raster Image Data, www.nvidia.com. Graphics Gems IV, Edited by Paul S. Heckbert, Computer The RenderMan Interface, Stephan R. Keith, Version 3.1, Science Department, Carnegie Mellon University, Academic Pixar Animation Studios, Sep. 1989. Press, Inc., 1994,pp. 433–437. The RenderMan Interface, Version 3.2, Pixar Animation James F. Blinn, “Simulationof Wrinkled Surfaces,” Caltech/ Studios, Jul. 2000, www.pixar.com. JPL, pp. 286-292, SIGGRAPH 78 (1978). NVIDIA Product Overview, “GeForce2Ultra”, NVIDIA Tomas Möller and Eric Haines “Real-Time Rendering”, AK Corporation, Aug. 21, 2000, www.nvidia.com. Peters, Ltd., (C) 1999, pp. 127-142. Duke, “Dreamcast Technical Specs”, Sega Dreamcast Technical Presentation: Vertex Buffers, posted Jun. 12, Review, Sega, Feb. 1999, www.game-revolution.com. 2000, www.nvidia.com. Marlin Rowley, “GeForce 1 & 2 GPU Speed Tests”, May 11, Technical Presentation: Hardware Transform and Lighting, 2000, www.g256.com. www.nvidia.com, posted Jun. 12, 2000. “Dreamcast: The Full Story”, Next Generation, Sep. 1998. Technical Presentation: Hardware Bump-mapping Choices DirectX 7.0 Programmer’s Reference, Microsoft Corpora and Concepts, Jun. 7, 2000, www.nvidia.com. tion,1995-1999 (as part of the DirectX 7.0 SDK on the Technical Presentation: How to Bump Map a Skinned Companion CD included with “Inside Direct3D’, Microsoft Polygonal Model, Jun. 7, 2000, www.nvidia.com. Programming Series, Peter J. Kovach, Microsoft Press, Technical Presentation: Computations for Hardware Light 1999). ing and Shading, Mar. 17, 2000, www.nvidia.com. “Inside Direct3D", Microsoft Programming Series, Peter J. Technical Presentation: Practical Bump-mapping for Kovach, Microsoft Press, 1999. Today’s GPUs, Mar 17, 2000 www.nvidia.com. “OpenGL Programming Guide, The Official Guide to Learn Technical Presentation: Shadows, Transparency, & Fog, ing OpenGL, Release 1”, Jackie Nieder, Tom David, Mason Mar 17, 2000 www.nvidia.com. Woo, Addision-Wesley Publishing Co., 1993. Technical Presentation: GeForce 256 Register Combiners, “Procedural Elements for Computer Graphics,” Second Edi Mar. 17, 2000,www.nvidia.com. tion, David F. Rogers, McGraw Hill, 1998. Technical Presentation: TexGen & The Texture Matrix, Mar. “Real-Time Rendering,” Tomas Molleir, Eric Haines, AK 15, 2000 www.nvidia.com. Peters, 1999. Technical Presentation: Toon Shading, Mar. 15, 2000, www. “Computer Graphics, Principles and Practices,” Second Edi nvidia.com. tion, The Systems Programming Series, Foley, van Dam, Technical Presentation: D3D 7 Vertex Lighting, Mar. 15, Fiener, Hughes, Addison Wesley, 1990. 2000, www.nvidia.com. “Priciples of Three-Dimensional Computer Animation”, Technical Presentation: Per- Lighting (by S. Dietrich) Revised Edition, Michael O’Rourke, W.W. Norton & Mar. 14, 2000 www.nvidia.com. Company, 1998. Technical Presentation: GeForce 256 and RIVA TNT U.S. Appl. No. 10/132,702. Combiners, Dec. 8, 1999, www.nvidia.com. U.S. Appl. No. 10/225,472. Technical Presentation: Vertex Cache Optimization, Nov. U.S. Appl. No. 10/225,487. 12, 1999, www.nvidia.com. Technical Presentation: Vertex Blending, Nov. 12, 1999, U.S. Appl. No. 10/600,585. www.nvidia.com. * cited by examiner U.S. Patent Feb. 21, 2006 Sheet 1 of 14 US 7,003,588 B1

U.S. Patent US 7,003,588 B1

09 U.S. Patent US 7,003,588 B1

JOSS000JA0|pny U.S. Patent US 7,003,588 B1

U.S. Patent Feb. 21, 2006 Sheet 5 of 14 US 7,003,588 B1

112 | 1.4

216 200a position 2300a position' ?t GX FIFO FIFO º: normal p #Osition' Processing Lighting Channels from Ni DisplayLists - §: ". 300e 110 amºm

position, normal Vertex Arrays tex COOrds - 214 Cache COOrdinate - 300c Generation : ! 220

; : Culling Setup 113 t External Fam: Buffer Or 3000 F Texture

504 ! 502 700C

500b 500a 600h

* \

º: 600a 700a 700b

Fig. 5 ExAMPLE GRAPHics PROCESSOR Flow U.S. Patent Feb. 21, 2006 Sheet 6 of 14 US 7,003,588 B1

Fig. 6 EXAMPLE INPUT/OUTPUT SUBSYSTEM

| |O Subsystem. 904

1000 Tolfrom Serial 52 |F S_PINT epRAM

1100

Torrom 132/442 EX_PINT

GFXRegs GFX unit To/From 180 106

122,

Memory Controller 152 U.S. Patent Feb. 21, 2006 Sheet 7 of 14 US 7,003,588 B1

8010?NN00AW|No.---- dWW

WWHOS

`~,

U.S. Patent Feb. 21, 2006 Sheet 9 of 14 US 7,003,588 B1

U.S. Patent Feb. 21, 2006 Sheet 10 of 14 US 7,003,588 B1

U.S. Patent Feb. 21, 2006 Sheet 11 of 14 US 7,003,588 B1

Fig. 11 U.S. Patent Feb. 21, 2006 Sheet 12 of 14 US 7,003,588 B1

U.S. Patent Feb. 21, 2006 Sheet 13 of 14 US 7,003,588 B1

U.S. Patent Feb. 21, 2006 Sheet 14 of 14 US 7,003,588 B1

US 7,003,588 B1 1 2 PERIPHERAL DEVICES FOR A VIDEO One approach would be to use standardized interfaces. GAME SYSTEM Computer equipment manufacturers have developed a num ber of standardized interfaces in the past to connect with CROSS-REFERENCE TO RELATED mass storage devices, modems, and other peripheral devices. APPLICATION Using standardized interfaces tends to simplify design efforts and achieve component compatibility and interoper This application claims priority under 35 U.S.C. 119 to ability. The typical has a number of provisional Application No. 60/313,812, filed Aug. 22, standardized interfaces so it can be modular and compatible 2001, the contents of which are incorporated herein. with hardware and peripheral devices designed by a number 10 of different manufacturers. Designing a new personal com FIELD OF THE INVENTION puter does not require redesign of all of these interfaces. The present invention relates to computer graphics, and While the standardized interface approach has certain more particularly to interactive graphics systems such as advantages in the arena of general purpose computing, it home video game systems. Still more particularly, this may not be suitable for home video game systems. Because 15 a home video game system must be manufactured at low invention relates to peripheral devices for connection to a cost and yet achieve maximum performance, it is desirable home video game system. to optimize each and every aspect of the system —including BACKGROUND AND SUMMARY OF THE the system interfaces. The interfaces can be looked at as the INVENTION highways over which information flows throughout the 20 system. This information traffic should flow as rapidly and Many of us have seen films containing remarkably real efficiently as possible. Using standard interfaces may be istic dinosaurs, aliens, animated toys and other fanciful easier from a design standpoint, but a standardized interface creatures. Such animations are made possible by computer may not provide the high performance that a customized graphics. Using such techniques, a computer graphics artist interface might offer. “One size fits all” makes things easier, can specify how each object should look and how it should 25 but doesn’t always result in the best possible fit. change in appearance over time, and a computer then models Another issue relates to hardware interoperability. Stan the objects and displays them on a display such as your dardized interfaces provide the advantage that everyone can television or a computer screen. The computer takes care of design components that are compatible with them. For performing the many tasks required to make sure that each example, when you buy a personal computer having a part of the displayed image is colored and shaped just right 30 standardized serial interface, parallel interface and expan based on the position and orientation of each object in a sion device interface, you know that you can go out and scene, the direction in which light seems to strike each purchase any of a variety of different devices all of which object, the surface texture of each object, and other factors. will be compatible with those standardized interfaces. You Because computer graphics generation is complex, com can plug in any of a dozen different types of printers to either puter-generated three-dimensional graphics just a few years 35 the serial or the parallel interface of your personal computer, ago were mostly limited to expensive specialized flight and they will all work. Similarly, any of dozens of different simulators, high-end graphics workstations and supercom modems or other network cards can be plugged into the puters. The public saw some of the images generated by PCMCIA card slot of a personal computer or laptop, and all these computer systems in movies and expensive television of these different cards will work. advertisements, but most of us couldn’t actually interact 40 Open standards have the advantage that they achieve with the computers doing the graphics generation. All this hardware interoperability between systems and a wide range has changed with the availability of relatively inexpensive of different accessories. This approach is helpful when the 3D graphics systems such as, for example, the Nintendo 64(8) system manufacturer is selling a general purpose device that and various 3D graphics cards now available for personal can be used for virtually any application, but makes less computers. It is now possible to interact with exciting 3D 45 sense in the home video game arena where a given video animations and simulations on relatively inexpensive com game manufacturer is responsible for making or licensing all puter graphics systems in your home or office. of the various special-purpose accessories for its brand of A problem graphics system designers confronted in the home video game system. past was how to efficiently couple system components For example, video game manufacturers in the past have together. A modern 3D graphics system is relatively com 50 expended substantial time, effort and resources to develop plex, and requires a number of different connections definitive new home video game systems. They want to sell between different aspects of the system. For example, it is as many of these as possible, and therefore price them very often necessary to interface with a mass storage device such competitively. Like the razor manufacturer who recoups his as an optical disk. In addition, in an interactive real time investment by selling razor blades as opposed to the razor system such as a gaming system, some means must be 55 itself, video game system manufacturers rely on controlling provided to interface with user-manipulable controls such as access to the installed user base of home video game systems hand-held controllers or the like. Sound is typically required, to achieve profits through licensing. If the home video game so that interfaces with various sound-producing and sound system used open standards, then competing manufacturers supporting components are required. It is also necessary to could bypass the company that invested all the time, effort provide some interfacing means for interfacing the system 60 and resources to develop the system to begin with, and could with a display device of an appropriate configuration. Addi instead market directly to consumers. Accordingly, under tionally, it is often desirable to interface the system with a this business model, it is important for the system manu number of other components such as, for example, read only facturer to be able to control access to the system. memory, flash memory, various memory cards, modems or One technique used successfully in the past to control other network connections, and debugging facilities for 65 access to home video game systems was to incorporate game or other application development. Various solutions to security systems that control access to the system. A security this problem were offered. system can enable the system to accept or reject things US 7,003,588 B1 3 4 plugged into it. As one example, it is possible to include an DETAILED DESCRIPTION OF EXAMPLE integrated circuit chip authentication type device in home EMBODIMENTS OF THE INVENTION video game cartridges. Before the home video game system interoperates with the cartridge or other peripheral device, it FIG. 1 shows an example interactive 3D computer graph may first authenticate the cartridge or other peripheral 5 ics system 50. System 50 can be used to play interactive 3D device by use of the security chip. While this approach can video games with interesting stereo sound. It can also be be highly successful, it requires each peripheral device to used for a variety of other applications. include authentication type information and/or devices. This In this example, system 50 is capable of processing, increases cost. In addition, no security system is impen interactively in real time, a digital representation or model of etrable. Given enough time, effort and resources, any secu 10 a three-dimensional world. System 50 can display some or rity system can be “cracked” to unlock access to the system. all of the world from any arbitrary viewpoint. For example, Thus, further improvements are desirable. system 50 can interactively change the viewpoint in In addition, video game systems typically compete for response to real time inputs from handheld controllers 52a, space atop or around televisions with other electronic com 52b or other input devices. This allows the game player to ponents such as video cassette recorders, DVD players, 15 see the world through the eyes of someone within or outside set-top boxes for cable and satellite systems and the like. of the world. System 50 can be used for applications that do This “competition” can be exacerbated if peripheral devices not require real time 3D interactive display (e.g., 2D display are to be added on to the video game system. Accordingly, generation and/or non-interactive display), but the capability it would be desirable to provide a video game system that of displaying quality 3D images very quickly can be used to provides a relatively small footprint even when peripheral 20 create very realistic and exciting game play or other graphi devices are added to the system. cal interactions. The present application describes a home video game To play a video game or other application using system system having at least one surface provided with one or 50, the user first connects a main unit 54 to his or her color more recesses therein. The video game system also has a television set 56 or other display device by connecting a connector within the recess for connecting peripheral 25 cable 58 between the two. Main unit 54 produces both video devices inserted in the recess to the game processing cir signals and audio signals for controlling color television set cuitry. The peripheral device includes an electrical compo 56. The video signals are what controls the images displayed ment, an electrical connector coupled to the electrical com on the television screen 59, and the audio signals are played ponent for connecting to a connector of the home video back as sound through television stereo loudspeakers 61L, game system, and a housing. The housing of the peripheral 30 device is configured so that when the peripheral device is 61R. The user also needs to connect main unit 54 to a power inserted in the recess of the home video game system, it is source. This power source may be a conventional AC substantially flush with the external surface of the home adapter (not shown) that plugs into a standard home elec video game system. In this way, the footprint of the video trical wall socket and converts the house current into a lower game system can remain the same, even if peripheral devices 35 are added. In addition, because the shapes of the peripheral DC voltage signal suitable for powering the main unit 54. devices are non-standard and unusual, they provide unique Batteries could be used in other implementations. ness that can be used as a basis for excluding unlicensed and The user may use hand controllers 52a, 52b to control unauthorized people from manufacturing components that main unit 54. Controls 60 can be used, for example, to are compatible with the video game system. This allows a 40 specify the direction (up or down, left or right, closer or home video game system developer to protect its substantial further away) that a character displayed on television 56 investment in the development of the system. should move within a 3D world. Controls 60 also provide input for other applications (e.g., menu selection, pointer/ BRIEF DESCRIPTION OF THE DRAWINGS cursor control, etc.). Controllers 52 can take a variety of 45 forms. In this example, controllers 52 shown each include These and other features and advantages provided by the controls 60 such as joysticks, push buttons and/or directional invention will be better and more completely understood by switches. Controllers 52 may be connected to main unit 54 referring to the following detailed description of presently by cables or wirelessly via electromagnetic (e.g., radio or preferred embodiments in conjunction with the drawings, of infrared) waves. which: 50 To play an application such as a game, the user selects an FIG. 1 is an overall view of an example interactive appropriate storage medium 62 storing the video game or computer graphics system; other application he or she wants to play, and inserts that FIG. 2 is a block diagram of the FIG. 1 example computer storage medium into a slot 64 (or other storage medium graphics system; receiving means such as a tray) in main unit 54. Storage FIG. 3 is a block diagram of the example graphics and 55 medium 62 may, for example, be a specially encoded and/or audio processor shown in FIG. 2; encrypted optical and/or magnetic disk. The user may oper FIG. 4 is a block diagram of the example 3D graphics ate a power switch 66 to turn on main unit 54 and cause the processor shown in FIG. 3; main unit to begin running the video game or other appli FIG. 5 is an example logical flow diagram of the FIG. 4 cation based on the software stored in the storage medium graphics and audio processor; 60 62. The user may operate controllers 52 to provide inputs to FIG. 6 shows an example input/output subsystem block main unit 54. For example, operating a control 60 may cause diagram; the game or other application to start. Moving other controls FIG. 7 is a block diagram of an example overall more 60 can cause animated characters to move in different detailed embodiment of an example video game system; and directions or change the user’s point of view in a 3D world. FIGS. 8–14B show external views of an example video 65 Depending upon the particular software stored within the game system showing example external connector configu storage medium 62, the various controls 60 on the controller rations. 52 can perform different functions at different times. US 7,003,588 B1 5 6 Example Electronics of Overall System additional memory (e.g., semiconductor, optical, magnetic, FIG. 2 shows a block diagram of example components of etc.) may be connected. Other peripheral devices may also system 50. The primary components include: be connected to this connector which constitutes a high a main processor (CPU) 110, speed parallel connector. a main memory 112, and Example Graphics And Audio Processor a graphics and audio processor 114. FIG. 3 is a block diagram of an example graphics and In this example, main processor 110 (e.g., an enhanced audio processor 114. Graphics and audio processor 114 in IBM Power PC 750) receives inputs from handheld control one example may be a single-chip ASIC (application spe lers 52 (and/or other input devices) via graphics and audio cific integrated circuit). In this example, graphics and audio processor 114. Main processor 110 interactively responds to 10 user inputs, and executes a video game or other program processor 114 includes: supplied, for example, by external storage media 62 via a a processor interface 150, mass storage access device 106 such as an optical disk drive. a memory interface/controller 152, As one example, in the context of video game play, main a 3D graphics processor 154, processor 110 can perform collision detection and animation 15 an audio digital signal processor (DSP) 156, processing in addition to a variety of interactive and control an SDRAM audio memory interface 158, functions. an audio interface and mixer 1300, In this example, main processor 110 generates 3D graph a peripheral controller 162, and ics and audio commands and sends them to graphics and a display controller 164. audio processor 114. The graphics and audio processor 114 20 3D graphics processor 154 performs graphics processing processes these commands to generate interesting visual tasks. Audio digital signal processor 156 performs audio images on display 59 and interesting stereo sound on stereo processing tasks. Display controller 164 accesses image loudspeakers 61R, 61L or other suitable sound-generating information from main memory 112 and provides it to video devices. encoder 120 for display on display device 56. Audio inter Example system 50 includes a video encoder 120 that 25 face and mixer 1300 interfaces with audio codec 122, and receives image signals from graphics and audio processor can also mix audio from different sources (e.g., streaming 114 and converts the image signals into analog and/or digital audio from mass storage access device 106, the output of video signals suitable for display on a standard display audio DSP 156, and external audio input received via audio device such as a computer monitor or home color television codec 122). Processor interface 150 provides a data and set 56. System 50 also includes an audio codec (compressor/ 30 control interface between main processor 110 and graphics decompressor) 122 that compresses and decompresses digi and audio processor 114. tized audio signals and may also convert between digital and Memory interface 152 provides a data and control inter analog audio signaling formats as needed. Audio codec 122 face between graphics and audio processor 114 and memory can receive audio inputs via a buffer 124 and provide them 112. In this example, main processor 110 accesses main to graphics and audio processor 114 for processing (e.g., 35 memory 112 via processor interface 150 and memory inter mixing with other audio signals the processor generates face 152 that are part of graphics and audio processor 114. and/or receives via a streaming audio output of mass storage Peripheral controller 162 provides a data and control inter access device 106). Graphics and audio processor 114 in this face between graphics and audio processor 114 and various example can store audio related information in an SDRAM peripheral devices mentioned above. SDRAM audio audio memory 126 that is available for audio tasks. Graphics 40 memory interface 158 provides an interface with SDRAM and audio processor 114 provides the resulting audio output audio memory 126. signals to audio codec 122 for decompression and conver sion to analog signals (e.g., via buffer amplifiers 128L, Example 128R) so they can be reproduced by loudspeakers 61L, 61R FIG. 4 shows a more detailed view of an example 3D of the television. 45 graphics processor 154. 3D graphics processor 154 includes, Graphics and audio processor 114 has the ability to among other things, a command processor 200 and a 3D communicate with various additional devices that may be graphics pipeline 180. Main processor 110 communicates present within system 50. For example, a parallel digital bus streams of data (e.g., graphics command streams and display 130 may be used to communicate with mass storage access lists) to command processor 200. Main processor 110 has a device 106 and/or other components. A serial peripheral bus 50 two-level cache 115 to minimize memory latency, and also 132 may communicate with a variety of peripheral or other has a write-gathering buffer 111 for uncached data streams devices including, for example: targeted for the graphics and audio processor 114. The a programmable read-only memory (PROM) and/or real write-gathering buffer 111 collects partial cache lines into time clock (RTC) 134, full cache lines and sends the data out to the graphics and a modem 136 or other networking interface such as a 55 audio processor 114 one cache line at a time for maximum broadband adapter (which may in turn connect system bus usage. 50 to a telecommunications network 138 such as the Command processor 200 receives display commands Internet or other digital network from/to which pro from main processor 110 and parses them—obtaining any gram instructions and/or data can be downloaded or additional data necessary to process them from shared uploaded), and 60 memory 112. The command processor 200 provides a stream of vertex commands to graphics pipeline 180 for 2D and/or flash memory 140. 3D processing and rendering. Graphics pipeline 180 gener A further external serial bus 142 may be used to communi ates images based on these commands. The resulting image cate with additional expansion memory 144 (e.g., a memory information may be transferred to main memory 112 for card) or other devices. Connectors may be used to connect 65 access by display controller/video interface unit various devices to busses 130, 132, 142. SDRAM audio 164—which displays the frame buffer output of pipeline 180 memory 126 may be associated with a connector to which on display 56. US 7,003,588 B1 7 8 FIG. 5 is a logical flow diagram of graphics processor gon and texture color/alpha/depth, and can also perform 154. Main processor 110 may store graphics command texture fog processing (600b) to achieve inverse range based streams 210, display lists 212 and vertex arrays 214 in main fog effects. Texture environment unit 600 can provide mul memory 112, and pass pointers to command processor 200 tiple stages to perform a variety of other interesting envi via bus interface 150. The main processor 110 stores graph 5 ronment-related functions based for example on color/alpha ics commands in one or more graphics first-in-first-out modulation, embossing, detail texturing, texture swapping, (FIFO) buffers 210 it allocates in main memory 112. The clamping, and depth blending. command processor 200 fetches: Pixel engine 700 performs depth (z) compare (700a) and command streams from main memory 112 via an on-chip pixel blending (700b). In this example, pixel engine 700 FIFO memory buffer 216 that receives and buffers the 10 stores data into an embedded (on-chip) frame buffer memory graphics commands for synchronization/flow control 702. Graphics pipeline 180 may include one or more embed and load balancing, ded DRAM memories 702 to store frame buffer and/or display lists 212 from main memory 112 via an on-chip texture information locally. Z compares 700a' can also be call FIFO memory buffer 218, and performed at an earlier stage in the graphics pipeline 180 vertex attributes from the command stream and/or from 15 depending on the rendering mode currently in effect (e.g., z vertex arrays 214 in main memory 112 via a vertex compares can be performed earlier if alpha blending is not cache 220. required). The pixel engine 700 includes a copy operation Command processor 200 performs command processing 700c that periodically writes on-chip frame buffer 702 to operations 200a that convert attribute types to floating point memory portion 113 of main memory 112 for access by format, and pass the resulting complete vertex polygon data 20 display/video interface unit 164. This copy operation 700c to graphics pipeline 180 for rendering/rasterization. A pro can also be used to copy embedded frame buffer 702 grammable memory arbitration circuitry 130 (see FIG. 4) contents to textures in the main memory 112 for dynamic arbitrates access to shared main memory 112 between graph texture synthesis effects. Anti-aliasing and other filtering can ics pipeline 180, command processor 200 and display con be performed during the copy-out operation. The frame troller/video interface unit 164. 25 buffer output of graphics pipeline 180 (which is ultimately FIG. 4 shows that graphics pipeline 180 may include: stored in main memory 112) is read each frame by display/ a transform unit 300, video interface unit 164. Display controller/video interface a setup/rasterizer 400, 164 provides digital RGB pixel values for display on display a texture unit 500, 56. a texture environment unit 600, and 30 a pixel engine 700. Example Input/Output Subsystem Transform unit 300 performs a variety of 2D and 3D FIG. 6 shows an example input/output subsystem. In this transform and other operations 300a (see FIG. 5). Transform example, the input/output subsystem includes a serial inter unit 300 may include one or more matrix memories 300b for face 1000, an external interface 1100, a disk interface 1200 storing matrices used in transformation processing 300a. 35 and audio interface 1300. Serial interface 1000 is used to Transform unit 300 transforms incoming geometry per ver communicate with controllers 52 or other devices that can be tex from object space to screen space; and transforms coupled to one of four serial ports of system 50. External incoming texture coordinates and computes projective tex interface 1100 is used to communicate with a variety of ture coordinates (300c). Transform unit 300 may also per devices such as PROM RTC 134, modem 136, flash memory form polygon clipping/culling 300d. Lighting processing 40 140, 144, etc. via various SPI buses 132, 142. 300e also performed by transform unit 300b provides per Disk interface 1200 is used to communicate with mass vertex lighting computations for up to eight independent storage access device 106 via a parallel bus 130. Audio lights in one example embodiment. Transform unit 300 can interface 1300 is used to stream the audio output data from also perform texture coordinate generation (300c) for an audio buffer in main memory 112 to audio codec 122. embossed type bump mapping effects, as well as polygon 45 In the example embodiment, the external interface 1100 clipping/culling operations (300d). and disk interface 1200 have direct access to memory Setup/rasterizer 400 includes a setup unit which receives controller 152 via a bus 900. In addition, each one of vertex data from transform unit 300 and sends triangle setup interfaces 1000, 1100,1200 and 1300 as well as audio digital information to one or more rasterizer units (400b) perform signal processor 156 share a common bus 902 used to ing edge rasterization, texture coordinate rasterization and 50 communicate between these components and a bus interface color rasterization. 904. The bus interface 904, in turn, can be used to arbitrate Texture unit 500 (which may include an on-chip texture access to graphics unit 180 including embedded DRAM memory (TMEM) 502) performs various tasks related to 702. In the example embodiment, there is also a connection texturing including for example: 906 between DSP 156 and audio interface 1300. retrieving textures 504 from main memory 112, 55 Briefly, disk interface 1200 provides an interface to mass texture processing (500a) including, for example, multi storage access device 106 providing a texture handling, post-cache texture decompression, capability with interrupt. Serial interface 1000 provides a texture filtering, embossing, shadows and lighting serial interface to hand controllers 52 or other serial devices through the use of projective textures, and BLIT with using automatic controller polling and bulk data mode alpha transparency and depth, 60 including a light gun interface. The external interface 1100 bump map processing for computing texture coordinate provides multiple serial peripheral interface (SPI) buses as displacements for bump mapping, pseudo texture and well as a memory mapped area for boot PROM 134. Audio texture tiling effects (500b), and interface 1300 provides an output to audio codec 122 as well indirect texture processing (500c). as an input for streaming audio from mass storage access Texture unit 500 outputs filtered texture values to the 65 device 106. These various interfaces 1000, 1100, 1200, 1300 texture environment unit 600 for texture environment pro provide a shared memory port for direct memory access, cessing (600a). Texture environment unit 600 blends poly with round robin arbitration for access to main memory. US 7,003,588 B1 9 10 Example Detailed Overall System Embodiment/Inmplemen disconnection between different modular portions of system tation 50 to provide an easy-to-manufacture system that can also FIG. 7 shows an example, detailed embodiment/imple be expanded, extended and reconfigured in the field without mentation of system 50 shown in FIG. 2. In this example special equipment and involved processes. implementation: FIGS. 8–14B show an example external view of system A clock generator 1502 provides clocking signals to both 50 within a housing H. Referring specifically to FIG. 8, main microprocessor 110 and to graphics and audio housing H includes a front side F, a back side B, a top processor 114. surface T and a bottom surface U. Disk drive 106 may be A 32-bit address bus and a 64-bit data bus connect the housed beneath a hinged, openable access cover C that graphics and audio processor 114 with the main micro 10 allows the user to insert optical disks 62 into housing H. processor 110. Controller connectors 1523-0, 1523-1, 1523-2, 1523-3 may Main memory 112 is implemented as a pair of 96-mega be disposed on front surface F (as best shown in FIG. 9). bit, 1TSRAM chips manufactured by MOSYS, Inc. Example connectors 1523 may each comprise a 10-pin A multi-pin modem connector 1514 (P6) is used to female connector configured to accept a mating 10-pin male connect the graphics and audio processor 114 to an 15 connector of appropriate configuration. Such a connector external or internal modem. may, for example, couple system 50 to a wire-connected The boot ROM/real time clock 134 is coupled to the handheld controller 52, a light gun, or any other peripheral modem connector 1514 (P6) and shares its bus for device. FIG. 10 shows an example peripheral device PD (in communication with the graphics and audio processor this case a receiver (e.g., an infrared or radio frequency 114. 20 receiver) for a wireless handheld controller 52—but it could Three multi-pin EXI connectors 1516 (P4), 1518 (P5), be any sort of device) connected to connector 1523a and 1520 (P8) are used to connect the graphics and audio being supported by the connector configuration. processor 114 to various internal and/or external Referring now more particularly to FIG. 11, a digital peripheral or other devices. audio/video connector 1552 and an analog audio/video A multi-pin serial connector 1510 is used to couple the 25 connector 1550 may be disposed on rear surface B of graphics and audio processor 114 to four controller housing H. Each of these connectors 1550, 1552 can output connectors 1523-0, 1523-1, 1523-2, 1523-3 each of sound and picture signals. However, the digital connector which is connected to a different hand controller 52 or 1552 provides access to the internal video and audio buses other external input/output device. described above—thereby providing additional flexibility in A multi-pin disk interface connector 1521 is used to 30 terms of interfacing system 50 with sophisticated digital couple the graphics and audio processor 114 to the audio and/or video equipment. A connector CH of an appro optical disk drive 106. priate configuration can be connected to the digital connec The SDRAM 126 may be provided with a multi-pin tor 1552, and the other end CH' of such a connector/cable expansion (high-speed parallel) connector 1549 (P10) can be connected to any sort of compatible external equip that can be used to expand the 128 MB capacity of 35 ment. Similarly, a connector (not shown) can be coupled to SDRAM 126 with an additional SDRAM expansion the analog audio/video connector 1550 to provide analog module 126a or connect to other peripheral devices audio and video information to a home television set, display such as a . monitor or other such equipment. A two-pin power supply An analog audio/video connector 1550 having multiple connector 1590 may also be provided on rear panel B for pins communicates analog audio and video information 40 inputting DC power to system 50. For example, an external between the graphics and audio processor 114 and 12 volt DC power supply can provide 12 volts DC to external devices such as, for example, television sets, connector 1590, and internal voltage regulation circuitry can external display monitors, external loudspeakers or regulate this 12 volt DC power supply level down to other input/output devices. appropriate lower intermediate voltage levels. A digital audio/video connector 1552 having multiple 45 Referring to FIG. 12, the bottom surface U of housing H pins makes the digital and audio interface provided by may include a number of recesses normally covered by graphics and audio processor 114 available to the covers (not shown). Removal of such covers exposes outside world for connection to any of a variety of recesses and associated connectors. For example, a recess R1 different digital video and/or audio devices. may include a “high speed port” connector such as the In one example implementation, the following external 50 connector 1549 (P10) and a recess R2 may expose and connectors may be provided: provide connection to a modem connector 1514 (P6). Fur four game controller connectors (1523-0, 1523-1, 1523-2, ther recess Rs may expose and provide connection to an 1523-3); additional such as the connector 1520 (P8). As two memory card slots (51, 52) associated with connec best seen in FIG. 13, peripheral devices PD1, PD2 and PD3 tors 1516 (P4) and 1518 (P5); 55 can be mechanically configured to fit dimensionally within one analog audio/video connector (1550); corresponding recess R1, R2, Rs so that such peripheral one digital audio/video connector (1552); devices can be mounted flush within the generally cubic two high speed serial connectors (1514 (P6) and 1520 configuration of housing H. In this way, the footprint of the (P8)); and video game system can remain the same, even if peripheral one high speed parallel connector (1549 (P10)). 60 devices PD1, PD2 and PD3 are added. Such peripheral In the example shown, each of connectors 1510, 1514, devices PD1, PD2, PD3 may include a broadband adapter, a 1516, 1518, 1520, 1521, 1523-0-1523-3, 1549, 1550 and modem, or any other sort of electronic or electrical device 1552 comprises a mating male and female multi-pin con providing data inputs and/or outputs. Devices PD1, PD2 and nector that allows connections to be made, broken and PD3 can be modular and inserted or removed into corre remade without destructive or permanent processes such as 65 sponding recess R at will to provide different expansion or soldering or permanent (non-deformable) crimping. Use of other functionality for system 50. Of course, a connecting such connectors allows simple and easy connection and cable or wireless communications device could be coupled