
INTERNATIONAL ORGANISATION FOR STANDARDISATION ORGANISATION INTERNATIONALE DE NORMALISATION ISO/IEC JTC 1/SC 29/WG 11 CODING OF MOVING PICTURES AND AUDIO ISO/IEC JTC 1/SC 29/WG 11m m23798 February 2012, San Jose, CA Title: Additional Use Cases for ARAF Authors: Marius Preda, Ivica Arsov (on behalf of the TOTEM project) 1 Mobile Mixed Reality/Augmented Reality Games use case Augmented Reality and, in general, Mixed Reality combines the real world with a virtual scene and creates an immersive and interactive real-time environment. The possible applications are almost limitless and encompass areas like learning, architecture or industrial production. In the gaming area, AR/MR offers unique opportunities. Unlike with traditional video games, these games are not imprisoned onto the screen but interact with the real world. They incorporate real locations and objects into the game, therefore tapping into already existing thoughts and emotions of the player, which in turn creates a potentially much richer gaming world and experience. Mobile AR/MR Gaming even goes a step further as the playing area becomes borderless and games can be played literally anywhere and anytime. With the current advancement of modern cell phone technology, we are identifying the potential of Mobile AR/MR Games being played by anyone. 2 Background and state of the art Digital games employ digital technology as part of the game experience. In contrast to non computer games, digital games “facilitate the play of games that would be too complicated in a non computerized context.”[Salen, 2003]. Game developers can create rich, interactive virtual worlds that real people can inhabit. Through digital technology and the possibility of representing complex behavior, computer games can deliver new and intense experiences to players. Mobile or pervasive games reach beyond desktop- or console- based gaming. They overcome spatial, temporal, and social boundaries of conventional computer games by making the real environment intrinsic.[Montola, 2005] They bridge the gap between traditional non computer games and existing video games by offering direct social interaction [Magerkurth, 2005]. Thus, mobile games integrate social components of traditional games into digital games, extending them into our everyday activities. Location-based or -aware games emphasize these aspects of pervasive games [Bell, 2006]. The game content and the player’s interaction opportunities depend on the player’s current real-world location. The user’s local environment (including buildings, plants, people, and so on) might contain items relevant for game play. Moreover, interaction with some of these items might be part of the game. This interaction extends to other players who might compete against one another or cooperate as part of the game when sharing a certain area—similar to common real-world activities. Finally, pervasive or mobile mixed reality (MR) games [Broll, 2006] represent distinctive location-aware pervasive games—that is, they use MR technology to enhance or modify the players’ real environment with virtual content. Instead of being represented by an avatar like in many traditional computer games, players are their own avatar moving in the game world by moving in their real environment. Most pervasive MR games are purely outdoor. However, some of them facilitate combined indoor and outdoor activities. Pervasive MR games typically combine real and virtual content, storytelling, and the user’s imagination to create a new type of entertainment experience. Such systems have the potential to create an ultimate gaming experience [Stapleton, 2003] One of the first mobile MR (or Augmented Reality) games was ARQuake, developed in 2002 [Piekarski, 2002]. ARQuake is the AR version of the famous first-person shooter Quake. The aim is identical to the desktop version where the player shoots monsters and collects items. In the AR version, the players wear AR systems consisting of a laptop (in a backpack) attached to several sensors for tracking position such as differential GPS for the orientation and a camera for marker-based computer vision tracking. In 2004, another AR variant of a famous computer game was developed. The story of Human Pacman [Cheok, 2004] is the same as that of the original arcade game Pacman: Pacman (represented by one player), has to collect all the cookies in the world before he is caught by a ghost (represented by another player). The game’s system consists of a central server, the wearable client system, a Bluetooth device, and a helper laptop. The client and server communicate via WiFi, and the client is connected to the physical devices via Bluetooth. The user’s viewing orientation is tracked by a 3-degrees-of-freedom orientation sensor built into the helmet and the position is determined via GPS. A special sensor detects whether the player is touched by the enemy player. NetAttack [Broll, 2006] also developed in 2004, is a scavenger hunt game. The basic idea and technique of ARQuake was enhanced by having outdoor as well as indoor game play. The outdoor player wears a mobile AR system and is tracked by GPS and marker-based computer vision tracking. The outdoor player is supported by an indoor player. The partner inside, sitting at a desktop-based survey system, communicates via Wi-Fi with the outdoor player, guides him, and helps him find game items. In 2004, AR Soccer [Reimann, 2004] was the first handheld-based AR game. AR Soccer is a computer vision-based smart phone game. Players shoot a penalty by kicking a virtual ball into a virtual goal with their real feet. The smart phone’s camera tracks the player’s foot. Instead of an optical flow algorithm, the foot is detected and tracked using a 2D edge extraction approach. In 2005, AR Tennis [Henrysson, 2005] and The Invisible Train [Wagner, 2005] were developed as marker-based games on handheld devices. AR Tennis is a multiuser-supported smart phone game that also uses computer vision-based tracking. In AR Tennis, the smart phones are equipped with markers, which can be detected by the cameras of the other players’ phones. The tracking data is transmitted via a peer-to-peer Bluetooth connection, thus enabling the two players to play tennis virtually with their smart phones. The aim of the multiuser game The Invisible Train is to steer a train over a wooden railroad track. The player can interact over the touch screen by changing the speed of the trains and the switches. The Invisible Train is a synchronized multiuser game in which PDAs are connected via Wi- Fi. The tracking is realized by a marker-based computer vision approach. In Capture the Flag [Cheok, 2006] developed in 2006, a team catches the opponent’s flag, which is presented by a small wooden box. This box is equipped with a Bluetooth device and a touch sensor. The touch sensor tracks whether the participants pick or move the box. The box and the smart phone are connected via Bluetooth. Mr. Virtuoso is an educational game played as video-see-through AR on a PDA [Wagner, 2006]. The objective is to sort artwork according to their date of creation along a timeline. The timeline consists of fiducial markers. Each marker carries one artwork. The virtual expert Mr. Virtuoso provides information to the artworks. The PDAs are synchronized via WiFi using a server. In 2007, Epidemic Menace [Lindt, 2007] the successor of NetAttack, was developed. Epidemic Menace uses multiple devices including PDAs and smart phones. It also takes place indoors and outdoors, and the players communicate via smartphones. The smart phones display a small map of the game area and the user’s current position. The PDA supports position tracking. The outdoor players are optionally equipped with mobile AR systems. TimeWarp [Herbst 2008] is a Mobile Mixed Reality game set in the oldtown of Cologne. Players travel in time and space to meet and help the legendary Heinzelmännchen elves while they have to interact with virtual and real objects alike. The Alchemists [Wetzel 2008] takes a simpler approach where the players have to collect magical ingredients from the environment to brew potent potions (see Figure 2). What makes this game unique is the fact that it was built with the so-called “Magic Lens Box”, an attempt at providing an authoring environment for Augmented Reality games. The input features exploited in the games presented above are the location/orientation (thanks to a GPS system, or NFC tags/QR-codes spread all over the field…), the accelerometer, the Bluetooth/GSM/WiFi connection, and the image/video captured by the mobile phone's camera. One objective of the project is to build (based on existent APIs) a unified framework to manage and process these inputs. A second category of information ready to be exploited in mobile mixed reality games is the user/player context. It is used to trigger the right service/action/reaction in the right moment and is at the heart of two very active research subjects named Pervasive Media and Autonomic Computing. Context is derived from sensors or from digital resources (presence services such as CellID, user input, …) and may refer to physical, virtual or social circumstances. Existent games such as ContextKing [Koolwaaij 2009] already exploit context management frameworks [Böhm 2008] for collection/aggregation/distribution of context information. For the localization data, as the main context feature, several solutions were implemented to ensure its computation whichever form of sensor used [Dey 2001, Roman 2002, Capra 2003, DaRocha 2006, Julien 2006]. To improve the accuracy or to provide higher level interpretation, sensor fusion is also an active domain reported in the literature [Fox 2001]. Some commercial efforts, while basic today and mainly based on self-reporting, are available to collect low- (e.g location) or high-level (e.g.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages7 Page
-
File Size-