Live-action Virtual Luis Valente1, Esteban Clua1, Alexandre Ribeiro Silva2, Bruno Feijó3

1MediaLab, Institute of 2Instituto Federal do Triângulo 3VisionLab, Department of Computing, Mineiro, Brazil Informatics, UFF, Brazil PUC-Rio, Brazil

{lvalente,esteban}@ic.uff.br, [email protected], [email protected]

Abstract. This paper proposes the concept of “live- high cost, the inconvenience of using this hardware action games” as a new genre of digi- (e.g., heavy helmets, lots of cables required, limited tal games based on an innovative combination of mobility), and application specificity (e.g., high end live-action, mixed-reality, context-awareness, and VR systems, military applications). One of the main interaction paradigms that comprise tangible ob- goals in virtual reality applications is to immerse jects, context-aware input devices, and the user’s senses in an artificial virtual environment embedded/embodied interactions. Live-action vir- (VE) through an interactive experience. A key fac- tual reality games are “live-action games” because tor regarding how this interactive immersive experi- a player physically acts out (using his/her real body ence is successful refers to the sense of presence and senses) his/her “” (his/her virtual repre- [1]. Recently, is a growing trend in the indus- sentation) in the stage – the mixed-reality en- try to bring these kinds of devices to the mass mar- vironment where the game happens. The game ket (e.g., Rift, Samsung VR, HTC Vive), stage is a kind of “augmented ” – a mixed- with affordable prices and small form factors, reality where the is augmented with which opens up possibilities for using these devices real-world information. In live-action virtual reality in mainstream games and entertainment applica- games, players wear HMD devices and see a virtual tions. world that is constructed using the physical world The advancement of virtual-reality technologies architecture as the basic geometry and context in- and context-awareness give rise to new possibilities formation. Physical objects that reside in the physi- of game genres. In this regard, in this paper we cal world are also mapped to virtual elements. Live- present a new game genre based on unique combi- action virtual reality games keeps the virtual and nations of live-action, mixed-reality, context-aware- real-worlds superimposed, requiring players to ness, and interaction paradigms comprising tangible physically move in the environment and to use dif- objects, context-aware input devices, embedded in- ferent interaction paradigms (such as tangible and teractions, and embodied interactions. We call this embodied interaction) to complete game activities. new genre as “live-action virtual reality games”. This setup enables the players to touch physical ar- chitectural elements (such as walls) and other ob- We say that live-action virtual reality games are jects, “feeling” the game stage. Players have free “live-action games” because a player physically movement and may interact with physical objects acts out his/her “avatar” (his/her virtual representa- placed in the game stage, implicitly and explicitly. tion) in the game stage – the mixed-reality environ- Live-action virtual reality games differ from similar ment where the game happens. The game creates game concepts because they sense and use contex- the game stage based on several real-world infor- tual information to create unpredictable game expe- mation sources, such as the physical environment riences, giving rise to emergent . architecture and context information. Live-action virtual reality games keep the virtual and real- Keywords: live-action virtual reality games, worlds superimposed as the player experiences the mixed-reality, virtual reality, context-awareness, game. The player sees a virtual world through a tangible interaction, embodied interaction, context- HMD device (in first-person perspective) and needs aware interaction, smart objects. to move and walk in the physical environment to play the game. Also, the player may need to use gestures and other body movements in interactions 1 Introduction with game elements and other players. The player may need to interact with physical objects to com- In the past, virtual reality hardware (such as head- plete game tasks. Live-action play in live-action mounted display devices) has been restricted virtual reality games is facilitated through light- mostly to research labs due to factors such as their Figure 1. Simplified reality-virtuality continuum, extracted from [2] weight and mobile HMDs, mobile computing, and ment properties (e.g., temperature), generating vi- context-awareness. brations, and outputting smells. This paper is organized as follows. Section 2 presents the live-action vision 2.2 Physical structure as input and its main concepts. Section 3 provides related work, which compares live-action virtual reality A live-action virtual reality game uses physical games to other types of games based on mixed-real- structure information as basic building blocks to ities and initiatives originated in industry and create the game stage. When a player enters the academia. Section 4 presents concluding remarks. physical environment where the game happens (e.g., a room), the game tracks all architectural ele- ments (such as walls) to use as the basic geometry 2 Live-action virtual reality games to create a virtual 3D environment. After creating the virtual world structure, the game may augment This section presents key concepts of live-action this raw structure with virtual content. Metaphori- virtual reality games, which are: mixed-reality and cally, we compare this process to the process of the game stage, information flow, context-aware- mapping images (textures) to raw polygon data in ness, and interaction paradigms that live-action vir- 3D environments. The game keeps the virtual and tual reality games provide. This section also dis- real worlds superimposed as the player moves in cusses how a live-action virtual reality game may the environment, creating the game stage. stimulate the sense of presence. The game may also track physical objects that are part of this physical environment and map them 2.1 Mixed-reality in the virtual world as 3D models. Examples of these physical objects are furniture, interactive ob- Live-action virtual reality games create a mixed-re- jects carried by a player, and players’ own bodies. ality environment, which we name as the game However, the player does not see the physical stage. A “mixed-reality environment” can be de- world and physical objects – the player only sees fined as an environment that fuses virtual and real- virtual representations through the HMD. The vir- world (physical) information. Figure 1 illustrates a tual world building process may happen in real- simplified “reality-virtuality” continuum that Mil- time or before the game session (as a preparation gram and co-authors [2] proposed. This continuum step). In this preparation step, the game system may represents a spectrum describing possible combina- track all relevant physical features to generate a vir- tions of virtual and real elements, resulting in dif- tual world to be optimized later by an artist, who ferent kinds of “mixed-”. may augment the virtual world with other virtual el- ements. The game stage is a kind of “augmented virtual- ity” because the virtual world is constructed through real-world information sources and en- 2.3 Context information as input riched through virtual and real-world information. We roughly categorize these real-world information Context-awareness is a key component of live-ac- sources as “physical structure information” and tion virtual reality games. Dey [3] defines context “context information”. Live-action virtual reality as “any information that can be used to character- games output information in the virtual and physi- ize the situation of an entity. An entity is a person, cal worlds. Physical world output can take form as place, or object that is considered relevant to the real-world effects, such as manipulating environ- interaction between a player and an application, including the user and applications themselves”. A able to output information (such as audio system can be considered as “context-aware” if “it and images) and create real-world effects uses context to provide relevant information and/or (Section 2.4). services to the user, where relevancy depends on 3. The physical place may be augmented with the user’s task” [3]. In this regard, “context-aware- dedicated infrastructure that enables the ness” means that a live-action virtual reality game game to sense context information. Exam- is able to adapt the gameplay according to the cur- ples include installing tracking technology rent context conditions. Context-awareness enables and sensors to collect environmental infor- live-action virtual reality games to collect informa- mation. In this solution, these objects are tion from the real-world and from players (while hidden (“invisible”) from players, being part the game session is happening), using these data as of the overall game infrastructure. On the input information. Live-action virtual reality games other hand, smart objects are complex, visi- use context information for two main purposes: ble physical objects that reside inside the 1. Augment the game environment with dy- game stage; namic information that originates in the real- 4. The game queries remote information about world; the player based on the player identity (e.g., 2. Generate game content dynamically to cre- social network profiles). ate unpredictable game experiences, giving rise to . 2.4 Information output and real-world Here are some examples of information that we effects consider as “context” in live-action virtual reality games (this list is by no means an exhaustive list): Live-action virtual reality games may output infor- mation through wearable devices, mobile devices,  Physical environment information. Exam- and environment devices. These devices may out- ples include temperature, humidity, and put digital media information and create real-world lighting conditions, time of the day, weather effects (e.g., vibrations, smells, smoke). These de- information, and environmental noise; vices may be connected to other devices that com-  Existing research on Human-computer inter- prise the game infrastructure, such as input devices, action [4] suggests that the concept of con- game servers, and other output devices. These de- text can be boarder enough to consider the vices may also be input devices and context-aware. ways a user interacts with a device as con- The wearable devices are devices worn by a text information. In this regard, we include player, which may also work as input devices and as context information about interactions capable of sensing context information. Examples with input devices and with the physical en- include HMDs, isolating or non-isolating head- vironment where the game takes place; phones, gloves, vests, and clothes. These devices  Player information. Examples include physi- might provide haptics feedback as output, for ex- ological state, personal preferences, social ample. network profiles, and personality traits; The mobile devices are devices that a player  Information derived from the possible forms may carry around, touch, and throw away. These of relationships and interactions among devices may output digital information and may be players – the social context [5]. equipped with actuators, motors, and other devices capable of generating real-world effects. A live-action virtual reality game is able to sense context information through several means. Environment devices are objects that reside in Some possible alternatives are: the physical environment and generate effects or behavior in the physical world. Some possibilities 1. The player carries or wears devices that for environment devices are (this is by no means an sense context from the environment and/or exhaustive list): from the player (e.g., physiological informa- tion); 1. Simple output devices, not limited to digital media (e.g., audio speakers, smell genera- 2. The physical place where the game happens tors, weather effects, heat generators, smoke may host customized devices that are generators); equipped with sensors – the smart objects. Smart objects may be connected to other 2. Mechanical or electrical devices equipped smart objects and to central game servers. with actuators (e.g., sliding doors, elevators, These devices may provide interaction inter- wind generators); faces to the players (Section 2.5) and may be 3. Smart objects (Section 2.3) equipped with action. Examples of these devices include smart output hardware. An example is the trash bands, devices, and brain-machine can in Fun Theory [6]. interfaces. Wearable devices usually are context- aware and may be able to sense the player’s physio- logical state to use as contextual input information. 2.5 Interaction possibilities Players in live-action virtual reality games have In live-action virtual reality games, players interact to physically walk to move in the mixed-reality en- with the game and with other players through multi- vironment to complete game tasks. In this regard, ple modalities (e.g., voice, body movements, and players act out their own avatar, which is a form of gestures), ordinary physical objects, customized in- full body interaction. Full body interactions in live- put devices, and context-aware devices, supporting action virtual reality games can be mediated by tangible, embedded, embodied, and context-aware wearable devices or not. With the expression “full interaction paradigms. body interactions not mediated by wearable de- The player may interact with the game by ma- vices”, we mean refer to interaction paradigms nipulating mobile devices equipped with sensors made possible by devices such as the Kinect sensor and ordinary physical objects (e.g., wood sticks, [13]. rocks). The mobile devices may be equipped with Another way to interact in live-action virtual re- networking capabilities. These objects and devices ality games takes form as smart objects (Section realize the “tangible object metaphor”, which 2.3). Smart objects enable interactions of “implicit stands for “human-machine interfaces that couple nature” [14]. In implicit interactions, the interaction digital information to physical objects” [7]. These interface is invisible (i.e., based on sensors). interfaces can be implemented through tracking Schmidt [15] defines an implicit interaction as “an technologies or sensors. Examples of suitable sen- action, performed by the user that is not primarily sors for this purpose are RFIDs, magnetic sensors, aimed to interact with a computerized system but proximity sensors, and light sensors. For example, which such a system understands as input”. For ex- on a general context tangible objects could be ample, a game stage may host a smart object con- pieces of augmented table-top games [8]. Another taining a proximity sensor that opens a door when example is using wood sticks (an ordinary object) someone gets close to it. Implicit interactions occur that the game tracks to represent swords (the virtual inadvertently from the player point of view. On the representation in a game). Tangible objects may other hand, explicit interactions occur through di- also have embedded computing capabilities. Exist- rect (conscious) player commands, meaning that the ing research suggests that tangible objects contrib- player has conscious intention to start something to ute to increasing player immersion in games [9–11]. achieve a goal. As the player does not see the physi- In live-action virtual reality games, players may cal world, live-action virtual reality game designers touch, manipulate, and throw these objects and de- should be careful when designing implicit interac- vices (among other possibilities) to interact with the tions to avoid accidents that might injure players. game. The player does not see these objects and de- vices directly – the game presents their virtual rep- resentation to the player through the HMD. These 2.6 Mixed-reality infrastructure and objects and devices may be carried by the player or management may be spread in the game stage. For example, a game may present a puzzle for the player to solve A game stage requires a dedicated physical place to that requires finding an object in the environment create a customized physical installation, due to the and placing it on top of another object or device infrastructure required to support the mixed-reality that resides the environment. in live-action virtual reality games. Before game sessions take place, the required infrastructure is Live-action virtual reality games may use cus- deployed to the physical environment, as a prepara- tomized input devices or general purpose con- tion step. Live-action virtual reality games design- trollers as interaction interfaces. These devices may ers and developers may exploit this requirement to include joysticks and custom hardware that features their advantage by deploying custom hardware a number of affordances that are specific for the (such as smart objects) in the environment that oth- game. An example of customized controller is a erwise would not be generally accessible to end controller modeled as gun. An example of general users, which might help in creating more sophisti- purpose general controller is the [12] cated game experiences. Using a dedicated, cus- Another possibility for interaction in live-action tomized installation also helps to implement an virtual reality games is through wearable devices. “uncertainty handling policy” [16] to remove (or These devices are devices worn by a player, provid- minimize) problems caused by limitations of the in- ing embedded, embodied, and context-aware inter- volved technologies, such as networking and sen- physical environment and by using interaction in- sors. terfaces based on gestures. In a customized game stage, live-action virtual reality game designers may In live-action virtual reality games, the prob- deploy smart objects and output devices that stimu- lems of “tracking physical world features” and late other senses, such as smell and the sense of “keeping the virtual world properly overlaid on the heat. physical world (e.g., synchronized)” are of central importance. Traditional computer vision techniques (such as the ones based no QR codes) do not have the required accuracy address these problems. So, 3 Related work live-action virtual reality games usually require An important aspect of live-action virtual reality other solutions based on other kinds of sensors and games is the mixed-reality. However, the concept of tracking techniques. “mixed-reality” can be very broad as Figure 1 illus- Live-action virtual reality games may require trates. In this regard, Section 3.1 compares live-ac- ongoing supervision while game sessions happen. tion virtual reality games to other kinds of games This supervision, also known as “orchestration” based on mixed-reality concepts, such as “pervasive [17] aims to anticipate and/or correct technical is- games” and “ games”. Section 3.1 sues that may happen during a game session, so that also compares live-action virtual reality games with the game experience is not broken due to these “alternate reality games”, due to the idea that live- problems. Orchestration also can be used to help action virtual reality games transport the player to a players who are experiencing difficulties in playing “different, alternate” reality – we believe that the the game. term “” might be misleading at first sight for some readers. Section 3.2 compares live-action virtual reality games with existing initia- 2.7 Presence and sense stimulation tives found in the industry and academia. Presence can be commonly defined as “a sense of being in a VE rather than the place in which the 3.1 Games based on mixed-reality and participant’s body is actually located” [1]. Impor- virtual reality tant aspects of presence include the extend of the user field of view, number of sensory systems stim- This section presents some earlier virtual reality ulated by the VR system, the quality of rendering in games, pervasive games, and alternate reality each sensory modality, the quality of tracking, games. among other aspects [1]. Sanchez-Vives and Slater [1] refers to “immersion” as “the technical capa- bility of the system to deliver a surrounding and 3.1.1 Early virtual reality games convincing environment with which the participant The Virtuality Group [19] provides earlier exam- can interact”. Immersion and presence are related ples (in 1990) of using VR in consumer games, as concepts, but they are not synonyms. However, the virtual reality arcade machines. These machines literature on these terms is confusing as work by contained a HMD, speakers, microphone, and joy- Schuemie and coauthors [18] demonstrate. In live- sticks. That system was able to track head and joy- action virtual reality games, a player experiences stick movements through a magnetic tracking sys- the constructed mixed-reality with at least five tem. Games released in this platform include senses: sight, touch, hearing, kinesthetic sense (e.g., Dactyl Nightmare [20] and Grid Busters [21]. An- sense of movement and body awareness), and the other attempt at bringing VR-based games into the vestibular sense (e.g., sense of balance). mainstream came from and its Virtual For example, we provide few possible alterna- Boy [22] in 1995. Although the system was origi- tives on how a live-action virtual reality game could nal, it was a commercial failure. King and Krzywin- stimulate senses. Live-action virtual reality games ska [23] suggested that the high price and physical stimulate sight through mobile, lightweight HMDs. discomfort while playing the game contributed to Live-action virtual reality games may stimulate the demise of this device. touch through wearable devices (capable of provid- The earlier virtual reality games focused mainly ing haptic feedback), the underlying physical struc- on immersing the user’s visual sense in the virtual ture, and physical objects (e.g., touching physical world. The users interacted in the virtual environ- walls, tables, and holding small objects). Hearing ment with limited mobility – physically walking in may be stimulated by isolating or non-isolating the environment was not possible as the equipment headphones and smart objects. Live-action virtual running the simulation was not portable. reality games stimulate the kinesthetic and vestibu- lar senses are by requiring players to move in the 3.1.2 Augmented reality games “pervasive games”, such as “mixed-reality games” and “ubiquitous games”. Live-action virtual reality games are fundamentally different from games based on “augmented reality” Through a technology perspective, a pervasive (left side of Figure 1), because in live-action virtual game can be considered as a context-aware system reality games the player does not see the physical that has these characteristics: world. In augmented reality games the player sees  The game is constantly coming back to the the physical world (through a HMD or a mobile de- real-world, which means that the game is vice camera) and virtual contents that are placed on played in physical places and it is not con- top of physical world elements. On the other hand, strained to stationary computing devices; in live-action virtual reality games the player sees a virtual world that is built based on physical world  The physical world (places, objects) is part characteristics, including its architectural elements of the game and it is combined with the vir- and contextual information. A classic reference on tual world, creating a mixed-reality; augmented reality can be found in [24].  is always existent and it is cre- A classic, early example of augmented-reality ated through pervasive computing technolo- game is ARQuake [25], an indoor and outdoor gies (e.g., sensors, context-aware systems); game where player wears a backpack containing a  The spatial mobility occurs in a physical laptop, an orientation sensor and a GPS sensor, “open” environment, that is, the “game which allows the player to walk freely in a physical world boundary” is not “well-defined”, and environment. The player uses a see-through HMD sometimes it can be unconstrained; that is connected to these devices. The ARQuake tracking system is based on GPS, digital compass,  The players use mobile devices (e.g., smart- and fiducial markers. The game takes place on a phones, tablets, custom hardware) to interact physical environment that is modeled as a Quake with the game and with other players; 3D . When the game is played in the real-  The game may last for several days or weeks world, ARQuake uses this 3D model to superimpose (or more), blending with the daily lives of virtual world information into the physical world. players. The game may define a that progresses without player inter- 3.1.3 Pervasive games vention. If some important event happens in the game, the game may notify the player to The term “pervasive games” refers to games that take action. These aspects are not manda- are played in the real-world, exploring mobility, tory; mixed-realities, and context-awareness, among other aspects. This description of pervasive games  The game may focus on focuses on promot- is not a consensus in the literature [26]. ing social interaction among players. Social interaction in a pervasive games may happen This kind of game aims to bring the game expe- directly (face to face interaction) or indi- rience out of a game device into the real world. Ac- rectly (mediated through technology). This cording to the Oxford English Dictionary [27], aspect is not mandatory as pervasive games “pervasive” means: “having the quality or power may be single-player games. of pervading; penetrative; permeative; ubiqui- tous”. This may suggest that “pervasive games” are As live-action virtual reality games, pervasive games “pervading something” (the real-world per- games also define a mixed-reality. Pervasive games haps) or spread somewhere. In games, pervasive- may also use the elements that we described in Sec- ness can be recognized every time the boundaries tions 2.2 to 2.6. However, the mixed-reality in per- of playing expand from the virtual (or fictional) vasive games is of different nature than the one world to the real world, creating mixed-realities. found in live-action virtual reality games. Pervasive The literature presents various interpretations and games are based on the idea of a real-world aug- scopes for defining what “pervasive games” mean. mented with virtual content. This augmentation These interpretations can be roughly divided into takes form as using context information collected “cultural perspectives” (e.g., , game through sensors placed in the physical environment studies, which take “pervasive” in its literal sense) or in mobile devices that the players use. This aug- and “technological perspectives” (e.g., ubiquitous mentation may also take form as “augmented real- and pervasive computing). There are some works in ity” (Section 3.1.2). The reader can be referred to the literature that analyze existing interpretations of [29] for a report describing some pervasive games. pervasive games [26, 28]. Also, in the literature there are other terms that are used as synonyms of 3.1.4 Alternate reality games The first on is “The VOID” [35], which is a project envisioned by Ken Bretschneider that is The term “alternate reality game” (ARG) may sug- planned to launch in 2016. The VOID defines a gest that these games bring the player to a kind of mixed-reality based on the real world structure, reality that is very different from a real-world set- where the players have to walk to move in the ting. This idea is metaphorically similar to some of mixed-reality. The environments where the game the ideas in live-action virtual reality games, but happens are named as “gaming pods”. They intend these game styles are very different. ARGs suggest to provide different kinds of sensations for players, a surrealistic setting where the game denies its exis- including “elevation changes, touching structures tence “as a game”. The main slogan of those games & objects, feeling vibrations, air pressure, cold & is “this is not a game”. ARGs use the real-world as heat, moisture, simulated liquids and smell” [36]. a platform and create a comprehensive interactive Players will use a proprietary HMD and wear pro- narrative, as massive puzzles that span on different prietary vests and gloves, both capable of providing media such as web-sites, emails, and phone calls. haptics feedback. The players do not see the physi- Game masters create (real-world and virtual) con- cal world. tent and steer the story according to players’ reac- tions. The game is purposely ambiguous, so that Another related initiative comes from Kenzan players always question if the game activities are [37], who defined their vision as “real virtuality”. indeed part of the game, or part of real-world life. In the vision by Kenzan, players wear HMDs and This includes discovering how to enter the game have their body and movements captured motion and guessing if it is over. capture devices installed in the physical environ- ment installation. The players also have to walk An example of early ARG is The Beast [30], physically to move in the mixed-reality environ- which was part of a marketing campaign for the ment. It seems that players may also interact with movie A.I. [31]. The game physical objects inside the environment, while the began with a question, “Who killed Evan Chan”, game displays their virtual representation (similar and then evolved to an interactive story that had to the idea of interacting with physical objects de- been deployed over the and the real world. scribed in this paper). The game itself was not advertised as a game, and its entry-point was hidden in A.I. movie trailers and Survios’ ActiveVR [38] is another related vi- posters. After following the clues, the player could sion, which is based on six elements: immersion access “real-world elements” (like voice mails from (“the perception that a digital environment really the game) that opened-up the gate for the storyline. exists”), presence (“the perception that you exist The game designers have created fake websites and within a digital environment”), embodiment (“the other content to support the game perception that you are physically interacting with through puzzles and other interactions. Also, some- a digital environment”), free movement (“being times the game would make phone calls to the play- completely unbound from physical restrictions in ers. Some researchers classify ARGs as a kind of the outside world”), shared space (“the perception “” [32, 33]. The reader should refer that you are naturally communicating with other to [34] for more information about ARGs. people in a digital environment”), and dynamic spectating (“being able to view and interact with people playing in VR from the outside world”). At 3.2 Related initiatives the time of writing this paper, Survios’s website does not provide more information on how they im- This section presents some industry and academic plement this vision. Survios seems to have origi- initiatives related to live-action virtual reality nated from “Project Holodeck” [39]. The game games, ending with some comparisons among them. demo Zombies on the Holodeck [40] from Project Holodeck features users wearing a de- 3.2.1 Industry initiatives with similar vice and interacting with the environment using concepts wired Razer Hydra controllers. The demo uses these controllers to create a tangible interface. The There are some industry and academic initiatives Razer Hydra controllers are able to detect motion that have similarities with the concept of live-action and orientation through embedded sensors. virtual reality games that we present in this paper. VRcade [41] envisions a system with “immer- These initiatives seem to be “works in progress” as sive head mounted displays, motion capture sys- information about their current status (especially tems, and real-time 3D engines” that takes place in concerning technical details) is scarce. Also, it “out-of-home installations as small as 15′x15′ or seems that these initiatives have not debuted yet. as large as 150′x75′ (~11,000 square feet), users are able to explore and interact with virtual envi- ronments with no wires and completely intuitive 6 m x 3 m) room with 24 cameras to create a mo- movements”. At the time of writing this paper, VR- tion capture system based on optical markers. In cade’s website does not provide more details about this Holodeck, a user is able to interact with hand- their vision and how their system works. based gestures. The focus of this initiative is visual- ization of scientific and engineering data. 3.2.2 The “Holodeck” vision and related initiatives 3.2.3 Comparing live-action virtual reality games with the initiatives formerly StarTrek’s Holodeck [42] seems to be an idealized described and prototypical model of immersive virtual envi- ronment for many virtual reality researchers. The As far as we know, the main difference among live- Holodeck corresponds to an empty room that is action virtual reality games and the initiatives de- filled with real content that is generated according scribed formerly is the augmentation of a virtual to a specific program, creating a “simulated real- world through context-awareness (especially ity”. The Holodeck is described in StarTrek’s web- through smart objects). In live-action virtual reality site as: games, we also support the idea of implicit interac- “The generic name, especially in use aboard tions through smart objects. Federation starships, for the "smart" virtual The key issue being worked in these initiatives reality system as evolved by the 2360s — a technology that combines transporter, replica- relates to keeping the virtual and physical worlds tor, and holographic systems. correctly superimposed (or “tracked”) so that a user The programs, projected via emitters within a is able to touch physical environment elements specially outfitted but otherwise empty room, (such as wall) and feel like he/she is touching the can create both “solid” props and characters virtual wall that he/she sees. as well as holographic background to evoke We have not been able to find much information any vista, any scenario, and any personality — all based on whatever real or fictional pa- about the current state of the industry initiatives rameters are programmed.” that we cited in this paper. The one that seems to be While personal holoprograms relieve the most advanced up to now is The VOID. stress and isolation of shipboard life for crew As far as we were able to know, the initiatives personnel, Holodecks are also used for tasks inspired by the Holodeck vision seem to be in an ranging from scientific simulation to tactical informal state when it comes to define the core con- or even covert training. Off starships, many commercial users have equipped facilities cepts that they want to support. The information we with so-called Holosuites. were able to get so far about these initiatives illus- trate that in the current state, these systems focused Inspired by this original idea, the “NYU on getting a user to walk in a physical environment Holodeck” is an initiative from the MAGNET while displaying a virtual world. The key issues be- group [http://magnet.nyu.edu] at New York Univer- ing worked relate to tracking technologies. sity to create a virtual reality environment with mo- bile HMDs. There is little information about this initiative so far – the few ones take form as 4 Concluding remarks YouTube videos such as [43]. Another unnamed (as far as we know) initiative There is a growing trend in the industry to bring to create a “Holodeck-room” comes from the Max VR hardware with affordable prices to the mass Planck Institute for Biological Cybernetics. This market. This trend has reignited VR research. Af- VR room takes form as the TrackingLab [44], fordable, lightweight, mobile, and powerful VR which is a (12.7 m x 11.9 m x 6.9 m) room hardware facilitate new possibilities such as live-ac- equipped with 16 motion capture cameras. The user tion virtual reality games. wears an Oculus Rift device equipped with trackers In this paper we outlined a vision about a new that are captured by the cameras in the room. genre of digital games – live-action virtual reality Joachim Tesch provides a demonstration of this games (live-action virtual reality games). Key con- system in a video interview [45]. cepts in live-action virtual reality games are live-ac- Marks and co-authors [46] describe yet another tion, mixed-reality, context-awareness, and interac- concept based on StarTrek’s Holodeck, also named tion paradigms based on tangible objects, embed- “Holodeck”. The Holodeck described by Marks ded, embodied, and context-aware interactions. and co-authors [46] comprises a physical tracking In live-action virtual reality games, players enter room, VR headsets (Oculus Rift), and render en- a game stage where they interact with the game and gines. The tracking room corresponds to a (6 m x other players through several devices and physical objects. The game requires the player to act out (PerCol 2011). IEEE Computer Society, Seattle his/her avatar in game, which requires him/her to (2011). walk, move, interact with physical objects, and use 6. Volkswagen: The World’s Deepest Bin | The Fun different interaction modalities. The game stage is Theory, http://www.thefuntheory.com/worlds-dee- mixed-reality environment created according to the pest-bin. physical structure where the game stage resides. 7. Ishii, H., Ullmer, B.: Tangible bits: towards seam- The game stage is augmented with environment de- less interfaces between people, bits and atoms. Pro- vices (physical objects placed in the physical envi- ceedings of the SIGCHI conference on Human fac- ronment) and context information. Smart objects tors in computing systems. 234–241 (1997). enable live-action virtual reality games to sense in- 8. Magerkurth, C., Cheok, A.D., Mandryk, R.L., Nil- formation about the physical environment and the sen, T.: Pervasive games: bringing computer enter- current situation of players, which may open up tainment back to the real world. Comput. Entertain. possibilities to create unpredictable game experi- 3, 4–4 (2005). ences and emergent gameplay. 9. Lindt, I., Ohlenburg, J., Pankoke-Babatz, U., Ghel- There are some research and industry initiatives lal, S., Oppermann, L., Adams, M.: Designing trying to come up with concepts that are similar to Cross Media Games. In: Proceedings of 2nd inter- live-action virtual reality games. We were not able national workshop on pervasive gaming applicati- ons. pp. 8–13. , Munich (2005). to find detailed information about these visions and their current status, especially in the industry side. 10. Tuulos, V., Scheible, J., Nyholm, H.: Combining It seems that many of them are focusing in solving Web, Mobile Phones and Public Displays in Large- the tracking problem related to keeping the virtual Scale: Manhattan Story Mashup. In: LaMarca, A., and real worlds correctly superimposed, so that a Langheinrich, M., and Truong, K. (eds.) Pervasive Computing. pp. 37–54. Springer Berlin / Heidel- user is able to walk in both environments simulta- berg (2007). neously. This also enables the user to touch physi- cal walls and “feel” their virtual representation, 11. Waern, A., Montola, M., Stenros, J.: The three-sixty which surely contributes to increase the sense of illusion: designing for immersion in pervasive ga- mes. Proceedings of the 27th international confe- presence in the virtual environment. As far as we rence on Human factors in computing systems. know, context-awareness is a key aspect that differ- 1549–1558 (2009). entiates live-action virtual reality games from other related initiatives. 12. Razer: Razer Hydra, http://www.razerzone.com/ga- ming-controllers/razer-hydra-portal-2-bundle. 13. Zhang, Z.: Microsoft Kinect Sensor and Its Effect. Acknowledgments IEEE MultiMedia. 19, 4–10 (2012). 14. Valente, L., Feijó, B.: Extending use cases to sup- The authors would like to thank CNPq, CAPES, port activity design in pervasive mobile games. In: FINEP, , and FAPERJ for the financial Proceedings of the SBGames 2014. pp. 876–885. , support to this research project. Porto Alegre (2014). 15. Schmidt, A.: Implicit human computer interaction References through context. Personal Technologies. 4, 191–199 (2000). 1. Sanchez-Vives, M.V., Slater, M.: From presence to 16. Valente, L., Feijo, B., Leite, J.C.S. do P.: Mapping consciousness through virtual reality. Nat Rev Neu- quality requirements for pervasive mobile games. rosci. 6, 332–339 (2005). Requirements Eng. (2015). 2. Milgram, P., Takemura, H., Utsumi, A., Kishino, F.: 17. Benford, S., Crabtree, A., Flintham, M., Drozd, A., Augmented reality: A class of displays on the rea- Anastasi, R., Paxton, M., Tandavanitj, N., Adams, lity-virtuality continuum. Systems Research. 2351, M., Row-Farr, J.: Can you see me now? ACM 282–292 (1994). Trans. Comput.-Hum. Interact. 13, 100–133 (2006). 3. Dey, A.K.: Understanding and Using Context. Per- 18. Schuemie, M.J., van der Straaten, P., Krijn, M., van sonal and . 5, 4–7 (2001). der Mast, C.A.P.G.: Research on Presence in Virtual Reality: A Survey. CyberPsychology & Behavior. 4, 4. Schmidt, A., Aidoo, K.A., Takaluoma, A., Tuomela, 183–201 (2001). U., Laerhoven, K., Velde, W.: Advanced Interaction in Context. In: Gellersen, H.-W. (ed.) Handheld and 19. Wikipedia: Virtuality (gaming), https://en.wikipedi- Ubiquitous Computing. pp. 89–101. Springer Ber- a.org/w/index.php?title=Virtuality_(gaming)&ol- lin Heidelberg, Berlin, Heidelberg (1999). did=665933692, (2015). 5. Endler, M., Skyrme, A., Schuster, D., Springer, T.: 20. Gaming history: Dactyl Nightmare (1991), Defining Situated Social Context for Pervasive So- http://www.arcade-history.com/?n=dactyl-nightma- cial Computing. In: Proc. of the 2nd Workshop on re&page=detail&id=12493. Pervasive Collaboration and Social Networking 21. Gaming history: Grid busters (1991), 33. Montola, M., Stenros, J., Wærn, A.: Pervasive Ga- http://www.arcade-history.com/?n=grid- mes: Theory and Design. Morgan Kaufmann busters&page=detail&id=12498. (2009). 22. Wikipedia: , 34. Szulborski, D.: This Is Not A Game: A Guide to Al- https://en.wikipedia.org/w/index.php? ternate Reality Gaming. Lulu.com (2005). title=Virtual_Boy&oldid=668532507, (2015). 35. The VOID: The VOID, https://thevoid.com/. 23. King, G., Krzywinska, T.: Tomb Raiders and Space 36. The VOID: Gaming Pods, https://thevoid.com/#the- Invaders: Videogame Forms and Contexts. I. B. centers/2. Tauris, London ; New York : New York (2006). 37. Kenzan: Real Virtuality, http://www.kenzan.ch/? 24. Azuma, R.: A survey of augmented reality. Presen- avada_portfolio=real-virtuality. ce. 6, 355–385 (1997). 38. Survios: Active VR, http://survios.com/active-vr/. 25. Piekarski, W., Thomas, B.: ARQuake: The Outdoor Augmented Reality Gaming System. Commun. 39. Project Holodeck: Project Holodeck, http://www.- ACM. 45, 36–38 (2002). projectholodeck.com/. 26. Valente, L., Feijó, B., Leite, J.C.S.P.: Features and 40. Project Holodeck: Zombies on the Holodeck! VR checklists to assist in pervasive deve- Game Designed for Oculus Rift and Razer Hydra, lopment. In: Proceedings of SBGames 2013. pp. https://www.youtube.com/watch? 90–99. Sociedade Brasileira de Computação, São v=ovhwSS86Km8. Paulo (2013). 41. VRcade: VRcade, http://vrcade.com/. 27. Oxford English Dictionary: pervasive, adj, http://www.oxforddictionaries.com/definition/en- 42. CBS Studios: Holodeck, glish/pervasive?q=pervasive, (2015). http://www.startrek.com/database_article/holodeck. 28. Nieuwdorp, E.: The pervasive discourse: an analy- 43. James Andrew: Breaking Free From Physical Walls sis. Comput. Entertain. 5, 13 (2007). in Virtual Spaces : Experiences in the Holodeck So Far, https://www.youtube.com/watch? 29. Valente, L., Feijó, B.: A survey on pervasive mobile v=m_WCx7csNhc. games. Departamento de Informática, PUC-Rio, Rio de Janeiro (2013). 44. Max Plank Institute: TrackingLab, http://www.cy- berneum.de/research-facilities/trackinglab.html. 30. Szulborski, D.: The Beast. In: Borries, F., Walz, S.P., and Böttger, M. (eds.) Space Time Play. pp. 45. Computerphile: Holodeck with an Oculus 228–229. Birkhäuser Basel (2007). Rift, https://www.youtube.com/watch? v=7ZPs7knvs7M. 31. Spielberg, S.: A.I. Artificial Intelligence. Warner Bros. (2001). 46. Marks, S., Estevez, J.E., Connor, A.M.: Towards the Holodeck: Fully Immersive Virtual Reality Vi- 32. McGonigal, J.: This might be a game: ubiquitous sualisation of Scientific and Engineering Data. In: play and performance at the turn of the twenty-first Proceedings of the 29th International Conference century, (2006). on Image and Vision Computing New Zealand. pp. 42–47. ACM, New York, NY, USA (2014).