Changhoon Park VR Theater: A Journey Sang Chul Ahn Yong-Moo Kwon into the Breath of Sorabol Hyoung-Gon Kim Heedong Ko Imaging Media Research Center, Institute of Science and Technology Abstract 39-1 Hawolgok-Dong, Seongbuk-Gu, Seoul 136-791, Korea We have built the world’s largest virtual reality (VR) theater for the Gyeongju World Culture EXPO 2000. The VR theater is characterized by a huge shared VR Dept. of Computer Science & space with tightly coupled user inputs from 651 audience members in real time. Engineering The shared 3D virtual environment is augmenting the physical audience space in Korea University harmony. Large computer-generated passive stereo images on a huge cylindrical 1,5-Ga Anam-Dong, Seongbuk-Gu, screen provide the sensation of visual immersion. The theater also provides 3D Seoul 136-701, Korea audio, vibration, and olfactory display as well as the keypads for each of the audi- ence members to interactively control the virtual environment. This paper intro- Taiyun Kim duces the issues raised and addressed during the design of a versatile VR theater, Dept. of Computer Science & the production process, and the presentation techniques using the versatile display Engineering and interaction capability of the future theater. Korea University 1, 5-Ga Anam-Dong, Seongbuk-Gu, Seoul 136-701, Korea 1 Introduction

Recently, we built a VR theater for the Gyeongju World Culture EXPO 2000, held in Korea’s old city of Gyeongju (Kyongju) from September 1 to November 26, 2000. The expo was a remarkable celebration of Korean history and culture, sponsored by the federal and provincial government. The main theme was “Breath of the New Millennium,” but its secondary theme of “En- counter and Harmony” describes its spirit more precisely. By taking visitors back in time to the era of the kingdom, it encouraged to draw on the strength of their cultural heritage to promote reconciliation and peace. The era of the liberal, progressive Silla is regarded as the golden age of Korean culture and its center was Sorabol, the present city of Gyeongju. The aim of designing and building the VR theater was to construct a versa- tile public demonstration for VR technology as a new medium for interactive storytelling of diverse kinds of artistic expression and edutainments of virtual heritage to the public. Demonstrating VR contents to a very large audience in VR theater is a relatively new area, and the issues and challenges to be over- come are only beginning to emerge in the literature (Greenhalgh et al., 1999; Stewart, Bederson, & Druin, 1999; Zanella & Greenberg, 2001; Ahn, Kim, Kim, Kwon, & Ko, 2001). Especially, the rapid advances in digital technolo- gies in recent years offer new directions for the virtual heritage based on the VR technology (Refsland, Ojika, Addison, & Stone, 2000; Kwon, Kim, Ahn, Ko, & Kim, 2001). Unlike the traditional single-user VR system or multiuser distributed/col-

Presence, Vol. 12, No. 2, April 2003, 125–139 laborative VR system, VR theater consists of a large screen shared by the large © 2003 by the Massachusetts Institute of Technology audience similar to IMAX theater. Although both the VR and IMAX theaters

Park et al. 125

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 126 PRESENCE: VOLUME 12, NUMBER 2

provide an immersive feeling of presence in the virtual the space between screens, projected a completely circu- environment by the large image displays, there is a fun- lar image while twelve synchronized sound channels damental difference in how the images are generated enveloped the audience in sound. The effect was magi- and projected. The VR theater is displaying multichan- cal, especially when the camera was in motion in a nel computer-generated images in real time by video plane, on a boat or on a railroad train (Stanton, 1997). projectors, whereas, in most installations, an IMAX the- But, there was no interaction for the audience. ater is driven by a large film projector with a single- At Expo 67, the Kino-Automat film at the Czecho- channel image. slovakia pavilion was a sociological and psychological The IMAX camera and projection system was devel- experiment with audience participation by just 127 peo- oped in 1970 specifically for realizing a greater mimesis ple in an intimate theater. It was developed by cinema- of natural vision. The IMAX system employs a frame tographer Raduz Cincera, who reasoned that, just as size of 48.51 by 60.60 mm and projected onto screens children like to make Tinkerbell live by applauding dur- averaging 70 by 100 ft. and covering, from most audi- ing Peter Pan, an adult audience might become in- ence positions, a diagonally measured field of 160 deg. volved in a performance with live audiences. Therefore, Omnimax, a modified form of IMAX, is an even at five points in the film’s plot, the film stopped and the closer representation of natural vision, utilizing fisheye audience was asked to vote on which way Mr. Novak, optics on both camera and projector. This procedure the hero, should act. Meanwhile, the actor appeared in creates a distorted image of the film that is corrected by the theater in person and appealed to the audience to projection onto the spherical surface of a planetarium helping solve his problems. Each viewer was asked to dome. The dome curvature yields a linearly corrected press either the red or green button beside him at each image covering 180 deg., reproducing the full maxi- decision point. The voting results were registered by mum angle of view of the eye. Another distinctive at- seat number, and displayed on a border around the tribute of the Omnimax projection is a curved image, screen, so each viewer could see their own vote which also conforms to the actual shape of the human counted. The winning choice would then be screened, field of vision. Although IMAX and Omnimax have making each experience of Kino-Automat (Stanton, been able to reproduce an angle of vision equivalent to 1997). Although the audience was allowed to change that of natural sight, these formats fail to meet the test the story with alternative film sequences, this does not of total mimesis in the functions of stereoscopic ranging mean real-time interaction was provided. and perception of motion. In shots composed of objects CINEMATRIX patented an audience participation at a distance of approximately 45 ft. or more from the technique through the use of simple reflectors. Each camera, stereoscopic depth perception is unable to dif- member of the audience was asked to vigorously wave ferentiate changes in that range. Moreover, the special one of two colored cards to express their vote. The projection requirements can only be met by installations overall level of activity of each color was automatically designed for a specific formal (Gutenko). In the film- detected from a video image. Therefore, an interactive based IMAX theater or theme park theater, no user in- entertainment system enables thousands of people to teraction is possible where the story of the movie must simultaneously communicate with a computer, making be fixed. possible an entire new class of human-computer interac- The Telephone Pavilion, sponsored by Canada’s tele- tion (CINEMATRIX 1998). But, the use of only two phone companies, told the story of Canada and com- colored cards restricted the scope of audience interac- munications. Its presentation Canada 67 was the most tion. popular film at Expo 67. The 22 min. film was executed At the MIT Media Laboratory in 1997, a computer- in Circle-Vision of 360 deg. a total wrap-around process based theatrical performance was held. It was a two- with 1,500 people standing in a room surrounded by character theater play, with a human character and an nine large movie screens. Nine projectors, concealed in autonomous computerized character. All the computer

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 127

character’s reactions during the play were based on the 2.1 Screen human actor’s actions as detected by a three-camera In front of the theater on a stage, a 27 ϫ 8 m cy- visual segmentation system that was invariant to lighting lindrical screen with a radius of 37 m is erected vertically changes. In this system, the human character’s actions to display images projected by six double-stacked pro- are recognized not only by tracking gesture information jectors. For passive stereo display, the screen surface but also the context provided by the current situation in must preserve the polarized images from the projector the story (Pinhanez, 1997). Their research focuses on after reflection. This means that the screen is highly re- the actor in real and the virtual space for computer the- flective, and any impurities on the surface can become ater, but the audience’s mass interaction is not consid- glaringly obvious. We used a Harkness Hall spectral ered here. screen that was manufactured meticulously in environ- In the VR theater of Gyeongju, the image must re- mentally controlled factory and could be installed on a flect the result of user interaction in real time. With the curved frame. The frame geometry of the screen was a advancement of digital cinema equipment and real-time design decision. With a flat screen, the geometry of computer image generation, we believe the VR theater the projected image would be too distorted for the pro- will eventually replace IMAX film theaters. Interactive jector to correct unless it is angled toward the projector. and real-time capabilities, combined with high-resolu- Then, the image across the discrete angle would look tion immersive imagery, create a much richer under- continuous only on the sweet spot but discontinuous in standing of the Silla Kingdom. all other viewing positions. With the curved screen, the images viewed from other viewing positions will look continuous even through distorted. Furthermore, be- 2 Hardware System of the Gyeongju VR cause the gain characteristics of the reflective screen Theater changes rapidly with the viewing angle, the small differ- ence between viewing angles of left and right eyes to a The VR theater in Gyeongju, Korea, was built to flat screen may cause a large enough intensity difference provide the audience with immersive and interactive that the viewer may not be able to fuse the stereo paired virtual experience environments. Besides the interaction images. Thus, the screen frame was cylindrically curved system, the VR theater facilitates an immersive environ- with a radius of 37 m to make the intensity of the re- ment by a large screen that displays passive stereo im- flected light as even as possible for the audience and to ages of high resolution. The VR theater uses a large cy- minimize the disparity of the intensity difference of re- lindrical screen, and the images to be projected on this flected light from the screen to the viewers. The inten- screen are high quality, with a resolution of 3780 ϫ sity of the projected images was bright and smooth 1024 pixels and a brightness of 4000 ANSI lumens. Six enough so that the stereo fusion was good in all the projectors are used to make the passive stereo images seats in the hall. that produce a visually immersive sensation. The sound system enables eight-channel surround 3D sound. In 2.2 Projector the sound system, the platform under the seats is used as a large subwoofer so that audience can feel the We initially considered using CRT projectors be- sound. Another facility to provide immersive feeling is a cause they were most versatile in geometry correction fragrance control system, controlling the type and and edge blending for tiling images from multiple pro- amount of fragrance, and the time of release. As a jectors seamlessly and were most adaptive to screen ge- whole, the Gyeongju VR theater provides an immersive ometry. Although the brightness was low, it can be environment by automatically controlled visual, aural, overcome with a high-contrast capability by blocking and olfactory rendering. (See figure 1.) the light into the theater completely and making use of

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 128 PRESENCE: VOLUME 12, NUMBER 2

Figure 1. Hardware facilities of the Gyeongju VR Theater.

a high-gain screen. However, because the throw dis- 2.3 Computer Image Generator tance of the CRT projectors were only 10 m, it had to The VR theater employed an SGI Onyx2 system be stationed in the middle of the theater or in front of with six IR2 graphics pipelines for interactive rendering the theater, which greatly reduces the theater capacity. of the virtual environment. The computation is powered Furthermore, the logistics of maintaining the projectors with fourteen 300 MHz R12000 MIPS process and 2 in the ground near the crowd was problematic. It would be much better if the projector were stationed in the GM of RAM. The graphics pipelines provide an addi- projector booth already in place from the 1998 EXPO tional 64 MB of fast texture memory. As the modeling (more than 30 m back and 4.5 m above the ground). At task required close attention to the details reflecting the the time, many video projectors for digital cinemas were archeological findings and historical references faithfully, coming to the market with long throw distance and very the size of the visual database was well over the limit of bright images with a resolution of 1280 ϫ 1024 native even the Onyx2 system. So, it was a major challenge to pixels. With the passive stereo displayed and the effec- realize 30 fps images. tive brightness of the projectors, the brightness after polarized filters was what mattered. We opted for the 2.4 Keypad BARCO 9300 LCD Light-Valve projectors because they could be positioned in the projector booth and a The 651 input keypads were the input channel for polarized projected image maintained 80% of the origi- audience interaction. We decided to provide interaction nal brightness, 4000 ANSI lumens, and edge blending devices for the audience to let them experience the dif- was possible for tiling. (See figure 2.) ference of VR from other film-based shows. Although

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 129

Figure 2. Double-stacked, six-projector array for passive stereo generation.

Figure 3. Designed and installed keypads for audience interaction.

the interaction device should be suited for the contents to interaction keypads attached to every seat as shown in to be presented, we also had to generalize it considering figure 3. other contents of the future. Thus, we designed a spe- cial keypad that mimicked the functionality of the com- 2.5 3D Sound puter mouse to generalize input capability. The interac- tion keypad was designed to have six keys (four The sound system consists of 24 speakers: four directional and two selection). The audience was di- front bottom speakers behind the perforated screen, two vided into three groups according to the seat groups. front subwoofer speakers, eight front top and rear We used color to help people easily identify their group, speakers, six speakers mounted on the ceiling, two bot- with red, yellow, and blue assigned to the three groups. tom stand subwoofer speakers underneath the access These colors were not applied to the seats directly, but floor, and two control room monitor speakers. The

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 130 PRESENCE: VOLUME 12, NUMBER 2

2.7 Building

The theater was already in place from Expo 1998 and used for film projection display. One of the key considerations in designing the theater was to host as many visitors as possible because they expected close to two million visitors for the Expo 2000. The building was 36 m wide, 36.75 m in depth, with tapered heights from 14 to 10 m, and holding 1,000 people in the audi- ence for each show. However, for a sophisticated technology demonstra- tion, we had to invent a new seating arrangement. The Figure 4. Configuration of eight-channel surround sound system. interior design proceeded with close attention paid to how the hall would be used for various public VR dem- sound source may come from a wireless microphone, onstrations for later exhibitions. Figure 5 shows a new CD, and a PC controlled by Pro Tools with sixteen seating arrangement with a capacity of 651; the seats are channels output to the audio mixing console, Yamaha arranged in three columns, and front/back rows and O2R, as input sources. The sixteen-channel audio out- seats in either side of the middle column were skewed put is mixed to eight output channels that are routed to toward the center of the display screen to provide com- 24 speakers by the crossover network. The sound sys- fortable viewing angles. For efficient and safe entrances tem was designed to provide 3D surround sound in and exits, we made each step wide and low: the slope of eight channel settings as well as vibration by the sub- the access floor staircases was gradual and the highest in woofers attached to the access floor underneath the the back was 2.5 m above the ground. seats. (See figure 4.)

3 Software Architecture of the VR 2.6 Aroma Control Theater To arrive at the full potential of virtual reality, ol- factory interfaces were incorporated with the VR the- In this section, we describe the main components ater. Air ducts were installed underneath the floor for and functions of the software architecture for the Gyeo- ventilation and release of fragrance. The air compressor ngju VR theater. The VR theater should provide not was stationed outside of the front side of the theater, only the 3D virtual space but also multimodal interac- and the main air duct was buried underground to be tions to provide the audience a much stronger sense of brought into the theater. The main duct was divided immersion. As shown in figure 6, the software architec- into multiple ducts under the access floor, and vents ture is designed based on a distributed system consisting were installed in the stairway steps to minimize the air of multiple hosts on the network to integrate 3D virtual intake and exhaust time period for interactive aroma space with various devices. This software architecture display. The aroma generator has five containers for liq- makes the system extensible, reconfigurable, and scal- uid fragrances and its controller is connected to a server able. The VR theater can extend new functions or inter- in the control room so that the computer can select the faces by simply adding external modules without modi- type of fragrance and control the release time. The gen- fying the kernel. The system can be specified for specific erator drops the selected fragrance liquid onto a heated applications or content by reorganizing the external plate, it is released into the air duct, and the ventilation modules only, and, by increasing the number of servers, system carries the aroma throughout the theater. application-specific performance can be achieved.

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 131

Figure 5. The layout of the interior design.

The network featured a render server, a device server, coupled EMs. For example, a joystick is a tightly cou- and a control server. The mandatory render server in- pled EM whereas a graphic user interface (GUI) is a cluded a kernel to render 3D virtual space in real time loosely coupled EM. Table 1 shows interface devices and to provide protocols to communicate with the rest implemented with EMs. The Control server managed of the servers. The optional external module (EM) han- loosely coupled EMs whereas the device server managed dled various interfaces of I/O devices, and can be classi- tightly coupled EMs. (See figure 6.) fied into two groups of tightly coupled EMs and loosely The kernel aims to facilitate the integration of these

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 132 PRESENCE: VOLUME 12, NUMBER 2

Figure 6. Software system architecture of the VR theater.

Table 1. Implemented I/O Devices of External Modules

Classification Sense Display

Tightly coupled Joystick 3D sound external module 651 keypads Motion bed Motion capture Loosely coupled GUI Perfume external module Speech recognition Stage light Projector

three servers as well as dynamic management of the vir- tual space. The block diagram of the kernel is shown in figure 7, and consists of the scenario manager, com- Figure 7. Block diagram of the kernel component. mand manager, event manager, and interaction man- ager, which will be described in detail.

building block of the script, abstracting various concepts 3.1 Scenario Manager and objects for the physical system and 3D virtual The main function of the scenario manager is to world. In the operation section, operation nodes specify interpret and process the script for presentation and in- dynamic control of the virtual environment. teraction. The script enables the author to describe both A script contains one system node that specifies con- static and dynamic specification of virtual environments. figuration of render server, device server, and control In this implementation, the 3D virtual environments server. Information such as the Internet address and and multimodal interface are defined by the network of port number is described to support communication three servers. among these servers. The system node is a static node Figure 8 shows the structure of the script file. The that cannot be modified in runtime. script file consists of a declaration section and operation The scene node is used to specify a 3D virtual world section. The declaration section contains two types of as shown in figure 9. Scene is a grouping node that in- system and scene nodes that describe the information to cludes a list of nodes such as channel, environment, initialize the virtual environments. A node is a basic model, and animation, each of them have their own pri-

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 133

The Set command is used to control the scene that represents a virtual world. By setting the field of node consisting of a scene, the virtual world can be updated. The target argument means the destination, the data includes the list of values to be used in the requested duration, and the value contains a single value for the field. For example, we can update the background color of a virtual space at time 10 with the following script.

Action { at 10.0 Figure 8. Structure of the script file. Set { target Channel.sky_color value 0.2 0.2 1.0 vate function. The channel node has fields to describe a } view of a scene such as view position, viewing frustum, } field of view, and so on. The environment node pro- The Send command enables transmission of an event vides a way to simulate atmospheric effects including through the event manager to control the device han- time of day, the color of sky, light position, and so on. dled by the control server. The field value of any node The model node specifies meta-information about the can be delivered to the control server. For example, we 3D model data to be present, and the animation node is can send an event defined to turn on the projector con- designed for key-framed animation. nected to the control server. As shown in figure 10, the operation node is used for The Control command activates or deactivates the controlling the scene node as time passes. A script con- device on the device server through the interaction tains one or more scene nodes and can be identified by manager, and connects the user input to the field of any its name field. Their sequential relation can be described node at any frame for navigation or manipulation. For by means of the next field that specifies the name of the example, the following script allows the user to navigate node for the next scene. Because each scene node needs virtual worlds from time 30 by using the joystick. one operation node, the number of scene nodes and that of operation nodes must be the same. The op_name Action { field of scene node specifies the name of corresponding at 30.0 operation node. The operation node has one or more Control { action nodes whose fields allow specifying the concept device_name joystick of time and contains one or more command nodes. activate on target Channel.view_pos } 3.2 Command Manager } The scenario manager requests the command manager to process the command when the time specifi- 3.3 Event Manager cation of the action node is met at the current time. The command manager processes the command required by The event manager handles the transmission of the scenario manager and the event manager. Table 2 events to and from the control server, based on asyn- describes the command list that can be processed by the chronous message passing. The event manager allows command manager. the render server to send an event to control loosely

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 134 PRESENCE: VOLUME 12, NUMBER 2

Figure 9. Specification of the scene node.

coupled external modules managed by the control server. In addition, the control server is allowed to transmit events including the commands to the render server through the event manager. For example, the render server can turn on a projector handled by the control server. And, a user on the control server is al- lowed to control the light of a 3D virtual space through a GUI component of the application.

3.4 Interaction Manager

The interaction manager connects the tightly cou- pled external modules on the device server to a scene graph that is an abstraction to represent a virtual space on the render server. The interaction manager allows the real-time interaction such as navigation or manipula- tion in the virtual space. For example, a user on the de- vice server can navigate a virtual space by using a force feedback joystick that vibrates when collision occurs. For natural interaction, the interaction manager pro- vides synchronous communication to the device server Figure 10. Specification of the operation node. by means of polling the querying of the device status. In the render server, polling is transmitted to the device server at every frame. At this time, information about

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 135

Table 2. Command List for the Command Manager light on the control server stimulate the senses of the audience. The joystick on the device server enables the Name Type Argument navigator to control the view of the virtual camera for Set [node] ⅐ [field] target guidance. Speech recognition (or the GUI on the con- String value trol server) allows the producer to operate the VR the- String data ater at runtime. The relationship and definition of the Send [node] ⅐ [field] target VR theater are specified in terms of the script file as de- String value scribed previously, and the specification can be modified Control String device_name easily according to the scenario. Boolean activate Figure 12 illustrates various action plans for an arbi- [node] ⅐ [field] target trary scenario. Actions can be specified by both the script and the GUI. For example, the action to update the virtual space such as 3D model loading, fade-in ef- collision or terrain following can be transmitted to the fects, and animation can be defined. And several facili- control server for devices or complex dynamic calcula- ties of VR theater such as aroma and stage light can be tion. Then, the device server replies with the recent sta- controlled. tus of the device. Comparing with the event manager, In the VR theater, various interaction functions using message exchange occurs continuously for the interaction. 651 keypads are provided for the audience interaction. The first one is statistical processing, which provides the statistics of pressed keys for the whole audience or for 4 System Integration and Audience each group. The function can be used in cooperative Interaction action, for instance, to select a direction by majority rule. The second function provides first-come/first- This section presents the implementation of the served services. This second function class is a competi- VR theater and audience interaction based on the hard- tive one. In this case, the system receives the inputs only ware and software architecture previously described. In from a specific number of people. After receiving the the Gyeongju VR theater, it is assumed that there are given number of inputs, the system ignores the rest. A three kinds of participants: audience, navigator, and time-limited service is also possible. (After the time producer (Reaney, 1996). The audience as the subject limit, the system ignores the inputs). In either case, we of the VR theater is provided the experience of immer- can specify by the commands the number of people to sion through multimodal display and audience interac- select or the time to cut off the service. These services tion. The navigator on the stage can be considered as a can be applied to the whole audience or to each group, kind of actor who guides the audience into the virtual too. The last function is to read the switch status of a world and interacts with both the virtual and real audi- specific seat. This function uses the ID of the seats. The ence space. The producer defines and controls the VR third function class is elementary and can be used for theater offline and online. After all, various types of the cooperative or competitive interactions or both. The interaction exist between participants and the VR theater. Gyeongju VR theater provides these three interaction Figure 11 shows the integration and relationship function classes, and their usage depends on the VR among the virtual space, participants, and external mod- contents. ules representing specific devices in the VR theater. The In “A Journey into the Breath of Sorabol,” we pro- audience is allowed to control the direction of butter- vide the user interaction using statistics because it was flies in the virtual space by using their keypads managed easy and intuitive enough for people to quickly learn for by the device server. Besides visual display, several dis- the short waiting and running time. Only four direc- play devices such as the aroma system and the stage tional keys were used, and the inputs from the other

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 136 PRESENCE: VOLUME 12, NUMBER 2

Figure 11. System Integration of the Gyeongju VR theater.

Figure 12. Action control for an arbitrary scene.

two selection keys were ignored in the keypad interac- This interaction was enabled only in small periods of tion. The purpose of the implemented audience interac- running time. When the interaction was enabled, arrows tion was to move a flock of butterflies on the screen. were displayed on both sides of the screen so that peo- Left and right key inputs were programmed to move the ple could recognize that it had started. The arrows were flock to the left and right, respectively; and up and programmed to become solid white as the number of down key inputs were used to move the flock away from corresponding directional inputs was increased. The and close to the audience, respectively. The final direc- arrows played a role of a feedback signal to the audience tion was selected by majority rule. so that they could recognize the intention of the other

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 137

people. As a result, the arrows provided an awareness with their keypads. Then, the visitors to the ancient city method of the intention of the other people. Figure of Sorabol are taken to a journey around the city from 13(d) and(h) show the screenshots when the interaction Namsan Top-gol (“valley of Pagoda”) in the southern was enabled. part of the city where common people worshipped at The interaction was enabled three times during the . The scent of pine trees during the mountain running time: in the introductory part, in the halfway scene of the show was most effective. Sensory informa- part, and in the last credit scene. We noticed that the tion available from the olfactory system increased the audience couldn’t move the flock of butterflies in the audience’s sense of presence in the VR theater (Wood- introductory part very well, but they became quickly row & Eric, 1996). habituated to the interface and could move butterflies As the dawn breaks, the audience takes a flight from across the screen in the last interaction period. This the valley to Wolsong (“moon castle”) through a showed that the implemented interaction was intuitive. Woljeong-kyo (“bridge of Moon spirit”), and land on a When we implemented the interaction using statistics, pond of the Moon (now called Wolgi or Anapji). Wolgi, we worried that the butterflies might not move at all located at the east side of Wolsung, was so named be- because the average of key inputs was just “don’t cause the moon seemed to emerge from the pond. The move.” However, people recognized that the motion of ancient Sillans raised valuable birds, animals, and plants, butterflies was bounded by the screen boundary and and played with wooden boats in Wolgi, decorating the tried to move them to the other directions at the pond with artificial objects to imitate the fairyland, a boundaries. From this we could see that the visual feed- world of Taoist hermits in harmony with nature. At the back removed the averaging influence. side of the pond, Imhaejun (“building by the sea”) was used as a reception hall for state guests hosting regular festivals in March and September. From the Anapji, they 5 Presentation pass through the main street of Sorabol with many houses along the roadside and are brought to Hwangry- The Gyeongju VR theater had presented an ap- ongsa, the largest temple existing in Korean history with proximately 15 min. VR movie, “Into the Breath of a nine-story pagoda in the center that was more than Sorabol,” from September 1 to November 26, 2000. It 80 m high. was the theme movie of the Gyeongju World Culture The daytime travel comes to an end and a night scene EXPO 2000 and was seen by almost one million people. begins with an astronomical observatory, Chomsung- The theme of the VR theater was “A Journey into the dae, the oldest observatory that remains in the Orient. Breath of Sorabol,” a virtual tour of the ancient city, From the observatory, the visitors looked into the sky, Sorabol (the capital of the Silla Kingdom 1,300 years and the stars in the sky coalesce into a form of flying ago) to educate the public about the importance of vir- angel or Hwarang, an ancient cadet during the Silla tual heritage in the conservation, preservation, and in- kingdom who was the main force behind the unification. terpretation of our cultural and natural history. During We owe the finale to Sukkuram, a grotto shrine con- the show, the life of the remarkable Silla culture became structed in 751 AD with the expectation and hope to a reality and the experience was a message about har- overcome all the conflicts and confusion during the pro- mony—between tradition and modernity, humanity and cess of unification of the Korean peninsula by the Silla nature, technology and culture. kingdom. Enlarged physical copies of the embossed The visitors were guided through seven historical carvings were hung on the walls of the theater but cov- venues depicting the ancient Korean city of Sorabol. ered with a dark cloth so that the audience did not no- Some snapshots from the presentation are shown in fig- tice them when entering. When the scene is entering ure 13. The show started from an introduction of VR Sukkuram, the embossed carvings hung on the wall are technology and how to control a flock of butterflies lit one by one with the sound coming from the back to

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 138 PRESENCE: VOLUME 12, NUMBER 2

Figure 13. Snapshots from the presentation of ЉInto the Breath of Seorabol.Љ

front, and finally the screen is filled with the main Bud- ing contents that are mostly navigation of many historic dha so that the audience may feel as if they are in the sites. We used under-the-sea ripple effects using multi- middle of the grotto. The grotto shrine is closed to the pass rendering for the Anapji scene. A frog jumps from public now, and those who have attended junior and one lotus flower to the other in the pond using vertex senior high school in Korea may feel the nostalgia when animation. The time of day, fog, and light maps are they were able to enter the grotto when they were used heavily to enhance the visual realism. All these ef- young and thus realize the importance of preserving the fects and camera movements are programmable using a heritage for our descendants. scripting language that controls the entire show. Figure The whole show ends with the flock of butterflies 13(g) shows the entrance scene of the main Buddha of asking a final interaction from the audience, asking the Sukkuram using the light maps from radiosity calculation. audience whether they were dreaming and were awak- ened or not—recalling the famous story of a butterfly dream by Jangja, an ancient philosopher in China, 6 Conclusion and Future work known to most people. (In his dream, he became a but- terfly and roamed around the world; after he was awak- We felt the Gyeongju VR theater project was suc- ened, he was not sure whether he was in a dream of an- cessful in keeping the attention of the audience other existence or whether he was actually awake.) throughout the show and made their experience enjoy- Moreover, the special effects provide zest to the bor- able. Among approximately 1.7 million visitors for the

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021 Park et al. 139

Gyeongju World Culture EXPO 2000, more than 67% References rated the VR show as the best out of a dozen other shows. We owe the success to two main factors (Mark, Ahn, S. H., Kim, I. J., Kim, H. G., Kwon, Y. M., & Ko, H. D. 1999). One was that the technology remained invisible, (2001). Audience interaction for virtual reality theater and its which kept the audience focused on the contents. The implementation. ACM International Symposium on Virtual stereo display, visual quality, frame rate, 3D sound, Reality Software and Technology (VRST 2001), 41–45. aroma display, and keypads worked in unison with few CINEMATRIX. (1998). Retrieved May 5, 2001, from glitches. Operations of the complex hardware systems, http://www.cinematrix.com/. projectors, computers, and ventilation system were all Greenhalgh, C., Bowers, J., Walker, G., Wyver, J., Benford, S., & remarkably stable with few failures: fewer than ten Taylor, I. (1999). Creating a live broadcast from a virtual envi- shows were affected by technical difficulties out of ronment. Proceedings of SIGGRAPH 99, 375–384. 2,500 shows since its opening in September 2000. This Gutenko, G. The super-realistic cinema: IMAX/OMNIMAX and Showscan. Retrieved November 15, 2001, from demonstrates that the technology for running a fully http://iml.umkc.edu/comm/html/faculty/Gutenko/ automated VR theater is mature and ready for public papers/pursui.htm. deployment in various forms, depending on the applica- Kwon, Y. M., Kim, I. J., Ahn, S. C., Ko, H. D., & Kim, H. G. tion needs. The VR contents certainly had the public (2001). Virtual heritage system: Modeling, database & pre- appeal in our experience. sentation. Proceedings of Virtual Systems and MultiMedia After EXPO 2000, we reopened the theater to the Conference(VSMM2001), 137–146. public with a new version of the show with enhanced Mark, B. (1999). Transparency in interactive entertain- visual realism. We demonstrated a part of the EXPO ment. Retrieved November 25, 2001, from http:// 2000 show in the SGI booth at the SIGGRAPH 2001. www.prairiearts.com/IEDesign/docs/Transparency.htm. We initiated a VR contents exchange program with Pinhanez, C. S. (1997). Computer theater. Proceedings of Eighth Toppan, Japan, to present our contents at their VR the- International Symposium on Electronic Arts (ISEA 97). ater and theirs in our Gyeongju VR theater. Recently, a Reaney, M. (1996). Virtual scenography: The actor, audience, new consortium was initiated between KIST and Euro- computer theater. Theater Design and Technology, 32(1), pean research organizations through Trans-Eurasia In- 36–43. formation Network: DHX (Digital Artistic and Ecologi- Refsland, S. T., Ojika, T., Addison, A., & Stone, R. (2000). cal Heritage Exchange—Transcontinental Guidance and Virtual heritage: Breathing new life into our ancient past. Exploration in globally shared cultural heritage) consor- IEEE Multimedia 7(2), 20–21. tium headed by GMD. We will research tele-immersion Stanton, J. (1997). Experimental multi-screen cinema at issues in a public demonstration environment and the Expo 67. Retrieved May 12, 2001, from http:// exchange of mutual contents with each other. naid.sppsr.ucla.edu/expo67/map-docs/cinema.htm. Stewart, J., Bederson, B., & Druin, A. (1999). Single display groupware: A model for co-present collaboration. Proceed- Acknowledgments ings of ACM Conference on Human Factors in Computing Systems (CHI 99), 286–293. This reported work was partly sponsored by the Ministry of Woodrow, B., & Eric, D. (1996). Comments on the use of Science and Technology (MOST) fund under the National olfactory display for virtual environments. Presence: Teleop- Research Laboratory (NRL) program, the Development Gam- erators and Virtual Environments, 5(1), 109–121. sung Engineering Technology Project, by the Ministry of In- Zanella, A., & Greenberg, S. (2001). Reducing interference in formation and Communication (MIC) fund under the 3D single display groupware through transparency. Proceedings Cyber Museum Technology Project, and by the Gyeongju of the Sixth European Conference on Computer Supported World Culture EXPO Organizing Committee. Cooperative Work (ECSCW2001).

Downloaded from http://www.mitpressjournals.org/doi/pdf/10.1162/105474603321640905 by guest on 01 October 2021