<<

Introduction to Augmented

R. Silva, J. C. Oliveira, G. A. Giraldi

National Laboratory for Scientific Computation, Av. Getulio Vargas, 333 - Quitandinha - Petropolis-RJ Brazil rodrigo,jauvane,[email protected]

ABSTRACT This paper presents an overview of basic aspects of (AR) and the main concepts of this technology. It describes the main fields in which AR is applied nowadays and important AR devices. Some characteristics of Augmented Reality systems will be discussed and this paper will provide an overview of them. Future directions are discussed.

Keywords: Augmented Reality, , Scientific Visualization

1 INTRODUCTION

Augmented Reality (AR) is a new technology that involves the overlay of computer graph- ics on the real world (Figure 1). One of the best overviews of the technology is [4], that defined the field, described many problems, and summarized the developments up to that point. That paper provides a starting point Figure 1: AR example with virtual for anyone interested in researching or using chairs and a virtual lamp. Augmented Reality. In , the fundamental purpose is AR is a more general context termed to extend operator’s sensory-motor facilities (MR) [20], which refers to and problem solving abilities to a remote en- a multi-axis spectrum of areas that cover vironment. In this sense, telepresence can be Virtual Reality (VR), AR, telepresence, and defined as a human/machine system in which other related technologies. the human operator receives sufficient infor- mation about the teleoperator and the task Virtual Reality is a term used for computer- environment, displayed in a sufficiently nat- generated 3D environments that allow the ural way, that the operator feels physically user to enter and interact with synthetic en- present at the remote site [26]. Very similar vironments [9][27][28]. The users are able to to virtual reality, in which we aim to achieve “immerse” themselves to varying degrees in the of presence within a computer the computers artificial world which may ei- simulation, telepresence aims to achieve the ther be a simulation of some form of real- illusion of presence at a remote location. ity [10] or the simulation of a complex phe- nomenon [33][9]. AR can be considered a tecnology between

1 VR and telepresence. While in VR the envi- 2 AR Components ronment is completely synthetic and in telep- resence it is completely real, in AR the user 2.1 Scene Generator sees the real world augmented with virtual objects. The scene generator is the device or responsible for rendering the scene. Render- ing is not currently one of the major problems When designing an AR system, three aspects in AR, because a few virtual objects need to must be in mind: (1) Combination of real and be drawn, and they often do not necessarily virtual worlds; (2) in real time; have to be realistically rendered in order to (3) Registration in 3D. serve the purposes of the application [4].

Wearable devices, like Head-Mounted- 2.2 Tracking System Displays (HMD) [28], could be used to show The tracking system is one of the most impor- the augmented scene, but other technologies tant problems on AR systems mostly because are also available [4]. of the registration problem [3]. The objects in the real and virtual worlds must be prop- erly aligned with respect to each other, or Besides the mentioned three aspects, another the illusion that the two worlds coexist will one could be incorporated: Portability. In be compromised. For the industry, many ap- almost all virtual environment systems, the plications demand accurate registration, spe- user is not allowed to go around much due to cially on medical systems [16][4]. devices limitations. However, some AR ap- plications will need that the user really walks through a large environment. Thus, portabil- 2.3 Display ity becomes an important issue. The tecnology for AR is still in development and solutions depend on design decisions. For such applications, the 3D registration be- Most of the Displays devices for AR are HMD comes even more complex. Wearable com- (Head Mounted Display), but other solutions puting applications generally provide unreg- can be found (see section 3). istered, text/graphics information using a monocular HMD. These systems are more When combining the real and of a ”see-around” setup and not an Aug- two basic choices are available: optical and mented Reality system by the narrow defini- video technology. Each of them has some tion. Henceforth, computing platforms and tradeoffs depending on factors like resolution, wearable display devices used in AR must be flexibility, field-of-view, registration strate- often developed for more general applications gies, among others [4]. (see section 3). Display technology continues to be a limit- ing factor in the development of AR systems. The field of Augmented Reality has existed There are still no see-through displays that for just over one decade, but the growth and have sufficient brightness, resolution, field of progress in the past few years has been re- view, and contrast to seamlessly blend a wide markable [12]. Since [4], the field has grown range of real and virtual imagery. Further- rapidly. Several conferences specialized in more, many technologies that begin to ap- this area were started, including the Inter- proach these goals are not yet sufficiently national Workshop and Symposium on Aug- small, lightweight, and low-cost. Neverthe- mented Reality, the International Sympo- less, the past few years have seen a number sium on Mixed Reality, and the Designing of advances in see-through display technol- Augmented Reality Environments workshop. ogy, as we shall see next.

2 3 AR Devices

Four major classes of AR can be distin- guished by their display type: Optical See- Through, Virtual Retinal Systems, Video See-Through, Monitor Based AR and Pro- jector Based AR.

The following sections show the correspond- ing devices and present their main features.

Figure 2: Optical See-Through HMD. 3.1 Optical See-Through HMD

Optical See-Through AR uses a transparent Head Mounted Display to show the virtual environment directly over the real world (Fig- ures 2 and 3). It works by placing optical combiners in front of the user’s eyes. These combiners are partially transmissive, so that Figure 3: Optical See-Through Scheme. the user can look directly through them to see the real world. The combiners are also Recent Optical See-Through HMD’s are be- partially reflective, so that the user sees vir- ing built for well-known companies like Sony tual images bounced off the combiners from and Olympus and have support for occlusion, head-mounted monitors. varying accommodation (process of focusing the eyes on objects at a particular distance). Prime examples of an Optical See-through There are very small prototypes that can be AR system are the various augmented medi- attached to conventional eyeglasses (Figure cal systems. The MIT Image Guided Surgery 4). has concentrated on brain surgery [15]. UNC has been working with an AR enhanced ultra- sound system and other ways to superimpose radiographic images on a patient [23]. There are many other Optical See-through systems, as it seems to be the main direction for AR.

Despite of these specific examples, there is still a lack of general purpose see-through Figure 4: Eyeglass display with holo- HMDs. One issue for Optical See-through graphic element. AR is the alignment of the HMD optics with the real world. A good HMD allows adjust- ments to fit the eye position and comfort of 3.2 Virtual Retinal Systems individual users. It should also be easy to move it out of the way when not needed. The VRD () was in- However, these movements will alter the reg- vented at the in istration of the VE over the real world and the Human Interface Technology Lab (HIT) require re-calibration of the system. An ex- in 1991. The aim was to produce a full pensive solution would be to instrument the color, wide field-of-view, high resolution, high adjustments, so the system could automagi- brightness, low cost virtual display. Microvi- cally compensate for the motion. Such de- sion Inc. has the exclusive license to com- vices are not reported in the literature. mercialize the VRD technology (Figure 5).

3 This technology has many potential applica- worlds is much easier. There are a variety of tions, from head-mounted displays (HMDs) solutions available including chroma-key and for military/aerospace applications to medi- depth mapping. Mixed Reality Systems Lab cal purposes. (MRSL) of Japan presented a stereo video see-through HMD at ISAR 2000. This de- The VRD projects a modulated beam of light vice addresses some of the parallax related (from an electronic source) directly onto the to location of the cameras vs eyes. of the eye producing a rasterized im- age (Figure 6). The viewer has the illusion of seeing the source image as if he/she stands two feet away in front of a 14-inch monitor. In reality, the image is on the retina of its eye and not on a screen. The quality of the im- age he/she sees is excellent with stereo view, full color, wide field of view and no flickering characteristics [13][24].

Figure 7: Video See-Through HMD.

Figure 5: Virtual Retinal System HMD.

Figure 8: Video See-Through Scheme.

3.4 Monitor Based

Monitor Based AR also uses merged video streams but the display is a more conven- tional desktop monitor or a hand held dis- play. It is perhaps the least difficult AR Figure 6: Virtual Retinal System Scheme. setup, as it eliminates HMD issues. Prince- ton Video Image, Inc. has developed a tech- nique for merging graphics into real time 3.3 Video See-Through HMD video streams. Their work is regularly seen as the first down line in Video See-Through AR uses an opaque HMD games. It is also used for placing to display merged video of the VE and view logos into various broadcasts. from cameras on the HMD (Figure 7).

This approach is a bit more complex than 3.5 Projection Displays optical see-through AR, requiring proper lo- cation of the cameras (Figure 8). However, Projector Based AR uses real world objects video composition of the real and virtual as the projection surface for the virtual envi-

4 Figure 9: Monitor Based Scheme.

Figure 12: Projector Based AR.

4.1 Medical

Because imaging technology is so pervasive throughout the medical field, it is not surpris- ing that this domain is viewed as one of the more important for augmented reality sys- Figure 10: Monitor Based Example. tems. Most of the medical applications deal with image guided surgery (Figure 13) [15]. ronment (Figures 11,12).

It has applications in industrial assembly, product visualization, etc. Projector based AR is also well suited to multiple user situa- tions. Alignment of projectors and the pro- jection surfaces is critical for successful appli- cations.

Figure 13: Image Guided surgery.

Pre-operative imaging studies of the patient, such as CT (Computed ) or MRI (Magnetic Resonance Imaging) scans, pro- vide the surgeon with the necessary view of the internal anatomy. From these images the surgery is planned. Figure 11: Projector Based AR. Visualization of the path through the anatomy of the affected area (where a tumor must be removed, for example) is done by 4 Applications first creating a 3D model from the multiple views and slices in the pre-operative study. The Augmented Reality technology has many The model is then projected over the target possible applications in a wide range of surface to help the surgical procedure. fields, including entertainment, , medicine, engineering and manufacturing. Augmented reality can be applied so that the surgical team can see the CT or MRI data It is expected that other potential areas of correctly registered on the patient in the op- applications will appear with the dissemina- erating theater while the procedure is pro- tion of this technology. gressing. Being able to accurately register

5 the images at this point will enhance the performance of the surgical team and elim- inate the need for the painful and cumber- some stereotactic frames that are currently used for registration [15].

Another application for augmented reality in the medical domain is in ultrasound imaging [2]. Using an optical see-through display the ultrasound technician can view a volumetric rendered image of the overlaid on the Figure 15: Games using a virtual table abdomen of the pregnant woman. The image and synthetic objects. appears as if it were inside of the abdomen and is correctly rendered as the user moves [25] (Figure 14).

Figure 16: VR-Border Guards, an AR game

Figure 14: Ultrasound Imaging. The electronic billboard requires calibration to the stadium by taking images from typi- 4.2 Entertainment cal camera angles and zoom settings in order to build a map of the stadium including the A simple form of augmented reality has been locations in the images where advertisements in use in the entertainment and news busi- will be inserted. By using pre-specified refer- ness for quite some time. Whenever you ence points in the stadium, the system auto- are watching the evening weather report, the matically determines the camera angle being speaker remains standing in front of chang- used and referring to the pre-defined stadium ing weather maps. In the studio the re- map inserts the advertisement into the cor- porter is actually standing in front of a blue rect place. screen. This real image is augmented with computer generated maps using a technique called chroma-keying. Another entertain- ment area where AR is being applied is on game development [31] (Figure 15 and 16).

Princeton Electronic Billboard has devel- oped an augmented reality system that al- lows broadcasters to insert advertisements into specific areas of the broadcast image (Figure 17). For example, while broadcasting a baseball game this system would be able to place an advertisement in the image so that Figure 17: Advertisement on a Football it appears on the outfield wall of the stadium. game.

6 4.3 Military Training real model in the augmented display that the designers are using. Or perhaps in an ear- The military has been using displays in cock- lier stage of the design, before a prototype pits that present information to the pilot on is built, the view in each conference room is the windshield of the cockpit or the visor of augmented with a computer generated image the flight helmet (Figure 18). This is a form of the current design built from the CAD files of augmented reality display. describing it [1] (Figure 19).

By equipping military personnel with hel- met mounted visor displays or a special pur- pose rangefinder the activities of other units participating in the exercise can be imaged. While looking at the horizon, during a train- ing section for example, the display equipped soldier could see a virtual helicopter rising above the tree line. This helicopter could be being flown in simulation by another partici- pant. In wartime, the display of the real bat- tlefield scene could be augmented with anno- tation information or highlighting to empha- size hidden enemy units [32]. Figure 19: AR applied to Engineering Design. This figure shows a real object augmented with virtual tubes.

4.5 Robotics and Telerobotics

In the domain of robotics and telerobotics an augmented display can assist the user of the Figure 18: Military Training. system [17][21].

A telerobotic operator uses a visual image of 4.4 Engineering Design the remote workspace to guide the robot. An- notation of the view would be useful as it is Imagine that a group of designers are working when the scene is in front of the operator. on the model of a complex device for their Besides, augmentation with wireframe draw- clients. ings of structures in the view can facilitate visualization of the remote 3D geometry. The designers and clients want to do a joint design review even though they are physically separated. If each of them had a conference If the operator is attempting a motion it room that was equipped with an augmented could be practiced on a virtual robot that reality display this could be accomplished. is visualized as an augmentation to the real scene. The operator can decide to pro- The physical prototype that the designers ceed with the motion after seeing the results. have mocked up is imaged and displayed The robot motion could then be executed in the client’s conference room in 3D. The directly which in a telerobotics application clients can walk around the display looking would eliminate any oscillations caused by at different aspects of it. To hold discussions long delays to the remote site. Another use the client can point at the prototype to high- of robotics and AR is on remote medical op- light sections and this will be reflected on the eration (Figures 20 and 21).

7 Figure 22: AR used to aid mechanical work.

harnesses [29].

Figure 20: Virtual surgery using robot arms.

Figure 23: AR applied to maintenance work.

4.7 Collaborative AR

Figure 21: Robotics using AR for re- AR addresses two major issues with collab- mote medical operation. oration: seamless integration with existing tools and practices, and enhancing practice by supporting remote and co-located activi- 4.6 Manufacturing, Maintenance and ties that would otherwise be impossible. Repair Collaborative AR systems have been built us- When the maintenance technician ap- ing projectors, hand-held and head-worn dis- proaches a new or unfamiliar piece of plays. By using projectors to augment the equipment instead of opening several repair surfaces in a collaborative environment, users manuals they could put on an augmented are unencumbered, can see each others eyes, reality display. In this display the image of and are guaranteed to see the same augmen- the equipment would be augmented with tations [6]. annotations and information pertinent to the repair. For example, the location of fasteners Examples of collaborative AR systems using and attachment hardware that must be see-through displays include both those that removed would be highlighted (Figure 22). use see-through handheld displays and see- through head-worn displays [14] (Figure 24). Boing made an experimental system, where the technicians are guided by the augmented display that shows the routing of the cables 5 Visualization Issues on a generic frame used for all harnesses (Fig- ure 23). The augmented display allows a sin- Researchers have begun to address problems gle fixture to be used for making the multiple in displaying information in AR, caused by

8 through silhouettes is found in [18]. This en- ables the insertion of virtual objects and dele- tion of real objects without an explicit 3D re- construction of the environment (Figure 25).

Figure 24: The Studierstube collabora- tive AR system. the nature of AR technology or displays. Work has been done in the correction of reg- istration errors and avoiding hiding critical Figure 25: Virtual/Real occlusions. data due to density problems. The brown cow and tree are virtual (the rest is real)

5.1 Visualization Errors 5.3 Photorealistic Rendering In some AR systems, registration errors are significant and unavoidable. For example, A key requirement for improving the render- the measured location of an object in the ing quality of virtual objects in AR applica- environment may not be known accurately tions is the ability to automatically capture enough to avoid visible registration error. the environmental illumination information Under such conditions, one approach for ren- [7][22][30]. dering an object is to visually display the area in screen space where the object could re- For example, in [8] it is presented a method side, based upon expected tracking and mea- that, using only an uncalibrated camera, al- surement errors [19]. This guarantees that lows the capture of object geometry and ap- the virtual representation always contains the pearance, and then, at a later stage, render- real counterpart. ing and AR overlay into a new scene. Another approach when rendering virtual ob- jects that should be occluded by real objects is to use a probabilistic function that gradu- 6 Conclusions and Future Work ally fades out the hidden virtual object along the edges of the occluded region, making reg- Despite of the many recent advances in AR, istration errors less objectionable [11]. much work remains to be done. Applica- tion developments can be helped by using the available libraries. One of them is ARToolkit 5.2 Removing real objects from the [5], that provides techniques environment to calculate a camera’s position and orienta- tion relative to marked cards so that virtual The problem of removing real objects is more 3D objects can be overlaid precisely on the than simply extracting depth information markers. from a scene. The system must also be able to segment individual objects in that environ- Here are some areas requiring further re- ment. A semi-automatic method for identi- search if AR is to become commonly de- fying objects and their locations in the scene ployed.

9 Ubiquitous tracking and system portabil- within a pregnant patient. Proceedings ity: Several impressive AR demonstra- of IEEE Visualization, 17-21, 1993. tions have generated compelling environ- ments with nearly pixel-accurate registra- [3] R. Azuma. Tracking requirements for tion. However, such demonstrations work augmented reality. Communications of only inside restricted, carefully prepared en- the ACM, 36(7):50-51, 1993. vironments. The ultimate goal is a tracking [4] R. Azuma. A survey of augmented real- system that supports accurate registration in ity. ACM SIGGRAPH, 1-38, 1997. any arbitrary unprepared environment, in- doors or outdoors. Allowing AR systems to [5] M. Billinghurst, S. Baldis, E. Miller, and go anywhere also requires portable and wear- S. Weghorst. Shared space: Collabora- able systems that are comfortable and unob- tive information spaces. Proc. of HCI trusive. International, 7-10, 1997. [6] M. Billinghurst and H. Kato. Mixed Ease of setup and use: Most existing AR reality - merging real and virtual systems require expert users (generally the worlds. Proc. International Symposium system designers) to calibrate and operate on Mixed Reality (ISMR ’99), 261-284, them. If AR applications are to become com- 1999. monplace, then the systems must be deploy- able and operable by non-expert users. This [7] S. Boivin and A. Gagalowicz. Image- requires more robust systems that avoid or based rendering for industrial applica- minimize calibration and setup requirements. tions. ERCIM News, 2001.

Photorealistic and advanced rendering: Al- [8] D. Cobzas, K. Yerex, and M. Jager- though many AR applications only need sim- sand. Editing real world scenes: Aug- ple graphics such as wireframe outlines and mented reality with image-based render- text labels, the ultimate goal is to render the ing. Proc. of IEEE Virtual Reality, 291- virtual objects to be indistinguishable from 292, 2003. the real ones. This must be done in real time, [9] A. Van Dam, A. Forsberg, D. Laid- without the manual intervention of artists law, J. LaViola, and R. Simpson. Im- or programmers. New techniques in image mersive VR for scientific visualization: based rendering must be considered in order A progress report. IEEE Computer to accomplish this task [7]. Graphics and Applications, 20(6): 26- 52, 2000. AR in all senses: Researchers have fo- cused primarily on augmenting the visual [10] P. du Pont. Building complex virtual sense. Eventually, compelling AR environ- worlds without programming. EURO- ments may require engaging other senses as GRAPHICS’95 State Of The Art Re- well (touch, , etc.). ports, 61–70, 1995.

[11] A. Fuhrmann et. al. Occlusion in collab- orative augmented environments. Com- REFERENCES puters Graphics, 23 (6): 809-819, 1999.

[1] K. Ahlers and A. Kramer. Distributed [12] R. Azuma et al. Recent advances in aug- augmented reality for collaborative de- mented reality. IEEE Computer Graph- sign applications. European Computer ics and Applications, 20-38, 2001. Industry Research Center, 3-14, 1995. [13] R. Chinthammit et al. Head tracking [2] S. Andrei, D. Chen, C. Tector, using the virtual retinal display. Sec- A. Brandt, H. Chen, R. Ohbuchi, ond IEEE and ACM International Sym- M. Bajura, and H. Fuchs. Case study: posium on Augmented Reality, 235-242, Observing a volume rendered fetus 2001.

10 [14] Z. Szalavri et. al. Studierstube: An envi- [23] Ryutarou Ohbuchi, David Chen, and ronment for collaboration in augmented Henry Fuchs. Incremental volume re- reality. Virtual Reality Systems, Devel- construction and rendering for 3D ul- opment and Application, 3 (1): 37-48, trasound imaging. Visualization in 1998. Biomedical Computing, 1808: 312-323, 1992. [15] W. Grimson, G. Ettinger, T. Kapur, M. Leventon, W. Wells, and R. Kikinis. [24] H.L. Pryor, T.A. Furness, and E. Vi- Utilizing segmented MRI data in image- irre. The virtual retinal display: A new guided surgery. International Journal of display technology using scanned laser Pattern Recognition and Artificial Intel- light. Proc. 42nd Human Factors Er- ligence, 11(8):1367-97, 1998. gonomics Society, pp. 1149, 1998. [16] Richard Lee Holloway. Registration er- [25] F. Sauer, A. Khamene, B. Bascle, rors in augmented reality systems. Tech- L. Schimmang, F. Wenzel, and S. Vogt. nical Report TR95-016, The University Augmented reality visualization of ultra- of North Carolina, 1 1995. sound images: System description, cal- ibration and features. IEEE and ACM [17] W. S. Kim and P. Schenker. An ad- International Symposium on Augmented vanced operator interface design with Reality, 30-39, 2001. preview/predictive displays for ground- controlled space telerobotic servicing. [26] T.B. Sheridan. Musing on telepres- Proceedings of SPIE Vol. 2057: Telema- ence and virtual presence. Presence, nipulator Technology and Space Teler- 1(1):120-125, 1992. obotics, 96-107, 1993. [27] W. Sherman and A. Craig. Understand- [18] V. Lepetit and M. Berger. Handling ing Virtual Reality: Interface, Applica- occlusions in augmented reality sys- tions and Design. Morgan Kaufmann tems: A semi-automatic method. Proc. Publishers, 2003. Int.l Symp. Augmented Reality, 137-146, [28] R. Silva and G. Giraldi. Introduction 2000. to virtual reality. Technical Report: 06/2003, LNCC, Brazil, 2003. [19] B. MacIntyre and E. Coelho. Adapt- ing to dynamic registration errors using [29] D. Sims. New in aircraft de- level of error (loe) filtering. Proc. Int.l sign and manufacture. IEEE Computer Symp. Augmented Reality, 85-88, 2000. Graphics and Applications, 14 (2): 91, 1994. [20] P. Milgram and F. Kishino. A taxon- omy of mixed reality visual displays. IE- [30] J. Stauder. Augmented reality with au- ICE Transactions on Information Sys- tomatic illumination control incorporat- tems, E77-D (12): 1321-1329, 1994. ing ellipsoidal models. IEEE Trans. , 1 (2): 136-143, 1999. [21] P. Milgram and S. Zhai. Applications of augmented reality for human-robot [31] Z. Szalavri, E. Eckstein, and M. Ger- communications. Proceedings of 1993 vautz. Collaborative gaming in aug- IEEE/RSJ International Conference on mented reality. VRST, Taipei, Taiwan, Intelligent Robots and Systems, 1467- 195–204, 1998. 1476, 1993. [32] E. Urban. The information warrior. [22] Y. Mukaigawa, S. Mihashi, and IEEE Spectrum, 32 (11): 66-70, 1995. T. Shakunaga. Photometric image- [33] D. Weimer. Frontiers of Scientific Visu- based rendering for virtual lighting alization, chapter 9, ”Brave New Virtual image synthesis. Proc. 2nd Int.l Work- Worlds”., pages 245–278. John Wiley shop Augmented Reality (IWAR ’99), and Sons, Inc., 1994. 115-124, 1999.

11