
Interface Techniques for Tangible Augmented Reality in a Mobile Device Setup for Magic Lens Experiences Dagny C. Döring, Robin Horst, Linda Rau, Ralf Dörner RheinMain University of Applied Sciences, Wiesbaden, Germany; E-Mails: [email protected]; Robin.Horst/Linda.Rau/Ralf.Dö[email protected] Abstract: Tangible Augmented Reality (TAR) is a subclass of Augmented Reality (AR) that includes tangible objects within AR interfaces. For example, utilizing object tracking and methodologies from AR, a real object such as a steering wheel can be used to steer a virtual car and thus becomes part of the user interface of an AR game. In this paper, we introduce four TAR-based user interface techniques for a setup that is based on handheld mobile device technology to create a magic lens AR experience. We relate these interfaces to basic interaction scenarios and state lessons learned during the implementation of them. Two interfaces are evaluated within a comparative user study. The study indicates that touch interactions on the mobile device performed better in terms of usability and that interactions that were virtually projected onto a planar surface around the tangible object involved challenges regarding the hand-coordination. Keywords: Tangible Augmented Reality, Interface Design, Magic Lens Experiences Figure 1: Illustration of four different tangible augmented reality interfaces in the example of augmenting a tangible skin model through a mobile tablet device. The (1) screen interface provides touch interactions on the screen, whereas the (2) surface interface projects a virtual interface onto a planar surface such as a table around the augmented tangible. The (3) tangible interface technique shifts the interaction space directly to the tangible object, for example by projecting a virtual interface onto the tangible itself. The (4) additional tangible interface extends our setup with a generic tangible object that serves as user interface for a TAR application. Veröffentlicht durch die Gesellschaft für Informatik e.V. 2020 in B. Weyers, C. Lürig, D. Zielasko (Hrsg.): GI VR / AR Workshop 2020, 24. - 25, September 2020, Trier. Copyright c 2020 bei den Autoren. http://dx.doi.org/10.18420/vrar2020_1 1 Introduction Particularly since the advent of the Internet of Things, IT in private living environments is not confined to dedicated computer devices. This provides an opportunity to employ tangible objects for designing user interfaces that non-experts in IT find literally better to grasp. One subset of tangible user interfaces can be built with Tangible Augmented Reality (TAR). TAR can mediate information through a virtual extension of a real-world tangible object. For example, a hand-held mobile device can be used as a magic lens [BSP+93] to annotate a tangible object. Users of this TAR setup can interact with both the tangible object and virtual application features. In this work, we differentiate between two classes of TAR inter- actions. Tangible interactions include interactions with the tangible itself, such as grabbing and rotating. Virtual interactions relate directly to the virtual features of the TAR applica- tion that augments real-world object. While virtual interactions do not influence the state of the tangible, tangible interactions can influence the state of the application. For example, rotating a tangible can intuitively reveal virtual information on the backside of a tangible. In this paper, we make the following contributions: • We explore input and output possibilities and affordances that the mobile device based setup provides and introduce four TAR-based interface techniques. (1) A touch screen interface, (2) an interface virtually projected through a handheld screen onto planar surface such as a table that the tangible object stands on, (3) an interface that uses only the tangible object itself to provide interactions and (4) an additional tangible object that serves as a generic interaction device (Fig. 1). We illustrate the interface designs in three basic use cases. • We state lessons learned during the implementation process of three interfaces, give practical insights in how to implement them and demonstrate their feasibility. • We conducted a comparative user study that evaluates the (1) screen interface and the (2) surface interface to draw conclusions on our techniques and their suitability for pattern-based applications. This paper is organized as follows: The next section discusses related work. In section 3, we depict our TAR-based interface concepts, and then describe their implementations in the fourth section. The user study is stated in section 5. The last section provides a conclusion and points out directions for future work. 2 Related Work Pioneering work by Ullmer and Ishii [IU97] and the Tangible Media Group [Tan] suggest principles of tangible user interfaces which utilize real world objects as computer input and output devices. In following work, they proposed a mode-control-representation interaction model (physical and digital) for such tangible interfaces [UI00]. Integration of physical representations and controls are highlighted. Koleva [KBNR03] describe the physical objects in tangible UIs according to their coherence to the digital objects. A high coherence exists when linked physical and digital objects can be perceived as the same thing by the user and the physical object is a common object within the application domain. General purpose tools represent a low coherence in their work. Work by Billinghurst et al. [BKP+08] shapes the term of TAR. They build on the existing principles of tangible user interfaces suggested by Ishii and Ullmer [IU97] and the Tangible Media Group [Tan]. They use an AR visual display, such as a mobile device, and couple it to a tangible physical interface. TAR interfaces involve hand-based interactions. For mobile AR displays, such as a tablet, users usually hold the device during the interactions. Datcu et al. [DLB15] compare different types of hand-based interaction in AR for navigation. Their study indicates that neither two- hand nor one-hand interaction is more effective compared to tangible interaction. Still, the positioning of the interface in the user’s field of view was considered disturbing. In their study, they used glasses as an AR display so that the user interface could make use of both hands. However, handheld mobile devices restrict the possibilities of interactions to one hand. Research on TAR interfaces for a mobile device setup is not considered here. Henderson and Feiner [HF09, HF08] describe in their work examples of what they call Opportunistic Controls that they have designed and implemented. They use optical marker tracking and extend existing tactile features on domain objects. These otherwise unused affordances in the environment are used as input and output devices for interactions within AR. The considered affordances are located in the domain-specific environment, however, affordances at more common objects or environments such as a table that a tangible lies on are not scope of their work. Work by Becker et al. [BKMS19] introduces a method to create personalized tangible user interfaces from plain paper. However, AR is not targeted here. Current work by Valkov et al. [VML19] presents a semi-passive tangible object that can be used to control virtual information in a tabletop setup such as interactive workbenches of fish-tank VR. Their haptic object provides programmable frictions and therefore is seen as a more generic tangible interface that is not determined to specific use cases. The mentioned work shows the variety of potential tangible interfaces that can success- fully be used in different setups. Exploring affordances that mobile device setup for magic lens experiences can offer for TAR-based interface techniques can be valuable for design- ing interfaces, as well, especially since mobile devices and magic lens experiences such as Pokemon Go [Inc] already belong to our everyday live environment. 3 Mobile Tangible Augmented Reality Interface Techniques This section presents four TAR-based interface concepts. We show how to design these interfaces for specific scenarios of a domain and illustrate it in three examples. Universal patterns from the knowledge communication domain were used representative for other basic cases of TAR applications. These patterns are the following (after [HD19]): The show and tell pattern annotates an object to inform a user about its composition. Compare contrasts aspects of two objects. A chronological sequence pattern illustrates a temporal relation. Figure 2: Conceptual illustration of our four proposed TAR interfaces regarding their application for short pattern-based AR experiences. 3.1 Screen Interface Using a visual display for interacting with the virtual content is an established concept for mobile AR technology such as a tablet device. The screen interface is closely related to the typical interactions performed on these touch devices. Most interaction takes place on the mobile screen itself. Regarding our screen interface for TAR, app interactions are performed solely on a touch screen, as well. For being able to actually hold a mobile device and simultaneously interact with its touch screen or the tangible, the mobile device can only be held with one hand. The second hand is free for either app or tangible interactions. Another possibility is to use a stand for the mobile device so that users have both hands free for simultaneous interaction with both the screen and the tangible. We focused on the hand-held setup, as users of mobile devices are already used to hold it themselves. More detailed information on this aspect is described in work by Rau et al. [RHL+20]. For the show and tell pattern, text annotations are displayed on the screen and connected to the real-world object (Fig. 2). In addition, these annotations serve as buttons that can be triggered to show further details or to activate animations of the sub-objects that they annotate. A menu at the side of the mobile TAR application provides system functionality, such as closing the app or switching to another pattern implementation.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages12 Page
-
File Size-