Wearable Haptic Technology for Basic 3D Interaction

Total Page:16

File Type:pdf, Size:1020Kb

Wearable Haptic Technology for Basic 3D Interaction Wearable Haptic Technology for 3D Selection and Guidance an der Universität Hamburg eingereichte Dissertation vorgelegt von Oscar Ariza Human-Computer Interaction Fachbereich Informatik Fakultät für Mathematik, Informatik und Naturwissenschaften 2020 Erstgutachter: Prof. Dr. Frank Steinicke Zweitgutachter: Dr. Anatole Lécuyer Vorsitzender der Prüfungskommission: Prof. Dr. Timo Gerkmann Stellvertretender Vorsitzender der Prüfungskommission: Prof. Dr. Matthias Rarey Datum der Disputation: 19.03.2021 i Dedicado a “la cambia mundos” iii Abstract The integration of wearable haptic technology in Virtual Environments (VEs) has enormous po- tential to improve user performance and accuracy while executing 3D selection and manipulation tasks, as well as providing proximity-based guidance during aiming and navigation tasks. However, the lack of haptic feedback has been commonly reported in VEs, combined with insufficient alternatives to incorporate feedback for tactile, kinesthetic, and other modalities in 3D user interfaces (3DUIs). Hence, there is a need to develop and evaluate novel haptic technology that circumvents these interaction problems and improves usability, reducing the performance gap between traditional interfaces and 3DUIs. This thesis’s primary goal is to improve the usability and performance of basic 3D interaction tasks involving selection and guidance in VEs by integrating and evaluating wearable haptic technology and hand tracking. First, we explore the use of proximity-based multimodal cues supporting 3D selection, specifically, for motions involving ballistic and correction phases during common approaching movements of the human arm. We present multimodal combinations of tactile and audio-visual cues improving the performance or reducing errors during 3D object selections. Additionally, we show how feedback combinations could affect the user’s selection movements and depth under-/overestimation. Second, we present the use of perceptual illusions supported by the use of vibrotactile feedback to reproduce the elongated-arm illusion to enable the user to manipulate out-of-reach 3D objects while interacting in-place. Our results support the persistent illusion of body transfer after brief phases of automatic and synchronized visual-haptic stimulation instead of the traditional approach based on manual stimulation. Additionally, we present multimodal combinations of haptic feedback to create a plausible touch illusion. Our approach relies on combining different haptic feedback (i.e., kinesthetic, pseudo, and tactile) to convey sensations for contact, stiffness, and activation for 3DUIs. Third, we explore vibrotactile technology supporting 3D navigation tasks in VEs. We propose wireless and wearable devices attached to both hemispheres of the user’s head with assistive vibrotactile cues for guidance, reducing the time used to turn and locate a target object. Moreover, we evaluate vibrotactile feedback embedded in shoe soles and handhelds for the sense of presence, user acceptance, and obstacle detection in VEs. Additionally, besides the studies conducted, we developed and evaluated haptic technology. It features light-weight, unencumbering, and versatile form factors compatible with natural interaction and hand- tracking technologies. We append all the designs, specifications, embedded code, firmware, and resources to enable practitioners to replicate the conducted experiments. v Zusammenfassung Die Integration tragbarer haptischer Technologien in virtuelle Umgebungen (Virtual Environments, VEs) bietet ein enormes Potenzial (i) zur Verbesserung der Benutzerleistung und -genauigkeit bei der Ausführung von 3D-Auswahl- und Manipulationsaufgaben sowie (ii) zur Bereitstellung einer auf Nähe basierenden Orientierungshilfe bei Ziel- und Navigationsaufgaben. Nichtsdestotrotz wird in VEs nur selten mit haptischem Feedback gearbeitet, was unter anderem auf unzureichende Optionen zur Einbeziehung von taktilen, kinästhetischen und anderen Modalitäten in 3D-Benutzeroberflächen (3D UIs) zurückzuführen ist. Aus diesem Grund besteht ein Bedarf nach neuartigen haptischen Technologien, welche diese Interaktionsprobleme umgehen und die Benutzerfreundlichkeit verbessern, wodurch die Leistungslücke zwischen herkömmlichen Schnittstellen und 3D UIs verringert wird. Das Hauptziel dieser Arbeit besteht darin, grundlegende 3D-Interaktionsaufgaben wie die Selektion und Führung in VEs in Bezug auf Benutzerfreundlichkeit und Durchsatz zu verbessern, indem tragbare haptische Technologien und Hand-Tracking integriert und evaluiert werden. Zunächst untersuchen wir die Verwendung von Nähe-basierten multimodalen Hinweisen, welche die 3D-Selektion unterstützen. Insbesondere fokussieren wir uns auf typische Armbewegungen bei der An- näherung an Zielobjekte, welche ballistische und korrigierende Phasen beinhalten. Wir präsentieren multimodale Kombinationen von taktilen und audiovisuellen Hinweisen, die die Leistung verbessern oder Fehler bei der Auswahl von 3D-Objekten reduzieren. Zusätzlich zeigen wir, wie Feedback-Kombinationen die Selektionsbewegungen des Benutzers und die Unter- / Überschätzung der Distanz beeinflussen können. Anschließend stellen wir durch vibrotaktiles Feedback unterstützte Wahrnehmungsillusionen vor, mit deren Hilfe der Eindruck eines verlängerten Arms reproduziert werden kann. Auf diese Weise kann der Benutzer nicht erreichbare 3D-Objekte manipulieren, während er an Ort und Stelle interagiert. Unsere Studienergebnisse weisen darauf hin, dass eine anhaltende Körpertransfer-Illusion durch kurze Phasen automatischer und synchronisierter visuell-haptischer Stimulation erzielt werden kann - im Gegensatz zum traditionellen Ansatz, der auf manueller Stimulation basiert. Zusätzlich präsentieren wir multimodale Kombinationen von haptischem Feedback, um eine plausible Berührungsillusion zu erzeugen. Unser Ansatz beruht auf der Verknüpfung verschiedener Formen haptischen Feedbacks (kinästhetisch, pseudo und taktil), um Empfindungen für Kontakt, Steifheit und Aktivierung für 3D UIs zu vermitteln. Abschließend untersuchen wir vibrotaktile Technologien zur Unterstützung von 3D- Navigationsaufgaben in VEs. Wir schlagen drahtlose, am Kopf getragene Geräte vor, die unterstützende vibrotaktile Hinweise bereitstellen und damit die Zeit zum Lokalisieren eines Zielobjekts sowie der entsprechenden Kopfdrehung reduzieren. Darüber hinaus evaluieren wir vibrotaktile Systeme, welche in Schuhsohlen und Handhelds eingebettet sind, in Bezug auf Präsenz, Benutzerakzeptanz und Hinderniserkennung in VEs. Zusätzlich zu den durchgeführten Studien haben wir ein haptisches Interaktionsgerät entwickelt und evaluiert. Es ist vielseitig einsetzbar und durch das geringe Gewicht sowie die hohe Ergonomie mit natürlichen Interaktions- und Hand-Tracking-Technologien kompatibel. Alle Entwürfe, Spezifikationen und Ressourcen sowie eingebetteter Code und Firmware sind der Arbeit angehängt, um eine Replikation der durchgeführten Experimente zu ermöglichen. Contents vii Contents 1. Introduction 1 1.1. Motivation ........................................ 1 1.2. Scenarios ......................................... 5 1.3. Research Goals ...................................... 8 1.4. Outline .......................................... 9 1.5. Publications ........................................ 10 1.5.1. Main Authorship ................................. 11 1.5.2. Co-Authorship ................................... 12 Part I. Fundamentals 15 2. Human Factors 17 2.1. Tactile Modality ...................................... 18 2.1.1. Mechanoreceptors ................................. 18 2.2. Kinesthetic Modality ................................... 20 2.2.1. Perception-action Loop .............................. 22 2.3. Visual Modality ...................................... 24 2.3.1. Field of View ................................... 24 2.3.2. Depth Estimation ................................. 25 2.4. Ergonomy ......................................... 26 2.5. Psychophysics ....................................... 27 3. Haptics and 3DUI 31 3.1. Haptic Technology .................................... 31 3.1.1. Types of Interactive Haptic Devices ........................ 32 3.1.2. Graspable Systems ................................. 34 3.1.3. Touchable Systems ................................. 36 3.1.4. Wearable and Contactless Systems ........................ 37 3.2. Visuo-haptic Illusions ................................... 43 3.3. Multimodal Feedback ................................... 45 3.4. 3D Interaction ....................................... 46 3.4.1. Hand Interaction .................................. 47 3.4.2. 3D Direct Selection ................................ 48 3.4.3. 3D Guidance .................................... 50 3.5. Conclusion ........................................ 50 viii Contents Part II. Support for 3D Selection 53 4. Proximity-based Patterns 55 4.1. Motivation ........................................ 55 4.2. Ring-shaped Haptic Device ................................ 56 4.3. Vibrotactile Patterns ................................... 56 4.4. Usability Study ...................................... 58 4.4.1. Participants .................................... 58 4.4.2. Materials ..................................... 59 4.4.3. Methods ...................................... 60 4.4.4. Results ....................................... 60 4.5. Discussion ......................................... 61 4.6. Conclusion ........................................ 63 5. Use of Multimodal Feedback 65 5.1.
Recommended publications
  • Feeling Is Believing: Using a Force Feedback Joystick to Teach
    Session 3668 Feeling is Believing: Using a Force-Feedback Joystick to Teach Dynamic Systems Christopher Richard, Allison M. Okamura, Mark. R. Cutkosky Center for Design Research, Stanford University Abstract As an innovative approach to teaching the laboratory component of an undergraduate course on dynamic systems, we present the haptic paddle: a low-cost, single-axis, force-feedback joystick. Using the paddle, students not only learned to model and analyze dynamic systems, but by using their sense of touch, they were able to feel the effects of phenomena such as viscous damping, stiffness, and inertia. Feeling the dynamics, in addition to learning the underlying physics, improved students’ understanding and added an element of fun to the course. In this paper, we describe the purpose and design of the haptic paddle, present examples of how the paddle was integrated into laboratory exercises, and show the results of student evaluations. 1. Introduction Engineering educators are continually challenged to provide physical examples in order to make course material more interesting and accessible to students. Laboratory exercises, software simulations, and in-class demonstrations are all helpful in developing students’ ability to connect theoretical principles with physical reality. The literature contains several examples of computer- based dynamic simulations being used for pedagogical purposed 1,4,9. But even with these aids, concepts such as eigenvalues, instability, and time constants can seem mysterious to students encountering them for the first time. Haptic interfaces, which allow a user to feel a virtual environment, are promising tools for helping students obtain an understanding of these physical phenomena. 1.1 The Field of Haptics The word haptic means relating to or based on the sense of touch.
    [Show full text]
  • 2020 IEEE Conference on Virtual Reality and 3D User Interfaces
    2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR 2020) Atlanta , Georgia, USA 22 – 26 March 2020 IEEE Catalog Number: CFP20VIR-POD ISBN: 978-1-7281-5609-5 Copyright © 2020 by the Institute of Electrical and Electronics Engineers, Inc. All Rights Reserved Copyright and Reprint Permissions: Abstracting is permitted with credit to the source. Libraries are permitted to photocopy beyond the limit of U.S. copyright law for private use of patrons those articles in this volume that carry a code at the bottom of the first page, provided the per-copy fee indicated in the code is paid through Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923. For other copying, reprint or republication permission, write to IEEE Copyrights Manager, IEEE Service Center, 445 Hoes Lane, Piscataway, NJ 08854. All rights reserved. *** This is a print representation of what appears in the IEEE Digital Library. Some format issues inherent in the e-media version may also appear in this print version. IEEE Catalog Number: CFP20VIR-POD ISBN (Print-On-Demand): 978-1-7281-5609-5 ISBN (Online): 978-1-7281-5608-8 ISSN: 2642-5246 Additional Copies of This Publication Are Available From: Curran Associates, Inc 57 Morehouse Lane Red Hook, NY 12571 USA Phone: (845) 758-0400 Fax: (845) 758-2633 E-mail: [email protected] Web: www.proceedings.com 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) VR 2020 Table of Contents General Chairs Message xix Conference Paper Program Chairs Message xx IEEE Visualization and Graphics Technical
    [Show full text]
  • An Empirical Study of Virtual Reality Menu Interaction and Design
    Mississippi State University Scholars Junction Theses and Dissertations Theses and Dissertations 4-30-2021 An empirical study of virtual reality menu interaction and design Emily Salmon Wall [email protected] Follow this and additional works at: https://scholarsjunction.msstate.edu/td Recommended Citation Wall, Emily Salmon, "An empirical study of virtual reality menu interaction and design" (2021). Theses and Dissertations. 5161. https://scholarsjunction.msstate.edu/td/5161 This Dissertation - Open Access is brought to you for free and open access by the Theses and Dissertations at Scholars Junction. It has been accepted for inclusion in Theses and Dissertations by an authorized administrator of Scholars Junction. For more information, please contact [email protected]. Template C with Schemes v4.1 (beta): Created by L. 11/15/19 An empirical study of virtual reality menu interaction and design By TITLE PAGE Emily Salmon Wall Approved by: Reuben F. Burch V (Major Professor) Michael Hamilton Daniel Carruth Brian Smith Ginnie Hsu Linkan Bian (Graduate Coordinator) Jason M. Keith (Dean, Bagley College of Engineering) A Dissertation Submitted to the Faculty of Mississippi State University in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in Industrial and Systems Engineering in the Department of Industrial and Systems Engineering Mississippi State, Mississippi April 2021 Copyright by COPYRIGHT PAGE Emily Salmon Wall 2021 Name: Emily Salmon Wall ABSTRACT Date of Degree: April 30, 2021 Institution: Mississippi State University Major Field: Industrial and Systems Engineering Major Professor: Reuben F. Burch V Title of Study: An empirical study of virtual reality menu interaction and design Pages in Study: 181 Candidate for Degree of Doctor of Philosophy This study focused on three different menu designs each with their own unique interactions and organizational structures to determine which design features would perform the best.
    [Show full text]
  • Novint Falcon and Novint/Sandia 3D-Touch Software
    2007 R&D 100 Award Entry Form NOVINT 2007 R&D 100 Award Entry Form NOVINT Submitting Organization Tom Anderson Novint Technologies 4109 Bryan Avenue NW Albuquerque, NM 87114, USA 505-463-1469 (phone) 866-298-4420 (fax) [email protected] AFFIRMATION: I affirm that all information submitted as a part of, or supplemental to, this entry is a fair and accurate represen- tation of this product. (Signature)______________________________________ Joint Submitters Nathan L. Golden Sandia National Laboratories P.O. Box 1500 Albuquerque, NM 87185-0114, USA 505-845-9737 (phone) 505-844-8011 (fax) [email protected] Jeff Smith Lunar Design 541 Eighth Street San Francisco, CA 94103, USA 415-252-4388 (phone) 415-252-4389 (fax) [email protected] 2007 R&D 100 Award Entry Form NOVINT Joint Submitters Francois Conti Force Dimension PSE-C, CH-1015 Lausanne, Switzerland 41 21 693-1911 (phone) 41 21 693-1910 (fax) [email protected] Product Name Novint Falcon and Novint/Sandia 3D-Touch Software Brief Product Description The Novint Falcon and its 3D-Touch Software lets consumers, for the first time, use an accurate sense of touch in computing. Product First Marketed or Available for Order The technology was first available for licensing in March 2006. It was first demonstrated at the 2006 Game Developers Conference. Inventor or Principal Developers Tom Anderson, CEO, Novint Technologies Walt Aviles, CTO, Novint Technologies Bill Anderson, Director of Game Development, Novint Technologies Jack Harrod, Hardware Consultant, Novint Technologies Arthurine Breckenridge, Consultant, Novint Technologies Richard Aviles, Programmer, Novint Technologies Jake Jones, Programmer, Novint Technologies Nick Brown, Programmer, Novint Technologies Marc Midura, Programmer, Novint Technologies Daryl Lee, Programmer, Novint Technologies 4109 Bryan Avenue NW Albuquerque, NM 87114, USA 866-298-4420 (phone) 866-298-4420 (fax) [email protected] 2007 R&D 100 Award Entry Form NOVINT Inventor or Principal Developers George Davidson Member of the Technical Staff Sandia National Laboratories P.O.
    [Show full text]
  • Real-Time 3D Graphic Augmentation of Therapeutic Music Sessions for People on the Autism Spectrum
    Real-time 3D Graphic Augmentation of Therapeutic Music Sessions for People on the Autism Spectrum John Joseph McGowan Submitted in partial fulfilment of the requirements of Edinburgh Napier University for the degree of Doctor of Philosophy October 2018 Declaration I, John McGowan, declare that the work contained within this thesis has not been submitted for any other degree or professional qualification. Furthermore, the thesis is the result of the student’s own independent work. Published material associated with the thesis is detailed within the section on Associate Publications. Signed: Date: 12th October 2019 J J McGowan Abstract i Abstract This thesis looks at the requirements analysis, design, development and evaluation of an application, CymaSense, as a means of improving the communicative behaviours of autistic participants through therapeutic music sessions, via the addition of a visual modality. Autism spectrum condition (ASC) is a lifelong neurodevelopmental disorder that can affect people in a number of ways, commonly through difficulties in communication. Interactive audio-visual feedback can be an effective way to enhance music therapy for people on the autism spectrum. A multi-sensory approach encourages musical engagement within clients, increasing levels of communication and social interaction beyond the sessions. Cymatics describes a resultant visualised geometry of vibration through a variety of mediums, typically through salt on a brass plate or via water. The research reported in this thesis focuses on how an interactive audio-visual application, based on Cymatics, might improve communication for people on the autism spectrum. A requirements analysis was conducted through interviews with four therapeutic music practitioners, aimed at identifying working practices with autistic clients.
    [Show full text]
  • Manipulation of 3D Objects in Immersive Virtual Environments
    PhD Dissertation Proposal Manipulation of 3D Objects in Immersive Virtual Environments Daniel Filipe Martins Tavares Mendes [email protected] September 2015 Abstract Interactions within virtual environments (VE) often require manipu- lation of 3D virtual objects. For this purpose, several research have been carried out focusing on mouse, touch and gestural input. While exist- ing mid-air gestures in immersive VE (IVE) can offer natural manipula- tions, mimicking interactions with physical objects, they cannot achieve the same levels of precision attained with mouse-based applications for computer-aided design (CAD). Following prevailing trends in previous research, such as degrees-of- freedom (DOF) separation, virtual widgets and scaled user motion, we intend to explore techniques for IVEs that combine the naturalness of mid-air gestures and the precision found in traditional CAD systems. In this document we survey and discuss the state-of-the-art in 3D object manipulation. With challenges identified, we present a research proposal and corresponding work plan. 1 Contents 1 Introduction 3 2 Background and Related Work 4 2.1 Key players . .4 2.2 Relevant events and journals . .5 2.3 Virtual Environments Overview . .7 2.4 Mouse and Keyboard based 3D Interaction . .9 2.5 3D Manipulation on Interactive Surfaces . 12 2.6 Touching Stereoscopic Tabletops . 16 2.7 Mid-Air Interactions . 17 2.8 Within Immersive Virtual Environments . 20 2.9 Discussion . 25 3 Research Proposal 28 3.1 Problem . 28 3.2 Hypotheses . 28 3.3 Objectives . 29 4 Preliminary Study of Mid-Air Manipulations 31 4.1 Implemented Techniques . 31 4.2 User Evaluation .
    [Show full text]
  • I Love My Iphone... but There Are Certain Things That 'Niggle'me
    “I Love My iPhone … But There Are Certain Things That ‘Niggle’ Me” Anna Haywood and Gemma Boguslawski Serco Usability Services, London, United Kingdom [email protected], [email protected] Abstract. Touchscreen technology is gaining sophistication, and the freedom offered by finger-based interaction has heralded a new phase in mobile phone evolution. The list of touchscreen mobiles is ever increasing as the appeal of ‘touch’ moves beyond the realms of the early adopter or fanboy, into the imagi- nation of the general consumer. However, despite this increasing popularity, touchscreen cannot be considered a panacea. It is important to look beyond the promise of a more direct and intuitive interface, towards the day-to-day reality. Based on our independent research, this paper explores aspects of the touch- screen user experience, offering iPhone insights as examples, before presenting key best practice guidelines to help design and evaluate finger-activated touch- screen solutions for small screen devices. 1 Introduction Although ‘touch’ is the latest buzzword, touchscreen technology and devices have been in use for over ten years, from public systems such as self-order and information kiosks, through to personal handheld devices such as PDAs or gaming devices. As the technology gains sophistication and teething problems are worked on, the list is ever increasing. However, it’s only since touchscreen heralded a new phase in mobile phone evolution with the freedom of finger-based interaction, that ‘touch’ has taken hold of the general consumers’ imagination. Mobile phones by their very nature are intended to support communication and, in- creasingly, access to information while on the move.
    [Show full text]
  • Gesture Recognition & Haptic Feedback Technologies
    Gesture Recognition & Haptic Feedback Technologies A whitepaper on touchless interfaces and tactile feedback Gesture Recognition & Haptic Feedback Technologies A whitepaper on touchless interfaces and tactile feedback by the huge demand to detect body movements at a Overview distance for the enormous global gaming market.5 But more generally touchless gesture recognition is This whitepaper explores the reasons for, and enabling acknowledged as the ‘next frontier’ in human machine technologies behind, gesture recognition and haptic interfaces for an enormous range of applications, which are feedback markets and highlights the advantages and now moving beyond computer and game interfaces to limitations of a variety of implementations. include consumer and domestic appliances, and automotive dashboards. Since the development of the trackball in 1941, which later led to the consumer release of the computer mouse in the Users of touchless controls often comment they lack 1980’s, engineers across the globe have sought to push the precision and the tactile feedback humans find so valuable in limitations of computer-human interaction in an attempt to confirming they are engaging with the electronics. Naturally achieve a more intuitive, natural user experience. The we prefer feeling a control, different textures and surfaces, expressive world of gestures, used in day-to-day, and often hearing an audible confirmation too. So effective face-to-face communications for millennia have been an simulation of tactile feedback, haptics, is seen as the key to obvious avenue to explore, pared with sophisticated audio unlocking the potential of touchless gesture recognition. and visual developments, the market has come a long way. The importance of the human sense of touch has also been A Feel for the Market championed, with ‘haptic feedback’ being added to many consumer devices.
    [Show full text]
  • An Intuitive Tangible Game Controller
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by AUT Scholarly Commons Jacques Foottit, Dave Brown, Stefan Marks, and Andy M. Connor. 2014. An Intuitive Tangible Game Controller. In Proceedings of the 2014 Conference on Interactive Entertainment (IE2014), Karen Blackmore, Keith Nesbitt, and Shamus P. Smith (Eds.). ACM, New York, NY, USA, , Article 4 , 7 pages. DOI=10.1145/2677758.2677774 http://doi.acm.org/10.1145/2677758.2677774 An Intuitive Tangible Game Controller Jacques Foottit, Dave Brown, Stefan Marks & Andy M. Connor Auckland University of Technology Colab (D-60), Private Bag 92006 Wellesley Street, Auckland 1142, NZ +64 9 921 9999 [wgk9328; wpv9142; smarks; aconnor]@aut.ac.nz ABSTRACT draw the attention of a user to a particular output. A wide variety This paper outlines the development of a sensory feedback device of implementations exist and the focus of this paper is the use of a providing a tangible interface for controlling digital environments, vibrotactile feedback mechanism. in this example a flight simulator, where the intention for the This paper outlines the development of a tangible interface for device is that it is relatively low cost, versatile and intuitive. controlling digital environments such as games that has the Gesture based input allows for a more immersive experience, so potential for being versatile and intuitive. The interface is based rather than making the user feel like they are controlling an around a particular design of a haptic glove used in conjunction aircraft the intuitive interface allows the user to become the with a Kinect controller.
    [Show full text]
  • Interactive and Explorable 360 VR System with Visual Guidance User Interfaces
    RealNodes: Interactive and Explorable 360◦ VR System with Visual Guidance User Interfaces ABSTRACT Emerging research expands the idea of using 360-degree panora- mas of the real-world for “360 VR” experiences beyond video and image viewing. However, most of these are strictly guided, with few opportunities for interaction or exploration. There is a desire for experiences with cohesive virtual environments with choice in navigation, versus scripted experiences with limited interaction. Un- like standard VR with the freedom of synthetic graphics, there are challenges in designing user interfaces (UIs) for 360 VR navigation within the limitations of fixed assets. We designed RealNodes, a novel software system that presents an interactive and explorable 360 VR environment. We also developed four visual guidance UIs for 360 VR navigation. The results of a comparative study of these UIs determined that choice of user interface (UI) had a significant effect on task completion times, showing one of the methods, Ar- row, was best. Arrow also exhibited positive but non-significant trends in preference, user engagement, and simulator-sickness. Re- alNodes and the comparative study contribute preliminary results that inspire future investigation of how to design effective visual guidance metaphors for navigation in applications using novel 360 VR environments. Keywords: Immersive / 360° video; 3D user interaction; Non- fatiguing 3DUIs; Locomotion and navigation; 3DUI metaphors; Computer graphics techniques Index Terms: Human-centered computing—Human computer Figure 1: Images from the RealNodes software displaying the four interaction (HCI)—Interaction paradigms—Virtual Reality; Human- types of visual guidance UI. (Top-left) Target; (Top-right) Arrow; centered computing—Interaction design —Interaction design pro- (Bottom-left) Path; (Bottom-right) Ripple.
    [Show full text]
  • CHI'94 Format Description
    1 New Interaction Techniques 2003 Grigori Evreinov University of Tampere Department of Computer and Information Sciences TAUCHI Unit for Computer-Human Interaction Pinninkatu 53B FIN-33014 TAMPERE, FINLAND http://www.cs.uta.fi/~grse/NIT_2003/NIT2003.htm PREFACE virtual reality and natural activity The course “New Interaction Techniques” [1] was Temporal dimension organized by Prof. Roope Raisamo in the spring of 2000 Audio-haptic manipulations at the Department of Computer and Information Sciences Engineering basics for computer interaction in the University of Tampere. The course is one of the Device capabilities and their future evolution advanced courses on human-computer interaction Trends in component technology supported by TAUCHI Unit for Computer-Human Display technology Interaction. Input devices Communication with PC The main goal of the course is deep analysis of novel Joystick port technological achievements for augmented Parallel port communication as well as user behavior in different Serial port situations, including extreme ones. Based on advanced USB engineering, software design and presentation of Text entry as the model for pointing & selection strategically-important scientific directions, the topics of Introduction to the problem and samples lectures should stimulate creative capabilities of the Text input: Techniques and research tools students for development of new approaches to the Visiting lecturer: Poika Isokoski solution of current and future tasks in computer science. Interactive Surfaces and Perceived Textures Human tactile sense Obviously, the most of students have different Tactile matrixes, displays and actuators background. Some of students have own experience in Textures and tactile symbols their use of novel devices; some students, probably, could Sound Pen propose own ideas to improve existing interaction Data Sonification techniques.
    [Show full text]
  • The Openxr Specification
    The OpenXR Specification Copyright (c) 2017-2021, The Khronos Group Inc. Version 1.0.19, Tue, 24 Aug 2021 16:39:15 +0000: from git ref release-1.0.19 commit: 808fdcd5fbf02c9f4b801126489b73d902495ad9 Table of Contents 1. Introduction . 2 1.1. What is OpenXR?. 2 1.2. The Programmer’s View of OpenXR. 2 1.3. The Implementor’s View of OpenXR . 2 1.4. Our View of OpenXR. 3 1.5. Filing Bug Reports. 3 1.6. Document Conventions . 3 2. Fundamentals . 5 2.1. API Version Numbers and Semantics. 5 2.2. String Encoding . 7 2.3. Threading Behavior . 7 2.4. Multiprocessing Behavior . 8 2.5. Runtime . 8 2.6. Extensions. 9 2.7. API Layers. 9 2.8. Return Codes . 16 2.9. Handles . 23 2.10. Object Handle Types . 24 2.11. Buffer Size Parameters . 25 2.12. Time . 27 2.13. Duration . 28 2.14. Prediction Time Limits . 29 2.15. Colors. 29 2.16. Coordinate System . 30 2.17. Common Object Types. 33 2.18. Angles . 36 2.19. Boolean Values . 37 2.20. Events . 37 2.21. System resource lifetime. 42 3. API Initialization. 43 3.1. Exported Functions . 43 3.2. Function Pointers . 43 4. Instance. 47 4.1. API Layers and Extensions . 47 4.2. Instance Lifecycle . 53 4.3. Instance Information . 58 4.4. Platform-Specific Instance Creation. 60 4.5. Instance Enumerated Type String Functions. 61 5. System . 64 5.1. Form Factors . 64 5.2. Getting the XrSystemId . 65 5.3. System Properties . 68 6. Path Tree and Semantic Paths.
    [Show full text]