User Interfaces for Mobile Augmented Reality Systems

Total Page:16

File Type:pdf, Size:1020Kb

User Interfaces for Mobile Augmented Reality Systems User Interfaces for Mobile Augmented Reality Systems Tobias Hans Hollerer¨ Submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy in the Graduate School of Arts and Sciences COLUMBIA UNIVERSITY 2004 ©2004 Tobias Hans H¨ollerer All Rights Reserved Abstract User Interfaces for Mobile Augmented Reality Systems Tobias Hans Hollerer¨ In this dissertation, we present typical components of, useful services associated with, and user interactions possible with mobile augmented reality systems, based on a comprehensive series of hardware and software infrastructures and application proto- types we developed. We define a practical taxonomy of user interface components for such systems and establish methodology for adaptive mobile augmented reality inter- faces that dynamically rearrange themselves in response to changes in user context. The research contributions to the state-of-the-art in augmented reality begin with the author’s participation in the design of the ”Columbia Touring Machine” in 1997, the first example of an outdoor mobile augmented reality system, and his lead in developing later prototypes. We develop a series of hardware and software infrastructures for proto- typing mobile augmented reality applications that allow multiple users to participate in collaborative tasks taking place indoors and outdoors. We present exploratory user interfaces for many different applications and user scenarios, including the Situated Documentaries application framework for experienc- ing spatially distributed hypermedia presentations. Based on these explorations, we de- velop a taxonomic categorization of mobile augmented reality interface components and their properties. Virtual and real world objects alike are considered part of the interface. We tag each component with information about its purpose, its intrinsic properties, its relationship to other objects, and its capabilities and flexibility with regard to various manipulations. Mobile augmented reality has until now faced a significant challenge: the com- plexity of the augmented views rapidly increases when many virtual objects fight for screen space to annotate physical entities in the dynamic views of multiple fast-paced roaming users. Responding to this, we develop user interface management techniques for mobile augmented reality. A rule-based reasoning architecture uses the taxonomic data classification mentioned above to automatically rearrange augmented reality views in dynamic situations; for example to remove clutter in the augmented view of the world or to react to infrastructural context changes, such as variations in tracking accuracy. Contents List of Figures v List of Tables xi Acknowledgments xiii Chapter 1 Introduction 1 1.1 Mobile Augmented Reality........................ 1 1.2 Thesis Overview . ............................ 3 1.2.1 Properties of MARS Interfaces . .............. 4 1.2.2 MARS UI Management ..................... 5 1.2.3 Scope of this Research . ..................... 6 1.2.4 Summary of Contributions . .............. 8 1.2.4.1 System Designs for Outdoor MARS . 9 1.2.4.2 Situated Documentaries and Other Application Proto- types . ..................... 10 1.2.4.3 Taxonomy of MARS UI Components . 11 1.2.4.4 MARS UI Management . .............. 11 1.2.4.5 Rule-based AR System . .............. 11 Chapter 2 Background and Related Work 13 2.1 Components of Mobile AR Systems . .............. 13 2.2 Historical Overview ............................ 14 2.3 Mobile AR: Applications, and Challenges . .............. 16 2.3.1 Applications . ..................... 16 2.3.1.1 Assembly and Construction .............. 16 2.3.1.2 Maintenance and Inspection .............. 18 2.3.1.3 Navigation and Path Finding . .......... 18 2.3.1.4 Tourism . ..................... 18 2.3.1.5 Architecture and Archaeology . .......... 19 i 2.3.1.6 Urban Modeling . .............. 20 2.3.1.7 Geographical Field Work . .............. 20 2.3.1.8 Journalism . ..................... 21 2.3.1.9 Entertainment ..................... 22 2.3.1.10 Medicine . ..................... 22 2.3.1.11 Military Training and Combat . .......... 23 2.3.1.12 Personal Information Management and Marketing . 24 2.3.2 Challenges ............................ 24 2.4 Mobile Augmented Reality Systems . .............. 26 2.4.1 Mobile Computing Platforms . .............. 26 2.4.2 Displays for Mobile AR ..................... 28 2.4.3 Tracking and Registration . .............. 33 2.4.4 Environmental Modeling ..................... 38 2.4.5 Wearable Input and Interaction Technologies .......... 40 2.4.6 Wireless Communication and Data Storage Technologies . 44 2.5 Existing Mobile AR Systems . ..................... 45 Chapter 3 System Designs 48 3.1 System Architectures, Hardware ..................... 49 3.2 System Architectures, Software . ..................... 51 3.2.1 COTERIE-based MARS ..................... 52 3.2.1.1 Campus Tour Architecture . .............. 53 3.2.1.2 Situated Documentaries Extensions .......... 54 3.2.2 IN-N-OUT AR – an Indoor/Outdoor Java/COTERIE Hybrid In- frastructure ............................ 56 3.2.2.1 Development Tools . .............. 58 3.2.3 JABAR—AJava-Based AR Infrastructure . .......... 59 3.2.4 CGUI Lab AR — A Central Architecture for AR and View Man- agement . ............................ 61 3.2.5 Ruby — Rule-based Control of AR interfaces .......... 61 3.3 Indoor Tracking for MARS . ..................... 62 3.3.1 Wide-Area Indoor Tracking using Dead-Reckoning . 63 Chapter 4 Analytical Foundation of MARS UIs 68 4.1 Scope of MARS User Interfaces ..................... 68 4.2 Taxonomy of MARS UI components . .............. 72 4.2.1 MARS Objects . ..................... 73 4.2.1.1 Environment Objects . .............. 74 4.2.1.2 UI Templates ..................... 75 ii 4.2.1.3 Container Objects . .............. 76 4.2.1.4 Special Objects and Examples . .......... 76 4.2.2 Object Attributes . ..................... 79 4.2.2.1 Type Attributes . .............. 79 4.2.2.2 State Attributes . .............. 81 4.2.2.3 Realization Attributes . .............. 83 4.2.2.4 Semantic Attributes . .............. 85 4.2.3 MARS Concepts . ..................... 86 4.2.3.1 Output Media ..................... 86 4.2.3.2 Reference Frames . .............. 88 4.2.3.3 Relationships ..................... 89 4.2.3.4 Goals . ..................... 90 4.2.3.5 Purposes . ..................... 90 4.2.3.6 Events . ..................... 91 4.2.4 MARS Interaction . ..................... 91 4.2.4.1 Interaction Tasks . .............. 92 4.2.4.2 Input Devices ..................... 93 4.2.4.3 Output Devices . .............. 95 Chapter 5 Implemented MARS UIs 97 5.1 Spatially Distributed Hypermedia . .............. 99 5.1.1 Situated Documentaries I & II . .............. 101 5.1.1.1 Navigating the Web of Presentations . 104 5.1.1.2 Multimedia Presentations . .............. 106 5.1.1.3 Exploratory UI Design . .............. 109 5.1.2 Situated Documentary III: Interactive Storytelling . 109 5.1.3 Summary of UI Concepts ..................... 113 5.2 Collaboration . ............................ 114 5.2.1 Indoor/Outdoor Collaboration . .............. 114 5.2.2 Hybrid UIs for Augmented Meetings .............. 118 5.2.3 Integrated Web Browser and Communication .......... 119 5.2.4 Summary of UI Concepts ..................... 120 5.3 Navigation . ............................ 121 5.3.1 Indoor Navigation UI . ..................... 121 5.3.2 Situational Awareness . ..................... 124 5.3.2.1 Intuitive Control of an AR World in Miniature . 124 5.3.2.2 Restaurant Guide . .............. 126 5.3.3 Summary of UI Concepts ..................... 129 5.4 MARS UI Design, Problems and Solutions . .............. 130 iii 5.4.1 MARS UI Design: Lessons Learned . .............. 132 5.4.2 Analysis of UI Alternatives . .............. 134 5.4.3 Adaptation ............................ 136 5.4.3.1 UI Management Pipeline . .............. 137 5.4.3.2 View Management for AR . .............. 139 Chapter 6 Rule-based Architecture for MARS UI Management 143 6.1 Ruby: a Rule-Based System Architecture . .............. 144 6.1.1 Ruby Architecture . ..................... 146 6.1.2 Ruby and Jess . ..................... 149 6.1.3 Ruby Data Formats . ..................... 152 6.1.3.1 Ruby Objects ..................... 153 6.1.3.2 Internal Attributes . .............. 155 6.1.3.3 External Attributes, Relationships, Events, and Goals 159 6.1.4 Ruby Rules ............................ 162 6.1.5 Ruby Control Flow . ..................... 164 6.2 Examples . ............................ 166 6.2.1 Exploring Display Modes for Occluded Infrastructure . 166 6.2.2 Adapting to Mobile User Context . .............. 168 6.2.3 Adapting to Tracking Accuracy . .............. 170 6.3 Discussion . ............................ 176 Chapter 7 Conclusions and Future Work 180 7.1 Summary of Results ............................ 182 7.2 Future Work . ............................ 183 7.2.1 Environment Management . .............. 183 7.2.2 Scalable Real-Time Knowledge Processing . .......... 184 7.2.3 Support for Hierarchical UI Descriptions . .......... 184 7.2.4 Usability Evaluation . ..................... 185 7.2.5 Adaptivity in Non-AR UIs . .............. 186 7.2.6 Additional Interaction Modalities . .............. 186 References 187 Appendix A Details of Hardware Prototypes 1996–2003 207 A.1 Columbia Touring Machine . ..................... 207 A.2 MARS 1999 . ............................ 209 A.3 MARS 2000 . ............................ 211 A.4 MARS 2001/2002 . ...........................
Recommended publications
  • A Review About Augmented Reality Tools and Developing a Virtual Reality Application
    Academic Journal of Science, CD-ROM. ISSN: 2165-6282 :: 03(02):139–146 (2014) $5(9,(:$%287$8*0(17('5($/,7<722/6$1' '(9(/23,1*$9,578$/5($/,7<$33/,&$7,21%$6('21 ('8&$7,21 0XVWDID8ODVDQG6DID0HUYH7DVFL )LUDW8QLYHULVLW\7XUNH\ Augmented Reality (AR) is a technology that gained popularity in recent years. It is defined as placement of virtual images over real view in real time. There are a lot of desktop applications which are using Augmented Reality. The rapid development of technology and easily portable mobile devices cause the increasing of the development of the applications on the mobile device. The elevation of the device technology leads to the applications and cause the generating of the new tools. There are a lot of AR Tool Kits. They differ in many ways such as used methods, Programming language, Used Operating Systems, etc. Firstly, a developer must find the most effective tool kits between them. This study is more of a guide to developers to find the best AR tool kit choice. The tool kit was examined under three main headings. The Parameters such as advantages, disadvantages, platform, and programming language were compared. In addition to the information is given about usage of them and a Virtual Reality application has developed which is based on Education. .H\ZRUGV Augmented reality, ARToolKit, Computer vision, Image processing. ,QWURGXFWLRQ Augmented reality is basically a snapshot of the real environment with virtual environment applications that are brought together. Basically it can be operated on every device which has a camera display and operation system.
    [Show full text]
  • User Interface Design Issues in HCI
    IJCSNS International Journal of Computer Science and Network Security, VOL.18 No.8, August 2018 153 User Interface Design Issues In HCI Waleed Iftikhar, Muhammad Sheraz Arshad Malik, Shanza Tariq, Maha Anwar, Jawad Ahmad, M. Saad sultan Department of Computing, Riphah International University, Faisalabad, Pakistan Summary Command line is the interface that allows the user to This paper presents an important analysis on a literature review interact with the computer by directly using the commands. which has the findings in design issues from the year 1999 to But there is an issue that the commands cannot be changed, 2018. This study basically discusses about all the issues related they are fixed and computer only understands the exact to design and user interface, and also gives the solutions to make commands. the designs or user interface more attractive and understandable. This study is the guideline to solve the main issues of user Graphical user interface is the interface that allows the interface. user to interact with the system, because this is user There is important to secure the system for modern applications. friendly and easy to use. This includes the graphics, The use of internet is quickly growing from years. Because of pictures and also attractive for all type of users. The this fast travelling lifestyle, where they lets the user to attach with command line is black and white interface. This interface systems from everywhere. When user is ignoring the is also known as WIMPS because it uses windows, icons, functionalities in the system then the system is not secure but, in menus, pointers.
    [Show full text]
  • Augmented Reality & Virtual Reality Is Now a Reality for Enterprises
    WHITE PAPER AUGMENTED REALITY & VIRTUAL REALITY IS NOW A REALITY FOR ENTERPRISES- THE FUTURE IS HERE! Abstract Innovation and next-generation technologies have completely changed the way we work, live and possibly even the way we think. AI, Augmented Reality (AR), Virtual Reality (VR), and Blockchain are just some of the technologies that have affected how we consume art, music, movies, and how we communicate, shop, and travel. We are in the midst of a complete digital revolution. This perspective paper illustrates a point of view on the use of mixed reality (MR) in today’s enterprise environment, and covers-- virtual reality and augmented reality, market trends, industry examples, and challenges within organizations that are adopting mixed reality. In short, it sheds light on what the future is expected to look like in the context of enterprise disruption with MR. Introduction Johnny Mnemonic, the Lawnmower Man, Minority Report, the Matrix, Minority Report, the Terminator 2, Ironman… Besides captivating audiences with their Everyone seems to know what VR headsets using special electronic equipment, such as stunning visual effects, these films all have are, and the popularity of Pokémon a helmet with an internal screen or gloves one thing in common - they showcase how Go almost allows omission of a basic fitted with sensors.” VR can digitally recreate MR technologies could be potentially used introduction to AR. Though they are often the environment around you, or give you in the future. used interchangeably, it is essential to clarify the impression you are somewhere entirely that AR and VR are not the same.
    [Show full text]
  • Designing for Increased Autonomy & Human Control
    IFIP Workshop on Intelligent Vehicle Dependability and Security (IVDS) January 29, 2021 Designing for Increased Autonomy & Human Control Ben Shneiderman @benbendc Founding Director (1983-2000), Human-Computer Interaction Lab Professor, Department of Computer Science Member, National Academy of Engineering Photo: BK Adams IFIP Workshop on Intelligent Vehicle Dependability and Security (IVDS) January 29, 2021 Designing for Increased Automation & Human Control Ben Shneiderman @benbendc Founding Director (1983-2000), Human-Computer Interaction Lab Professor, Department of Computer Science Member, National Academy of Engineering Photo: BK Adams What is Human-Centered AI? Human-Centered AI Amplify, Augment, Enhance & Empower People Human Responsibility Supertools and Active Appliances Visual Interfaces to Prevent/Reduce Explanations Audit Trails to Analyze Failures & Near Misses Independent Oversight à Reliable, Safe & Trustworthy Supertools Digital Camera Controls Navigation Choices Texting Autocompletion Spelling correction Active Appliances Coffee maker, Rice cooker, Blender Dishwasher, Clothes Washer/Dryer Implanted Cardiac Pacemakers NASA Mars Rovers are Tele-Operated DaVinci Tele-Operated Surgery “Robots don’t perform surgery. Your surgeon performs surgery with da Vinci by using instruments that he or she guides via a console.” https://www.davincisurgery.com/ Bloomberg Terminal A 2-D HCAI Framework Designing the User Interface Balancing automation & human control First Edition: 1986 Designing the User Interface Balancing automation &
    [Show full text]
  • Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration
    ORIGINAL RESEARCH published: 14 June 2021 doi: 10.3389/frvir.2021.697367 Eye See What You See: Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration Allison Jing*, Kieran May, Gun Lee and Mark Billinghurst Empathic Computing Lab, Australian Research Centre for Interactive and Virtual Environment, STEM, The University of South Australia, Mawson Lakes, SA, Australia Gaze is one of the predominant communication cues and can provide valuable implicit information such as intention or focus when performing collaborative tasks. However, little research has been done on how virtual gaze cues combining spatial and temporal characteristics impact real-life physical tasks during face to face collaboration. In this study, we explore the effect of showing joint gaze interaction in an Augmented Reality (AR) interface by evaluating three bi-directional collaborative (BDC) gaze visualisations with three levels of gaze behaviours. Using three independent tasks, we found that all bi- directional collaborative BDC visualisations are rated significantly better at representing Edited by: joint attention and user intention compared to a non-collaborative (NC) condition, and Parinya Punpongsanon, hence are considered more engaging. The Laser Eye condition, spatially embodied with Osaka University, Japan gaze direction, is perceived significantly more effective as it encourages mutual gaze Reviewed by: awareness with a relatively low mental effort in a less constrained workspace. In addition, Naoya Isoyama, Nara Institute of Science and by offering additional virtual representation that compensates for verbal descriptions and Technology (NAIST), Japan hand pointing, BDC gaze visualisations can encourage more conscious use of gaze cues Thuong Hoang, Deakin University, Australia coupled with deictic references during co-located symmetric collaboration.
    [Show full text]
  • Evaluating Performance Benefits of Head Tracking in Modern Video
    Evaluating Performance Benefits of Head Tracking in Modern Video Games Arun Kulshreshth Joseph J. LaViola Jr. Department of EECS Department of EECS University of Central Florida University of Central Florida 4000 Central Florida Blvd 4000 Central Florida Blvd Orlando, FL 32816, USA Orlando, FL 32816, USA [email protected] [email protected] ABSTRACT PlayStation Move, TrackIR 5) that support 3D spatial in- teraction have been implemented and made available to con- We present a study that investigates user performance ben- sumers. Head tracking is one example of an interaction tech- efits of using head tracking in modern video games. We nique, commonly used in the virtual and augmented reality explored four di↵erent carefully chosen commercial games communities [2, 7, 9], that has potential to be a useful ap- with tasks which can potentially benefit from head tracking. proach for controlling certain gaming tasks. Recent work on For each game, quantitative and qualitative measures were head tracking and video games has shown some potential taken to determine if users performed better and learned for this type of gaming interface. For example, Sko et al. faster in the experimental group (with head tracking) than [10] proposed a taxonomy of head gestures for first person in the control group (without head tracking). A game ex- shooter (FPS) games and showed that some of their tech- pertise pre-questionnaire was used to classify participants niques (peering, zooming, iron-sighting and spinning) are into casual and expert categories to analyze a possible im- useful in games. In addition, previous studies [13, 14] have pact on performance di↵erences.
    [Show full text]
  • Exploring the User Interface Affordances of File Sharing
    CHI 2006 Proceedings • Activity: Design Implications April 22-27, 2006 • Montréal, Québec, Canada Share and Share Alike: Exploring the User Interface Affordances of File Sharing Stephen Voida1, W. Keith Edwards1, Mark W. Newman2, Rebecca E. Grinter1, Nicolas Ducheneaut2 1GVU Center, College of Computing 2Palo Alto Research Center Georgia Institute of Technology 3333 Coyote Hill Road 85 5th Street NW, Atlanta, GA 30332–0760, USA Palo Alto, CA 94304, USA {svoida, keith, beki}@cc.gatech.edu {mnewman, nicolas}@parc.com ABSTRACT and folders shared with other users on a single computer, With the rapid growth of personal computer networks and often as a default system behavior; files shared with other the Internet, sharing files has become a central activity in computers over an intranet or home network; and files computer use. The ways in which users control the what, shared with other users around the world on web sites and how, and with whom of sharing are dictated by the tools FTP servers. Users also commonly exchange copies of they use for sharing; there are a wide range of sharing documents as email attachments, transfer files during practices, and hence a wide range of tools to support these instant messaging sessions, post digital photos to online practices. In practice, users’ requirements for certain photo album services, and swap music files using peer–to– sharing features may dictate their choice of tool, even peer file sharing applications. though the other affordances available through that tool Despite these numerous venues for and implementations of may not be an ideal match to the desired manner of sharing.
    [Show full text]
  • Virtual and Augmented Reality
    Virtual and Augmented Reality Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin This book first published 2020 Cambridge Scholars Publishing Lady Stephenson Library, Newcastle upon Tyne, NE6 2PA, UK British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Copyright © 2020 by Zeynep Tacgin All rights for this book reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner. ISBN (10): 1-5275-4813-9 ISBN (13): 978-1-5275-4813-8 TABLE OF CONTENTS List of Illustrations ................................................................................... x List of Tables ......................................................................................... xiv Preface ..................................................................................................... xv What is this book about? .................................................... xv What is this book not about? ............................................ xvi Who is this book for? ........................................................ xvii How is this book used? .................................................. xviii The specific contribution of this book ............................. xix Acknowledgements ...........................................................
    [Show full text]
  • User Interface for Volume Rendering in Virtual Reality Environments
    User Interface for Volume Rendering in Virtual Reality Environments Jonathan Klein∗ Dennis Reuling† Jan Grimm‡ Andreas Pfau§ Damien Lefloch¶ Martin Lambersk Andreas Kolb∗∗ Computer Graphics Group University of Siegen ABSTRACT the gradient or the curvature at the voxel location into account and Volume Rendering applications require sophisticated user interac- require even more complex user interfaces. tion for the definition and refinement of transfer functions. Tradi- Since Virtual Environments are especially well suited to explore tional 2D desktop user interface elements have been developed to spatial properties of complex 3D data, bringing Volume Render- solve this task, but such concepts do not map well to the interaction ing applications into such environments is a natural step. However, devices available in Virtual Reality environments. defining new user interfaces suitable both for the Virtual Environ- ment and for the Volume Rendering application is difficult. Pre- In this paper, we propose an intuitive user interface for Volume vious approaches mainly focused on porting traditional 2D point- Rendering specifically designed for Virtual Reality environments. and-click concepts to the Virtual Environment [8, 5, 9]. This tends The proposed interface allows transfer function design and refine- to be unintuitive, to complicate the interaction, and to make only ment based on intuitive two-handed operation of Wand-like con- limited use of available interaction devices. trollers. Additional interaction modes such as navigation and clip In this paper, we propose an intuitive 3D user interface for Vol- plane manipulation are supported as well. ume Rendering based on interaction devices that are suitable for The system is implemented using the Sony PlayStation Move Virtual Reality environments.
    [Show full text]
  • Effective User Interface Design for Consumer Trust Two Case Studies
    2005:097 SHU MASTER'S THESIS Effective User Interface Design for Consumer Trust Two Case Studies XILING ZHOU XIANGCHUN LIU Luleå University of Technology MSc Programme in Electronic Commerce Department of Business Administration and Social Sciences Division of Industrial marketing and e-commerce 2005:097 SHU - ISSN: 1404-5508 - ISRN: LTU-SHU-EX--05/097--SE Luleå University of Technology E-Commerce ACKNOWLEDGEMENT This thesis is the result of half a year of work whereby we have been accompanied and supported by many people. It is a pleasant aspect that we could have this opportunity to express our gratitude to all of them. First, we are deeply indebted to our supervisor Prof. Lennart Persson who is from Division of Industrial Marketing at LTU. He helped us with stimulating suggestions and encouragement in all the time of research and writing of this thesis. Without his never-ending support during this process, we could not have done this thesis. Especially, we would like to express our gratitude to all of participants, who have spent their valuable time to response the interview questions and discuss with us. Finally, we would like to thank our family and friends. I, Zhou Xiling am very grateful for everyone who gave me support and encouragement during this process. Especially I felt a deep sense of gratitude to my father and mother who formed part of my vision and taught me the good things that really matter in the life. I also want to thank my friend Tang Yu for his never-ending support and good advices. I, Liu XiangChun am very grateful for my parents, for their endless love and support.
    [Show full text]
  • Augmented Reality, Virtual Reality, & Health
    University of Massachusetts Medical School eScholarship@UMMS National Network of Libraries of Medicine New National Network of Libraries of Medicine New England Region (NNLM NER) Repository England Region 2017-3 Augmented Reality, Virtual Reality, & Health Allison K. Herrera University of Massachusetts Medical School Et al. Let us know how access to this document benefits ou.y Follow this and additional works at: https://escholarship.umassmed.edu/ner Part of the Health Information Technology Commons, Library and Information Science Commons, and the Public Health Commons Repository Citation Herrera AK, Mathews FZ, Gugliucci MR, Bustillos C. (2017). Augmented Reality, Virtual Reality, & Health. National Network of Libraries of Medicine New England Region (NNLM NER) Repository. https://doi.org/ 10.13028/1pwx-hc92. Retrieved from https://escholarship.umassmed.edu/ner/42 Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. This material is brought to you by eScholarship@UMMS. It has been accepted for inclusion in National Network of Libraries of Medicine New England Region (NNLM NER) Repository by an authorized administrator of eScholarship@UMMS. For more information, please contact [email protected]. Augmented Reality, Virtual Reality, & Health Zeb Mathews University of Tennessee Corina Bustillos Texas Tech University Allison Herrera University of Massachusetts Medical School Marilyn Gugliucci University of New England Outline Learning Objectives Introduction & Overview Objectives: • Explore AR & VR technologies and Augmented Reality & Health their impact on health sciences, Virtual Reality & Health with examples of projects & research Technology Funding Opportunities • Know how to apply for funding for your own AR/VR health project University of New England • Learn about one VR project funded VR Project by the NNLM Augmented Reality and Virtual Reality (AR/VR) & Health What is AR and VR? F.
    [Show full text]
  • Natural Interaction in Augmented Reality Context
    Natural Interaction in Augmented Reality Context John Aliprantis1, Markos Konstantakis1, Rozalia Nikopoulou2, Phivos Mylonas2 and George Caridakis1 1 University of the Aegean, 81100 Mytilene, Greece {jalip, mkonstadakis, gcari}@aegean.gr 2 Ionian University 49100 Corfu, Greece [email protected], [email protected] Abstract. In recent years, immersive technologies like Virtual and Augmented Reality have been accelerating at an incredible pace, building innovative experiences and developing new interaction paradigms. Current research has widely explored gesture interaction with Augmented Reality interfaces, but usually requires users to manipulate input devices that could be cumbersome and obtrusive, thus preventing them from interacting efficiently with the 3D environment. Therefore, Natural User Interfaces and freehand gesture interaction are becoming more and more popular, improving the user’s engagement and sense of presence, providing more stimulating, user-friendly and non-obtrusive interaction methods. However, researchers argue about the impact of the interaction fidelity in usability and user satisfaction, questioning the level of naturalness that should characterize the interaction metaphors. Current paper proposes different gesture recognition techniques for three basic interaction categories (translation, rotation and scaling) in a Leap Motion Controller - Augmented Reality framework. A prototype is implemented in order to evaluate efficiency and usability of the proposed architecture. Finally, experimental results are discussed. Keywords: Natural interactionAugmented realityLeap motion controller Gesture recognition. 1 Introduction Over the last few years, Augmented Reality (AR) has developed into a cutting edge technology, providing new ways to interact with computer – generated information. By removing the boundaries between physical and virtual, AR has been able to create more engaging experiences, enhancing user’s enjoyment and satisfaction.
    [Show full text]