User Interfaces for Mobile Augmented Reality Systems
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
A Review About Augmented Reality Tools and Developing a Virtual Reality Application
Academic Journal of Science, CD-ROM. ISSN: 2165-6282 :: 03(02):139–146 (2014) $5(9,(:$%287$8*0(17('5($/,7<722/6$1' '(9(/23,1*$9,578$/5($/,7<$33/,&$7,21%$6('21 ('8&$7,21 0XVWDID8ODVDQG6DID0HUYH7DVFL )LUDW8QLYHULVLW\7XUNH\ Augmented Reality (AR) is a technology that gained popularity in recent years. It is defined as placement of virtual images over real view in real time. There are a lot of desktop applications which are using Augmented Reality. The rapid development of technology and easily portable mobile devices cause the increasing of the development of the applications on the mobile device. The elevation of the device technology leads to the applications and cause the generating of the new tools. There are a lot of AR Tool Kits. They differ in many ways such as used methods, Programming language, Used Operating Systems, etc. Firstly, a developer must find the most effective tool kits between them. This study is more of a guide to developers to find the best AR tool kit choice. The tool kit was examined under three main headings. The Parameters such as advantages, disadvantages, platform, and programming language were compared. In addition to the information is given about usage of them and a Virtual Reality application has developed which is based on Education. .H\ZRUGV Augmented reality, ARToolKit, Computer vision, Image processing. ,QWURGXFWLRQ Augmented reality is basically a snapshot of the real environment with virtual environment applications that are brought together. Basically it can be operated on every device which has a camera display and operation system. -
User Interface Design Issues in HCI
IJCSNS International Journal of Computer Science and Network Security, VOL.18 No.8, August 2018 153 User Interface Design Issues In HCI Waleed Iftikhar, Muhammad Sheraz Arshad Malik, Shanza Tariq, Maha Anwar, Jawad Ahmad, M. Saad sultan Department of Computing, Riphah International University, Faisalabad, Pakistan Summary Command line is the interface that allows the user to This paper presents an important analysis on a literature review interact with the computer by directly using the commands. which has the findings in design issues from the year 1999 to But there is an issue that the commands cannot be changed, 2018. This study basically discusses about all the issues related they are fixed and computer only understands the exact to design and user interface, and also gives the solutions to make commands. the designs or user interface more attractive and understandable. This study is the guideline to solve the main issues of user Graphical user interface is the interface that allows the interface. user to interact with the system, because this is user There is important to secure the system for modern applications. friendly and easy to use. This includes the graphics, The use of internet is quickly growing from years. Because of pictures and also attractive for all type of users. The this fast travelling lifestyle, where they lets the user to attach with command line is black and white interface. This interface systems from everywhere. When user is ignoring the is also known as WIMPS because it uses windows, icons, functionalities in the system then the system is not secure but, in menus, pointers. -
Augmented Reality & Virtual Reality Is Now a Reality for Enterprises
WHITE PAPER AUGMENTED REALITY & VIRTUAL REALITY IS NOW A REALITY FOR ENTERPRISES- THE FUTURE IS HERE! Abstract Innovation and next-generation technologies have completely changed the way we work, live and possibly even the way we think. AI, Augmented Reality (AR), Virtual Reality (VR), and Blockchain are just some of the technologies that have affected how we consume art, music, movies, and how we communicate, shop, and travel. We are in the midst of a complete digital revolution. This perspective paper illustrates a point of view on the use of mixed reality (MR) in today’s enterprise environment, and covers-- virtual reality and augmented reality, market trends, industry examples, and challenges within organizations that are adopting mixed reality. In short, it sheds light on what the future is expected to look like in the context of enterprise disruption with MR. Introduction Johnny Mnemonic, the Lawnmower Man, Minority Report, the Matrix, Minority Report, the Terminator 2, Ironman… Besides captivating audiences with their Everyone seems to know what VR headsets using special electronic equipment, such as stunning visual effects, these films all have are, and the popularity of Pokémon a helmet with an internal screen or gloves one thing in common - they showcase how Go almost allows omission of a basic fitted with sensors.” VR can digitally recreate MR technologies could be potentially used introduction to AR. Though they are often the environment around you, or give you in the future. used interchangeably, it is essential to clarify the impression you are somewhere entirely that AR and VR are not the same. -
Designing for Increased Autonomy & Human Control
IFIP Workshop on Intelligent Vehicle Dependability and Security (IVDS) January 29, 2021 Designing for Increased Autonomy & Human Control Ben Shneiderman @benbendc Founding Director (1983-2000), Human-Computer Interaction Lab Professor, Department of Computer Science Member, National Academy of Engineering Photo: BK Adams IFIP Workshop on Intelligent Vehicle Dependability and Security (IVDS) January 29, 2021 Designing for Increased Automation & Human Control Ben Shneiderman @benbendc Founding Director (1983-2000), Human-Computer Interaction Lab Professor, Department of Computer Science Member, National Academy of Engineering Photo: BK Adams What is Human-Centered AI? Human-Centered AI Amplify, Augment, Enhance & Empower People Human Responsibility Supertools and Active Appliances Visual Interfaces to Prevent/Reduce Explanations Audit Trails to Analyze Failures & Near Misses Independent Oversight à Reliable, Safe & Trustworthy Supertools Digital Camera Controls Navigation Choices Texting Autocompletion Spelling correction Active Appliances Coffee maker, Rice cooker, Blender Dishwasher, Clothes Washer/Dryer Implanted Cardiac Pacemakers NASA Mars Rovers are Tele-Operated DaVinci Tele-Operated Surgery “Robots don’t perform surgery. Your surgeon performs surgery with da Vinci by using instruments that he or she guides via a console.” https://www.davincisurgery.com/ Bloomberg Terminal A 2-D HCAI Framework Designing the User Interface Balancing automation & human control First Edition: 1986 Designing the User Interface Balancing automation & -
Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration
ORIGINAL RESEARCH published: 14 June 2021 doi: 10.3389/frvir.2021.697367 Eye See What You See: Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration Allison Jing*, Kieran May, Gun Lee and Mark Billinghurst Empathic Computing Lab, Australian Research Centre for Interactive and Virtual Environment, STEM, The University of South Australia, Mawson Lakes, SA, Australia Gaze is one of the predominant communication cues and can provide valuable implicit information such as intention or focus when performing collaborative tasks. However, little research has been done on how virtual gaze cues combining spatial and temporal characteristics impact real-life physical tasks during face to face collaboration. In this study, we explore the effect of showing joint gaze interaction in an Augmented Reality (AR) interface by evaluating three bi-directional collaborative (BDC) gaze visualisations with three levels of gaze behaviours. Using three independent tasks, we found that all bi- directional collaborative BDC visualisations are rated significantly better at representing Edited by: joint attention and user intention compared to a non-collaborative (NC) condition, and Parinya Punpongsanon, hence are considered more engaging. The Laser Eye condition, spatially embodied with Osaka University, Japan gaze direction, is perceived significantly more effective as it encourages mutual gaze Reviewed by: awareness with a relatively low mental effort in a less constrained workspace. In addition, Naoya Isoyama, Nara Institute of Science and by offering additional virtual representation that compensates for verbal descriptions and Technology (NAIST), Japan hand pointing, BDC gaze visualisations can encourage more conscious use of gaze cues Thuong Hoang, Deakin University, Australia coupled with deictic references during co-located symmetric collaboration. -
Evaluating Performance Benefits of Head Tracking in Modern Video
Evaluating Performance Benefits of Head Tracking in Modern Video Games Arun Kulshreshth Joseph J. LaViola Jr. Department of EECS Department of EECS University of Central Florida University of Central Florida 4000 Central Florida Blvd 4000 Central Florida Blvd Orlando, FL 32816, USA Orlando, FL 32816, USA [email protected] [email protected] ABSTRACT PlayStation Move, TrackIR 5) that support 3D spatial in- teraction have been implemented and made available to con- We present a study that investigates user performance ben- sumers. Head tracking is one example of an interaction tech- efits of using head tracking in modern video games. We nique, commonly used in the virtual and augmented reality explored four di↵erent carefully chosen commercial games communities [2, 7, 9], that has potential to be a useful ap- with tasks which can potentially benefit from head tracking. proach for controlling certain gaming tasks. Recent work on For each game, quantitative and qualitative measures were head tracking and video games has shown some potential taken to determine if users performed better and learned for this type of gaming interface. For example, Sko et al. faster in the experimental group (with head tracking) than [10] proposed a taxonomy of head gestures for first person in the control group (without head tracking). A game ex- shooter (FPS) games and showed that some of their tech- pertise pre-questionnaire was used to classify participants niques (peering, zooming, iron-sighting and spinning) are into casual and expert categories to analyze a possible im- useful in games. In addition, previous studies [13, 14] have pact on performance di↵erences. -
Exploring the User Interface Affordances of File Sharing
CHI 2006 Proceedings • Activity: Design Implications April 22-27, 2006 • Montréal, Québec, Canada Share and Share Alike: Exploring the User Interface Affordances of File Sharing Stephen Voida1, W. Keith Edwards1, Mark W. Newman2, Rebecca E. Grinter1, Nicolas Ducheneaut2 1GVU Center, College of Computing 2Palo Alto Research Center Georgia Institute of Technology 3333 Coyote Hill Road 85 5th Street NW, Atlanta, GA 30332–0760, USA Palo Alto, CA 94304, USA {svoida, keith, beki}@cc.gatech.edu {mnewman, nicolas}@parc.com ABSTRACT and folders shared with other users on a single computer, With the rapid growth of personal computer networks and often as a default system behavior; files shared with other the Internet, sharing files has become a central activity in computers over an intranet or home network; and files computer use. The ways in which users control the what, shared with other users around the world on web sites and how, and with whom of sharing are dictated by the tools FTP servers. Users also commonly exchange copies of they use for sharing; there are a wide range of sharing documents as email attachments, transfer files during practices, and hence a wide range of tools to support these instant messaging sessions, post digital photos to online practices. In practice, users’ requirements for certain photo album services, and swap music files using peer–to– sharing features may dictate their choice of tool, even peer file sharing applications. though the other affordances available through that tool Despite these numerous venues for and implementations of may not be an ideal match to the desired manner of sharing. -
Virtual and Augmented Reality
Virtual and Augmented Reality Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin This book first published 2020 Cambridge Scholars Publishing Lady Stephenson Library, Newcastle upon Tyne, NE6 2PA, UK British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Copyright © 2020 by Zeynep Tacgin All rights for this book reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner. ISBN (10): 1-5275-4813-9 ISBN (13): 978-1-5275-4813-8 TABLE OF CONTENTS List of Illustrations ................................................................................... x List of Tables ......................................................................................... xiv Preface ..................................................................................................... xv What is this book about? .................................................... xv What is this book not about? ............................................ xvi Who is this book for? ........................................................ xvii How is this book used? .................................................. xviii The specific contribution of this book ............................. xix Acknowledgements ........................................................... -
User Interface for Volume Rendering in Virtual Reality Environments
User Interface for Volume Rendering in Virtual Reality Environments Jonathan Klein∗ Dennis Reuling† Jan Grimm‡ Andreas Pfau§ Damien Lefloch¶ Martin Lambersk Andreas Kolb∗∗ Computer Graphics Group University of Siegen ABSTRACT the gradient or the curvature at the voxel location into account and Volume Rendering applications require sophisticated user interac- require even more complex user interfaces. tion for the definition and refinement of transfer functions. Tradi- Since Virtual Environments are especially well suited to explore tional 2D desktop user interface elements have been developed to spatial properties of complex 3D data, bringing Volume Render- solve this task, but such concepts do not map well to the interaction ing applications into such environments is a natural step. However, devices available in Virtual Reality environments. defining new user interfaces suitable both for the Virtual Environ- ment and for the Volume Rendering application is difficult. Pre- In this paper, we propose an intuitive user interface for Volume vious approaches mainly focused on porting traditional 2D point- Rendering specifically designed for Virtual Reality environments. and-click concepts to the Virtual Environment [8, 5, 9]. This tends The proposed interface allows transfer function design and refine- to be unintuitive, to complicate the interaction, and to make only ment based on intuitive two-handed operation of Wand-like con- limited use of available interaction devices. trollers. Additional interaction modes such as navigation and clip In this paper, we propose an intuitive 3D user interface for Vol- plane manipulation are supported as well. ume Rendering based on interaction devices that are suitable for The system is implemented using the Sony PlayStation Move Virtual Reality environments. -
Effective User Interface Design for Consumer Trust Two Case Studies
2005:097 SHU MASTER'S THESIS Effective User Interface Design for Consumer Trust Two Case Studies XILING ZHOU XIANGCHUN LIU Luleå University of Technology MSc Programme in Electronic Commerce Department of Business Administration and Social Sciences Division of Industrial marketing and e-commerce 2005:097 SHU - ISSN: 1404-5508 - ISRN: LTU-SHU-EX--05/097--SE Luleå University of Technology E-Commerce ACKNOWLEDGEMENT This thesis is the result of half a year of work whereby we have been accompanied and supported by many people. It is a pleasant aspect that we could have this opportunity to express our gratitude to all of them. First, we are deeply indebted to our supervisor Prof. Lennart Persson who is from Division of Industrial Marketing at LTU. He helped us with stimulating suggestions and encouragement in all the time of research and writing of this thesis. Without his never-ending support during this process, we could not have done this thesis. Especially, we would like to express our gratitude to all of participants, who have spent their valuable time to response the interview questions and discuss with us. Finally, we would like to thank our family and friends. I, Zhou Xiling am very grateful for everyone who gave me support and encouragement during this process. Especially I felt a deep sense of gratitude to my father and mother who formed part of my vision and taught me the good things that really matter in the life. I also want to thank my friend Tang Yu for his never-ending support and good advices. I, Liu XiangChun am very grateful for my parents, for their endless love and support. -
Augmented Reality, Virtual Reality, & Health
University of Massachusetts Medical School eScholarship@UMMS National Network of Libraries of Medicine New National Network of Libraries of Medicine New England Region (NNLM NER) Repository England Region 2017-3 Augmented Reality, Virtual Reality, & Health Allison K. Herrera University of Massachusetts Medical School Et al. Let us know how access to this document benefits ou.y Follow this and additional works at: https://escholarship.umassmed.edu/ner Part of the Health Information Technology Commons, Library and Information Science Commons, and the Public Health Commons Repository Citation Herrera AK, Mathews FZ, Gugliucci MR, Bustillos C. (2017). Augmented Reality, Virtual Reality, & Health. National Network of Libraries of Medicine New England Region (NNLM NER) Repository. https://doi.org/ 10.13028/1pwx-hc92. Retrieved from https://escholarship.umassmed.edu/ner/42 Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. This material is brought to you by eScholarship@UMMS. It has been accepted for inclusion in National Network of Libraries of Medicine New England Region (NNLM NER) Repository by an authorized administrator of eScholarship@UMMS. For more information, please contact [email protected]. Augmented Reality, Virtual Reality, & Health Zeb Mathews University of Tennessee Corina Bustillos Texas Tech University Allison Herrera University of Massachusetts Medical School Marilyn Gugliucci University of New England Outline Learning Objectives Introduction & Overview Objectives: • Explore AR & VR technologies and Augmented Reality & Health their impact on health sciences, Virtual Reality & Health with examples of projects & research Technology Funding Opportunities • Know how to apply for funding for your own AR/VR health project University of New England • Learn about one VR project funded VR Project by the NNLM Augmented Reality and Virtual Reality (AR/VR) & Health What is AR and VR? F. -
Natural Interaction in Augmented Reality Context
Natural Interaction in Augmented Reality Context John Aliprantis1, Markos Konstantakis1, Rozalia Nikopoulou2, Phivos Mylonas2 and George Caridakis1 1 University of the Aegean, 81100 Mytilene, Greece {jalip, mkonstadakis, gcari}@aegean.gr 2 Ionian University 49100 Corfu, Greece [email protected], [email protected] Abstract. In recent years, immersive technologies like Virtual and Augmented Reality have been accelerating at an incredible pace, building innovative experiences and developing new interaction paradigms. Current research has widely explored gesture interaction with Augmented Reality interfaces, but usually requires users to manipulate input devices that could be cumbersome and obtrusive, thus preventing them from interacting efficiently with the 3D environment. Therefore, Natural User Interfaces and freehand gesture interaction are becoming more and more popular, improving the user’s engagement and sense of presence, providing more stimulating, user-friendly and non-obtrusive interaction methods. However, researchers argue about the impact of the interaction fidelity in usability and user satisfaction, questioning the level of naturalness that should characterize the interaction metaphors. Current paper proposes different gesture recognition techniques for three basic interaction categories (translation, rotation and scaling) in a Leap Motion Controller - Augmented Reality framework. A prototype is implemented in order to evaluate efficiency and usability of the proposed architecture. Finally, experimental results are discussed. Keywords: Natural interactionAugmented realityLeap motion controller Gesture recognition. 1 Introduction Over the last few years, Augmented Reality (AR) has developed into a cutting edge technology, providing new ways to interact with computer – generated information. By removing the boundaries between physical and virtual, AR has been able to create more engaging experiences, enhancing user’s enjoyment and satisfaction.