Three-Dimensional Image Sensing and Visualization with Augmented Reality Xin Shen University of Connecticut - Storrs, [email protected]

Total Page:16

File Type:pdf, Size:1020Kb

Three-Dimensional Image Sensing and Visualization with Augmented Reality Xin Shen University of Connecticut - Storrs, Xin.Shen@Uconn.Edu University of Connecticut OpenCommons@UConn Doctoral Dissertations University of Connecticut Graduate School 8-24-2018 Three-Dimensional Image Sensing and Visualization with Augmented Reality Xin Shen University of Connecticut - Storrs, [email protected] Follow this and additional works at: https://opencommons.uconn.edu/dissertations Recommended Citation Shen, Xin, "Three-Dimensional Image Sensing and Visualization with Augmented Reality" (2018). Doctoral Dissertations. 1892. https://opencommons.uconn.edu/dissertations/1892 Three-Dimensional Image Sensing and Visualization with Augmented Reality Xin Shen, Ph.D University of Connecticut, 2018 Abstract In recent decades, there have been significant technological advancements in sensors, devices, materials, algorithms, and computational hardware, resulting in extensive improvements for visualization capabilities applied to real world objects. Among these improvements, three-dimensional (3D) imaging technologies have received interest from many research groups and may offer advantages over conventional two-dimensional (2D) approaches. In comparison with 2D sensing techniques, which record only the intensity of the scene, passive 3D imaging also includes depth and directional information. Many techniques for 3D imaging have been proposed, such as holography and interferometry, time-of-flight, two-view based stereoscopy, and multi-view techniques for autostereoscopic 3D imaging, to cite a few. This dissertation will focus on novel aspects of integral imaging based multi-view 3D imaging systems, 3D information processing and visualization in three separate parts. In the first part, two concepts for integral imaging based dynamic 3D imaging are presented. Xin Shen, University of Connecticut, 2018 First, an extended depth-of-focus 3D micro display is presented by using a bifocal liquid crystal lens. Second, a head tracking 3D display is presented by means of proper application of the smart pseudoscopic-to-orthoscopic conversion (SPOC) algorithm. In the second part, novel 3D imaging systems and 3D image processing approaches are proposed. First, the recent progress of integral imaging based Multidimensional Optical Sensing and Imaging Systems (MOSIS) is presented for object recognition, material inspection, and integrated visualization, etc. Second, 3D profilometric reconstruction using flexible sensing integral imaging with automatic occlusion removal is presented. Third, spatial-temporal human gesture recognition under degraded conditions using 3D integral imaging is presented. In the third part, approaches for 3D sensing and visualization with Augmented Reality (AR) are presented. First, an AR based approach for optical visualization and object recognition using 3D axially distributed sensing (ADS) is presented. Second, we present an eye fatigue-free 3D augmented display using lenslet based integral imaging. Lastly, a dynamic 3D imaging system based optical see-through augmented reality display with enhanced depth range of a 3D augmented image is presented to reduce the accommodation- convergence mismatch problem. Three-Dimensional Image Sensing and Visualization with Augmented Reality Xin Shen B.S., Xidian University, Xi’an, China, 2010 M.S., Xidian University, Xi’an, China, 2013 M.S., Doshisha University, Kyoto, Japan, 2013 M.S., University of Connecticut, Connecticut, USA, 2016 A Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy at the University of Connecticut 2018 i Copyright by Xin Shen 2018 ii APPROVAL PAGE Doctor of Philosophy Dissertation Three-Dimensional Image Sensing and Visualization with Augmented Reality Presented by Xin Shen, B.S., M.S. Major Advisor ______________________________________________ Bahram Javidi Associate Advisor ______________________________________________ Rajeev Bansal Associate Advisor ______________________________________________ Helena Silva University of Connecticut 2018 iii ACKNOWLEDGMENTS While I am writing this dissertation, the memory of my first week starting this Ph.D program looks like it just happened yesterday. Five years have passed since the beginning of the program, and over that time I have gained knowledge and wisdom along the way which I will always be thankful. I would not be able to write this dissertation without the encouragement, guidance, and collaborations of my advisors, colleagues, family and friends. My deepest gratitude goes first and foremost to my advisor, Prof. Bahram Javidi, for his constant encouragement and guidance. I am very fortunate and proud to have joined Prof. Javid’s group for my academic study and research. His insightful vision and foresight guided me to navigate the research field deeply. I was deeply inspired by his rigorous and earnest research attitude. His collaborations with worldwide outstanding researchers and scientists help broaden my horizon. With his illuminating instruction, I am able to keep improving. I would like to extend my sincerest appreciations to my committee members, Prof. Rajeev Bansal, Prof. Helena Silva, Prof. John Chandy, and Prof. Monty Escabi. Their advice and support helped my research and study at UConn a lot. I would also like to extend my gratitude to Prof. Artur Carnicer, Prof. Pedro Latorre Carmona, Prof. Arun Anand, and Prof. José Manuel Rodríguez Ramos for their guidance and help during their visits at UConn. I will always remember the valuable discussions with them on my study, research, and concerns. I also enjoyed conversations with Prof. Manuel Martinez-Corral, Prof. Yi- iv Bai Huang, Prof. Adrian Stern, Prof. Hua Hong and Prof. Myungjin Cho in international conferences. I wish to truly thank my current lab mates, Adam Markman, Satoru Komatsu, Siddharth Rawat, Hee-Seung Kim, Timothy O'Connor, and Hyun-Woo Kim for their hard work, collaboration, and help. We have built a great lab culture to support each other and improve together. Thanks to my former colleagues, Xiao Xiao, Jingang Wang, Kaleel Mahmood, and Gang Yao for their guidance, which made me be engaged into my research quickly. I would like to have a thank to my international research mates: Yu-Jen Wang and Simon Jen. I could not forget the summer and winter during their visit at UConn. With a limited period, we collaborated efficiently to achieve our goals. I thank other visiting scholars, Alba Peinado Capdevila, Prof. Xiaoxi Chen, Chen Yang, Zhiyuan Shen, Faliu Yi, Juan Trujillo, and all the friends I have met at UConn, who made my life colorful. I am indebted to my parents, Ling Shen and Lianxiang Wang. None of this would be possible without their unconditional encouragement and support. Although we just had very limited time to get together for the past years, I can always feel their love deeply from my heart. This dissertation is also dedicated to all my beloved family, my dear grandparents who I am always missing, my aunts & uncles, cousins, and my lovely niece. They have always been there for me and I am thankful to have their continuous love which makes me keep going. v Table of Contents List of Figures................................................................................................................ ix List of Tables .................................................................................................................xv Part I ................................................................................................................................1 Three-Dimensional Imaging and Dynamic Three-Dimensional Optical Display .....1 Chapter 1 Overview of Three-dimensional Integral Imaging Technologies .............2 1.1 Introduction .........................................................................................................2 1.2 Principle of 3D Integral Imaging .........................................................................3 1.2.1 Pickup Stage of Integral Imaging ...............................................................4 1.2.2 Reconstruction Stage of Integral Imaging ..................................................8 1.3 Main Characteristics of 3D Integral Imaging Display ......................................13 1.3.1 Display Modes of Integral Imaging ..........................................................13 1.3.2 Depth of Focus and Spatial Resolution ....................................................14 1.3.3 Viewing Angle .........................................................................................15 1.3.4 Viewing Quality .......................................................................................17 1.4 Organization of Thesis ......................................................................................19 Chapter 2 Extended Depth-of-focus Three-Dimensional Micro Integral Imaging Display using a Bifocal Liquid Crystal Lens ..............................................................21 2.1 Introduction .......................................................................................................21 2.2 Operating Principle of the Micro Integral Imaging Display with Liquid Crystal Lens .........................................................................................................................22 2.3 Experimental Results .........................................................................................27 2.4 Conclusion .........................................................................................................32 Chapter 3 Head Tracking Three-dimensional Integral Imaging Display using Smart Pseudoscopic-to-orthoscopic
Recommended publications
  • A Review About Augmented Reality Tools and Developing a Virtual Reality Application
    Academic Journal of Science, CD-ROM. ISSN: 2165-6282 :: 03(02):139–146 (2014) $5(9,(:$%287$8*0(17('5($/,7<722/6$1' '(9(/23,1*$9,578$/5($/,7<$33/,&$7,21%$6('21 ('8&$7,21 0XVWDID8ODVDQG6DID0HUYH7DVFL )LUDW8QLYHULVLW\7XUNH\ Augmented Reality (AR) is a technology that gained popularity in recent years. It is defined as placement of virtual images over real view in real time. There are a lot of desktop applications which are using Augmented Reality. The rapid development of technology and easily portable mobile devices cause the increasing of the development of the applications on the mobile device. The elevation of the device technology leads to the applications and cause the generating of the new tools. There are a lot of AR Tool Kits. They differ in many ways such as used methods, Programming language, Used Operating Systems, etc. Firstly, a developer must find the most effective tool kits between them. This study is more of a guide to developers to find the best AR tool kit choice. The tool kit was examined under three main headings. The Parameters such as advantages, disadvantages, platform, and programming language were compared. In addition to the information is given about usage of them and a Virtual Reality application has developed which is based on Education. .H\ZRUGV Augmented reality, ARToolKit, Computer vision, Image processing. ,QWURGXFWLRQ Augmented reality is basically a snapshot of the real environment with virtual environment applications that are brought together. Basically it can be operated on every device which has a camera display and operation system.
    [Show full text]
  • Augmented Reality & Virtual Reality Is Now a Reality for Enterprises
    WHITE PAPER AUGMENTED REALITY & VIRTUAL REALITY IS NOW A REALITY FOR ENTERPRISES- THE FUTURE IS HERE! Abstract Innovation and next-generation technologies have completely changed the way we work, live and possibly even the way we think. AI, Augmented Reality (AR), Virtual Reality (VR), and Blockchain are just some of the technologies that have affected how we consume art, music, movies, and how we communicate, shop, and travel. We are in the midst of a complete digital revolution. This perspective paper illustrates a point of view on the use of mixed reality (MR) in today’s enterprise environment, and covers-- virtual reality and augmented reality, market trends, industry examples, and challenges within organizations that are adopting mixed reality. In short, it sheds light on what the future is expected to look like in the context of enterprise disruption with MR. Introduction Johnny Mnemonic, the Lawnmower Man, Minority Report, the Matrix, Minority Report, the Terminator 2, Ironman… Besides captivating audiences with their Everyone seems to know what VR headsets using special electronic equipment, such as stunning visual effects, these films all have are, and the popularity of Pokémon a helmet with an internal screen or gloves one thing in common - they showcase how Go almost allows omission of a basic fitted with sensors.” VR can digitally recreate MR technologies could be potentially used introduction to AR. Though they are often the environment around you, or give you in the future. used interchangeably, it is essential to clarify the impression you are somewhere entirely that AR and VR are not the same.
    [Show full text]
  • Interaction Methods for Smart Glasses: a Survey
    This article has been accepted for publication in a future issue of this journal, but has not been fully edited. Content may change prior to final publication. Citation information: DOI 10.1109/ACCESS.2018.2831081, IEEE Access Date of publication xxxx 00, 0000, date of current version xxxx 00, 0000. Digital Object Identifier 10.1109/ACCESS.2018.Doi Number Interaction Methods for Smart Glasses: A survey Lik-Hang LEE1, and Pan HUI1&2 (Fellow, IEEE). 1The Hong Kong University of Science and Technology, Department of Computer Science and Engineering 2The University of Helsinki, Department of Computer Science Corresponding author: Pan HUI (e-mail: panhui@ cse.ust.hk). ABSTRACT Since the launch of Google Glass in 2014, smart glasses have mainly been designed to support micro-interactions. The ultimate goal for them to become an augmented reality interface has not yet been attained due to an encumbrance of controls. Augmented reality involves superimposing interactive computer graphics images onto physical objects in the real world. This survey reviews current research issues in the area of human-computer interaction for smart glasses. The survey first studies the smart glasses available in the market and afterwards investigates the interaction methods proposed in the wide body of literature. The interaction methods can be classified into hand-held, touch, and touchless input. This paper mainly focuses on the touch and touchless input. Touch input can be further divided into on-device and on-body, while touchless input can be classified into hands-free and freehand. Next, we summarize the existing research efforts and trends, in which touch and touchless input are evaluated by a total of eight interaction goals.
    [Show full text]
  • Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration
    ORIGINAL RESEARCH published: 14 June 2021 doi: 10.3389/frvir.2021.697367 Eye See What You See: Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration Allison Jing*, Kieran May, Gun Lee and Mark Billinghurst Empathic Computing Lab, Australian Research Centre for Interactive and Virtual Environment, STEM, The University of South Australia, Mawson Lakes, SA, Australia Gaze is one of the predominant communication cues and can provide valuable implicit information such as intention or focus when performing collaborative tasks. However, little research has been done on how virtual gaze cues combining spatial and temporal characteristics impact real-life physical tasks during face to face collaboration. In this study, we explore the effect of showing joint gaze interaction in an Augmented Reality (AR) interface by evaluating three bi-directional collaborative (BDC) gaze visualisations with three levels of gaze behaviours. Using three independent tasks, we found that all bi- directional collaborative BDC visualisations are rated significantly better at representing Edited by: joint attention and user intention compared to a non-collaborative (NC) condition, and Parinya Punpongsanon, hence are considered more engaging. The Laser Eye condition, spatially embodied with Osaka University, Japan gaze direction, is perceived significantly more effective as it encourages mutual gaze Reviewed by: awareness with a relatively low mental effort in a less constrained workspace. In addition, Naoya Isoyama, Nara Institute of Science and by offering additional virtual representation that compensates for verbal descriptions and Technology (NAIST), Japan hand pointing, BDC gaze visualisations can encourage more conscious use of gaze cues Thuong Hoang, Deakin University, Australia coupled with deictic references during co-located symmetric collaboration.
    [Show full text]
  • Evaluating Performance Benefits of Head Tracking in Modern Video
    Evaluating Performance Benefits of Head Tracking in Modern Video Games Arun Kulshreshth Joseph J. LaViola Jr. Department of EECS Department of EECS University of Central Florida University of Central Florida 4000 Central Florida Blvd 4000 Central Florida Blvd Orlando, FL 32816, USA Orlando, FL 32816, USA [email protected] [email protected] ABSTRACT PlayStation Move, TrackIR 5) that support 3D spatial in- teraction have been implemented and made available to con- We present a study that investigates user performance ben- sumers. Head tracking is one example of an interaction tech- efits of using head tracking in modern video games. We nique, commonly used in the virtual and augmented reality explored four di↵erent carefully chosen commercial games communities [2, 7, 9], that has potential to be a useful ap- with tasks which can potentially benefit from head tracking. proach for controlling certain gaming tasks. Recent work on For each game, quantitative and qualitative measures were head tracking and video games has shown some potential taken to determine if users performed better and learned for this type of gaming interface. For example, Sko et al. faster in the experimental group (with head tracking) than [10] proposed a taxonomy of head gestures for first person in the control group (without head tracking). A game ex- shooter (FPS) games and showed that some of their tech- pertise pre-questionnaire was used to classify participants niques (peering, zooming, iron-sighting and spinning) are into casual and expert categories to analyze a possible im- useful in games. In addition, previous studies [13, 14] have pact on performance di↵erences.
    [Show full text]
  • An Augmented Reality Social Communication Aid for Children and Adults with Autism: User and Caregiver Report of Safety and Lack of Negative Effects
    bioRxiv preprint doi: https://doi.org/10.1101/164335; this version posted July 19, 2017. The copyright holder for this preprint (which was not certified by peer review) is the author/funder. All rights reserved. No reuse allowed without permission. An Augmented Reality Social Communication Aid for Children and Adults with Autism: User and caregiver report of safety and lack of negative effects. An Augmented Reality Social Communication Aid for Children and Adults with Autism: User and caregiver report of safety and lack of negative effects. Ned T. Sahin1,2*, Neha U. Keshav1, Joseph P. Salisbury1, Arshya Vahabzadeh1,3 1Brain Power, 1 Broadway 14th Fl, Cambridge MA 02142, United States 2Department of Psychology, Harvard University, United States 3Department of Psychiatry, Massachusetts General Hospital, Boston * Corresponding Author. Ned T. Sahin, PhD, Brain Power, 1 Broadway 14th Fl, Cambridge, MA 02142, USA. Email: [email protected]. Abstract Background: Interest has been growing in the use of augmented reality (AR) based social communication interventions in autism spectrum disorders (ASD), yet little is known about their safety or negative effects, particularly in head-worn digital smartglasses. Research to understand the safety of smartglasses in people with ASD is crucial given that these individuals may have altered sensory sensitivity, impaired verbal and non-verbal communication, and may experience extreme distress in response to changes in routine or environment. Objective: The objective of this report was to assess the safety and negative effects of the Brain Power Autism System (BPAS), a novel AR smartglasses-based social communication aid for children and adults with ASD. BPAS uses emotion-based artificial intelligence and a smartglasses hardware platform that keeps users engaged in the social world by encouraging “heads-up” interaction, unlike tablet- or phone-based apps.
    [Show full text]
  • The Use of Smartglasses in Everyday Life
    University of Erfurt Faculty of Philosophy The Use of Smartglasses in Everyday Life A Grounded Theory Study Dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy (Dr. phil.) at the Faculty of Philosophy of the University of Erfurt submitted by Timothy Christoph Kessler from Munich November 2015 URN: urn:nbn:de:gbv:547-201600175 First Assessment: Prof. Dr. Joachim R. Höflich Second Assessment: Prof. Dr. Dr. Castulus Kolo Date of publication: 18th of April 2016 Abstract We live in a mobile world. Laptops, tablets and smartphones have never been as ubiquitous as they have been today. New technologies are invented on a daily basis, lead- ing to the altering of society on a macro level, and to the change of the everyday life on a micro level. Through the introduction of a new category of devices, wearable computers, we might experience a shift away from the traditional smartphone. This dissertation aims to examine the topic of smartglasses, especially Google Glass, and how these wearable devices are embedded into the everyday life and, consequently, into a society at large. The current research models which are concerned with mobile communication are only partly applicable due to the distinctive character of smartglasses. Furthermore, new legal and privacy challenges for smartglasses arise, which are not taken into account by ex- isting theories. Since the literature on smartglasses is close to non-existent, it is argued that new models need to be developed in order to fully understand the impact of smart- glasses on everyday life and society as a whole.
    [Show full text]
  • Virtual and Augmented Reality
    Virtual and Augmented Reality Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin Virtual and Augmented Reality: An Educational Handbook By Zeynep Tacgin This book first published 2020 Cambridge Scholars Publishing Lady Stephenson Library, Newcastle upon Tyne, NE6 2PA, UK British Library Cataloguing in Publication Data A catalogue record for this book is available from the British Library Copyright © 2020 by Zeynep Tacgin All rights for this book reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of the copyright owner. ISBN (10): 1-5275-4813-9 ISBN (13): 978-1-5275-4813-8 TABLE OF CONTENTS List of Illustrations ................................................................................... x List of Tables ......................................................................................... xiv Preface ..................................................................................................... xv What is this book about? .................................................... xv What is this book not about? ............................................ xvi Who is this book for? ........................................................ xvii How is this book used? .................................................. xviii The specific contribution of this book ............................. xix Acknowledgements ...........................................................
    [Show full text]
  • Téléprésence, Immersion Et Interactions Pour Le Reconstruction
    THÈSE Pour obtenir le grade de DOCTEUR DE L’UNIVERSITÉ DE GRENOBLE Spécialité : Mathématiques et Informatique Arrêté ministériel : 7 août 2006 Présentée par Benjamin PETIT Thèse dirigée par Edmond BOYER et codirigée par Bruno RAFFIN préparée au sein des laboratoires LJK et LIG dans l’école doctorale MSTII Téléprésence, immersion et tel-00584001, version 1 - 7 Apr 2011 tel-00584001, version 1 - 7 interaction pour la reconstruction 3D temps-réel Thèse soutenue publiquement le 21 Février 2011 devant le jury composé de : Mme. Indira, THOUVENIN Enseignant chercheur à l’Université de Technologie de Compiègne, Président Mr. Bruno, ARNALDI Professeur à l’INSA Rennes, Rapporteur Mme. Saida, BOUAKAZ Professeur à l’Université Claude Bernard Lyon 1, Rapporteur Mr. Edmond, BOYER Directeur de recherche à l’INRIA Grenoble, Membre Mr. Bruno, RAFFIN Chargé de recherche à l’INRIA Grenoble, Membre tel-00584001, version 1 - 7 Apr 2011 Remerciements Je voudrais commencer par remercier Edmond et Bruno pour avoir encadrer ma thèse. Ce fut un plaisir de travailler avec vous. Merci également aux membres de mon jury d’avoir accepté de rapporter cette thèse. Merci pour vos commentaires très constructifs. Pendant ma thèse j’ai eu l’occasion de travailler avec différentes personnes. Ces collaborations ont été très enrichissantes. Je voudrais remercier plus spécifiquement Jean-Denis qui m’a aidé à remettre sur pied la plateforme Grimage, Thomas avec qui j’ai passé de longues heures à développer les applications et démonstrations de ma thèse et enfin Hervé pour son excellent support sur la plateforme Grimage. J’aimerais remercier également Clément et Florian pour m’avoir transmis leur savoir sur la plateforme Grimage, Nicolas et Jean-François pour leur aide technique.
    [Show full text]
  • Augmented Reality, Virtual Reality, & Health
    University of Massachusetts Medical School eScholarship@UMMS National Network of Libraries of Medicine New National Network of Libraries of Medicine New England Region (NNLM NER) Repository England Region 2017-3 Augmented Reality, Virtual Reality, & Health Allison K. Herrera University of Massachusetts Medical School Et al. Let us know how access to this document benefits ou.y Follow this and additional works at: https://escholarship.umassmed.edu/ner Part of the Health Information Technology Commons, Library and Information Science Commons, and the Public Health Commons Repository Citation Herrera AK, Mathews FZ, Gugliucci MR, Bustillos C. (2017). Augmented Reality, Virtual Reality, & Health. National Network of Libraries of Medicine New England Region (NNLM NER) Repository. https://doi.org/ 10.13028/1pwx-hc92. Retrieved from https://escholarship.umassmed.edu/ner/42 Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. This material is brought to you by eScholarship@UMMS. It has been accepted for inclusion in National Network of Libraries of Medicine New England Region (NNLM NER) Repository by an authorized administrator of eScholarship@UMMS. For more information, please contact [email protected]. Augmented Reality, Virtual Reality, & Health Zeb Mathews University of Tennessee Corina Bustillos Texas Tech University Allison Herrera University of Massachusetts Medical School Marilyn Gugliucci University of New England Outline Learning Objectives Introduction & Overview Objectives: • Explore AR & VR technologies and Augmented Reality & Health their impact on health sciences, Virtual Reality & Health with examples of projects & research Technology Funding Opportunities • Know how to apply for funding for your own AR/VR health project University of New England • Learn about one VR project funded VR Project by the NNLM Augmented Reality and Virtual Reality (AR/VR) & Health What is AR and VR? F.
    [Show full text]
  • Factors Influencing Consumer Attitudes Towards M-Commerce AR Apps
    “I see myself, therefore I purchase”: factors influencing consumer attitudes towards m-commerce AR apps Mafalda Teles Roxo and Pedro Quelhas Brito Faculdade de Economia da Universidade do Porto and LIAAD-INESC TEC, Portugal [email protected]; [email protected] Abstract Mobile commerce (m-commerce) is starting to represent a significant share of e-commerce. The use of Augmented Reality (AR) by brands to convey information about their products - within the store and mainly as mobile apps – makes it possible for researchers and managers to understand consumer reactions. Although attitudes towards AR have been studied, the overall effect of distinct aspects such as the influence of others, the imagery, projection and perceived presence, has not been tackled as far as we know. Therefore, we conducted a study on 218 undergraduate students, using a pre-test post-test experimental design to address the following questions: (1) Do AR media characteristics affect consumer attitudes towards the medium in a mobile shopping context? Also, (2) Do the opinion and physical presence of people influence the attitude towards an m-commerce AR app? It found that AR characteristics such as projection and imagery positively influence attitudes towards m-commerce AR apps, whereas social variables did not have any influence. Keywords: MAR; m-commerce; consumer psychology; AR-consumer relationship. 1 Introduction Simultaneously with the increasing percentage of e-commerce sales resulting from mobile retail commerce (m-commerce), it is estimated that in the U.S., by 2020, 49.2% of online sales will be made using mobile apps (Statista, 2019b). Also, in 2018, approximately 57% of internet users purchased fashion-related products online (Statista, 2019a).
    [Show full text]
  • Natural Interaction in Augmented Reality Context
    Natural Interaction in Augmented Reality Context John Aliprantis1, Markos Konstantakis1, Rozalia Nikopoulou2, Phivos Mylonas2 and George Caridakis1 1 University of the Aegean, 81100 Mytilene, Greece {jalip, mkonstadakis, gcari}@aegean.gr 2 Ionian University 49100 Corfu, Greece [email protected], [email protected] Abstract. In recent years, immersive technologies like Virtual and Augmented Reality have been accelerating at an incredible pace, building innovative experiences and developing new interaction paradigms. Current research has widely explored gesture interaction with Augmented Reality interfaces, but usually requires users to manipulate input devices that could be cumbersome and obtrusive, thus preventing them from interacting efficiently with the 3D environment. Therefore, Natural User Interfaces and freehand gesture interaction are becoming more and more popular, improving the user’s engagement and sense of presence, providing more stimulating, user-friendly and non-obtrusive interaction methods. However, researchers argue about the impact of the interaction fidelity in usability and user satisfaction, questioning the level of naturalness that should characterize the interaction metaphors. Current paper proposes different gesture recognition techniques for three basic interaction categories (translation, rotation and scaling) in a Leap Motion Controller - Augmented Reality framework. A prototype is implemented in order to evaluate efficiency and usability of the proposed architecture. Finally, experimental results are discussed. Keywords: Natural interactionAugmented realityLeap motion controller Gesture recognition. 1 Introduction Over the last few years, Augmented Reality (AR) has developed into a cutting edge technology, providing new ways to interact with computer – generated information. By removing the boundaries between physical and virtual, AR has been able to create more engaging experiences, enhancing user’s enjoyment and satisfaction.
    [Show full text]