The XRSI Definitions of Extended Reality (XR)

Total Page:16

File Type:pdf, Size:1020Kb

The XRSI Definitions of Extended Reality (XR) XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality The XRSI Defi nitions of Extended Reality (XR) XR Safety Initiative Standard Publication XR-001 www.xrsi.org XR Safety Initiative xrsidotorg 1 CC BY-NC-SA 4.0 XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality XR Data Classifi cation Framework Public Working Group XR-DCF Public Working Group XR Safety Initiative, California, USA Liaison Organization: Open AR Cloud 2 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Abstract The Extended Reality – Data Classifi cation Framework – Public Working Group (XR- DCF-PWG) at the XR Safety Initiative (XRSI) develops and promotes a fundamental understanding of the properties and classifi cation of data in XR environments by providing technical leadership in the XR domain. XR-DCF-PWG develops tests, test methods, reference data, proof of concept implementations, and technical analysis to advance the development and productive use of immersive technology. XR-DCF-PWG’s responsibilities include the development of technical, physical, administrative, and management standards and guidelines for the risk-based security and privacy of sensitive information in XR environments. This Special Publication XR- series reports on XR-DCF-PWG’s research, guidance, and outreach efforts in XR Safety and its collaborative activities with industry, government, and academic organizations. This specifi c report is an enumeration of terms for the purposes of consistency in communication. Certain commercial entities, equipment, or materials may be identifi ed in this document in order to describe an experimental procedure or concept adequately. Such identifi cation is not intended to imply recommendation or endorsement by the XR Safety Initiative, nor is it intended to imply that the entities, materials, or equipment are the best available for the purpose. 3 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Table of contents Abstract ............................................................................................................................3 Acknowledgments ......................................................................................................6 Contributors ...................................................................................................................6 SECTION 1 1.1 Introduction ..............................................................................................................7 1.2 Purpose and Scope ...............................................................................................8 1.3 Audience ....................................................................................................................8 SECTION 2 The XRSI Defi nition of Extended Reality (XR) .................................................9 SECTION 3 Common Forms of Extended Reality (XR) ......................................................10 3.1 Virtual Reality (VR) ............................................................................................... 11 3.2 Augmented Reality (AR) ................................................................................... 11 3.3 Mixed Reality (MR) .............................................................................................. 12 3.4 Volumetric Video Capture .............................................................................. 12 3.5 360 Degree Video ............................................................................................... 13 SECTION 4 Common Terms of XR ..............................................................................................14 4.1 Field of View (FOV) .............................................................................................14 4.2 Degrees of Freedom (DoF) ............................................................................. 15 4.3 Head Mounted Display (HMD) ...................................................................... 16 4.4 Haptics .................................................................................................................... 16 4.5 Positional Tracking ............................................................................................ 17 4.6 Biometric Tracking ............................................................................................ 21 4.7 Gaze .......................................................................................................................... 21 SECTION 5 References ................................................................................................................... 23 4 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Acknowledgments The founders Kavya Pearlman and Ibrahim Baggili of the XR Safety Initiative (XRSI) would like to thank the many experts in industry and government who contributed their thoughts to the creation and review of this document. We especially acknowledge our liaison organization, Open AR Cloud (OARC) for the technical insight and active contribution in this effort. Contributors Kavya Pearlman Jan-Erik Vinje XR Safety Initiative (XRSI) Open AR Cloud Ross Newman April Boyd-Noronha University of North Texas - Human/ XR Safety Initiative (XRSI) machine Intelligence and Language Technologies Lab (HiLT) Emily Dare XR Safety Initiative (XRSI) Marco Magnano XR Safety Initiative (XRSI) Lisa Winter ZMTVR Alina Kadlubsky Open AR Cloud Vandana Verma InfoSecGirls Tamas Henning XR Safety Initiative (XRSI) Christopher Burns Private Entity Dennis Bonilla Baltu Studios Colin Steinmann Open AR Cloud Peter Clemenko CyVR Zoe Braiterman Open Web Application Security Project Jay Silvas Virtual World Society Suzanne Borders BadVR Ninad Patil Private Entity Liv Erickson XR Software Engineer Eva Hoerth WXR Fund / WeMakeRealities 5 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality SECTION 1 Introduction The XR Safety Initiative (XRSI) developed this document as part of its mission to help build safe virtual environments. XRSI is responsible for developing standards and guidelines, including minimum requirements. However, such standards and guidelines may not apply universally to all XR systems and applications. These guidelines have been prepared for use by XR stakeholders. They may be used by governments, non-governmental organizations, and academics on a voluntary basis. They are not subject to copyright, although attribution is desired. 6 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Purpose and Scope Extended Reality (XR) is an evolving paradigm. The XRSI defi nitions characterize 1.2 important aspects of XR and are intended to serve as a means for a broad comparison of XR technologies and environments. This document also provides a baseline on what Extended Reality (XR) is and how to best use it. The defi nitions and taxonomies defi ned in this document are not intended to prescribe or constrain any particular dimension, timeline, or environment of immersive technologies. Audience The intended audience of this document is XR stakeholders, potentially including 1.3 developers, technologists, providers, and consumers of immersive technologies. 7 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality SECTION 2 The XRSI Defi nition of Extended Reality (XR) Extended Reality (XR) is a fusion of all the realities – including Augmented Reality (AR), Virtual Reality (VR), and Mixed Reality (MR) – which consists of technology- mediated experiences enabled via a wide spectrum of hardware and software, including sensory interfaces, applications, and infrastructures. XR is often referred to as immersive video content, enhanced media experiences, as well as interactive and multi-dimensional human experiences. “XR does not refer to any specifi c technology. It’s a bucket for all of the realities.” Jim Malcolm, Humaneyes 8 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality SECTION 3 Common Forms of Extended Reality (XR) 9 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Virtual Reality (VR) is a fully immersive software-generated artifi cial digital 3.1 environment. VR is a simulation of three- Virtual dimensional images, experienced by Reality users via special electronic equipment, (VR) such as a Head Mounted Display (HMD). VR can create or enhance characteristics such as presence, embodiment, and agency. Augmented Reality (AR) overlays digitally-created content on top of the user’s real-world environment, viewed 3.2 through a device (such as a smartphone) Augmented that incorporates real-time inputs to Reality (AR) create an enhanced version of reality. Digital and virtual objects (e.g., graphics, sounds) are superimposed on an existing environment to create an AR experience. 10 CC BY-NC-SA 4.0 xrsidotorg XRSI Standard Publication XR-001 The XRSI Defi nition of Extended Reality Mixed Reality (MR) seamlessly blends the user’s real-world environment with 3.3 digitally-created content, where both Mixed environments can coexist and interact Reality with each other. In MR, the virtual (MR) objects behave in all aspects as if they are present in the real world, e.g., they are occluded by physical objects, their lighting is consistent with the actual light
Recommended publications
  • Chapter 7 Immersive Journalism: the New Narrative Doron Friedman and Candice Kotzen
    “9.61x6.69” b3187 Robot Journalism: Can Human Journalism Survive? FA Chapter 7 Immersive journalism: The new narrative Doron Friedman and Candice Kotzen Immersive journalism is a subcategory of journalism that uses virtual reality (VR) and similar technologies to provide those engaging in such technologies with a sense of being wholly engrossed in the news story, thus allowing the news audience to form a direct impression of the ambience of the story. This chapter is intended to serve as a primer of VR use for news storytelling for individuals with an interest or background in journalism. The first section presents some essential background on VR and related technologies. Next, we present some research findings on the impact of VR, and review some of the early work in immersive journalism. We conclude by delineating a collection of thoughts and questions for journalists wishing to enter into this new exciting field. 1. The Technology More than 50 years after the first demonstration of virtual reality (VR) technologies [Sutherland, 1965], it is apparent that VR is on the brink of becoming a form of mass media as VR documentary and journalism has been a central theme. Triggered by Facebook’s acquisition of Oculus Rift in 2014, the technology industry launched the race to deliver compelling VR hardware, software, and content. In this chapter, we present the essential background for non-experts who are intrigued by immer- sive journalism. For a recent comprehensive review of VR research in general, we recommend Slater and Sanchez-Vives [2016]. Relevant issues from this review are elaborated below.
    [Show full text]
  • New Realities Risks in the Virtual World 2
    Emerging Risk Report 2018 Technology New realities Risks in the virtual world 2 Lloyd’s disclaimer About the author This report has been co-produced by Lloyd's and Amelia Kallman is a leading London futurist, speaker, Amelia Kallman for general information purposes only. and author. As an innovation and technology While care has been taken in gathering the data and communicator, Amelia regularly writes, consults, and preparing the report Lloyd's does not make any speaks on the impact of new technologies on the future representations or warranties as to its accuracy or of business and our lives. She is an expert on the completeness and expressly excludes to the maximum emerging risks of The New Realities (VR-AR-MR), and extent permitted by law all those that might otherwise also specialises in the future of retail. be implied. Coming from a theatrical background, Amelia started Lloyd's accepts no responsibility or liability for any loss her tech career by chance in 2013 at a creative or damage of any nature occasioned to any person as a technology agency where she worked her way up to result of acting or refraining from acting as a result of, or become their Global Head of Innovation. She opened, in reliance on, any statement, fact, figure or expression operated and curated innovation lounges in both of opinion or belief contained in this report. This report London and Dubai, working with start-ups and corporate does not constitute advice of any kind. clients to develop connections and future-proof strategies. Today she continues to discover and bring © Lloyd’s 2018 attention to cutting-edge start-ups, regularly curating All rights reserved events for WIRED UK.
    [Show full text]
  • Advanced Volumetric Capture and Processing
    ADVANCED VOLUMETRIC CAPTURE AND PROCESSING O. Schreer, I. Feldmann, T. Ebner, S. Renault, C. Weissig, D. Tatzelt, P. Kauff Fraunhofer Heinrich Hertz Institute, Berlin, Germany ABSTRACT Volumetric video is regarded worldwide as the next important development step in media production. Especially in the context of rapidly evolving Virtual and Augmented Reality markets, volumetric video is becoming a key technology. Fraunhofer HHI has developed a novel technology for volumetric video: 3D Human Body Reconstruction (3DHBR). The 3D Human Body Reconstruction technology captures real persons with our novel volumetric capture system and creates naturally moving dynamic 3D models, which can then be observed from arbitrary viewpoints in a virtual or augmented reality scene. The capture system consists of an integrated multi-camera and lighting system for full 360 degree acquisition. A cylindrical studio has been developed with a diameter of 6m and it consists of 32 20MPixel cameras and 120 LED panels that allow for arbitrary lit background. Hence, diffuse lighting and automatic keying is supported. The avoidance of green screen and provision of diffuse lighting offers best possible conditions for re-lighting of the dynamic 3D models afterwards at design stage of the VR experience. In contrast to classical character animation, facial expressions and moving clothes are reconstructed at high geometrical detail and texture quality. The complete workflow is fully automatic, requires about 12 hours per minute of mesh sequence and provides a high level of quality for immediate integration in virtual scenes. Meanwhile a second, professional studio has been built up on the film campus of Potsdam Babelsberg. This studio is operated by VoluCap GmbH, a joint venture between Studio Babelsberg, ARRI, UFA, Interlake and Fraunhofer HHI.
    [Show full text]
  • COIMBATORE-641046 DEPARTMENT of COMPUTER APPLICATIONS (Effective from the Academic Year 2019-2020)
    Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 1 of 4 SCAA Dated: 09.05.2019 Annexure – V BHARATHIAR UNIVERSITY:: COIMBATORE-641046 DEPARTMENT OF COMPUTER APPLICATIONS (Effective from the academic Year 2019-2020) Certificate Course in Augmented Reality Eligibility for admission to the course A pass in Higher Secondary Examination (+2) conducted by the Government of Tamil Nadu or an examination accepted as equivalent there to by the syndicate. Duration of the course The candidates can undergo this course both in full-time (3 months) and part-time (6 months). The certificate programme consists of theory and practical courses. Regulations The general Regulations of the Bharathiar University Choice Based Credit System are applicable to this certificate programme. The Medium of Instruction and Examinations The medium of instruction and Examinations shall be in English. Submission of Record Notebooks for Practical Examinations Candidates taking the Practical Examinations should submit bonafide Record Note Books prescribed for the Examinations. Otherwise the candidates will not be permitted to take the Practical Examinations. Revision of Regulations and Curriculum The above Regulation and Scheme of Examinations will be in vogue without any change for a minimum period of three years from the date of approval of the Regulations. The University may revise /amend/ change the Regulations and Scheme of Examinations, if found necessary. Scheme of Examinations Max Course Code Subject and Paper L P Credits Marks Paper I Augmented Reality 4 2 6 150 Certificate course in Augmented Reality -2019-20-ud Annexure No: 55F Page 2 of 4 SCAA Dated: 09.05.2019 Course Title :AUGMENTED REALITY No.
    [Show full text]
  • A Review of Extended Reality (XR) Technologies for Manufacturing Training
    technologies Article A Review of Extended Reality (XR) Technologies for Manufacturing Training Sanika Doolani * , Callen Wessels, Varun Kanal, Christos Sevastopoulos, Ashish Jaiswal and Harish Nambiappan and Fillia Makedon * Department of Computer Science and Engineering, The University of Texas at Arlington, Arlington, TX 76019, USA; [email protected] (C.W.); [email protected] (V.K.); [email protected] (C.S.); [email protected] (A.J.); [email protected] (H.N.) * Correspondence: [email protected] (S.D.); [email protected] (F.M.) Received: 30 October 2020; Accepted: 5 December 2020; Published: 10 December 2020 Abstract: Recently, the use of extended reality (XR) systems has been on the rise, to tackle various domains such as training, education, safety, etc. With the recent advances in augmented reality (AR), virtual reality (VR) and mixed reality (MR) technologies and ease of availability of high-end, commercially available hardware, the manufacturing industry has seen a rise in the use of advanced XR technologies to train its workforce. While several research publications exist on applications of XR in manufacturing training, a comprehensive review of recent works and applications is lacking to present a clear progress in using such advance technologies. To this end, we present a review of the current state-of-the-art of use of XR technologies in training personnel in the field of manufacturing. First, we put forth the need of XR in manufacturing. We then present several key application domains where XR is being currently applied, notably in maintenance training and in performing assembly task.
    [Show full text]
  • State-Of-The-Art in Holography and Auto-Stereoscopic Displays
    State-of-the-art in holography and auto-stereoscopic displays Daniel Jönsson <Ersätt med egen bild> 2019-05-13 Contents Introduction .................................................................................................................................................. 3 Auto-stereoscopic displays ........................................................................................................................... 5 Two-View Autostereoscopic Displays ....................................................................................................... 5 Multi-view Autostereoscopic Displays ...................................................................................................... 7 Light Field Displays .................................................................................................................................. 10 Market ......................................................................................................................................................... 14 Display panels ......................................................................................................................................... 14 AR ............................................................................................................................................................ 14 Application Fields ........................................................................................................................................ 15 Companies .................................................................................................................................................
    [Show full text]
  • X-Reality Museums: Unifying the Virtual and Real World Towards Realistic Virtual Museums
    applied sciences Article X-Reality Museums: Unifying the Virtual and Real World Towards Realistic Virtual Museums George Margetis 1 , Konstantinos C. Apostolakis 1, Stavroula Ntoa 1, George Papagiannakis 1,2 and Constantine Stephanidis 1,2,* 1 Foundation for Research and Technology Hellas, Institute of Computer Science, N. Plastira 100, Vassilika Vouton, GR-700 13 Heraklion, Greece; [email protected] (G.M.); [email protected] (K.C.A.); [email protected] (S.N.); [email protected] (G.P.) 2 Department of Computer Science, University of Crete, GR-700 13 Heraklion, Greece * Correspondence: [email protected]; Tel.: +30-2810-391-741 Abstract: Culture is a field that is currently entering a revolutionary phase, no longer being a privilege for the few, but expanding to new audiences who are urged to not only passively consume cultural heritage content, but actually participate and assimilate it on their own. In this context, museums have already embraced new technologies as part of their exhibitions, many of them featuring augmented or virtual reality artifacts. The presented work proposes the synthesis of augmented, virtual and mixed reality technologies to provide unified X-Reality experiences in realistic virtual museums, engaging visitors in an interactive and seamless fusion of physical and virtual worlds that will feature virtual agents exhibiting naturalistic behavior. Visitors will be able to interact with the virtual agents, as they would with real world counterparts. The envisioned approach is expected to not only provide refined experiences for museum visitors, but also achieve high quality entertainment combined with more effective knowledge acquisition. Keywords: extended reality; diminished reality; true mediated reality; augmented reality; virtual reality; natural multimodal interaction; unified user experiences; interactive museum exhibits Citation: Margetis, G.; Apostolakis, K.C.; Ntoa, S.; Papagiannakis, G.; Stephanidis, C.
    [Show full text]
  • The Metaverse and Digital Realities Transcript Introduction Plenary
    [Scientific Innovation Series 9] The Metaverse and Digital Realities Transcript Date: 08/27/2021 (Released) Editors: Ji Soo KIM, Jooseop LEE, Youwon PARK Introduction Yongtaek HONG: Welcome to the Chey Institute’s Scientific Innovation Series. Today, in the 9th iteration of the series, we focus on the Metaverse and Digital Realities. I am Yongtaek Hong, a Professor of Electrical and Computer Engineering at Seoul National University. I am particularly excited to moderate today’s webinar with the leading experts and scholars on the metaverse, a buzzword that has especially gained momentum during the online-everything shift of the pandemic. Today, we have Dr. Michael Kass and Dr. Douglas Lanman joining us from the United States. And we have Professor Byoungho Lee and Professor Woontack Woo joining us from Korea. Now, I will introduce you to our opening Plenary Speaker. Dr. Michael Kass is a senior distinguished engineer at NVIDIA and the overall software architect of NVIDIA Omniverse, NVIDIA’s platform and pipeline for collaborative 3D content creation based on USD. He is also the recipient of distinguished awards, including the 2005 Scientific and Technical Academy Award and the 2009 SIGGRAPH Computer Graphics Achievement Award. Plenary Session Michael KASS: So, my name is Michael Kass. I'm a distinguished engineer from NVIDIA. And today we'll be talking about NVIDIA's view of the metaverse and how we need an open metaverse. And we believe that the core of that metaverse should be USD, Pixar's Universal Theme Description. Now, I don't think I have to really do much to introduce the metaverse to this group, but the original name goes back to Neal Stephenson's novel Snow Crash in 1992, and the original idea probably goes back further.
    [Show full text]
  • Immersive Tourism
    Immersive Tourism State of the Art of Immersive Tourism Realities through XR Technology The Whitepaper contributes to the BUAS project DigiReal, an IMPULS/Sprong project, which was financed by the Dutch National Funding Organisation for Research SIA. Front page image credit: The WaveXR 1 About the Authors Jessika Weber Sabil, PhD Senior Researcher & Lecturer Games & Tourism [email protected] Dr. Jessika Weber Sabil is senior researcher at the Faculty of Digital Entertainment at BUas under the professorship of Applied Games, Innovation and Society and a senior lecturer at the Academy of Tourism of Breda University of Applied Sciences. Her research focusses on games applied to tourism ecosystems and experience design. Current and previous research projects explore (mobile) location-based AR games for experience enhancement, the application of serious games to understand complex systems and games to facilitate creative processes. Jessika holds a PhD from Bournemouth University, where she explored the game experiences of tourists with location-based augmented reality games in urban environments and a master from the University of Applied Sciences Salzburg on Tourism and Innovation Management. She is a member of the International Federation for Information Technology in Travel and Tourism (IFITT), Digital Games Research Group (DiGRA) and the Interaction Design Foundation. 2 Dai-In Danny Han, PhD Senior Researcher & Lecturer Hotel & Facility [email protected] Dr. Dai-In Danny Han is a professor at the research centre Future of Food at Zuyd University of Applied Sciences and a senior researcher at Breda University of Applied Sciences. He holds a PhD in the area of mobile augmented reality user experience design and has been involved in numerous projects studying the user experience for immersive technologies in the hospitality and tourism context.
    [Show full text]
  • 2014 Annual Report
    Sandia National Laboratories 2014 LDRD Annual Report Laboratory Directed Research and Development 2014 ANNUAL REPORT Exceptional service in the national interest 1 Cover photos (clockwise): Randall Schunk and Rekha Rao discuss an image generated by Goma 6.0, an R&D 100 Laboratory Directed Research and Development Award winner that has origins in several LDRD projects. 2014 ANNUAL REPORT A modeled simulation of heat flux in a grandular material, showing what appear to be several preferential “channels” for heat flow developed in Project 171054. Melissa Finley examines a portable diagnostic device, for Bacillus anthracis detection in ultra-low resource environments, developed in Project 158813, which won an R&D 100 award. The Gauss–Newton with Approximated Tensors (GNAT) Reduced Order Models technique applied to a large-scale computational fluid dynamics problem was developed in Project 158789. Edward Jimenez, Principal Investigator for a “High Performance Graphics Processor- Based Computed Tomography Reconstruction Algorithms for Nuclear and Other Large Exceptional service in the national interest Scale Applications” for Project 158182. 1 Issued by Sandia National Laboratories, operated for the United States Abstract Department of Energy by Sandia Corporation. This report summarizes progress from the NOTICE: This report was prepared as an account of work sponsored by an agency of Laboratory Directed Research and Development the United States Government. Neither the United States Government, nor any agency (LDRD) program during fiscal year 2014. In thereof, nor any of their employees, nor any of their contractors, subcontractors, or addition to the programmatic and financial their employees, make any warranty, express or implied, or assume any legal liability overview, the report includes progress reports or responsibility for the accuracy, completeness, or usefulness of any information, from 419 individual R&D projects in 16 apparatus, product, or process disclosed, or represent that its use would not infringe categories.
    [Show full text]
  • Demystifying the Future of the Screen
    Demystifying the Future of the Screen by Natasha Dinyar Mody A thesis exhibition presented to OCAD University in partial fulfillment of the requirements for the degree of Master of Design in DIGITAL FUTURES 49 McCaul St, April 12-15, 2018 Toronto, Ontario, Canada, April, 2018 © Natasha Dinyar Mody, 2018 AUTHOR’S DECLARATION I hereby declare that I am the sole author of this thesis. This is a true copy of the thesis, including any required final revisions, as accepted by my examiners. I authorize OCAD University to lend this thesis to other institutions or individuals for the purpose of scholarly research. I understand that my thesis may be made electronically available to the public. I further authorize OCAD University to reproduce this thesis by photocopying or by other means, in total or in part, at the request of other institutions or individuals for the purpose of scholarly research. Signature: ii ABSTRACT Natasha Dinyar Mody ‘Demystifying the Future of the Screen’ Master of Design, Digital Futures, 2018 OCAD University Demystifying the Future of the Screen explores the creation of a 3D representation of volumetric display (a graphical display device that produces 3D objects in mid-air), a technology that doesn’t yet exist in the consumer realm, using current technologies. It investigates the conceptual possibilities and technical challenges of prototyping a future, speculative, technology with current available materials. Cultural precedents, technical antecedents, economic challenges, and industry adaptation, all contribute to this thesis proposal. It pedals back to the past to examine the probable widespread integration of this future technology. By employing a detailed horizon scan, analyzing science fiction theories, and extensive user testing, I fabricated a prototype that simulates an immersive volumetric display experience, using a holographic display fan.
    [Show full text]
  • State of the Art in Extended Reality - Multimodal Interaction
    State of the Art in Extended Reality - Multimodal Interaction Ismo Rakkolainen, Ahmed Farooq, Jari Kangas, Jaakko Hakulinen, Jussi Rantala, Markku Turunen, and Roope Raisamo Technical report HUMOR project, January 2021 Tampere University, Finland Contents 1. Introduction ............................................................................................................................................. 1 2. Multimodal Interaction ........................................................................................................................... 2 2.1 Human senses .......................................................................................................................................... 2 2.2. Introduction to multimodal interaction ................................................................................................... 4 2.3 Three-dimensional user interfaces ........................................................................................................... 8 3. Core Multimodal Interaction Techniques ........................................................................................... 11 3.1 Auditory interfaces ................................................................................................................................ 11 3.2 Speech.................................................................................................................................................... 12 3.3 Gesture recognition technologies .........................................................................................................
    [Show full text]