1-2-4 Openxr
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Introduction to the Vulkan Computer Graphics API
1 Introduction to the Vulkan Computer Graphics API Mike Bailey mjb – July 24, 2020 2 Computer Graphics Introduction to the Vulkan Computer Graphics API Mike Bailey [email protected] SIGGRAPH 2020 Abridged Version This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License http://cs.oregonstate.edu/~mjb/vulkan ABRIDGED.pptx mjb – July 24, 2020 3 Course Goals • Give a sense of how Vulkan is different from OpenGL • Show how to do basic drawing in Vulkan • Leave you with working, documented, understandable sample code http://cs.oregonstate.edu/~mjb/vulkan mjb – July 24, 2020 4 Mike Bailey • Professor of Computer Science, Oregon State University • Has been in computer graphics for over 30 years • Has had over 8,000 students in his university classes • [email protected] Welcome! I’m happy to be here. I hope you are too ! http://cs.oregonstate.edu/~mjb/vulkan mjb – July 24, 2020 5 Sections 13.Swap Chain 1. Introduction 14.Push Constants 2. Sample Code 15.Physical Devices 3. Drawing 16.Logical Devices 4. Shaders and SPIR-V 17.Dynamic State Variables 5. Data Buffers 18.Getting Information Back 6. GLFW 19.Compute Shaders 7. GLM 20.Specialization Constants 8. Instancing 21.Synchronization 9. Graphics Pipeline Data Structure 22.Pipeline Barriers 10.Descriptor Sets 23.Multisampling 11.Textures 24.Multipass 12.Queues and Command Buffers 25.Ray Tracing Section titles that have been greyed-out have not been included in the ABRIDGED noteset, i.e., the one that has been made to fit in SIGGRAPH’s reduced time slot. -
GLSL 4.50 Spec
The OpenGL® Shading Language Language Version: 4.50 Document Revision: 7 09-May-2017 Editor: John Kessenich, Google Version 1.1 Authors: John Kessenich, Dave Baldwin, Randi Rost Copyright (c) 2008-2017 The Khronos Group Inc. All Rights Reserved. This specification is protected by copyright laws and contains material proprietary to the Khronos Group, Inc. It or any components may not be reproduced, republished, distributed, transmitted, displayed, broadcast, or otherwise exploited in any manner without the express prior written permission of Khronos Group. You may use this specification for implementing the functionality therein, without altering or removing any trademark, copyright or other notice from the specification, but the receipt or possession of this specification does not convey any rights to reproduce, disclose, or distribute its contents, or to manufacture, use, or sell anything that it may describe, in whole or in part. Khronos Group grants express permission to any current Promoter, Contributor or Adopter member of Khronos to copy and redistribute UNMODIFIED versions of this specification in any fashion, provided that NO CHARGE is made for the specification and the latest available update of the specification for any version of the API is used whenever possible. Such distributed specification may be reformatted AS LONG AS the contents of the specification are not changed in any way. The specification may be incorporated into a product that is sold as long as such product includes significant independent work developed by the seller. A link to the current version of this specification on the Khronos Group website should be included whenever possible with specification distributions. -
View the Manual
Base game Manual V0.4 Environment Remove any objects in the playing area that you might touch or hit while playing with your body. Please also make sure that lamps or fans are not in the playing area. If you are at the edge of the playing area, do not make any big movements, otherwise you could touch the walls with your body or hands/controllers. It is best to stand. Step movements are not necessary during the game. You can move and rotate completely with the controllers. Technical requirements To play the game, you need Virtual Reality headset (VR headset). Without VR glasses, the game will not run. A (free) Steam account is also required. Also installed must be the VR software "SteamVR" (when using HTC VIVE or the Valve Index) or the additional software "Windows Mixed Reality for SteamVR" for the use of Windows Mixed Reality glasses on Steam with "SteamVR". Before the first launch, a room measuring should also have been done in SteamVR (when using the HTC VIVE or the Valve Index). When using Windows Mixed Reality glasses, there is an option to change the position of the floor from the height. The controllers of the VR headset should also be connected to the VR system and be charged. The game Container You start the game in the container and go to the selected construction site through the door. Later, you can start different locations here depending on the existing DLC. You can now set options to one of the boards by moving the switch accordingly. -
VR Headset Comparison
VR Headset Comparison All data correct as of 1st May 2019 Enterprise Resolution per Tethered or Rendering Special Name Cost ($)* Available DOF Refresh Rate FOV Position Tracking Support Eye Wireless Resource Features Announced Works with Google Subject to Mobile phone 5.00 Yes 3 60 90 None Wireless any mobile No Cardboard mobile device required phone HP Reverb 599.00 Yes 6 2160x2160 90 114 Inside-out camera Tethered PC WMR support Yes Tethered Additional (*wireless HTC VIVE 499.00 Yes 6 1080x1200 90 110 Lighthouse V1 PC tracker No adapter support available) HTC VIVE PC or mobile ? No 6 ? ? ? Inside-out camera Wireless - No Cosmos phone HTC VIVE Mobile phone 799.00 Yes 6 1440x1600 75 110 Inside-out camera Wireless - Yes Focus Plus chipset Tethered Additional HTC VIVE (*wireless tracker 1,099.00 Yes 6 1440x1600 90 110 Lighthouse V1 and V2 PC Yes Pro adapter support, dual available) cameras Tethered All features HTC VIVE (*wireless of VIVE Pro ? No 6 1440x1600 90 110 Lighthouse V1 and V2 PC Yes Pro Eye adapter plus eye available) tracking Lenovo Mirage Mobile phone 399.00 Yes 3 1280x1440 75 110 Inside-out camera Wireless - No Solo chipset Mobile phone Oculus Go 199.00 Yes 3 1280x1440 72 110 None Wireless - Yes chipset Mobile phone Oculus Quest 399.00 No 6 1440x1600 72 110 Inside-out camera Wireless - Yes chipset Oculus Rift 399.00 Yes 6 1080x1200 90 110 Outside-in cameras Tethered PC - Yes Oculus Rift S 399.00 No 6 1280x1440 90 110 Inside-out cameras Tethered PC - No Pimax 4K 699.00 Yes 6 1920x2160 60 110 Lighthouse Tethered PC - No Upscaled -
Virtual Reality Headsets
VIRTUAL REALITY HEADSETS LILY CHIANG VR HISTORY • Many companies (Virtuality, Sega, Atari, Sony) jumped on the VR hype in the 1990s; but commercialization flopped because both hardware and software failed to deliver on the promised VR vision. • Any use of the VR devices in the 2000s was limited to the military, aviation, and medical industry for simulation and training. • VR hype resurged after Oculus successful KickStarter campaign; subsequently acquired by Facebook for $2.4 bn. • Investments rushed into the VR industry as major tech firms such as Google, Samsung, and Microsoft and prominent VC firms bet big on the VR revolution. LIST OF VIRTUAL REALITY HEADSET FIRMS Company Name Entered Exited Disposition Company Name Entered Exited Disposition Company Name Entered Exited Disposition LEEP Optics 1979 1998 Bankrupt Meta Altergaze 2014 Ongoing VPL Research 1984 1990 Bankrupt SpaceGlasses 2012 Ongoing Archos VR 2014 Ongoing Division Group Sulon Cortex 2012 Ongoing AirVr 2014 Ongoing LTD 1989 1999 Acquired Epson Moverio Sega VR 1991 1994 Bankrupt BT-200 2012 Ongoing 360Specs 2014 Ongoing Virtuality 1991 1997 Acquired i2i iPal 2012 Ongoing Microsoft VictorMaxx 1992 1998 Bankrupt Star VR 2013 Ongoing Hololens Systems 2015 Ongoing Durovis Dive 2013 Ongoing Razr OSVR 2015 Ongoing Atari Jaguar VR 1993 1996 Discontinued Vrizzmo 2013 Ongoing Virtual I-O 1993 1997 Bankrupt Cmoar 2015 Ongoing CastAR 2013 Ongoing eMagin 1993 Ongoing Dior Eyes VR 2015 Ongoing VRAse 2013 Ongoing Virtual Boy 1994 1995 Discontinued Yay3d VR 2013 Ongoing Impression Pi -
Attitudes Towards Virtual Reality Gaming and Products
ATTITUDES TOWARDS VIRTUAL REALITY GAMING AND PRODUCTS Eppu Siirtola International Business Bachelor's Thesis Supervisor: Suzanne Altobello Date of approval: 9 April 2018 Aalto University School of Business Bachelor´s Program in International Business Mikkeli Campus ATTITUDES TOWARDS VIRTUAL REALITY GAMING AND PRODUCTS Eppu Siirtola International Business Bachelor's Thesis Supervisor: Suzanne Altobello Date of approval: 9 April 2018 Aalto University School of Business Bachelor´s Program in International Business Mikkeli Campus AALTO UNIVERSITY ABSTRACT OF SCHOOL OF BUSINESS BACHELOR’S THESIS Mikkeli Campus Author: Eppu Siirtola Title of thesis: ATTITUDES TOWARDS VIRTUAL REALITY GAMING AND PRODUCTS Date: 9 April 2018 Degree: Bachelor of Science in Economics and Business Administration Supervisor: Suzanne Altobello Objectives The main objective of this study was to explore the attitudes towards virtual reality gaming and the products surrounding it to identify the reasons why consumers choose to purchase and adopt the usage of these devices. The study also analyzes the components of added value that virtual reality gaming brings on top of a more traditional gaming experience Summary An extensive examination of already published academic research regarding product adoption and value concept was conducted to realize the key notions that would act as the cornerstones of the primary research. A quantitative survey was designed for the relevant sample to see whether the earlier findings on the topics of product adoption and value concept resonated with the results gathered from the survey focusing on virtual reality gaming and the products surrounding it. Conclusions The research shows that the product adoption in the virtual reality gaming market essentially operates under the already established concepts of product adoption. -
Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration
ORIGINAL RESEARCH published: 14 June 2021 doi: 10.3389/frvir.2021.697367 Eye See What You See: Exploring How Bi-Directional Augmented Reality Gaze Visualisation Influences Co-Located Symmetric Collaboration Allison Jing*, Kieran May, Gun Lee and Mark Billinghurst Empathic Computing Lab, Australian Research Centre for Interactive and Virtual Environment, STEM, The University of South Australia, Mawson Lakes, SA, Australia Gaze is one of the predominant communication cues and can provide valuable implicit information such as intention or focus when performing collaborative tasks. However, little research has been done on how virtual gaze cues combining spatial and temporal characteristics impact real-life physical tasks during face to face collaboration. In this study, we explore the effect of showing joint gaze interaction in an Augmented Reality (AR) interface by evaluating three bi-directional collaborative (BDC) gaze visualisations with three levels of gaze behaviours. Using three independent tasks, we found that all bi- directional collaborative BDC visualisations are rated significantly better at representing Edited by: joint attention and user intention compared to a non-collaborative (NC) condition, and Parinya Punpongsanon, hence are considered more engaging. The Laser Eye condition, spatially embodied with Osaka University, Japan gaze direction, is perceived significantly more effective as it encourages mutual gaze Reviewed by: awareness with a relatively low mental effort in a less constrained workspace. In addition, Naoya Isoyama, Nara Institute of Science and by offering additional virtual representation that compensates for verbal descriptions and Technology (NAIST), Japan hand pointing, BDC gaze visualisations can encourage more conscious use of gaze cues Thuong Hoang, Deakin University, Australia coupled with deictic references during co-located symmetric collaboration. -
Implementing FPGA Design with the Opencl Standard
Implementing FPGA Design with the OpenCL Standard WP-01173-3.0 White Paper Utilizing the Khronos Group’s OpenCL™ standard on an FPGA may offer significantly higher performance and at much lower power than is available today from hardware architectures such as CPUs, graphics processing units (GPUs), and digital signal processing (DSP) units. In addition, an FPGA-based heterogeneous system (CPU + FPGA) using the OpenCL standard has a significant time-to-market advantage compared to traditional FPGA development using lower level hardware description languages (HDLs) such as Verilog or VHDL. 1 OpenCL and the OpenCL logo are trademarks of Apple Inc. used by permission by Khronos. Introduction The initial era of programmable technologies contained two different extremes of programmability. As illustrated in Figure 1, one extreme was represented by single core CPU and digital signal processing (DSP) units. These devices were programmable using software consisting of a list of instructions to be executed. These instructions were created in a manner that was conceptually sequential to the programmer, although an advanced processor could reorder instructions to extract instruction-level parallelism from these sequential programs at run time. In contrast, the other extreme of programmable technology was represented by the FPGA. These devices are programmed by creating configurable hardware circuits, which execute completely in parallel. A designer using an FPGA is essentially creating a massively- fine-grained parallel application. For many years, these extremes coexisted with each type of programmability being applied to different application domains. However, recent trends in technology scaling have favored technologies that are both programmable and parallel. Figure 1. -
Advanced Displays and Techniques for Telepresence COAUTHORS for Papers in This Talk
Advanced Displays and Techniques for Telepresence COAUTHORS for papers in this talk: Young-Woon Cha Rohan Chabra Nate Dierk Mingsong Dou Wil GarreM Gentaro Hirota Kur/s Keller Doug Lanman Mark Livingston David Luebke Andrei State (UNC) 1994 Andrew Maimone Surgical Consulta/on Telepresence Federico Menozzi EMa Pisano, MD Henry Fuchs Kishore Rathinavel UNC Chapel Hill Andrei State Eric Wallen, MD ARPA-E Telepresence Workshop Greg Welch Mary WhiMon April 26, 2016 Xubo Yang Support gratefully acknowledged: CISCO, Microsoft Research, NIH, NVIDIA, NSF Awards IIS-CHS-1423059, HCC-CGV-1319567, II-1405847 (“Seeing the Future: Ubiquitous Computing in EyeGlasses”), and the BeingThere Int’l Research Centre, a collaboration of ETH Zurich, NTU Singapore, UNC Chapel Hill and Singapore National Research Foundation, Media Development Authority, and Interactive Digital Media Program Office. 1 Video Teleconferencing vs Telepresence • Video Teleconferencing • Telepresence – Conven/onal 2D video capture and – Provides illusion of presence in the display remote or combined local&remote space – Single camera, single display at each – Provides proper stereo views from the site is common configura/on for precise loca/on of the user Skype, Google Hangout, etc. – Stereo views change appropriately as user moves – Provides proper eye contact and eye gaze cues among all the par/cipants Cisco TelePresence 3000 Three distant rooms combined into a single space with wall-sized 3D displays 2 Telepresence Component Technologies • Acquisi/on (cameras) • 3D reconstruc/on Cisco -
2020 IEEE Conference on Virtual Reality and 3D User Interfaces
2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR 2020) Atlanta , Georgia, USA 22 – 26 March 2020 IEEE Catalog Number: CFP20VIR-POD ISBN: 978-1-7281-5609-5 Copyright © 2020 by the Institute of Electrical and Electronics Engineers, Inc. All Rights Reserved Copyright and Reprint Permissions: Abstracting is permitted with credit to the source. Libraries are permitted to photocopy beyond the limit of U.S. copyright law for private use of patrons those articles in this volume that carry a code at the bottom of the first page, provided the per-copy fee indicated in the code is paid through Copyright Clearance Center, 222 Rosewood Drive, Danvers, MA 01923. For other copying, reprint or republication permission, write to IEEE Copyrights Manager, IEEE Service Center, 445 Hoes Lane, Piscataway, NJ 08854. All rights reserved. *** This is a print representation of what appears in the IEEE Digital Library. Some format issues inherent in the e-media version may also appear in this print version. IEEE Catalog Number: CFP20VIR-POD ISBN (Print-On-Demand): 978-1-7281-5609-5 ISBN (Online): 978-1-7281-5608-8 ISSN: 2642-5246 Additional Copies of This Publication Are Available From: Curran Associates, Inc 57 Morehouse Lane Red Hook, NY 12571 USA Phone: (845) 758-0400 Fax: (845) 758-2633 E-mail: [email protected] Web: www.proceedings.com 2020 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) VR 2020 Table of Contents General Chairs Message xix Conference Paper Program Chairs Message xx IEEE Visualization and Graphics Technical -
New Realities Risks in the Virtual World 2
Emerging Risk Report 2018 Technology New realities Risks in the virtual world 2 Lloyd’s disclaimer About the author This report has been co-produced by Lloyd's and Amelia Kallman is a leading London futurist, speaker, Amelia Kallman for general information purposes only. and author. As an innovation and technology While care has been taken in gathering the data and communicator, Amelia regularly writes, consults, and preparing the report Lloyd's does not make any speaks on the impact of new technologies on the future representations or warranties as to its accuracy or of business and our lives. She is an expert on the completeness and expressly excludes to the maximum emerging risks of The New Realities (VR-AR-MR), and extent permitted by law all those that might otherwise also specialises in the future of retail. be implied. Coming from a theatrical background, Amelia started Lloyd's accepts no responsibility or liability for any loss her tech career by chance in 2013 at a creative or damage of any nature occasioned to any person as a technology agency where she worked her way up to result of acting or refraining from acting as a result of, or become their Global Head of Innovation. She opened, in reliance on, any statement, fact, figure or expression operated and curated innovation lounges in both of opinion or belief contained in this report. This report London and Dubai, working with start-ups and corporate does not constitute advice of any kind. clients to develop connections and future-proof strategies. Today she continues to discover and bring © Lloyd’s 2018 attention to cutting-edge start-ups, regularly curating All rights reserved events for WIRED UK. -
Standards for Vision Processing and Neural Networks
Standards for Vision Processing and Neural Networks Radhakrishna Giduthuri, AMD [email protected] © Copyright Khronos Group 2017 - Page 1 Agenda • Why we need a standard? • Khronos NNEF • Khronos OpenVX dog Network Architecture Pre-trained Network Model (weights, …) © Copyright Khronos Group 2017 - Page 2 Neural Network End-to-End Workflow Neural Network Third Vision/AI Party Applications Training Frameworks Tools Datasets Trained Vision and Neural Network Network Inferencing Runtime Network Model Architecture Desktop and Cloud Hardware Embedded/Mobile Embedded/MobileEmbedded/Mobile Embedded/Mobile/Desktop/CloudVision/InferencingVision/Inferencing Hardware Hardware cuDNN MIOpen MKL-DNN Vision/InferencingVision/Inferencing Hardware Hardware GPU DSP CPU Custom FPGA © Copyright Khronos Group 2017 - Page 3 Problem: Neural Network Fragmentation Neural Network Training and Inferencing Fragmentation NN Authoring Framework 1 Inference Engine 1 NN Authoring Framework 2 Inference Engine 2 NN Authoring Framework 3 Inference Engine 3 Every Tool Needs an Exporter to Every Accelerator Neural Network Inferencing Fragmentation toll on Applications Inference Engine 1 Hardware 1 Vision/AI Inference Engine 2 Hardware 2 Application Inference Engine 3 Hardware 3 Every Application Needs know about Every Accelerator API © Copyright Khronos Group 2017 - Page 4 Khronos APIs Connect Software to Silicon Software Silicon Khronos is an International Industry Consortium of over 100 companies creating royalty-free, open standard APIs to enable software to access