Building the Metaverse One Open Standard at a Time

Total Page:16

File Type:pdf, Size:1020Kb

Building the Metaverse One Open Standard at a Time Building the Metaverse One Open Standard at a Time Khronos APIs and 3D Asset Formats for XR Neil Trevett Khronos President NVIDIA VP Developer Ecosystems [email protected]| @neilt3d April 2020 © Khronos® Group 2020 This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 1 Khronos Connects Software to Silicon Open interoperability standards to enable software to effectively harness the power of multiprocessors and accelerator silicon 3D graphics, XR, parallel programming, vision acceleration and machine learning Non-profit, member-driven standards-defining industry consortium Open to any interested company All Khronos standards are royalty-free Well-defined IP Framework protects participant’s intellectual property >150 Members ~ 40% US, 30% Europe, 30% Asia This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 2 Khronos Active Initiatives 3D Graphics 3D Assets Portable XR Parallel Computation Desktop, Mobile, Web Authoring Augmented and Vision, Inferencing, Machine Embedded and Safety Critical and Delivery Virtual Reality Learning Guidelines for creating APIs to streamline system safety certification This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 3 Pervasive Vulkan Desktop and Mobile GPUs http://vulkan.gpuinfo.org/ Platforms Apple Desktop Android (via porting Media Players Consoles (Android 7.0+) layers) (Vulkan 1.1 required on Android Q) Virtual Reality Cloud Services Game Streaming Embedded Engines Croteam Serious Engine Note: The version of Vulkan available will depend on platform and vendor This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 4 Vulkan Roadmap Vulkan 1.1 Extensions Maintenance updates plus additional functionality Roadmap Discussions Timeline semaphores Ray Tracing DX/HLSL compatibility Variable Rate Shading Bindless resources Reduced precision arithmetic Accelerated Video Encode/Decode Formal memory model Machine Learning Primitives Buffer references Mesh Shaders SPIR-V 1.5 January 2020 Variable Rate Shading Enables concentration of rendering power at the foveal point in eye- tracked XR systems This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 5 Ray Tracing is a Flexible Technique Ambient Occlusion Reflections Programmers need programmable flexibility to trace rays through scenes for a wide variety of visual effects – some examples… Shadows Global Illumination This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 6 Vulkan Ray Tracing Two ways to launch rays into scene to generate high realistic visuals Using a BVH data structure to Ray Tracing Pipelines enable efficient ray tracing A new type of graphics pipeline through a 3D scene Implicit management of ray intersections Application compiles a set of shaders into the pipeline to 1. Launch 2D/3D grid of rays provide desired ray and material processing Ray into scene contained in an Generation Acceleration Structure Ray Queries 3. Invoke ‘Any Hit’ Shader if intersection is found. Multiple Any type of shader can launch a ray at any time intersections possible - arbitrary Any Hit Shader can process intersection data however it wishes Acceleration order Shader controls how traversal proceeds Structure Traversal Intersection 2. ‘Intersection’ Shader computes ray intersections Ray-triangle intersections are built-in Hit? No Yes 4. Invoke ‘Closest Hit’ shader on the closest intersection of the ray OR Miss Closest Hit Invoke ‘Miss’ Shader if no hit is found Can trace more rays Ray Tracing Pipeline This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 7 OpenXR – Cross-Platform Portable AR/VR Working Group Participants Games Engines Desktop/Console VR AR Mobile VR GPUs OpenXR is a collaborative design UI Devices Integrating many lessons from proprietary ‘first-generation’ XR API designs * OpenXR 1.0 is focused on enabling cross-platform applications. Optional device plugin interface will be supported post V1.0 This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 8 OpenXR 1.0 Availability Significant Community Feedback Finalize Conformance Test Suite Enable Officially Conformant Implementations Improved Input subsystem, game engine editor support, loader … Provisional OpenXR for Windows Mixed Reality headsets Specification and HoloLens 2 GDC, March 2019 PLUS extensions to support HoloLens 2 hand tracking, eye tracking, spatial mapping and spatial anchors Many more coming! Ratify and Release OpenXR support for Oculus Rift and Oculus Quest OpenXR 1.0 Oculus PC (for Rift) and Android SDK (for Quest) SIGGRAPH, July 2019 include OpenXR for native C/C++ development ‘Monado’ OpenXR open source implementation Support OpenHMD hardware and Nova Northstar AR HMD OpenXR 1.0 plugin for Unreal Engine v4.2.4 This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 9 OpenXR Win-Win-Win XR End-Users Can run the apps they want on their system – reducing market confusion and increasing consumer confidence XR Vendors Can bring more applications onto their platform by leveraging the OpenXR content ecosystem XR ISVs Can easily ship on more platforms for increased market reach This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 10 OpenXR is used with a 3D API Application or Engine Display, High-performance, low-latency composition and Cross-platform access to XR 3D rendering and composition* optical correction HMDs and sensors Multiview parameters XR application lifecycle Context priority Input device discovery and events Front buffer rendering Sensor tracking and pose calculation Tiled rendering (beam racing) Frame timing and display composition Variable rate rendering Haptics Control * OpenXR can be used with other 3D APIs such as Direct3D, OpenGL and OpenGL ES This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 11 Bringing XR to the Web The Web will Evolve Native XR Apps into the Metaverse Web XR Apps Lifting OpenXR functionality into the Web stack WebXR Native 3D Web 3D Engines Close cooperation Engines between WebXR and OpenXR Khronos provides the foundation for native and Web-based 3D/XR This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 12 OpenXR and Edge Server Applications with 5G Low latency Sensor Data Location–aware Content Requests OpenXR APIs hide the 5G round trip from applications MEC (Multi-access Edge Sensor handling Computing) Server 1. Processes sensor data, including machine learning for environmental lighting, occlusion, Needed assets Display composition scene semantics, object loaded to reconstruction and UI edge server 2. Generates imagery from 3D Wireless mobile device with models, including stereo, foveal rendering, ray-tracing, optics pre- display and sensors distortion, varifocal processing Apps and 3D Assets NVIDIA EGX Generated Augmentations & Scenes This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 13 glTF 2.0 Scene Description Structure .gltf (JSON) Node hierarchy, PBR material textures, cameras .bin .png Geometry: vertices and indices .jpg Animation: key-frames .ktx2 Skins: inverse-bind matrices Textures Mandatory Metallic-Roughness Materials Optional Specular-Glossiness Materials Geometry Texture based PBR materials This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 14 This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 15 glTF Roadmap Second Generation Physically-Based Rendering (PBR) JPEG must be fully Set of coherent extensions decompressed into 14,000,000Clear coat (imminent) GPU memory Absorption/attenuation 12,582,912 File Size GPU Size Subsurface scattering Anisotropy Inspiration from Dassault 10,500,000 Systèmes Enterprise PBR Shading Model (DSPBR) and MDL Wide industry cooperation 7,000,000 Universal textures can beSeeking Requirements directly transcoded to Subdivision3,500,000 surfaces FlightHelmet_baseColor 2,778,518 compressed GPU textures 2048 x 2048, RGB Advanced Animation 2,097,152 LOD and Streaming Compressed Point Clouds Cross-asset linking 315,619 232,104 0Enhanced Metadata Composability PNG JPEG glTF Universal Textures ETC1S Instancing (imminent) Working Group is constantly balancing CAD/BIM model support Basis Universal encoding/transcoding feature requests against the ‘glTF Encryption and security Uncompressed Basis Universal KTX2 Container Prime Directive’ - remain a universal 3D Printing and easy to process delivery format This work is licensed under a Creative Commons Attribution 4.0 International License © The Khronos® Group Inc. 2020 - Page 16 3D Commerce - The Opportunity 3D Commerce = E Commerce enhanced with the use of 3D Models on any platform – including VR and AR Early Experience Shows Increased customer engagement! Strengthened brand loyalty! Deeper product understanding! =$$$!
Recommended publications
  • Introduction to the Vulkan Computer Graphics API
    1 Introduction to the Vulkan Computer Graphics API Mike Bailey mjb – July 24, 2020 2 Computer Graphics Introduction to the Vulkan Computer Graphics API Mike Bailey [email protected] SIGGRAPH 2020 Abridged Version This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License http://cs.oregonstate.edu/~mjb/vulkan ABRIDGED.pptx mjb – July 24, 2020 3 Course Goals • Give a sense of how Vulkan is different from OpenGL • Show how to do basic drawing in Vulkan • Leave you with working, documented, understandable sample code http://cs.oregonstate.edu/~mjb/vulkan mjb – July 24, 2020 4 Mike Bailey • Professor of Computer Science, Oregon State University • Has been in computer graphics for over 30 years • Has had over 8,000 students in his university classes • [email protected] Welcome! I’m happy to be here. I hope you are too ! http://cs.oregonstate.edu/~mjb/vulkan mjb – July 24, 2020 5 Sections 13.Swap Chain 1. Introduction 14.Push Constants 2. Sample Code 15.Physical Devices 3. Drawing 16.Logical Devices 4. Shaders and SPIR-V 17.Dynamic State Variables 5. Data Buffers 18.Getting Information Back 6. GLFW 19.Compute Shaders 7. GLM 20.Specialization Constants 8. Instancing 21.Synchronization 9. Graphics Pipeline Data Structure 22.Pipeline Barriers 10.Descriptor Sets 23.Multisampling 11.Textures 24.Multipass 12.Queues and Command Buffers 25.Ray Tracing Section titles that have been greyed-out have not been included in the ABRIDGED noteset, i.e., the one that has been made to fit in SIGGRAPH’s reduced time slot.
    [Show full text]
  • New Realities Risks in the Virtual World 2
    Emerging Risk Report 2018 Technology New realities Risks in the virtual world 2 Lloyd’s disclaimer About the author This report has been co-produced by Lloyd's and Amelia Kallman is a leading London futurist, speaker, Amelia Kallman for general information purposes only. and author. As an innovation and technology While care has been taken in gathering the data and communicator, Amelia regularly writes, consults, and preparing the report Lloyd's does not make any speaks on the impact of new technologies on the future representations or warranties as to its accuracy or of business and our lives. She is an expert on the completeness and expressly excludes to the maximum emerging risks of The New Realities (VR-AR-MR), and extent permitted by law all those that might otherwise also specialises in the future of retail. be implied. Coming from a theatrical background, Amelia started Lloyd's accepts no responsibility or liability for any loss her tech career by chance in 2013 at a creative or damage of any nature occasioned to any person as a technology agency where she worked her way up to result of acting or refraining from acting as a result of, or become their Global Head of Innovation. She opened, in reliance on, any statement, fact, figure or expression operated and curated innovation lounges in both of opinion or belief contained in this report. This report London and Dubai, working with start-ups and corporate does not constitute advice of any kind. clients to develop connections and future-proof strategies. Today she continues to discover and bring © Lloyd’s 2018 attention to cutting-edge start-ups, regularly curating All rights reserved events for WIRED UK.
    [Show full text]
  • Exploring Telepresence in Virtual Worlds
    Exploring Telepresence in Virtual Worlds Dan Zhang z3378568 A thesis in fulfillment of the requirements for the degree of Doctor of Philosophy School of Information Systems and Technology Management UNSW Business School March 2018 PLEASE TYPE THE UNIVERSITY OF NEW SOUTH WALES Thesis/Dissertation Sheet Surname or Family name: Zhang First name: Dan Other name/s: Abbreviation for degree as given in the University calendar: PhD School: School of Information Systems and Technology Management Faculty: UNSW Business School Title: Exploring telepresence in virtual worlds Abstract 350 words maximum: (PLEASE TYPE) Virtual worlds, as the computer-based simulated environments incorporating various representations of real-world elements, have great potential to not only transform the structures and operation modes of various industries but also change the way people work, do business, learn, play, and communicate. However, the existing sharp distinctions between virtual worlds and the real world also bring critical challenges. To address these challenges, the concept of telepresence—the user’s feeling of ‘being there’ in the virtual environments—is adopted as it is considered a direct and essential consequence of a virtual world’s reality. To cultivate this feeling, it is essential to understand what factors can lead to telepresence. However, some literature gaps on telepresence antecedents impede the understanding of telepresence antecedents and affect the adoption of the telepresence construct in the design of virtual worlds. To address these issues, this study explores the concept of telepresence in the context of virtual worlds. Specifically, by adopting means-end chain (MEC) theory, the study aims to investigate the antecedents of telepresence; to reveal the inter-relationships among these antecedents by building a hierarchical structure; and to develop an innovative approach for user segmentation to understand in-depth individual differences in perceiving telepresence.
    [Show full text]
  • Standards for Vision Processing and Neural Networks
    Standards for Vision Processing and Neural Networks Radhakrishna Giduthuri, AMD [email protected] © Copyright Khronos Group 2017 - Page 1 Agenda • Why we need a standard? • Khronos NNEF • Khronos OpenVX dog Network Architecture Pre-trained Network Model (weights, …) © Copyright Khronos Group 2017 - Page 2 Neural Network End-to-End Workflow Neural Network Third Vision/AI Party Applications Training Frameworks Tools Datasets Trained Vision and Neural Network Network Inferencing Runtime Network Model Architecture Desktop and Cloud Hardware Embedded/Mobile Embedded/MobileEmbedded/Mobile Embedded/Mobile/Desktop/CloudVision/InferencingVision/Inferencing Hardware Hardware cuDNN MIOpen MKL-DNN Vision/InferencingVision/Inferencing Hardware Hardware GPU DSP CPU Custom FPGA © Copyright Khronos Group 2017 - Page 3 Problem: Neural Network Fragmentation Neural Network Training and Inferencing Fragmentation NN Authoring Framework 1 Inference Engine 1 NN Authoring Framework 2 Inference Engine 2 NN Authoring Framework 3 Inference Engine 3 Every Tool Needs an Exporter to Every Accelerator Neural Network Inferencing Fragmentation toll on Applications Inference Engine 1 Hardware 1 Vision/AI Inference Engine 2 Hardware 2 Application Inference Engine 3 Hardware 3 Every Application Needs know about Every Accelerator API © Copyright Khronos Group 2017 - Page 4 Khronos APIs Connect Software to Silicon Software Silicon Khronos is an International Industry Consortium of over 100 companies creating royalty-free, open standard APIs to enable software to access
    [Show full text]
  • Khronos Template 2015
    Ecosystem Overview Neil Trevett | Khronos President NVIDIA Vice President Developer Ecosystem [email protected] | @neilt3d © Copyright Khronos Group 2016 - Page 1 Khronos Mission Software Silicon Khronos is an Industry Consortium of over 100 companies creating royalty-free, open standard APIs to enable software to access hardware acceleration for graphics, parallel compute and vision © Copyright Khronos Group 2016 - Page 2 http://accelerateyourworld.org/ © Copyright Khronos Group 2016 - Page 3 Vision Pipeline Challenges and Opportunities Growing Camera Diversity Diverse Vision Processors Sensor Proliferation 22 Flexible sensor and camera Use efficient acceleration to Combine vision output control to GENERATE PROCESS with other sensor data an image stream the image stream on device © Copyright Khronos Group 2016 - Page 4 OpenVX – Low Power Vision Acceleration • Higher level abstraction API - Targeted at real-time mobile and embedded platforms • Performance portability across diverse architectures - Multi-core CPUs, GPUs, DSPs and DSP arrays, ISPs, Dedicated hardware… • Extends portable vision acceleration to very low power domains - Doesn’t require high-power CPU/GPU Complex - Lower precision requirements than OpenCL - Low-power host can setup and manage frame-rate graph Vision Engine Middleware Application X100 Dedicated Vision Processing Hardware Efficiency Vision DSPs X10 GPU Compute Accelerator Multi-core Accelerator Power Efficiency Power X1 CPU Accelerator Computation Flexibility © Copyright Khronos Group 2016 - Page 5 OpenVX Graphs
    [Show full text]
  • Migrating from Opengl to Vulkan Mark Kilgard, January 19, 2016 About the Speaker Who Is This Guy?
    Migrating from OpenGL to Vulkan Mark Kilgard, January 19, 2016 About the Speaker Who is this guy? Mark Kilgard Principal Graphics Software Engineer in Austin, Texas Long-time OpenGL driver developer at NVIDIA Author and implementer of many OpenGL extensions Collaborated on the development of Cg First commercial GPU shading language Recently working on GPU-accelerated vector graphics (Yes, and wrote GLUT in ages past) 2 Motivation for Talk Coming from OpenGL, Preparing for Vulkan What kinds of apps benefit from Vulkan? How to prepare your OpenGL code base to transition to Vulkan How various common OpenGL usage scenarios are re-thought in Vulkan Re-thinking your application structure for Vulkan 3 Analogy Different Valid Approaches 4 Analogy Fixed-function OpenGL Pre-assembled toy car fun out of the box, not much room for customization 5 AZDO = Approaching Zero Driver Overhead Analogy Modern AZDO OpenGL with Programmable Shaders LEGO Kit you build it yourself, comes with plenty of useful, pre-shaped pieces 6 Analogy Vulkan Pine Wood Derby Kit you build it yourself to race from raw materials power tools used to assemble, adult supervision highly recommended 7 Analogy Different Valid Approaches Fixed-function OpenGL Modern AZDO OpenGL with Vulkan Programmable Shaders 8 Beneficial Vulkan Scenarios Has Parallelizable CPU-bound Graphics Work yes Can your graphics Is your graphics work start work creation be CPU bound? parallelized? yes Vulkan friendly 9 Beneficial Vulkan Scenarios Maximizing a Graphics Platform Budget You’ll yes do whatever Your graphics start it takes to squeeze platform is fixed out max perf. yes Vulkan friendly 10 Beneficial Vulkan Scenarios Managing Predictable Performance, Free of Hitching You put yes You can a premium on manage your start avoiding graphics resource hitches allocations yes Vulkan friendly 11 Unlikely to Benefit Scenarios to Reconsider Coding to Vulkan 1.
    [Show full text]
  • Augmented Reality and Its Aspects: a Case Study for Heating Systems
    Augmented Reality and its aspects: a case study for heating systems. Lucas Cavalcanti Viveiros Dissertation presented to the School of Technology and Management of Bragança to obtain a Master’s Degree in Information Systems. Under the double diploma course with the Federal Technological University of Paraná Work oriented by: Prof. Paulo Jorge Teixeira Matos Prof. Jorge Aikes Junior Bragança 2018-2019 ii Augmented Reality and its aspects: a case study for heating systems. Lucas Cavalcanti Viveiros Dissertation presented to the School of Technology and Management of Bragança to obtain a Master’s Degree in Information Systems. Under the double diploma course with the Federal Technological University of Paraná Work oriented by: Prof. Paulo Jorge Teixeira Matos Prof. Jorge Aikes Junior Bragança 2018-2019 iv Dedication I dedicate this work to my friends and my family, especially to my parents Tadeu José Viveiros and Vera Neide Cavalcanti, who have always supported me to continue my stud- ies, despite the physical distance has been a demand factor from the beginning of the studies by the change of state and country. v Acknowledgment First of all, I thank God for the opportunity. All the teachers who helped me throughout my journey. Especially, the mentors Paulo Matos and Jorge Aikes Junior, who not only provided the necessary support but also the opportunity to explore a recent area that is still under development. Moreover, the professors Paulo Leitão and Leonel Deusdado from CeDRI’s laboratory for allowing me to make use of the HoloLens device from Microsoft. vi Abstract Thanks to the advances of technology in various domains, and the mixing between real and virtual worlds.
    [Show full text]
  • Metaverse Roadmap Overview, 2007. 2007
    A Cross-Industry Public Foresight Project Co-Authors Contributing Authors John Smart, Acceleration Studies Foundation Corey Bridges, Multiverse Jamais Cascio, Open the Future Jochen Hummel, Metaversum Jerry Paffendorf, The Electric Sheep Company James Hursthouse, OGSi Randal Moss, American Cancer Society Lead Reviewers Edward Castronova, Indiana University Richard Marks, Sony Computer Entertainment Alexander Macris, Themis Group Rueben Steiger, Millions of Us LEAD SPONSOR FOUNDING PARTNERS Futuring and Innovation Center Graphic Design: FizBit.com accelerating.org metaverseroadmap.org MVR Summit Attendees Distinguished industry leaders, technologists, analysts, and creatives who provided their insights in various 3D web domains. Bridget C. Agabra Project Manager, Metaverse Roadmap Project Patrick Lincoln Director, Computer Science Department, SRI Janna Anderson Dir. of Pew Internet’s Imagining the Internet; Asst. International Prof. of Communications, Elon University Julian Lombardi Architect, Open Croquet; Assistant VP for Tod Antilla Flash Developer, American Cancer Society Academic Services and Technology Support, Office of Information Technology Wagner James Au Blogger, New World Notes; Author, The Making of Second Life, 2008 Richard Marks Creator of the EyeToy camera interface; Director of Special Projects, Sony CEA R&D Jeremy Bailenson Director, Virtual Human Interaction Lab, Stanford University Bob Moore Sociologist, Palo Alto Research Center (PARC), PlayOn project Betsy Book Director of Product Management, Makena Technologies/There;
    [Show full text]
  • Openxr 1.0 Reference Guide Page 1
    OpenXR 1.0 Reference Guide Page 1 OpenXR™ is a cross-platform API that enables a continuum of real-and-virtual combined environments generated by computers through human-machine interaction and is inclusive of the technologies associated with virtual reality, augmented reality, and mixed reality. It is the interface between an application and an in-process or out-of-process XR runtime that may handle frame composition, peripheral management, and more. Color-coded names as follows: function names and structure names. [n.n.n] Indicates sections and text in the OpenXR 1.0 specification. Specification and additional resources at khronos.org/openxr £ Indicates content that pertains to an extension. OpenXR API Overview A high level overview of a typical OpenXR application including the order of function calls, creation of objects, session state changes, and the rendering loop. OpenXR Action System Concepts [11.1] Create action and action spaces Set up interaction profile bindings Sync and get action states xrCreateActionSet xrSetInteractionProfileSuggestedBindings xrSyncActions name = "gameplay" /interaction_profiles/oculus/touch_controller session activeActionSets = { "gameplay", ...} xrCreateAction "teleport": /user/hand/right/input/a/click actionSet="gameplay" "teleport_ray": /user/hand/right/input/aim/pose xrGetActionStateBoolean ("teleport_ray") name = “teleport” if (state.currentState) // button is pressed /interaction_profiles/htc/vive_controller type = XR_INPUT_ACTION_TYPE_BOOLEAN { actionSet="gameplay" "teleport": /user/hand/right/input/trackpad/click
    [Show full text]
  • Digital Reality: What's Next After Google Glass?
    1 Digital Reality: What’s Next After Google Glass? July 15, 2021 In less than a decade we’ve moved from Google smart glasses to Oculus virtual reality headsets; what’s next? In the third installment of our Convergence series, which examines five growth themes that are shaping the future of investing, Hugo Scott-Gall speaks with Global Research Analysts Bill Benton, CFA, and Drew Buckley, CFA, to discuss a vision of the internet in which the virtual and physical worlds become a seamlessly interconnected realm. Comments are edited excerpts from our podcast, which you can listen to in full below. https://media.blubrry.com/the_active_share/b/content.blubrry.com/the_active_share/The_Active_Share_Metaverse_Episode_Revision.mp3 What is the metaverse? Am I in it? Are you in it? Bill: It’s actually not a simple question. I don’t know if there is one definition. I think of it as effectively doing everything you do in the physical world within a digital ecosystem. People sometimes think about it as a Roblox or Minecraft experience, where you can create things, interact, and spend money virtually. That gets close, but so does the Tencent ecosystem in China. They have a gaming platform, a chatting platform, entertainment payments, and programs that allow you to connect. Do you agree with that, Drew? Drew: Yes. If you’re a science fiction reader, Ready Player One or Ready Player Two are great descriptions of the 2 metaverse. Effectively, at the end point, as Bill was saying, your whole life is lived in digital form. You have a body, of course, and you sustain that life in the physical world, but everything else you do is digital.
    [Show full text]
  • Conference Booklet
    30th Oct - 1st Nov CONFERENCE BOOKLET 1 2 3 INTRO REBOOT DEVELOP RED | 2019 y Always Outnumbered, Never Outgunned Warmest welcome to first ever Reboot Develop it! And we are here to stay. Our ambition through Red conference. Welcome to breathtaking Banff the next few years is to turn Reboot Develop National Park and welcome to iconic Fairmont Red not just in one the best and biggest annual Banff Springs. It all feels a bit like history repeating games industry and game developers conferences to me. When we were starting our European older in Canada and North America, but in the world! sister, Reboot Develop Blue conference, everybody We are committed to stay at this beautiful venue was full of doubts on why somebody would ever and in this incredible nature and astonishing choose a beautiful yet a bit remote place to host surroundings for the next few forthcoming years one of the biggest worldwide gatherings of the and make it THE annual key gathering spot of the international games industry. In the end, it turned international games industry. We will need all of into one of the biggest and highest-rated games your help and support on the way! industry conferences in the world. And here we are yet again at the beginning, in one of the most Thank you from the bottom of the heart for all beautiful and serene places on Earth, at one of the the support shown so far, and even more for the most unique and luxurious venues as well, and in forthcoming one! the company of some of the greatest minds that the games industry has to offer! _Damir Durovic
    [Show full text]
  • Portable VR with Khronos Openxr
    Portable VR with Khronos OpenXR Kaye Mason, Nick Whiting, Rémi Arnaud, Brad Grantham July 2017 © Copyright Khronos Group 2016 - Page 1 Outline for the Session • Problem, Solution • Panelist Introductions • Technical Issues • Audience Questions • Lightning Round • How to Join • Party! © Copyright Khronos Group 2016 - Page 2 Problem Without a cross-platform standard, VR applications, games and engines must port to each vendors’ APIs. In turn, this means that each VR device can only run the apps that have been ported to its SDK. The result is high development costs and confused customers – limiting market growth. © Copyright Khronos Group 2016 - Page 3 Solution The cross-platform VR standard eliminates industry fragmentation by enabling applications to be written once to run on any VR system, and to access VR devices integrated into those VR systems to be used by applications. © Copyright Khronos Group 2016 - Page 4 Architecture OpenXR defines two levels of API interfaces that a VR platform’s runtime can use to access the OpenXR ecosystem. Apps and engines use standardized interfaces to interrogate and drive devices. Devices can self- integrate to a standardized driver interface. Standardized hardware/software interfaces reduce fragmentation while leaving implementation details open to encourage industry innovation. © Copyright Khronos Group 2016 - Page 5 OpenXR Participant Companies © Copyright Khronos Group 2016 - Page 6 Panelist Introduction • Kaye Mason Dr. Kaye Mason is a senior software engineer at Google and the API Lead for Google Daydream, as well as working on graphics and distortion. She is also the Specification Editor for the Khronos OpenXR standard. Kaye has a PhD in Computer Science from research focused on using AIs to do non- photorealistic rendering.
    [Show full text]