Presenting 2D Web Content in XR

Presenting 2D Web Content in XR

Presenting 2D Web Content in XR TPAC 2018 Overview 2D Web -ContentBackground in 3D XR Rendering Context - Potential implementation paths forward - Open questions, concerns 2D Web Content in 3D XR Rendering Context Background - 2D Content in XR 2D Web Content in 3D XR Rendering Context - Overlays for handheld AR. (DOM to Screen) - Formatted text, control panels, 2D video players (DOM to World) 2D Web Content in 3D XR Rendering Context Background - Compositors, Quad Layers 2D Web Content in 3D XR RenderingNew Context platforms and hardware, new features to explore for rendering 2D content securely and performantly. Compositors - Definition, purpose - Predictive rendering Quad Layer - Definition, purpose - How they work, what that means for rendering behavior e.g. occlusion, depth sensitivity, transparency - How they might help solve existing problems around exposing DOM content to XR context 2D Web Content in 3D XR Rendering Context User Agents and Compositor Layers: CSS3D 2D Web Content in 3D XR RenderingRendering Context context and transform-style: preserve-3d; - Elements can request to be in the same 3D context as their parents and sibling elements. - Issues around ordering, depth buffer are effectively avoided by returning 3D objects to a 2D rendering context when certain CSS attributes are defined on elements in the rendering context. Instantiate new rendering context for element with will-change: transform; property - Moves element to its own compositor layer under the hood for transform, animation optimization, etc. - Not in all browsers, limited to max layers underlying platform can support. 2D Web Content in 3D XR Rendering Context Background - DOM to Texture 2D Web Content in 3D XR RenderingHistory Context of DOM to Texture - Not a new idea. - First proposed as a way to render 2D web views in WebGL. - No way to implement while safely mitigating security concerns around access to cross-origin content pixel data, input injection. 2D Web Content in 3D XR Rendering Context Options we investigated 2D Web Content1. Do in nothing3D XR Rendering Context 2. Solve for immersive mode on handheld devices only 3. Option 2 + a UA-managed workaround for immersive mode on headset devices 4. Option 2 + quad layers. 5. Revisit feasibility of DOM to texture 2D Web Content in 3D XR Rendering Context Option 2 - DOM to Screen 2D Web Content in 3D XR Rendering Context - Layout and rendering is required, but should elements be required to be in the DOM tree hierarchy? - Prior art - fullscreen API - Could be implemented as a quad layer? 2D Web Content in 3D XR Rendering Context Option 4 - DOM to World 2D Web Content in 3D XR RenderingDOM Context quad layer - Input - static or interactive? - Support cross-origin content? - Layout and rendering is required, but should elements be in the DOM tree? - Differences in underlying platform support for quad layers? - Composition order - Occlusion - Input injection - How would we extend WebXR to support multiple layers? 2D Web Content in 3D XR Rendering Context Inputs 2D Web Content in 3D XR Rendering Context In order to interact with DOM content, UA will need to: - Continue sending pose data as normal through WebXR APIs - Compute raycast intersection for each input source against a quad layer and generating existing DOM events to 2D window coordinates. 2D Web Content in 3D XR Rendering Context Cross-Origin DOM Content 2D WebVisual Content Representation in 3D XR Rendering Context - No new security concerns with quad layer implementation, parent document does not get access to nested document pixel data Inputs - Security concerns delivering pose data to both document contexts. - Pose data must continue to go to parent document for basic scene rendering - Delivering input injection results in two browsing contexts getting the same input data. 2D Web Content in 3D XR Rendering Context Differences in underlying platform support - Platform Capabilities Platfor Quad Layer Occlusion Depth Platform Quad Layer Occlusion Depth m Buffer Buffer ARKit N N N OpenVR Y, limited N* N (IVRComposit ARCore N N N or::Submit) OpenXR Y, multiple Y* Y* Google N* N N (XrCompositor VR LayerQuad) SDK WMR N N N Magic N* N N Hololens (MaxQuadLay Leap ers set to 0) Oculus Y, multiple Y Y WMR Y, multiple Only with N* (ovrLayerQ Immersive (HolographicQ other quad uad) Headsets uadLayer) layers Next Steps 2D Web -ContentRationalize in 3D XR a common abstraction for a quad layer API across different Rendering Context platforms - What changes are necessary for the WebXR API to support multiple layers - If we add support for a new XR layer type, is this the only time it will be necessary to do so or are there other features that will need their own layer abstraction? - Is a quad layer abstraction sufficient for both DOM to screen and DOM to world? - Identifying collaborators for design, action items 2D Web Content in 3D XR Rendering Context.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    13 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us