Using Google Cardboard to Perform a Visual Field Screening Test Dhanraj Selvaraj Iowa State University

Total Page:16

File Type:pdf, Size:1020Kb

Using Google Cardboard to Perform a Visual Field Screening Test Dhanraj Selvaraj Iowa State University Iowa State University Capstones, Theses and Graduate Theses and Dissertations Dissertations 2018 Using Google Cardboard to perform a visual field screening test Dhanraj Selvaraj Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/etd Part of the Engineering Commons Recommended Citation Selvaraj, Dhanraj, "Using Google Cardboard to perform a visual field screening test" (2018). Graduate Theses and Dissertations. 16461. https://lib.dr.iastate.edu/etd/16461 This Thesis is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Using Google Cardboard to perform a visual field screening test by Dhanraj Selvaraj A thesis submitted to the graduate faculty in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE Major: Mechanical Engineering Program of Study Committee: Greg Luecke, Major Professor Judy Vance Stephen Gilbert The student author, whose presentation of the scholarship herein was approved by the program of study committee, is solely responsible for the content of this thesis. The Graduate College will ensure this thesis is globally accessible and will not permit alterations after a degree is conferred. Iowa State University Ames, Iowa 2018 Copyright © Dhanraj Selvaraj, 2018. All rights reserved. ii DEDICATION I dedicate this thesis to my parents, Dr. Eunice Selvaraj and Dr. Selvaraj Rajendran, and my sister, Suganthi Selvaraj for their constant support, encouragement, devotion to my dreams and ambition to become an engineer. iii TABLE OF CONTENTS Page LIST OF FIGURES ............................................................................................................ v LIST OF TABLES ............................................................................................................ vii NOMENCLATURE ........................................................................................................ viii ACKNOWLEDGMENTS ................................................................................................. ix ABSTRACT ........................................................................................................................ x CHAPTER 1. INTRODUCTION ....................................................................................... 1 Virtual Reality Applications .......................................................................................... 1 Virtual Reality Headsets ................................................................................................ 1 Oculus Rift & HTC Vive.......................................................................................... 2 Google Cardboard .................................................................................................... 3 Visual Field .................................................................................................................... 5 Glaucoma ....................................................................................................................... 6 Humphrey Field Analyzer ............................................................................................. 6 Visual Field Easy ........................................................................................................... 8 CHAPTER 2. TECHNICAL DEVELOPMENT ................................................................ 9 Visual Field Test ............................................................................................................ 9 Google Cardboard ........................................................................................................ 15 Field Of View (FOV) ............................................................................................. 16 Unity3D .................................................................................................................. 16 CHAPTER 3. EXPERIMENTAL SETUP AND RESULTS ........................................... 17 Experimental Setup ...................................................................................................... 17 Display Device ....................................................................................................... 17 Google Cardboard Viewer...................................................................................... 18 Setting Up Unity3D ................................................................................................ 20 Developing The Application .................................................................................. 22 Results ......................................................................................................................... 25 CHAPTER 4. DISCUSSION AND CONCLUSION ....................................................... 30 Discussion .................................................................................................................... 30 Conclusion ................................................................................................................... 32 REFERENCES ................................................................................................................. 34 iv APPENDIX A. CODE FOR APPLICATION .................................................................. 36 APPENDIX B. EXCEL DATA RESULTS ...................................................................... 45 Data results for 3 x 3 light pattern ............................................................................... 45 Data results for 7 x 7 light pattern ............................................................................... 46 v LIST OF FIGURES Page Figure 1-1 Oculus Rift [3] ................................................................................................ 2 Figure 1-2 HTC Vive [4] ..................................................................................................... 3 Figure 1-3 Google Cardboard [6]....................................................................................... 4 Figure 1-4 Visual Field of Right Eye [8] ............................................................................. 5 Figure 1-5 Humphrey Field Analyzer ................................................................................. 7 Figure 2-1 Visual Field Test using the Humphrey Field Analyzer (HFA) [16] ................ 10 Figure 2-2 Diagram of visual field of the right eye. The three regions of the field are the central field (0 to 30 degrees), the peripheral field (30 to 60 degrees), and the temporal crescent. [18] .................................................... 11 Figure 2-3 30-2 pattern: The distribution of the target light spots are at equal distances from the horizontal axis at 3 degrees. [8] ..................................... 12 Figure 2-4 HFA Visual Field Test Results (Left Eye) ....................................................... 13 Figure 2-5 HFA Visual Field Test Results (Right Eye) ..................................................... 14 Figure 2-6 Google Cardboard .......................................................................................... 15 Figure 3-1 Nexus 6P.......................................................................................................... 18 Figure 3-2 Google Cardboard Viewer Parts [25]............................................................. 20 Figure 3-3 Track Rotation and Track Position disabled in the Inspector (Screenshot) ... 21 Figure 3-4 Field Of View set to 30 degrees (Screenshot) ................................................. 22 Figure 3-5 Initialized 7 x 7 Light Pattern.......................................................................... 23 Figure 3-6 Light Pattern (3 X 3) ....................................................................................... 25 Figure 3-7 Right Eye Light Pattern Results (3 X 3) .......................................................... 27 Figure 3-8 Left Eye Light Pattern Results (3 X 3) ............................................................ 27 Figure 3-9 Left Eye Light Pattern Results (7 X 7) ............................................................ 28 vi Figure 3-10 Right Eye Light Pattern Results (7 X 7) ........................................................ 29 Figure 4-1 Right Eye Visual Field Test Results using Google Cardboard (Left Image) and Right Eye Humphrey Field Analyzer Results (Right Image) ..... 32 vii LIST OF TABLES Page Table 1 Nexus 6P Specifications [24] ............................................................................... 17 Table 2 Google Cardboard Specifications [25] ................................................................ 19 Table 3 Right Eye Results (3 x 3 Pattern)......................................................................... 26 Table 4 Left Eye Results (3 x 3 Pattern) ........................................................................... 26 viii NOMENCLATURE VR Virtual Reality HFA Humphrey Field Analyzer FOV Field Of View ix ACKNOWLEDGMENTS I would like to thank my Father, God Almighty for blessing me with this opportunity, and helping me with the process, by giving me strength, perseverance, wisdom and everything I needed to accomplish to be where I am today. To Dr. Greg Luecke, my major professor, I am ever grateful to him for
Recommended publications
  • Top 20 Influencers
    Top 20 AR/VR InfluencersWhat Fits You Best? Sanem Avcil Palmer Luckey @Sanemavcil @PalmerLuckey Founder of Coolo Games, Founder of Oculus Rift; CEO of Politehelp & And, the well known Imprezscion Yazilim Ve voice in VR. Elektronik. Chris Milk Alex Kipman @milk @akipman Maker of stuff, Key player in the launch of Co-Founder/CEO of Within. Microsoft Hololens. Creator of Focusing on innovative human the Microsoft Kinect experiences in VR. motion controller. Philip Rosedale Tony Parisi @philiprosedale @auradeluxe Founder of Head of AR and VR Strategy at 2000s MMO experience, Unity, began his VR career Second Life. co-founding VRML in 1994 with Mark Pesce. Kent Bye Clay Bavor @kentbye @claybavor Host of leading Vice President of Virtual Reality VR podcast, Voices of VR & at Google. Esoteric Voices. Rob Crasco Benjamin Lang @RoblemVR @benz145 VR Consultant at Co-founder & Executive Editor of VR/AR Consulting. Writes roadtovr.com, one of the leading monthly articles on VR for VR news sites in the world. Bright Metallic magazine. Vanessa Radd Chris Madsen @vanradd @deep_rifter Founder, XR Researcher; Director at Morph3D, President, VRAR Association. Ambassador at Edge of Discovery. VR/AR/Experiencial Technology. Helen Papagiannis Cathy Hackl @ARstories @CathyHackl PhD; Augmented Reality Founder, Latinos in VR/AR. Specialist. Author of Marketing Co-Chair at VR/AR Augmented Human. Assciation; VR/AR Speaker. Brad Waid Ambarish Mitra @Techbradwaid @rishmitra Global Speaker, Futurist, Founder & CEO at blippar, Educator, Entrepreneur. Young Global Leader at Wef, Investor in AugmentedReality, AI & Genomics. Tom Emrich Gaia Dempsey @tomemrich @fianxu VC at Super Ventures, Co-founder at DAQRI, Fonder, We Are Wearables; Augmented Reality Futurist.
    [Show full text]
  • Getting Real with the Library
    Getting Real with the Library Samuel Putnam, Sara Gonzalez Marston Science Library University of Florida Outline What is Augmented Reality (AR) & Virtual Reality (VR)? What can you do with AR/VR? How to Create AR/VR AR/VR in the Library Find Resources What is Augmented and Virtual Reality? Paul Milgram ; Haruo Takemura ; Akira Utsumi ; Fumio Kishino; Augmented reality: a class of displays on the reality- virtuality continuum. Proc. SPIE 2351, Telemanipulator and Telepresence Technologies, 282 (December 21, 1995) What is Virtual Reality? A computer-generated simulation of a lifelike environment that can be interacted with in a seemingly real or physical way by a person, esp. by means of responsive hardware such as a visor with screen or gloves with sensors. "virtual reality, n". OED Online 2017. Web. 16 May 2017. Head mounted display, U.S. Patent Number 8,605,008 VR in the 90s By Dr. Waldern/Virtuality Group - Dr. Jonathan D. Waldern, Attribution, https://commons.wikimedia.org/w/index.php?curid=32899409 By Dr. Waldern/Virtuality Group - Dr. Jonathan D. Waldern, By Dr. Waldern/Virtuality Group - Dr. Jonathan D. Waldern, Attribution, Attribution, https://commons.wikimedia.org/w/index.php?curid=32525338 https://commons.wikimedia.org/w/index.php?curid=32525505 1 2 3 VR with a Phone 1. Google Daydream View 2. Google Cardboard 3. Samsung Gear VR Oculus Rift ● Popular VR system: headset, hand controllers, headset tracker ($598) ● Headset has speakers -> immersive environment ● Requires a powerful PC for full VR OSVR Headset ● Open Source ● “Plug in, Play Everything” ● Discounts for Developers and Academics ● Requires a powerful PC for full VR Augmented Reality The use of technology which allows the perception of the physical world to be enhanced or modified by computer-generated stimuli perceived with the aid of special equipment.
    [Show full text]
  • 13 Cool Things You Can Do with Google Chromecast Chromecast
    13 Cool Things You Can Do With Google Chromecast We bet you don't even know half of these Google Chromecast is a popular streaming dongle that makes for an easy and affordable way of throwing content from your smartphone, tablet, or computer to your television wirelessly. There’s so much you can do with it than just streaming Netflix, Hulu, Spotify, HBO and more from your mobile device and computer, to your TV. Our guide on How Does Google Chromecast Work explains more about what the device can do. The seemingly simple, ultraportable plug and play device has a few tricks up its sleeve that aren’t immediately apparent. Here’s a roundup of some of the hidden Chromecast tips and tricks you may not know that can make casting more magical. Chromecast Tips and Tricks You Didn’t Know 1. Enable Guest Mode 2. Make presentations 3. Play plenty of games 4. Cast videos using your voice 5. Stream live feeds from security cameras on your TV 6. Watch Amazon Prime Video on your TV 7. Create a casting queue 8. Cast Plex 9. Plug in your headphones 10. Share VR headset view with others 11. Cast on the go 12. Power on your TV 13. Get free movies and other perks Enable Guest Mode If you have guests over at your home, whether you’re hosting a family reunion, or have a party, you can let them cast their favorite music or TV shows onto your TV, without giving out your WiFi password. To do this, go to the Chromecast settings and enable Guest Mode.
    [Show full text]
  • Utilization of Immersive 360 Degree Spherical Videos and Google Cardboard in Medical Training and Simulation: a Novel and Multi-Dimensional Way of Learning
    Utilization of Immersive 360 Degree Spherical Videos and Google Cardboard in Medical Training and Simulation: A Novel and Multi-dimensional Way of Learning Shoeb Mohiuddin, MD CA-2 Daniel Roshan, MD CA-2 Heike Knorpp, MD University of Illinois at Chicago Financial Disclosure We have no relevant financial or nonfinancial relationships within the products or services described, reviewed, evaluated or compared in this presentation. To view video, search on YouTube App: “STA 2016 Abstract Demo” or go to https://youtu.be/yr5EDF_taa8 Outline ● Generation X & Learning ● Experiential Learning ● 360 Degree Videos ● Google Cardboard Viewers ● Video Demonstration ● Pros & Cons ● Questions To view video, search on YouTube App: “STA 2016 Abstract Demo” or go to https://youtu.be/yr5EDF_taa8 Generation X, Technology, and Learning Majority of anesthesia residents are Millennial students with divergent learning needs from their predecessors that desire interactive learning through experiential and immersive learning. - Psychosocial influences - Physiological changes of the brain that change the way information is processed Generational difference in learning creates challenges for educators to teach trainees that speak a “different language” Be aware of cognitive biases. Keil Centre. http://www.keilcentre.co. uk/news/cognitive-biases/. Accessed: 1/3/2016. (Chu, 2012) To view video, search on YouTube App: “STA 2016 Abstract Demo” or go to https://youtu.be/yr5EDF_taa8 Educational Preferences of Millennials ● Learning and working in teams ● Structure with achievement-oriented goals ● Engagement and experience ● Visual and kinesthetic educational modalities and environments ● Learning about things that they feel matter to them (Chu, 2012) To view video, search on YouTube App: “STA 2016 Abstract Demo” or go to https://youtu.be/yr5EDF_taa8 Experiential Learning Experiential learning is referred to as learning through action, learning by doing, learning through experience, and learning through discovery and exploration.
    [Show full text]
  • Design Specification
    SPACETALK DESIGN SPECIFICATION TEAM Brian Orlando Eugene Meng Edward Roberts Terrie Chan SPONSORS NASA Jet Propulsion Lab MHCI+D INTRO Introduction National Aeronautics and Space Administration Those with the SpaceTalk app installed on their (NASA) spacecraft missions require scientists and mobile device can request a VR talk with other users engineers across the country to share complex and discuss complex mission information in context. mission data with each other. Yet, data such We believe that the accessibility and immersion as scientific instrument schedules, spacecraft of mobile VR will enable scientists and engineers orientation, and orbital trajectory are difficult to to collaborate with each more efficiently and convey with static presentations, texts and charts. effectively than ever before, saving NASA precious Remotely located scientists and engineers need time and resources in missions. a quick and easy means of discussing complex mission-centric data in order to make decisions, or This Design Specification is meant to describe to gain a holistic understanding of the mission. the SpaceTalk application in full to create as little friction as possible during the development process. To solve this issue, we’ve designed SpaceTalk: a It is composed of an architecture, system flow, collaborative mobile virtual reality (VR) application interaction model and visual system to create a that creates an interactive and immersive simulation holistic definition of the application. of NASA spacecraft missions. With SpaceTalk, users can
    [Show full text]
  • Introduction to Immersive Realities for Educators
    Introduction to Immersive Realities for Educators Contents 1. Introduction 2. Resources & Examples 3. Recommendations 4. Getting Started 5. VR Applications 6. Recording in VR 7. Using Wander 8. VR Champions 9. FAQs 10. References Introduction How VR In Education Will Change How We Learn And Teach The ever-evolving nature of technology continues to influence teaching and learning. One area where advances have impacted educational settings is immersive technology. Virtual reality (VR) immersive technologies “support the creation of synthetic, highly interactive three dimensional (3D) spatial environments that represent real or non-real situations” (Mikropoulos and Natsis, 2010, p. 769). The usage of virtual reality can be traced to the 1960s when cinematographer and inventor Morton Heiling developed the Sensorama, a machine in which individuals watched a film while experiencing a variety of multi-sensory effects, such as wind and various smells related to the scenery. In the 1980’s VR moved into professional education and training. The integration of VR in higher education became apparent in the 1990’s, and continues to be explored within colleges and universities in the 21st century. Why does it all mean? VR, AR, MR and What Does Immersion Actually Mean? Terms such as "Virtual Reality"(VR), "Augmented Reality" (AR), "Mixed Reality" (MR), and "Immersive Content" are becoming increasingly common in education and are more user-friendly and affordable than ever. Like any other technology, IR is used to introduce, support, or reinforce course learning objectives not unlike a text, film, or field trip to a museum. The major difference is that learning can be much more immersive, interactive and engaging.
    [Show full text]
  • Using Virtual Reality to Engage and Instruct: a Novel Tool for Outreach and Extension Age Group: All Ages! Dr. Geoffrey Attardo
    Using Virtual Reality to Engage and Instruct: A novel tool for Outreach and Extension Age Group: All Ages! Dr. Geoffrey Attardo Email: [email protected] Assistant Professor Room 37 Briggs Hall Department of Entomology and Nematology University of California, Davis Davis, California 95616 Recent developments in computer and display technologies are providing novel ways to interact with information. One of these innovations is the development of Virtual Reality (VR) hardware. Innovations in hardware and software have made this technology broadly accessible with options ranging from cell phone based VR kits made of cardboard to dedicated headsets driven by computers using powerful graphical hardware. VR based educational experiences provide opportunities to present content in a form where they are experienced in 3 dimensions and are interactive. This is accomplished by placing users in virtual spaces with content of interest and allows for natural interactions where users can physically move within the space and use their hands to directly manipulate/experience content. VR also reduces the impact of external sensory distractions by completely immersing the user in the experience. These interactions are particularly compelling when content that is only observable through a microscope (or not at all) can be made large allowing the user to experience these things at scale. This has great potential for entomological education and outreach as students can experience animated models of insects and arthropods at impossible scales. VR has great potential as a new way to present entomological content including aspects of morphology, physiology, behavior and other aspects of insect biology. This demonstration allows users of all ages to view static and animated 3D models of insects and arthropods in virtual reality.
    [Show full text]
  • UPDATED Activate Outlook 2021 FINAL DISTRIBUTION Dec
    ACTIVATE TECHNOLOGY & MEDIA OUTLOOK 2021 www.activate.com Activate growth. Own the future. Technology. Internet. Media. Entertainment. These are the industries we’ve shaped, but the future is where we live. Activate Consulting helps technology and media companies drive revenue growth, identify new strategic opportunities, and position their businesses for the future. As the leading management consulting firm for these industries, we know what success looks like because we’ve helped our clients achieve it in the key areas that will impact their top and bottom lines: • Strategy • Go-to-market • Digital strategy • Marketing optimization • Strategic due diligence • Salesforce activation • M&A-led growth • Pricing Together, we can help you grow faster than the market and smarter than the competition. GET IN TOUCH: www.activate.com Michael J. Wolf Seref Turkmenoglu New York [email protected] [email protected] 212 316 4444 12 Takeaways from the Activate Technology & Media Outlook 2021 Time and Attention: The entire growth curve for consumer time spent with technology and media has shifted upwards and will be sustained at a higher level than ever before, opening up new opportunities. Video Games: Gaming is the new technology paradigm as most digital activities (e.g. search, social, shopping, live events) will increasingly take place inside of gaming. All of the major technology platforms will expand their presence in the gaming stack, leading to a new wave of mergers and technology investments. AR/VR: Augmented reality and virtual reality are on the verge of widespread adoption as headset sales take off and use cases expand beyond gaming into other consumer digital activities and enterprise functionality.
    [Show full text]
  • Moving Web 3D Content Into Gearvr
    Moving Web 3d Content into GearVR Mitch Williams Samsung / 3d-online GearVR Software Engineer August 1, 2017, Web 3D BOF SIGGRAPH 2017, Los Angeles Samsung GearVR s/w development goals • Build GearVRf (framework) • GearVR Java API to build apps (also works with JavaScript) • Play nice with other devices • Wands, Controllers, I/O devices • GearVR apps run on Google Daydream ! • Performance, New Features • Fight for every Millisecond: • Android OS, Oculus, GPU, CPU, Vulkun • Unity, Unreal, GearVRf (framework) • Enable content creation • Game developers, 3D artists, UI/UX people, Web designers Content Creation for GearVR • 360 movies • Game Editors: Unity, Unreal • GearVRf (framework) • Open source Java api, JavaScript bindings • WebVR • WebGL; frameworks: A-frame, Three.js, React, X3Dom • 3d file formats • Collada, .FBX, gltf, .OBJ &.mtl, etc. using Jassimp • Java binding for Assimp (Asset Import Library) • X3D Why implement X3D in GearVR • Samsung began this effort February, 2016 • X3D is a widely supported file format • Exported by 3DS Max, Blender, Maya, Moto • Or exports VRML and converts to X3D • No other file format had similar capabilities. • Interactivity via JavaScript • Declarative format easy to edit / visualize the scene. • GearVR is not just a VR game console like Sony PSVR • We are a phone, web access device, camera, apps platform • X3D enables web applications: • Compliments the game influence in GearVR from Unity, Unreal. • Enables new VR web apps including: Google Maps, Facebook, Yelp JavaScript API’s. GearVR app Build Runtime Live Demo Very daring for the presenter who possesses no art skills. • 3ds Max 1. Animated textured objects 2. Export VRML97, Convert with “Instant Reality” to X3D • Android Studio 1.
    [Show full text]
  • Mobile AR/VR with Edge-Based Deep Learning Jiasi Chen Department of Computer Science & Engineering University of California, Riverside
    Mobile AR/VR with Edge-based Deep Learning Jiasi Chen Department of Computer Science & Engineering University of California, Riverside CNSM Oct. 23, 2019 Outline • What is AR/VR? • Edge computing can provide... 1. Real-time object detection for mobile AR 2. Bandwidth-efficient VR streaming with deep learning • Future directions 2 What is AR/VR? 3 End users Multimedia is… Audio On-demand video Internet Live video Content creation Compression Storage Distribution Virtual and augmented reality 4 What is AR/VR? | | | | virtual reality augmented virtuality augmented reality reality mixed reality 5 Who’s Using Virtual Reality? Smartphone-based hardware: Google Cardboard Google Daydream High-end hardware: 6 Playstation VR HTC Vive Why VR now? Portability (1) Have to go somewhere (2) Watch it at home (3) Carry it with you Movies: VR: CAVE (1992) Virtuality gaming (1990s) Oculus Rift (2016) Similar portability trend for VR, driven by hardware advances from the smartphone revolution.7 Who’s Using Augmented Reality? Smartphone- based: Pokemon Go Google Translate (text processing) Snapchat filters (face detection) High-end hardware: Google Glasses Microsoft Hololens 8 Is it all just fun and games? • AR/VR has applications in many areas: Data visualization Education Public Safety • What are the engineering challenges? • AR: process input from the real world (related to computer vision, robotics) • VR: output the virtual world to your display (related to computer graphics) 9 How AR/VR Works 1. Virtual world 3. Render 4. Display VR: generation 2. Real object detection AR: 4. Render 5. Display 1. Device tracking 10 What systems functionality is currently available in AR/VR? 11 Systems Support for VR Game engines • Unity • Unreal 1.
    [Show full text]
  • Open the Door to the Museum with Virtual Reality
    Open the Door to the Museum with Virtual Reality Using Google Virtual Reality Applications to Bring Immersive 3-D Images into the AP Art History Classroom Cathy Payne A Capstone in the Field of Museum Studies for the degree of Master of Liberal Arts in Extension Studies Harvard University Extension School March 2018 Author’s Statement The idea for this capstone project grew organically in my AP Art History classroom when I ordered a Google Cardboard viewer after reading about the technology in the New York Times in November 2015. Much like the story recounted by a journalist in this paper, as soon as I experienced Cardboard I was compelled to share it with a young people—in this case, my students. Although Street View had very few images useful to my classroom at the time, over the last two years the image database has grown exponentially, making it a valuable tool for teaching art history. It cannot be overstated how important my students have been to this project as I have watched them navigate, intuitively as digital natives, Cardboard in the classroom. They have shown me which teaching methods work best with VR and they have made it abundantly clear to me that sometimes the best lessons are learned while wondering/wandering off-script. The joy of teaching and learning has been rediscovered in my AP Art History classroom with Google Cardboard and Street View and my students have taken ownership and pride over their learning. I hope that the successes of this capstone project can be shared widely by teachers, students, and museums.
    [Show full text]
  • How to Navigate a 360° Video
    How To Navigate a 360° Video There are four common ways to watch • Start the video. If it does not play a 360° YouTube video: correctly, then repeat these steps • With a browser on your computer and select a lower resolution. Use • In “magic window” mode on your the highest resolution that your phone or tablet network speed will support. • With a phone-based, Google • If you click and release in the video Cardboard-compatible without moving the mouse, the virtual reality (VR) headset video will pause. To resume, click the play symbol in the lower left • With a standalone VR headset section or simply click again in the Note that not all browsers, main screen of the video. phones, tablets, and VR headsets support watching 360° YouTube videos. Browser When you play a YouTube 360° video, it will initially look like any other video. The key difference is that you can use your mouse—or finger if you’re using a touch screen—to look left, right, up, and down. These instructions should work for most browsers on most computers. A 360° video often defaults to low resolution so the first thing you need to do is to increase the resolution. Here are the steps: • Open your internet browser (e.g. Google Chrome, Safari) • Visit our tutorial video at bit.ly/2YGIvth • Click the gear icon in the lower-right corner of the video player to access the Settings options. • Click on “Quality,” and select “2160s 4K.” How To Navigate a 360° Video If you have a mouse, you can navigate • Install the free YouTube App on in the 360° video by left clicking and your device using the App Store.
    [Show full text]