Visual Cues for Locating Out-Of-View Objects in Mixed Reality

Total Page:16

File Type:pdf, Size:1020Kb

Visual Cues for Locating Out-Of-View Objects in Mixed Reality Faculty II – School of Computing Science, Business Administration, Economics, and Law Visual Cues for Locating Out-of-View Objects in Mixed Reality A dissertation accepted by the School of Computing Science, Business Administration, Economics and Law of the Carl von Ossietzky University Oldenburg (Germany) to obtain the degree and title Doctor of Natural Sciences (Dr. rer. nat.) By Uwe Grünefeld, M.Sc. Born on 11th of June 1990, in Leer, Germany Supervisors: Dr. Wilko Heuten Prof. Dr. Susanne Boll-Westermann Reviewers: Prof. Dr. Susanne Boll-Westermann Prof. Dr. Hans Gellersen Day of Defense: 7th of February 2020 V Abstract In the past decade, Mixed Reality has emerged as a promising technology for supporting users with everyday tasks. It allows one to alter perceived reality by blending it with virtual content. Mixed Reality can thus be used to overlay perceived reality with visual cues that empower users to locate relevant objects in the environment, regardless of whether they are in view. This approach seems auspicious for various scenarios, such as a traffic encounter in which a car driver overlooks a cyclist, or the docking process of large container vessels during which the person in charge must monitor several assisting tugboats. In such situations, visual cues that help users to locate relevant objects out of view can improve situational awareness and help to avoid fatal consequences. In the presented work, we follow the “Research through Design” methodology to develop and evaluate visual cues in Mixed Reality that empower users to locate out-of-view objects. Initially, we analyzed three different scenarios, conducting an ethnographic study, an accident analysis, and literature reviews. Thereafter, inspired by the different scenarios, we reviewed all relevant background and re- lated work to derive three research questions that are studied in depth in the presented work: (RQ1) To what extent can existing off-screen visualization tech- niques be adapted to cue the direction to out-of-view objects in Mixed Reality?, (RQ2) How can Mixed Reality devices with small fields-of-view be extended to present directional cues to out-of-view objects?, and (RQ3) In what way can the directions and distances of out-of-view objects be visualized for moving or non-moving objects in Mixed Reality? Our results show that directional cues presented in the user’s periphery are easy to perceive and help the user to locate objects quickly. We showed that using three-dimensional visual cues can result in a lower direction estimation error than using two-dimensional cues. Furthermore, if the used Mixed Reality device suffers from a small field-of-view, radial light displays presented around the screen can be used to cue direction for out-of-view objects. However, sometimes a user must simultaneously locate several out-of-view objects, some of which may be occluded. Directional cues alone are insufficient for such cases, so distance information is required as well. We found that it is possible to convey this information, albeit with increased workload and additional visual clutter. Furthermore, visual cues that help a user to locate out-of-view objects should not disappear when these objects are visible on the screen, since assistance when these objects appear in view improves error rates and overall performance of the visual assistance. VII Zusammenfassung Im letzten Jahrzehnt ist Mixed Reality zu einer vielversprechenden Technologie geworden um Nutzer bei alltäglichen Aufgaben zu unterstützen. Mixed Reality er- möglicht es die wahrgenommene Realität mit virtuellen Inhalten zu verschmelzen. Dadurch kann die vom Nutzer wahrgenommene Realität mit visuellen Hinweisen überlagert werden, die helfen relevante Objekte in der Umgebung zu lokalisieren, selbst wenn diese nicht im Sichtfeld des Nutzers liegen. Dieser Ansatz ist viel- versprechend für unterschiedliche Szenarien, zum Beispiel im Straßenverkehr, in welchem ein Autofahrer beim Abbiegen einen Fahrradfahrer übersieht, oder im Fall eines andockenden Containerschiffes, in welchem die verantwortliche Person gleichzeitig mehrere unterstützende Schlepper überwachen muss. In all diesen Szenarien können visuelle Hinweise, die es Nutzern ermöglichen relevante Ob- jekte außerhalb der Sicht zu lokalisieren, helfen, die Situationswahrnehmung zu verbessern und gefährliche Konsequenzen zu vermeiden. In dieser Arbeit, folgen wir der Research-through-Design Methodik, um visuelle Hinweise in Mixed Reality zu entwickeln und zu evaluieren, die es Nutzern ermög- lichen, die Position von Objekten außerhalb der Sicht zu bestimmen. Anfänglich haben wir drei verschiedene Szenarien mit Hilfe einer ethnografischen Studie, ei- ner Unfallanalyse und Literaturrecherchen untersucht. Anschließend, inspiriert von den unterschiedlichen Szenarien, haben wir alle Grundlagen und verwandten Arbeiten analysiert, um drei Forschungsfragen abzuleiten, die in der vorliegenden Arbeit im Detail untersucht werden sollen: (RQ1) In welchem Umfang müssen be- stehende Off-Screen-Visualisierungstechniken angepasst werden, um die Richtung zu den Objekten außerhalb der Sicht in Mixed Reality anzuzeigen?, (RQ2) Wie können Mixed Reality-Geräte mit einem kleinen Sichtfeld erweitert werden, um Richtungshinweise auf Objekte außerhalb der Sicht zu präsentieren, und (RQ3) Inwiefern kann die Richtung und Distanz zu statischen oder beweglichen Objek- ten außerhalb des Sichtfeldes in Mixed Reality dargestellt werden? Unsere Ergebnisse zeigen, dass visuelle Hinweise in der Peripherie leicht zu erkennen sind und helfen Objekte außerhalb der Sicht schnell zu finden. Wir konnten zeigen, dass drei-dimensionale visuelle Hinweise zu geringeren Richtungs- schätzfehlern führen als zwei-dimensionale Hinweise. Außerdem können radia- le Lichtdisplays das vorhandene Display erweitern um Richtungsinformationen anzuzeigen, wenn das Sichtfeld des vorhanden Displays problematisch klein ist. Wenn ein Benutzer jedoch mehrere Objekte außerhalb der Sicht loaklisieren soll die ebenfalls verdeckt sind, dann reichen Richtungsinformationen nicht aus und zusätzliche Distanzinformationen werden benötigt. Wir konnten zeigen, dass es möglich ist diese zusätzlichen Informationen anzuzeigen, jedoch mit erhöhter Ar- beitslast und zusätzlicher visueller Verdeckung. Darüber hinaus sollten visuelle Hinweise, die dem Benutzer helfen, Objekte außerhalb der Sicht zu lokalisieren, nicht verschwinden wenn diese Objekte auf dem Bildschirm sichtbar sind, um Fehlerraten zu reduzieren und die visuelle Assistenz effektiv zu verbessern. IX Acknowledgments This thesis would not have been possible without many people who supported me during my time as a doctoral student. I would like to express my gratitude to my supervisor Wilko Heuten for the great support, useful discussions, remarks, and engagement throughout the learning process. Furthermore, I would like to thank Susanne Boll for her helpful advice during the discussions and presentations of this work, and for the opportunity to conduct my research. My thanks also go to my committee members Hans Gellersen, Marius Brinkmann and Martin Fränzle. Thank you for your time and efforts. Your valuable feedback helped me to further improve this thesis. A special thank you goes to my colleagues, my collaborators, my students, my friends, and my family. Without the help of every single one of you, this work would not have been possible, and I am very thankful that I received your support and advice throughout the years. Last but not least, I would like to thank Dakota Smith, who helped me to proof-read my thesis and never declined my attempts to discuss the content of this thesis with her. XI Contents 1 Introduction 1 1.1 Scenarios . 2 1.2 Approach and Methodology . 9 1.3 Thesis Outline . 12 1.4 Publications . 14 2 Background 17 2.1 Human Visual Perception . 17 2.2 Cognitive Maps . 22 2.3 Mixed Reality . 26 3 Related Work 35 3.1 Off-Screen Visualization Techniques . 35 3.2 Head-mounted Peripheral Displays . 41 3.3 Attention Guidance . 46 4 Conceptual Design 51 4.1 Problem Space . 51 4.2 Strategies for Locating Out-of-View Objects . 53 4.3 Peripheral Visualization . 55 4.4 Head-mounted Mixed Reality . 56 4.5 Challenges . 57 4.6 Research Questions . 59 5 Adapting Off-Screen Visualization Techniques 61 5.1 Comparing Off-Screen Visualization Techniques . 62 5.2 Beyond Halo and Wedge: 3D Visual Cues . 70 5.3 FlyingARrow: Flying Towards Out-of-View Objects . 89 5.4 Summary . 97 6 Extending the Field-of-View of Mixed Reality 99 6.1 PeriMR: Prototyping Peripheral Light Displays . 100 6.2 RadialLightVR: Investigating Radial Displays in Virtual Reality . 104 6.3 MonoculAR: Single Radial Display for Augmented Reality . 115 6.4 Guiding Smombies: Low-Cost Glasses for Guiding Attention . 121 6.5 Summary . 130 XII Contents 7 Encoding the Out-of-View Object’s Position 131 7.1 EyeSee360: Direction and Distance of Out-of-View Objects . 132 7.2 EyeSee360 for Optical See-Through Augmented Reality . 151 7.3 Comparing Techniques for Moving Out-of-View Objects . 158 7.4 Summary . 167 8 Conclusions and Outlook for Future Work 169 8.1 Synopsis . 169 8.2 Contributions to the Research Questions . 169 8.3 Recommendations for the Scenarios . 173 8.4 Future Work . 175 8.5 Concluding Remarks . 177 A System Usability Scale Questionnaire 179 B NASA Raw Task Load Index Questionnaire 180 Figures 181 Tables 185 Bibliography 187 1 1 Introduction Awareness of objects in the environment is an important ability required for everyday life. Without information about the locations of objects around us, there
Recommended publications
  • VR Headset Comparison
    VR Headset Comparison All data correct as of 1st May 2019 Enterprise Resolution per Tethered or Rendering Special Name Cost ($)* Available DOF Refresh Rate FOV Position Tracking Support Eye Wireless Resource Features Announced Works with Google Subject to Mobile phone 5.00 Yes 3 60 90 None Wireless any mobile No Cardboard mobile device required phone HP Reverb 599.00 Yes 6 2160x2160 90 114 Inside-out camera Tethered PC WMR support Yes Tethered Additional (*wireless HTC VIVE 499.00 Yes 6 1080x1200 90 110 Lighthouse V1 PC tracker No adapter support available) HTC VIVE PC or mobile ? No 6 ? ? ? Inside-out camera Wireless - No Cosmos phone HTC VIVE Mobile phone 799.00 Yes 6 1440x1600 75 110 Inside-out camera Wireless - Yes Focus Plus chipset Tethered Additional HTC VIVE (*wireless tracker 1,099.00 Yes 6 1440x1600 90 110 Lighthouse V1 and V2 PC Yes Pro adapter support, dual available) cameras Tethered All features HTC VIVE (*wireless of VIVE Pro ? No 6 1440x1600 90 110 Lighthouse V1 and V2 PC Yes Pro Eye adapter plus eye available) tracking Lenovo Mirage Mobile phone 399.00 Yes 3 1280x1440 75 110 Inside-out camera Wireless - No Solo chipset Mobile phone Oculus Go 199.00 Yes 3 1280x1440 72 110 None Wireless - Yes chipset Mobile phone Oculus Quest 399.00 No 6 1440x1600 72 110 Inside-out camera Wireless - Yes chipset Oculus Rift 399.00 Yes 6 1080x1200 90 110 Outside-in cameras Tethered PC - Yes Oculus Rift S 399.00 No 6 1280x1440 90 110 Inside-out cameras Tethered PC - No Pimax 4K 699.00 Yes 6 1920x2160 60 110 Lighthouse Tethered PC - No Upscaled
    [Show full text]
  • Immersive Virtual Reality Methods in Cognitive Neuroscience and Neuropsychology: Meeting the Criteria of the National Academy Of
    Immersive virtual reality methods in cognitive neuroscience and neuropsychology: Meeting the criteria of the National Academy of Neuropsychology and American Academy of Clinical Neuropsychology Panagiotis Kourtesisa,b,c,d* and Sarah E. MacPhersone,f aNational Research Institute of Computer Science and Automation, INRIA, Rennes, France; bUniv Rennes, Rennes, France; cResearch Institute of Computer Science and Random Systems, IRISA, Rennes, France; dFrench National Centre for Scientific Research, CNRS, Rennes, France. eHuman Cognitive Neuroscience, Department of Psychology, University of Edinburgh, Edinburgh, UK; fDepartment of Psychology, University of Edinburgh, Edinburgh, UK; * Panagiotis Kourtesis, National Research Institute of Computer Science and Automation, INRIA, Rennes, France. Email: [email protected] Abstract Clinical tools involving immersive virtual reality (VR) may bring several advantages to cognitive neuroscience and neuropsychology. However, there are some technical and methodological pitfalls. The American Academy of Clinical Neuropsychology (AACN) and the National Academy of Neuropsychology (NAN) raised 8 key issues pertaining to Computerized Neuropsychological Assessment Devices. These issues pertain to: (1) the safety and effectivity; (2) the identity of the end-user; (3) the technical hardware and software features; (4) privacy and data security; (5) the psychometric properties; (6) examinee issues; (7) the use of reporting services; and (8) the reliability of the responses and results. The VR Everyday Assessment Lab (VR-EAL) is the first immersive VR neuropsychological battery with enhanced ecological validity for the assessment of everyday cognitive functions by offering a pleasant testing experience without inducing cybersickness. The VR-EAL meets the criteria of the NAN and AACN, addresses the methodological pitfalls, and brings advantages for neuropsychological testing.
    [Show full text]
  • Recommendations for Integrating a P300-Based Brain–Computer Interface in Virtual Reality Environments for Gaming: an Update
    computers Review Recommendations for Integrating a P300-Based Brain–Computer Interface in Virtual Reality Environments for Gaming: An Update Grégoire Cattan 1,* , Anton Andreev 2 and Etienne Visinoni 3 1 IBM, Cloud and Cognitive Software, Department of SaferPayment, 30-150 Krakow, Poland 2 GIPSA-lab, CNRS, Department of Platforms and Project, 38402 Saint Martin d’Hères, France; [email protected] 3 SputySoft, 75004 Paris, France; [email protected] * Correspondence: [email protected] Received: 19 September 2020; Accepted: 12 November 2020; Published: 14 November 2020 Abstract: The integration of a P300-based brain–computer interface (BCI) into virtual reality (VR) environments is promising for the video games industry. However, it faces several limitations, mainly due to hardware constraints and limitations engendered by the stimulation needed by the BCI. The main restriction is still the low transfer rate that can be achieved by current BCI technology, preventing movement while using VR. The goal of this paper is to review current limitations and to provide application creators with design recommendations to overcome them, thus significantly reducing the development time and making the domain of BCI more accessible to developers. We review the design of video games from the perspective of BCI and VR with the objective of enhancing the user experience. An essential recommendation is to use the BCI only for non-complex and non-critical tasks in the game. Also, the BCI should be used to control actions that are naturally integrated into the virtual world. Finally, adventure and simulation games, especially if cooperative (multi-user), appear to be the best candidates for designing an effective VR game enriched by BCI technology.
    [Show full text]
  • Flightsim Community Survey 2019
    FlightSim Community Survey 2019 Final Report 1 Copyright Notice © 2020 Navigraph By licensing our work with the CC BY-SA 4.0 license it means that you are more than welcome to copy, remix, transform and build upon the results of this survey and then redistribute it to whomever you want in any way. You only have to give credit back to Navigraph and keep the same license. https://creativecommons.org/licenses/by-sa/4.0/ 2 Preamble This is the annual flightsim community survey, a collaborative effort between partners – developers, companies and organizations in the flightsim domain – coordinated and compiled by Navigraph. This survey is freely distributed for the common good of the flightsim community to guide future projects and attract new pilots. This flightsim community survey is the largest and most comprehensive of its kind. This year 17,800 respondents participated in the survey. This is an 18.6% increase from last year when 15,000 participated. This year’s survey consisted of 93 questions, compared to last year’s 77 questions. However, this year many more of the questions were conditional, allowing us to add new sections and ask in-depth questions only to those respondents for whom it was relevant. New sections this year contained questions specifically aimed at pilots using flight simulator software on mobile devices and helicopter flight simulators. We also added questions on combat simulators, air traffic control and flight planning. Finally, because of the upcoming release of the new Microsoft Flight Simulator 2020, we added questions on this topic as well. Our main objective this year was to recruit more and diverse partners to broaden the reach of the survey to get an even more representative sample of the community.
    [Show full text]
  • Équipement Léger De Simulation À Domicile
    Équipement léger de simulation à domicile Le confinement vous a incité à trouver des solutions pour voler par-delà les interdictions administratives ou une météo capricieuse ? Alors, un ordinateur, un manche, CONDOR 2 et vous voilà prêt (à minima) à décoller *! *Les matériels que vous trouverez ci-dessous ont été testées par les membres du groupe « lab Planeur FFVP ». Ces préconisations ne sont pas exhaustives mais représentent le meilleur rapport qualité/prix du moment sur le matériel testé. Les liens vers les commerces en ligne sont proposés à titre indicatif et la FFVP n’a contracté aucun partenariat avec le distributeur ni le fabricant. Les matériels sont susceptibles d’être trouvés dans tout commerce dédié à un prix inférieur. Les prix peuvent variés d’un jour à l’autre suivant les promotions. Matériel requis : 1) Ordinateur : • Avec ou sans Track IR : processeur I3 minimum avec 4 Go de mémoire, carte graphique GTX 1050 TI • Avec un casque de réalité virtuelle : processeur I7 avec 8Go de mémoire carte graphique GTX 1080 2) Condor 2 et accès réseau internet En plus d’acquérir Condor 2 et de disposer d’un réseau internet de qualité, il vous faudra un disque dur de 500 Go minimum (recommandé 1 ou 2 To) pour stocker les scènes Condor... 3) Le matériel de vol Un joystick avec au minimum 1 manette de gaz et 8 boutons. Si vous voulez allez plus loin, nous conseillons l’acquisition d’un palonnier (ou la fabrication maison avec les nombreux tutos que vous trouverez via internet). a) manche à moins de 75€ Manche thrusmaster T 16000 FCS PC b) palonnier à moins de 150 € Thrustmaster TFRP Rudder c) les combinés manche/palonnier (150 à 250€) ▪ T.16000M FCS FLIGHT PACK (palonnier + manche avec trim 16 boutons + manette des gaz pour volet ou aérofreins) ▪ Thrusmaster T flight à 150 € environ Pour aller plus loin pour votre confort de pilotage Vous pouvez acquérir un Track Ir ou un masque de réalité virtuelle.
    [Show full text]
  • Applikationszentrum V/AR Bericht #08: Head-Mounted Displays
    Bericht #08: Head-Mounted Displays: Messung des Sichtfelds (Field of View) Stand: v12. 06.11.2020 Inhalt 1. Einführung - Aufgabenstellung ........................................................................................................ 3 2. Messaufbau ..................................................................................................................................... 4 3. Messverfahren ................................................................................................................................. 6 4. Messergebnisse ............................................................................................................................... 7 5. Schlussfolgerungen ........................................................................................................................ 14 6. Limitationen ................................................................................................................................... 16 7. Glossar ........................................................................................................................................... 17 8. Messprotokoll ................................................................................................................................ 18 9. Literaturverzeichnis ....................................................................................................................... 19 10. Impressum ....................................................................................................................................
    [Show full text]
  • Laval Virtual's Missions Are to Gather, Inspire and Valorize Involved in This Study
    The VR/AR special edition #4 health Clinical VR Medicine Well Being #EDITORIAL How VR is changing the way women breast cancer is diagnosed, treated and managed LAURENT CHRÉTIEN DIRECTOR / LAVAL VIRTUAL ancer cells live in complex communities. They will then take all the information they Just like houses in a city, each cell in a collect about the cells in a tumour and use it tumour is different from its neighbour, to construct a 3D version that can be studied Cand relies on infrastructure to support using virtual reality. its existence. And we know that there are different neighbourhoods, some worse than Using virtual reality will allow scientists others. Where we have roads, tumours contain to immerse themselves in a tumour, blood vessels that deliver nutrients, and act meaning they can study patterns and other as highways for different cell types to move characteristics within it, in entirely new around. And when a tumour spreads, the can- ways that aren’t possible in 2D. It will also cer cells themselves use these blood ‘roads’ to allow multiple doctors and scientists to look migrate. at a tumour at the same time, meaning people at opposite ends of a country, and with different areas of expertise, can What the healthcare experts need is a Google Earth-like view work together to help diagnose and treat patients better. And of a tumour. If they could make a 3D map, they would find with the Covid19 crisis, the use of virtual reality to cooperate new targets for treatment and, eventually, could use this view remotely is even more obvious! to track what’s going on in real time, such as in response to treatment.
    [Show full text]
  • Newsletter 2019-03
    VDC-Newsletter März 2019 Der VDC-Newsletter ist der monatliche Informationsdienst des Virtual Dimension Center (VDC) Fellbach mit Neuigkeiten aus dem Netzwerk sowie Nachrichten und Terminen rund um das Thema Virtuelles Engineering. Tagesaktuelle Nachrichten sind auf www.vdc-fellbach.de verfügbar. Inhaltsverzeichnis VDC-Spezial VDC-Veranstaltungen & -Aktivitäten Stellenangebote Analysen & Markt Anwendungen Forschung & Technologie Unternehmensnews Termine Kontakt & Impressum Vernetzen Sie sich mit uns: VDC-Spezial VDC Fellbach zum dritten Mal in Folge unter den besten Netzwerken Europas Externe Gutachter durchleuchten die Tätigkeit des VDCs in insgesamt 34 Feldern und bescheinigen dem Virtual-Reality-Netzwerk exzellente Arbeit. [Bildquelle: Virtual Dimension Center Fellbach] zum Beitrag Jahresbericht 2018 des VDC-Netzwerks veröffentlicht Das Virtual Dimension Center (VDC) hat den Jahresbericht für das vergangene Jahr vorgelegt. Der Bericht informiert über die Tätigkeit der Geschäftsstelle, neue Mitglieder des Netzwerkes sowie die zahlreichen Projekte im Jahr 2018. zum Beitrag Das Virtual Dimension Center freut sich über vier neue Mitglieder Der Vorstand des Virtual Dimension Centers (VDC) Fellbach nahm vier Unternehmen in das Virtual-Reality-Netzwerk auf. zum Beitrag 3. Forum M-T-I: Smart Operator - Wearables in Produktion und Service Im Mittelpunkt der Veranstaltung stehen Wearables, wie Datenbrillen oder Smart Watches, die bei Industrieunternehmen vermehrt in Produktion und Service eingesetzt werden. Informieren Sie sich zum aktuellen Entwicklungsstand im Rahmen der Trendvorträge und testen Sie entsprechende Lösungen direkt vor Ort in der begleitenden Ausstellung. [Bildquelle: Festo] zum Beitrag Technologieforum "Virtual Aircraft": melden Sie sich noch bis zum 5. April an Das Technologieforum findet am 11. April 2019, im Rahmen der AERO 2019, der Fachmesse für Allgemeine Luftfahrt in Friedrichshafen statt.
    [Show full text]
  • What Will It Take to Create Life-Like Virtual Reality Headsets?
    Creating the Perfect Illusion : What will it take to Create Life-Like Virtual Reality Headsets? Eduardo Cuervo Krishna Chintalapudi Manikanta Kotaru Microsoft Research Microsoft Research Stanford University [email protected] [email protected] [email protected] ABSTRACT Based on the limits of human visual perception, we first show As Virtual Reality (VR) Head Mounted Displays (HMD) push the that Life-Like VR headsets will require 6−10× higher pixel densities boundaries of technology, in this paper, we try and answer the and 10 − 20× higher frame rates of that of existing commercially question, “What would it take to make the visual experience of a available VR headsets. We then ask, “Will it be possible to create VR-HMD Life-Like, i.e., indistinguishable from physical reality?” Life-Like headsets in the near future, given this gap in specifications Based on the limits of human perception, we first try and establish to be bridged?” We examine various technological trends such as the specifications for a Life-Like HMD. We then examine crucial pixel densities, frame rates, computational power of GPUs etc. Our technological trends and speculate on the feasibility of Life-Like VR analysis indicates that while displays are likely to achieve Life-Like headsets in the near future. Our study indicates that while display specifications by 2025, the computational capacity of GPUs will technology will be capable of Life-Like VR, rendering computation remain a bottleneck and not be able to render at the high frame-rates is likely to be the key bottleneck. Life-Like VR solutions will likely required for achieving a Life-Like VR experience.
    [Show full text]
  • Displays and Optic Innovations Transform the AR & VR Industries1
    Press Release May 04, 2020 LYON, France A bright future? Displays and optic innovations transform the AR & VR industries1 OUTLINES: Technology status for AR2 & VR3: optics is getting ready. MicroLED displays are the next roadblock. OEM4s are waiting for MicroLED-based innovations. Yole Développement anticipates MicroLED’s penetration to reach 30% in AR headsets by 2027. A 1st generation of AR headsets to come soon with a 2021 milestone showing noticeable volume. Supporting an attractive 105% CAGR5 for AR headsets through 2027, a dedicated supply chain is emerging with new players and new strategies to handle key manufacturing challenges. From Zine Bouhamri, PhD, Technology & Market Analyst, Displays at Yole Développement (Yole): “AR was and continues to be the dream that consumer electronics companies want to make real to deliver the long-awaited revolution of replacing smartphones. But as children of the flat panel display industry, we are used to having very high-quality displays all around us. And the image quality that AR has been able to provide so far is not yet at this level. Technology improvements such as waveguide optics and microLEDs will enable an increase in functionalities and use case developments. Without a compelling use case, the consumer will not jump into the game”. The market research & strategy consulting company, Yole, analyzes this progress in a new dedicated display report, titled Displays and optics for AR & VR 2020. Yole offers a comprehensive overview and in-depth understanding of the displays and optics markets associated with these industries. This report analyzes the key challenges related to AR & VR systems and the future trends and evolution of AR & VR market landscapes.
    [Show full text]
  • Windows Mixed Reality Recommended Specs
    Windows Mixed Reality Recommended Specs Sometimes vulned Fitz improvised her grabbler swaggeringly, but Pecksniffian Bryon shunned wealthily or titivate perspectively. Dreich and prokaryotic Bing agnize generically and crave his tyne listlessly and presently. Asunder and raiding Saundra always aliment impersonally and parboil his penises. In small white leds, and registered trademarks of the performance despite the oculus rift, it back up to leave vr home windows mixed reality has since before putting the Mark at the windows creators, but none of recommended for users to earn compensation on our data set up of games and more intuitive and. EVERYDAY GOLF VR is a realistic golf VR game. The world within that dtna asked themselves. Vr players inside the image has a timeline as their next iteration of recommended specs. Windows mixed reality, or other programs you tried restarting your back down with tracked controllers with integrated into many of recommended specs required, but some features or clicked back up. Windows Mixed Reality headsets. Find answers to common Windows Mixed Reality questions in our FAQs. Pc vendors and unofficial vr requires a windows mixed reality recommended specs, footer and adobe premiere pro is better screen or registered trademarks and. Apple would be changed server and windows mixed reality recommended specs actually interact with. How to windows mixed reality is possible by releasing a window to release of recommended specs that work properly. Steam VR theater and working. Adjust on windows mixed reality recommended specs and those who consider the specs and. Am did that makes me and windows mixed reality recommended specs.
    [Show full text]
  • Vr Training for Enterprise
    VR TRAINING FOR ENTERPRISE Everything you need to know to develop and deploy successful Virtual Reality training solutions into your business or organisation. © 2019 SPEARHEAD INTERACTIVE LIMITED INTRODUCTION If you’re interested in the ways in which Virtual Reality can be added to your training program, then this guide is for you. Whilst the resurgence of VR began back in 2013, there is still a lot of mysticism and ambiguity around exactly what the art of the possible may be across multiple areas of commercial operation – from design and planning, consultation and feedback, process management, digital transformation and data visualisation. Through this guide, we aim to educate and inform on the specifics of utilising VR Training for enterprise; specifically: - Outlining technology and software options - Demonstrating tangible benefits - Offering insights to ensure you’re working with the right developer - Developing your business case - De-risking any investment - Ensuring a successful engagement / outcome This guide has been written by the award-winning team at Spearhead Interactive based on over 10 years of experience developing and deploying real-time 3D and immersive / interactive technologies within enterprise and B2B sectors. We would be delighted to assist you as you explore this exciting new era of immersive training. If you have any further questions around discovery, development or deployment, please contact [email protected] or call +44 (0) 1642 689 4187. V1.0 February 2019 © 2019 Spearhead Interactive Limited Page 2 of 33 http://www.spearheadinteractive.com CONTENTS 1. What is VR? ……………………………… 4 2. Benefits of using VR for Training ……………………………… 5 3. Types of VR ……………………………… 6 3.1.
    [Show full text]