Understanding Remote Collaboration in Video Collaborative Virtual Environments

Total Page:16

File Type:pdf, Size:1020Kb

Understanding Remote Collaboration in Video Collaborative Virtual Environments Understanding Remote Collaboration in Video Collaborative Virtual Environments A thesis submitted in partial fulfilment of the requirements for the Degree of Doctor of Philosophy in the University of Canterbury by J¨org Hauber Supervising Committee Dr. Andy Cockburn Supervisor Dr. Mark Billinghurst Co-Supervisor Dr. Holger Regenbrecht Associate Supervisor Dr. Dean Owen Associate Supervisor University of Canterbury 2008 Publications from this dissertation Material from this dissertation has been previously published in the peer- reviewed papers, posters, and extended abstracts listed below. The chapters of this thesis that relate to each publication are noted. I also comment on my individual contribution to each of these papers in the alongside paragraphs. 1. Hauber, J., Regenbrecht, H., Hills, A., Cockburn, A. and Billinghurst, M. (2005) Social Presence in Two- and Three-dimensional Videoconferencing. Presented at PRESENCE 2005, 8th Annual International Workshop on Pres- ence, (London, Sep. 21-23, 2005). (Chapter 4) 2. Hills, A., Hauber, J. and Regenbrecht, H. (2005) Videos in Space: A study on Presence in Video Mediating Communication Systems. Presented at the 15th International Conference on Artificial Reality and Telexistence (ICAT 2005), (Christchurch, Dec. 5-8, 2005). (Chapter 5) 3. Regenbrecht, H., Haller, M., Hauber, J., Billinghurst, M. (2006) Carpeno: Interfacing Remote Collaborative Virtual Environments with Table-Top In- teraction. Virtual Reality - Systems, Development and Applications, 10(2) (September 2006). Springer/London, 95–107. (Chapter 6) 4. Hauber, J., Regenbrecht, H., Billinghurst, M., and Cockburn, A. (2006) Spatiality in videoconferencing: trade-offs between efficiency and social pres- ence. In Proceedings of the 2006 20th Anniversary Conference on Computer Supported Cooperative Work (Banff, Alberta, Canada, November 04-08, 2006). CSCW ’06. ACM Press, New York, NY, 413–422. (Chapter 6) 5. Hauber, J. (2006) Improving Video-Mediated Communication with a Col- laborative Virtual Environment Approach. Doctoral Colloquium Abstract, ACM Conference on Computer Supported Co-operative work (CSCW ’06), Banff, Alberta, Canada, November 2006. 6. Hauber, J., Regenbrecht, H., Billinghurst, M., Teoh, H. (2006) Presence in Video-mediated Communication: Does being there help to come to- gether? Poster, ACM Conference on Computer Supported Co-operative work (CSCW ’06), Banff, Alberta, Canada, November 2006. (Chapter 7) Most of the papers resulted from collaborative work. My contributions to these papers were (numbers correspond to list numbers above): 1. This study was run by me with the assistance of Aimee Hills and under the supervision of Dr. Regenbrecht. I was responsible for experiment design and data analysis. I also wrote the paper with help from Dr. Regenbrecht, Dr. Cockburn, and Dr. Billinghurst. 2. This study was designed and conducted collaboratively by Aimee Hills and me, again under supervision and assistance of Dr. Regenbrecht. However, I analysed the gathered data separately and wrote most of the paper. 3. The Carpeno project was primarily a research effort by Dr. Regen- brecht and Dr. Haller. I helped with the implementation of the proto- type and conducted two informal user studies during its demonstration at two local conferences. I also wrote the part of the paper describing findings of these studies. The experiences gathered with both hardware and user feedback influenced the design of my follow-up experiments. 4. This was my own research. My supervisors Dr. Regenbrecht, Dr. Billinghurst, and Dr. Cockburn consulted me during the design and data analysis phase and repeatedly reviewed the resulting paper I wrote. 5. This extended abstract was written by me with assistance of my super- visors Dr. Cockburn and Dr. Billinghurst. 6. This poster was designed by me and reflected my ideas for the design of a study that I carried out later in a slightly modified form (see Chapter 7). Signature Date Dedicated to Sandra Abstract Video-mediated communication (VMC) is currently the prevalent mode of telecommunication for applications such as remote collaboration, telecon- ferencing, and distance learning. It is generally assumed that transmitting real-time talking-head videos of participants in addition to their audio is ben- eficial and desirable, enabling remote conferencing to feel almost the same as face-to-face collaboration. However, compared to being face-to-face, VMC still feels distant, artificial, cumbersome, and detached. One limitation of standard video-collaboration that contributes to this feeling is that the 3D context between people and their shared workspace given in face-to-face col- laboration is lost. It is therefore not possible for participants to tell from the video what others are looking at, what they are working on, or who they are talking to. Video Collaborative Virtual Environments (video-CVEs) are novel VMC interfaces which address these problems by re-introducing a virtual 3D con- text into which distant users are mentally “transported” to be together and interact with the environment and with each other, represented by their spa- tially controllable video-avatars. To date, research efforts following this ap- proach have primarily focused on the demonstration of working prototypes. However, maturation of these systems requires a deeper understanding of human factors that emerge during mediated collaborative processes. This thesis contributes to a deeper understanding of human factors. It in- vestigates the hypothesis that video-CVEs can effectively support face-to-face aspects of collaboration which are absent in standard video-collaboration. This hypothesis is tested in four related comparative user studies involv- ing teams of participants collaborating in video-CVEs, through standard video-conferencing systems, and being face-to-face. The experiments apply and extend methods from the research fields of human-computer interaction, computer-supported cooperative work, and presence. Empirical findings indicate benefits of video-CVEs for user experience dimensions such as social presence and copresence, but also highlight chal- lenges for awareness and usability that need to be overcome to unlock the full potential of this type of interface. Table of Contents List of Figures viii List of Tables xi Chapter 1: Introduction 1 1.1 Research context . 2 1.2 Problem statement and research hypothesis . 2 1.3 Contributions . 4 1.3.1 Methodological contributions . 4 1.3.2 Substantive contributions . 4 1.4 Structure of this thesis . 5 Chapter 2: Background 8 2.1 Getting a grip on communication . 9 2.1.1 Shannon and Weaver’s model of communication . 9 2.1.2 Interpersonal communication . 12 2.1.3 Collaboration . 17 2.1.4 Criticisms of transmission models . 20 2.1.5 Summary . 21 2.2 Mediated communication . 22 2.2.1 Media characteristics . 22 2.2.2 Media selection . 30 2.2.3 Remote collaboration in shared workspaces . 32 2.3 Video-mediated communication . 35 2.3.1 Review of empirical work in video-mediated communi- cation . 36 2.3.2 Improving video-mediated communication . 47 2.4 Using virtual reality (VR) as a communication medium . 57 2.4.1 The sense of presence in virtual environments . 58 2.4.2 Collaborative virtual environments (CVE) . 63 2.4.3 The affordances of CVEs for remote collaboration . 64 2.4.4 Examples of CVEs . 67 2.4.5 Empirical research in CVEs . 70 2.5 Chapter summary . 73 Chapter 3: Research Approach 75 3.1 A CVE approach to video-conferencing . 75 3.1.1 The video-CVE “cAR/PE!” . 77 3.2 Method . 82 3.2.1 A comparative approach . 83 3.2.2 Independent measures . 84 3.2.3 Dependent measures . 87 3.3 Overview of the experiments . 88 Chapter 4: Experiment: “Desert Survival Game” 93 4.1 Experiment design . 95 4.1.1 Experimental task . 95 4.1.2 Experiment conditions . 96 4.1.3 Apparatus . 97 4.1.4 Participants . 97 4.1.5 Procedure . 98 4.1.6 Measures . 100 4.2 Data analysis . 100 4.2.1 Social presence as quality of the medium . 100 4.2.2 Social presence as the experience of the other . 101 4.2.3 Exploratory questionnaire items . 101 4.2.4 Scale validity . 102 4.2.5 Age and simulator experience effects . 103 4.3 Results . 103 4.3.1 Social presence as a quality of the medium . 103 4.3.2 Social presence as the experience of the other . 104 4.3.3 Exploratory questionnaire items . 105 4.3.4 Scale validity . 106 ii 4.3.5 Age and simulator experience effects . 108 4.3.6 User observation in condition vCVE . 108 4.4 Discussion . 110 4.5 Experiment summary . 111 Chapter 5: Experiment: “Dream House” 114 5.1 Experiment design . 115 5.1.1 Experimental task . 115 5.1.2 Experiment conditions . 116 5.1.3 Apparatus . 117 5.1.4 Participants . 118 5.1.5 Procedure . 119 5.1.6 Measures . 119 5.2 Data analysis . 120 5.2.1 Social presence . 120 5.2.2 Physical presence . 121 5.2.3 The relation between physical presence and social pres- ence . 121 5.2.4 Reliability analysis . 121 5.2.5 Preference ratings . 121 5.2.6 Exploration of other factors . 121 5.3 Results . 122 5.3.1 Differences in social presence . 122 5.3.2 Differences in physical presence . 122 5.3.3 Exploring the relationship between social presence and physical presence . 122 5.3.4 Reliability analysis . 124 5.3.5 User comments and preference . 126 5.3.6 Gender and relationship effects . 127 5.4 Discussion . 127 5.5 Experiment summary . 128 Chapter 6: Experiment: “Dogs and Owners” 130 6.1 Experiment design . 132 iii 6.1.1 Experimental task . 132 6.1.2 Experiment conditions . 133 6.1.3 Apparatus . 135 6.1.4 Participants . 137 6.1.5 Procedure . 138 6.2 Measures . 139 6.2.1 Social presence . 139 6.2.2 Copresence . 139 6.2.3 Usability parameters . 140 6.2.4 Video recordings . 140 6.3 Data analysis .
Recommended publications
  • Advances in Artificial Reality and Tele-Existence
    R. Liang, Z. Pan, A. Cheok, M. Haller, R.W.H. Lau, H. Saito (Eds.) Advances in Artificial Reality and Tele-Existence 16th International Conference on Artificial Reality and Telexistence, ICAT 2006, Hangzhou, China, November 28 - December 1, 2006, Proceedings Series: Information Systems and Applications, incl. Internet/Web, and HCI, Vol. 4282 This book constitutes the refereed proceedings of the 16th International Conference on Artificial Reality and Telexistence, ICAT 2006, held in Hangzhou, China in November/ December 2006. The 138 revised full papers presented were carefully reviewed and selected from a total of 523 submissions. The papers are organized in topical sections on anthropomorphic intelligent robotics, artificial life, augmented reality, mixed reality, distributed and collaborative VR system, haptics, human factors of VR, innovative applications of VR, motion tracking, real time computer simulation, tools and technique for modeling VR systems, ubiquitous/wearable computing, virtual heritage, virtual medicine and health science, virtual reality, as well as VR interaction and navigation techniques. 2006, XLVI, 1350 p. In 2 volumes, not available separately. Printed book Softcover ▶ 159,99 € | £144.00 | $219.00 ▶ *171,19 € (D) | 175,99 € (A) | CHF 213.21 eBook Available from your bookstore or ▶ springer.com/shop MyCopy Printed eBook for just ▶ € | $ 24.99 ▶ springer.com/mycopy Order online at springer.com ▶ or for the Americas call (toll free) 1-800-SPRINGER ▶ or email us at: [email protected]. ▶ For outside the Americas call +49 (0) 6221-345-4301 ▶ or email us at: [email protected]. The first € price and the £ and $ price are net prices, subject to local VAT. Prices indicated with * include VAT for books; the €(D) includes 7% for Germany, the €(A) includes 10% for Austria.
    [Show full text]
  • Evaluation of Virtual Reality Tracking Systems Underwater
    International Conference on Artificial Reality and Telexistence Eurographics Symposium on Virtual Environments (2019) Y. Kakehi and A. Hiyama (Editors) Evaluation of Virtual Reality Tracking Systems Underwater Raphael Costa1,Rongkai Guo2,and John Quarles1 1University of Texas at San Antonio, USA 2Kennesaw State University, USA Abstract The objective of this research is to compare the effectiveness of various virtual reality tracking systems underwater. There have been few works in aquatic virtual reality (VR) - i.e., VR systems that can be used in a real underwater environment. Moreover, the works that have been done have noted limitations on tracking accuracy. Our initial test results suggest that inertial measurement units work well underwater for orientation tracking but a different approach is needed for position tracking. Towards this goal, we have waterproofed and evaluated several consumer tracking systems intended for gaming to determine the most effective approaches. First, we informally tested infrared systems and fiducial marker based systems, which demonstrated significant limitations of optical approaches. Next, we quantitatively compared inertial measurement units (IMU) and a magnetic tracking system both above water (as a baseline) and underwater. By comparing the devices’ rotation data, we have discovered that the magnetic tracking system implemented by the Razer Hydra is approximately as accurate underwater as compared to a phone-based IMU. This suggests that magnetic tracking systems should be further explored as a possibility
    [Show full text]
  • Virtual and Augmented Reality Applications in Medicine And
    OPEN ACCESS Freely available online iology: C ys ur h re P n t & R y e Anatomy & Physiology: s m e o a t r a c n h A ISSN: 2161-0940 Current Research Reie Article Virtual and Augmented Reality Applications in Medicine and Surgery- The Fantastic Voyage is here Wayne L Monsky1*, Ryan James2, Stephen S Seslar3 1Division of Interventional Radiology, Department of Radiology, University of Washington Medical Center, Washington, USA; 2Department of Biomedical Informatics and Medical Education, University of Washington, Seattle, Washington, USA; 3Division of Pediatric Cardiology, Department of Pediatrics, Seattle Children's Hospital, University of Washington, Seattle, WA, USA ABSTRACT Virtual Reality (VR), Augmented Reality (AR) and Head Mounted Displays (HMDs) are revolutionizing the way we view and interact with the world, affecting nearly every industry. These technologies are allowing 3D immersive display and understanding of anatomy never before possible. Medical applications are wide-reaching and affect every facet of medical care from learning gross anatomy and surgical technique to patient-specific pre-procedural planning and intra-operative guidance, as well as diagnostic and therapeutic approaches in rehabilitation, pain management, and psychology. The FDA is beginning to approve these approaches for clinical use. In this review article, we summarize the application of VR and AR in clinical medicine. The history, current utility and future applications of these technologies are described. Keywords: Virtual reality; Augmented reality; Surgery; Medicine; Anatomy; Therapy INTRODUCTION DEFINITIONS AND BRIEF HISTORY Recalling the 1966 science fiction film “Fantastic Voyage”, watching The human visual system in amazement as the doctors traveled inside the patient’s body to repair a brain injury.
    [Show full text]
  • Evaluating Performance Benefits of Head Tracking in Modern Video
    Evaluating Performance Benefits of Head Tracking in Modern Video Games Arun Kulshreshth Joseph J. LaViola Jr. Department of EECS Department of EECS University of Central Florida University of Central Florida 4000 Central Florida Blvd 4000 Central Florida Blvd Orlando, FL 32816, USA Orlando, FL 32816, USA [email protected] [email protected] ABSTRACT PlayStation Move, TrackIR 5) that support 3D spatial in- teraction have been implemented and made available to con- We present a study that investigates user performance ben- sumers. Head tracking is one example of an interaction tech- efits of using head tracking in modern video games. We nique, commonly used in the virtual and augmented reality explored four di↵erent carefully chosen commercial games communities [2, 7, 9], that has potential to be a useful ap- with tasks which can potentially benefit from head tracking. proach for controlling certain gaming tasks. Recent work on For each game, quantitative and qualitative measures were head tracking and video games has shown some potential taken to determine if users performed better and learned for this type of gaming interface. For example, Sko et al. faster in the experimental group (with head tracking) than [10] proposed a taxonomy of head gestures for first person in the control group (without head tracking). A game ex- shooter (FPS) games and showed that some of their tech- pertise pre-questionnaire was used to classify participants niques (peering, zooming, iron-sighting and spinning) are into casual and expert categories to analyze a possible im- useful in games. In addition, previous studies [13, 14] have pact on performance di↵erences.
    [Show full text]
  • Virtual Reality and Virtual Reality System Components Oluleke
    Proceedings of the 2nd International Conference On Systems Engineering and Modeling (ICSEM-13) Virtual Reality and Virtual Reality System Components Oluleke Bamodu1,2, a and Xuming Ye1, b 1 College of Mechanical Engineering, Shenyang University, Shenyang, China 2 Faculty of Computing, Engineering and Technology, Staffordshire University, United Kingdom [email protected], [email protected] Keywords: virtual reality, VR, virtual reality system, hardware, software, VR engine. Abstract. This paper is a synoptic review of virtual reality, its features, types, and virtual reality systems, as well as the elements of the VR system hardware and software, which are essential components of virtual reality systems. Introduction Applications of Virtual Reality (VR) have and continue to increase over the last couple of years. This is partly due to its usefulness in many fields and as a result of the attention given to it by the media. This trend is expected to continue in the future with the advancement of technology in areas like computer graphics, computer vision, controls, image processing and other technology-affiliated components. This paper aims to dissect the nature, the role, the component, the application and applicability of the virtual reality system, as well as, explore briefly the characteristics and use of VR system hardware and software. Definition of Virtual Reality Defining Virtual Reality can prove to be a difficult task because there is no standard definition for it. It is said to be an oxymoron, as it is referred to by some school of thought as “reality that does not exist” [1]. Many different fields have different meaning associated to it, while some people have even misused the term in many cases.
    [Show full text]
  • Adam NOWAK Czcionka Times New Roman (TNR) 13
    SILESIAN UNIVERSITY OF TECHNOLOGY PUBLISHING HOUSE SCIENTIFIC PAPERS OF THE SILESIAN UNIVERSITY OF TECHNOLOGY 2019 ORGANISATION AND MANAGEMENT SERIES NO. 134 1 POPULAR STRATEGIES AND METHODS FOR USING AUGMENTED 2 REALITY 3 Dawid PIECHACZEK1*, Ireneusz JÓŹWIAK2 4 1 University of Science and Technology, Wroclaw; [email protected], 5 ORCID: 0000-0001-6670-7568 6 2 University of Science and Technology, Wroclaw; [email protected], 7 ORCID: 0000-0002-2160-7077 8 * Correspondence author 9 Abstract: Augmented reality (AR) is a modern technology which integrates 3D virtual objects 10 into the real environment in real time. It can be used for many purposes, which should improve 11 different processes in daily life. The paper will analyze the areas in which this technology is 12 currently used. First, the history of the development of augmented reality will be recalled. 13 Then, this technology will be compared to virtual reality because these terms are often 14 incorrectly used interchangeably. This paper describes the tools and popular platform solutions 15 related to augmented reality. The most common problems related to the use of this technology 16 will be discussed, including popular approaches concerning optical and video combining 17 methods. The existing applications and their potential in solving everyday problems will be 18 analyzed. Finally, the perspectives for the development of augmented reality and its possibilities 19 in the future will be discussed. This paper provides a starting point for using and learning about 20 augmented reality for everyone. 21 Keywords: augmented reality, graphical elements, image processing, image recognition. 22 1.
    [Show full text]
  • Téléprésence, Immersion Et Interactions Pour Le Reconstruction
    THÈSE Pour obtenir le grade de DOCTEUR DE L’UNIVERSITÉ DE GRENOBLE Spécialité : Mathématiques et Informatique Arrêté ministériel : 7 août 2006 Présentée par Benjamin PETIT Thèse dirigée par Edmond BOYER et codirigée par Bruno RAFFIN préparée au sein des laboratoires LJK et LIG dans l’école doctorale MSTII Téléprésence, immersion et tel-00584001, version 1 - 7 Apr 2011 tel-00584001, version 1 - 7 interaction pour la reconstruction 3D temps-réel Thèse soutenue publiquement le 21 Février 2011 devant le jury composé de : Mme. Indira, THOUVENIN Enseignant chercheur à l’Université de Technologie de Compiègne, Président Mr. Bruno, ARNALDI Professeur à l’INSA Rennes, Rapporteur Mme. Saida, BOUAKAZ Professeur à l’Université Claude Bernard Lyon 1, Rapporteur Mr. Edmond, BOYER Directeur de recherche à l’INRIA Grenoble, Membre Mr. Bruno, RAFFIN Chargé de recherche à l’INRIA Grenoble, Membre tel-00584001, version 1 - 7 Apr 2011 Remerciements Je voudrais commencer par remercier Edmond et Bruno pour avoir encadrer ma thèse. Ce fut un plaisir de travailler avec vous. Merci également aux membres de mon jury d’avoir accepté de rapporter cette thèse. Merci pour vos commentaires très constructifs. Pendant ma thèse j’ai eu l’occasion de travailler avec différentes personnes. Ces collaborations ont été très enrichissantes. Je voudrais remercier plus spécifiquement Jean-Denis qui m’a aidé à remettre sur pied la plateforme Grimage, Thomas avec qui j’ai passé de longues heures à développer les applications et démonstrations de ma thèse et enfin Hervé pour son excellent support sur la plateforme Grimage. J’aimerais remercier également Clément et Florian pour m’avoir transmis leur savoir sur la plateforme Grimage, Nicolas et Jean-François pour leur aide technique.
    [Show full text]
  • Augmented Reality, Virtual Reality, & Health
    University of Massachusetts Medical School eScholarship@UMMS National Network of Libraries of Medicine New National Network of Libraries of Medicine New England Region (NNLM NER) Repository England Region 2017-3 Augmented Reality, Virtual Reality, & Health Allison K. Herrera University of Massachusetts Medical School Et al. Let us know how access to this document benefits ou.y Follow this and additional works at: https://escholarship.umassmed.edu/ner Part of the Health Information Technology Commons, Library and Information Science Commons, and the Public Health Commons Repository Citation Herrera AK, Mathews FZ, Gugliucci MR, Bustillos C. (2017). Augmented Reality, Virtual Reality, & Health. National Network of Libraries of Medicine New England Region (NNLM NER) Repository. https://doi.org/ 10.13028/1pwx-hc92. Retrieved from https://escholarship.umassmed.edu/ner/42 Creative Commons License This work is licensed under a Creative Commons Attribution-Noncommercial-Share Alike 4.0 License. This material is brought to you by eScholarship@UMMS. It has been accepted for inclusion in National Network of Libraries of Medicine New England Region (NNLM NER) Repository by an authorized administrator of eScholarship@UMMS. For more information, please contact [email protected]. Augmented Reality, Virtual Reality, & Health Zeb Mathews University of Tennessee Corina Bustillos Texas Tech University Allison Herrera University of Massachusetts Medical School Marilyn Gugliucci University of New England Outline Learning Objectives Introduction & Overview Objectives: • Explore AR & VR technologies and Augmented Reality & Health their impact on health sciences, Virtual Reality & Health with examples of projects & research Technology Funding Opportunities • Know how to apply for funding for your own AR/VR health project University of New England • Learn about one VR project funded VR Project by the NNLM Augmented Reality and Virtual Reality (AR/VR) & Health What is AR and VR? F.
    [Show full text]
  • Visual Tracking for Augmented Reality
    Visual Tracking for Augmented Reality Georg Klein King’s College A thesis submitted for the degree of Doctor of Philosophy January 2006 Declaration This dissertation is submitted to the University of Cambridge in partial fulfilment for the degree of Doctor of Philosophy. It is an account of work undertaken at the Department of Engineering between October 2001 and January 2006 under the supervision of Dr T.W. Drummond. It is the result of my own work and includes nothing which is the outcome of work done in collaboration except where specifically indicated in the text. This disser- tation is approximately 50,000 words in length and contains 39 figures. Georg Klein Acknowledgements I thank my supervisor, Dr. Tom Drummond, and my colleagues in the Fall- side lab for their help and friendship. I have learned a great deal from them and hope they have enjoyed the last four years as much as I have. I am grateful to the Gates Cambridge Trust for funding my research, and to Joe Newman for donating the Sony Glasstron display which made my work on Augmented Reality possible. Finally I would like to thank my parents for their patience and continual support. Abstract In Augmented Reality applications, the real environment is annotated or enhanced with computer-generated graphics. These graphics must be ex- actly registered to real objects in the scene and this requires AR systems to track a user’s viewpoint. This thesis shows that visual tracking with in- expensive cameras (such as those now often built into mobile computing devices) can be sufficiently robust and accurate for AR applications.
    [Show full text]
  • Virtual Reality in Scripted Television: Why It Will Ultimately Fail
    !1 Virtual Reality in Scripted Television: Why it will ultimately fail Lucy Tashman Advised by Paul Bloom, Brooks and Suzanne Ragen Professor of Psychology, Yale University Submitted to the faculty of Cognitive Science Department in partial fulfillment of the requirements for the degree of Bachelor of Arts Yale University April 21, 2017 !2 Abstract: Virtual reality has prominently made its way into the world of scripted television. It seems to many that it will be the future of the industry, but this is not the case. There are many reasons why virtual reality will fail in the world of scripted television. These can be broken down into practicality issues, issues with the art of filmmaking, and issues with story telling. Practicality issues encompass the current physical, technical, and aesthetic issues plaguing virtual reality. However, these are all issues that will most likely be fixed as the medium becomes more popular. The lasting reasons virtual reality will ultimately fail for scripted television is because of the problems with filmmaking and story telling. Virtual reality restricts the many artistic resources that exist in filmmaking, impeding on the artistry that helps television entice and entrance viewers. Lastly, virtual reality fundamentally opposes the format and goal of storytelling. Instead, virtual reality is confined to be a world creator and not a story teller. This renders virtual reality as a format that will fail for scripted television, but may succeed in different areas. !3 Contents 1. Introduction . 4 1.1 What is Virtual Reality . 5 1.2 The Rise of Virtual Reality . 6 1.3 Current Uses of Virtual Reality .
    [Show full text]
  • Flightsim Community Survey 2019
    FlightSim Community Survey 2019 Final Report 1 Copyright Notice © 2020 Navigraph By licensing our work with the CC BY-SA 4.0 license it means that you are more than welcome to copy, remix, transform and build upon the results of this survey and then redistribute it to whomever you want in any way. You only have to give credit back to Navigraph and keep the same license. https://creativecommons.org/licenses/by-sa/4.0/ 2 Preamble This is the annual flightsim community survey, a collaborative effort between partners – developers, companies and organizations in the flightsim domain – coordinated and compiled by Navigraph. This survey is freely distributed for the common good of the flightsim community to guide future projects and attract new pilots. This flightsim community survey is the largest and most comprehensive of its kind. This year 17,800 respondents participated in the survey. This is an 18.6% increase from last year when 15,000 participated. This year’s survey consisted of 93 questions, compared to last year’s 77 questions. However, this year many more of the questions were conditional, allowing us to add new sections and ask in-depth questions only to those respondents for whom it was relevant. New sections this year contained questions specifically aimed at pilots using flight simulator software on mobile devices and helicopter flight simulators. We also added questions on combat simulators, air traffic control and flight planning. Finally, because of the upcoming release of the new Microsoft Flight Simulator 2020, we added questions on this topic as well. Our main objective this year was to recruit more and diverse partners to broaden the reach of the survey to get an even more representative sample of the community.
    [Show full text]
  • Équipement Léger De Simulation À Domicile
    Équipement léger de simulation à domicile Le confinement vous a incité à trouver des solutions pour voler par-delà les interdictions administratives ou une météo capricieuse ? Alors, un ordinateur, un manche, CONDOR 2 et vous voilà prêt (à minima) à décoller *! *Les matériels que vous trouverez ci-dessous ont été testées par les membres du groupe « lab Planeur FFVP ». Ces préconisations ne sont pas exhaustives mais représentent le meilleur rapport qualité/prix du moment sur le matériel testé. Les liens vers les commerces en ligne sont proposés à titre indicatif et la FFVP n’a contracté aucun partenariat avec le distributeur ni le fabricant. Les matériels sont susceptibles d’être trouvés dans tout commerce dédié à un prix inférieur. Les prix peuvent variés d’un jour à l’autre suivant les promotions. Matériel requis : 1) Ordinateur : • Avec ou sans Track IR : processeur I3 minimum avec 4 Go de mémoire, carte graphique GTX 1050 TI • Avec un casque de réalité virtuelle : processeur I7 avec 8Go de mémoire carte graphique GTX 1080 2) Condor 2 et accès réseau internet En plus d’acquérir Condor 2 et de disposer d’un réseau internet de qualité, il vous faudra un disque dur de 500 Go minimum (recommandé 1 ou 2 To) pour stocker les scènes Condor... 3) Le matériel de vol Un joystick avec au minimum 1 manette de gaz et 8 boutons. Si vous voulez allez plus loin, nous conseillons l’acquisition d’un palonnier (ou la fabrication maison avec les nombreux tutos que vous trouverez via internet). a) manche à moins de 75€ Manche thrusmaster T 16000 FCS PC b) palonnier à moins de 150 € Thrustmaster TFRP Rudder c) les combinés manche/palonnier (150 à 250€) ▪ T.16000M FCS FLIGHT PACK (palonnier + manche avec trim 16 boutons + manette des gaz pour volet ou aérofreins) ▪ Thrusmaster T flight à 150 € environ Pour aller plus loin pour votre confort de pilotage Vous pouvez acquérir un Track Ir ou un masque de réalité virtuelle.
    [Show full text]