
UC San Diego UC San Diego Electronic Theses and Dissertations Title RealityFlythrough : a system for ubiquitous video Permalink https://escholarship.org/uc/item/4gk7t002 Author McCurdy, Neil James Publication Date 2007 Supplemental Material https://escholarship.org/uc/item/4gk7t002#supplemental Peer reviewed|Thesis/dissertation eScholarship.org Powered by the California Digital Library University of California UNIVERSITY OF CALIFORNIA, SAN DIEGO RealityFlythrough: A System for Ubiquitous Video A dissertation submitted in partial satisfaction of the requirements for the degree Doctor of Philosophy in Computer Science by Neil James McCurdy Committee in charge: Professor William G. Griswold, Chair Professor James D. Hollan Professor Leslie A. Lenert Professor Stefan Savage Professor Mohan M. Trivedi 2007 Copyright Neil James McCurdy, 2007 All rights reserved. The dissertation of Neil James McCurdy is approved, and it is acceptable in quality and form for publication on micro- film: Chair University of California, San Diego 2007 iii TABLE OF CONTENTS Signature Page . iii Table of Contents . iv List of Figures . viii List of Tables . x Acknowledgments . xi Vita . xiii Abstract . xiv Chapter 1 Introduction . 1 1.1. How does RealityFlythrough Work? . 5 1.2. Why does RealityFlythrough Work? . 6 1.3. Other Forms of Telepresence . 8 1.4. How do you Build RealityFlythrough? . 10 1.4.1. System Architecture . 10 1.4.2. Engine Architecture . 11 1.4.3. Handling Dynamic Environments . 13 1.4.4. Using Transitions to Compensate for Low Frame Rates . 15 1.4.5. Using Point-Matching to Augment Position Sensors . 16 1.4.6. Walking Through Hallways . 19 1.5. Organization of the Dissertation . 20 Chapter 2 Presence . 23 2.1. Introduction . 23 2.2. Presence . 25 2.2.1. Properties of Presence . 26 2.2.2. Dimensions of Mediated Presence . 28 2.3. The Virtual Window . 32 2.4. Visual Eyes . 36 2.4.1. Goggles’n’Gloves . 37 2.4.2. CAVE’s . 38 2.4.3. Visual Eyes . 39 2.5. Space Browser . 41 2.6. Augmented Virtual Environments . 44 2.7. Presence in RealityFlythrough . 50 2.7.1. Properties of Presence . 52 2.7.2. Content . 55 iv 2.7.3. Beyond Being There . 56 2.7.4. Reflections . 58 2.8. Conclusion . 58 Chapter 3 An Abstraction for Ubiquitous Video . 61 3.1. Introduction . 61 3.2. User Experience . 63 3.3. Requirements . 65 3.4. System Overview . 67 3.5. Engine Architecture . 68 3.5.1. Model-View-Controller . 70 3.5.2. Still Image Generation . 74 3.5.3. Transition Planner/Executer . 75 3.5.4. Camera Repository . 78 3.6. Evaluation . 79 3.6.1. Effectiveness of the Abstraction . 79 3.6.2. System Performance . 82 3.6.3. Robustness to Change . 86 3.7. Conclusion . 91 3.8. Acknowledgments . 92 Chapter 4 Closure: Why the Illusion Works . 93 4.1. McCloud Closure . 94 4.2. Cognitive Film Theory as an Explanatory Tool . 99 4.2.1. The Human Visual System . 99 4.2.2. Seamless Motion . 100 4.2.3. Jump Cuts . 100 4.2.4. Clean Cuts . 101 4.3. Why does RealityFlythrough Work? . 103 4.4. Conclusion . 106 Chapter 5 The Smart Camera . 107 5.1. Introduction . 108 5.2. Motivation . 111 5.3. Our Approach . 113 5.4. Related Work . 116 5.5. Hazmat Field Study . 118 5.5.1. Experimental Setup . 118 5.5.2. Results . 119 5.5.3. Followup . 120 5.6. Lab Study . 121 5.6.1. Experiment Setup . 122 5.6.2. Results . 123 5.6.3. Analysis . 124 v 5.6.4. Secondary Study . 126 5.7. Conclusion . 126 5.8. Acknowledgments . 127 Chapter 6 Putting it All Together . 128 6.1. Smart Camera . 128 6.2. Composite Camera . 131 6.2.1. Generating Composite Images . 131 6.2.2. Point-matching Inconsistencies . 133 6.2.3. Integration with RealityFlythrough . 134 6.2.4. Architectural Considerations . 140 6.3. Hitchhiking . 141 6.4. Walking Metaphor . 144 6.5. Temporal Controls . 145 6.6. Conclusion . 146 Chapter 7 User Studies . 148 7.1. How Transitions Affect User Behavior . 148 7.1.1. Experiment . 149 7.1.2. Results . 150 7.2. The Effectiveness of Simple Transitions . 152 7.2.1. Results . 154 7.3. The Effectiveness of Complex Transitions . 156 7.3.1. Experimental Setup . 159 7.3.2. Experimental Results and Analysis . 162 7.3.3. Conclusion . 169 7.4. Using the Complete System . 169 7.4.1. Experimental Setup . 171 7.4.2. Experiences of the Jump Group . 174 7.4.3. Experiences of the Sequence Group . 176 7.4.4. Experiences of the Transition Group . 177 Chapter 8 Conclusion . 184 8.1. User Interface . 185 8.1.1. Spatial Navigation . 185 8.1.2. Virtual Camera Interface . 186 8.2. Going Beyond Being There . 186 8.2.1. Augmented Reality . 186 8.2.2. Enhancing Temporal Controls . 187 8.3. Improving Path Plans . 188 8.4. Increasing Sensory Breadth . 188 8.5. Understanding Closure . 189 8.6. From Research to Product . 189 8.6.1. Multi-viewer Support . 189 vi 8.6.2. Multi-story and Altitude Support . 190 8.6.3. Better Hardware . 190 8.7. Final Thoughts . 191 Bibliography . 192 vii LIST OF FIGURES Figure 1.1: A screenshot of a typical RealityFlythrough session. 2 Figure 1.2: Snapshots of a transition. 3 Figure 1.3: An illustration of how the virtual cameras project their images onto a wall. 5 Figure 1.4: Snapshots of a point-matched transition. 17 Figure 1.5: This figure shows a transition that takes a less than ideal path through two walls. As figure 1.7a shows, the problem is that the user is traveling as the crow flies. 20 Figure 1.6: This figure shows a transition that is similar to the one shown in figure 1.5 only this time the user does not walk through walls. Fig- ure 1.7b shows the path that was taken. 20 Figure 1.7: These two figures show the paths that were taken when complet- ing the transitions shown in figures 1.5 and 1.6. ..
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages213 Page
-
File Size-