863

Subject Index

‘Note: Page numbers followed by “f” indicate figures, “t” indicate tables and “b” indicate boxes.’

A Affordances, 112–114 A/D conversion. See Analog to Digital conversion in , 114–117 (A/D conversion) false affordances, 116 AAAD. See Action at a distance (AAAD) reinforcing perceived affordances, 116–117 AAR. See After-action review (AAR) After-action review (AAR), 545f, 630, 630f, 634–635, 645, Absolute input, 198–200 761 Abstract haptic representations, 439 Affordances of VR, 114–117 Abstract synthesis, 495 Agency, 162, 164, 181 Abstraction triangle, 448 Agents, 552–553, 592–593, 614, 684–685 Accelerometers, 198–199, 218 AIFF. See Audio interchange file format (AIFF) Accommodation, 140–141, 273–275, 320, 570, 804 Airfoils, 13 Action at a distance (AAAD), 557–558 AIs. See Artificial intelligences (AIs) Activation mechanism, 554–556, 583 Aladdin’s Magic Carpet Ride VR experience, 185–186, 347, Active haptic displays, 516 347f, 470, 505, 625, 735, 770–771 Active input, 193–196 Alberti, Leon Battista, 28 Active surfaces, 808 Alice system for programming education, Adaptability, 122–123 758–759 Adaptive rectangular decomposition (ARD), 502 AlloSphere, 51–52, 51f, 280 Additive sound creation techniques, 499 Allstate Impaired Driver Simulator, 625, 629f Advanced Realtime Tracking (ART), 53–54, 213f Alpha delta fiber (Aδ fiber), 149 Advanced Robotics Research Lab (ARRL), 369 Alphanumeric value selection, 591–593 Advanced systems, 11–12 Ambient sounds, 436–437, 505 Advanced texture mapping techniques, 469–473 Ambiotherm device, 372, 373f Adventure (games), 12 Ambisonics, 354 Adverse effect, 351 Ambulatory platforms, 242–243 Aestheticism, 414 American Sign Language (ASL), 552–553 Affine transformations, 486–487 Amount/type of information, 196–198 864 | Subject Index

Amplification, 349–350 ARToolKit (ARTK), 48–49, 715–716 Amplifier, 349–350, 349f Ascension Technologies, 41–44, 44f, 46–47, 86–87 Anaglyphic 3D, 270f, 7f, 30, 49, 269–270, 271f ASL. See American Sign Language (ASL) Analog to Digital conversion (A/D conversion), 198, 202–203 Assembly line trainer, 591 Analog to Digital converter (A/D converter), 492 Associability with sense displays, 293, 351–352, 365–366 Angular measurement, 131–132 Asymmetric bimanual manipulations, 563 Animated texture maps, 473 Asynchronous communication, 643–645, 710 Annotation, 642, 644–648, 644f, 648f–649f, 744 Asynchronous time warping. See Time warping Anstey, Josephine, 84–85, 84f–85f, 609f Atari, 35, 35f, 39 Aperture selection technique, 584–585 Atmospheric effects, 139 Aperture-based manipulation, 597 Atmospheric extinction effects, 468–469 App launcher, 751 Audience, 9, 68–71, 69f–70f, 73, 75–76, 82, 84, 86–87, 91, Apparatus for Exhibiting Pictures (Robert Barker), 28 97–98, 100, 111, 141–142, 300–301, 316f, 402–403, Apple Computer, 37, 51–53, 93, 331–332, 338, 800–801. 410, 506, 554, 634, 636, 677–678, 765–769 See also ARKit SDK (Apple Computer); Siri (Apple Audio compression techniques, 806 Computer) Audio displays, 805–806 Application development software, 816–819 Audio input, 252–255 Application software predictions, 819–820 Audio interchange file format (AIFF), 510–511 Applications of VR, 88, 728–730. See also Index of Media Audio signals, 606 Experiences Audition, 119–121 Aquarium VR, 18, 301–305 cochlear nerve, 143–144 AR. See (AR) human physiology for, 142–144 ARCore SDK (Google), 340, 796–797 inner ear, 143 ARD. See Adaptive rectangular decomposition (ARD) limits of human hearing, 144 Argonne Remote Manipulator (ARM), 375, 375f, 377–378 middle ear, 143 Aristotelian physics, 704–705 outer ear, 142–143 ARkit SDK (Apple Computer), 340, 796–797 Augmented reality (AR), 18–26, 262, 431, 551, 691, 792, 805. ARM. See Argonne Remote Manipulator (ARM) See also Virtual reality (VR) Arm flapping, 196 display, 325f ARRL. See Advanced Robotics Research Lab (ARRL) increased use, 792–795 Ars Electronica, 46–47, 47f, 393f, 677–678 Augmented Reality Toolkit (ARTK). See ARToolKit (ARTK) ART. See Advanced Realtime Tracking (ART) Aural displays, 343–357 Art/entertainment, 747–748 amplification, 349–350 Artifacts, 76, 684 associability with sense displays, 351–352 Artificial intelligences (AIs), 640 aural presentation properties, 344–350 , 13, 20 cost, 352–353 Artificial Reality II, 20 encumbrance, 352 Artistic worlds, 748–749 environment requirements, 351 Artists role in VR, 91–92 interface with tracking methods, 351 865 | Subject Index

latency tolerance, 350 B localization, 346–348 Background sound. See Ambient sounds logistic qualities, 350–353 Baked lighting, 470 , 348–349 Baking, 469 noise pollution, 350 Balance. See Vestibulation number of display channels, 348 Bandwidth, 196–198 paradigms, 353–357 Batch-mode rendering, 480–481 combining aural display systems, Bauhaus-Universitat Weimar Virtual Reality and 355–356 Visualization Research Group, 271–272, 311, 431f, hand-based aural displays, 355 590f, 628, 692f head-based aural displays, 354–355 BDI Suture Trainer, 442, 597, 795 stationary aural displays, 353–354 Beam tracing, 498 portability, 352 Beam-scan tracking, 216–217 safety, 352 “Beeping” sound, 419 sound stage, 345–346 Behavioral responses, 172–174 throughput, 352 BIBS. See Body image and body schema (BIBS) user mobility, 350 Bidirectional reflectance distributions functions (BRDFs), Aural illusions, 144–145 468 Aural localization cues, 145–148 Billboarding, 471 Aural perception, 142–148 BIM systems. See Building Information Management human physiology for audition, systems (BIM systems) 142–144 Bimanual interface, 563–565 Aural representation in VR, 432–445 Binocular monoscopic displays, 270 features of sound, 432–433 Binocular stereoscopic display, 303 sound in VR experience, 434–438 Binoculars , 26 Authorship, 80 Biological sensor technologies, 233–234, 234f Auto-enhancement techniques, 817 Biometrics, 81 Auto-stereo displays, 267–268, 803 Bladder AutoCAD, 483 actuators, 369 Autostereoscopic image, 30 bladder-based pressure rendering, 518 Avatars, 14–15, 78–79, 79f, 177 Blazing Saddles, 437, 437f full body, 79f Blender, 687–688 range of motion, 429 Blinn, Jim, 248f, 467 Avzio tool, 756 Blueprint system, 814–815 Axes of translation, 200f, 561b–563b Bluetooth AxonVR, 808 communication game controllers, 333 Azimuth, 561b–563b input device, 207 Aδ fiber. See Alpha delta fiber Body image and body schema (BIBS), 177 (Aδ fiber) Body location illusions, 179 866 | Subject Index

Body motion, 129 California Institute for Telecommunications and Information Body ownership illusion (BOI), 178–181 Technology (CALIT2), 53. See also Qualcomm Body posture and gesture recognition, 249–252 Institute Body referential zones, 251, 569, 589, 590f, 617f, 760f Caloric nystagmus illusion, 157–158 Body tracking, 226 Camera, 213 Body-based displays, 260 camera-based sensors, 193 Body-grounded system, 380 camera-based tracking technology, 800 Boeing’s wire bundle construction application, 728–729 depth of field emulation, 469 BOI. See Body ownership illusion (BOI) tracking technologies, 810 BoilerMaker visualization, 608. See also BoilerMaker in Canonical “killer app”, 799 Index of Media Experiences Canonical axis, 560 Bolas, Mark, xvii-xviii, 41–42, 48–49, 472 Cardboard phone viewer (Google), 48–49, 56–57, 204, 208, “Book problem”, 168 275, 298–299, 331–334, 332f, 629, 789f, 790–791, BOOM ((Fakespace), 40–42, 288–289, 297, 317–319, 319f, 821 678f Carrier media, 61–62 Braille, 443 Carrier/modulator frequency ratio (C/M frequency ratio), BRDFs. See Bidirectional reflectance distributions functions 499 (BRDFs) Cartesian coordinate system, 486, 559–560 Bread crumbs, 607 Cartography, 402–403 Breaking the frame, 280–281, 281f, 301 Cartoon physics, 703–704, 704f Brightness, 139, 266 CastAR HMPD system, 55–56, 335 British ITC television network, 168 Caterpillar Inc. Brooks, Frederick P. Jr., xxi, 34, 388, 767, 769 Virtual Prototyping System, 567, 625. See also Brown University, 594–595 Caterpillar’s Virtual Prototyping System (VPS) in the Building Information Management systems (BIM systems), Index of Media Experiences 793, 820 VR work of, 246f, 434 Built-in object and travel constraints, 557 Cathode ray tubes (CRTs), 34, 263–264 Bump-mapping, 467 CAVE (CAVE Automatic Virtual Environment), 199, 293–294, Burton Box, 36 293f, 295f, 570 Button inputs, 202 application, 623 CAVE2 systems, 18, 54–55 C environment, 440 C fiber, 149 licensee Pyramid Systems, 47–48 C-reps, 413–414 style, 47–48, 96, 140–141, 263, 269, 337, 642–643 C/M frequency ratio. See Carrier/modulator frequency ratio system, 17–18, 41–42, 47f (C/M frequency ratio) Cave of Lascaux experience, 679–680 CAE Healthcare “LapVR” laproscopic simulator, 563–564, Cave paintings, 5, 5f–6f 597 CAVElib VR integration library, 94, 715–716 Café Wall Illusion, 134–135, 135f Celiac Plexus Block Simulator, 442, 597 867 | Subject Index

Celsius scale, 424 Collaborative interaction, 640–651 CFF frequency. See Critical flicker fusion frequency annotation, 645–648 (CFF frequency) communication among people, 641–643 Change blindness, 125–127, 126f floor control, 648–649 redirection, 127f, 667, 669–670 synchronous and asynchronous communication, 643–645 Channel capacity, 196 world congruity, 649–651 “Chaperone” system, 320–321, 321f Collada Digital Asset Exchange (DAE), 483 for HTC Vive, 694 Collision interaction, 709 Chemoreceptors, 121 Color field sequential displays, 265 CHI. See Human–computer interaction (HCI) Colors, 132, 264 Choreographed physics, 706–707 Columbia University, 736f, 748f Chorusing, 500 Combat games, 758–759 Cinema of Attraction, 92–93, 790–791. See also Media of Combining tracking methods, 224 Attraction Combustion processes, 460 Cinematic-VR, 13, 168–169 Command gestures, 250 Cinematrix, 639f, 639b Communication, 65–71, 546 Cinerama, 32 through medium, 61–62 Circular polarization, 310 process, 661 Citizen Kane (motion picture), 87, 727–728 Compass indicators, 608 Clark, Jim, 36, 38, 756, 800 Complete physics , 548–549 Classic motion base platform, 389–390 Complex visual scene rendering, 461–475 Classic Snellen visual acuity, 131–132 rasterization rendering pipeline, 464–465 Classroom, VR in, 789–790 reducing polygon count, 466–469 Climate control device, 519 shading, 465–466 Climb-in style, 287–288 Compromise rendering technique, 474 “Climbball” game, 314b–316b, 316f Computation, 451–454 Clock-style representation of time, 630 Computer graphics, 454 Closed-ear headphones, 354 rendering, 141–142, 480–481. See also Visual rendering Closure, 405–406 Computer program execution, 13 Cloud computing, 454 Computer systems, 117 applications, 252 Computer vision techniques, 213, 340 speech analysis tools, 553 Computer-based image generators, 799–800 “CloudLands” mini-golf game, 759 Computer-based , 7–8, 12–13 Clouds Over Sidra VR movie, 776–777 Computer-mediated interfaces, 115 Cochlear nerve, 143–144 Computer-supported cooperative work (CSCW), 640 Cockpit-style system, 245, 246f, 287–288 Concept representation. See C-reps Cognitively vested, 628 Conduit tool (Mechdyne), 819 Collaborative environments, 14–15 Connotation, 413–414 Collaborative experiences, 634 Constrained manipulation, 557 868 | Subject Index

Constrained space, 242–243 Coverbal gestures, 250 Constrained travel, 611 Crayoland application, 434–435, 671f Constraint(s), 557, 615 Creatorship, 80 of movement, 526–527 Cricket hand-held input device, 45f Constructive solid geometry (CSG), 455, 457–459, 459f Critical flicker fusion frequency (CFF frequency), 133 Consumer sound-production devices, 506 Cross-modal localization effect, 128 Contact-select, 581–582, 582f Cross-modal perception, 127–129 Context, 136–137, 671–681 Cross-reality (XR), 19 point of view, 671–675 Cross-sensory effects and virtual reality, 160–162 venue, 675–681 benefits of cross-modal effects, 160–161 Contextual factors of presence, 171 making VR design choices based on cross-modal effects, Continuous input, 196 162 Continuum properties, 263 negatives of cross-modal effects, 161 Contrast, 265–266, 266f sensorial precedence, 160 Control location, 565–572 Cross-sensory illusion, 137, 145, 153 Control order, 573–574, 616–618 Crowd funding, 53, 794, 800–801 Control visibility, 572–573 Crowd sourcing, 697–698, 794 “-through an aperture” placement CRTs. See Cathode ray tubes (CRTs) option, 571 Crumbs volume visualization tool, 480, 583 “–at hand” placement option, 569 Cruz-Niera, Carolina, 18, 43–44, 43f, 715 “–in-hand” placement option, 567 CSCW. See Computer-supported cooperative work (CSCW) “–in-world” placement option, 566 CSG. See Constructive solid geometry (CSG) “–on panel” placement option, 571 Cubist painter, 88 “–on the display” placement option, 570 Culling, 468–469 Conventional user interface design, 116 Curvature gain, 125 Convergence, 140 Cutaneous cues, 443–444 “Conversation, The” (motion picture), 403 Cutaneous mechanoreception. See Taction Convolution, 496 Cutaneous rabbit illusion, 153 filters, 496–497, 501 Cutaneous receptors, 150 Convolvotron system, 40, 509, 513–514 Cutty Sark Virtual Voyage experience, 621, 677–678, 776, Coordinate display, 610–611 787 Coordinate system, 559–560 Cutty Sark whiskey, 294 transformations, 486–489 Cyark, 818 Coordinate-directed selection, 579 CyberGlove, 41–42 Cornucopia of application areas, 738–753 CyberGrasp hand-based force feedback device, 47, 378f Correct rejection, 112–113 Cybernetic Systems device, 380–381, 381f Cortana (Microsoft), 553 Cybersickness. See Sim-sickness Cortical homunculus, 119 Cyberspace, 15, 18–26, 39–41, 167–168, Cost, 298–300, 352–353, 367–368 637–638, 640 869 | Subject Index

D Desert Research Institute (DRI), 272–273, 608f, 630f, 632f, 645 DAC. See Digital to analog converter (DAC) Design Augmented by Computer system (DAC system), 33 DAC system. See Design augmented by computer system Design review, VR, 744–745 (DAC system) Design tradeoffs, 769–770 “Dactyl Nightmare”, 41–42, 41f, 759, 760f Desktop fishtank systems, 307 DAE. See Collada Digital Asset Exchange (DAE) Desktop metaphor, 545 Dance, 72, 83, 89–90, 90f, 97, 97f Developing Virtual Reality Applications, 731 Daqri, 16, 234f, 325f, 569f, 792–793 Development cycle, 782 Smart Helmet, 24, 57, 324f, 453, 568, 802 DIDI. See Digital Image Design Inc. (DIDI) Smart Glasses, 23f, 297f Diegesis, 11, 187, 405 Dassault Systèmes, 693f, 715–716 Diegetic. See also Nondiegetic Data processing, 741 interface, 565–567, 566f Data sonification. See Sonification world, 768–769 DataGlove, 39, 40f, 715 Digital data entry glove interface device, 37–38 Daydream (Google), 333, 797 Digital Image Design Inc. (DIDI), 44 controller, 206, 206f, 219, 236b, 334, 810 Digital light processor (DLP), 51–52, 263–264 Dead man (aka hill) switch, 382, 441 Digital samples, 531 Decomposition, 476–477 Digital signal processor (DSP), 452, 506–509 Deconstructing presence, 165–168 Digital to analog converter (DAC), 490 DeFanti, Tom, 37, 43–44, 307, 309f Dimmer control for light fixture, 235 Degradation of accuracy over time, 218–219 Direct brain-tap, 222–223 Degree of freedom (DOF), 194, 196–197, 281, 362 Direct neural connections, 812–813 2-DOF, 549 Direct user control, 547–549, 614 input device, 196–197 Directed narratives, 84–87 3-DOF, 520–521 Direction selection, 575–580 tracked HBD, 206 DirectX 3D, 485 5-DOF, 527 DirectX offshoots, 464 6-DOF Discrete input, 196, 197f controls, 549 DISH system, 305, 306f, 488, 694 tracking lack, 334 Disk storage, 6 Degree of realism, 706f Disney, 47–48, 670, 679–680, 735. See also Aladdin’s Magic de la Pena, Nonny, 84 Carpet Ride VR experience Delta3D game engine, 718 DISH VR display system, 305, 306f, 488, 694 Denotation, 413–414 DisneyQuest, 47–50, 48f, 52, 241f, 390, 392f, 621, 784 Denouement, 84–87, 774 Disney Research, 30f, 444, 798 Depth cameras, 702 Center, 639, 790–791 Depth cues, 137 , 97–98, 98f, 285–286, 298, Depth information, 139–140 305, 306f, 488, 694, 770, 784 Depth-based skeletal tracking, 232 Walt Disney VR Studio, 185–186, 347, 347f, 470, 505, 720, 771 870 | Subject Index

Display arrangement, 109 E Display channels, 269f, 266–273, 267f–268f, Ear drum, 142–143 348, 362 “Early Morning Dreams” song, 738 Display of position coordinates, 610 Early reflections impulse responses (ERIRs), 501 Display technologies, 800–801, 803–809. See also Input Ecological validity, 163–164 technologies Education, 746 audio displays, 805–806 Edutainment, 747 haptic displays, 806–808 EEG measurements, 233–234, 234f olfactory and gustatory displays, 808–809 Effective immersion, 661 visual displays, 803–805 Effects boxes. See Sound postprocessors Distance(s), 137–142, 557–558 Egocentric frames of reference, 200, 200f, 486, 559–560, cues, 146–147 561b–563b, 562f, 604–605, 610, 615–616, 616f misperception, 140–141 Electric motors or hydraulics/pneumatics, 375–376 Distributed Virtual Environment library (DIVE), 715 Electromagnetic spectrum, 702 Divide and conquer wayfinding strategy, 602 Electromagnetic tracking systems, 211–213, 212f, 291, 675 DIY smartphone-VR holder, 331–332 Electronic digital computer, 31 DIY tracking system, 298–299 Electronic Visualization Lab (EVL), 18, 37, 43–46, 43f, D LP. See Digital light processor (DLP) 83f, 94, 271–272, 306–307, 308f, 365–366, 408f, Documentation, 747 429f–430f, 644, 756–757, 787–788 DOF. See Degree of freedom (DOF) Elements of Psychophysics (Fechner), 123 Dolby 3D. See Infitec 3D stereoscopy Elevation, 561b–563b Dolinsky, Margaret, 95, 238f, 308f Embodiment, 162, 177–183 Domain, 64 dimensions of, 178–179 Doppler effect, 433 increasing “reality” of virtual world, 183–186 Dots per inch (dpi), 264–265 inducing body ownership illusion, 178–181 Dow Jones Industrial Averages, 420 influence on participants through agency and body own- DPLEX decomposition, 476, 477f–478f ership, 181–183 Drag-and-drop desktop style, 250 transference of object permanence, 184–186 Dry sound samples, 511, 515 Emitting technology, 263–264 DS P. See Digital signal processor (DSP) Encoding, 400 Dual User Option (DUO), 46–47 Encumbrance, 295–297, 297f, 352, 367 Dual-handed fly-through, 623–624 End-effector haptic displays, 359, 374–380 Dungeon (games), 12 components, 375–376 Dungeons and Dragons role-playing features, 376–378 game, 681 interface issues, 378–380 DUO. See Dual User Option (DUO) Entertainment media, 144 Dynamic frame of reference, 564 Environment requirements, 292–293, 351, 365 Dynamic objects, 686 Environmental 4D effects, 358, 372–373, 372f, 444, 445f, Dynamic range of display, 265–266 447, 519–520 871 | Subject Index

Environmental actuators, 372, 372f user testing, 774–775 Equilibrioception. See Vestibular perception VR attaining goals, 725–730 ERIRs. See Early reflections impulse responses (ERIRs) External hand tracking, 811 Error correction, 524–525 Extra-physics modifications, 707 Eureka moment, 740–741 Extravehicular activity (EVA), 549 EVA. See Extravehicular activity (EVA) Eye tracking, 134, 231, 231f, 274–275, 457, 473–474, 576, Evans & Sutherland, 34–36, 452, 799–800 763, 811–812 EVL. See Electronic Visualization Lab (EVL) EyePhones from VPL, 39 EVL CAVE, 94 Eyes, position tracking body, 231, 231f EVL CAVE2 hybrid reality environment, 308f Exemplary VR experiences, 753–761 F Exerting force on virtual object, 596–597 Facebook, 56, 64t–65t, 80, 633, 749–750, 790–791, 794, Exocentric frames of reference, 200, 200f, 218, 486, 800–802, 820 559–560, 561b–563b, 562f, 604–605, 609–610, Fakespace Systems, 47–48, 318–319, 319f, 322b–323b, 323f, 615–616, 616f 800–801 Exocentric view, 609–610 False affordance, 112–113 Exoskeletal device, 379 in virtual reality, 116 Experience Fantastic Contraption puzzle game, 589, 590f conceiving new VR application, 731–737 Faux details, 469 conception and design, 725 FBX. See Filmbox (FBX) cornucopia of application areas, 738–753 FE method. See Finite element method (FE method) creation process, 719–722 Fechner’s Law, 123 first VR application, 721–722 Feedback, 116–117 design tradeoffs, 769–770 loop, 68–71 designing VR experience, 761–777 user interactions, 556 design deliberately, 761–762 Feet, tracking, 232–233 design to engaging audience, 768–769 Feiner, Steve, 161, 251f, 447, 736f design with audience in mind, 765–767 Ferry Porter Law, 134 design with system in mind, 763–764 Fidelity, 363 design with venue in mind, 764 Fiducial landmarks, 215–216 prototype, 762 Fiducial markers, 325–326, 326f document, deploy, and evaluating experience, 775–777 Field of regard (FOR), 278–281, 279f, 431 end of experience, 771–774 Field of view (FOV), 131, 277–278, 278f, 427 exemplary VR experiences, 753–761 Field-sequential color display, 264 information vs., 88–92 File storage formats past and future of VR design, 777–778 audio, 69f, 348, 510–511 playback, 644 haptic, 531–532 social interactions, 769 polygonal, 483–484, 722 user objective, 771 Film Comment, 788 872 | Subject Index

Film: Form and Function (Wead and Lellis), 82 Ford Galaxy VR experience, 772 Filmbox (FBX), 483, 722 Form, 87–88, 362–363 Filters, 495 Formal user analysis studies, 542 Finger surrogate, 380 Four-dimension (4D) Finger-contact gloves, 588–589. See also Pinch Gloves 4D effect, 372. See also Environmental 4D effects Finite element method (FE method), 500–501 4D film, 372. See also Environmental 4D effects First-person POV, 672 Fourier analysis, 499 First-person shooter (FPS), 748–749 4K TV. See Ultra-resolution TV First-person view (FPV), 674 FOV. See Field of view (FOV) Fisher, Scott, 35, 38–42, 678f, 738 FOV2GO phone viewer (MxR Lab/USC/ICT), 331–332, 332f Fishtank virtual reality, 18, 213, 303–305, 793–794 , 473–474 components, 302–303, 302f FPS. See First-person shooter (FPS)/See Frames per second display, 301, 301f (FPS) interface issues, 304 FPV. See First-person view (FPV) Flashback, 82 Frame of reference (FOR), 198–200, 486, 558–563, 615–616 Flat shading, 465 Frame rate, 133–134, 285–286, 664–665 Flat-panel displays, 306, 309 Free view stereoscopic image, 290 Flavor, 129 FreeVR VR Integration Library, 715–716, 815 Flight simulation, 10–14, 30–31, 31f, 245, 287–288, 306, Frequency modulation (FM), 493–494, 499 389–390, 425, 451, 452f, 481, 486, 534–535, Frequency of waves, 490 561b–563b, 562f, 609–610, 639b, 770, 799–800 Fresnel lenses, 275–276 “Flock of Birds” system from Ascension Technologies, 43–44 FRHC. See Force Reflecting Hand Controller (FRHC) Floor control, 648–649 Friction Floor space requirement, 286–287 bitmap, 531–532 Fly-Through, 621–624 friction-based texture rendering, 519 FM. See Frequency modulation (FM) Fröhlich, Bernd, 272, 324–325. See also Bauhaus- Focal distance, 273–275, 274f, 304f Universitat Weimar Virtual Reality and Visualization Fool the user, 667–670 Research Group; German National Research Center Foot-coupled display category, 358–359 for Information Technology (GMD) FOR. See Field of regard (FOR); Frame of reference (FOR) Furness, Tom, 38, 40, 91 Force Fuse button, 75f, 251, 334, 555, 576 display device, 148, 357, 442, 442f, 529 feedback, 148 G force-feedback displays, 357 GAF Viewmaster, 280 force/resistance displays, 360 Gain, 573–574, 616–618 perception, 441 Game controllers, 235, 333–334 rendering, 520–521 Game Developers Conference (GDC), 57, 92 representations, 441–442 Game engine, 713, 715, 717–718 Force Reflecting Hand Controller (FRHC), 362, 362f using game engines in VR development, 718 873 | Subject Index

Gamepads, 235–236 GMD. See German National Research Center for Information gamepad-style controller, 236 Technology (GMD) Garage VR enthusiasts, 785–786 Go-go method, 583–584 Gartner, Inc., 782, 782f Golgi joint organs, 151 Gaze-directed fly-through, 622 Golgi tendon organs, 151 Gaze-directed selection, 576 Google, 675–676, 698, 775–776, 785, 786f, 800–801, 818. Gaze-directed user interface, 568, 569f See also ARCore (Google); Cardboard phone viewer GDC. See Game Developers Conference (GDC) (Google); Daydream (Google) GearVR phone viewer (Samsung/ VR), 207, 207f, 288, Game Engine software plugin, 721–722. 331–333, 790–791 See also Google Earth VR in Experience Index General Haptics Open Software Toolkit (GHOST), 531–532 , 55–56, 234f, 253, 317–318. General-purpose GPUs (GPGPUs), 481–482 See also Soli project (Google); Tango project General-purpose sound renderers, 506 (Google); Tilt Brush by Google in Experience Index Generalization, 410–411 Worldsense, 796–797 Genre, 87–88, 164 Google Earth VR, 437–438, 698f, 756–757, 768, 768f, Geographic maps, 402–403 775–776, 799, 819 GeoMagic Touch display, 376f Gouraud shading, 465 Geometric computation methods, 498 GPGPUs. See General-purpose GPUs (GPGPUs) Georgia Institute of Technology, 423f, 430f, 432, 651, 652f, GPS. See Global Positioning Satellite System (GPS) 787, 788f GPU. See Graphics processing unit (GPU) German National Research Center for Information Grabbing, 560–561 Technology (GMD), 45 grab-with-fist interaction, 547–548 Gernsback, Hugo, 31, 33, 33f Granular synthesis, 499–500 Gestalt approach to perception, 414–417, 415f Graphic(s) Gesture(s), 555–556 engines, 481–482, 800 communication, 552–553 images, 17 inputs, 205 latency tolerance, 285 recognition, 249–252 rendering. See also Visual rendering GHOST. See General Haptics Open Software Toolkit shader algorithms, 466f (GHOST) Graphics processing unit (GPU), 302, 452–453, 481–482 “Ghostbusters: Dimension” experience, 677–678, 678f Grasping, 527 Ghosting effect, 269 Gravity Sketch tool, 631, 756, 817 Github, 98 Green-screen video, 96 Glasses-style, 288 GRiD Compass laptop, 38–39 GLES, 464 Grid structure, 610–611 Global attribute modifications, 598–599 Gridded wall, 693f Global network strategy, 602 GRIP application, 388 Global Positioning Satellite System (GPS), 225 GROPE project, 34 Glyphs, 409 Grounding, 263, 360–361 874 | Subject Index

Group control, 639b end-effector displays, 374–380 Gun prop, 445, 445f fidelity, 363 Gustation, 119–121, 394–395, 447 form, 362–363 perception, 159–160 grounding, 360–361 Gustatory displays, 808 haptic presentation properties, 360–364 Gyroscopes, 204, 218 interface with tracking methods, 365 Gyroscopic sensors, 198–199 kinesthetic and proprioceptic cues, 361 latency tolerance, 364 H logistic properties, 365–368 H3DAPI, 533 mixed displays, 385–386 Hair follicles, 151 number of display channels, 362 Hand and fingers, position tracking body, 228–230, paradigms, 368 228f–229f passive haptic displays, 384–385 Hand controllers, 53–54, 57, 202, 206f, 219, 230, 230f, portability, 366–367 235–236, 236b, 563, 756, 800, 810. See also ROSDs, 380–384 Daydream (Google) controller; Force Reflecting Hand safety, 367 Controller (FRHC); Game controllers; HTC Vive con- size, 364 troller; Move controller; Nintendo Wii remote; Razer spatial resolution, 363–364 Hydra gaming input device; Reactive Grip controller; tactile haptic displays, 368–374 Xbox controllers tactile/cutaneous cues, 361–362 “Hand in space” symbol, 78 temporal resolution, 364 Hand-based aural displays, 355 3D hardcopy, 385–3865 Hand-based displays, 260, 262, 282–283 throughput, 367 Hand-based VR paradigm, 18, 228 user mobility, 365 Hand-held prop/controller, 206 Haptic illusions, 152–154 Handheld VR display, 338–342 Haptic perception, 148–154, 516 components, 339–340 haptic illusions, 152–154 features, 340 haptic localization perception, 154 interface issues, 341 human physiology for somatosensation, 149–152 Hang-gliding interface, 248, 248f Haptic rendering Haptic accelerator, 530 bladder-based pressure rendering, 518 Haptic asset encodings, 531–532 complex haptic scenes with force displays, 525–527 Haptic bitmaps, 531–532 constraint of movement, 526–527 Haptic cues, 186f 5-DOF from two points of contact, 527 Haptic displays, 210, 343–357, 360–368, 387–388, 806–808 trading force for torque, 527 associability with sense displays, 365–366 force (rendering), 520–521 cost, 367–368 friction-based texture rendering, 519 degrees of freedom, 362 hardware, 530 encumbrance, 367 movement damping-based texture rendering, 519 875 | Subject Index

muscle/joint-based rendering, 515–516 Head-based retro-reflective displays. See Head-based pin-based pressure rendering, 518 (mounted) projective/projection displays shape rendering with ROSD, 524 (HMPD) skin-based rendering, 515–516 Head-based visual displays. See Head-based displays software, 532–533 (HBDs) systems, 515–534 Head-butt zoom, 628 thermal rendering, 519 Head-mounted display (HMD), 13, 30f, 96, 118, 260, 318, Haptic representation in VR, 439–445 320f, 473–474, 541–542, 782 features, 439–441 HMD-based system, 32f haptic information, 441–445 HMD-style VR visual displays, 140–141, 288 Haptic retargeting experiment, 525, 669 Head-mounted projective displays (HMPD), 335 Haptic sensor types, 149–151 Head-referenced 3D sound field, 347 Haption, 119–121, 149–152 Head-related transfer functions (HRTFs), 146, 496–497, 504, Hardware computer rendering market, 800 509 Hardware interface software, 815–816 Head-up display (HUD), 431 Harry Potter and the Forbidden Journey ride, 748–749 Heading, 561b–563b My3D phone viewer (Hasbro), 53, 331–333 Headphones, 344, 347–348, 354–355, 354f HBDs. See Head-based displays (HBDs) Heat map visualization, 418 HBPD. See Head-based projection display (HBPD) Height in visual field, 139 HCI. See Human–computer interaction (HCI) Heilig, Morton, 32, 90, 372, 394 HDR. See High dynamic range (HDR) Held-to-the-head apparatus donning style, 288–289, HDTV. See High-definition TV (HDTV) 331–332 “Head crusher” technique, 597 Helmet, 17 Head position information, 227–228, 281–283, 284f Helmet-mounted style, 288 Head-based (mounted) projective/projection displays Hertz (Hz), 285–286 (HMPD), 17, 45, 48–49, 49f, 55–56, 300, 318, Hexapod motion platform, 389–390, 389f 335–338, 336f Hidden affordance, 112–113 components, 335–336 Hierarchical processing, 121–122 features, 336–337 High dynamic range (HDR), 457 interface issues, 337 High-definition TV (HDTV), 453, 798 Head-based aural displays, 354–355 High-fidelity audio devices, 344 Head-based devices, 297 High-level processing, 134 Head-based displays (HBDs), 17, 17f, 199, 260, 262, 282, High-pass filter, 496 317–318, 317f, 692, 742, 817–818. See also Higher experience fidelity, 797–798 Nonocclusive head-based displays (Nonocclusive Higher-end “VR-ready”/“gaming” machines, 491 HBDs) HITLab. See Human Interfaces Technology Laboratory Head-based projective/projection display (HBPD). See (HITLab) Head-based (mounted) projective/projection Hitting objects, 596 displays (HMPD) HMD. See Head-mounted display (HMD) 876 | Subject Index

HMPD. See Head-based (mounted) projective/projection matching display to needs of human, 111 displays (HMPD) embodiment, 162, 177–183 Hobby-serving tools, 815 human perceptual system, 117–162 Hoffman, Hunter, 385, 164, 185, 385f, 742 presence, 162–169 Hold-in-your hand, 289 Human Interfaces Technology Laboratory (HITLab), 48–49, Hollywood quality, 662–663 185, 331–332 HoloLens (Microsoft), 24, 57, 224–225, 225f, 282, 290, Human perceptual system, 109, 117–162 297, 327, 348–349, 453, 480, 555–556, 556f, 736f, aural localization cues, 145–148 795–797, 811 aural perception, 142–148 Home, VR in, 793–794 cross-sensory effects and virtual reality, 160–162 Homuncular flexibility, 177–178 gustation perception, 159–160 Homunculus, 119, 120f haptic perception, 148–154 House of Fables, The, 605 olfactory perception, 158–159 How Maps Work, 604 sensation, 118–129 How to Lie with Maps, 604 vestibular perception, 154–158 HRTFs. See Head-related transfer functions (HRTFs) visual perception, 129–142 HTC Vive, xix, 57, 217, 275–277, 291f Human performance lighthouse system, 57, 208, 216–217, 217f, 541–542, research on, 751 692–693, 796 within VR, 751–753 chaperone system, 320–321, 321f, 692–694, 693f Human physiology controller, 202, 216–217, 217f, 235–236, 589f for somatosensation, 149–152 Hubble Space Telescope repair application, 650 for vestibulation, 155–156 HUD. See Head-up display (HUD) of vision, 129–134 Human audition system, 142 high-level processing, 134 Human communication media, 61–62, 64t–65t light receptors, 129–130 authorship, 80 luminosity, 132–133 interface to virtual world, 72–74 retina, 130–132 issues, 72–81 temporal sensitivity, 133–134 language, 74–80 Human vestibular sense, 447. See also Vestibular Human comprehension, 409–417 Perception generalization, 410–411 Human–computer interaction (HCI), 111, 220, 539 gestalt approach to perception, 414–417 Humankind, 63, 65 representation of ideas, 413–414 Hysteresis, 558 semiotics, 412 Human hearing, limits of, 144 I Human in VR loop IAs. See Intelligent agents (IAs) connecting human to simulation, 109–117 IBM PCs/PC clones, 485 affordances, 112–114 IBR. See Image-based rendering (IBR) affordances in virtual reality, 114–117 Iconic representations, 250, 406, 408f–409f, 411, 411f, 419 877 | Subject Index

ICT. See Institute for Creative Technologies (ICT) Poggendorf Illusion, 137, 137f Idiom of VR, 76–77 Snake Illusion, 136–137, 136f Idiots’ Guide to Virtual World Design, The, 770 Trompe-l’oeil, 96–98 IF. See Interactive fiction (IF) Zöllner illusion, 134–135, 135f IK. See Inverse kinematics (IK) Illustrated Man, The (Ray Bradbury), 32 ILD. See Interaural level difference (ILD) Image-based depth mapping, 221–222 “IllumiRoom” project (Microsoft), 98. See also “RoomAlive” Image-based rendering (IBR), 470, 700 (Microsoft) Imagination, 8 Illusion, 122 Imagined reality, 11 aural Illusions, 144–145 ImmersaDesk, 45–46, 247–248, 306 McGurk effect, 144 Immersion, 8–12, 166, 659–671 Shepard tone, 145 mental, 10–12, 660–661 tritone paradox, 145 physical/sensory, 10–12, 660 ventriloquism effect, 128, 144, 147, 502–503 role of realism in, 661–671 cross-sensory Illusions, 137, 145, 153 Immersive journalism, 84, 748f long arm illusion, 145, 153 Immersive virtual world, 232 marble hand illusion, 145, 181 Immersive visualization, 740–741 vection, 137, 156–157 Immotive narrative experience, 82–87 haptic Illusions, 152–154 Impedance mismatch, 111 cutaneous rabbit illusion, 153 Implicit “go-go” technique, 543–544 pinocchio illusion, 152 Impressionist painter, 88 phantom limb, 153 Impulse response (IR), 496 presence Illusions IMU. See Inertial measurement unit (IMU) body location illusions, 179 In event-based media, 82 Body Ownership Illusion (BOI), 178–181 In-hand style of interface, 567–568 marble hand illusion, 145, 181 Inclination, 218 False body height illusion, 145, 153 Indexed representations, 406 place illusion (PI), 166–167 Indexed sounds, 438 plausibility illusion (Psi), 166–167 Indiana University, 299f, 700f, 789–790, 789f rubber Hand illusion, 178 Advanced Visualization Lab, 296f, 299f, 302f social presence illusion, 167 The Virtual World Heritage Laboratory (VWHL), 372f, 448f, vestibular Illusions, 137, 156–158 670, 670f caloric nystagmus illusion, 157–158 Indirect tracking, 226–227 oculogravic illusion, 157 Inertial and resistant effects, 524 oculogyral illusion, 157 Inertial measurement unit (IMU), 198–199, 233f vection, 137, 156–157 Inertial navigation systems (INS), 218 visual Illusions, 134–137 Inertial tracking technology, 218–219 cafe wall illusion, 134–135, 135f Infitec 3D stereoscopy, 49, 269–270, 271f, 298 Pinna-Gregory illusion, 134–135, 135f Inflatable bladder technology, 369 878 | Subject Index

Information, experience vs., 88–92 Interactive computer-mediated medium, 68–71 Infrared (IR), 290–291 Interactive fiction (IF), 12–13, 12f, 64t–65t, 73–74, 84–85, Inner ear, 143 84f, 403, 405, 409f, 773 Input technologies, 809–813 Interactive medium, 73 direct neural connections, 812–813 Interactive narrative experience, 82–87 new input devices, 811–812 Interactivity, 12–15, 71, 83–84 participants with virtual world, 193–225 collaborative environments, 14–15 absolute vs. relative inputs, 198–200 computationally simulated realities, 13–14 active vs. passive inputs, 194–196 Interaural level difference (ILD), 143–144, 503 amount/type of information, 196–198 Interaural time difference (ITD), 143–144, 503 continuous vs. discrete inputs, 196 Interface type, 191 input classifications, 201–207, 202f, 206f Internal computer representation, 482–484, 510–511, 531–532 physical vs. virtual inputs, 200–201 Internal perception, 125 unencumbered input, 810–811 International Space Station Tour VR experience, 605 Inputs within VR, 226–256, 238f Interocular distance (IOD), 480 body posture and gesture recognition, 249–252 Interposition, 138 physical input devices, 234–248 Interpupillary distance (IPD), 480 position tracking body, 226–234 Interval data, 423 speech recognition, 252–255 Intramolecular electro-static forces, 439 INRIA, 96, 96f Inverse kinematics (IK), 670–671 INS. See Inertial navigation systems (INS) Invisible interface, 250 Inside-out POV, 674–675 IOD. See Interocular distance (IOD) Inside-out tracking, 793 Iowa State University’s Virtual Reality Applications Center Institute for Creative Technologies (ICT), 48–49, 54–55, (VRAC), 49 331–332, 332f, 472f, 788f, 791 IPD. See Interpupillary distance (IPD) Institutionalized virtual reality, 714–715, 798–799 IQ-station, 296f, 310–311 Instrument guidance, 609 IR. See Impulse response (IR); Infrared (IR) Integrated active-stereoscopic glasses, 43–44 IR-band visual tracking system, 292 Intelligent agents (IAs), 640 Isotrak magnetic tracking system, 40 Intention preserving actions, 543–544 ITC Sense of Presence Inventory (SOPI), 168 Intention surprises, 649 ITD. See Interaural time difference (ITD) Interaction with virtual world, 539 Item selection, 580–591 collaborative interaction, 640–651 design basics, 540–544 J interacting with others, 633–651 Jet aircraft, 23 manipulation, 547–600 JNDs. See Just noticeable differences (JNDs) metacommands, 651–652 Job tasks, 814 navigation, 600–633 Judder phenomenon, 286, 665 user interface metaphors, 544–546 Just noticeable differences (JNDs), 123, 363–364 879 | Subject Index

K Laser image range finder system, 326–327, 327f Kelvin scale, 424 Laserbeam, 558 Kickstarter, 53–56, 54f, 331, 335, 783, 791, 794, 800–801, Latching valuator input, 203 811 Late reflection/reverberation impulse response (LRIR), 501, 515 Kill switch. See Dead man (aka kill) switch Late-reflection stage of sound, 497–498 Kinect (Microsoft Xbox input device), 15f, 51–53, 96, Latency tolerance, 283–285, 350, 364, 399–400 195f, 221, 221f, 232, 255, 314b–316b, 640f, 692, Lateral movement, 615 692f–693f, 696f, 793–794 Laurel, Brenda, 35, 41–42, 79–80, 85–86, 91–92, 249f, Kinesthesia, 148, 259 577–578, 622, 648f, 680f Kinesthetic LBE. See Location-based entertainment (LBE) cues, 361, 441–442 LCD. See Liquid crystal display (LCD) modalities, 119–121 LCOS. See Liquid crystal on silicon (LCOS) receptors, 151 , 55–56, 229–230, 255, 383, 584–585, 701, 701f, sensations, 516 796–797, 811 -style, 289–290, 317–318, 322b–323b, 323f, controller, 226, 229, 229f 678f Hovercast VR interface, 588–589 Kiosk platform, 243, 243f LED Detection and Ranging (LEDDAR), 222 Know your audience, 765 LEEP System. See Large Expanse Enhanced Perspective Kreylos, Oliver, 15f, 94–95, 95f, 570–571, 586f, 696f, System (LEEP System) 715–716 Legend Quest game, 773, 773f Krueger, Myron, 20, 37, 306, 314b–316b, 315f, 673, 673f Lenses, 276 Kuka Robot, 391f Lenticular display, 267–268 Kyma/Pacarana package, 507–509, 508f, 513 Level-of-detail culling (LOD culling), 427, 468 Levitation, 97, 97f L Lidar ranging technology, 220, 581, 702, 793 LidarViewer application, 560–561, 587 Lanier, Jaron, 39, 636f, 783 Light pollution, 292 Lag differential, 329 Light receptors, 129–130 Landmark(s), 605–606 Lightfield display, 140–141, 274–276, 320, 455, 475, 804 landmark-directed selection, 580 “Lighthouse” position-tracking system, 57, 216–217 Language, 74–80 Likert scale, 424 elements of time and space, 77–78 Lincoln Wand, The (Roberts), 34 idiom of VR, 76–77 Link, Edwin, 30–31 representations, 406 Liquid crystal display (LCD), 263–265, 800 self representation, 78–80 Liquid crystal on silicon (LCOS), 263–264 Laparoscopic surgery simulation, 378–379, 439, 442, 442f, “Live air” technique, 670, 768–769 526–527, 527f Live communication with collaborators, 692 Large Expanse Enhanced Perspective System (LEEP System), Live world capture, motivations for, 691–696 37–38 Live-capture of VR experience, 96 Large-format stationary displays, 454 880 | Subject Index

Local coordinates, 200 M Localization, 128, 146, 346–348, 502–505 Magellan input device, 549 Location walkabout, 82 , 792–793, 796–797, 811 Location-based entertainment (LBE), 678–679 Magic Lens Location-based VR systems, 444 AR applications, 228, 324, 338, 355 Lock to surface method, 557 effect, 324, 325f, 341 LOD culling. See Level-of-detail culling (LOD culling) interface, 463, 520f, 598–599 Logarithmic perception, 123–124 “Magic School Bus” children’s television program, 746 Logistic properties Magnetometers, 218 associability with sense displays, 293, 351–352, 365–366 Magnetoreception, 395 of aural displays, 350–353 MakeVR tool, 817 cost, 298–300, 352–353, 367–368 Mandala system, 673, 673f donning the apparatus, 287–290, 295–296 Maneuvering, 601 encumbrance, 295–297, 352, 367 Manipulandum, 374 environment requirements, 292–293, 351, 365 manipulandum/end-effector displays, 516 of haptic displays, 365–368 Manipulation, 546–600, 548f interface with tracking methods, 290–292, 351, 365 methods, 547–553, 614 light pollution, 292 operations, 593–599 noise pollution, 350 properties, 553–574 portability, 293–294, 352, 366–367 selection, 575–593 safety, 297–298, 352, 367–368 Mapping, 417–420 throughput, 294–295, 352, 367 Maps, 604–605 user mobility, 290, 350, 365 “Marble hand illusion”, 145, 181 of visual displays, 286–300 Markers, 437–438 Logitech desktop 6-DOF tracking system, 44f Marketing, 745–746 LOLA. See Lunar Orbit Landing Approach (LOLA) Masking, 277 Long arm illusion, 145, 153 Massachusetts Institute of Technology (MIT), 33–34, 38–39, “Longbow” mini-game, 759 44, 552–553, 606, 606f Look to talk method, 253 Mathematical formulas, or computer algorithms, 688 Low-cost VR systems, 52 Mathematical models, 708–709 Low-frequency speakers, 369–370 Mathematical operation, 560 Low-pass filters, 495 Maturation of VR, 785–791 LRIR. See Late reflection/reverberation impulse response Max/MSP tool, 512 (LRIR) Maya, 687–688 Luckey, Palmer, 54–55 McCLoud, Scott, 62, 74, 405–408, 407f–409f, 413 “Lumen” consumer application, 757–758, 759f McGurk effect, 144 Luminosity, 132–133 McLuhan, Marshall, 75, 80 Lunar Orbit Landing Approach (LOLA), 451 Meandering, 601 881 | Subject Index

Mechanical tracking, 209–210, 210f MEMS-based gyroscope, 53 Mechanoreceptor(s), 119, 121 Microsoft, 371f. See also Cortana (Microsoft); HoloLens information, 148–149 (Microsoft); Kinect (Microsoft Xbox input device); Mechdyne Corporation, 49–53, 819 RoomAlive (Microsoft); Xbox controllers Media of Attraction, 56, 92–93, 677–678, 718, 790–791 (Microsoft) Medical sensor technologies, 233–234, 234f Windows platforms, 485, 510–511 Medicine Meets VR (MMVR), 92 Windows , 796–797 “Meditation Chamber” The, 757–758 Middle ear, 143 Meditative experiences, 757–758 MIDI. See Musical instrument digital interface (MIDI) Medium (VR), 9, 61, 80, 788 MiddleVR, 717 communicating through media, 61–62 Midrange sound frequencies, 142–143 communication, 65–71, 66f–67f, 69f Millennium Falcon, 773–774 musical experiences, 71f Mimesis, 9 content, 63–65 of simulated world, 404–405 issues of human communication media, 72–81 Mimetic world, 404–405 as social actor, 166 Mine, Mark, 547, 557–558, 590–592, 592f, 617f, 623, 729 study of virtual reality medium, 81–98 Mini-golf style game, 759 capturing virtual reality experiences, 94–98 Miniature world selection, 589–591 experience vs. information, 88–92 Minimal Reality Toolkit (MRT), 715 form and genre, 87–88 Minimally invasive surgery simulation. See Laparoscopic interactivity, participation, and multipresence, 83–84 surgery simulation narrative, 82–87 Minimally invasive techniques, 25 narrative flexibility, 84–87 Minnesota Scanner, 40–41, 216–217 virtual reality, 81 Mipmap level, 480 apropos, 727–730 Misrepresentation, 403 media of attraction, 92–93 Mission planning and reconnoiter for public safety and virtual worlds suitability for particular media, 65 military operations, 739 Meissner’s corpuscles, 150 Mixed displays, 359, 385–386 Memorable placenames, 606–607 Mixed reality (MR), 19 MEMS. See Microelectromechanical systems (MEMS) Mixed Reality Videos and Trompe-l’oeil, 96–98, 99f Mental immersion, 10–12, 660–661 MMORPGs, 83 Mental model creation, 602–603 MMVR. See Medicine Meets VR (MMVR) Menu, 77 Mobile computing technology, 290 menu-like virtual control, 592 Mobile platform, 341 menu-select technique, 586–589 Mobile-based AR, 792–793 Merkel’s disks, 150 Moby Dick, 163, 727–728, 734–735 Mesh, polygonal, 225f, 469, 482–483, 514f, 700 MoCap system. See system (MoCap system) Metacommands, 651–652 Modal analysis, 500–501 Microelectromechanical systems (MEMS), 218–219, 331 Modality, 121 882 | Subject Index

Modeling, 756 MTV-style, 76 package, 687–688 MUDs. See Multiuser dimensions/dungeons (MUDs) programs, 483 Multimodal sensory input, 425 Modern game engines, 814–815 Multiparticipant environment, 14 Modern GPU systems, 463 Multipass rendering, 470–471 Modern rule-of-thumb frame rate, 665 Multiperson Modern shaders, 468 cockpit, 637 Modern VR systems, 453 interactive graphics, 35 Modern VR-ready game engines, 817–818 Multiple media, 73 Momentary valuator input, 203 Multiple plane haptic models, 522–523 Monitor-based VR, 301 Multiple points of haptic contact, 527 Monochromatic displays, 264 Multiple projectors, 309 Monoscopic image depth cues, 138–139 Multiple-screen displays, 309 Mood-setting ambient sounds, 436–437 Multiple-view texture mapping, 472 Moon effect, 283 Multiplexed rendering, 476–477 Motion, 429 Multiplexing, 266–267, 271–273, 477f–478f bases (platforms), 389–390, 389f–390f color, 271f depth cues, 139–140 for multiple users, 271, 642–643 parallax, 227 polarization, 268–269 Motion Analysis, 38–39 spatial, 267–268 Motion capture system (MoCap system), 37–38, 214 spectral, 269–270, 272 Motion picture medium, 91, 32, 75–76, 83, 91, 133–134, temporal, 268, 272 168, 285–286, 403, 437f, 672, 674, 682–683, 788, Multipresence environment, 14, 83–84 790–791 Multiresolution shading, 474 Motion sickness, 157, 475, 626, 668, 753. See also Multispring model, 523 Sim-sickness Multiuser dimensions/dungeons (MUDs), 22 Motion to photon, 283–285, 427, 475, 665, 798 Multiwall stationary display, 285 MotionStar wireless magnetic tracking system, 46–47 Muscle spindles, 151 Motivations for live world capture, 691–696 Muscle/joint-based rendering, 515–516 Motor homunculus, 119 Muscular/neural tracking, 222–223 Move Controller (from PlayStation), 236b MUSE system, 438 Move from reference, 623–624 Music tool, 757 Move-the-World, 626–627 Musical instrument digital interface (MIDI), 509, 511 Movement damping-based texture rendering, 519 Musical instruments, 696 Movement formula, 573–574, 616–618 Musical sounds, 493–494 Movie makers, 449 Mutual understanding, 91–92 Moving platforms, 391–392 MxR Lab (ICT/USC), 48–49, 54–55, 331–332, 332f, 472f MR. See Mixed reality (MR) “My3D” device, 53 MRT. See Minimal Reality Toolkit (MRT) Myo arm-band, 223f 883 | Subject Index

N system devices, 40 N-ary inputs, 204–205 travel, 612–631 Nalco Fuel Tech BoilerMaker application, 583. See Wayfinding, 601–611 BoilerMaker in Index of Media Experiences Navy Research Lab (NRL), 729–730, 730f Name-to-select, 585–586 NCSA. See National Center for Supercomputing Name-to-talk, 253 Applications; University of Illinois Narrative, 82–87 Neural mapping, 177–178 elaboration, 603 Neuromancer (novel), 21–22, 39 experiences, 799 Neurons, 122 flexibility, 84–87 Neutral File Format (NFF), 483 NASA, 222f, 362f New Technology Hype Cycle, 782 Ames Research Center (ARC), 21, 783 New York Stock Exchange, 420 Astronaut EVA trainer, 549, 550f New York Times, 57, 748f Houston VR Lab, 650, 650f Newtonian physics, 704–705, 705f Hubble Space Telescope Repair mission, 650–651, 650f Newtons/meter (Nt/m), 363 Johnson Space Center, 650f NewtonWorld application, 369–370, 597, 705, 705f Lunar Orbit Landing Approach (LOLA), 451 NexCAVE VR display system, 18, 307, 308f VIEW lab, 39–42 NFF. See Neutral File Format (NFF) Virtual Windtunnel, 594–595, 595f, 756 NFT. See Natural feature tracking (NFT) National Aeronautics and Space Administration. See NASA NICE VR experience, 86 National Center for Supercomputing Applications (NCSA) NIH. See National Institutes for Health (NIH) BattleView military visualization, 610 “NinjaRun”, 624 BayWalk collaborative visualization application, Nintendo 3DS game system, 267–268 647, 648f Nintendo 64 Rumble Pack, 44 Crumbs tool, 756 Nintendo home video game system, 40–41 Virtual Director entertainment production application, Nintendo Wii remote, 51–52, 195f, 216, 236, 236b, 343–344, 585–586, 624, 652, 729, 765 355, 445f, 796, 800, 810–811 National Institutes for Health (NIH), 533 Nociceptors, 121, 150 Natural feature tracking (NFT), 215–216, 810 Noise pollution, 350 Nausea, 123, 156, 159–160, 173, 176, 187–188, 208, 270, 274, Nominal data, 423 283–286, 298, 388–389, 399–400, 447, 570, 665, Non-internet examples, 22 753, 775, 795, 803, 805. See also Motion ­sickness; Non-VR situations, 15 Sim-sickness Nonaffine operation, 486–487 Naval Postgraduate School (NPS), 378, 633f Noncontact displays, 440 Navigation(al), 546, 600–633 Nondiegetic, 431, 567. See also Diegetic awareness, 601 sounds, 437, 437f controller, 236b Nonfiction narrative, 82 maps, 417–418 Nongaming tool, 815 reset function, 627–628 Nongeometric rendering systems, 459–461 884 | Subject Index

Nonocclusive head-based displays (Nonocclusive HBDs), object-based rendering, 455–457 324–330 transference of object permanence, 184–186 components, 325–327 Occlusive head-based virtual reality displays, 318, 323–324 features, 327–328 components, 318–319 interface issues, 328–330 features, 319–321 Nontechnological “displays”, 444 interface issues, 321–323 Nonuniform rational B-splines (NURBS), 455, 457–458 Oceanarium scale, 301 Normal objects (aka artifacts), 684 Ocular system, 133 Notation, 89–90 Oculogravic illusion, 157 Novel(s), 61–62, 73 Oculogyral illusion, 157 Novoview, flight simulator image-generator, 35–36 Oculus VR, 790. See also GearVR phone viewer (Samsung/ NPS. See Naval Postgraduate School (NPS) Oculus VR) NRL. See Navy Research Lab (NRL) Developer Kit 1 (DK-1), 54–55, 54f, 331 Numeric computation methods, 497–498 Developer Kit 2 (DK-2), 54–56, 153f, 231f NURBS. See Nonuniform rational B-splines (NURBS) CV-1, 57, 290–291, 317f, 791 nVidia Corporation, 231f Oculus SDK, 815 nVidia GameWorksVR team, 474 controller, 57, 193, 202, 202f, 230, 235–236, nVidia GeForce 3 ushers, 49 618, 810 nVidia PhysX, 705 Oculus VR CV-1, 791 nVidia VCA, 454 ODE. See Open Dynamics Engine (ODE) nVidia VRworks, 491 ODG display. See Osterhout Design Group display nVidia VRworks-Audio, 512 (ODG display) Nyquist’s sampling theorem, 142 Odorants, 394 ODT. See Omni directional treadmills (ODT) O Odyssey system, 35 Obama, Barack, 680f Off-board sound rendering systems, 513 Object(s), 414 Officer of the Deck application, 606, 614 attribute modification, 597–598 Offline world capture, 697 behavior, 685–687 OLED. See Organic light-emitting diode (OLED) dynamic, 686 Olfaction, 119–121, 394–395, 395f, 447 rigged, 686 Olfactory displays, 808 static, 686 Olfactory perception, 158–159 trigger, 687 Omni (magazine), 165 cointeraction, 709 Omni directional treadmills (ODT), 244, 245f identification, 128 One immersed participant with onlookers, 637 modeling and world layout, 685–690 Opacity, 276–277 object forms, 687–689 Open display, 637 models, 483 Open Dynamics Engine (ODE), 705, 718 object-based rasterization rendering pipeline, 464–465 Open-ear headphones, 354 885 | Subject Index

OpenGL, 464, 485–486, 716 ParaView visualization tool, 717, 756 interception, 717 PARIS. See Personal Augmented Reality Immersive System offshoots, 464, 485. See also GLES; WebGL (PARIS) OpenHaptics library, 532, 716 Participant’s sensory system, 425 OpenSceneGraph, 714–716, 718 Participants with virtual world OpenVR, 815 input technologies, 193–225 “Operation Lindbergh”, 49–50 inputs within virtual reality system, 226–256 Operation planning and reconnoiter for science, 740 position-tracking technologies, 207–225 Optical tracking systems, 213–216 Participation, 83–84 Optics (delivery technology), 275–276 Particle based rendering, 460–461, 460f Optimistic strategy of world congruity, 651 Passive haptics, 116, 363, 696 OptiTrack package, 51–52 block, 669, 669f Optiverse, 609 displays, 359, 384–385, 385f Orbital viewing, 629 objects, 247 Ordinal data, 423 representations, 444–445 Oregon Trail, The (games), 12 technique, 185 Organic light-emitting diode (OLED), 263–264 Passive input, 193–196 Orienting, 128 Path following, 604 Origin, 559–560 Pausch, Randy(cite)(cite)(cite)(cite), 52, 77, 347, 471–472, osgHaptics, 533 505, 605, 621, 758–759 Osmose VR application, 757–758, 772–773, 772f pd. See PureData (pd) Osterhout Design Group display (ODG display), 288, 289f PDA. See Personal Digital Assistant (PDA) Otoconia, 155–156 PenguFly, 251, 251f, 578, 624 Otolith organs, 155–156 Penguin trainers, 30–31 Outer ear, 142–143 Penrose Studio’s “Allumette”, 733–734, 734f Outside-in POV, 674–675 Pepper, John, 30 Oxford Medical Systems, 37–38 Pepper’s Ghost, 30 Perceived representations, 406–408 P Perceptible affordance, 112–113 P-reps. See Physical representation (P-reps) Perception, 162–163 Pac Man videogame, 735, 759 of change, 124–127 Pacarana hardware, 507–509, 513 of form, 414–415 Pacinian corpuscles, 150–151 Perceptual elements, 170–172 Pain (journal), 164 Perceptual processing, 122 Painting, medium of, 5, 5f, 63, 63f, 66f, 67–68, 76, 568f, Personal Augmented Reality Immersive System (PARIS), 754–755, 755f 365–366 Palm-VR display. See Handheld VR display Personal Digital Assistant (PDA), 50 Panoramic painting, 29f “Perspective projection” transformation, 486–487 Pape, Dave, 83f, 435f, 596f, 623, 643f, 670, 671f Pessimistic strategy of world congruity, 651 886 | Subject Index

Phantom device, 44, 364 Newtonian physics, 704–705 Phantom Premium, 363–364, 375–377, 375f object cointeraction, 709 Phantom limbs, 153 of real and virtual worlds, 711–712 “Pharaoh’s Tomb”, 239f simulation/mathematical model, 708–709 Phasic nerve. See Rapidly adapting nerve (RA nerve) simulation software, 705 Philips “WOWvx” 9-zone auto-stereo display, 267–268, 803 static world, 703 Phong-interpolation, 465 world persistence, 709–711 Photogenia, 414 world physics, 707 Photogrammetry, 483, 700. See also Structure from Motion Physiological depth cues, 140 (SfM) PI. See Place illusion (PI) Photographic Message (Barthes), The, 414 Pie menus (RWTH), 588–589, 588f Photopic vision, 133–134 Pilot-Through, 624–626 “Photoportals” project, 590f, 692, 692f Pin actuator technology, 370, 371f Photoreceptors, 119, 121 Pin-based pressure rendering, 518 Physical controls, 235–236, 549, 614 Pin-based texture rendering, 518 Physical emulation of sound, 434 Pinch Gloves, 202, 202f, 588–589, 756 Physical immersion, 9–12, 193, 660, 664f Pinching, 527 Physical input, 200–201 “Ping pong” game (Cinematrix experience), 314b–316b of alphanumeric information, 591 Pinna. See Outer ear devices, 234–248 “Pinna–Gregory Illusion”, 134–135, 135f physical controls, 235–236 “Pinocchio illusion”, 152 platforms, 241–248 PIT. See Protein Interactive Theater (PIT) props, 236–240 Pitch, 561b–563b Physical locomotion, 619–620 Pixel, 263–264. See also Ray-tracing Physical object rendering, 525 pixel-based rendering, 455–457 Physical platform interfaces, 625 Pixel Planes technology, 40–41 Physical reality, 11, 19, 116 Place illusion (PI), 166–167 Physical representation, 68, 312–313, 386f Placeholder, 79, 85–86, 91–92, 577–578, 582 P-reps, 413 application, 647 Physical space, 65 voiceholder annotation tools, 635 Physical stair-stepper device, 614 Plasma displays, 265 Physical synthesis models, 494–495 Plasticity, 122–123 Physical tracking, 51 Plateau of productivity, 785 Physical/sensory immersion, 660 Platforms, 241–248, 241f–242f, 244f Physics, virtual world, 702–712 Plausibility illusion (Psi), 166–167 Aristotelian physics, 704–705 “Play Store”, 791 cartoon physics, 703–704 Plenoptic displays, 274–275, 804 choreographed physics, 706–707 Pneumatic pressure systems, 375–376 extent of world physics, 707–708 Pocket projector technology, 336 887 | Subject Index

Poggendorff illusion, 137, 137f improving position tracking, 223–225 Point and plane models, 522–523 inertial tracking and microelectromechanical system Point of view (POV), 8–9, 82, 659, 671–675 technologies, 218–219 first-person, 672 mechanical tracking, 209–210 inside-out vs. outside-in, 674–675 muscular/neural tracking, 222–223 second-person, 673–674 optical tracking, 213–214 third-person, 674 section, 281 Point to point model, 523 tracking through range-finding technologies, 219–222 Point-based rendering, 460–461 ultrasonic tracking, 210–211 Point-cloud rendering, 460–461, 461f videometric tracking, 215–216 Point-to-select interaction, 328, 582 Position(ing) Pointer beam scope, 558 inputs, 203–204 Pointer-directed fly-through, 622 objects, 594–595 Pointer-directed selection, 575–576 sensors, 198f, 207 Polarization multiplexing, 268–269 tracker, 198 Polarized glasses, 309–310 Post-traumatic stress disorder (PTSD), 164, 742–743 Polhemus, 37–38, 40, 212f Postprocessors, 509 Polygon(al), 458, 467 Posture. See Body posture and gesture recognition count reduction, 466–469 POV. See Point of view (POV) decimation, 469 Powerglove, 40–41 mesh, 469, 482 Precedence effect, 147 method, 457 Predictions of the future and the past, 801–813 Pompeii application, 772–773 application futures, 819–820 Pong video game, 639b display technologies, 803–809 Portability, 293–294, 352, 366–367 input technologies, 809–813 dual-screen fishtank style VR display, 296f software, 813–819 mock up of portable CAVE-style system, 296f Predictive analysis, 223–224 “Portal” game, 400 Presence, 10, 162–169, 170f Pose, 414 breaking, 186–188 Position tracking, 12, 207–225 deconstructing, 165–168 beam-scan tracking, 216–217 determinants and responses of, 169–174 body, 226–234 behavioral responses, 172–174 biological and medical sensor technologies, 233–234 perceptual and visceral elements, 170–172 eyes, 231, 231f evolving ideas regarding, 164–165 feet, 232–233 measuring, 174–177 hand and fingers, 228–230, 228f–229f in other media, 168–169 head, 227–228 uses of studying presence, 163–164 Torso, 231–232 Presentation/interaction styles, 88 electromagnetic tracking, 211–213 Primary hand, 563 888 | Subject Index

Procedure planning and reconnoiter for medicine, 741–742 Push-to-talk method, 253, 253f Procedure visualization, 742 “Put-Me-There” method, 627–628 Product showcase, 745 Pygmalion’s Spectacles (Weinbaum), 31, 31f, 33, 738 Profitability, 726 Programmable graphics system, 37 Q Programmable shaders, 461 Quadcopters, 27f Programmable sound processor, 507–509 Quake-II engine, 758–759 Progressive expansion, 602–603 Qualcomm Institute, 307, 308f–309f. See also California Projection mapping, 97–98 Institute for Telecommunications and Information Projection VR. See Stationary VR Technology (CALIT2) Projection-based AR/VR environment, 48–49 NEXcave, 18, 307, 308f Projection-based stationary systems, 320–321 SunCAVE, 307, 309f Projection-based VR, 305 WAVE (Wide-Angle Virtual Environment), 307, 308f Propagation, 501–502 Qualia, 175 Proprioceptic box, 251. See also Body referential zones Qualitative representations, 420–424, 422f Proprioceptic cues, 361 Quantitative representations, 420–424 Proprioception, 119–121, 148, 154, 259 QuickHaptics API, 532 Proprioceptive drift, 128 Quintessential nonperpetual experiences, 771–772 Proprioceptor, 150 Props, physical input devices, 236–240 R Props for Neurosurgical Visualization, 240 RA nerve. See Rapidly adapting nerve (RA nerve) Protein Interactive Theater (PIT), 272 Radar, 220 Prototype Radar-based Soli project, 811 low-friction walking interface, 244f Radio communications (RF communications), 295–296 tracking system, 215f Radiological Immersive Survey Trainer application (DRI), 645 ProVision VR, 40–41 RAIR. See Responding-as-if-real (RAIR) Proximity, 440 RAM, 6 PRT. See Pulse ranging technology (PRT) Ranging technologies, 222, 701 Psi. See Plausibility illusion (Psi) Rapidly adapting nerve (RA nerve), 150 Psychological factors of presence, 171 Rapture Vest, 361, 361f Psychology research groups, 751 Raspberry-Pi, 482 PTSD. See Post-traumatic stress disorder (PTSD) Rasterization, 456–457, 461–462 Public venue VR experiences, 41–42, 46–47, 47f–48f, 186, rendering pipeline, 464–465 288–289, 294–295, 314b–316b, 323f, 361, 384–385, Ratcheting, 557 637f, 675, 677–678, 680f, 726, 772, 776 Ratio measurement, 423 Pulse ranging technology (PRT), 222 RAVE systems, 41–42 PureData (pd), 512 Ray-casting techniques, 460, 462–463, 462f Purpose-built room platform, 247 Ray-tracers, 458 PUSH display, 322b–323b, 323f Ray-tracing, 456–457, 460, 463f 889 | Subject Index

Razer Hydra gaming input device, 53–54, 195f, 230f, 563, Regionally targeted rendering, 473–474 756 Registration, 24 RB-2. See Reality Built for 2 (RB-2) Reinforcing perceived affordances, 116–117 Reactive Grip controller, 811 Relative input, 198–200 Ready-to-go tools, 795 Remotely operated robot, 27f Real , 237–239 Rendered at zero parallax, 140–141 Real world, 696, 697f Rendering. See also Haptic rendering; Sonic rendering; interface metaphors, 544–545, 545f Visual rendering as part of virtual world, 690–702 of other senses, 534–535 motivations for live world capture, 691–696 olfaction and gustation, 535 motivations for offline world capture, 697 software, 816 world capture technologies, 697–702 systems, 449–454 physics, 711–712 hardware, 451–454 sounds, 349, 492, 492f scene, 453–454 Real-time vestibulation, 534–535 computer graphics, 800 Representation of virtual world, 400–426 computing of direct path, 501 of computer game, 400f real-time-rendered holography, 804 human comprehension, 409–417 rendering, 399, 449, 455–456 issues, 424–426 Realism, 166 mapping, 417–420 components of immersion, 663–665, 664f of other senses, 446–447 fooling user, 667–670 quantitative and qualitative, 420–424 levels of immersion, 665–666 summary, 448–449 redirected walking, 667–668 verisimilitude, 404–409 role in immersion, 661–671 in VR, 427–432 Realism Axis, 406–409, 440–441 Representative behavior of presence, 173 Realistic sounds, 434 Resource budget, 424 Reality, 6 Responding-as-if-real (RAIR), 167 Reality Built for 2 (RB-2), 40–41 Responsive Workbench, 45–46, 306, 570 “Reality Vest 64”, 361 Reticle-directed selection, 576–577 Rec Room experience, 750 Retro-reflective screen, 335 Received information, 406–408 “Return-to-zero” property, 204–205 Received representations, 406–408 Reverb effects, 347, 496 Recording audio, 94 Reverberation, 496 Redirected touching technique, 154, 667 RF communications. See Radio communications Redirected walking, 118, 667–668 (RF communications) change blindness redirection, 669–670 Ride along, 620–621 redirected touching, 668–669 Rigged objects, 686 Refractive optics, 456 Rigid bodies or (aka constellations), 214, 500, 705 890 | Subject Index

Ring platform, 242–243 Sarcos Dextrous Arm Master, 516, 517f River metaphor, 621 Sarcos Uniport system, 375, 376f, 378 Roberts, Larry, 34 Saturday Night Live (SNL), 45, 734–735 Robotic graphics, 380 Sayre Glove, 37 Robotically operated shape displays (ROSDs), 359, Scale operation, 486–487 380–384, 439, 516 Scale-the-World, 627 components, 382 SCAPE project, 45, 48–49, 49f features, 382–383 Scene-graph, 483–486, 484f, 716 interface issues, 383–384 “Scent palette”, 394 Roll, 561b–563b Scents, 808 Roller-coaster vestibular displays, 391–392 Schell Games, 404f, 543–544, 584f, 762, 801 Rome Reborn VR application, 647 Schell, Jesse, 584f, 801. See also “Schell Games” virtual map in, 607 Science, 123–124 Room acoustics, 497 Science Fiction, 31–33, 43–44, 87, 222–223, 255, 738 “RoomAlive” (Microsoft), 98, 221, 221f ScienceSpace, 563–564 Rope (motion picture), 76 Scientific visualization, 756 ROSDs. See Robotically operated shape displays (ROSDs) Scotopic vision, 133–134 Rotation(al), 561b–563b Scrollbar, 604 gain, 125 SDL. See Simple DirectMedia Layer (SDL) operation, 486–487 Second Life, 20–21, 50 vection, 156–157 Second person projected realities, 314b–316b Rubber hand illusion, 128, 178 Second-person POV, 673–674 Ruffini corpuscles, 151 See-through effect, 324–330 Rutgers Dextrous Master, 377f “Seeing’s believing, but feeling’s the truth”, 439–440 RWTH Aachen University, 588–589, 588f Selectivity, 121–122 Self representation, 78–80 S Self-grounded force displays, 526 SA nerve. See Slowly Adapting nerve (SA nerve) Self-grounded systems, 360 Safety, 297–298, 352, 367, 383–384 Self-perception within virtual world, 162–188 Sample/sampling, 492–493 Semicircular canal fluids, 156 array, 492 Semicircular canals, 155 rate, 124 Semiotics, 412 rate, 510 SensAble Technologies (now 3D Systems), 44, 45f, 210, sounds, 434–435 531–532, 716, 756 Samsung GearVR. See GearVR phone viewer (Samsung/ Sensation, 118–129 Oculus VR) cross-modal perception, 127–129 Sandbox application, 430f hierarchical processing and selectivity, 121–122 Sandin, Dan, 43–44, 49–50, 267–268, 803 measuring perception/sensation, 123–124 Sapporo “Virtual Brewery”, 677–678 Penfield’s sensory homunculus, 120f 891 | Subject Index

perception of change, 124–127 designer, 88 plasticity and adaptability, 122–123 language, 250 Sense displays, associability with, 293, 351–352, 365–366 Sila Sveta (dance company), 97 Sense of agency, 11, 10–11, 768–769 Silicon Graphics, Inc. (SGI), 485, 717, 800 Sense of presence, 10, 84, 149, 162–163, 168, 170, 174, 185, Sim-sickness, 805 385f, 690, 696, 768–769 Simple body gestures, 641 Senses, 118–129, 393f, 394–395 Simple DirectMedia Layer (SDL), 512 display, 32, 90, 372, 394 Simple thumb-finger gesture, 555–556, 556f Sensorial precedence, 160 Simple vibration generators, 443 Sensory carryover, 184 Simplified shape rendering models with forces, 521–523 Sensory displays, 260, 388–395, 664–665 Simulated sounds, 435–436 olfaction, gustation, and senses, 394–395 Simulation/mathematical model, 708–709 Sensory feedback, 11 Simulator sickness, 219, 319, 389, 629, 665, 753. Sensory homunculus, 119 See also Motion sickness; Simulator Sickness Sensory substitution, 117, 369, 370f, 373–374, 395, Questionnaire (SSQ) 425–426, 443–444, 446–447, 556, 581, 603 Simulator sickness questionnaire (SSQ), 775. See also events, 437 Sim-sickness haptic representation, 439 Simultaneous localization and mapping (SLAM), 216, 277 SfM. See Structure from motion (SfM) tracking technique, 224–225, 225f, 793, 796–797, 805, SGISilicon Graphics, Inc. (SGI) 810, 812 Shader Sine waves, 498 algorithms, 465 Single point of contact, 520–521 code, 457 Single VR application, 645 Shading, 138, 465–466 Single-bitmap-textured polygon, 472–473 Shadow(s), 138, 141–142 Single-player games, 759 maps, 457 Single-screen fishtank, 313 “ShadowLight”, 756 Single-source video-tracking method, 213 Shape rendering with robotically operated shape display, Siri (Apple Computer), 252–254, 553 524 Situational awareness, 601 Shared experience, 634–640 Situational training, 739 aspects of experience, 635–636 Sizing objects, 594–595 ways, 636–638 Sketchfab, 818 “Shrink-wrapped” tools, 815 Sketchpad application, 33 Shutter glasses, 309–310 Sketchpad-III, 33 Side-bar, 236b Skin SIGGRAPH Computer Graphics Conference, 42–47, 43f, 46f, activator array, 808 49, 306, 639b, 757–758, 767, 769 colors, 182–183 Sight, 428 pressure, 518 Sign(s), 412 shear, 444 892 | Subject Index

Skin (Continued) Social media, 80 skin-based rendering, 515–516 Social presence illusion, 167 stimuli, 516 Social richness, 166 Skype, 50 Software, 813–819 Skyscraper, 216 application development software, 816–819 SLAM. See Simultaneous localization and mapping (SLAM) applications. See Experience Index Slater, Mel, 165–167, 176, 662, 665–666 availability, 798–799 Slater–Usoh–Steed presence questionnaire (SUS presence game engines, 717–718 questionnaire), 165 hardware interface, 815–816 Slowly adapting nerve (SA nerve), 150 integration, 714–717 Smart Helmet. See Daqri—Smart Helmet to manifesting VR experience, 713–719 Smart Glasses. See Daqri—Smart Glasses rendering, 455, 816. See also Haptic rendering; software; Smart Scene, 756 Sonic rendering; software; Visual rendering; software Smart-tablet, 18 systems, 481, 485 Smartphone, 18, 551, 804–805 VR integration software, 714–717 smartphone-based VR display, 292–294, 297, 323 web-delivered virtual reality, 718–719 smartphone-VR head-based displays, 331–335 world creation, 814–815 advent of smartphones with inertial tracking-enabled Solar System Modeler (SSM), 572 devices, 332f Soli project (Google), 220, 220f, 229, 584–585, 811 components, 332–333 Somatosensation, 119–121, 148 features, 333–334 haptic sensor types, 149–151 interface issues, 334 haptic spatial and temporal resolution, 151–152 Wheatstone stereoscope mockup, 331f. See also human physiology for, 149–152 Cardboard phone viewer (Google); Daydream Sonic asset encodings, 510–511 (Google); FOV2GO phone viewer (MxR Lab/USC/ Sonic pressure waves, 142–144 ICT); GearVR phone viewer (Samsung); My3D phone Sonic rendering viewer (Hasbro) complex sounds, 498–505 SMARTTRACK camera-based position tracker, 53–54 linking virtual to real, 514–515 Smell. See Olfaction methods, 148, 489–515 Smell, primary dimensions of, 158 off-board sound rendering systems, 513 Snake Charmer, 381–382 software, 511–514 Snake illusion, 136–137, 136f special-purpose hardware for sound, 505–510 Snap to each other method, 557 Sonification, 38–39, 436 Snap to grid method for translation and/or rotation, 557 events, 437 SNL. See Saturday Night Live (SNL) Sonography, 702 “Snow Crash” (Stephenson), 43–44, 78–79, 738 Sony, 96, 97f. See also Sony PlayStation Social actor within medium, 166 Sony PlayStation, 57, 219, 236b, 759, 791 Social hangout, 749–750 SOPI. See ITC Sense of Presence Inventory (SOPI) Social interactions, 769 “Sorcerer’s Apprentice” system, 36f 893 | Subject Index

Sound(s), 142, 345, 350, 432, 662–663, 690 Static monoscopic image depth cues, 141 features, 432–433 Static objects, 686 filtering, 495–497 Static world, 703 generation, 491–497 Stationary aural displays, 353–354 postprocessors, 506 Stationary displays, 260, 262, 300–301, 343–344, 629 propagation algorithms, 490, 497–498 Stationary screen systems, 96 spatialization, 490 Stationary VR, 43–44, 199, 301 stage, 345–346, 345f, 757 display platform, 247, 306 synthesis, 490 paradigm, 17–18 synthesizer, 506–507, 507f systems, 551 in VR experience, 434–438 visual displays, 227–228 Space and planetary exploration/mapping, 756–757 Statistical methods, 420, 421f Space Navigator input device, 198, 203 Stepping into a picture, 628 Space traversal, 78 Stereo overlap FOV, 278 SpaceBall input device, 203, 549 StereoHaptics team, 444, 798 Spatial acceleration data structures, 461–462 Stereolithography, 386–387, 386f Spatial characteristic cues, 147 Stereophonic headphones, 348 Spatial multiplexing, 267–268 Stereopsis, 139, 141, 266–268, 267f, 320, 663 Spatial presence, 166–167 Stereoscopic Spatial resolution, 264–265, 265f, 363–364 camera, 166, 19, 479–480, 805 Spatialization, 146, 346–347, 502–505 display for multiple people, 311 Speakers, 348, 353–354 display hardware, 30, 32, 33f, 34, 35f, 49–53, 55–56, Specialized sound renderers, 506 269–272, 278f, 303, 318, 803 Spectral additive and subtractive techniques, 499 glasses, 269f, 166, 43–44, 44f, 49, 268–269, 288, Spectral frequency modulation, 499 291–292, 291f, 298, 301, 309–310, 570f Spectral multiplexing, or anaglyphic stereo displays, 269–270 image depth cue, 139 Spectral sound-generation methods, 493–494 rendering, 282, 469–470, 472–473, 480, 482 Spectral synthesis methods, 493–494 textures, 472–473 Spectrum, 689 viewer, 30, 268f, 280, 288–290, 331–332, 331f, Speech recognition, 252–255, 254f 678f, 791 Speed, 617 Stimulus, 121 Spherical coordinate system, 559–560 Stimulus–response pathway, 169 “Spider & Web”, interactive fiction experience, 403 STK. See Synthesis ToolKit (STK) Spotlight, 558 Storage media, 62, 68 Spring and dashpot model, 522 Streakline release points, 594–595 SSQ. See Simulator sickness questionnaire (SSQ). See also Stream processing units, 481–482 Sim-sickness Stream-processors, 509–510 Standard computer interface, 111–112 Strickland, Rachel, 79, 85–86, 249f Stanford University, 38, 41–42, 533, 716 Structure from Motion (SfM), 222, 700, 818 StarVR, 319–320 Structured light depth mapping, 220–221 894 | Subject Index

Structured light-tracking methods, 221 Tactile, 119–121 Studierstube Tracker library, 338 devices, 516 STYLY tool, 814–815 displays, 357, 359 Subtractive sound creation techniques, 499 feedback array, 518 Subwoofers, 369–370 haptic displays, 368–374 Sun Microsystems, 43–44, 306, 754 components, 368–372, 370f SunCAVE, 307, 309f features, 373 Super Cockpit (Furness), 38 interface issues, 373–374 Supernumerary limbs, 178 perception, 443 Surface texture, 518–519 representations, 443–444 gradient, 138–139 sensations, 383, 805 Surface-based methods, 459 tactile/cutaneous cues, 361–362 Surgery, 14, 16f, 49–50, 200, 290, 358, 378–379, 379f, 394. Taction, 148–149 See also Laparoscopic surgery Tactors, 808, 153, 153f, 361, 367–369, 373, 380, 426, 517, 530 Surrogate agent control, 651 Tango project (Google), 327, 796–797 Surround virtual reality, 310–312, 317 Taste aversion, 159–160 components, 307–310 Taste system, 159 display, 305–307 Technical factors of presence, 170 interface issues, 312–316, 313f Technologists role in VR, 91–92 Surround-screen virtual reality, 98 Technology, 90 SUS presence questionnaire. See Slater–Usoh–Steed pres- interface, 117 ence questionnaire (SUS presence questionnaire) technology-based media, 719–720 Suspension of disbelief, 9–10, 172, 184–185, 404, 661, 670 VR technology trigger, 782–783 Sutherland, Ivan, 33–36, 35f–36f, 306, 328, 782, 799–800 Technology Affordances (Gaver), 112–113 Swedish Royal Institute of Technology (KTH), 47–48 TechViz XL tool, 819 “Sword of Damocles, The”, 209, 209f “Teddy” modeling package, 687–688, 688f Symbolic representations, 406 “Tele-hop” travel interface, 541–542, 541f Symbolic Sound Kyma/Pacarana system, 507–509 Tele-medicine, 33, 33f, 49–50, 243f Symbols, 411–412, 413f Teleoperation, 25, 559 Synchronous communication, 643–645 Telephone, 21–22 Syntax, 414 Telepresence, 18–26, 26f, 165, 559 Synthesis ToolKit (STK), 512, 716 operation, 327–328 Synthesized sounds, 493–495, 494f, 507 system, 32f Synthesizers. See Specialized sound renderers Teletact Glove, 369 System calibration, 224 Teleyeglasses, 33, 33f Temperature T actuators, 371–372 Tablet, 551 display, 443 “Tabletop Simulator” application, 750, 801–802 Temple Presence Inventory (TPI), 168 895 | Subject Index

Temporal multiplexing, 268 Thunderstorm, 402, 403f Temporal resolution, 285–286, 364 Tilt Brush by Google, 567–568, 568f, 775–776, 799 Temporal sensitivity, 133–134 Time bar, 630–631 Temporary scene reduction, 480 Time warping, 477–479 Terrain following, 615 Time-of-flight systems (TOF systems), 701 Text inputs, 205 Titans of Space, 756–757 Text-based media, 73 Toirt Samhlaigh visualization application, 83f, 296f, 552f, Texture mapping, 467–468, 467f 570f, 619, 652, 736 Texture maps, 461 Tonic nerve. See Slowly adapting nerve (SA nerve) The Amazing Adventures of Spider-Man, 748–749 Torso “The Lab” experience, 426, 427f position tracking body, 231–232 The VOID. See VOID, The torso-directed selection, 577–578 Thermal rendering, 519 torso-directed walk-or fly-through, 622 Thermoreceptors, 121 Towrope, 621 Thiébaux, Marcus, 91–92, 585–586, 624, 729, 756–757 TPI. See Temple Presence Inventory (TPI) Thing Growing, The (Anstey), 84–85 Tracked handheld devices, 551, 552f Third-person POV, 674 Tracked viewer, 311 Three dimension (3D) Tracking chalk drawing, 98, 99f of facial expressions, 811–812 digitizing device, 483 interface with tracking methods, 290–292, 291f, 351, 365 effects, 347 through range-finding technologies, 219–222 hardcopy, 359, 386–387, 525 technologies, 796 modeling package, 112 Trading force for torque, 527 mouse, 43–44 Traditional computer interface, 787 object, 471–472, 471f Traditional performing arts, 68–71 space, 145 Traffic accidents, 601–602 spatialization of sound, 490 Training, 163–164, 739 stereoscopic visual display, 266–267 Transducers, 698 3D-cursor-select method, 583–584 Transfer functions, 146 3DM application, 756 Transfer media, 66–67 3D Studio MAX, 687–688 Transfer object permanence, 664 360 degree Transference of Object Permanence, 149, 240, movie, 13 384–385 rendering to 360-degree spherical view, 474–475 Transformation, 486 videos, 334 names, 561b–563b 360 panoramic paintings, 28, 29f Translation(al), 196–197 3Dfx Voodoo graphics card, 45–46 force, 375–376 Throughput, 294–295, 352, 367 gain, 125 “Thumbs up” gesture, 230 operation, 486–487 Thumper display, 260 Transmitters, 211 896 | Subject Index

Transportation, 166 Ultrasonic receivers, 43–44 Transputer hardware design, 40–41 Ultrasonic sensor, 699 Travel, 600, 612–631 Ultrasonic tracking, 210–211, 211f, 291 classes of travel methods, 618–629 Ultrasound constraints, 615 imaging systems, 702 controlling, 599 visualization project, 326–327 frame of reference, 615–616 Unaided human senses, 22–23 manipulation method, 614 Unassimilated phase, 677–678 movement formula, 616–618 Uncanny valley, 663 properties of travel interfaces, 613–618 Understanding Augmented Reality (Craig), 24 through time, 630–631 Understanding Comics (McCloud), 74 Treachery of Images, The, 63 Understanding Media (McLuhan), 75 Treadmills or stair-stepping machines, 243–244 Undirected narratives, 84–87 Trials on Tatooine (ILMxLab), 759. See also Trials on Taooine Undo surprises, 649 in the Index of Media Experiences Unit of measurement, 424 Triangular Pixels Ltd., 690f Unity game engine, 51, 483–484, 689, 713, 717–719, Trichromatic arrangement, 130 721–722, 814–815, 817–818 Trichromatic displays, 264 University of Illinois, 48–49, 181, 215f, 330, 330f, 335, 403f, Trick photography, 414 410f, 419f, 609, 789–790, 790f. See also Electronic Trigger, 687 Visualization Lab (EVL); National Center for Tritone paradox, 145 Supercomputing Applications (NCSA) Trompe-l’oeil, mixed Reality Videos and, 96–98, 99f Dan Simons, 125–126, 126f “Twinkle Box, The”, 35–36 University of North Carolina Chapel Hill, 34, 37, 42, 783, 787 Twist transformation, 561b–563b University of Tokyo Cybernetic Systems Laboratory, 380–381 Two (or more) immersed participants, 637 University of Utah, 34, 36, 36f, 39 Two-dimension (2D) University of Virginia’s User Interface Group, 237–239, 238f, computer interface, 113 628, 787 cursor, 571 Unmanned aerial vehicles (UAVs), 26 “Two-point discrimination” test, 151–152 Unreal Engine (UE), 511, 713, 717–718, 721–722, 814–815, 817–818 U UE4, 718 UAVs. See Unmanned aerial vehicles (UAVs) US Navy Research Lab’s Shadwell fire fighting project, 729–730 UE. See Unreal Engine (UE) User adaptation, 329–330 UI. See User Interface (UI) User Interface (UI), 72–73, 109, 539 Ultimate display, 34 elements, 685 Ultimate interface, 240, 545–546, 809 events, 437 Ultra HD 3DTVs, 55–56 metaphors, 544–546 Ultra-resolution TV, 798 manipulation, navigation, and communication, 546 “Ultrasonic haptics” display, 807, 807f prop, 236–237 897 | Subject Index

User mobility, 290, 350, 365 Vestibulation, 119–121, 259, 446–447 User objective, 771 human physiology for, 155–156 User testing, 774–775 Vestibulo-ocular reflex (VOR), 155 VESUB program, 552–553 V Vibrotactile actuators, 369, 516–517 Valuator(s), 203, 235 Vickers, Donald, 36 inputs, 202–203 Video (see-through) method of nonocculsive HMDs, 329 valuator-directed selection, 578–579 Video teleconferencing equipment, 642 Valve, 57, 93, 215f, 475, 704f Videometric tracking system, 213, 215–216 Chaparone System, 320–321, 321f Videoplace prototype, 37 Experiences, Under the Index of Media Experiences. VIEW lab. See Virtual Interface Environment Workstation lab See also “Longbow”, “Portal”,”Portal VR”, “Steam (VIEW lab) VR Tutorial”, and “The Lab” Visual displays, logistic properties of, 286–300 Lighthouse tracking, 216–217, 217f, 800 VIO. See Virtual I/O (VIO) Software, 512, 718, 815 Virginia Tech, 588–589, 588f Vanilla Sound Server (VSS), 511, 513, 716 Virtual Annotation System, The, 647–648 Varifocal displays, 140–141, 274–275, 320, 783–784, 804f Virtual controls, 550–552, 591, 614 VCA. See Visual Computing Appliance (VCA) altering state, 599 Vection, 129, 137, 156–157, 161–162 Virtual Director, camera choreography application, 592–593, Vehicle platform, 245 607, 631 Velcro strap, 222 Virtual environment, 21 Ventriloquism effect, 128, 144, 147, 502–503 Virtual I/O (VIO), 45–46 Venue, 675–681 Virtual image, 20–21 shapes participatory experience, 679–681, 680f–681f Virtual input, 200–201 shapes VR experience, 677–679 Virtual Interface Environment Workstation lab (VIEW lab), 39 Verisimilar representations, 406 “Virtual Jungle Cruise Ride” (DisneyQuest), 390, 392f, 621 Verisimilar sounds, 434–436 Virtual memory, 6 Verisimilitude, 11, 404–409, 663 , 807 Vertigo, 157 Virtual Perambulator, 45–46 Vestibular displays, 388–395 Virtual Portal, 43–44, 306 motion bases (platforms), 389–390 Virtual Prototyping System, 14, 642, 705 moving platforms, 391–392 Virtual reality (VR), 5–6, 16, 28–57, 81, 83, 109, 191, 259, vestibular options, 392–393 261f, 280f, 299f, 659, 725, 781, 785. See also Vestibular illusions, 137, 156–158 Augmented reality (AR) Vestibular perception, 154–158 applications. See Index of Media Experiences, pp. 903–908 human physiology for vestibulation, 155–156 choosing medium of VR, 727–730 vestibular illusions, 156–158 conceiving new VR application, 731–737 vestibular localization perception, 158 adapting from other media, 733–736 Vestibular system, 626 creating new VR experience from scratch, 737 898 | Subject Index

Virtual reality (VR) (Continued) Virtual Reality Peripheral Network (VRPN), 94, 714–716 drawing inspiration from or adapting from existing VR Virtual Space Devices, 46–47 experience, 736–737 Virtual tourism, 697 in home, 793–794 Virtual Windtunnel (NASA), 595f, 756 key elements, 6–18 Virtual world, 6–8, 18, 20–21, 62–65, 399, 681–702 combining elements, 16 increasing reality, 183–186 immersion, 8–12 interface to, 72–74 interactivity, 12–15 object modeling and world layout, 685–690 participants and creators, 6–7 rules, 702–712 virtual world, 7–8 substance of, 682–685, 683f in laboratory, 786–788 agents, 684–685 maturation, 785–791 artifacts, 684 transition, 790–791 user interface elements, 685 media of attraction, 56, 92–93, 677–678, 718, world geography, 682–684 790–791 suitability for particular medium, 65 meeting goals, 725–730 Virtuality Group plc, 85–86, 620, 784 painting, 754–755 Arcade Game System, 41–42, 242f, 681, 738, 766–767, 795 paradigms, 16–18 Dactyl Nightmare, 41–42, 41f, 759, 760f hand-based, 18 Ford Galaxy VR, 772 head based, 17 Legend Quest, 681, 681f, 773, 773f stationary, 17–18 Pac-Man VR, 759 state of, 781–785 Zone Hunter, 620–621 peak of inflated expectations, 783 “Virtualization Gate” experience, 96, 96f plateau of productivity, 785 Virtuix Omni platform, 55–56, 55f, 232–233, 233f, 244, 797, 807 slope of enlightenment, 784–785 Visbox, 310–311 technology trigger, 782–783 Visceral elements, 170–172 trough of disillusionment, 783–784 Visible light spectrum, 129–130 system, 7–8 “VisiCalc” electronic spreadsheet, 799 telepresence, augmented reality, cyberspace and, 18–26 Vision, 119–121 trends, 792–801 human physiology, 129–134 higher experience fidelity, 797–798 vision-enhancing devices, 26 less encumbrance, 795–797 Visual asset encodings, 482–484 new drivers/disruptive technologies, 799–801 Visual Computing Appliance (VCA), 454 software availability, 798–799 Visual depth cues, 137–142 VR-Ready machines, 794–795 distance misperception, 140–141 world building, 756 monoscopic image depth cues, 138–139 Virtual Reality Annual International Symposium (VRAIS), 44, 92 motion depth cues, 139–140 Virtual Reality Gorilla Exhibit, 568, 651 physiological depth cues, 140 Virtual Reality Modeling Language (VRML), 483–484 stereoscopic image depth cue, 139 899 | Subject Index

Visual display(s), 260–343, 430, 803–805 complex visual scene rendering, 461–475 channel, 266–267 computer graphics rendering, 141–142, 480–481 logistic properties, 286–300 rasterization rendering pipeline, 464–465 paradigms, 262, 300–324, 342–343 reducing polygon count, 466–469 fishtank (aquarium) virtual reality display, 301–305 shading, 465–466 handheld VR, 338–342 foveated rendering, 473–474 head-based (mounted) projective displays, 335–338 hardware, 311 head-based displays, 317–330 image-based rendering (IBR), 470, 700 nonocclusive head-based displays, 324–330 latency, 475–480 occlusive head-based virtual reality displays, 318–324 multiplexed rendering, 476–477 smartphone-virtual reality head-based displays, stereoscopic camera, 479–480 331–335 time warping, 477–479 stationary displays, 300–317 library, 485 surround virtual reality display, 305–317 multipass rendering, 470–471 visual presentation properties, 263–286 nongeometric rendering systems, 459–461 Visual dominance, 440 particle-based rendering, 460–461 Visual illusions, 134–137 point-based rendering, 460–461 Visual Molecular Dynamics (VMD), 474, 756 point-cloud rendering, 460–461, 461f Visual perception, 129–142, 427 regionally targeted rendering, 473–474 human physiology of vision, 129–134 software, 485–486 visual depth cues, 137–142 stereoscopic rendering, 282, 469–470, 472–473, 480, 482 visual illusions, 134–137 systems, 454–489. See also Volume rendering; Ray-tracing Visual presentation properties of visual displays, 263–286 geometrical representations, 457–459 FOR, 278–281 object-based rendering, 455–457 color, 264 pixel-based rendering, 455–457 contrast, 265–266 rendering complex visual scenes, 461–475 emitting technology, 263–264 360-degree spherical view, 474–475 focal distance, 273–275 Visual representation in VR, 427–432 FOV, 277–278 incorporation of real world, 431–432 grounding, 263 Visual sense, 23. See also Visual perception head position information, 281–283 Visual-based ventriloquism effect, 148 latency tolerance, 283–285 Visualization, 82 masking, 277 VMD. See Visual Molecular Dynamics (VMD) number of display channels, 266–273 Vocal messages, 438 opacity, 276–277 Vocal sounds, 438 optics (delivery technology), 275–276 Vocal utterances, 556 spatial resolution, 264–265 Voice, 618 temporal resolution, 285–286 frequency, 142 Visual rendering recognition system, 17 900 | Subject Index

Voiceholders, 582, 647 Wavefront file format (.obj), 483 VOID, The Wavelengths, 503–504 “Curse of the Serpent’s Eye” experience, 444, 445f Wayfinding, 600–611 “Ghostbusters Dimension” experience, 447, aids, 603–611 748–749, 749f mental model creation, 602–603 public venue VR experiences, 384–385 Wearable cutaneous devices, 368 Volume rendering, 459–460 Web-based tools, 814–815 VOR. See Vestibulo-ocular reflex (VOR) Web-delivered virtual reality, 718–719 VPL, 37–41, 40f, 206f, 715, 783–784 Weber’s Law, 123 VR. See Virtual reality (VR) WebGL, 464, 485 VR-2 Flight Helmet (Virtual Research Systems), 42 WebVR/WebXR, 718–719 VR-Chem Tool, 756 Weinbaum, Stanley G., 31, 738 VR-Ready machines, 794–795 Wet sound, 493 VRAC. See Iowa State University’s Virtual Reality Wheatstone, Sir Charles, 30, 288–289, 331–332, 331f Applications Center (VRAC) Wheelchair input device, 245 VRAIS. See Virtual Reality Annual International Symposium White boxing, 762 (VRAIS) White noise, 499 vrJuggler VR integration library, 715–716 Wide-Angle Virtual Environment (WAVE), 307, 308f, 510–511 VRML. See Virtual Reality Modeling Language (VRML) Wide-area tracking, 225 “Vroom Server” experience, 294 “Widget window”, 77 VRPN. See Virtual Reality Peripheral Network (VRPN) Wii Balance Board, 203 VRRobot systems, 383–384 Wii Sensor Bar, 216 Vrui Wiimote. See Nintendo Wii remote applications, 695f WIM. See World in miniature (WIM) experience recording, 94–95, 95f WIMP interface, 547 tool, 695f–696f, 715–716, 756 WindowVR from Virtual Research Systems, 210 VR integration library, 570–571 Wingman view, 609–610 Vrui-based visualization applications, 587, 756, 815 Witmer and Singer Presence questionnaire (WS Presence VSS. See Vanilla Sound Server (VSS) questionnaire), 165 Vulkan, 485 Wizard of Oz agent control, 651 Wonder Stories, 33 W World annotation, 644 Wagon wheel effect, 124 World capture technologies, 697–702 Walking-in-place, 624 World congruity, 649–651 Walkthrough paradigm, 621–624 World coordinates, 200 interaction, 87 World creation software, 814–815 Wandering, 601 World events, 437 WAVE audio file format (.wav), 510–511 World geography, 682–684 Waveform sample, 492 World in hand interface, 626 901 | Subject Index

World layout, 685–690 Xbox controllers (Microsoft), 236b, 236, 236b, 314b–316b. World persistence, 709–711 See also “Kinect (Microsoft Xbox input device) implementing persistent virtual worlds, 710–711 XR. See Cross reality (XR) World physics, 500, 702–712. See also Physics; virtual world World Wide Web, 484, 718–719, 784 Y World-capture, 812 Yaw, 561b–563b World-grounded system, 360–361, 380 Yggdrasil, 717 World-in-miniature (WIM), 589–590, 592f, 605, 614 YouTube, 93 WorldToolKit (WTK), 715, 717–718 Wright Patterson AFB, 783 Z WTK. See WorldToolKit (WTK) Zelig (film), 88 Zöllner illusion, 134–135, 135f X Zone Hunter, 620–621 X-ray vision, 339, 793 Zork (games), 12, 405 X3D file format, 483