WEDNESDAY MORNING, 9 MAY 2018 NICOLLET C, 8:50 A.M. TO 12:00 NOON Session 3aAA Architectural Acoustics, Psychological and Physiological Acoustics, and Speech Communication: Auditory Perception in Virtual, Mixed, and Augmented Environments Philip W. Robinson, Cochair Media Technology, Aalto University, PL 15500, Aalto 00076, Finland G. Christopher Stecker, Cochair Hearing and Speech Sciences, Vanderbilt University, 1215 21st Ave. South, Room 8310, Nashville, TN 37232 Chair’s Introduction—8:50 Invited Papers 8:55 3aAA1. Validating auditory spatial awareness with virtual reality and vice-versa. G. Christopher Stecker, Steven Carter, Travis M. Moore, and Monica L. Folkerts (Hearing and Speech Sci., Vanderbilt Univ., 1215 21st Ave. South, Rm. 8310, Nashville, TN 37232,
[email protected]) “Immersive” technologies such as virtual (VR) and augmented reality (AR) stand to transform sensory and perceptual science, clini- cal assessments, and habilitation of spatial awareness. This talk explores some of the general challenges and opportunities for VR- and AR-enabled research, illustrated by specific studies in the area of spatial hearing. In one study, freefield localization and discrimination measures were compared across conditions which used VR to show, hide, or alter the visible locations of loudspeakers from trial to trial. The approach is well suited to understanding potential biases and cuing effects in real-world settings. A second study used headphone presentation to understand contextual effects on the relative weighting of binaural timing and intensity cues. Previous studies have used adjustment and lateralization to “trade” time and level cues in a sound-booth setting, but none have attempted to measure how listeners weight cues in realistic multisensory scenes or in realistic temporal contexts.