Echo-House: Exploring a Virtual Environment by Using Echolocation
Total Page:16
File Type:pdf, Size:1020Kb
Echo-house: Exploring a virtual environment by using echolocation Author Andrade, R, Baker, S, Waycott, J, Vetere, F Published 2018 Conference Title OzCHI '18: Proceedings of the 30th Australian Conference on Computer-Human Interaction Version Accepted Manuscript (AM) DOI https://doi.org/10.1145/3292147.3292163 Copyright Statement © ACM, 2018. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution. The definitive version was published in OzCHI '18: Proceedings of the 30th Australian Conference on Computer-Human Interaction, ISBN: 978-1-4503-6188-0, https://doi.org/10.1145/3292147.3292163 Downloaded from http://hdl.handle.net/10072/402097 Griffith Research Online https://research-repository.griffith.edu.au Echo-House: Exploring a Virtual Environment by Using Echolocation Ronny Andrade Steven Baker Jenny Waycott The University ofMelbourne The University ofMelbourne The University ofMelbourne Australia Australia Australia [email protected] [email protected] [email protected] Frank Vetere The University ofMelbourne Australia [email protected] ABSTRACT • Human-centered computing~Auditory feedback • Human-centered computing~Accessibility systems and The graphics-intensive nature of most virtual tools environments (VEs) prevents many people with visual impairment from being able to successfully explore them. KEYWORDS A percentage of the population of people with visual impairment are known to use echolocation —sound waves Visual impairment, echolocation, virtual environments, and their reflections— to better explore their exploration; games; accessibility. surroundings. In this paper, we describe the development of an echolocation-enabled VE (Echo-House) and evaluate ACM Reference format: the feasibility of using echolocation as a novel technique Ronny Andrade, Steven Baker, Jenny Waycott and Frank Vetere. 2018. Echo-House: Exploring a Virtual Environment by Using to explore this environment. Results showed that Echolocation. In Proceedings of the 30th Australian Conference on echolocation gave participants an improved sense of space Computer-Human Interaction (OzCHI'18). ACM, New York, NY, in the VE. However, the evaluation also identified a range USA, 12 pages. https://doi.org/1234567890 of orientation and mobility issues and found that participants needed additional support to gain confidence 1 INTRODUCTION in their use of echolocation in the VE. Our findings Technology has facilitated access to information and suggest that with proper support, echolocation has the generated new possibilities for entertainment. Virtual potential to improve access to VEs for people who are experiences, such as virtual environments (VEs) and blind or visually impaired by revealing features that videogames, and their ever-increasing relevance in would be otherwise inaccessible. popular culture are a prime example of this [9, 18]. CCS CONCEPTS However, the high sensory capabilities that most virtual experiences require from their users leads to the exclusion of users with different physical or mental abilities. In the case of people with visual impairment, exclusion from virtual experiences is most evident, as they cannot rely on the visual output that most of these experiences produce. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are People with visual impairment need to use nonvisual not made or distributed for profit or commercial advantage and that stimuli to construct a mental map of their surroundings, copies bear this notice and the full citation on the first page. Copyrights both in the physical and virtual world [34]. Despite the for components of this work owned by others than the author(s) must be development of different approaches to navigation of honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior virtual spaces, there are unexplored opportunities to specific permission and/or a fee. Request permissions from incorporate navigation techniques that people with visual [email protected]. impairment use in the physical world into VEs. OzCHI'18, 5-7 December 2018, Melbourne, Australia In this paper, we aim to explore the feasibility of © 2018 Copyright held by the owner/author(s). Publication rights licensed to ACM. 000-0-0000-0000-0/00/00...$15 building a VE that allows for the simulation of https://doi.org/1234567890 echolocation —the use of sound waves and their echo reverberations — and evaluate the use of echolocation as a novel technique to assist people with visual impairment to Connors et al. [15] describe the design, implementation explore the environment. To do this, we present a and evaluation of a VE that was modelled after a location prototype echolocation-enabled VE, Echo-House, and in the real world. Players with visual impairment could discuss findings from a small-scale proof-of-concept explore the VE via the use of sound cues. Ohuchi et al. laboratory-based evaluation conducted with participants [29] and Connors et al. [15] agree that using sound-based who have a range of visual impairments. While we have virtual mazes modelled after locations in the real world focused on the development of a VE, the overlap between improves the capabilities of people with visual impairment VEs and videogames means that our findings could to create a mental map of an unknown location. equally be applied to existing videogames or used to create new and more accessible games. 1.2 Echolocation We begin by introducing the concept of sensory Echolocation is a skill that has been observed in bats and substitution, which underpins the way in which people some aquatic mammals [40]. According to Kolarik et al. with visual impairment interact with technology. This will [24], and Kupers et al. [25], echolocation in humans is the be followed by a review of previous work on echolocation ability to use self-generated sounds and their echo and a review of the existing trends in virtual experiences reverberations to build a mental image of the surrounding for people with visual impairment, focusing on spaces. The sound can be generated through a mouth- videogames. We then describe the design decisions made click, a clap, footsteps, or by hitting a cane on the floor. in implementing the prototype of Echo-House and the The use of self-generated sounds and echoes stimulate the methodology followed for its evaluation. Finally, results visual cortex of the brain of echolocators [24, 39], from the prototype evaluation are presented, and we resulting in a form of sensory substitution by which visual discuss conclusions regarding the feasibility of using input is replaced by auditory input. echolocation as a novel technique to explore VEs. The process of echolocation consists of two steps: transmission, which is the process of producing a sound, 1.1 Sensory substitution usually with the mouth, and reception, which is the In order for people with visual impairment to navigate the process of listening to the sound emitted as it combines world around them, it is necessary for them to use with its reverberation over surfaces. Over long distances, mechanisms that replace visual sensory input with other there is a perceptible gap between the moment the person types of input. Sensory substitution is the translation of listens to the sound they have produced and the upcoming information from one sense to another, with auditory and echoes [24, 40]. tactile feedback being the most common senses used to Most individuals who echolocate use clicking sounds replace visual feedback [26]. Research in this field is produced with their tongues [32]. The physical extensive, with Bach-y-Rita et al. [6] studying this field for characteristics of the clicking sounds echolocators use decades, and trialing systems that use other sensory have been analyzed. Kolarik et al. [24] and Thaler et al. organs, such as the tongue [2, 5, 23], or the skin [3, 4] to [40] have found that the mouth-clicks produced by replace visual information. individuals who echolocate last for approximately 3ms to Specific sensory-substitution devices, that aim to assist 10ms, with an interval of 250ms between clicks. Sound people with visual impairment to navigate the physical levels range from 60 to 108 dB SPL, with maximum energy world by replacing visual feedback with tactile or auditory in the frequency range of 8 kHz to 10kHz. Thaler et al. feedback, have appeared in the literature: [6, 17, 24, 26]. [40] have produced artificial clicks based on the These devices use cameras, ultrasound or optic emitters parameters obtained from gathering a large sample of paired with a receiver to detect deflections of emitted mouth-clicks produced by expert echolocators as a step signals. Once a signal is emitted and detected by the towards the development of computational models of receiver, these devices produce either auditory or tactile human echolocation. feedback to indicate characteristics of the environment, It is estimated that 20 per cent to 30 per cent of blind such as distance between the user of the device and individuals can use echolocation in varying degrees [38], potential obstacles. 3D printed tangible interfaces have and the functional benefits that echolocation offers also been explored as means of sensory substitution to include: the ability to determine distance, position, size, assist people with visual impairment