Exploring Visual-To-Auditory Sensory Substitution Mappings in An

Total Page:16

File Type:pdf, Size:1020Kb

Exploring Visual-To-Auditory Sensory Substitution Mappings in An RESEARCH ARTICLE Stereosonic vision: Exploring visual-to- auditory sensory substitution mappings in an immersive virtual reality navigation paradigm Daniela Massiceti1*, Stephen Lloyd Hicks2, Joram Jacob van Rheede3 1 Department of Engineering Science, University of Oxford, Oxford, United Kingdom, 2 Nuffield Department of Clinical Neurosciences, University of Oxford, Oxford, United Kingdom, 3 Department of Physiology, Anatomy and Genetics, University of Oxford, Oxford, United Kingdom a1111111111 * [email protected] a1111111111 a1111111111 a1111111111 a1111111111 Abstract Sighted people predominantly use vision to navigate spaces, and sight loss has negative consequences for independent navigation and mobility. The recent proliferation of devices that can extract 3D spatial information from visual scenes opens up the possibility of using OPEN ACCESS such mobility-relevant information to assist blind and visually impaired people by presenting Citation: Massiceti D, Hicks SL, van Rheede JJ this information through modalities other than vision. In this work, we present two new meth- (2018) Stereosonic vision: Exploring visual-to- ods for encoding visual scenes using spatial audio: simulated echolocation and distance- auditory sensory substitution mappings in an dependent hum volume modulation. We implemented both methods in a virtual reality (VR) immersive virtual reality navigation paradigm. PLoS ONE 13(7): e0199389. https://doi.org/ environment and tested them using a 3D motion-tracking device. This allowed participants 10.1371/journal.pone.0199389 to physically walk through virtual mobility scenarios, generating data on real locomotion Editor: Krish Sathian, Pennsylvania State behaviour. Blindfolded sighted participants completed two tasks: maze navigation and University, UNITED STATES obstacle avoidance. Results were measured against a visual baseline in which participants Received: November 6, 2017 performed the same two tasks without blindfolds. Task completion time, speed and number of collisions were used as indicators of successful navigation, with additional metrics explor- Accepted: June 6, 2018 ing detailed dynamics of performance. In both tasks, participants were able to navigate Published: July 5, 2018 using only audio information after minimal instruction. While participants were 65% slower Copyright: © 2018 Massiceti et al. This is an open using audio compared to the visual baseline, they reduced their audio navigation time by an access article distributed under the terms of the average 21% over just 6 trials. Hum volume modulation proved over 20% faster than simu- Creative Commons Attribution License, which permits unrestricted use, distribution, and lated echolocation in both mobility scenarios, and participants also showed the greatest reproduction in any medium, provided the original improvement with this sonification method. Nevertheless, we do speculate that simulated author and source are credited. echolocation remains worth exploring as it provides more spatial detail and could therefore Data Availability Statement: All data files are be more useful in more complex environments. The fact that participants were intuitively made available for public download at http://www. able to successfully navigate space with two new visual-to-audio mappings for conveying robots.ox.ac.uk/~daniela/research/stereosonic_ vision/static/stereosonic_vision_data.zip. spatial information motivates the further exploration of these and other mappings with the goal of assisting blind and visually impaired individuals with independent mobility. Funding: Daniela Massiceti was supported by University of Oxford Clarendon Fund (http://www. ox.ac.uk/clarendon/about). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript. Stephen Hicks and Joram van Rheede: Salary and research expenses were supported by a PLOS ONE | https://doi.org/10.1371/journal.pone.0199389 July 5, 2018 1 / 32 Stereosonic vision: Visual-to-auditory sensory substitution for immersive virtual reality navigation private donor into the University of Oxford titled 1 Introduction ªEVE: Expanded Visual Enhancement.º The donor, Mr Jiangong Zhang, had no role in study design, Globally more than 250 million people are visually impaired, with over 35 million of this data collection and analysis, decision to publish, or group classified as blind [1, 2]. While certain causes of visual impairment can be prevented or preparation of the manuscript. treated, a large proportion of sight loss remains without a cure [3]. New treatment approaches Competing interests: The authors have declared such as retinal prosthetics, optogenetics, and gene therapy offer hope for the future, but at that no competing interests exist. present are at a research or early implementation stage and await evidence of real-life benefit to patients [4]. Vision loss affects the ability to independently carry out activities of daily living [5±8], in part due to its negative impact on mobility and navigation [9±15]. While blind or visually impaired individuals are often able to learn to successfully navigate without vision through ori- entation and mobility training [16], they face significant challenges not faced by the sighted population [17±20]. Non-sighted navigation requires more planning and cognitive resources [19±22], and blind and visually impaired individuals are at an increased risk of mobility- related accidents, injuries, and falls [23±27]. Even when walking around a familiar environ- ment, the variable presence of obstacles, changes in walking surface or drop-offs can be a sig- nificant perceived mobility hazard [28], and even very experienced non-sighted navigators still regularly veer off their intended course [29]. It also remains the case that human living spaces are usually designed with sighted navigation in mind [19]. While academic debate exists around the representation of spatial information in blind individuals, in particular people who are congenitally blind, it is clear that the cognitive ability for representing spatial information is not the main limiting factor in navigation and mobility [20, 30]. Rather, the limitation lies in the rate at which non-sighted navigators are able to acquire spatial information about their current environment, whether that is in order to build an initial cognitive map of the space, or to update their current position in a cognitive map from memory and scan for the presence of any obstacles to safe mobility [20]. While vision is uniquely well-placed to rapidly provide mobility-relevant spatial information [31], it is not the only source of such information. Already, many blind individuals will spontaneously learn to use non-visual cues to their advantage in sensing obstacles in their environment [32, 33]. Sensory Substitution Devices (SSDs) go one step further, converting information normally acquired through one sensory modality into a representation that is compatible with another intact sensory modality, aiming to exploit the increasingly well-understood cross-modal capacities of the brain [34, 35]. Approaches to substituting information received through vision naturally focus on the other spatially informative senses, namely hearing and touch [36]. The first SSDs were pioneered by Bach-y-Rita in the 1960s with his development of the Tactile Vision Sensory Substitution (TVSS) device [37]: subjects received vibrating patterns via an array of pins mounted on their backs and were able to differentiate between oriented parallel lines, simple geometric shapes and capital block letters. Extending this initial work, a host of studies have investigated ªseeingº with vibro-tactile and electro-tactile stimulation applied to a number of body surfaces [38±41]. Other SSDs have attempted to present an audi- tory representation of the visual scene. The vOICe, perhaps the most widely investigated vision-to-audio SSD, scans the visual environment from left to right, and converts the 2D gray- scale image into a frequency spectrum or ªsoundscapeº [42]. These efforts and others like Eye- Music [43] have largely focused on converting a 2D camera image into a corresponding sound or set of sounds for the purpose of identifying elements of the visual scene. Mobility, however, is a 3D task, and requires access to information about the distance of objects in the visual scene and their radial position. An effective SSD for navigation and mobil- ity will therefore need to provide such information as explicitly as possible. This moves away from the concept of an SSD as a generic replacement for vision, and towards seeing it as a PLOS ONE | https://doi.org/10.1371/journal.pone.0199389 July 5, 2018 2 / 32 Stereosonic vision: Visual-to-auditory sensory substitution for immersive virtual reality navigation mobility tool aimed at providing the user with an improved spatial awareness of their sur- roundings. This task-specific approach has the additional advantage of reducing the band- width of the information needed to be encoded cross-modallyÐan important consideration [36, 44]. Several SSDs have been developed specifically with this in mind (for detailed surveys of such devices, see [36, 44, 45]). Work in this field has initially been led by Leslie Kay with the initial Kay Sonic Torch [46] followed by several others [47±51]. While the first approaches only provided users with a `virtually extended' cane that reported the distances of objects fur- ther afield in their path, the more ambitious devices use
Recommended publications
  • Cognitive Abilities in Human Echolocation
    Umeå University Department of Psychology Master Thesis VT2010 Cognitive Abilities in Human Echolocation Oskar Andersson Supervisor: Bo Schenkman Steven Nordin Abstract The human ability to echolocate differs between individuals. Blind perform significantly better then sighted. There is very limited research on psychological factors underlying this ability. This study is based on earlier research by Schenkman and Nilsson (2010). The research questions in the present study were whether blind individuals perform better then sighted on general cognitive tasks, if people who perform well on echolocation tasks, also perform well on general cognitive tasks, and is there a correlation between performance on echolocation tasks and general cognitive tasks? Participants were 8 blind (5 female and 3 men) and 4 sighted (3 female and 1 male). Range of age was 36 – 65 years. Cognitive test used were Digit Span, California Verbal Learning Test and an auditiv reaction time test. The blind performed significantly better than the sighted on learning over trials. There was significant difference in focused attention, working memory and learning over trials between individuals who performed well on echolocation tasks and the rest of the sample. There was a positive correlation between learning over trials and individuals ability to echolocate. The conclusion of this report is that blind individuals perform better on learning over trials and that people who perform well on echolocation tasks are superior in the cognitive abilities focused attention, working memory and learning over trials. Cognitive Abilities in Human Echolocation Oskar Andersson Echolocation is often associated with mammals like bats, dolphins and whales. These animals all hold the ability to echolocate with great precision, and echolocation is used for orientation and hunting.
    [Show full text]
  • SENSORY SUBSTITUTION: LIMITS and PERSPECTIVES Charles Lenay, Olivier Gapenne, Sylvain Hanneton, Catherine Marque, Christelle Genouëlle
    SENSORY SUBSTITUTION: LIMITS AND PERSPECTIVES Charles Lenay, Olivier Gapenne, Sylvain Hanneton, Catherine Marque, Christelle Genouëlle To cite this version: Charles Lenay, Olivier Gapenne, Sylvain Hanneton, Catherine Marque, Christelle Genouëlle. SEN- SORY SUBSTITUTION: LIMITS AND PERSPECTIVES. Yvette Hatwell; Arlette Streri; Edouard Gentaz. Touching for Knowing, 53, John Benjamins Publishers, pp.275-292, 2004, Advances in Con- sciousness Research, 9789027251855. 10.1075/aicr.53.22len. hal-02434266 HAL Id: hal-02434266 https://hal.archives-ouvertes.fr/hal-02434266 Submitted on 9 Jan 2020 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. SENSORY SUBSTITUTION: LIMITS AND PERSPECTIVES Charles Lenay, Olivier Gapenne, Sylvain Hanneton1, Catherine Marque et Christelle Genouëlle 29/08/2003 Université de Technologie de Compiègne COSTECH - BIM Groupe Suppléance Perceptive2 A quarter of a century ago, in the preface to « Brain Mechanisms in Sensory Substitution », Paul Bach y Rita wrote: "This monograph thus risks becoming outdated in a very short time since the development of refined sensory substitution systems should allow many of the question raised here to be answered, and some of the conclusions may appear naive to future readers." (BACH Y RITA, 1972) As it turns out, this prediction is far from having been fulfilled: in spite of their scientific and social interest, their real effectiveness and a certain technological development, prosthetic devices employing the principle of "sensory substitution" are not widely used by the blind persons for whom they were originally destined.
    [Show full text]
  • Perceiving Invisible Light Through a Somatosensory Cortical Prosthesis
    ARTICLE Received 24 Aug 2012 | Accepted 15 Jan 2013 | Published 12 Feb 2013 DOI: 10.1038/ncomms2497 Perceiving invisible light through a somatosensory cortical prosthesis Eric E. Thomson1,2, Rafael Carra1,w & Miguel A.L. Nicolelis1,2,3,4,5 Sensory neuroprostheses show great potential for alleviating major sensory deficits. It is not known, however, whether such devices can augment the subject’s normal perceptual range. Here we show that adult rats can learn to perceive otherwise invisible infrared light through a neuroprosthesis that couples the output of a head-mounted infrared sensor to their soma- tosensory cortex (S1) via intracortical microstimulation. Rats readily learn to use this new information source, and generate active exploratory strategies to discriminate among infrared signals in their environment. S1 neurons in these infrared-perceiving rats respond to both whisker deflection and intracortical microstimulation, suggesting that the infrared repre- sentation does not displace the original tactile representation. Hence, sensory cortical prostheses, in addition to restoring normal neurological functions, may serve to expand natural perceptual capabilities in mammals. 1 Department of Neurobiology, Duke University, Box 3209, 311 Research Drive, Bryan Research, Durham, North Carolina 27710, USA. 2 Edmond and Lily Safra International Institute for Neuroscience of Natal (ELS-IINN), Natal 01257050, Brazil. 3 Department of Biomedical Engineering, Duke University, Durham, North Carolina 27710, USA. 4 Department of Psychology and Neuroscience, Duke University, Durham, North Carolina 27710, USA. 5 Center for Neuroengineering, Duke University, Durham, North Carolina 27710, USA. w Present address: University of Sao Paulo School of Medicine, Sao Paulo 01246-000, Brazil. Correspondence and requests for materials should be addressed to M.A.L.N.
    [Show full text]
  • Haptic Navigation Aids for the Visually Impaired
    DOCTORAL T H E SIS Department of Computer Science, Electrical and Space Engineering Division of EISLAB Daniel Innala Ahlmark Haptic Navigation Aids for the Visually Impaired Aids for the Ahlmark Haptic Navigation Daniel Innala ISSN 1402-1544 Haptic Navigation Aids for the ISBN 978-91-7583-605-8 (print) ISBN 978-91-7583-606-5 (pdf) Visually Impaired Luleå University of Technology 2016 Daniel Innala Ahlmark Industrial Electronics Haptic Navigation Aids for the Visually Impaired Daniel Innala Ahlmark Dept. of Computer Science, Electrical and Space Engineering Lule˚aUniversity of Technology Lule˚a,Sweden Supervisors: Kalevi Hyypp¨a,Jan van Deventer, Ulrik R¨oijezon European Union Structural Funds Printed by Luleå University of Technology, Graphic Production 2016 ISSN 1402-1544 ISBN 978-91-7583-605-8 (print) ISBN 978-91-7583-606-5 (pdf) Luleå 2016 www.ltu.se To my mother iii iv Abstract Assistive technologies have improved the situation in society for visually impaired individ- uals. The rapid development the last decades have made both work and education much more accessible. Despite this, moving about independently is still a major challenge, one that at worst can lead to isolation and a decreased quality of life. To aid in the above task, devices exist to help avoid obstacles (notably the white cane), and navigation aids such as accessible GPS devices. The white cane is the quintessential aid and is much appreciated, but solutions trying to convey distance and direction to obstacles further away have not made a big impact among the visually impaired. One fundamental challenge is how to present such information non-visually.
    [Show full text]
  • Visuospatial Assistive Device Art: Expanding Human Echolocation Ability
    Visuospatial Assistive Device Art: Expanding Human Echolocation Ability March 2018 Aisen Carolina Chacin 1 Visuospatial Assistive Device Art: Expanding Human Echolocation Ability School of Integrative and Global Majors Ph.D. Program in Empowerment Informatics University of Tsukuba March2018 Aisen Carolina Chacin 2 Abstract: This study investigates human echolocation through sensory substitution devices for the augmentation of visuospatial skills, thus replacing vision through acoustic and haptic interfaces. These concepts are tested through two case studies; Echolocation Headphones and IrukaTact. The Echolocation Headphones are a pair of goggles that disable the participant's vision. They emit a focused sound beam which activates the space with directional acoustic reflection. The directional properties of this parametric sound provide the participant a focal echo. This directionality is similar to the focal point of vision, giving the participant the ability to selectively gain an audible imprint of their environment. This case study analyzes the effectiveness of this wearable sensory extension for aiding auditory spatial location in three experiments; optimal sound type and distance for object location, perceptual resolution by just noticeable difference, and goal-directed spatial navigation for open pathway detection. The second case study is the IrukaTact haptic module, a wearable device for underwater tactile stimulation of the fingertips with a jet-stream of varying pressure. This haptic module utilizes an impeller pump which suctions surrounding water and propels it onto the volar pads of the finger, providing a hybrid feel of vibration and pressure stimulation. IrukaTact has an echo-haptic utility when it is paired with a sonar sensor and linearly transposing its distance data into tactile pressure feedback.
    [Show full text]
  • Electrocorticography Evidence of Tactile Responses in Visual Cortices
    UvA-DARE (Digital Academic Repository) Electrocorticography Evidence of Tactile Responses in Visual Cortices Gaglianese, A.; Branco, M.P.; Groen, I.I.A.; Benson, N.C.; Vansteensel, M.J.; Murray, M.M.; Petridou, N.; Ramsey, N.F. DOI 10.1007/s10548-020-00783-4 Publication date 2020 Document Version Final published version Published in Brain Topography License CC BY Link to publication Citation for published version (APA): Gaglianese, A., Branco, M. P., Groen, I. I. A., Benson, N. C., Vansteensel, M. J., Murray, M. M., Petridou, N., & Ramsey, N. F. (2020). Electrocorticography Evidence of Tactile Responses in Visual Cortices. Brain Topography, 33(5), 559–570. https://doi.org/10.1007/s10548-020-00783-4 General rights It is not permitted to download or to forward/distribute the text or part of it without the consent of the author(s) and/or copyright holder(s), other than for strictly personal, individual use, unless the work is under an open content license (like Creative Commons). Disclaimer/Complaints regulations If you believe that digital publication of certain material infringes any of your rights or (privacy) interests, please let the Library know, stating your reasons. In case of a legitimate complaint, the Library will make the material inaccessible and/or remove it from the website. Please Ask the Library: https://uba.uva.nl/en/contact, or a letter to: Library of the University of Amsterdam, Secretariat, Singel 425, 1012 WP Amsterdam, The Netherlands. You will be contacted as soon as possible. UvA-DARE is a service provided by the library of the University of Amsterdam (https://dare.uva.nl) Download date:03 Oct 2021 Brain Topography (2020) 33:559–570 https://doi.org/10.1007/s10548-020-00783-4 ORIGINAL PAPER Electrocorticography Evidence of Tactile Responses in Visual Cortices Anna Gaglianese1,2,3 · Mariana P.
    [Show full text]
  • Sensory Substitution and the Human
    tavvJ TRENDSin CognitiveScierc€s Vot.7No.l2 D6csmbe.2OO3 Sensorysubstitution and the human- machineinterface Paul Baeh-y-Ritaland Stephen W. Kercelz l?e?.anmglE-9fonhopedics and RehabiritationMedicine, and BiomedicarEngineering, univorsity of wisconsin, MadisonWl 53706.USA 2Endogenous SystemsResearch Group, New EnglandInstitute, University ot New England,Biddeford ME 04005,USA Recent advances in the instrumentation technology ol advances have led to the possibility of new prosthetic sensory substitution presemed have new opportunities devices being potentially accessible at much lower cost to to dsv€lop systems for compsnsation of sansory loss, millions ofpatients. ThiB is Btimulating the interest ofboth ,n sensory substitution {€.g. ot sight or vestibular func. research groups and industry leading to the establish- tion), intormation trom an artificial roceptor is coupled ment of new research and development efrorts in many tho to brain via a human-machine interfaco. The brain countries. Thia recent exploeion of interest in sensorv is able to use this intormation in place of that usually substitution suggest€that now is a good time to review transmitted from an intaqt ssnss organ. Both auditory progress in the area. and tactilo systems show promiss for practical sensory substitution intartace sitos. This research provides Brain plasticity experimental tools tor examining brain plasticity and Brain plasticity can be defined as the adaptiee capacities has implications fot porceptual and cognition studios ofthe central nervous system - its ability to modify its own more gonerally. structural organization and fiuctioning, IlT. It permits atr adaptive (or a maladaptive) responseto functional demand, Persons who become blind do not lose tÄe capacity to Mechanisms of brain plasticity include neurochemical, see.
    [Show full text]
  • Robustness in Neurological Systems 13 - 15 November, 2015 Center for Philosophy of Science
    Robustness in Neurological Systems 13 - 15 November, 2015 Center for Philosophy of Science ABSTRACTS Alison BArth, Professor, DepArtment of Biology, AnD HeAD, BArth LAb, CArnegie Mellon University Functional Connectivity is Regulated by SOM Interneurons Spontaneous Activity UnDerstAnDing the DynAmic rAnge for synAptic trAnsmission is A criticAl component of builDing A functionAl circuit DiAgrAm for the mammaliAn brAin. We find that excitAtory synaptic strength between neocortical neurons is markeDly suppresseD During network Activity in mouse somatosensory cortex, with smaller EPSP AmplituDes AnD high fAilure rAtes thAn previously reporteD. This phenomenon is regulAteD by tonic ActivAtion of presynAptic GABAb receptors viA the Activity of somatostAtin-expressing interneurons. Optogenetic suppression of somatostAtin neurAl firing wAs sufficient to enhAnce EPSP AmplituDe AnD reDuce fAilure rAtes, effects thAt were fully reversible AnD Also occluDeD by GABAb AntAgonists. These DAtA inDicAte thAt somatostAtin-expressing interneurons can rapidly and reversibly rewire neocortical networks through synAptic silencing, And suggest A critical role for these neurons in gating perception And plAsticity. Emilio Bizzi, DepArtment of BrAin AnD Cognitive Sciences And McGovern Institute for Brain Research, MIT Muscle Synergies, Concept, Principles, and Potential use in Neurorehabilitation When the centrAl nervous system (CNS) generAtes voluntAry movement, many muscles, eAch comprising thousAnDs of motor units, Are simultAneously ActivAteD AnD coorDinAteD. ComputAtionAlly, this is A DAunting tAsk, AnD investigAtors hAve striveD to unDerstAnD whether AnD how the CNS’s burDen is reDuceD to A much smaller set of vAriAbles. In the lAst few yeArs, we and our collaborators have seArcheD for physiologicAl eviDence of simplifying strAtegies by exploring whether the motor system makes use of motor moDules, to construct A lArge set of movement.
    [Show full text]
  • Parahippocampal Cortex Is Involved in Material Processing Via Echoes in Blind Echolocation Experts ⇑ Jennifer L
    CORE Metadata, citation and similar papers at core.ac.uk Provided by Elsevier - Publisher Connector Vision Research 109 (2015) 139–148 Contents lists available at ScienceDirect Vision Research journal homepage: www.elsevier.com/locate/visres Parahippocampal cortex is involved in material processing via echoes in blind echolocation experts ⇑ Jennifer L. Milne a, Stephen R. Arnott b, Daniel Kish c, Melvyn A. Goodale a, , Lore Thaler d a The Brain and Mind Institute, The University of Western Ontario, London, Ontario, Canada b The Rotman Research Institute, Baycrest, Toronto, Ontario, Canada c World Access for the Blind, Encino, CA, United States d Department of Psychology, Durham University, Durham, United Kingdom article info abstract Article history: Some blind humans use sound to navigate by emitting mouth-clicks and listening to the echoes that Received 1 May 2014 reflect from silent objects and surfaces in their surroundings. These echoes contain information about Received in revised form 1 July 2014 the size, shape, location, and material properties of objects. Here we present results from an fMRI exper- Available online 30 July 2014 iment that investigated the neural activity underlying the processing of materials through echolocation. Three blind echolocation experts (as well as three blind and three sighted non-echolocating control par- Keywords: ticipants) took part in the experiment. First, we made binaural sound recordings in the ears of each echo- Texture locator while he produced clicks in the presence of one of three different materials (fleece, synthetic Vision foliage, or whiteboard), or while he made clicks in an empty room. During fMRI scanning these recordings Audition Multisensory were played back to participants.
    [Show full text]
  • Durham Research Online
    Durham Research Online Deposited in DRO: 25 August 2016 Version of attached le: Accepted Version Peer-review status of attached le: Not peer-reviewed Citation for published item: Thaler, Lore (2015) 'Using sound to get around - discoveries in human echolocation.', Observer., 28 (10). Further information on publisher's website: http://www.psychologicalscience.org/index.php/publications/observer/2015/december-15/using-sound-to-get- around.html Publisher's copyright statement: Use policy The full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that: • a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders. Please consult the full DRO policy for further details. Durham University Library, Stockton Road, Durham DH1 3LY, United Kingdom Tel : +44 (0)191 334 3042 | Fax : +44 (0)191 334 2971 https://dro.dur.ac.uk Author Bio: Lore Thaler is a lecturer at Durham University, United Kingdom. Her research focuses on human echolocation and vision. She can be contacted at [email protected]. Using Sound to Get Around Discoveries in Human Echolocation By Lore Thaler The sight of a blind person snapping her fingers, making clicking sounds with her tongue, or stomping her feet might draw stares on a street or in a subway station, but it’s the type of behaviour that is opening up a vibrant area of research in psychology.
    [Show full text]
  • Audioviewer: Learning to Visualize Sound
    AudioViewer: Learning to Visualize Sound by Yuchi Zhang A THESIS SUBMITTED IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF Master of Science in THE FACULTY OF GRADUATE AND POSTDOCTORAL STUDIES (Computer Science) The University of British Columbia (Vancouver) December 2020 c Yuchi Zhang, 2020 The following individuals certify that they have read, and recommend to the Faculty of Graduate and Postdoctoral Studies for acceptance, the thesis entitled: AudioViewer: Learning to Visualize Sound submitted by Yuchi Zhang in partial fulfillment of the requirements for the degree of Master of Science in Computer Science. Examining Committee: Helge Rhodin, Computer Science Supervisor Kwang Moo Yi, Computer Science Examining Committee Member ii Abstract Sensory substitution can help persons with perceptual deficits. In this work, we attempt to visualize audio with video. Our long-term goal is to create sound perception for hearing impaired people, for instance, to facilitate feedback for training deaf speech. Different from existing models that translate between speech and text or text and images, we target an immediate and low-level translation that applies to generic environment sounds and human speech without delay. No canonical mapping is known for this artificial translation task. Our design is to translate from audio to video by compressing both into a common latent space with a shared structure. Our core contribution is the development and evaluation of learned mappings that respect human perception limits and maximize user comfort by enforcing priors and combining strategies from unpaired image translation and disentanglement. We demonstrate qualitatively and quantitatively that our AudioViewer model maintains important audio features in the generated video and that generated videos of faces and numbers are well suited for visualizing high- dimensional audio features since they can easily be parsed by humans to match and distinguish between sounds, words, and speakers.
    [Show full text]
  • Brain Uploading-Related Group Mind Scenarios
    MIRI MACHINE INTELLIGENCE RESEARCH INSTITUTE Coalescing Minds: Brain Uploading-Related Group Mind Scenarios Kaj Sotala University of Helsinki, MIRI Research Associate Harri Valpola Aalto University Abstract We present a hypothetical process of mind coalescence, where artificial connections are created between two or more brains. This might simply allow for an improved formof communication. At the other extreme, it might merge the minds into one in a process that can be thought of as a reverse split-brain operation. We propose that one way mind coalescence might happen is via an exocortex, a prosthetic extension of the biological brain which integrates with the brain as seamlessly as parts of the biological brain in- tegrate with each other. An exocortex may also prove to be the easiest route for mind uploading, as a person’s personality gradually moves away from the aging biological brain and onto the exocortex. Memories might also be copied and shared even without minds being permanently merged. Over time, the borders of personal identity may become loose or even unnecessary. Sotala, Kaj, and Harri Valpola. 2012. “Coalescing Minds: Brain Uploading-Related Group Mind Scenarios.” International Journal of Machine Consciousness 4 (1): 293–312. doi:10.1142/S1793843012400173. This version contains minor changes. Kaj Sotala, Harri Valpola 1. Introduction Mind uploads, or “uploads” for short (also known as brain uploads, whole brain emu- lations, emulations or ems) are hypothetical human minds that have been moved into a digital format and run as software programs on computers. One recent roadmap chart- ing the technological requirements for creating uploads suggests that they may be fea- sible by mid-century (Sandberg and Bostrom 2008).
    [Show full text]