Durham Research Online
Total Page:16
File Type:pdf, Size:1020Kb
Durham Research Online Deposited in DRO: 25 August 2016 Version of attached le: Accepted Version Peer-review status of attached le: Not peer-reviewed Citation for published item: Thaler, Lore (2015) 'Using sound to get around - discoveries in human echolocation.', Observer., 28 (10). Further information on publisher's website: http://www.psychologicalscience.org/index.php/publications/observer/2015/december-15/using-sound-to-get- around.html Publisher's copyright statement: Use policy The full-text may be used and/or reproduced, and given to third parties in any format or medium, without prior permission or charge, for personal research or study, educational, or not-for-prot purposes provided that: • a full bibliographic reference is made to the original source • a link is made to the metadata record in DRO • the full-text is not changed in any way The full-text must not be sold in any format or medium without the formal permission of the copyright holders. Please consult the full DRO policy for further details. Durham University Library, Stockton Road, Durham DH1 3LY, United Kingdom Tel : +44 (0)191 334 3042 | Fax : +44 (0)191 334 2971 https://dro.dur.ac.uk Author Bio: Lore Thaler is a lecturer at Durham University, United Kingdom. Her research focuses on human echolocation and vision. She can be contacted at [email protected]. Using Sound to Get Around Discoveries in Human Echolocation By Lore Thaler The sight of a blind person snapping her fingers, making clicking sounds with her tongue, or stomping her feet might draw stares on a street or in a subway station, but it’s the type of behaviour that is opening up a vibrant area of research in psychology. In actuality, these vision-impaired individuals are using echolocation — the same type of navigational technique used by bats and some marine mammals. They’re essentially learning about the objects in their environment by the echoes that bounce off of them. Echolocation is not only a fascinating subject in its own right, but also a suitable paradigm to study neuroplasticity from several disciplinary perspectives. It is a technique that people (not only blind — but also sighted) can learn relatively easily and that can be used to probe how the brain deals with novel sensory information. A General Ability At one time, echolocation in humans was referred to as “facial vision” or “obstacle sense.” In fact, the term echolocation was coined by zoologist Donald Griffin only in 1944. Initially, the ability to detect obstacles without vision was considered a special skill of a few blind people. Scientists weren’t clear on how it worked, i.e., whether the ability to detect obstacles without vision was mediated by pressure waves on the skin or by sound. But a set of experiments conducted in the 1940s showed that sound and hearing were the driving aspects. A video of some of these early experiments is available here http://vlp.mpiwg-berlin.mpg.de/library/data/lit39549. Subsequent research showed that both blind and sighted people can develop the skill to avoid obstacles without vision, as long as they have normal hearing. In sum, these studies showed that the “obstacle sense” was not a mysterious skill that only some blind people possessed, but instead a general human ability. To discover your inner bat and listen to some human echolocation sound clips go to supplemental audio clips 1-6. Beyond Obstacle Detection Initially, echolocation research focused mainly on the detection of obstacles. Yet subsequent studies progressed from obstacle detection tasks to measuring people’s ability to echolocate distance, direction, shape, material, motion, or size. Most studies made use of “categorical tasks,” which measured participants’ ability to identify something from a limited number of alternatives. In the 1960s, Winthrop Kellogg introduced the psychophysical method to human echolocation research, making more fine-grained measures of people’s echolocation abilities possible. Researchers have used psychophysical methods to measure people’s echolocation of location (direction and distance) and size. Scientists led by Bo Schenkman (Royal Institute of Technology Stockholm, Sweden) or Daniel Rowan (University of Southampton, UK) also have made progress in investigating the acoustic features that may be relevant for human echolocation. Yet, this research has focused on echoes from longer white noise signals rather than mouth-clicks that people make (see also section “The Sonar Emission”). While most studies have focused on the echolocating person as a “perceiver,” it is important to keep in mind, however, that echolocation is an active process. For example, in daily life people move their bodies and heads while they echolocate. Studies have shown that bats “steer their sound beam” to sample the environment during echolocation; similarly, recent investigations by our own group, and by neurobiologist Ludwig Wallmeier (Ludwig-Maximilians-University of Munich, Germany) and colleagues, emphasize that movement can be an essential component in humans’ successful echolocation. The Sonar Emission In early research, the sounds (i.e., sonar emissions) that people made to generate echoes were not systematically controlled, and thus included talking, humming, mouth clicks, footsteps, cane tapping sounds, and other noises. But in a study published in 2009, researchers led by Juan Antonio Martinez Rojas (University of Alcalá, Spain) analyzed the physical properties of various sounds and concluded that mouth clicks might be particularly useful for human echolocation because they are highly reproducible (i.e., the sound is quite stable across repeated emissions). Additionally, the spatial relationship between mouth and ears is fixed (as compared with ambient sound, footsteps, or cane taps, etc.). Because of these factors, people can interpret variations in audible sound as changes in the environment rather than changes in the emitted sounds themselves. The majority of recent investigations into human echolocation examine mouth-click-based echolocation. Clicks tend to be 3–15 milliseconds long transients with peak frequencies around 6–8 kilohertz. Echo-suppression The human auditory system typically shows a phenomenon termed echo-suppression. The term describes the phenomenon that a person’s percept, upon hearing two sounds in rapid succession, is driven by the first of the two sounds. This phenomenon also is referred to as precedence effect. Wallmeier and colleagues have suggested that echo-suppression is reduced during echolocation compared with “regular” spatial hearing. The underlying mechanisms are unclear at present. The Neurobiology of Echolocation To date, evidence for brain areas involved in human echolocation comes from studies using neuroimaging methods such as PET or fMRI. In 1999, neuroscientist Anne G. DeVolder (Catholic University of Louvain, Belgium) and colleagues used PET to measure brain activity in blind and sighted people’s brains while they used an echolocation-based sensory substitution device. The device included a pair of spectacles equipped with an ultrasound speaker, two ultrasonic microphones, two earphones, and a processing unit. The device acquired and decoded ultrasonic echoes into audible sounds sent to the user's earphones. The pitch of the audible sounds conveyed distance and the sound’s binaural intensity balance conveyed direction. They found that in the group of blind subjects, the processing of sound from the device was associated with an increase in brain activity in Brodmann area (BA) 17/18 (i.e., the early “visual” cortex). Though subjects in the study did not echolocate per se, this was first evidence to suggest that information derived from echolocation may drive early visual cortex in blind people. Encouraged by these findings, my colleagues and I in 2011 conducted the first-ever study to measure brain activity during echolocation in two blind people trained in echolocation using mouth-clicks. Using fMRI, we found that while listening to echolocation sounds as compared with control sounds, both participants showed significant increase of brain activity in BA17. In this and subsequent studies, we also found that echo-related activity in BA17 is stronger for echoes coming from contra- lateral space (i.e., contra-lateral preference), and that the activity pattern changes as the echoes move away from the center towards the periphery of space (i.e., modulation with eccentricity). A recent fMRI study by Wallmeier and colleagues has since confirmed the involvement of BA17 in echolocation in the blind. My colleagues and I also have found that echo-motion activates brain areas that might coincide with the visual motion area MT+, and that the shape of echolocated surfaces might activate the Lateral Occipital Complex (LOC), a brain area thought to be involved in the visual processing of shape. We also have found that both blind and sighted people show activation in posterior parietal cortex during echolocation of path direction for walking, and the location of this activation might coincide with areas involved in processing of vision for motor action. In sum, although there are only few studies to date about neural substrates of natural echolocation, it is increasingly evident that traditional “visual” brain areas are involved during echolocation in blind echolocation experts, and that this activation appears to be feature specific. Echolocation and Blindness The literature to date suggests that blind people are more sensitive to acoustic reverberations even when they do