© 2019 WUPJ, May 2019, Volume 7 1 Multisensory Integration of Echolocation and Vision in Mammals Rebecca E. Whiley and Rachel M. Day * 1 There is limited research discussing comparative multisensory integration in echolocating mammals. Multisensory integration is the combination of redundant stimulus information from multiple sensory modalities. Through multisensory integration, sensory information from multiple modalities can influence an animal or human’s behaviour. The current paper reviews the experimental findings of studies in bats, toothed whales, and humans that test or indirectly suggest multisensory integration of echolocation, vision, and equilibrioception. We focus on these sensory modalities because they dominant the current literature on multisensory integration with echolocation. Experimental evidence to date strongly supports the importance of multisensory integration of echolocation with vision and/or equilibrioception for processes such as navigation, object recognition, and social communication. Additionally, we discuss opportunities for further research and the importance of comparative approaches for directing studies in humans. Keywords: multisensory integration, neuroethology, echolocation, vision, mammals Multisensory integration is the process of feedback to provide a spatial representation of the synthesizing redundant information from multiple external environment (Vercillo, Milne, Gori, & sensory modalities which often leads to enhanced Goodale, 2015). Well-known animal echolocators detection and discrimination of stimuli, as well as include bats (Dechmann & Safi, 2005) and toothed reduced reaction times (Noel, Niear, Burg, & whales (Kremers et al., 2016), and to a lesser extent Wallace, 2016; Stein & Stanford, 2008). It is well some birds (Brinkløv, Elemans, & Ratcliffe, 2017; established that sensory information from one Coles, Konishi, & Pettigrew, 1987), terrestrial modality can influence perception through another, mammals (Forsman & Malmquist, 1988), and even and that information from different modalities is humans (Thaler & Foresteire, 2017). weighted to achieve an estimate with maximum Most early research on echolocation reliability (Ernst & Banks, 2002). Multisensory focused on animal models, and these continue to integration is most likely to occur when cross- make up the majority of the available research on modal stimuli are temporally and spatially close the topic (Rosenblum et al., 2017). More recent together, and also semantically congruent (Spence, studies focus on applying knowledge from animal 2011). models to human studies of echolocation Although multisensory integration is the (Rosenblum et al., 2017). The majority of research foundation of sensory perception in many contexts, on multisensory integration with echolocation has the role of echolocation in multisensory integration focused on vision, possibly because of the has not been well evaluated because it is a sense dominance of the visual system in humans (Alais, that most humans do not experience (Rosenblum, Newell, & Mamassian, 2010). In this review of Dias, & Dorsi, 2017). Echolocation is the use of differences in abilities, study techniques, and areas self-generated acoustic sounds that produce for future research in echolocating mammals, we * Initially submitted for Psychology 4190 at the University of Western Ontario. Both authors contributed equally to this article. For inquiries regarding the article, please email the authors at [email protected] and [email protected]. MULTISENSORY INTEGRATION IN ECHOLOCATORS 2 highlight the importance of multisensory integration adaptations (Yovel & Greif, 2018). Chiroptera is for sensory perception as well as the role of traditionally divided into suborders: comparative research for directing future sensory Microchiroptera, the microbats, and studies in humans. Megachiroptera, the megabats (Fenton, 2013). Microbats are laryngeal echolocators, generating Multisensory Integration in Animal biosonar sounds via the larynx and emiting the Echolocators sounds through their open mouths, or in rare Animal echolocation was first described in circumstances, their noses (Boonman, Bumrungsri, bats (order Chiroptera) by Pierce and Griffin & Yovel, 2014). Most megabats do not echolocate, (1938). Bat echolocation has been a highly active with the exception of the genus Rousettus, also area of research for almost eighty years, and current known as “fruit bats,” which echolocate through research interests are expanding into the field of clicks made with their tongues or wings (Boonman multisensory integration (Rosenblum et al., 2017). et al., 2014). Microbats have small eyes and large Echolocation in toothed whales (parvorder ears with a tragus, which is likely essential for Odontoceti) was described in the mid-twentieth echolocation (Fenton, 2013), while megabats have century (McBride, 1956) and multisensory larger eyes and small ears (Horowitz, Cheney, & integration and cross-modal sensory abilities have Simmons, 2004). As a result of differences in been well studied in some species (Kremers et al., behaviour and physiology, each suborder differs in 2016). their sensory abilities. Bats (Geva-Sagiv, Las, Yovel, & Sensory perception and multisensory Ulanovsky, 2015) and toothed whales (Harley & integration in bats. Although audition (both active DeLong, 2008) can use echolocation to compute echolocation and passive listening) is the dominant target distance by comparing the time between sensory system in bats, they have other well-known sound emission and the returning echo, and mammalian systems including vision, compute direction by comparing the sounds somatosensory perception, vestibular perception, arriving at each ear. Echolocation can also be used and chemoreception (Yovel & Greif, 2018). They to measure target velocity and infer the detailed also have magnetoreception, the ability to perceive shape and texture of objects (Geva-Sagiv et al., a magnetic field (Beetz, 2017). The auditory system 2015). Animals can rapidly adapt their echolocation of bats is highly specialized for sensing and emissions to reflect their task and attention to interpreting echolocation calls and serves many specific environmental stimuli (Harley & DeLong, critical adaptive benefits (Fenton, 2013). 2008). Echolocators are excellent models for Behavioural studies have established that studying multisensory integration because they are bats use multimodal sensing in a variety of contexts active sensors (Rosenblum et al., 2017). Active (reviewed in Beetz, 2017), but it is important to sensation requires an organism to make an output to improve understanding of how different sensory gain sensory information, which is usually a vocal systems integrate. Most research on multisensory output in echolocators (Measor et al., 2017). integration in bats focuses on navigation and Order Chiroptera localization, object detection, and social The order Chiroptera, bats, are a highly interactions. Recent research in many bat species dynamic mammalian order, with extensive indicates that multisensory integration of cues from geographic distribution, occupying diverse habitats echolocation and other modalities provide more and dietary niches (Yovel & Greif, 2018). There are accurate information to coordinate their highly over 1,300 species of bats, which vary widely in dynamic behaviours (Rosenblum et al., 2017). their morphological, sensory, and behavioural MULTISENSORY INTEGRATION IN ECHOLOCATORS 3 Echolocation and vision. Following is large and conspicuous. Boonman and colleagues echolocation, vision is the second most studied (2013) compared the range of vision and sensory system in bats (Fenton, 2013). These echolocation in two different species of microbats sensory modalities provide complementary (Rhinopoma microphyllum and Pipistrellus kuhlii) information because vision allows long-range in a prey detection task. The bats continually detection of larger objects and better angular integrated cues from two sensory modalities, resolution, while echolocation allows detection of searching for prey using echolocation while small objects and very high accuracy (Boonman, visually monitoring the nearby background targets Bar-On, Cvikel, & Yovel, 2013; Yovel & Greif, or monitoring self-location relative to other 2018). By integrating visual and auditory landmarks (Boonman et al., 2013). More research is information, bats effectively navigate complex necessary to understand the conditions in which environments, avoid obstacles, and detect food. bats vocalize less and rely on the multisensory Numerous behavioural studies support that integration of vision and echolocation. microbats constantly benefit from the integration of For megabats, vision is the preferred vision and echolocation (Boonman et al., 2013). In modality for navigation because of their high visual a study with free-living little brown bats (Myotis acuity and angular resolution (Thiele, Rübsamen, & lucifugus), echolocation was their primary sensory Hoffmann, 1996). Danilovich and colleagues modality in orientation, but the bats still relied on (2015) studied Egyptian fruit bats (Rousettus visual cues (Orbach & Fenton, 2010). Orbach and aegyptiacus) to determine the extent to which one Fenton (2010) determined that the bats benefited sensory modality modulates information acquired from multisensory integration because their within another sensory modality. Sensory performance in an obstacle avoidance task information from vision influences sensory worsened with reduced light
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages14 Page
-
File Size-