Naturally Occurring Auditory-Visual Synesthesia Experienced Under
Total Page:16
File Type:pdf, Size:1020Kb
Naturally occurring auditory-visual synesthesia experienced under dark adaptation Anupama Nair1,2 Advisor: David Brang, PhD1 Co-assessor: Romke Rouw, PhD2 1University of Michigan, 2University of Amsterdam Abstract Synesthesia is a perceptual phenomenon in which stimulation of one sensory modality evokes additional experiences in an unrelated modality (e.g., sounds evoking colors). This condition is thought to arise from increased connectivity between associated sensory areas. However, non-synesthetes can experience these sensations via hallucinogens or as a result of brain damage, raising the possibility that synesthesia exists as a latent feature in all individuals, manifesting only when the balance of activity across the senses has been altered. Indeed, multisensory connections are present in all individuals that support the processing of dynamic auditory, visual, and tactile information present in the environment, but it is thought that inhibition of these pathways and the presence of dominant bottom-up information prevents normal multisensory interactions from evoking the subjective experience of synesthesia. The present research explores the conditions necessary to evoke auditory-visual synesthetic experiences in non-synesthetes. First, subjects performed a visual-imagery task in a visually deprived environment while simultaneously being presented with startling sounds from two spatial locations at random, infrequent intervals. The visual imagery task served to increase top-down feedback to early visual areas and from previously conducted pilot studies, startling sounds were found to be more effective in over-stimulating the multisensory network present in all individuals. Visual synesthetic percepts, evoked by startling sounds, were observed in ~60% of our non synesthetic subjects across several behavioural experiments. To identify the neural correlates of this phenomenon, we conducted an EEG study to explore differences in early visual areas for trials in which the participants experienced hallucinatory percepts vs. when they reported no such experiences. The EEG signals reflected a difference in average ERP activity for the two conditions within 100 ms of sound exposure implying differential visual cortex activation for the presence of hallucinatory experiences versus its absence. Across all experiments, subjects reported seeing visual images (vivid colors and Klüver's form-constants) localized to the position of the speaker. These results indicate a higher prevalence of synesthetic experiences in the general population and a link to normal multisensory processes. INDEX Introduction 2 EXPERIMENT I 6 1. Methods 6 2. Phase I 7 3. Phase II 8 4. Analysis procedure 10 5. Results 10 EXPERIMENT II 14 1. Methods 14 2. Results and analysis 15 EXPERIMENT III 19 1. Methods 19 2. Results and analysis 20 EEG STUDY 23 1. Methods 23 2. Behavioural data results 23 3. EEG results and analysis 25 a. Event Related Potentials (ERPs) 25 b. Spectral-power analysis 26 c. Phase analysis 27 Discussion 29 Conclusion 30 References 30 Appendix 35 1 Introduction Interactions between different sensory modalities have been a topic of avid interest in Psychology and other sciences, with a substantial amount of research devoted to discovering possible connections between the auditory and visual systems in humans and animals. Auditory and visual stimuli presented concurrently or in close proximity to each other interact with each other uniquely to produce a unified sensory experience. For example, the 'Ventriloquism effect' demonstrates the dominance of visual cues over auditory ones in localization tasks (Choe, Welch, Gilford & Juola, 1975), and throws light on how the two temporally or spatially discrepant senses interact or supersede each other to produce a coherent, unified experience. Similar studies using pairs of discrepant visual, auditory, and proprioceptive information have been conducted by Pick, Warren & Hay (1969) who were interested in determining the biasing influence of one modality on another; these researchers found visual information to exert biasing effects on localization of auditory stimuli, in their studies. Similarly, based on their findings, Bertelson and Aschersleben (1998) report that a sound can be mislocalized (or dragged across space) to coincide with a visual target in a dark room, even with instructions to disregard the visual stimulus. In contrast, the beep-flash or the "fission" illusion shows how auditory cues gain precedence over visual cues and possibly alter visual perception in a temporally close context (Innes-Brown & Crewther, 2009). That is, multiple beeps presented close in succession have the power to induce detection of multiple 'illusory' flashes, even if only a single flash is presented (Shams, Kamitani, & Shimojo, 2000). These studies highlight the conditions under which miscellaneous sensory information can interact with and influence information presented through alleged dominant sense modalities. Neurophysiological studies conducted on animals have indicated that auditory stimuli can modulate visual cortex activity alongside active visual stimulation (Brang, Towle, Suzuki et al., 2015). For example, research by Allman and Meredith (2007) has found auditory stimuli to exert multimodal influences on otherwise visually- responsive neurons of the cat posterolateral lateral suprasylvian (PLLS) visual area, but only in the presence of other visual stimuli. Recent research has also suggested a mirror effect in humans, with auditory stimuli being capable of modulating activity evoked as a result of visual stimulation, in early visual cortex (Mercier et al., 2013). In line with this finding, McDonald and colleagues (2013) found that peripherally presented salient sounds can activate contralateral occipital neurons, and the associated ERP component was called auditory-evoked contralateral occipital positivity (ACOP). They found the ACOP to arise from ventral visual areas in the occipital cortex, which is also the source of the visually evoked P1 component. Moreover, they also found the lateral-sound induced ACOP to resemble the response generated by visual stimuli, strengthening the argument for multimodal influences on so-called unimodal neurons. Thus, their findings point to an enhanced activation of the contralateral visual cortex in response to peripherally presented sounds, suggesting that these sound-related visual effects are experienced in the ipsilateral (to the sound) side. Research has also found a facilitative effect of sounds on visual processing, under certain conditions. For example, it has been found that simultaneously presented auditory stimuli can enhance visual sensitivity of targets, specifically low-intensity targets (Noesselt et al. 2010). Some research has also reported faster perceptual and motor responses to visual targets in the presence of concurrent auditory stimulation (Cappe et al. 2010; Brang et al. 2013). For example, Miller (1982) describes the “redundant signals effect” which posits quicker reaction times to 2 bimodal signals (“redundant signal trial”) as compared to unimodal signals (“single signal trial”) which could explain enhanced visual processing with concurrent auditory stimulation. The reverse effect is also reported in literature i.e. the enhancement of auditory stimulus processing by attending to a visual stimulus (Bulkin & Groh, 2006). Specifically, ERP and fMRI studies have shown that visual stimuli presented concurrently with auditory tones can enhance processing of the auditory stimulus, despite a discrepancy in signal locations,, provided that the visual stimuli are being attended to (Busse, Roberts, Crist, Weissman, & Woldorff, 2005). These multisensory interactions occur as a function of three important rules. The spatial rule suggests more effective integration of multisensory stimuli originating in a seemingly similar origin (Meredith & Stein, 1985). The temporal rule states that multisensory stimuli presented at the same time or approximately the same time tend to be integrated (Meredith & Stein, 1983), Studies have also shown that bimodal activation of certain neurons through stimuli presented in close temporal and spatial proximity to each other can exceed the sum of the neuronal response to each of the unimodal stimuli. In other words, stimuli coincident in space and within receptive fields can lead to neuronal response enhancement of the stimulus (Meredith & Stein, 1986; Frassinetti, Bolognini, & Làdavas, 2002). This enhancement is amplified when unimodal stimuli would have in turn elicited relatively weak responses. This rule is called the law of inverse effectiveness (Meredith & Stein, 1983; Stein & Meredith, 1996). While multisensory connections exist in all individuals to facilitate sensory processing, certain individuals experience hyperactive state of connections between certain sensory regions that often result in multimodal processing of unimodal stimuli. This condition, labeled "synesthesia" results in consistent multisensory perceptual experiences with otherwise unisensory stimuli (Aleman et al., 2001; Sagiv & Ward, 2006). According to Cytowic (1995), "synesthesia is the involuntary physical experience of a cross-modal association." While previously thought to be a relatively rare phenomenon, some studies have found synesthesia to occur more frequently previously imagined. For example, Simner et al. (2006) conducted two surveys testing for the prevalence of chromatic-grapheme synesthetic traits in normal individuals; the results from