Unimodal (unisensory) vs multisensory processing 8 0

• sensory processing and extensively 0 2

n

Crossmodal studies: cognitive neuroscience studied o r u and experimental design • most research on just a single modality (vision, e N

t l

audition, touch) e s

multisensory and crossmodal processing in the normal brain s e

compensatory plasticity and • real world situations often stimulate several o N

senses at the same time d n a

Martin Schürmann, School of Psychology, University of Nottingham – audiovisual speech (dubbed movies) r e v i – exploring a texture with the hand (motor contribution!) r D

: w

• multisensory aspects of social perception e i v – see another person being touched e R

Misleading illusions vs multisensory estimates Multisensory processing: rules of integration (=research questions!) 8

0 integration governed by • (spatial) ventriloquism 0 Cat, superior colliculi 2

... n o

– mislocalization of sound based on temporally r • spatial coherence u coincident visual event for distant location e • temporal coherence N

t l • inverse effectiveness e

• McGurk effect s s – unreliable input in e – perception of speech sounds biased by seen lip o one modality N

d – sensory deficits in

movement n a

one modality r e

• joint estimates of external properties v i r D

– each modality weighted according to its reliability : w e i

– but how to match input across modalities? v e R

Stein, Exp Brain Res 1998

Multisensory processing: Brain areas with a role in multisensory processing brain basis (=research question!) 6

The brain joins information from different 0 0 2

sensory modalities to construct an „optimal“ i representation of the external world c S – where in the brain? n

-- multisensory convergence zones („heteromodal cx“) g o

-- or primary sensory cortices C

-- cerebral cortex „essentially multisensory“? s d

Ghazanfar and Schroeder Trends Cogn Sci 2006 n e r T

r e d e o r h c S

d n a

r a f n a z a h

Classical view: only heteromodal Modern view: multisensory input G areas are multisensory to heteromodal and unimodal areas Calvert Cereb Cortex 2001

1 Examples of crossmodal studies

Multisensory and Crossmodal plasticity crossmodal processing in after sensory deprivation the normal brain audiovisual Speech and non-speech stimuli in MEG, fMRI Sight-reading in Multisensory musicians (MEG study) processing, example A1: visuotactile TMS to visual cortex in audiovisual speech blind subjects

Sams et al Neurosci Lett 1991 audiotactile Vibrotactile stimuli in Vibrotactile activation of • subjects watched video of actress articulating /pa/ and /ka/ fMRI, MEG ... cortex in a deaf • auditory stimulus always /pa/ subject (MEG) • waveforms explained by equivalent current dipoles (ECDs) at the supratemporal cortex • visual information may have access NB additional interaction with motor processes to auditory cortex

Multisensory processing, example A2: Multisensory processing, example A3: audiovisual speech audiovisual letters

Raij, Uutela, Hari Neuron 2001 (MEG): integrating auditory Calvert et al Science 1997 (fMRI): (phonemic) and visual (graphemic) aspects of letters lip-reading (numbers 1 to 10) activates • stimuli that have been associated through learning • task: lift the left index finger as quickly and accurately as auditory cortex possible when a target stimulus occurs • right temporo-parieto-occipital junction and left/right STS, stronger interaction for letters than controls • only in left/right STS, stronger interaction for matching vs non-matching letters • NB interaction was suppressive! purple: auditory , blue: lipreading, yellow: overlap Interaction sources: grand averages (380-540 ms) Calvert et al. Curr Biol 2000 (fMRI): • semantically concordant AV speech • semantiaclly discordant AV speech • heard speech, seen speech, rest Calvert Cereb Cortex 2001 interested in areas where • A > 0, V > 0, AVcon > A+V, AVdis < A+V only region that met criteria was left STS

Identifying multisensory areas: Multisensory processing, example A4: intersection, conjunction, or interaction? audiovisual, non speech stimuli 1 0 0 2

x e

Foxe et al. Cogn Brain Res 2000 (ERPs) t r o

• 1000-Hz tones paired with median nerve C

b e

stimuli (mild „electric shocks“) r e C

• somatosensory cx: 50 ms after onset t r e

• auditory cx: 70-80 ms after onset v l a

• supports „early integration in primary Speech stimuli (interaction method) Speech stimuli (methods compared) C

sensory areas“ Calvert Cereb Cortex 2001 Intersection: statistics for A gives map A, Better: 2 x 2 design with interaction contrast statistics for V gives map V, then A * V [AV-rest] – [(A-rest) + (V-rest)] Bushara et al J Neurosci 2001 (PET) NB 1: coexistence of two sets of neurons in the shows bimodal response that 0 V • visual stimulus: circle on computer screen; auditory stimulus: 2000 Hz, 90 dB tones, both 100 ms same voxel (Type I error, „false positive“) cannot be predicted from the NB 2: might overlook an area where weak sum of unimodal responses 0 Rest V • detection of AV stimulus onset asynchrony activates insula and superior colliculus responses to unimodal stimuli are enhanced for * no Type I error as in NB1 A A AV Calvert et al. Neuroimage 2001 (fMRI) bimodal stimuli (Type II error, „false negative“) * no need for unimodal • checkerboard reversal and 1000 ms white noise bursts responses to reach significance • if input synchronous, then activation in superior colliculus, insula, left STS, right IPS Conjunction: statistics on combined A+V data * still difficult to interpret if BOLD responses to Contrast unimodal stimuli with true bimodal A, B are close to ceiling • no extra activation in primary sensory areas stimuli: [AV-V] * [AV-A] ... but if AV is just A+V * best use parametric design where intensity of then this analysis is same as A*V above unimodal stimuli is systematically varied * still difficult if one of the unimodal responses is suppressive (e.g. visual tasks depress responses in aud cx)

2 Feedback from convergence areas to Model of brain areas contributing to AV integration primary cortical areas? – Example A5

Noesselt et al J Neurosci 2007 • visual transients in the periphery of the subject‘s visual field (task: monitors fixation mark for changes in brightness) • sound bursts either temporally coincindent or non-coincident with visual transients • for temporally coincident stimuli, activation in STS and in primary visual cx and primary auditory cx STS: synthesis of linguistic information (note also role in • analyses of functional communicative signals in general, such as perception coupling suggest of biological motion) feedback from STS to IPS, superior colliculi: spatially coincident stimuli primary areas Insula, superior colliculi: temporal coincidence of stimuli Calvert Cereb Cortex 2001

Auditory cortex activated by visual stimuli Multisensory processing, example A6: Visual w/o imagery Visual with imagery auditory cortex activated by visual stimuli

Mind's ear in a musician: • Audiovisual association • Trained musicians reading musical score may hear corresponding sounds in their mind’s ear • Studied earlier: spatial pattern of brain activation identified, but no timing information (Sergent et al., 1992) • Goal: analyse time course of brain activation sites during visually triggered auditory imagery in musicians

Subjects: 11 experienced professional- level musicians Stimuli: Four different notes/tones, presented once every 1.5-2s MEG: 306-channel whole-head device Source analysis: minimum current estimates (Uutela et al. 1999)

Schürmann, Raij, Fujiki, Hari, NeuroImage 2002 * marks an artefact (inferior surface of the model)

Multisensory processing example A7: audiotactile interaction

* Persons with impairment may perceive sounds (including speech) using their sense of touch

* Audiotactile interaction not easily noticed in everyday life, needs experiments like parchment-skin illusion Jousmäki & Hari Curr Biol 1998 Guest, Catmur, Lloyd, Spence Exp Brain Res 2002

* tactile stimuli with a "sound-like" temporal pattern

Conclusion: MEG shows neural activation in visually triggered auditory imagery >> Objective: study brain activation by auditory and - with well-defined time course in multiple areas - may reflect recalling of firmly established audiovisual associations vibrotactile stimuli: shared brain resources?

3 Touch activates human auditory cortex: an fMRI study Touch activates human auditory cortex

Stimuli: * Vibrotactile to fingers and palm Stimuli: * Vibrotactile to fingers and palm * 200-Hz, 500-ms vibrations 200-Hz, 500-ms vibrations, SOA 1500 700 ms * Noise bursts in headphones * stimulus onset asynchrony 1500  700 ms 500-ms bursts, SOA 1500 700 ms ≠ psychophysical study * Touch to fingertips http://www.ami.hut.fi Air pressure pulses, SOA 300–500 ms How to prevent artifacts from 13 subjects 28.11.8 years (22–39), all right-handed, 4 females vibration-induced noise? * subject touching General Electrics 3T scanner, Helsinki Univ of Technology only one out of two vibrating tubes Preprocessing: SPM99 (Functional Imaging Laboratory, UCL) * brain activation Realignment to correct for movement during scans in contrast Spatial normalization to standard brain Vibrotactile-Control Smoothing kept to minimum otherwise areas of multisensory Schürmann, Caetano, Hlushchuk, Jousmäki & Hari, NeuroImage 2006 co-activation appear larger than they are (statistical thresholds!)

Touch activates human auditory cortex Results - Group analysis: SI • minimal smoothing 3 x 3 x 4 mm SII • liberal smoothing 9 x 9 x 12 mm NB „extent“ of activation NB is activation bilateral? varies with smoothing • only left auditory belt area with suprathreshold • contrasts for individual subjects: aud cx activation in second-level analysis • Vibrotactile-Control, Noise-Rest, Touch-Rest • population map shows left and right auditory belt area • group analysis: random effects, 13 subjects activation

Schürmann, Caetano, Hlushchuk, Jousmäki & Hari, NeuroImage 2006 Schürmann, Caetano, Hlushchuk, Jousmäki & Hari, NeuroImage 2006

BOLD signal time series from individual subjects‘ data Integration of touch and sound in auditory cortex: high-resolution fMRI in macaque monkeys Co-activated area (in Kayser, Petkov, Augath, Logothetis, Neuron 2005 group analysis) as region of interest

(ROI) Tactile&Sound

Tactile For each subject, choose cluster next to ROI, plot BOLD time series Sound

plots show averages |------100 s ------| across recurring blocks and across subjects – left hemisphere

Schürmann, Caetano, Hlushchuk, Jousmäki & Hari, NeuroImage 2006

4 s e Activation of auditory belt areas by vibrotactile stimuli: MEG study Responses to vibrotactile stimuli: MEG topography s n o subjects report weak percept of vibration in the fingertips, perception of sound p s Caetano and Jousmäki, Neuroimage 2006 e r t n ie MEG combines Neuromag VV s n a r * good spatial resolution (~5 mm in LTL, Helsinki t optimal cases) 102 x 3 sensors * millisecond temporal resolution

d e e s in n a o t p s s 6 u e 0

s r 0 2

e g a m i o r u e N

, i k ä m s u o J Vibrotactile stimuli to fingertips d

~20 dB above threshold, 500 ms duration n a

"No Touch": sound from tube? o n a Functional localizers: Traces shown are from t e * auditory cortex: 60 dB, 1 kHz, 100 ms tones gradiometers (maximum Traces for 2 subsets of identical a C * somatosensory cx: median nerve stim amplitude above source) stimuli show reproducibility

1. MEG time series 2. MEG sources over time (two example subjects) Conclusion: Audiotactile interaction - brain basis?

upper bank Activation of auditory belt areas through vibrotactile stimuli lower bank of Sylvian fissure Shared neural resources might be needed for „binding“ across modalities or detecting temporal, spatial coincidence >> correlate of facilitatory interaction in „hands help hearing“? >> analysis of temporal patterns required in both modalities?

Caveat: activation of inner ear via bone conduction? 3. Sources in group analysis (N=10) additional control in 8 subjects (Caetano and Jousmäki 2006): sustained response (300 - 500 ms) to • vibrotactile stimuli applied to wrist or elbow 500 ms stimuli, only in lower bank (p • no perception of sound = 0.008, right hemisphere) • bone conduction does not explain MEG results

Responses to vibrotactile stimuli Caetano and Jousmäki, Neuroimage 2006

Competing (?) explanations of the brain basis of Crossmodal plasticity example multisensory processing B1: tactile input to visual cortex

A – „all multisensory“: direct  repetitive TMS for feedforward influences between virtual lesions  errors during Braille modalities (arising at thlamic level or 8

0 reading for early blind

through cortico-cortical connections) 0

2 vs sighted subjects B – „new bimodal areas“: n

o  error rates depend on audiotactile and audiovisual areas r

u site of virtual lesions

immediately adjacent to auditory e

N  cortical plasticity in

t cortex l early blind subjects e

C – „Critical Role of Feedback s s

Circuitry“: activation of STS and e o N

feedback to prrimary visual and d

auditory cortex ... test how ... ? n a

r e v

Driver and Noesselt 2008 i r D

Cohen LG et al Nature 1997; 389: 180-183

5 Crossmodal plasticity Crossmodal plasticity example B2: chronometry example B3: Vibrotactile activation of auditory cortex  crossmodal input to in a congenitally deaf adult visual cortex  single-pulse TMS  early blind subjects  real and nonsensical a t

Braille stimuli a d presented via tactile G

stimulator E M  subjects' task (a) detect stimuli (b) identify stimuli as (a) Field patterns (deviant vibrotactile stimuli) real vs nonsensical TMS @ 20 ms TMS @ 60 post tact stim ms post tact and (b) responses for all subjects, channels with  conclusion: visual interferes with stim interferes maxS1andST(supratemporal) cortex cortex contributes detection, with perception Cortical sites activated in the deaf subject to tactile information perception (a, b) Source areas projected onto individual processing in early MRI. (c) Time course of source strengths in blind subjects the right and left ST cortices during Hamilton R, Pascual-Leone A. symbols (square, triangle, circle) for three subjects vibrotactile stimuli, and when the subject did Levänen, Jousmäki and Hari, Curr Biol 1998 Trends Cogn Sci 1998; 2: 168-174 open symbols: detected stimuli; filled symbols: correctly identified stimuli not hold his hand around the tube (control).

Summary: crossmodal studies

Multisensory and crossmodal Crossmodal plasticity processing in the normal brain after sensory deprivation

audiovisual Speech and non-speech stimuli in fMRI, MEG: activation of STS, IPS, SC [A1-A5] Sight-reading in musicians (MEG): visual notes activate auditory belt areas [A6] visuotactile Braille reading affected by TMS to visual cortex in blind subjects [B1, B2] audiotactile Vibrotactile stimuli in fMRI, Vibrotactile activation of MEG: activation of auditory belt auditory cortex in a deaf areas [A7] subject (MEG) [B3]

NB additional interaction with motor processes

6