Modulation of Somatosensory Perception and Multisensory Integration As a Function of Habitual Patterns of Action

Total Page:16

File Type:pdf, Size:1020Kb

Modulation of Somatosensory Perception and Multisensory Integration As a Function of Habitual Patterns of Action Modulation of somatosensory perception and multisensory integration as a function of habitual patterns of action Harriet Dempsey-Jones Bachelor of Psychological Science (Hons) A thesis submitted for the degree of Doctor of Philosophy at The University of Queensland in 2015 School of Psychology 1 Abstract Somatosensation refers to an integrated percept consisting of information from the skin and muscles, e.g., touch and proprioception. Together these disparate inputs provide information about the state of the body, with respect to itself and the external world. Due to the partially unconscious nature of the somatic senses and the complexity of this distributed system, much remains unknown about its supportive neurocognitive mechanisms. In the current thesis I aimed explore the modulation of somatosensory perception and integration as a function of habitual patterns of action. That is, how patterns of sensory input resulting from activity can shape the organisation of the somatosensory system and multisensory integration processes. Distinct experimental paradigms and somatosensory sub-modalities were explored, to provide converging evidence for this investigation. The first series of experiments looked at how habitual tactile stimulation might affect the representation of touch in the somatosensory system. Touch perception of the fingers can be improved by tactile stimulation (i.e. tactile perceptual learning). Learning transfers from trained to untrained fingers in a pattern reflecting the underlying relationships between fingers in the somatosensory system. I predicted repetitive patterns of touch resulting from daily use between fingers should affect this representation and, therefore, be reflected in learning transfer. A trained group underwent seven sessions of testing and training over four days. This was compared with an untrained control group. A divergence was identified in the transfer of learning from a trained middle finger to two fingers that are physically and cortically adjacent to the trained finger. I suggest this divergence may have resulted from documented differences in cooperative finger use with the middle finger. These results demonstrate how repetitive patterns of action are a potential organising force in the human somatosensory system. In a second line of research, I expanded my investigation into multisensory contributions to somatosensory perception. I used a modification of the classic rubber hand illusion (RHI) to investigate the integration of proprioceptive and visual hand position information. In this paradigm, an illusory spatial disparity is created between visual and proprioceptive hand position, and the strength of multisensory integration is measured by the shift of felt position towards the seen position (proprioceptive 2 drift). Here I was interested in how habitual patterns of action can shape multisensory integration within peripersonal space. I predicted a difference in the integration of visual and proprioceptive hand information within the habitual action space of the arm, versus beyond the action space of the arm. Unlike previous studies I fixed the relative distance between the real and false hand, while varying the absolute position of the two hands in space. By doing so I was able to look at the effect of proximity to the action space alone without the confounding influence of the relative distance between sensory inputs (also known to affect integration). In Experiment One, I demonstrated a spatial modulation of drift across the workspace of the left hand – where visuo-proprioceptive integration was greatest in the habitual action space of the hand. In a follow up experiment, I compared this effect for the left and right hands to reveal these integration differences are indeed due to the hand position, and not a simple processing bias within the left or right hemispace. Overall, my results suggest that multisensory integration is anchored to action space, rather than to the hand itself. In a final experiment I investigate the multicomponent model of self-representation. I ask whether self-localisation (i.e. the sense of where your body is located in space) is a related, but dissociated phenomenon to subjective self-representation (i.e. embodiment and ownership). I predicted that these phenomena are distinct processes, supported by different mechanisms of multisensory integration. To test this prediction I tested the RHI under a condition that was expected to alter self-localisation (measured using ‘drift’) but not subjective self-representation (measured using a questionnaire). In this condition, I placed the participant’s hand within the habitual action space of the arm and shifted felt position laterally out from this location, towards extracorporeal space (RHI Out Condition). I compared this with a control condition that was not expected to modulate either drift or subjective questionnaire outcomes (RHI In Condition). It was predicted that drift would be greater in the Out condition as compared to In, but that there would be no significant difference in subjective questionnaire reports. This dissociation should be possible because, we suggest, subjective illusion is based on a different mechanism of multisensory integration that is unrelated to functional interaction with space. It would, therefore, be unaffected by proximity of the hand to action space. Results revealed drift was indeed greater in the Out condition compared to In, but subjective illusion was not different between conditions. My results support the dissociation of subjective and 3 drift related outcomes of the RHI. I discuss the related, but distinct multisensory mechanisms supporting these two outcomes and their likely neural locus. In conclusion, the results of the current thesis revealed that there is significant evidence that patterns of habitual action affect somatosensory perception and integration processes. Specifically, functional interaction with the world shapes somatosensory processing through schedules of sensory input. Manipulation of somatosensory perception is a useful tool for studying the neurocognitive mechanisms supporting such processes. 4 Declaration by author This thesis is composed of my original work, and contains no material previously published or written by another person except where due reference has been made in the text. I have clearly stated the contribution by others to jointly authored works that I have included in my thesis. I have clearly stated the contribution of others to my thesis as a whole, including statistical assistance, survey design, data analysis, significant technical procedures, professional editorial advice, and any other original research work used or reported in my thesis. The content of my thesis is the result of work I have carried out since the commencement of my research higher degree candidature and does not include a substantial part of work that has been submitted to qualify for the award of any other degree or diploma in any university or other tertiary institution. I have clearly stated which parts of my thesis, if any, have been submitted to qualify for another award. I acknowledge that an electronic copy of my thesis must be lodged with the University Library and, subject to the policy and procedures of The University of Queensland, the thesis be made available for research and study in accordance with the Copyright Act 1968 unless a period of embargo has been approved by the Dean of the Graduate School. I acknowledge that copyright of all material contained in my thesis resides with the copyright holder(s) of that material. Where appropriate I have obtained copyright permission from the copyright holder to reproduce material in this thesis. Harriet Dempsey-Jones September 2015 5 Publications during candidature Journal articles Dempsey-Jones, H., & Kritikos, A. (2014). Higher-order cognitive factors affect subjective but not proprioceptive aspects of self-representation in the rubber hand illusion. Consciousness and Cognition, 26, 74-89. Dempsey-Jones, H., Harrar., V., Oliver, J., Johnansen-Berg, H., Spence, C. & Makin, T. (2015). Transfer of tactile perceptual learning reflects neighborhood use relationships. J Neurophysiol, jn 00181 02015. doi: 10.1152/jn.00181.2015. 6 Conference Abstracts Dempsey-Jones, H. & Kritikos, A. (2011). Like the back of my hand: Greater shifts in self-location away from the body than towards the body in The Rubber Hand Illusion. Poster presented at the 2nd Australian Cognitive Neuroscience Conference. Dempsey-Jones, H. & Kritikos, A. (2012). Proprioceptive drift towards the body is associated with changes in self-ownership and embodiment in the Rubber Hand Illusion (RHI) – but drift away from the body is not. Poster presented at the 39th Experimental Psychology Conference. Dempsey-Jones, H. & Kritikos, A. (2012). Handedness and proprioceptive position estimation: Are left handed people more accurate in self-representation and is this representation resistant to manipulation by the Rubber Hand Illusion? Front. Hum. Neurosci. Conference Abstract: ACNS-2012 Australasian Cognitive Neuroscience Conference. doi: 10.3389/conf.fnhum.2012.208.00138. Dempsey-Jones, H. & Kritikos, A. (2013). Handedness and the weighting of visual- proprioceptive information in position estimation: the effect of illusory visual position information. Journal of Vision, 13(9), 877. doi: 10.1167/13.9.877. 7 Dempsey-Jones, H. & Kritikos, (2013) A. Visual biases override proprioceptive biases in the estimation of limb position following a visual
Recommended publications
  • Prefrontal and Posterior Parietal Contributions to the Perceptual Awareness of Touch M
    www.nature.com/scientificreports OPEN Prefrontal and posterior parietal contributions to the perceptual awareness of touch M. Rullmann1,2,5, S. Preusser1,5 & B. Pleger1,3,4* Which brain regions contribute to the perceptual awareness of touch remains largely unclear. We collected structural magnetic resonance imaging scans and neurological examination reports of 70 patients with brain injuries or stroke in S1 extending into adjacent parietal, temporal or pre-/frontal regions. We applied voxel-based lesion-symptom mapping to identify brain areas that overlap with an impaired touch perception (i.e., hypoesthesia). As expected, patients with hypoesthesia (n = 43) presented lesions in all Brodmann areas in S1 on postcentral gyrus (BA 1, 2, 3a, 3b). At the anterior border to BA 3b, we additionally identifed motor area BA 4p in association with hypoesthesia, as well as further ventrally the ventral premotor cortex (BA 6, BA 44), assumed to be involved in whole-body perception. At the posterior border to S1, we found hypoesthesia associated efects in attention-related areas such as the inferior parietal lobe and intraparietal sulcus. Downstream to S1, we replicated previously reported lesion-hypoesthesia associations in the parietal operculum and insular cortex (i.e., ventral pathway of somatosensory processing). The present fndings extend this pathway from S1 to the insular cortex by prefrontal and posterior parietal areas involved in multisensory integration and attention processes. Te primary somatosensory cortex (S1) in monkeys can be divided into four Brodmann areas: (BA) 1, 2, 3a, and 3b. Each BA consists of a somatotopically organized map that subserves distinct somatosensory functions1–3.
    [Show full text]
  • Somatosensory Function in Speech Perception
    Somatosensory function in speech perception Takayuki Itoa, Mark Tiedea,b, and David J. Ostrya,c,1 aHaskins Laboratories, 300 George Street, New Haven, CT 06511; bResearch Laboratory of Electronics, Massachusetts Institute of Technology, 50 Vassar Street, Cambridge, MA 02139; and cDepartment of Psychology, McGill University, 1205 Dr. Penfield Avenue, Montreal, QC, Canada H3A 1B1 Edited by Fernando Nottebohm, The Rockefeller University, Millbrook, NY, and approved December 4, 2008 (received for review October 7, 2008) Somatosensory signals from the facial skin and muscles of the vocal taken from a computer-generated continuum between the words tract provide a rich source of sensory input in speech production. head and had. We found that perception of these speech sounds We show here that the somatosensory system is also involved in varied in a systematic manner depending on the direction of skin the perception of speech. We use a robotic device to create stretch. The presence of any perceptual change at all depended patterns of facial skin deformation that would normally accom- on the temporal pattern of the stretch, such that perceptual pany speech production. We find that when we stretch the facial change was present only when the timing of skin stretch was skin while people listen to words, it alters the sounds they hear. comparable to that which occurs during speech production. The The systematic perceptual variation we observe in conjunction findings are consistent with the hypothesis that the somatosen- with speech-like patterns of skin stretch indicates that somatosen- sory system is involved in the perceptual processing of speech. sory inputs affect the neural processing of speech sounds and The findings underscore the idea that there is a broad nonau- shows the involvement of the somatosensory system in the per- ditory basis to speech perception (17–21).
    [Show full text]
  • Using Multisensory Integration to Understand the Human Auditory Cortex
    Chapter 8 Using Multisensory Integration to Understand the Human Auditory Cortex Michael S. Beauchamp Abstract Accurate and meaningful parcellation of the human cortex is an essential endeavor to facilitate the collective understanding of brain functions across sensory and cognitive domains. Unlike in the visual cortex, the details of anatomical and functional mapping associated with the earliest stages of auditory processing in the cortex are still a topic of active debate. Interestingly, aspects of multisensory pro- cessing may provide a unique window to meaningfully subdivide the auditory sen- sory areas by exploring different functional properties other than the traditional tonotopic approach. In this chapter, a tour of the auditory cortical areas is first pro- vided, starting from its core area, Heschl’s gyrus, then moving onto surrounding areas. Evidence from different sources, including postmortem studies of the human auditory cortex, resting-state functional connectivity derived from the Human Connectome Project, and electrocorticographic studies, is presented to better under- stand how different subdivisions of the human auditory cortex and its surrounding areas are involved in auditory and multisensory processing. The chapter concludes with the remaining challenges to account for individual variability in functional anatomy, particularly pertaining to multisensory processing. Keywords Auditory cortex · Cross-modal · Electrocorticography · Functional anatomy · Functional connectivity · Heschl’s gyrus · Human Connectome Project · Sensory integration · Superior temporal sulcus · Temporal cortex 8.1 Introduction The human cerebrum is divided into sensory cortices specialized for the processing of a specific sensory modality, with the visual cortex located in the occipital lobe and the auditory cortex centered on Heschl’s gyrus on the plane of the superior M.
    [Show full text]
  • Visual Experience Is Necessary for the Development of Multisensory Integration
    9580 • The Journal of Neuroscience, October 27, 2004 • 24(43):9580–9584 Brief Communication Visual Experience Is Necessary for the Development of Multisensory Integration Mark T. Wallace, Thomas J. Perrault Jr, W. David Hairston, and Barry E. Stein Department of Neurobiology and Anatomy, Wake Forest University School of Medicine, Winston-Salem, North Carolina 27157 Multisensoryneuronsandtheirabilitytointegratemultisensorycuesdevelopgraduallyinthemidbrain[i.e.,superiorcolliculus(SC)].To examine the possibility that early sensory experiences might play a critical role in these maturational processes, animals were raised in the absence of visual cues. As adults, the SC of these animals were found to contain many multisensory neurons, the large majority of which were visually responsive. Although these neurons responded robustly to each of their cross-modal inputs when presented indi- vidually, they were incapable of synthesizing this information. These observations suggest that visual experiences are critical for the SC to develop the ability to integrate multisensory information and lead to the prediction that, in the absence of such experience, animals will be compromised in their sensitivity to cross-modal events. Key words: cross-modal; dark rearing; multimodal; plasticity; subcortical; superior colliculus Introduction et al., 2000). What is not evident from these studies is whether Neurons in the superior colliculus (SC) have the remarkable ca- these experiences, so critical for the normal development of indi- pacity to integrate information from the different senses, a pro- vidual sensory systems, as well as for the normal cross-modal cess that substantially enhances the salience of the initiating event topography seen in multisensory structures (King et al., 1988; (Meredith and Stein 1983, 1985, 1986; King and Palmer, 1985; Knudsen and Brainard, 1991; Benedetti and Ferro, 1995), play Wallace et al., 1996, 1998; Frens and Van Opstal 1998; Bell et al., any role in the development of the ability of the brain to synthe- 2001; Perrault et al., 2003).
    [Show full text]
  • Performing a Task Jointly Enhances the Sound-Induced Flash Illusion
    Performing a task jointly enhances the sound-induced flash illusion Basil Wahn1, Tim Rohe2, Anika Gearhart3, Alan Kingstone1, and Scott Sinnett3 Abstract Our different senses are stimulated continuously. Through multisensory integration, different sensory inputs may or may not be combined into a unitary percept. Simultaneous with this stimulation, people are frequently engaged in social interactions, but how multisensory integration and social processing combine is largely unknown. The present study investigated if, and how, the multisensory sound-induced flash illusion is affected by a social manipulation. In the sound-induced flash illusion, a participant typically receives one visual flash and two auditory beeps and she/he is required to indicate the number of flashes that were perceived. Often, the auditory beeps alter the perception of the flashes such that a participant tends to perceive two flashes instead of one flash. We tested whether performing a flash counting task with a partner (confederate), who was required to indicate the number of presented beeps, would modulate this illusion. We found that the sound-induced flash illusion was perceived significantly more often when the flash counting task was performed with the confederate compared to performing it alone. Yet, we no longer find this effect if visual access between the two individuals is prevented. These findings, combined with previous results, suggest that performing a multisensory task jointly – in this case an audiovisual task – lowers the extent to which an individual attends to visual information – which in turn affects the multisensory integration process. Keywords multisensory processing, social cognition, joint action, sound-induced flash illusion, double flash illusion Introduction In addition, a large body of research in the field of joint action has shown that individual task performance is affected Humans constantly receive input from multiple sensory by performing visual perceptual tasks jointly (Bockler¨ et al.
    [Show full text]
  • Pain Processing in Multisensory Environments
    Review article e-Neuroforum 2010 · 1:23–28 Marion Höfle1 · M. Hauck2 · A.K. Engel1 · D. Senkowski1 DOI 10.1007/s13295-010-0004-z 1 Department of Neurophysiology and Pathophysiology, University Medical Center Hamburg-Eppendorf, Hamburg 2 Department of Neurology, University Medical Center Hamburg-Eppendorf, Hamburg © Springer-Verlag 2010 Pain processing in multisensory environments Introduction It is well known that different environ- influences between visual and pain stim- mental and cognitive factors can modu- uli. These studies support the notion that Imagine yourself cutting peppers. While late activity in the so-called pain matrix, visual inputs can strongly affect the pro- you watch the pepper getting smaller, i.e., the network of brain regions involved cessing of painful stimuli. Finally, we give your mind might wander and the knife in pain processing [28]. Moreover, there is an outlook for future research on pain pro- accidentally carves the skin of your finger elaborate knowledge of the neurophysio- cessing in multisensory environments. and you feel a sharp pain. Painful events in logical mechanisms underlying pain pro- real-world situations do not occur in iso- cessing, as well as of the principle mecha- Dimensions of pain perception lation but often comprise input from ad- nisms of multisensory integration of non- ditional sensory modalities, i.e., the visual painful information (. Box 1). Surpris- The application of different pain models percept of the knife penetrating the finger. ingly, however, only few studies have sys- (. Box 2) revealed that pain is a complex The successful integration and recall of tematically addressed how stimuli from experience that can be described along painful stimuli in combination with high- other sensory modalities affect nocicep- two major dimensions, which themselves ly informative input of other sensory mo- tive processing.
    [Show full text]
  • Hemodynamic Changes in Cortical Sensorimotor Systems Following
    University of Nebraska - Lincoln DigitalCommons@University of Nebraska - Lincoln Public Access Theses and Dissertations from the Education and Human Sciences, College of (CEHS) College of Education and Human Sciences Spring 2016 Hemodynamic Changes in Cortical Sensorimotor Systems Following Hand and Orofacial Motor Tasks and Pulsed Cutaneous Stimulation Austin Oder Rosner University of Nebraska-Lincoln, [email protected] Follow this and additional works at: http://digitalcommons.unl.edu/cehsdiss Part of the Communication Sciences and Disorders Commons Rosner, Austin Oder, "Hemodynamic Changes in Cortical Sensorimotor Systems Following Hand and Orofacial Motor Tasks and Pulsed Cutaneous Stimulation" (2016). Public Access Theses and Dissertations from the College of Education and Human Sciences. 256. http://digitalcommons.unl.edu/cehsdiss/256 This Article is brought to you for free and open access by the Education and Human Sciences, College of (CEHS) at DigitalCommons@University of Nebraska - Lincoln. It has been accepted for inclusion in Public Access Theses and Dissertations from the College of Education and Human Sciences by an authorized administrator of DigitalCommons@University of Nebraska - Lincoln. HEMODYNAMIC CHANGES IN CORTICAL SENSORIMOTOR SYSTEMS FOLLOWING HAND AND OROFACIAL MOTOR TASKS AND PULSED CUTANEOUS STIMULATION by Austin Oder Rosner A DISSERTATION Presented to the Faculty of The Graduate College at the University of Nebraska In Partial Fulfillment of Requirements For the Degree of Doctor of Philosophy Major: Interdepartmental Area of Human Sciences (Communication Disorders) Under the Supervision of Professor Steven M. Barlow Lincoln, Nebraska January, 2016 HEMODYNAMIC CHANGES IN CORTICAL SENSORIMOTOR SYSTEMS FOLLOWING HAND AND OROFACIAL MOTOR TASKS AND PULSED CUTANEOUS STIMULATION Austin Oder Rosner, Ph.D.
    [Show full text]
  • Development of Multisensory Integration Approach Model
    International Journal of Applied Research 2016; 2(4): 629-633 ISSN Print: 2394-7500 ISSN Online: 2394-5869 Impact Factor: 5.2 Development of multisensory integration approach IJAR 2016; 2(4): 629-633 www.allresearchjournal.com model Received: 13-02-2016 Accepted: 15-03-2016 Prasannakumar S, Dr. Saminathan B Prasannakumar S Assistant Professor, North Abstract East Regional Institute of Education, Shillong, India. Every teacher expects optimum level of processing in mind of them students. The level of processing is mainly depends upon memory process. Most of the students have retrieval difficulties on past learning. Dr. Saminathan B Memory difficulties directly related to sensory integration. In these circumstances the investigator Assistant Professor, made an attempt to construct Multisensory integration Approach Model based on sensory process and Department of Education, perceptual process. Multisensory Integration approach model contains three parts the first part is Bharathidasan University, Instructional part, second part is processing part and third one is learning outcome part. Instructional Trichy, India. part is includes seven steps. There are relating new information to prior knowledge, Focusing attention to the information, developing sensory connection, Organizing the information, Expanding sensory images, Structuring the information and Practicing recall. The above instructional strategies develop stimulation, sensation, Attention, Perception, Imagery, conceptualization and memory. The learning outcome part contains from low level sensory integration (visual- Auditory) to high level sensory integration (Visual-Auditory-Tactile-Olfactory-Gustatory). The investigator regulates information processing of the students in brain through Multisensory Integration Approach Model. Keywords: Development, Multisensory Integration, Approach, Model 1. Introduction Development of models of teaching is the recent innovation in teaching.
    [Show full text]
  • Insights and Perspectives on Sensory-Motor Integration and Rehabilitation Rochelle Ackerley, Michael Borich, Calogero Maria Oddo, Silvio Ionta
    Insights and Perspectives on Sensory-Motor Integration and Rehabilitation Rochelle Ackerley, Michael Borich, Calogero Maria Oddo, Silvio Ionta To cite this version: Rochelle Ackerley, Michael Borich, Calogero Maria Oddo, Silvio Ionta. Insights and Perspectives on Sensory-Motor Integration and Rehabilitation. Multisensory Research, Brill, 2016, 29 (6-7), pp.607- 633. 10.1163/22134808-00002530. hal-01599609 HAL Id: hal-01599609 https://hal.archives-ouvertes.fr/hal-01599609 Submitted on 2 Oct 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Multisensory Research 29 (2016) 607–633 brill.com/msr Insights and Perspectives on Sensory-Motor Integration and Rehabilitation Rochelle Ackerley 1,2, Michael Borich 3, Calogero Maria Oddo 4 and Silvio Ionta 5,6,∗ 1 Department of Physiology, University of Gothenburg, Göteborg, Sweden 2 Laboratoire Neurosciences Intégratives et Adaptatives (UMR 7260), CNRS — Aix-Marseille Université, Marseille, France 3 Neural Plasticity Research Laboratory, Division of Physical Therapy, Dept of Rehabilitation Medicine, Emory
    [Show full text]
  • Multisensory Processing and Perceptual Consciousness: Part I
    Philosophy Compass 11/2 (2016): 121–133, 10.1111/phc3.12227 Multisensory Processing and Perceptual Consciousness: Part I Robert Eamon Briscoe* Ohio University Abstract Multisensory processing encompasses all of the various ways in which the presence of information in one sensory modality can adaptively inf luence the processing of information in a different modality. In Part I of this survey article, I begin by presenting a cartography of some of the more extensively investigated forms of multisensory processing, with a special focus on two distinct types of multisensory integration. I brief ly discuss the conditions under which these different forms of multisensory processing occur as well as their important perceptual consequences and interrelations. In Part II, I then turn to examining of some of the different possible ways in which the structure of conscious perceptual experience might also be characterized as multisensory. In addition, I discuss the significance of research on multisensory processing and multisensory consciousness for philosophical attempts to individuate the senses. 1. Introduction Multisensory processing, as understood here, encompasses all of the various ways in which the presence of information in one sensory modality can adaptively inf luence the way information is processed in a different modality.1 Multisensory processes are adaptive in the sense that they function to enhance the accuracy of perception and the control of perceptually guided actions.2 Some forms of multisensory processing, for example, deal with the challenge of tracking or identifying objects and events crossmodally. When sensory signals in different modalities have been identified as having a common origin in the external environment (or the perceiver’s body), other forms of multisensory processing address the problem of integrating them and, importantly, reconciling them when they conf lict.
    [Show full text]
  • Miracle Berry’ Demo ‘Miracle Berry’ Demo What Are Miracle Berries?
    Applied Neuroscience Columbia Science Honors Program Fall 2016 Sensory Systems and Neural Circuits, Part Two Last week’s class Sensory systems and neural circuits 1. Introduction to sensory systems 2. Visual system 3. Auditory system 4. Introduction to neural circuits How do we take cues from our environment and make sense of the world around us? Sensory systems How do sensory systems work? Neurons in sensory regions of the brain respond to stimuli by firing action potentials after receiving a stimulus. Each sensory system follows a specific plan: 1. Reception 2. Transduction 3. Coding Today’s class Objective: Sensory Systems and Neural Circuits, Part Two Agenda: Sensory Systems Gustatory system (taste) Olfactory system (odor) Demo 1: Jellybean test Somatosensory system (touch and pain) Demo 2: Flavor tripping with ‘Miracle Berries’ Sensory systems What are different types of senses? Hearing Vision Smell Touch (olfaction) Pain Humans Taste Temperature (gustation) Vestibular Autonomic Senses Senses Proprioception Chemical senses Sensory systems associated with the nose and mouth (olfaction and taste) detect chemicals in the environment. Gustatory system: detects ingested, primarily water-soluble, molecules called tastants Olfactory system: detects airborne molecules called odors These systems rely on receptors in the nasal cavity, mouth or face that interact with the relevant molecules and generate action potentials. Gustatory System Objective: Understand how the gustatory system works Agenda: 1. Morphology of cells 2. Gustatory pathway v Flavor tripping with ‘Miracle Berries’ (later in the class) Gustatory System Morphology of taste buds and cell types: • Taste buds are located on papillae and are distributed across the tongue • They are also found on the oral mucosa of the palate and epiglottis • Taste buds contain 80 cells arranged around a central taste pore.
    [Show full text]
  • Sensory Training: Translating Research to Clinical Practice & Home Program
    Sensory Training: translating research to clinical practice & home program Nel Ledesma, MHS, OTR/L Rehabilitation Services Beaumont Health System – Troy Campus October 10, 2019 Course Objectives • Discuss the prevalence of sensory impairment following a stroke • Describe the stroke survivors’ experiences of sensory impairment in the upper limb and impact with daily life • Review sensory assessments and outcome measures • Review the adaptive / compensatory strategies for sensory impairment • Discuss the passive and active sensory training – Discuss the recommended parameters, electrodes placements and dosage for passive sensory training using TENS – Discuss Thermal Stimulation to improve thermal awareness – Identify sample objects for active sensory training • Describe sensory training home program • Present case studies 2 Stroke Statistics • Stroke kills about 140,000 Americans each year—that’s 1 out of every 20 deaths.1 • Someone in the United States has a stroke every 40 seconds. Every 4 minutes, someone dies of stroke.2 • Every year, more than 795,000 people in the United States have a stroke. About 610,000 of these are first or new strokes.2 • About 185,00 strokes—nearly 1 of 4—are in people who have had a previous stroke.2 • About 87% of all strokes are ischemic strokes, in which blood flow to the brain is blocked.2 • Economic impact was estimated at $34 billion each year.2 This total includes the cost of health care services, medicines to treat stroke, and missed days of work. 3 Post stroke dysfunction • Stroke is a leading cause
    [Show full text]