Psych 102 Chapter 9 Presentation

Total Page:16

File Type:pdf, Size:1020Kb

Psych 102 Chapter 9 Presentation 7/23/19 Hearing and Language Chapter 9 Stimulus for Hearing The Auditory Mechanism Frequency Analysis Garrett: Brain & Behavior 4e Locating sounds with Binaural Cues Language, Aphasias, and Antecedents 1 Hearing • A *receptor is a cell or specialized neuron that • Responds to a particular form of energy 7/23/19 (adequate stimulus). • *converts one form of energy to another • Intensity and pattern of information makes information meaningful. • “Hearing” vs. “ listening” • *Sensation- acquisition of information Garrett: Brain & Behavior 4e • *Perception- interpretation of information 2 Hearing Where Sensation Occurs and What is Encoded Where Physical stimulus is Psychological 7/23/19 Sense Adequate Stimulus modalities converted modalities (receptor) Hearing Vibration in a Cochlea • Frequency • Pitch (Ch 9) conducting medium (hair cell) • Intensity or • Loudness (air, water, bone) Amplitude Vision Electromagnetic Retina • Wavelengt • Color (Ch 10) radiation in visual (rod , cone) h • Brightness spectrum • Intensity or wavelengths Amplitude Garrett: Brain & Behavior 4e 3 Physical vs. Psychological • *Amplitude (or • *Loudness 7/23/19 Intensity) • dB- decibels • mV- millivolts • 6dB corresponds to doubling of amplitude • Peak-to-peak • *Frequency • *Pitch • Hz- hertz • Waves per second • Human hearing range: Garrett: Brain & Behavior 4e *20Hz – 20,000 Hz 4 Hearing Fig 9.2: Alternating Compression & Decompression of Air by a Sound Source Fig 9.3: Examples of Pure Tones (a-d) and Complex (e-f) Sounds (right) 7/23/19 Garrett: Brain & Behavior 4e 5 SOURCE: (LEFT) From Sensation and Perception, 5th ed., by Goldstein, 1999. Reprinted with permission of Wadsworth, a division of Thomson Learning. Hearing Figure 9.4: The Outer, Middle, and Inner Ear 7/23/19 Garrett: Brain & Behavior 4e Oval window Round window Eustachian tube 6 Hearing Figure 9.4: The Outer, Middle, and Inner Ear • Outer ear 7/23/19 • Pinna • External auditory canal Garrett: Brain & Behavior 4e 7 Hearing Figure 9.5: Structures of the Middle and Inner Ear. • Middle ear 7/23/19 • Tympanic membrane • Tensor tympani • *Ossicles • hammer, anvil, stirrup • amplify sound from the tympanic membrane to the oval window Garrett: Brain & Behavior 4e • Eustachian tube Stirrup 8 Hearing Figure 9.5: Structures of the Middle and Inner Ear • Inner ear 7/23/19 • Oval window • Cochlea (Inner Ear) • Vestibular canal • Cochlear canal • Organ of Corti • Tympanic canal • Round window • Auditory nerve Garrett: Brain & Behavior 4e 9 Hearing See Figure 9.6: Electron Microscope view Showing the Hair Cells Attached to the Tectorial Membrane • *Organ of Corti – The receptive Organ of the ear 7/23/19 • Tectorial membrane • *Hair cells – The actual auditory receptors themselves • *Outer hair cells -> (amplification and sound sharpening) • Inner hair cells (encode sound into impulses) Garrett: Brain & Behavior 4e • Basilar membrane 10 Hearing The Auditory Cortex. Figure 9.7: The Auditory Pathway and Auditory Cortex • Auditory (8th Cranial Nerve) > • Inferior colliculi (sound from both ears converge) > 7/23/19 • Medial geniculate nucleus (mostly opposite ear information) > • Primary auditory cortex *(temporal lobe) • Topographical organization throughout system Garrett: Brain & Behavior 4e 11 Hearing Fig 9.8: The Dorsal “Where” & Ventral “What” Streams of Auditory Processing • What (green) • *Ventral Stream into temporal lobes 7/23/19 • Secondary Auditory Cortical areas • Where (red) • *Dorsal stream to parietal lobes, then frontal lobes Garrett: Brain & Behavior 4e 12 Hearing Frequency Theories. Figure 9.9: Illustration of Volleying in Neurons • Frequency theory (Rutherford, 1886) • Telephone theory (Wever & Bray, 1930) 7/23/19 • Both of above limited to <500 Hz sounds. Why? • Volley theory (Wever, 1949) • Volleying fails to follow sounds beyond about 5200 Hz • *groups of neurons follow frequency up to this point though Garrett: Brain & Behavior 4e 13 Hearing Why Frequency Theories are Inadequate • Why can’t we use frequency theories for encoding high frequency signals? • 1000 Hz is 1000 waves per second, • 1 millisecond between action potentials • Well below absolute refractory period of individual neuron +20 mV 0 mV -20 mV Absolute -40 mV 14 Threshold (mV) Refractory Period Relative Refractory Period 00 21 42 63 84 ms Time between action potentials (ms) Hearing Place Theory. Figure 9.10: Frequency Sensitivity on the Human Basilar Membrane. • *Place Theory • Tonotopic map of basilar membrane • *Frequencies encoded by place on membrane Garrett: Brain & Behavior 4e 15 SOURCE: Animation © 2006 Howard Hughes Medical Institute Hearing Place Theory. Figure 9.11: Tonotopic Map Figure 9.12: “ Tuning Curves” of three Cat Auditory Neurons • Auditory cortex is also *tonotopically organized • Cells tightly tuned to specific frequencies Garrett: Brain & Behavior 4e 16 SOURCE: (Bottom) Figure 11.31 from Sensation and Perceptions (5th ed.; p. 331) by E. Bruce Goldstein, 1999, Pacific Grove, CA: Brooks-Cole. © 1999. Reprinted by permission of Wadsworth, a division of Thomson Learning: www.thomsonrights.com. Fax 800-730-2215. Hearing Summary of Frequency vs. Place Theory • Hair cells fire at same • Basilar membrane frequency as sound sensitive to different wave frequencies • Volley: combining • Tonotopic frequency several hair cells organization • Low frequency • High frequency • Pitch- impulse rate of • Pitch- membrane hair cells location • Loudness- number of • Loudness- firing rate firing hair cells of hair cell 17 Hearing • However, place theory alone is inadequate. 7/23/19 • Basilar membrane vibrates equally throughout low range of hearing • Frequency-specific neurons have not been found below 200 Hz. • Frequency-place theory • Frequency encoding at low frequencies (most < 500 Hz) • Place encoding for everything else Frequency Theories Garrett: Brain & Behavior 4e Place Theory 18 0 1 2 3 4 5+ kHz Hearing Analyzing Complex Sounds. Figure 9.14: Fourier analysis of a clarinet note • Analyzing Complex Sounds 7/23/19 • Fourier analysis • Breaking complex sound into component frequencies • Basal membrane Garrett: Brain & Behavior 4e 19 SOURCE: From “How Much Distraction Can You Hear?” by P. Milner, Stereo Review, June 1977, pp. 64–68. © 1977. Reprinted by permission of Stereo Review. Hearing Analyzing Complex Sounds Figure 9.15: Areas Involved in Identifying Environmental Sounds • Analyzing Complex Sounds 7/23/19 • Cocktail party effect • Following an auditory object within complex sound background • Selective attention requires directional information as well as “what” pathway (yellow areas below) Garrett: Brain & Behavior 4e 20 SOURCE: From “Human Brain Regions involved in Recognizing Environmental Sounds,” by J. W. Lewis et al., 2004, Cerebral Cortex, 14, 1008–1021. Reprinted with permission. Hearing Locating Sounds With Binaural Cues Figure 9.16: Sound Localizing Device Used by 19th-Century Sailors. • Binaural: using both ears • Differences –*Greatest with sounds to our left & right • Difference in Intensity • Difference in Time of Arrival • Phase Difference Garrett: Brain & Behavior 4e 21 Hearing Locating Sounds With Binaural Cues Figure 9.17: Differential Intensity & Time of Arrival Cues • Difference in 7/23/19 Intensity • High frequencies • Sound shadow • Difference in Time of Arrival • Low frequencies • Sound delay Garrett: Brain & Behavior 4e 22 Hearing Figure 9.18: Phase Differences at the Two Ears • Phase difference between ears 7/23/19 • The far ear’s wave will lag behind the nearer one • Only useful < 1500 Hz Garrett: Brain & Behavior 4e 23 Application: Restoring Hearing *Hearing aids useful for losses associated with middle ear bone conduction issues *A Cochlear Implant Device for problems with hair cells Garrett: Brain & Behavior 4e 24 Stimulates hair cells directly through implanted receiver/stimulator Quality is about the same as a telephone call Hearing A Brain Circuit for Detecting Time Differences Figure 9.19: Difference in Time of Arrival Circuit • Coincidence detectors and delay lines 7/23/19 • Longer pathway from one ear compensates for sound delay to the other ear. • Cell fires most when inputs from both ears arrive at the same time. • Each detector is specialized for a particular angle of sound. Garrett: Brain & Behavior 4e 25 SOURCE: Based on the results of Carr and Konishi (1990). Hearing Application: I Can Hear a Tree Over There • Echolocation • Emit sound, echoes coming off objects are “seen” • Bats, dolphins, whales, some birds, submarines • Occasionally a blind human! Garrett: Brain & Behavior 4e 26 SOURCE: From Figure 3, Thaler, L., Arnott, S. R., Goodale, M. A. (2011). Neural correlates of natural human echolocation in early and late blind echolocation experts. PLoS ONE, 6, e20162, DOI: 10.1371/journal.pone.0020162. Figure was edited to highlight only the later views of Participant EB and control Participant C1 Language Defined • Language: generation and understanding of written, spoken, and gestural communication. 7/23/19 • Acquired through learning • Brain areas responsible for language • Impairment- Aphasia Garrett: Brain & Behavior 4e 27 Language Broca’s Area. Figure 9.20: Language-Related Areas of the Cortex • Broca (1861): Stroke patient with *frontal lobe damage anterior to motor cortex (Broca’s area) 7/23/19 • Broca’s (Expressive) Aphasia • *Non-fluency (selecting the right word) • Anomia • *Inarticulate (pronunciation) • Agrammatic 28 Language Wernicke’s Area. Figure 9.20: Language-Related Areas of the Cortex • Wernicke (1874): Left *Posterior superior temporal lobe damage
Recommended publications
  • Meta-Analytic Connectivity Modeling of Brodmann Area 37
    Florida International University FIU Digital Commons Nicole Wertheim College of Nursing and Health Nicole Wertheim College of Nursing and Health Sciences Sciences 12-17-2014 Language and Visual Perception Associations: Meta-Analytic Connectivity Modeling of Brodmann Area 37 Alfredo Ardilla Department of Communication Sciences and Disorders, Florida International University, [email protected] Byron Bernal Miami Children's Hospital Monica Rosselli Florida Atlantic University Follow this and additional works at: https://digitalcommons.fiu.edu/cnhs_fac Part of the Physical Sciences and Mathematics Commons Recommended Citation Ardilla, Alfredo; Bernal, Byron; and Rosselli, Monica, "Language and Visual Perception Associations: Meta-Analytic Connectivity Modeling of Brodmann Area 37" (2014). Nicole Wertheim College of Nursing and Health Sciences. 1. https://digitalcommons.fiu.edu/cnhs_fac/1 This work is brought to you for free and open access by the Nicole Wertheim College of Nursing and Health Sciences at FIU Digital Commons. It has been accepted for inclusion in Nicole Wertheim College of Nursing and Health Sciences by an authorized administrator of FIU Digital Commons. For more information, please contact [email protected]. Hindawi Publishing Corporation Behavioural Neurology Volume 2015, Article ID 565871, 14 pages http://dx.doi.org/10.1155/2015/565871 Research Article Language and Visual Perception Associations: Meta-Analytic Connectivity Modeling of Brodmann Area 37 Alfredo Ardila,1 Byron Bernal,2 and Monica Rosselli3 1 Department of Communication Sciences and Disorders, Florida International University, Miami, FL 33199, USA 2Radiology Department and Research Institute, Miami Children’s Hospital, Miami, FL 33155, USA 3Department of Psychology, Florida Atlantic University, Davie, FL 33314, USA Correspondence should be addressed to Alfredo Ardila; [email protected] Received 4 November 2014; Revised 9 December 2014; Accepted 17 December 2014 Academic Editor: Annalena Venneri Copyright © 2015 Alfredo Ardila et al.
    [Show full text]
  • Underwater Acoustics: Webinar Series for the International Regulatory Community
    Underwater Acoustics: Webinar Series for the International Regulatory Community Webinar Outline: Marine Animal Sound Production and Reception Thursday, December 3, 2015 at 12:00pm (US East Coast Time) Sound production and reception in teleost fish (M. Clara P. Amorim, Ispa – Instituto Universitário) • Teleost fish are likely the largest vocal vertebrate group. Sounds made by fish can be an important part of marine soundscapes. • Fish possess the most diversified sonic mechanisms among vertebrates, which include the vibration of the swim bladder through intrinsic or extrinsic sonic muscles, as well as the rubbing of bony elements. • Fish sounds are usually pulsed (each sonic muscle contraction corresponds to a sound pulse), short (typically shorter than 1 s) and broadband (with most energy below 1 kHz), although some fish produce tonal sounds. Sounds generated by bony elements are often higher frequency (up to a few kHz). • In contrast with terrestrial vertebrates, fish have no external or middle ear. Fish detect sounds with the inner ear, which comprises three semicircular canals and three otolithic end organs, the utricle, the saccule and the lagena. Fish mostly detect particle motion and hear up to 1 kHz. Some species have evolved accessory auditory structures that serve as pressures transducers and present enhanced hearing sensitivity and increased frequency detection up to several kHz. Fish hearing seems to have evolved independently of sound production and is important to detect the ‘auditory scene’. • Acoustic signals are produced during social interactions or during distress situations as in insects or other vertebrates. Sounds are important in mate attraction, courtship and spawning or to defend a territory and gain access to food.
    [Show full text]
  • Advance and Unedited Reporting Material (English Only)
    13 March 2018 (corr.) Advance and unedited reporting material (English only) Seventy-third session Oceans and the law of the sea Report of the Secretary-General Summary In paragraph 339 of its resolution 71/257, as reiterated in paragraph 354 of resolution 72/73, the General Assembly decided that the United Nations Open-ended Informal Consultative Process on Oceans and the Law of the Sea would focus its discussions at its nineteenth meeting on the topic “Anthropogenic underwater noise”. The present report was prepared pursuant to paragraph 366 of General Assembly resolution 72/73 with a view to facilitating discussions on the topic of focus. It is being submitted for consideration by the General Assembly and also to the States parties to the United Nations Convention on the Law of the Sea, pursuant to article 319 of the Convention. Contents Page I. Introduction ............................................................... II. Nature and sources of anthropogenic underwater noise III. Environmental and socioeconomic aspects IV. Current activities and further needs with regard to cooperation and coordination in addressing anthropogenic underwater noise V. Conclusions ............................................................... 2 I. Introduction 1. The marine environment is subject to a wide array of human-made noise. Many human activities with socioeconomic significance introduce sound into the marine environment either intentionally for a specific purpose (e.g., seismic surveys) or unintentionally as a by-product of their activities (e.g., shipping). In addition, there is a range of natural sound sources from physical and biological origins such as wind, waves, swell patterns, currents, earthquakes, precipitation and ice, as well as the sounds produced by marine animals for communication, orientation, navigation and foraging.
    [Show full text]
  • Neurobiology of Language Steven L
    CHAPTER 1 The Neurobiology of Language Steven L. Small1 and Gregory Hickok2 1Department of Neurology, University of California, Irvine, CA, USA; 2Department of Cognitive Sciences, Center for Language Science, Center for Cognitive Neuroscience, University of California, Irvine, CA, USA 1.1 HISTORY For many centuries, the biological basis of human thought has been an important focus of attention in medi- cine, with particular interest in the brain basis of language sparked by the famous patients of Pierre Paul Broca in the mid 19th century (Broca, 1861a,c). The patient Louis Victor LeBorgne (Domanski, 2013)presentedtotheHoˆpital Biceˆtre in Paris with severe difficulty speaking, purport- edly only uttering the syllable “tan,” sometimes as a pair “tan, tan,” and often accompanied by gestures (Domanski, 2013). The diagnosis was not clear until autopsy, when Broca found on gross inspection that some neurological process (he reported a resulting collection of serous fluid) had destroyed a portion of the left posterior inferior frontal FIGURE 1.1 The exterior surface of the brain of LeBorgne (“tan”). gyrus (Broca, 1861c)(Figure 1.1). A subsequent patient, LeLong, had a similar paucity of speech output (five Wernicke (1874), the diagram-making of Lichtheim words were reported) with a lesion not dissimilar to that (1885) (Figure 1.2) and Grashey (1885), the anatomy of of LeBorgne (Broca, 1861a). Given the ongoing debates at De´jerine (1895), and of course many other contributors. the time about brain localization of language, including In the past century, Norman Geschwind recapitulated attribution of the “seat of language” to the frontal and added to the language “center” models that pre- lobes (Auburtin, 1861; Bouillaud, 1825; Gall & ceded him and presented a reconceptualized “connec- Spurtzheim, 1809)—which led Broca to investigate this tionist” view of the brain mechanisms of language case in the first place—he presented this patient with (Geschwind, 1965, 1970).
    [Show full text]
  • Large Scale Sound Installation Design: Psychoacoustic Stimulation
    LARGE SCALE SOUND INSTALLATION DESIGN: PSYCHOACOUSTIC STIMULATION An Interactive Qualifying Project Report submitted to the Faculty of the WORCESTER POLYTECHNIC INSTITUTE in partial fulfillment of the requirements for the Degree of Bachelor of Science by Taylor H. Andrews, CS 2012 Mark E. Hayden, ECE 2012 Date: 16 December 2010 Professor Frederick W. Bianchi, Advisor Abstract The brain performs a vast amount of processing to translate the raw frequency content of incoming acoustic stimuli into the perceptual equivalent. Psychoacoustic processing can result in pitches and beats being “heard” that do not physically exist in the medium. These psychoac- oustic effects were researched and then applied in a large scale sound design. The constructed installations and acoustic stimuli were designed specifically to combat sensory atrophy by exer- cising and reinforcing the listeners’ perceptual skills. i Table of Contents Abstract ............................................................................................................................................ i Table of Contents ............................................................................................................................ ii Table of Figures ............................................................................................................................. iii Table of Tables .............................................................................................................................. iv Chapter 1: Introduction .................................................................................................................
    [Show full text]
  • The Audiogram
    Recently,Recently, I’veI’ve been trying to orga- evaluations, and they are truly im- nize some of the columns and articles pressive, the information and insights RI’veI’ve written overover the past ten years.years. provided by the simple audiogram As I was looking through them, it be- can still provide the most pertinent came apparent that I’ve neglected to information to explain the behavioral discuss what is perhaps the implications of a hearing loss. most important hearing di- Perhaps the most important in- mension of all, the simple sight of all is an appreciation of how The audiogram. specifi c audiograms impact upon the In reality, however, perception of certain speech sounds. the “simple” audiogram, Without including speech in the Audiogram: and particularly its im- equation, it is simply not possible to Audiogram: plications, is not quite so intelligibly discuss the audiogram. simple. Even though just This, after all, is the signal we are about everybody who re- most interested in hearing (not to Explanation ceives a hearing aid has his minimize the specifi c needs of certain or her hearing tested with groups of people for other types of a pure-tone audiometer, sounds, such as musicians). and not everybody receives a comprehensive explanation Figure One Audiogram — of exactly what the results The “Speech Banana” Signifi cance mean and what the impli- cations are for them. The audiogram of a fairly typical And even for those who audiogram can be seen in Figure 1. do, at a time when prospec- (My thanks to Brad Ingrao for creat- By Mark Ross tive hearing aid purchasers ing these fi gures for me.) Let’s fi rst go are being inundated with through the fundamentals.
    [Show full text]
  • The Perfect Match: OG and Technology by Fay Van Vliet F/AOGPE and Susan Christenson M.A
    Winter/Spring 2017 The Perfect Match: OG and Technology by Fay Van Vliet F/AOGPE and Susan Christenson M.A. Fear is often what keeps us from pursuing new paths, so those growth may be attributed to the benefi ts of online tutoring paths are frequently left for the youth who are not laden with as it meets the needs of working parents, saves travel time, the gift of caution that time may provide. For those of us who fulfi lls requests from school districts needing our tutoring work in the area of dyslexia, technology is an area that may services, and allows the tutor to work in the comfort of their cause angst. At the Reading Center, it is mind-boggling that home! This is a win-win situation for the parents, students, Jean Osman, pioneer in the fi eld of dyslexia and school districts, and tutors. co-founder of our 65-year-old organization, is the one who has lead us into the technological age, ex- What has made the difference? How do we engage ploring areas in which e-tools may help those who apprehensive tutors? Which students qualify for on- struggle with dyslexia. As Jean Osman posed to us, line instruction? What tools do we need, and what “What would Orton do with new technology?” are the costs? How do we keep tutoring multisen- sory and consistent with the Essential Elements of Our non-profi t organization, The Reading Center, OG Instruction? intrepidly began online tutoring in 2003 with Fel- lows Nancy Sears and Jean Hayward leading the way using Engaging Tutors NetMeeting.
    [Show full text]
  • Appendix J Fish Hearing and Sensitivity to Acoustic
    APPENDIX J FISH HEARING AND SENSITIVITY TO ACOUSTIC IMPACTS Fish Hearing and Sensitivity to Acoustic Impacts J-iii TABLE OF CONTENTS Page 1. INTRODUCTION ............................................................................................................................ J-1 1.1. What is Injury for Fishes?........................................................................................................ J-1 1.2. Fish........................................................................................................................................... J-1 1.3. Fish Bioacoustics – Overview.................................................................................................. J-2 1.4. Metrics of Sound Exposure...................................................................................................... J-2 2. BACKGROUND ON FISH HEARING........................................................................................... J-3 2.1. Sound in Water ........................................................................................................................ J-3 2.2. Hearing Sensitivity................................................................................................................... J-3 2.3. Other Aspects of Fish Hearing................................................................................................. J-7 3. EFFECTS OF HUMAN-GENERATED SOUND ON FISHES – OVERVIEW ............................. J-8 4. EFFECTS OF ANTHROPOGENIC SOUNDS ON HEARING .....................................................
    [Show full text]
  • How Good Is Your Hearing? HEARING RANGE
    Exhibit Sheet How good is your hearing? HEARING RANGE (Type) Ages Topic Time Science 7-14 Sound <10 mins background Skills used Observations - Curiosity Overview for adults Hearing Range lets you test your hearing. As you listen through headphones, sounds of different frequencies are played, starting quietly and getting louder. You press a button when you hear the sound and at the end get an estimate of your “hearing age”. What’s the science? Frequency is the number of sound waves per second. It’s measured in hertz (Hz). Humans can hear sounds at frequencies between 20 hertz (a cat purring) and 20,000 hertz (a bat squeaking). The sounds you can hear are linked to your age. An 8-year-old hears the widest range of sounds. Then, as you get older, it becomes harder to hear high-frequency sounds. Science in your world Different animals have different hearing ranges. Dogs and cats can hear much higher sounds than we can – up to 50,000hz. That’s why dog whistles work. They produce really high frequency sounds that we can’t hear but dogs can hear them loud and clear. Things to think and talk about … Why do you think hearing gets worse as you get older? What could make your hearing better or worse? Things to investigate … How good is your hearing? Who the best hearing in your group? Are they the youngest? Museum links When microphones were invented, they could only pick up small ranges of frequencies. To record people speaking, you needed lots of different microphones of different sizes – from tiny ones to pick up high frequency sounds to big ones to pick up the low frequency sounds.
    [Show full text]
  • Neurobiology of Language
    NEUROBIOLOGY OF LANGUAGE Edited by GREGORY HICKOK Department of Cognitive Sciences, University of California, Irvine, CA, USA STEVEN L. SMALL Department of Neurology, University of California, Irvine, CA, USA AMSTERDAM • BOSTON • HEIDELBERG • LONDON NEW YORK • OXFORD • PARIS • SAN DIEGO SAN FRANCISCO • SINGAPORE • SYDNEY • TOKYO Academic Press is an imprint of Elsevier SECTION A INTRODUCTION This page intentionally left blank CHAPTER 1 The Neurobiology of Language Steven L. Small1 and Gregory Hickok2 1Department of Neurology, University of California, Irvine, CA, USA; 2Department of Cognitive Sciences, Center for Language Science, Center for Cognitive Neuroscience, University of California, Irvine, CA, USA 1.1 HISTORY For many centuries, the biological basis of human thought has been an important focus of attention in medi- cine, with particular interest in the brain basis of language sparked by the famous patients of Pierre Paul Broca in the mid 19th century (Broca, 1861a,c). The patient Louis Victor LeBorgne (Domanski, 2013) presented to the Hoˆpital Biceˆtre in Paris with severe difficulty speaking, purport- edly only uttering the syllable “tan,” sometimes as a pair “tan, tan,” and often accompanied by gestures (Domanski, 2013). The diagnosis was not clear until autopsy, when Broca found on gross inspection that some neurological process (he reported a resulting collection of serous fluid) had destroyed a portion of the left posterior inferior frontal FIGURE 1.1 The exterior surface of the brain of LeBorgne (“tan”). gyrus (Broca, 1861c)(Figure 1.1). A subsequent patient, LeLong, had a similar paucity of speech output (five Wernicke (1874), the diagram-making of Lichtheim words were reported) with a lesion not dissimilar to that (1885) (Figure 1.2)andGrashey (1885), the anatomy of of LeBorgne (Broca, 1861a).
    [Show full text]
  • Audiology 101: an Introduction to Audiology for Nonaudiologists Terry Foust, Aud, FAAA, CC-SLP/A; & Jeff Hoffman, MS, CCC-A
    NATIONALA RESOURCE CENTER GUIDE FOR FOR EARLY HEARING HEARING ASSESSMENT DETECTION & & MANAGEMENT INTERVENTION Chapter 5 Audiology 101: An Introduction to Audiology for Nonaudiologists Terry Foust, AuD, FAAA, CC-SLP/A; & Jeff Hoffman, MS, CCC-A Parents of young Introduction What is an audiologist? children who are arents of young children who are An audiologist is a specialist in hearing identified as deaf or hard identified as deaf or hard of hearing and balance who typically works in of hearing (DHH) are P(DHH) are suddenly thrust into a either a medical, private practice, or an suddenly thrust into a world of new concepts and a bewildering educational setting. The primary roles of world of new concepts array of terms. What’s a decibel or hertz? an audiologist include the identification and a bewildering array What does sensorineural mean? Is a and assessment of hearing and balance moderate hearing loss one to be concerned problems, the habilitation or rehabilitation of terms. about, since it’s only moderate? What’s of hearing and balance problems, and the a tympanogram or a cochlear implant? prevention of hearing loss. When working These are just a few of the many questions with infants and young children, the that a parent whose child has been primary focus of audiology is hearing. identified as DHH may have. In addition to parents, questions also arise from Audiologists are licensed by the state in professionals and paraprofessionals who which they practice and may be members work in the field of early hearing detection of the American Speech-Language- and intervention (EHDI) and are not Hearing Association (ASHA), American audiologists.
    [Show full text]
  • Dyslexia and Other Reading Disorders
    Reading Disorders & Dyslexia By Kate Maxwell Baker, MS CCC-SLP, Director Maxwell Speech & Language Center What is a Reading Disorder? • A learning disorder that involves significant impairment of reading accuracy, speed, or comprehension to the extent that the impairment interferes with academic achievement and activities of daily living. • This overall diagnosis can be used for deficits in reading decoding AND reading comprehension. What is Dyslexia? • A reading disorder is most commonly called Dyslexia, but Dyslexia usually includes deficits in spelling and writing as well as reading. What is Dysgraphia? • A specific learning disability that affects a person’s ability to acquire written language and to use written language to express their thoughts. • It can affect a person’s handwriting, orthographic coding, basic spelling and grammar, and use of incorrect wording. • It can occur with or without a diagnosis of Dyslexia. My Story • Reading Delay • Attention Deficit Disorder, non-hyperactive • A Language Learning Disability How did my story end? • The College of William and Mary - BS, Psychology • James Madison University - MS, Communication Sciences & Disorders • Owner/Director of Maxwell Speech & Language Center NEVER take NO for an answer. Reading Developmental Norms • Pre-Reading, ages 0-4 • Kindergarten, age 5 • 1st and 2nd Grade • 2nd and 3rd Grade • 4th Grade through 8th Grade • High School Developing pre-reading skills • Read to your child as much as you can! • Play silly games with rhyming or sounds • Have books available about many topics and encourage young children to sit quietly and look at pictures. Reading Decoding vs. Reading Comprehension • Reading Decoding: Translating the printed word into a sound • Reading Comprehension: Understanding the meaning of words that are written When to be Concerned • .
    [Show full text]