The Auditory System

Total Page:16

File Type:pdf, Size:1020Kb

The Auditory System L105/205 – Phonetics Handout 13 Scarborough Feb. 28, 2005 reading: Johnson Ch. 3 (today); Johnson Ch. 4 (Wed) paper: By the end of this week, you should have a concrete plan for your term project. The Auditory System 1. Remember the speech chain: (planning) → articulation → acoustics → audition → perception (from Denes & Pinson, 1993) • audition = hearing: registering sounds in the brain an acoustic signal in the air is converted into an electrochemical signal in the brain • perception: decoding the message an electrochemical signal in the brain is decoded into segments, words, and ultimately, meanings 2. The auditory system (from Borden, et al., 2003) • Outer ear ° pinna – funnels the sound (especially from in front of the head) the small projection of the pinna over the opening to the ear canal: tragus ↓ ° external auditory meatus = ear canal – protects the very delicate parts of the middle ear cerumen (wax) and cilia (hairs) line the canal and filter out dust, etc. The ear canal also filters sound since it is a resonator tube (open at one end) - amplifies frequencies around 3500 – 4000 Hz • Middle ear ° tympanic membrane = eardrum – a thin, stretched membrane separating the outer ear from the middle ear; vibrates with fluctuations of air pressure (sound) ↓ ° ossicular chain – a chain of 3 bones attached at the inside of the eardrum that propagate and amplify sound vibrations malleus (hammer) → incus (anvil) → stapes (stirrup) (from Denes & Pinson, 1993) The signal will have to be propagated next to the fluid-filled inner ear. But liquid offers a higher impedance or resistance to sound pressure than air does, so the sound pressure must be increased so that energy transmission isn’t just blocked (reflected). The ossicular chain converts small vibrations on a large surface (tympanic membrane) → large vibrations on a small surface (oval window). Sound pressure is increased by approximately 30 dB. ↓ ° oval window – a membrane leading to the inner ear • Inner ear ° cochlea – a fluid-filled, coil-shaped duct in the temporal bone of the skull; contains the basilar membrane and the organ of corti Rocking of the stapes in the oval window is translated into pressure variations in the cochlear fluids. ↓ basilar membrane – coiled membrane stimulated by movements in the cochlear fluid “tonotopic” organization ° each piece of the membrane responds to different freqs • where – frequency • how big – loudness → gives a spectrum of the sound (from Borden, et al., 2003) The membrane is narrow and stiff at the base and wider and less stiff at the apex (opposite of what one might expect). ↓ organ of corti – actual sense organ of hearing (auditory receptor); holds rows of hair cells against the basilar membrane and releases an electrochemical signal to the auditory nerve • Auditory nerve (= 8th cranial nerve) – a bundle of nerve fibers coming from the hair cells ↓ exits the temporal bone via the internal auditory meatus; passes through the brainstem ↓ auditory cortex – area in the temporal lobe (Heschl’s gyrus) The frequency and intensity information are represented directly (topographically) in the temporal lobe. 3. Sound begins as acoustic energy. → mechanical energy as it hits the tympanic membrane (and is transmitted through the ossicular chain) → hydrodynamic energy in the cochlea → electrochemical energy as hair cells are activated via the basilar membrane Loudness 4. The amplitude of a sound wave = the displacement of air pressure from atmospheric pressure. - usually measured as root-mean-squared (RMS) amplitude i.e., the square root of the mean of the squared amplitude Why? A sound wave involves both negative and positive displacements from atmospheric pressure. Amplitude is the magnitude of displacement, either positive or negative. Squaring makes all the displacements positive so we can take the mean, and then we undo the squaring by taking the square root. 5. The auditory sensation of loudness - The perceived loudness of a sound depends on its amplitude. But subjective auditory impressions of loudness differences do not match sound pressure differences. e.g., For soft sounds, small changes in pressure yield large changes in perceived loudness; for loud sounds, large changes in pressure yield small changes in loudness. e.g., Loudness is sensitive to frequency: low and high frequencies are perceived as quieter than mid frequencies. - sones: units of perceived loudness based on subjective judgments (from Johnson, 2003) - decibel (dB): unit of relative loudness that provides an approximation of the nonlinearity of human loudness sensation; measured in terms of intensity • Intensity is proportional to the square of the amplitude. • We can measure the intensity of one signal relative to another: the intensity of a sound x relative to a reference sound r : x2/r2 2 2 • A bel is the base 10 log of this ratio : log10(x /r ) 2 2 A decibel (dB) is one tenth of a bel : 10 log10(x /r ) = 20 log10(x/r) • Reference level: - 20 µPa : lowest audible pressure fluctuation of a 1000 Hz tone (dB SPL) - lowest audible pressure fluctuation at that frequency (dB SL) - could be specified as anything else (Note that the log scaling of the decibel scale approximates human perception of loudness by representing enhanced sensitivity to differences in sound pressure at low pressure levels. However, the decibel scale actually exaggerates this non-linearity.) Frequency response 6. The response of the auditory system to frequency is also non-linear. e.g., For low frequency sounds, changes in frequency are perceptually greater than acoustically equivalent changes for higher frequency sounds. i.e., The auditory system is more sensitive to frequency changes at the low end of the audible frequency range than at the high end. - This effect is due to the physical structure of the basilar membrane. A relatively large proportion of the membrane responds to low frequency sounds, so frequency resolution is much better in this range. (from Johnson, 2003) - Bark: units of auditory frequency (from Johnson, 2003) 7. Auditory representations Since acoustic scales of loudness and frequency do not match auditory scales, acoustic analyses of speech may not accurately represent what a listener experiences. - To avoid this mismatch, we can implement a functional model of the auditory system in our analysis. - Bark and sone scaling are examples of one-dimensional models. - Spectra can also be calculated according to an auditory model. acoustic spectrum auditory spectrum (from Johnson, 2003) • Auditory models allow us to look at the speech signal from the listener’s point of view. .
Recommended publications
  • Guide to Sensory Processing.Pdf
    Guide to Sensory Processing Prepared by Allison Travnik, MSOTS Level II Fieldwork Student Project Kavitha N Krishnan MS OTR/L Fieldwork Instructor Sensory Processing In order to understand what is going on around us, we need to organize all of the incoming sensory information (Ayres, 2005). The sensory information involves what we see, smell, taste, hear, feel on our body, where our body is in relation to others, and how well we are balanced. This is a lot of information that our brains need to process in order to engage in productive behavior, learn, and form accurate perceptions. Proprioceptive Where are body is in space Tactile Auditory What we feel The noise on our skin around us Sensory Smell Processing The Sight difference What we see scents around us around us Oral Sensory Processing Vestibular The sensations Jean Ayres developed the sensory Our sense of Disorder + balance that food give integration (SI) theory. SI gives us in our mouth meaning to what our senses are recognizing. When the sensations are not being organized properly may notice some of the same qualities in the brain, Ayres compared it to about yourself.It is important to a traffic jam. The traffic jam of remember that everyone has some sensory information can lead to quirks about their sensory processing learning difficulties and problem whether it be a sensitivity to loud behavior (Ayres, 2005). Children noises or dislike of light touch. with Sensory Processing Disorder However the identification of SPD is (SPD) are struggling with this reserved for individuals whose traffic jam. sensory quirks are outside of the Sensory processing is a typical range and affect their daily dynamic and complex theory.
    [Show full text]
  • Perforated Eardrum
    Vinod K. Anand, MD, FACS Nose and Sinus Clinic Perforated Eardrum A perforated eardrum is a hole or rupture m the eardrum, a thin membrane which separated the ear canal and the middle ear. The medical term for eardrum is tympanic membrane. The middle ear is connected to the nose by the eustachian tube. A perforated eardrum is often accompanied by decreased hearing and occasional discharge. Paih is usually not persistent. Causes of Eardrum Perforation The causes of perforated eardrum are usually from trauma or infection. A perforated eardrum can occur: if the ear is struck squarely with an open hand with a skull fracture after a sudden explosion if an object (such as a bobby pin, Q-tip, or stick) is pushed too far into the ear canal. as a result of hot slag (from welding) or acid entering the ear canal Middle ear infections may cause pain, hearing loss and spontaneous rupture (tear) of the eardrum resulting in a perforation. In this circumstance, there may be infected or bloody drainage from the ear. In medical terms, this is called otitis media with perforation. On rare occasions a small hole may remain in the eardrum after a previously placed P.E. tube (pressure equalizing) either falls out or is removed by the physician. Most eardrum perforations heal spontaneously within weeks after rupture, although some may take up to several months. During the healing process the ear must be protected from water and trauma. Those eardrum perforations which do not heal on their own may require surgery. Effects on Hearing from Perforated Eardrum Usually, the larger the perforation, the greater the loss of hearing.
    [Show full text]
  • Understanding Sensory Processing: Looking at Children's Behavior Through the Lens of Sensory Processing
    Understanding Sensory Processing: Looking at Children’s Behavior Through the Lens of Sensory Processing Communities of Practice in Autism September 24, 2009 Charlottesville, VA Dianne Koontz Lowman, Ed.D. Early Childhood Coordinator Region 5 T/TAC James Madison University MSC 9002 Harrisonburg, VA 22807 [email protected] ______________________________________________________________________________ Dianne Koontz Lowman/[email protected]/2008 Page 1 Looking at Children’s Behavior Through the Lens of Sensory Processing Do you know a child like this? Travis is constantly moving, pushing, or chewing on things. The collar of his shirt and coat are always wet from chewing. When talking to people, he tends to push up against you. Or do you know another child? Sierra does not like to be hugged or kissed by anyone. She gets upset with other children bump up against her. She doesn’t like socks with a heel or toe seam or any tags on clothes. Why is Travis always chewing? Why doesn’t Sierra liked to be touched? Why do children react differently to things around them? These children have different ways of reacting to the things around them, to sensations. Over the years, different terms (such as sensory integration) have been used to describe how children deal with the information they receive through their senses. Currently, the term being used to describe children who have difficulty dealing with input from their senses is sensory processing disorder. _____________________________________________________________________ Sensory Processing Disorder
    [Show full text]
  • Electromagnetic Field and TGF-Β Enhance the Compensatory
    www.nature.com/scientificreports OPEN Electromagnetic feld and TGF‑β enhance the compensatory plasticity after sensory nerve injury in cockroach Periplaneta americana Milena Jankowska1, Angelika Klimek1, Chiara Valsecchi2, Maria Stankiewicz1, Joanna Wyszkowska1* & Justyna Rogalska1 Recovery of function after sensory nerves injury involves compensatory plasticity, which can be observed in invertebrates. The aim of the study was the evaluation of compensatory plasticity in the cockroach (Periplaneta americana) nervous system after the sensory nerve injury and assessment of the efect of electromagnetic feld exposure (EMF, 50 Hz, 7 mT) and TGF‑β on this process. The bioelectrical activities of nerves (pre‑and post‑synaptic parts of the sensory path) were recorded under wind stimulation of the cerci before and after right cercus ablation and in insects exposed to EMF and treated with TGF‑β. Ablation of the right cercus caused an increase of activity of the left presynaptic part of the sensory path. Exposure to EMF and TGF‑β induced an increase of activity in both parts of the sensory path. This suggests strengthening efects of EMF and TGF‑β on the insect ability to recognize stimuli after one cercus ablation. Data from locomotor tests proved electrophysiological results. The takeover of the function of one cercus by the second one proves the existence of compensatory plasticity in the cockroach escape system, which makes it a good model for studying compensatory plasticity. We recommend further research on EMF as a useful factor in neurorehabilitation. Injuries in the nervous system caused by acute trauma, neurodegenerative diseases or even old age are hard to reverse and represent an enormous challenge for modern medicine.
    [Show full text]
  • SENSORY MOTOR COORDINATION in ROBONAUT Richard Alan Peters
    SENSORY MOTOR COORDINATION IN ROBONAUT 5 Richard Alan Peters 11 Vanderbilt University School of Engineering JSC Mail Code: ER4 30 October 2000 Robert 0. Ambrose Robotic Systems Technology Branch Automation, Robotics, & Simulation Division Engineering Directorate Richard Alan Peters II Robert 0. Ambrose SENSORY MOTOR COORDINATION IN ROBONAUT Final Report NASNASEE Summer Faculty Fellowship Program - 2000 Johnson Space Center Prepared By: Richard Alan Peters II, Ph.D. Academic Rank: Associate Professor University and Department: Vanderbilt University Department of Electrical Engineering and Computer Science Nashville, TN 37235 NASNJSC Directorate: Engineering Division: Automation, Robotics, & Simulation Branch: Robotic Systems Technology JSC Colleague: Robert 0. Ambrose Date Submitted: 30 October 2000 Contract Number: NAG 9-867 13-1 ABSTRACT As a participant of the year 2000 NASA Summer Faculty Fellowship Program, I worked with the engineers of the Dexterous Robotics Laboratory at NASA Johnson Space Center on the Robonaut project. The Robonaut is an articulated torso with two dexterous arms, left and right five-fingered hands, and a head with cameras mounted on an articulated neck. This advanced space robot, now dnven only teleoperatively using VR gloves, sensors and helmets, is to be upgraded to a thinking system that can find, in- teract with and assist humans autonomously, allowing the Crew to work with Robonaut as a (junior) member of their team. Thus, the work performed this summer was toward the goal of enabling Robonaut to operate autonomously as an intelligent assistant to as- tronauts. Our underlying hypothesis is that a robot can deveZop intelligence if it learns a set of basic behaviors ([.e., reflexes - actions tightly coupled to sensing) and through experi- ence learns how to sequence these to solve problems or to accomplish higher-level tasks.
    [Show full text]
  • Auditory System & Hearing
    Auditory System & Hearing Chapters 9 part II Lecture 17 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017 1 Cochlea: physical device tuned to frequency! • place code: tuning of different parts of the cochlea to different frequencies 2 The auditory nerve (AN): fibers stimulated by inner hair cells • Frequency selectivity: Clearest when sounds are very faint 3 Threshold tuning curves for 6 neurons (threshold = lowest intensity that will give rise to a response) Characteristic frequency - frequency to which the neuron is most sensitive threshold(dB) frequency (kHz) 4 Information flow in the auditory pathway • Cochlear nucleus: first brain stem nucleus at which afferent auditory nerve fibers synapse • Superior olive: brainstem region thalamus MGN in the auditory pathway where inputs from both ears converge • Inferior colliculus: midbrain nucleus in the auditory pathway • Medial geniculate nucleus (MGN): part of the thalamus that relays auditory signals to the cortex 5 • Primary auditory cortex (A1): First cortical area for processing audition (in temporal lobe) • Belt & Parabelt areas: areas beyond A1, where neurons respond to more complex characteristics of sounds 6 Basic Structure of the Mammalian Auditory System Comparing overall structure of auditory and visual systems: • Auditory system: Large proportion of processing before A1 • Visual system: Large proportion of processing after V1 7 Basic Structure of the Mammalian Auditory System Tonotopic organization: neurons organized spatially in order of preferred frequency •
    [Show full text]
  • Study Guide Medical Terminology by Thea Liza Batan About the Author
    Study Guide Medical Terminology By Thea Liza Batan About the Author Thea Liza Batan earned a Master of Science in Nursing Administration in 2007 from Xavier University in Cincinnati, Ohio. She has worked as a staff nurse, nurse instructor, and level department head. She currently works as a simulation coordinator and a free- lance writer specializing in nursing and healthcare. All terms mentioned in this text that are known to be trademarks or service marks have been appropriately capitalized. Use of a term in this text shouldn’t be regarded as affecting the validity of any trademark or service mark. Copyright © 2017 by Penn Foster, Inc. All rights reserved. No part of the material protected by this copyright may be reproduced or utilized in any form or by any means, electronic or mechanical, including photocopying, recording, or by any information storage and retrieval system, without permission in writing from the copyright owner. Requests for permission to make copies of any part of the work should be mailed to Copyright Permissions, Penn Foster, 925 Oak Street, Scranton, Pennsylvania 18515. Printed in the United States of America CONTENTS INSTRUCTIONS 1 READING ASSIGNMENTS 3 LESSON 1: THE FUNDAMENTALS OF MEDICAL TERMINOLOGY 5 LESSON 2: DIAGNOSIS, INTERVENTION, AND HUMAN BODY TERMS 28 LESSON 3: MUSCULOSKELETAL, CIRCULATORY, AND RESPIRATORY SYSTEM TERMS 44 LESSON 4: DIGESTIVE, URINARY, AND REPRODUCTIVE SYSTEM TERMS 69 LESSON 5: INTEGUMENTARY, NERVOUS, AND ENDOCRINE S YSTEM TERMS 96 SELF-CHECK ANSWERS 134 © PENN FOSTER, INC. 2017 MEDICAL TERMINOLOGY PAGE III Contents INSTRUCTIONS INTRODUCTION Welcome to your course on medical terminology. You’re taking this course because you’re most likely interested in pursuing a health and science career, which entails ­proficiency­in­communicating­with­healthcare­professionals­such­as­physicians,­nurses,­ or dentists.
    [Show full text]
  • Sensory Change Following Motor Learning
    A. M. Green, C. E. Chapman, J. F. Kalaska and F. Lepore (Eds.) Progress in Brain Research, Vol. 191 ISSN: 0079-6123 Copyright Ó 2011 Elsevier B.V. All rights reserved. CHAPTER 2 Sensory change following motor learning { k { { Andrew A. G. Mattar , Sazzad M. Nasir , Mohammad Darainy , and { } David J. Ostry , ,* { Department of Psychology, McGill University, Montréal, Québec, Canada { Shahed University, Tehran, Iran } Haskins Laboratories, New Haven, Connecticut, USA k The Roxelyn and Richard Pepper Department of Communication Sciences and Disorders, Northwestern University, Evanston, Illinois, USA Abstract: Here we describe two studies linking perceptual change with motor learning. In the first, we document persistent changes in somatosensory perception that occur following force field learning. Subjects learned to control a robotic device that applied forces to the hand during arm movements. This led to a change in the sensed position of the limb that lasted at least 24 h. Control experiments revealed that the sensory change depended on motor learning. In the second study, we describe changes in the perception of speech sounds that occur following speech motor learning. Subjects adapted control of speech movements to compensate for loads applied to the jaw by a robot. Perception of speech sounds was measured before and after motor learning. Adapted subjects showed a consistent shift in perception. In contrast, no consistent shift was seen in control subjects and subjects that did not adapt to the load. These studies suggest that motor learning changes both sensory and motor function. Keywords: motor learning; sensory plasticity; arm movements; proprioception; speech motor control; auditory perception. Introduction the human motor system and, likewise, to skill acquisition in the adult nervous system.
    [Show full text]
  • Ear, Page 1 Lecture Outline
    Ear - Hearing perspective Dr. Darren Hoffmann Lecture Objectives: After this lecture, you should be able to: -Describe the surface anatomy of the external ear in anatomical language -Recognize key anatomy in an otoscopic view of the tympanic membrane -Describe the anatomy and function of the ossicles and their associated muscles -Relate the anatomical structures of the middle ear to the anterior, posterior, lateral or medial walls -Explain the anatomy of middle ear infection and which regions have potential to spread to ear -Describe the anatomical structures of the inner ear -Discriminate between endolymph and perilymph in terms of their origin, composition and reabsorption mechanisms -Identify the structures of the Cochlea and Vestibular system histologically -Explain how hair cells function to transform fluid movement into electrical activity -Discriminate the location of cochlear activation for different frequencies of sound -Relate the hair cells of the cochlea to the hair cells of the vestibular system -Contrast the vestibular structures of macula and crista terminalis Let’s look at the following regions: Hoffmann – Ear, Page 1 Lecture Outline: C1. External Ear Function: Amplification of Sound waves Parts Auricle Visible part of external ear (pinna) Helix – large outer rim Tragus – tab anterior to external auditory meatus External auditory meatus Auditory Canal/External Auditory Meatus Leads from Auricle to Tympanic membrane Starts cartilaginous, becomes bony as it enters petrous part of temporal bone Earwax (Cerumen) Complex mixture
    [Show full text]
  • Inner Ear Infection (Otitis Interna) in Dogs
    Hurricane Harvey Client Education Kit Inner Ear Infection (Otitis Interna) in Dogs My dog has just been diagnosed with an inner ear infection. What is this? Inflammation of the inner ear is called otitis interna, and it is most often caused by an infection. The infectious agent is most commonly bacterial, although yeast and fungus can also be implicated in an inner ear infection. If your dog has ear mites in the external ear canal, this can ultimately cause a problem in the inner ear and pose a greater risk for a bacterial infection. Similarly, inner ear infections may develop if disease exists in one ear canal or when a benign polyp is growing from the middle ear. A foreign object, such as grass seed, may also set the stage for bacterial infection in the inner ear. Are some dogs more susceptible to inner ear infection? Dogs with long, heavy ears seem to be predisposed to chronic ear infections that ultimately lead to otitis interna. Spaniel breeds, such as the Cocker spaniel, and hound breeds, such as the bloodhound and basset hound, are the most commonly affected breeds. Regardless of breed, any dog with a chronic ear infection that is difficult to control may develop otitis interna if the eardrum (tympanic membrane) is damaged as it allows bacteria to migrate down into the inner ear. "Dogs with long, heavy ears seem to bepredisposed to chronic ear infections that ultimately lead to otitis interna." Excessively vigorous cleaning of an infected external ear canal can sometimes cause otitis interna. Some ear cleansers are irritating to the middle and inner ear and can cause signs of otitis interna if the eardrum is damaged and allows some of the solution to penetrate too deeply.
    [Show full text]
  • Bedside Neuro-Otological Examination and Interpretation of Commonly
    J Neurol Neurosurg Psychiatry: first published as 10.1136/jnnp.2004.054478 on 24 November 2004. Downloaded from BEDSIDE NEURO-OTOLOGICAL EXAMINATION AND INTERPRETATION iv32 OF COMMONLY USED INVESTIGATIONS RDavies J Neurol Neurosurg Psychiatry 2004;75(Suppl IV):iv32–iv44. doi: 10.1136/jnnp.2004.054478 he assessment of the patient with a neuro-otological problem is not a complex task if approached in a logical manner. It is best addressed by taking a comprehensive history, by a Tphysical examination that is directed towards detecting abnormalities of eye movements and abnormalities of gait, and also towards identifying any associated otological or neurological problems. This examination needs to be mindful of the factors that can compromise the value of the signs elicited, and the range of investigative techniques available. The majority of patients that present with neuro-otological symptoms do not have a space occupying lesion and the over reliance on imaging techniques is likely to miss more common conditions, such as benign paroxysmal positional vertigo (BPPV), or the failure to compensate following an acute unilateral labyrinthine event. The role of the neuro-otologist is to identify the site of the lesion, gather information that may lead to an aetiological diagnosis, and from there, to formulate a management plan. c BACKGROUND Balance is maintained through the integration at the brainstem level of information from the vestibular end organs, and the visual and proprioceptive sensory modalities. This processing takes place in the vestibular nuclei, with modulating influences from higher centres including the cerebellum, the extrapyramidal system, the cerebral cortex, and the contiguous reticular formation (fig 1).
    [Show full text]
  • Hearing Loss, Vertigo and Tinnitus
    HEARING LOSS, VERTIGO AND TINNITUS Jonathan Lara, DO April 29, 2012 Hearing Loss Facts S Men are more likely to experience hearing loss than women. S Approximately 17 percent (36 million) of American adults report some degree of hearing loss. S About 2 to 3 out of every 1,000 children in the United States are born deaf or hard-of-hearing. S Nine out of every 10 children who are born deaf are born to parents who can hear. Hearing Loss Facts S The NIDCD estimates that approximately 15 percent (26 million) of Americans between the ages of 20 and 69 have high frequency hearing loss due to exposure to loud sounds or noise at work or in leisure activities. S Only 1 out of 5 people who could benefit from a hearing aid actually wears one. S Three out of 4 children experience ear infection (otitis media) by the time they are 3 years old. Hearing Loss Facts S There is a strong relationship between age and reported hearing loss: 18 percent of American adults 45-64 years old, 30 percent of adults 65-74 years old, and 47 percent of adults 75 years old or older have a hearing impairment. S Roughly 25 million Americans have experienced tinnitus. S Approximately 4,000 new cases of sudden deafness occur each year in the United States. Hearing Loss Facts S Approximately 615,000 individuals have been diagnosed with Ménière's disease in the United States. Another 45,500 are newly diagnosed each year. S One out of every 100,000 individuals per year develops an acoustic neurinoma (vestibular schwannoma).
    [Show full text]