Chapter 11: Sound, the Auditory System, and Pitch Perception Overview of Questions
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Guide to Sensory Processing.Pdf
Guide to Sensory Processing Prepared by Allison Travnik, MSOTS Level II Fieldwork Student Project Kavitha N Krishnan MS OTR/L Fieldwork Instructor Sensory Processing In order to understand what is going on around us, we need to organize all of the incoming sensory information (Ayres, 2005). The sensory information involves what we see, smell, taste, hear, feel on our body, where our body is in relation to others, and how well we are balanced. This is a lot of information that our brains need to process in order to engage in productive behavior, learn, and form accurate perceptions. Proprioceptive Where are body is in space Tactile Auditory What we feel The noise on our skin around us Sensory Smell Processing The Sight difference What we see scents around us around us Oral Sensory Processing Vestibular The sensations Jean Ayres developed the sensory Our sense of Disorder + balance that food give integration (SI) theory. SI gives us in our mouth meaning to what our senses are recognizing. When the sensations are not being organized properly may notice some of the same qualities in the brain, Ayres compared it to about yourself.It is important to a traffic jam. The traffic jam of remember that everyone has some sensory information can lead to quirks about their sensory processing learning difficulties and problem whether it be a sensitivity to loud behavior (Ayres, 2005). Children noises or dislike of light touch. with Sensory Processing Disorder However the identification of SPD is (SPD) are struggling with this reserved for individuals whose traffic jam. sensory quirks are outside of the Sensory processing is a typical range and affect their daily dynamic and complex theory. -
Underwater Acoustics: Webinar Series for the International Regulatory Community
Underwater Acoustics: Webinar Series for the International Regulatory Community Webinar Outline: Marine Animal Sound Production and Reception Thursday, December 3, 2015 at 12:00pm (US East Coast Time) Sound production and reception in teleost fish (M. Clara P. Amorim, Ispa – Instituto Universitário) • Teleost fish are likely the largest vocal vertebrate group. Sounds made by fish can be an important part of marine soundscapes. • Fish possess the most diversified sonic mechanisms among vertebrates, which include the vibration of the swim bladder through intrinsic or extrinsic sonic muscles, as well as the rubbing of bony elements. • Fish sounds are usually pulsed (each sonic muscle contraction corresponds to a sound pulse), short (typically shorter than 1 s) and broadband (with most energy below 1 kHz), although some fish produce tonal sounds. Sounds generated by bony elements are often higher frequency (up to a few kHz). • In contrast with terrestrial vertebrates, fish have no external or middle ear. Fish detect sounds with the inner ear, which comprises three semicircular canals and three otolithic end organs, the utricle, the saccule and the lagena. Fish mostly detect particle motion and hear up to 1 kHz. Some species have evolved accessory auditory structures that serve as pressures transducers and present enhanced hearing sensitivity and increased frequency detection up to several kHz. Fish hearing seems to have evolved independently of sound production and is important to detect the ‘auditory scene’. • Acoustic signals are produced during social interactions or during distress situations as in insects or other vertebrates. Sounds are important in mate attraction, courtship and spawning or to defend a territory and gain access to food. -
Understanding Sensory Processing: Looking at Children's Behavior Through the Lens of Sensory Processing
Understanding Sensory Processing: Looking at Children’s Behavior Through the Lens of Sensory Processing Communities of Practice in Autism September 24, 2009 Charlottesville, VA Dianne Koontz Lowman, Ed.D. Early Childhood Coordinator Region 5 T/TAC James Madison University MSC 9002 Harrisonburg, VA 22807 [email protected] ______________________________________________________________________________ Dianne Koontz Lowman/[email protected]/2008 Page 1 Looking at Children’s Behavior Through the Lens of Sensory Processing Do you know a child like this? Travis is constantly moving, pushing, or chewing on things. The collar of his shirt and coat are always wet from chewing. When talking to people, he tends to push up against you. Or do you know another child? Sierra does not like to be hugged or kissed by anyone. She gets upset with other children bump up against her. She doesn’t like socks with a heel or toe seam or any tags on clothes. Why is Travis always chewing? Why doesn’t Sierra liked to be touched? Why do children react differently to things around them? These children have different ways of reacting to the things around them, to sensations. Over the years, different terms (such as sensory integration) have been used to describe how children deal with the information they receive through their senses. Currently, the term being used to describe children who have difficulty dealing with input from their senses is sensory processing disorder. _____________________________________________________________________ Sensory Processing Disorder -
Electromagnetic Field and TGF-Β Enhance the Compensatory
www.nature.com/scientificreports OPEN Electromagnetic feld and TGF‑β enhance the compensatory plasticity after sensory nerve injury in cockroach Periplaneta americana Milena Jankowska1, Angelika Klimek1, Chiara Valsecchi2, Maria Stankiewicz1, Joanna Wyszkowska1* & Justyna Rogalska1 Recovery of function after sensory nerves injury involves compensatory plasticity, which can be observed in invertebrates. The aim of the study was the evaluation of compensatory plasticity in the cockroach (Periplaneta americana) nervous system after the sensory nerve injury and assessment of the efect of electromagnetic feld exposure (EMF, 50 Hz, 7 mT) and TGF‑β on this process. The bioelectrical activities of nerves (pre‑and post‑synaptic parts of the sensory path) were recorded under wind stimulation of the cerci before and after right cercus ablation and in insects exposed to EMF and treated with TGF‑β. Ablation of the right cercus caused an increase of activity of the left presynaptic part of the sensory path. Exposure to EMF and TGF‑β induced an increase of activity in both parts of the sensory path. This suggests strengthening efects of EMF and TGF‑β on the insect ability to recognize stimuli after one cercus ablation. Data from locomotor tests proved electrophysiological results. The takeover of the function of one cercus by the second one proves the existence of compensatory plasticity in the cockroach escape system, which makes it a good model for studying compensatory plasticity. We recommend further research on EMF as a useful factor in neurorehabilitation. Injuries in the nervous system caused by acute trauma, neurodegenerative diseases or even old age are hard to reverse and represent an enormous challenge for modern medicine. -
Cell Phones Silent Clickers on Remember - Learning Team – You Can Email/Skype/Facetime/Zoom in Virtual Office Hours
This is PHYS 1240 - Sound and Music Lecture 11 Professor Patricia Rankin Cell Phones silent Clickers on Remember - Learning Team – you can email/skype/facetime/zoom in virtual office hours Graduate student Teaching Assistant Tyler C Mcmaken - online 10-11am Friday Undergraduate Learning Assistants Madeline Karr Online 6-7pm Monday Rishi Mayekar Online 11-12noon Thur Miles Warnke Online 3-4pm Wed Professor Patricia Rankin Online 2-3pm Wed Physics 1240 Lecture 11 Today: Timbre, Fourier, Sound Spectra, Sampling Size Next time: Sound Intensity, Intervals, Scales physicscourses.colorado.edu/phys1240 Canvas Site: assignments, administration, grades Homework – HW5 Due Wed February 19th 5pm Homelabs – Hlab3 Due Monday Feb 24th 5pm Debrief – Last Class(es) Waves in pipes Open-open n=1,2,3,4 Closed-open n = 1,3,5 Pressure nodes at open ends, pressure antinodes at closed ends Displacement nodes/antinodes opposite to pressure ones. Open-open 푓푛=푛∙푣푠/2퐿 So, fundamental doesn’t depend just on length of pipe Closed-open 푓푛=푛∙푣푠/4퐿 Homework hints Check your units You can use yx key to find 2n Be aware – you may need to do some manipulation e.g. Suppose question asks you to find the mass/unit length of a string given velocity, tension…. You know F 2 퐹 , 퐹 v 푠표, 푣푡 = 휇 = 2 t 휇 푣푡 Timbre – the tone color/voice of an instrument Timbre is why we can recognize different instruments from steady tones. Perception Physics Pitch Frequency Consonance Frequencies are ratios of small integers Loudness Overall amplitude Timbre Amplitudes of a harmonic series of notes Superposition We can add/superimpose sine waves to get a more complex wave profile Overall shape (Timbre) depends on the frequency spectra (the frequencies of waves added together), and the amplitudes of the waves Ohm's acoustic law, sometimes called the acoustic phase law or simply Ohm's law (but another Ohm’s law in Electricity and Magnetism), states that a musical sound is perceived by the ear as the fundamental note of a set of a number of constituent pure harmonic tones. -
Psychoacoustics Perception of Normal and Impaired Hearing with Audiology Applications Editor-In-Chief for Audiology Brad A
PSYCHOACOUSTICS Perception of Normal and Impaired Hearing with Audiology Applications Editor-in-Chief for Audiology Brad A. Stach, PhD PSYCHOACOUSTICS Perception of Normal and Impaired Hearing with Audiology Applications Jennifer J. Lentz, PhD 5521 Ruffin Road San Diego, CA 92123 e-mail: [email protected] Website: http://www.pluralpublishing.com Copyright © 2020 by Plural Publishing, Inc. Typeset in 11/13 Adobe Garamond by Flanagan’s Publishing Services, Inc. Printed in the United States of America by McNaughton & Gunn, Inc. All rights, including that of translation, reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, electronic, mechanical, recording, or otherwise, including photocopying, recording, taping, Web distribution, or information storage and retrieval systems without the prior written consent of the publisher. For permission to use material from this text, contact us by Telephone: (866) 758-7251 Fax: (888) 758-7255 e-mail: [email protected] Every attempt has been made to contact the copyright holders for material originally printed in another source. If any have been inadvertently overlooked, the publishers will gladly make the necessary arrangements at the first opportunity. Library of Congress Cataloging-in-Publication Data Names: Lentz, Jennifer J., author. Title: Psychoacoustics : perception of normal and impaired hearing with audiology applications / Jennifer J. Lentz. Description: San Diego, CA : Plural Publishing, -
Auditory System & Hearing
Auditory System & Hearing Chapters 9 part II Lecture 17 Jonathan Pillow Sensation & Perception (PSY 345 / NEU 325) Fall 2017 1 Cochlea: physical device tuned to frequency! • place code: tuning of different parts of the cochlea to different frequencies 2 The auditory nerve (AN): fibers stimulated by inner hair cells • Frequency selectivity: Clearest when sounds are very faint 3 Threshold tuning curves for 6 neurons (threshold = lowest intensity that will give rise to a response) Characteristic frequency - frequency to which the neuron is most sensitive threshold(dB) frequency (kHz) 4 Information flow in the auditory pathway • Cochlear nucleus: first brain stem nucleus at which afferent auditory nerve fibers synapse • Superior olive: brainstem region thalamus MGN in the auditory pathway where inputs from both ears converge • Inferior colliculus: midbrain nucleus in the auditory pathway • Medial geniculate nucleus (MGN): part of the thalamus that relays auditory signals to the cortex 5 • Primary auditory cortex (A1): First cortical area for processing audition (in temporal lobe) • Belt & Parabelt areas: areas beyond A1, where neurons respond to more complex characteristics of sounds 6 Basic Structure of the Mammalian Auditory System Comparing overall structure of auditory and visual systems: • Auditory system: Large proportion of processing before A1 • Visual system: Large proportion of processing after V1 7 Basic Structure of the Mammalian Auditory System Tonotopic organization: neurons organized spatially in order of preferred frequency • -
The Physics of Sound 1
The Physics of Sound 1 The Physics of Sound Sound lies at the very center of speech communication. A sound wave is both the end product of the speech production mechanism and the primary source of raw material used by the listener to recover the speaker's message. Because of the central role played by sound in speech communication, it is important to have a good understanding of how sound is produced, modified, and measured. The purpose of this chapter will be to review some basic principles underlying the physics of sound, with a particular focus on two ideas that play an especially important role in both speech and hearing: the concept of the spectrum and acoustic filtering. The speech production mechanism is a kind of assembly line that operates by generating some relatively simple sounds consisting of various combinations of buzzes, hisses, and pops, and then filtering those sounds by making a number of fine adjustments to the tongue, lips, jaw, soft palate, and other articulators. We will also see that a crucial step at the receiving end occurs when the ear breaks this complex sound into its individual frequency components in much the same way that a prism breaks white light into components of different optical frequencies. Before getting into these ideas it is first necessary to cover the basic principles of vibration and sound propagation. Sound and Vibration A sound wave is an air pressure disturbance that results from vibration. The vibration can come from a tuning fork, a guitar string, the column of air in an organ pipe, the head (or rim) of a snare drum, steam escaping from a radiator, the reed on a clarinet, the diaphragm of a loudspeaker, the vocal cords, or virtually anything that vibrates in a frequency range that is audible to a listener (roughly 20 to 20,000 cycles per second for humans). -
Sensory Change Following Motor Learning
A. M. Green, C. E. Chapman, J. F. Kalaska and F. Lepore (Eds.) Progress in Brain Research, Vol. 191 ISSN: 0079-6123 Copyright Ó 2011 Elsevier B.V. All rights reserved. CHAPTER 2 Sensory change following motor learning { k { { Andrew A. G. Mattar , Sazzad M. Nasir , Mohammad Darainy , and { } David J. Ostry , ,* { Department of Psychology, McGill University, Montréal, Québec, Canada { Shahed University, Tehran, Iran } Haskins Laboratories, New Haven, Connecticut, USA k The Roxelyn and Richard Pepper Department of Communication Sciences and Disorders, Northwestern University, Evanston, Illinois, USA Abstract: Here we describe two studies linking perceptual change with motor learning. In the first, we document persistent changes in somatosensory perception that occur following force field learning. Subjects learned to control a robotic device that applied forces to the hand during arm movements. This led to a change in the sensed position of the limb that lasted at least 24 h. Control experiments revealed that the sensory change depended on motor learning. In the second study, we describe changes in the perception of speech sounds that occur following speech motor learning. Subjects adapted control of speech movements to compensate for loads applied to the jaw by a robot. Perception of speech sounds was measured before and after motor learning. Adapted subjects showed a consistent shift in perception. In contrast, no consistent shift was seen in control subjects and subjects that did not adapt to the load. These studies suggest that motor learning changes both sensory and motor function. Keywords: motor learning; sensory plasticity; arm movements; proprioception; speech motor control; auditory perception. Introduction the human motor system and, likewise, to skill acquisition in the adult nervous system. -
Loudness Standards in Broadcasting. Case Study of EBU R-128 Implementation at SWR
Loudness standards in broadcasting. Case study of EBU R-128 implementation at SWR Carbonell Tena, Damià Curs 2015-2016 Director: Enric Giné Guix GRAU EN ENGINYERIA DE SISTEMES AUDIOVISUALS Treball de Fi de Grau Loudness standards in broadcasting. Case study of EBU R-128 implementation at SWR Damià Carbonell Tena TREBALL FI DE GRAU ENGINYERIA DE SISTEMES AUDIOVISUALS ESCOLA SUPERIOR POLITÈCNICA UPF 2016 DIRECTOR DEL TREBALL ENRIC GINÉ GUIX Dedication Für die Familie Schaupp. Mit euch fühle ich mich wie zuhause und ich weiß dass ich eine zweite Familie in Deutschland für immer haben werde. Ohne euch würde diese Arbeit nicht möglich gewesen sein. Vielen Dank! iv Thanks I would like to thank the SWR for being so comprehensive with me and for letting me have this wonderful experience with them. Also for all the help, experience and time given to me. Thanks to all the engineers and technicians in the house, Jürgen Schwarz, Armin Büchele, Reiner Liebrecht, Katrin Koners, Oliver Seiler, Frauke von Mueller- Rick, Patrick Kirsammer, Christian Eickhoff, Detlef Büttner, Andreas Lemke, Klaus Nowacki and Jochen Reß that helped and advised me and a special thanks to Manfred Schwegler who was always ready to help me and to Dieter Gehrlicher for his comprehension. Also to my teacher and adviser Enric Giné for his patience and dedication and to the team of the Secretaria ESUP that answered all the questions asked during the process. Of course to my Catalan and German families for the moral (and economical) support and to Ema Madeira for all the corrections, revisions and love given during my stay far from home. -
Advance and Unedited Reporting Material (English Only)
13 March 2018 (corr.) Advance and unedited reporting material (English only) Seventy-third session Oceans and the law of the sea Report of the Secretary-General Summary In paragraph 339 of its resolution 71/257, as reiterated in paragraph 354 of resolution 72/73, the General Assembly decided that the United Nations Open-ended Informal Consultative Process on Oceans and the Law of the Sea would focus its discussions at its nineteenth meeting on the topic “Anthropogenic underwater noise”. The present report was prepared pursuant to paragraph 366 of General Assembly resolution 72/73 with a view to facilitating discussions on the topic of focus. It is being submitted for consideration by the General Assembly and also to the States parties to the United Nations Convention on the Law of the Sea, pursuant to article 319 of the Convention. Contents Page I. Introduction ............................................................... II. Nature and sources of anthropogenic underwater noise III. Environmental and socioeconomic aspects IV. Current activities and further needs with regard to cooperation and coordination in addressing anthropogenic underwater noise V. Conclusions ............................................................... 2 I. Introduction 1. The marine environment is subject to a wide array of human-made noise. Many human activities with socioeconomic significance introduce sound into the marine environment either intentionally for a specific purpose (e.g., seismic surveys) or unintentionally as a by-product of their activities (e.g., shipping). In addition, there is a range of natural sound sources from physical and biological origins such as wind, waves, swell patterns, currents, earthquakes, precipitation and ice, as well as the sounds produced by marine animals for communication, orientation, navigation and foraging. -
EBU R 128 – the EBU Loudness Recommendation
10 things you need to know about... EBU R 128 – the EBU loudness recommendation 1 EBU R 128 is at the core of a true audio revolution: audio levelling based on loudness, not peaks Audio signal normalisation based on peaks has led to more and more dynamic compression and a phenomenon called the ‘Loudness War’. The problem arose because of misuse of the traditional metering method (Quasi Peak Programme Meter – QPPM) and the extended headroom due to digital delivery. Loudness normalisation ends the ‘Loudness War’ and brings audio peace to the audience. 2 ITU-R BS.1770-2 defines the basic measurement, EBU R 128 builds on it and extends it BS.1770-2 is an international standard that describes a method to measure loudness, an inherently subjective impression. It introduces ‘K-weighting’, a simple weighting curve that leads to a good match between subjective impression and objective measurement. EBU R 128 takes BS.1770-2 and extends it with the descriptor Loudness Range and the Target Level: -23 LUFS (Loudness Units referenced to Full Scale). A tolerance of ± 1 LU is generally acceptable. 3 Gating of the measurement is used to achieve better loudness matching of programmes that contain longer periods of silence Longer periods of silence in programmes lead to a lower measured loudness level. After subsequent loudness normalisation such programmes would end up too loud. In BS.1770-2 a relative gate of 10 LU (Loudness Units; 1 LU is equivalent to 1dB) below the ungated loudness level is used to eliminate these low level periods from the measurement.