CHAPTER 15 Perception and Interactive Technology MEIKE SCHELLER, KARIN PETRINI, AND MICHAEL J. PROULX “What does it mean, to see?” was the question rehabilitation. The interface between experi- that David Marr used to motivate his com- mental psychology and technology provides putational approach to understanding Vision challenges for basic and applied research, (Marr, 1982). Marr’s answer, building on and, as a result, great opportunities to explore Aristotle, was that “vision is the process psychology and cognitive neuroscience in of discovering from images what is present novel ways. in the world, and where it is” (page 3). We will first provide an outline of human Although we humans might have a prefer- sensory perception using single (unisensory) ence for visual perception, we are endowed and multiple (multisensory) senses. This first with other senses that provide us with a rich section highlights the interplay between dif- experience (Chapters 2, 3, 4, 5, and 14, this ferent sensory modalities for the construction volume). Therefore, the broader question of a precise and accurate representation of might be: What does it mean, to perceive? the environment and the mechanisms our Although this might be seen as a philosophi- brains have developed to deal with physical cal question of sorts, it gets to the important uncertainty. Thereby, we specifically focus issue of how we define perceptual experi- on optimal multisensory integration and its ence scientifically so that we may study it. development during ontogeny. We then look The importance of defining it is crucial for into the adaptation of human perception to research applications: If we aim to restore a sensory or motor deficits—that is, when sense such as vision in blindness or hearing one or multiple senses are impaired, or the in deafness, what does it mean to see or to motor system isn’t functioning normally. hear such that we will know when restoration We describe how sensory loss/impairment has been successful? This chapter reviews the impacts individuals in their everyday life and interaction between multisensory perception how deficits in one sense affect development and interactive technological approaches to in the remaining, intact senses. Also, the sensory rehabilitation. It builds on research role that action and motor impairment plays in multisensory perception, sensory impair- in the perceptual framework is discussed. We ment, and the development of cognition to then outline current sensory rehabilitation provide a foundation for understanding the techniques, with focus on auditory and visual psychological and neural basis for sensory rehabilitation, as these domains are more extensively investigated, thereby drawing a This work was supported by grant SG142127 from the clear distinction between sensory restoration British Academy/Leverhulme. and sensory substitution. Their function and Stevens’ Handbook of Experimental Psychology and Cognitive Neuroscience, Fourth Edition, edited by John T. Wixted. Copyright © 2018 John Wiley & Sons, Inc. DOI: 10.1002/9781119170174.epcn215 1 2 Perception and Interactive Technology benefits of these different techniques for this volume; Hollins & Risner, 2000). certain populations will be discussed, and This clearly makes touch an inherently active the chapter closes with some remarks on the sense. However, with the aim of controlling outlook of interactive technology in sensory our actions appropriately, perception must rehabilitation research and application. be frequently updated via sensory feedback, which arises from our actions. In fact, not UNISENSORY only touch but also other senses like vision, AND MULTISENSORY proprioception, and audition critically depend PERCEPTION on the fine-tuned recalibration of action and perception (Cressman & Henriques, 2011; Our sensory systems have been shaped by Proulx, et al., 2015). evolutionary processes in such a way that we The environment we live in is not stable are well-adapted to the natural world we live but complex and dynamic. Moreover, all in and respond accurately to biologically rel- stimuli in our environment can be differ- evant events (Kaas, 1989; Machens, Gollisch, entiated in multiple features. For example, Kolesnikova, & Herz, 2005; Nummela et al., sounds vary in amplitude and pitch while 2013). Humans have a number of senses light varies in hue, luminance, and color. This that consist of arrays of different types vast variation of environmental stimuli that, of receptors: electromagnetic receptors, on the one hand, supports our brain in struc- mechanoreceptors, chemoreceptors, ther- turing our complex lives, also emphasizes the moreceptors and pain receptors. We take up necessity of our sensory systems to be quite information from the environment using these flexible in the way they process incoming receptors by transforming the different forms information, regardless of whether they arise of energy (e.g., electromagnetic radiation, from the same or from different modalities pressure waves) into electrical signals. This (Chapter 14, this volume). process is called transduction and enables us Here is an example: in order to judge to perceive the different forms of energy in visually the spatial distance of an object our one and the same entity—namely, in elec- eyes provide us with a number of different trical impulses. These impulses, in turn, get visual cues. The perception of depth, which sent to the central nervous system via neural is crucial for estimating the distance and pathways. Our central nervous system then relative position of objects in space, arises processes and combines the information in from the combination of information from a way that makes us perceive and recognize monocular cues like perspective, occlusion, the world around us, eventually leading to shading, or relative size as well as binocular ecologically relevant behavior. The process cues like retinal disparity and convergence. of perception is strongly characterized by the Furthermore, extra-retinal cues like signals combination of different, as well as redun- from the eye muscles have also to be taken dant information, derived from our sensory into account by the brain to determine in organs. It is not a unidirectional process but which direction the eyes are looking. This stays in constant dynamic interaction with already shows that vision is much more the actions we make. We actively use our complex than we often think and that even body to facilitate perception by sampling within one sensory system the amount of our environment in the best way possible. information our brain processes in order For example, we need to actively explore or to compute a single object feature—like manipulate an object in order to gain enough depth—is immense and not restricted to the information to recognize it (Chapter 5, visual sense alone. Unisensory and Multisensory Perception 3 When we stick to the example of depth as Besides situations in which the informa- a distance cue, we find that vision is the sense tion from one sense is ambiguous or missing, that is dominantly used for estimating spatial there are situations in which the presence depth at distances that are out of physical of environmental or internal noise can reach (Battaglia, Jacobs, & Aslin, 2003). drastically affect our perception. Noise is However, our sense of hearing can addi- present in all the stimuli surrounding us tionally extract spatial distance cues from and arises from their physical nature, like our environment using frequency spectrum, clutter affecting sound waves or quantum inter-aural loudness difference and inter-aural fluctuations of light. Also, internal noise, time difference (Moore, 2003). This gets par- which results from a variability in neural ticularly important when information in the coding or the fluctuation of attention, can environment is limited or ambiguous. Vision affect perception at many different stages itself, for instance, is often ambiguous of processing. For example, one may think due to projection of a three-dimensional that trying to walk straight lines while being visual scene onto a two-dimensional retinal blindfolded is an easy task. As long as the image. Mapping the two-dimensional image distance to be traveled is only a couple of back into a three-dimensional scene can meters, then probably it is. However, Souman result in many different possible outcomes. and colleagues (2009) showed that during Similarly, reliance on self-motion can result navigation it is much harder to maintain a in well-known perceptual misinterpre- straight walking route when the information tations, as the somatogravic illusion shows input is limited to fewer senses, the level (see Figure 15.1). Here, the vestibular system, of sensory noise is increased in the absence which provides us with rotational and transla- of visual calibration. In one experiment in tional movement cues, is tricked in a way that which participants were asked to walk a acceleration or deceleration of, for example, straight line, even participants who were not blindfolded could not walk straight on an airplane evokes the sensation of the own cloudy days. When the sun was not visible, body facing upwards or downwards, which is they started to walk in circles, whereas the in turn misperceived as an upward or down- other participants who walked on sunny days ward tilting of the airplane. This interpreta- followed almost perfectly straight routes. tion can result in dangerous maneuvers if not Souman and colleagues concluded that when
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages50 Page
-
File Size-