Abstract the EFFECTS of PRESENTATION MODALITY ON
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Understanding the Role of the Modality Principle in Multimedia Learning Environments Amy Marie Oberfoell Iowa State University
View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by Digital Repository @ Iowa State University Iowa State University Capstones, Theses and Graduate Theses and Dissertations Dissertations 2015 Understanding the role of the modality principle in multimedia learning environments Amy Marie Oberfoell Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/etd Part of the Curriculum and Instruction Commons, and the Instructional Media Design Commons Recommended Citation Oberfoell, Amy Marie, "Understanding the role of the modality principle in multimedia learning environments" (2015). Graduate Theses and Dissertations. 14602. https://lib.dr.iastate.edu/etd/14602 This Thesis is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Understanding the role of the modality principle in multimedia learning environments by Amy Marie Oberfoell A thesis submitted to the graduate faculty in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE Major: Education (Curriculum and Technology Instruction) Program of Study Committee Ana-Paula Correia, Major Professor Denise Schmidt-Crawford Volker Hegelheimer Iowa State University Ames, Iowa 2015 Copyright © Amy Marie Oberfoell, -
Crossmodal Attention Jon Driver and Charles Spence†
245 Crossmodal attention Jon Driver∗ and Charles Spence² Most selective attention research has considered only a multimodal representations of space, within which atten- single sensory modality at a time, but in the real world, tion can be directed. our attention must be coordinated crossmodally. Recent studies reveal extensive crossmodal links in attention across the various modalities (i.e. audition, vision, touch Attending to one sensory modality versus and proprioception). Attention typically shifts to a common another location across the modalities, despite the vast differences The most basic crossmodal issue concerning attention is in their initial coding of space. These spatial synergies whether people can attend selectively to one modality in attention can be maintained even when receptors are at the expense of others, or whether the modalities realigned across the modalities by changes in posture. are so independent that concentrating on one has no Some crossmodal integration can arise preattentively. The implications for the others. As long ago as 1893, Wundt mechanisms underlying these crossmodal links can be [1] claimed that attention can speed up the processing of examined in a convergent manner by integrating behavioural a relevant modality, while delaying irrelevant modalities; studies of normal subjects and brain-damaged patients with many subsequent authors have concurred. However, as •• neuroimaging and neurophysiological studies. we have pointed out elsewhere [2 ], most studies on this issue have suffered from methodological ¯aws. For instance, stimuli in different modalities were typically Addresses presented from different spatial positions, so attention to ∗ Institute of Cognitive Neuroscience, Department of Psychology, a modality was confounded with attention to a location. -
Variance Adaptation in Navigational Decision Making Ruben Gepner1, Jason Wolk1, Digvijay Shivaji Wadekar1, Sophie Dvali1, Marc Gershow1,2,3*
RESEARCH ARTICLE Variance adaptation in navigational decision making Ruben Gepner1, Jason Wolk1, Digvijay Shivaji Wadekar1, Sophie Dvali1, Marc Gershow1,2,3* 1Department of Physics, New York University, New York, United States; 2Center for Neural Science, New York University, New York, United States; 3Neuroscience Institute, New York University, New York, United States Abstract Sensory systems relay information about the world to the brain, which enacts behaviors through motor outputs. To maximize information transmission, sensory systems discard redundant information through adaptation to the mean and variance of the environment. The behavioral consequences of sensory adaptation to environmental variance have been largely unexplored. Here, we study how larval fruit flies adapt sensory-motor computations underlying navigation to changes in the variance of visual and olfactory inputs. We show that variance adaptation can be characterized by rescaling of the sensory input and that for both visual and olfactory inputs, the temporal dynamics of adaptation are consistent with optimal variance estimation. In multisensory contexts, larvae adapt independently to variance in each sense, and portions of the navigational pathway encoding mixed odor and light signals are also capable of variance adaptation. Our results suggest multiplication as a mechanism for odor-light integration. DOI: https://doi.org/10.7554/eLife.37945.001 Introduction The world is not fixed but instead varies dramatically on multiple time scales, for example from cloudy to sunny, day to night, or summer to winter; and contexts: from sun to shade, or burrowing *For correspondence: [email protected] to walking. Nervous systems must respond appropriately in varied environments while obeying bio- physical constraints and minimizing energy expenditure (Niven and Laughlin, 2008). -
The Split-Attention Principle in Multimedia Learning
CHAPTER 8 The Split-Attention Principle in Multimedia Learning Paul Ayres John Sweller University of New South Wales Abstract Definition of Split-Attention The split-attention principle states that Instructional split-attention occurs when when designing instruction, including mul learners are required to split their atten timedia instruction, it is important to tion between and mentally integrate sev avoid formats that require learners to split eral sources of physically or temporally dis their attention between, and mentally in tegrate, multiple sources of information. parate information, where each source of Instead, materials should be formatted so information is essential for understanding that disparate sources of information are the material. Cognitive load is increased by physically and temporally integrated thus the need to mentally integrate the multi obviating the need for learners to en ple sources of information. This increase in gage in mental integration. By eliminat extraneous cognitive load (see chapter 2) is ing the need to mentally integrate multiple likely to have a negative impact on learn sources of information, extraneous work ing compared to conditions where the infor ing memory load is reduced, freeing re mation has been restructured to eliminate sources for learning. This chapter provides the need to split attention. Restructuring the theoretical rationale, based on cog occurs by physically or temporally inte nitive load theory, for the split-attention principle, describes the major experiments grating disparate sources of information to that establish the validity of the princi eliminate the need for mental integration. ple, and indicates the instructional de The split-attention effect occurs when learn sign implications when dealing with multi ers studying integrated information outper media materials. -
Peri-Personal Space As a Prior in Coupling Visual and Proprioceptive
www.nature.com/scientificreports OPEN Peri-personal space as a prior in coupling visual and proprioceptive signals Received: 24 April 2018 Jean-Paul Noel1,2, Majed Samad1,3, Andrew Doxon1, Justin Clark1, Sean Keller1 & Accepted: 7 October 2018 Massimiliano Di Luca1,4 Published: xx xx xxxx It has been suggested that the integration of multiple body-related sources of information within the peri-personal space (PPS) scafolds body ownership. However, a normative computational framework detailing the functional role of PPS is still missing. Here we cast PPS as a visuo-proprioceptive Bayesian inference problem whereby objects we see in our environment are more likely to engender sensations as they come near to the body. We propose that PPS is the refection of such an increased a priori probability of visuo-proprioceptive coupling that surrounds the body. To test this prediction, we immersed participants in a highly realistic virtual reality (VR) simulation of their right arm and surrounding environment. We asked participants to perform target-directed reaches toward visual, proprioceptive, and visuo-proprioceptive targets while visually displaying their reaching arm (body visible condition) or not (body invisible condition). Reach end-points are analyzed in light of the coupling prior framework, where the extension of PPS is taken to be represented by the spatial dispersion of the coupling prior between visual and proprioceptive estimates of arm location. Results demonstrate that if the body is not visible, the spatial dispersion of the visuo-proprioceptive coupling relaxes, whereas the strength of coupling remains stable. By demonstrating a distance-dependent alteration in visual and proprioceptive localization attractive pull toward one another (stronger pull at small spatial discrepancies) when the body is rendered invisible – an efect that is well accounted for by the visuo- proprioceptive coupling prior – the results suggest that the visible body grounds visuo-proprioceptive coupling preferentially in the near vs. -
Control Systems and Homeostasis
About This Chapter • General properties of sensory systems • Somatic senses • Chemoreception: smell and taste • The ear: hearing • The ear: equilibrium • The eye and vision © 2016 Pearson Education, Inc. Sensory Pathways • Stimulus as physical energy sensory receptor – Receptor acts as a transducer • Intracellular signal usually change in membrane potential • Stimulus threshold action potential to CNS • Integration in CNS cerebral cortex or acted on subconsciously © 2016 Pearson Education, Inc. © 2016 Pearson Education, Inc. Receptors to Particular Forms of Energy • Naked (“free”) nerves • Complex neural receptors encased in connective tissue capsules • Smell receptors are neurons • Non-neural receptors for four special senses © 2016 Pearson Education, Inc. Receptors to Particular Forms of Energy • Chemoreceptors respond to chemical ligands: taste, smell • Mechanoreceptors respond to mechanical energy pressure and sound: hearing • Thermoreceptors respond to temperature • Photoreceptors for vision respond to light © 2016 Pearson Education, Inc. Figure 10.1 Simple, complex, and nonneural sensory receptors Simple receptors are neurons Complex neural receptors have nerve Most special senses receptors are cells with free nerve endings. They endings enclosed in connective tissue capsules. that release neurotransmitter onto sensory may have myelinated or This illustration shows a Pacinian corpuscle, neurons, initiating an action potential. The unmyelinated axons. which senses touch. cell illustrated is a hair cell, found in the ear. Stimulus -
Understanding the Role of the Modality Principle in Multimedia Learning Environments Amy Marie Oberfoell Iowa State University
Iowa State University Capstones, Theses and Graduate Theses and Dissertations Dissertations 2015 Understanding the role of the modality principle in multimedia learning environments Amy Marie Oberfoell Iowa State University Follow this and additional works at: https://lib.dr.iastate.edu/etd Part of the Curriculum and Instruction Commons, and the Instructional Media Design Commons Recommended Citation Oberfoell, Amy Marie, "Understanding the role of the modality principle in multimedia learning environments" (2015). Graduate Theses and Dissertations. 14602. https://lib.dr.iastate.edu/etd/14602 This Thesis is brought to you for free and open access by the Iowa State University Capstones, Theses and Dissertations at Iowa State University Digital Repository. It has been accepted for inclusion in Graduate Theses and Dissertations by an authorized administrator of Iowa State University Digital Repository. For more information, please contact [email protected]. Understanding the role of the modality principle in multimedia learning environments by Amy Marie Oberfoell A thesis submitted to the graduate faculty in partial fulfillment of the requirements for the degree of MASTER OF SCIENCE Major: Education (Curriculum and Technology Instruction) Program of Study Committee Ana-Paula Correia, Major Professor Denise Schmidt-Crawford Volker Hegelheimer Iowa State University Ames, Iowa 2015 Copyright © Amy Marie Oberfoell, 2015. All rights reserved ii TABLE OF CONTENTS LIST OF FIGURES………………………………………………………………………………iv LIST OF TABLES………………………………………………………………………………...v -
Modality Effects and the Structure of Short-Term Verbal Memory
Memory & Cognition 1989, 17 (4), 398-422 Modality effects and the structure of short-term verbal memory CATHERINE G. PENNEY Memorial University of Newfoundland, St. John's, Newfoundland, Canada The effects of auditory and visual presentation upon short-term retention of verbal stimuli are reviewed, and a model of the structure of short-term memory is presented. The main assumption ofthe model is that verbal information presented to the auditory and visual modalities is processed in separate streams that have different properties and capabilities. Auditory items are automat ically encoded in both the A (acoustic) code, which, in the absence of subsequent input, can be maintained for some time without deliberate allocation of attention, and a P (phonological) code. Visual items are retained in both the P code and a visual code. Within the auditory stream, suc cessive items are strongly associated; in contrast, in the visual modality, it is simultaneously presented items that are strongly associated. These assumptions about the structure of short term verbal memory are shown to account for many ofthe observed effects ofpresentation modality. Penney (1975) reviewed the literature on the effects of the distraction did not require a long period of vocaliza presentation modality on short-term retention of verbal tion. Overt vocalization of a visually presented list by the material and concluded that there were separate stores for subject produced much the same effect as did auditory auditorily and visually presented information. To empha presentation on the recency part of the serial position size the idea that active processing underlies retention in curve, but subject vocalization tended to reduce recall in short-term memory, and to discourage the conceptuali the nonrecency part of the serial position curve. -
Sensory Receptors A17 (1)
SENSORY RECEPTORS A17 (1) Sensory Receptors Last updated: April 20, 2019 Sensory receptors - transducers that convert various forms of energy in environment into action potentials in neurons. sensory receptors may be: a) neurons (distal tip of peripheral axon of sensory neuron) – e.g. in skin receptors. b) specialized cells (that release neurotransmitter and generate action potentials in neurons) – e.g. in complex sense organs (vision, hearing, equilibrium, taste). sensory receptor is often associated with nonneural cells that surround it, forming SENSE ORGAN. to stimulate receptor, stimulus must first pass through intervening tissues (stimulus accession). each receptor is adapted to respond to one particular form of energy at much lower threshold than other receptors respond to this form of energy. adequate (s. appropriate) stimulus - form of energy to which receptor is most sensitive; receptors also can respond to other energy forms, but at much higher thresholds (e.g. adequate stimulus for eye is light; eyeball rubbing will stimulate rods and cones to produce light sensation, but threshold is much higher than in skin pressure receptors). when information about stimulus reaches CNS, it produces: a) reflex response b) conscious sensation c) behavior alteration SENSORY MODALITIES Sensory Modality Receptor Sense Organ CONSCIOUS SENSATIONS Vision Rods & cones Eye Hearing Hair cells Ear (organ of Corti) Smell Olfactory neurons Olfactory mucous membrane Taste Taste receptor cells Taste bud Rotational acceleration Hair cells Ear (semicircular -
An Examination of the Effects of Mathematics Anxiety
AN EXAMINATION OF THE EFFECTS OF MATHEMATICS ANXIETY, MODALITY, AND LEARNER-CONTROL ON TEACHER CANDIDATES IN MULTIMEDIA LEARNING ENVIRONMENTS by Elena Elizabeth Ward A thesis submitted to the Faculty of Education in conformity with the requirements for the degree Master of Education Queen’s University Kingston, Ontario, Canada September, 2008 Copyright © Elena Elizabeth Ward, 2008 ISBN: 978-0-494-42743-9 Abstract This study examined mathematics anxiety among elementary teacher candidates, and to what extent it interacted with the modality principle under various degrees of learner-control. The experiment involved a sample of 186 elementary teacher candidates learning from eight versions of a computer program on division with fractions. The eight versions varied in modality of presentation (diagrams with narration, or diagrams with written text), control of pacing (pacing was controlled by either the learner or the system), and control of sequence (sequence was controlled by either the learner or the system). A pre-test, post-test, demographic questionnaire, subjective measure of mental effort, and the Abbreviated Math Anxiety Survey were also administered. This study revealed that mathematics anxiety was significantly positively correlated with mental effort, and significantly negatively correlated with engagement, pre-test and post-test scores. Additionally, a modality x pacing interaction was observed for both high prior knowledge and low mathematics-anxious students. Under system-pacing, the modality effect was observed, and these students achieved higher far transfer scores when learning from the diagrams and narration modality condition. However, under learner-pacing, the modality effect reversed, and high prior knowledge and low mathematics-anxious students performed better on far transfer scores when learning from the diagrams and written text modality condition. -
The Effects of Stimulus Modality and Task Integrality: Predicting Dual-Task Performance
THE EFFECTS OF STIMULUS MODALITY AND TASK INTEGRALITY: PREDICTING DUAL-TASK PERFORMANCE. AND WORKLOAD FROM SINGLETASK LEVELS Sandra G. Hart, Robert J. Shively hlichael A. Vidulich NASA-Ames Research Center Moffett Field. CA Ronald C. Miller Informatics General Corporation Palo Alto. CA ABSTRACT The influence of stimulus modality and task difficulty on workload and perfor- mance was investigated in the current study. The goal ums to quantify the "cost" (in terms of response time and ezperienced workload) incurred when essentially serial task components shared common elements (e.g., the response to one initiated the other) which could be accomplished in parallel. The ezperimental tasks were based on the "Fittsberg" paradigm; the solution to a SternBERG-type memory task deter- mines which of two identical FITTS targets are acquired. Previous research suggested that such functionally integrated "dual" tasks are performed with substan- tially less workload and faster response times than would be predicted by sum- ming single-task components when both are presented in the same stimulus modality (visual). In the current study, the physical integration of task elements ums varied (although their functional relationship remained the same) to determine whether dual-task facilitation would persist if task components were presented in different sensory modalities. Again, ii was found that the cost of performing the two-stage task was considerably less than the sum of component single-task levels when both were presented visually. Less facilitation was found when task elements were presented in different sensory modalities. These results suggest the impor- tance of distinguishing between concurrent tasks that compete for limited resources from those that beneficially share common resources when selecting the stimulus modalities jor information displays. -
Self-Prioritization in Perception 1 RUNNING HEAD: Self-Prioritization
Self-Prioritization in Perception RUNNING HEAD: Self-Prioritization in Perception Self-prioritization beyond perception Sarah Schäfer1, Dirk Wentura2, & Christian Frings1 1 University of Trier, 2 Saarland University Word count main text (5321 words) Correspondence: Christian Frings University of Trier Cognitive Psychology D-54286 Trier, Germany [email protected] 1 Self-Prioritization in Perception Abstract Recently, Sui and colleagues (2012) introduced a new paradigm to measure perceptual self-prioritization processes. It seems that arbitrarily tagging shapes to self-relevant words (I, my, me, and so on) leads to speeded verification times when matching self-relevant word shape pairings (e.g., me – triangle) as compared to non-self-relevant word shape pairings (e.g., stranger – circle). In order to analyze the level at which self-prioritization takes place we analyzed whether the self-prioritization effect is due to a tagging of the self-relevant label and the particular associated shape or due to a tagging of the self with an abstract concept. In two experiments participants showed standard self-prioritization effects with varying stimulus features or different exemplars of a particular stimulus category suggesting that self- prioritization also works at a conceptual level. Keywords: self-prioritization, feature variance, conceptual processing 2 Self-Prioritization in Perception A function of self-relevance is that it guides our attention and behavior as self-relevant stimuli indicate important sensory inputs for our cognitive system. Accordingly, modulations of attentional processing due to self-relevance has been shown in different paradigms of selec- tive attention (e.g., Alexopoulos, Muller, Ric, & Marendaz, 2012; Bargh, 1982; Giesbrecht, Sy, & Lewis, 2009; Shapiro, Caldwell, & Sorensen, 1997).