Virtual Display of Tactile Graphics and Braille by Lateral Skin Deformation

Total Page:16

File Type:pdf, Size:1020Kb

Virtual Display of Tactile Graphics and Braille by Lateral Skin Deformation Virtual Display of Tactile Graphics and Braille by Lateral Skin Deformation Vincent Lévesque Department of Electrical & Computer Engineering McGill University Montreal, Canada August 2009 A thesis submitted to McGill University in partial fulfillment of the requirements for the degree of Doctor of Philosophy. © 2009 Vincent Lévesque i Abstract Graphical content is increasingly pervasive in digital interfaces and documents yet it remains accessible to visually impaired persons almost exclusively on media with limited flexibility such as embossed paper. Textual content is more accessible but nevertheless limited by the cost and functionality of refreshable Braille displays and voice synthesis. This thesis explores the use of a novel tactile stimulation approach that relies on lateral skin deformation for the computerized display of virtual Braille and tactile graphics. Tactile synthesis by lateral skin deformation is initially explored in the context of Braille. The feasibility of producing virtual Braille by laterotactile stimulation is first demonstrated by creating the illusion of brushing against a line of Braille dots through the synchronization of a travelling wave of skin deformation with the displacement of a tactile array of eight actuators. This principle is then extended to complete 6- dot Braille cells by distributing lines of virtual dots onto the rows of actuators of a general-purpose STReSS2 tactile array. Reading the resulting virtual Braille is shown to generally be feasible but demanding, suggesting that a specialized laterotactile Braille display should be devised or that dots be rendered for contrast rather than realism. Tactile rendering by lateral skin deformation is then further explored with the grad- ual development of a virtual tactile graphics framework that emulates conventional features such as raised lines and areal textures through a coherent set of patterns that includes grating textures, stroked and dotted shapes, bitmap-based masks, and com- posite patterns. Dynamic rendering is also exploited to produce novel effects such as tactile flow, reactive textures dependent on the exploration behaviour, and interactive content with alternate views. The usability of the framework is informally evaluated with visually impaired volunteers and early tactile patterns studied through formal ex- periments. The tactile patterns are presented on the Tactograph, a haptic interface redesigned specifically for the display of tactile graphics that combines a STReSS2 dis- play with an instrumented planar carrier. This thesis demonstrates the potential of lateral skin deformation for the display of Braille and tactile graphics, and explores in the process the ways in which this novel approach to tactile stimulation can be applied to produce meaningful tactile sensations. ii Sommaire Le graphisme gagne en présence dans les interfaces et documents numériques mais est accessible aux personnes non-voyantes quasi-exclusivement sur media à flexibilité limitée tel que le papier embossé. Le texte est plus accessible mais néanmoins lim- ité par le coût et les fonctionnalités des afficheurs Braille et de la synthèse vocale. Cette thèse explore l’utilisation d’une nouvelle approche pour la stimulation tactile opérant par déformation latérale de la peau pour l’affichage informatisé de Braille et de graphiques tactiles virtuels. La synthèse tactile par déformation latérale de la peau est initialement explorée dans le contexte du Braille. La faisabilité de produire du Braille virtuel par stimu- lation latérotactile est d’abord démontrée en créant une illusion de frottement contre une ligne de points par la synchronisation d’une onde progressive de déformation de la peau avec le déplacement d’une plage tactile à 8 actionneurs. Ce principe est en- suite étendu au Braille à 6 points par la distribution de lignes de points virtuels sur les rangées d’actionneurs d’un afficheur tactile STReSS2. La lecture de ce Braille virtuel est généralement faisable mais exigeante, suggérant qu’un afficheur Braille latérotac- tile soit conçu ou que le rendu des points vise le contraste plutôt que le réalisme. Le rendu tactile par déformation latérale de la peau est ensuite exploré plus à fond avec le développement graduel d’un système de graphique tactile virtuel qui émule des éléments conventionnels tels que les traits surélevés et les surfaces texturées à travers un ensemble cohérent de motifs incluant des textures à rayure, des formes tracées ou pointillées, des masques par image, et des motifs composites. Le rendu dynamique est ensuite exploité pour produire de nouveaux effets tel que le flot tactile, les textures réactives répondant au comportement d’exploration, et le contenu interac- tif avec vues alternatives. L’ergonomie du système est évaluée informellement avec des volontaires non-voyants et les premiers motifs tactiles étudiés à travers des expéri- ences formelles. Les motifs tactiles sont présentés sur le Tactograph, une interface haptique reconçue spécifiquement pour l’affichage de graphiques tactiles qui combine un afficheur STReSS2 à un charriot planaire instrumenté. Cette thèse démontre le potentiel de la déformation latérale de la peau pour l’affichage de Braille et de graphiques tactiles, et explore ainsi comment cette approche à la stim- ulation tactile peut être appliquée à la production de sensations tactiles signifiantes. iii Acknowledgments Foremost, I would like to thank my supervisor, Prof. Vincent Hayward, without whom this work would not have been possible. His support, encouragement and creativity have made this challenging experience both pleasant and rewarding. This thesis also rests on an innovative technology that was developed under his supervision by Christine Desmarais, Juan Manuel Cruz-Hernandez, Jérôme Pasquero, and Qi Wang. I am indebted to them all for their dedication and hard work, and wish to particularly thank Qi Wang for his help and support with the latest prototype. I have had the chance to work alongside many inspiring and helpful colleagues during my years in the Haptics Laboratory. Jérôme Pasquero has been a valuable col- laborator and friend, and has influenced this work through innumerable discussions. Andrew Gosline patiently shared his mechanical engineering skills and helped me with the design of the Tactograph. Mounia Ziat’s expertise was invaluable for the supplementary statistical analysis presented in Chapters 4 and 6. I would also like to thank other past and present members of the Haptics Lab and Center for Intelligent Machines for their support and friendship (in alphabetical order): Omar Ayoub, Gi- anni Campion, Hanifa Dostmohamed, Diana Garroway, Prasun Lala, Akihiro Sato, Qi Wang, and Hsin-Yun Yao. I’m especially grateful to Sandra Skaff with whom I’ve shared much of this experience and who has proved a supportive and valuable friend. I would also like to thank Jean-Samuel Chenard for sharing his electronics ex- pertise and saving in extremis a demonstration of the tactile graphics system, and Don Pavlasek of the Mechanical Workshop (ECE dept.) for his advice and meticu- lous work. This work also benefited from a collaboration with Gregory Petit (UdeM), Prof. Aude Dufresne (UdeM), Nicole Trudeau and Pierre Ferland (INLB), with whom I’ve had numerous interesting discussions about tactile graphics. I would also like to thank Maryse Legault (HumanWare), Chantal Nicole (INLB) and Marie-Chantal Wanet-Defalque (INLB) for their assistance. VisuAide, now HumanWare Canada, also provided the initial impetus for this research and donated experimental hardware used for the early work on Braille. Last but not least, I wish to thank my parents, Suzanne and Viateur Lévesque, my sister, Danny Lévesque, as well as my family and friends for their support and encouragement throughout these years. iv Contents 1 Introduction 1 2 Assistive Technologies for Visually Impaired Persons 8 2.1 Introduction . .8 2.2 Haptics . .9 2.3 Blindness and Visual Impairment . 10 2.3.1 Definition, Prevalence and Causes . 10 2.3.2 Neurology and Psychology . 11 2.4 Technological Aids . 13 2.4.1 Orientation and Mobility . 13 2.4.2 Reading and Writing . 17 2.4.3 Graphics . 20 2.4.4 Human-Computer Interaction . 23 2.4.5 Vision Substitution . 25 2.4.6 Sight Recovery . 26 2.5 Discussion . 27 2.5.1 Assistive Technologies Market . 27 2.5.2 Recommendations . 28 2.5.3 Trends and Future Developments . 29 2.5.4 Applications of Haptics . 30 2.6 Conclusion . 32 3 Display of Virtual Braille Dots by Lateral Skin Deformation: Feasibility Study 34 3.1 Introduction . 37 Contents v 3.1.1 Braille Displays . 37 3.1.2 Alternative Technologies . 38 3.1.3 Overview . 38 3.2 Virtual Braille Display . 39 3.2.1 Device . 39 3.2.2 Skin Deformation Patterns . 43 3.3 Parameter Tuning . 46 3.3.1 Method . 47 3.3.2 Procedure . 48 3.3.3 Results . 48 3.3.4 Discussion . 49 3.4 Virtual Braille Legibility . 50 3.4.1 Method . 50 3.4.2 Results and Discussion . 51 3.5 Control Experiment . 55 3.5.1 Method . 55 3.5.2 Results and Discussion . 57 3.6 Conclusion and Future Work . 58 4 Braille Display by Lateral Skin Deformation with the STReSS2 Tactile Transducer 61 4.1 Introduction . 64 4.2 Virtual Braille Rendering . 65 4.2.1 Hardware . 65 4.2.2 Skin Deformation Patterns . 66 4.3 Virtual Braille Legibility . 69 4.3.1 Method . 69 4.3.2 Results . 71 4.4 Discussion . 75 4.5 Conclusion . 76 5 A Haptic Memory Game using the STReSS2 Tactile Display 78 5.1 Introduction . 81 5.2 Technology . 81 Contents vi 5.3 Tactile Rendering . 82 5.4 Haptic Memory Game . 83 5.5 Conclusion . 84 6 Tactile Graphics Rendering Using Three Laterotactile Drawing Primitives 86 6.1 Introduction . 89 6.2 Tactile Display Prototype . 91 6.3 Tactile Rendering . 92 6.3.1 Dot Rendering . 94 6.3.2 Vibration Rendering . 94 6.3.3 Grating Rendering . 95 6.3.4 Composite Rendering . 95 6.4 Experiments . 96 6.4.1 Shape . 96 6.4.2 Grating Spatial Frequency .
Recommended publications
  • Evaluating the Effect of Four Different Pointing Device Designs on Upper Extremity Posture and Muscle Activity During Mousing Tasks
    Applied Ergonomics 47 (2015) 259e264 Contents lists available at ScienceDirect Applied Ergonomics journal homepage: www.elsevier.com/locate/apergo Evaluating the effect of four different pointing device designs on upper extremity posture and muscle activity during mousing tasks * Michael Y.C. Lin a, Justin G. Young b, Jack T. Dennerlein a, c, a Department of Environmental Health, Harvard School of Public Health, 665 Huntington Avenue, Boston, MA 02115, USA b Department of Industrial & Manufacturing Engineering, Kettering University, 1700 University Avenue, Flint, MI 48504, USA c Department of Physical Therapy, Movements, and Rehabilitation Sciences, Bouve College of Health Sciences, Northeastern University, 360 Huntington Avenue, Boston, MA 02115, USA article info abstract Article history: The goal of this study was to evaluate the effect of different types of computer pointing devices and Received 10 January 2014 placements on posture and muscle activity of the hand and arm. A repeated measures laboratory study Accepted 3 October 2014 with 12 adults (6 females, 6 males) was conducted. Participants completed two mouse-intensive tasks Available online while using a conventional mouse, a trackball, a stand-alone touchpad, and a rollermouse. A motion analysis system and an electromyography system monitored right upper extremity postures and muscle Keywords: activity, respectively. The rollermouse condition was associated with a more neutral hand posture (lower Pointing device inter-fingertip spread and greater finger flexion) along with significantly lower forearm extensor muscle Computer tasks fi Musculoskeletal disorders activity. The touchpad and rollermouse, which were centrally located, were associated with signi cantly more neutral shoulder postures, reduced ulnar deviation, and lower forearm extensor muscle activities than other types of pointing devices.
    [Show full text]
  • Track Ball → It Is Pointing Device That Is Used to Control the Positions Of
    Department Of Computer Application (BBA) Dr. Rakesh Ranjan BCA Sem - 2 Input devices Track ball it is pointing device that is used to control the positions of the cursor on the screen. It is usually used in notebook computers where it is placed on the keyboard . It is nothing but an upside down mouse where the ball rotates in place within a socket. The user can rolls the ball to position the cursor at the appropriate position on the screen and then clicks one of the buttons near the track ball either to select the objects or to position the cursor. The working of a track ball is identical to mouse Touch pad it is small flat rectangular stationary pointing device with a sensitive surface of 1.5 to 2 inch. The user has to slide his or her figure tips across the surface of the pad to point to specific objects on the object. The surface translate the motion and position of the user’s figures to a relative position on the screen The touch pad are widely used in laptop and other handheld devices . The working of the touchpad is similar to that of mouse or a trackball. The pressure of the finger on the surface leads to a capacitance effect, which is detected by the sensors . the sensors send appropriate signals to cpu which interprets them and display the pointer on the screen . Joy stick it is widely used in computer games and computer aided design and manufacturing (CAD/CAM) applications. It has one or more push buttons called switches .
    [Show full text]
  • Assessment of Options for Handling Full Unicode Character Encodings in MARC21 a Study for the Library of Congress
    1 Assessment of Options for Handling Full Unicode Character Encodings in MARC21 A Study for the Library of Congress Part 1: New Scripts Jack Cain Senior Consultant Trylus Computing, Toronto 1 Purpose This assessment intends to study the issues and make recommendations on the possible expansion of the character set repertoire for bibliographic records in MARC21 format. 1.1 “Encoding Scheme” vs. “Repertoire” An encoding scheme contains codes by which characters are represented in computer memory. These codes are organized according to a certain methodology called an encoding scheme. The list of all characters so encoded is referred to as the “repertoire” of characters in the given encoding schemes. For example, ASCII is one encoding scheme, perhaps the one best known to the average non-technical person in North America. “A”, “B”, & “C” are three characters in the repertoire of this encoding scheme. These three characters are assigned encodings 41, 42 & 43 in ASCII (expressed here in hexadecimal). 1.2 MARC8 "MARC8" is the term commonly used to refer both to the encoding scheme and its repertoire as used in MARC records up to 1998. The ‘8’ refers to the fact that, unlike Unicode which is a multi-byte per character code set, the MARC8 encoding scheme is principally made up of multiple one byte tables in which each character is encoded using a single 8 bit byte. (It also includes the EACC set which actually uses fixed length 3 bytes per character.) (For details on MARC8 and its specifications see: http://www.loc.gov/marc/.) MARC8 was introduced around 1968 and was initially limited to essentially Latin script only.
    [Show full text]
  • An Isometric Joystick As a Pointing Device for Handheld Information Terminals
    An Isometric Joystick as a Pointing Device for Handheld Information Terminals Miika Silfverberg I. Scott MacKenzie Tatu Kauppinen Usability Group Department of Computer Science Usability Group Nokia Research Center, Finland York University, Canada Nokia Research Center, Finland Abstract embedded pointing device that is suitable for handheld Meeting the increasing demand for desktop-like appli- use. This work studies the applicability of the isometric cations on mobile products requires powerful interac- joystick to handheld usage. tion techniques. One candidate is GUI-style point-and- click interaction using an integrated pointing device 1.1 Isometric Joystick that supports handheld use. We tested an isometric joy- A joystick is a good candidate for handheld pointing. stick for this purpose. Two prototypes were built. They Since it is mounted in the device chassis, it cannot be were designed for thumb operation and included a sepa- lost, unlike a stylus. It is small and can be manipulated rate selection button. Twelve participants performed potentially with the same hand that holds the device. point-and-select tasks. We tested both one-handed and two-handed interaction, and selection using the separate The device studied herein is an isometric joystick. The selection button and the joystick’s integrated press-to- pointer is moved by applying force to the stick. The select feature. A notebook configuration served as a stick itself doesn't move, or moves very little – hence reference. Results for the handheld conditions, both the name "isometric". The most common input-output one-handed and two-handed, were just slightly off those mapping is known as “velocity-control”, whereby the for the notebook condition, suggesting that an isometric applied force controls the velocity of the pointer.
    [Show full text]
  • Haptiread: Reading Braille As Mid-Air Haptic Information
    HaptiRead: Reading Braille as Mid-Air Haptic Information Viktorija Paneva Sofia Seinfeld Michael Kraiczi Jörg Müller University of Bayreuth, Germany {viktorija.paneva, sofia.seinfeld, michael.kraiczi, joerg.mueller}@uni-bayreuth.de Figure 1. With HaptiRead we evaluate for the first time the possibility of presenting Braille information as touchless haptic stimulation using ultrasonic mid-air haptic technology. We present three different methods of generating the haptic stimulation: Constant, Point-by-Point and Row-by-Row. (a) depicts the standard ordering of cells in a Braille character, and (b) shows how the character in (a) is displayed by the three proposed methods. HaptiRead delivers the information directly to the user, through their palm, in an unobtrusive manner. Thus the haptic display is particularly suitable for messages communicated in public, e.g. reading the departure time of the next bus at the bus stop (c). ABSTRACT Author Keywords Mid-air haptic interfaces have several advantages - the haptic Mid-air Haptics, Ultrasound, Haptic Feedback, Public information is delivered directly to the user, in a manner that Displays, Braille, Reading by Blind People. is unobtrusive to the immediate environment. They operate at a distance, thus easier to discover; they are more hygienic and allow interaction in 3D. We validate, for the first time, in INTRODUCTION a preliminary study with sighted and a user study with blind There are several challenges that blind people face when en- participants, the use of mid-air haptics for conveying Braille. gaging with interactive systems in public spaces. Firstly, it is We tested three haptic stimulation methods, where the hap- more difficult for the blind to maintain their personal privacy tic feedback was either: a) aligned temporally, with haptic when engaging with public displays.
    [Show full text]
  • ISO/IEC JTC1/SC2/WG2 N 2005 Date: 1999-05-29
    ISO INTERNATIONAL ORGANIZATION FOR STANDARDIZATION ORGANISATION INTERNATIONALE DE NORMALISATION --------------------------------------------------------------------------------------- ISO/IEC JTC1/SC2/WG2 Universal Multiple-Octet Coded Character Set (UCS) -------------------------------------------------------------------------------- ISO/IEC JTC1/SC2/WG2 N 2005 Date: 1999-05-29 TITLE: ISO/IEC 10646-1 Second Edition text, Draft 2 SOURCE: Bruce Paterson, project editor STATUS: Working paper of JTC1/SC2/WG2 ACTION: For review and comment by WG2 DISTRIBUTION: Members of JTC1/SC2/WG2 1. Scope This paper provides a second draft of the text sections of the Second Edition of ISO/IEC 10646-1. It replaces the previous paper WG2 N 1796 (1998-06-01). This draft text includes: - Clauses 1 to 27 (replacing the previous clauses 1 to 26), - Annexes A to R (replacing the previous Annexes A to T), and is attached here as “Draft 2 for ISO/IEC 10646-1 : 1999” (pages ii & 1 to 77). Published and Draft Amendments up to Amd.31 (Tibetan extended), Technical Corrigenda nos. 1, 2, and 3, and editorial corrigenda approved by WG2 up to 1999-03-15, have been applied to the text. The draft does not include: - character glyph tables and name tables (these will be provided in a separate WG2 document from AFII), - the alphabetically sorted list of character names in Annex E (now Annex G), - markings to show the differences from the previous draft. A separate WG2 paper will give the editorial corrigenda applied to this text since N 1796. The editorial corrigenda are as agreed at WG2 meetings #34 to #36. Editorial corrigenda applicable to the character glyph tables and name tables, as listed in N1796 pages 2 to 5, have already been applied to the draft character tables prepared by AFII.
    [Show full text]
  • Pointing Devices, Input-Output Mappings, CD Gain, Mid-Air Interaction, Problems of Direct Input and Solutions Input Devices Vs
    Input: pointing devices, input-output mappings, CD gain, mid-air interaction, problems of direct input and solutions Input devices vs. Finger-based input Indirect vs. Direct pointing Indirect: The position of the cursor Direct: Fingers manipulate visual is controlled by the device objects directly on the screen Absolute vs. Relative pointing Absolute: 1-to-1 mapping between input and output space indirect direct Relative: Input controls the relative position of the cursor (always indirect) Hovering mode Tracking the position of the pointing device (e.g., the pen) or the finger from distance Hover widgets http://www.youtube.com/watch?v=KRXfaZ8nqZM Absolute pointing Direct input ! Hovering feedback is not indispensable as there is a clear mapping between pen/fingers and the screen ! Main drawback: occlusion problems Indirect input Wacom Cintiq ! « Hovering » is indispensable: users must know the position of the cursor before starting drawing regular graphics tablet Relative pointing Common devices: mouse and touchpad « Clutching » instead of « hovering » mode ! Lift the mouse or finger to « re-calibrate » movement ! Use of smaller input space to traverse a larger output space How would you map the input space of the tablet to the output space of the wall? Smarties: https://www.lri.fr/~chapuis/publications/CHI14-smartiestk.mp4 Buxton’s 3-state model (1990) A. Two-state model for mouse Buxton’s 3-state model (1990) B. Two-state model for a touch tablet Buxton’s 3-state model (1990) C. Three-state model for a gaphics tablet with stylus Relative pointing: Mappings Position control: maps human input to the position of the cursor (or object of interest) Examples: mouse, touchpad Rate (or velocity) control: maps human input to the velocity of the cursor (or object of interest) Examples: joystick, trackpoint Trackpoint Isotonic vs.
    [Show full text]
  • A Computer Vision-Based Pointing and Gesture Input Device
    FlowMouse: A Computer Vision-Based Pointing and Gesture Input Device Andrew D. Wilson and Edward Cutrell Microsoft Research One Microsoft Way, Redmond, WA [email protected], [email protected] Abstract. We introduce FlowMouse, a computer vision-based pointing device and gesture input system. FlowMouse uses optical flow techniques to model the motion of the hand and a capacitive touch sensor to enable and disable in- teraction. By using optical flow rather than a more traditional tracking based method, FlowMouse is exceptionally robust, simple in design, and offers op- portunities for fluid gesture-based interaction that go well beyond merely emu- lating pointing devices such as the mouse. We present a Fitts law study exam- ining pointing performance, and discuss applications of the optical flow field for gesture input. 1 Introduction Today’s computing environments are strongly tied to the availability of a high resolu- tion pointing device, and, more fundamentally, to the notion of a single, discrete two- dimensional cursor. Modern GUIs (graphical user interfaces) combined with devices such as mice and track pads are extremely effective at reducing the richness and vari- ety of human communication down to a single point. While the utility of such devices in today’s interfaces cannot be denied, there are opportunities to apply other kinds of sensors to enrich the user experience. For example, video cameras and computer vision techniques may be used to capture many details of human shape and movement [24]. The shape of the hand may be analyzed over time to manipulate an onscreen object in a way analogous to the hand’s manipulation of paper on a desk.
    [Show full text]
  • People with Disabilities and Computer Technology Providing Access to Technology by Sheryl Burgstahler, Ph.D
    Working Together: People with Disabilities and Computer Technology Providing access to technology By Sheryl Burgstahler, Ph.D. People with disabilities meet barriers of all that normally require two or more keys to be types. However, technology is helping to lower pressed simultaneously. The key repeat function many of these barriers. By using computing can be disabled for those who cannot release a technology for tasks such as reading and writing key quickly enough to avoid multiple selections. documents, communicating with others, and Key ­­board guards (solid templates with holes over searching for information on the Internet, each key to assist precise selection) can be used by students and employees with disabilities are those with limited fine motor control. capable of handling a wider range of activities independently. Still, people with disabilities face a Sometimes repositioning the keyboard and variety of barriers to computer use. These barriers monitor can enhance accessibility. For example, can be grouped into three functional categories: mounting keyboards perpendicular to tables barriers to providing computer input, interpreting or wheelchair trays at head-height can assist output, and reading supporting documentation. individuals with limited mobility who use Hardware and software tools (known as adaptive pointing devices to press keys. Other simple or assistive technologies) have been developed to hardware modifications can assist individuals provide functional alternatives to these standard with mobility impairments. For instance, disk operations. Specific products, and approaches to guides can assist with inserting and removing using them, are described below. diskettes; a dedicated hard disk or computer network access can eliminate or reduce the necessity to do so.
    [Show full text]
  • Chapter 9. Input Devices
    Table of contents 9 Input devices .................................................................................................................9-1 9.1 Keyboards ............................................................................................................. 9-4 9.2 Fixed-function keys .............................................................................................. 9-6 9.3 Pointing devices.................................................................................................... 9-7 9.3.1 General........................................................................................................... 9-7 9.3.2 Mouse ............................................................................................................ 9-9 9.3.3 Joystick and trackball .................................................................................. 9-10 9.3.3.1 General..................................................................................................9-10 9.3.3.2 Hand-operated displacement joysticks .................................................9-10 9.3.3.3 Finger-operated displacement joysticks................................................9-11 9.3.3.4 Thumb tip and fingertip-operated displacement joysticks....................9-13 9.3.3.5 Hand-operated isometric joysticks........................................................9-13 9.3.3.6 Thumb tip and fingertip-operated isometric joysticks..........................9-14 9.3.3.7 Ball controls..........................................................................................9-14
    [Show full text]
  • Braille in Mathematics Education
    Masters Thesis Information Sciences Radboud University Nijmegen Braille in Mathematics Education Marc Bitter April 4, 2013 Supervisors IK183 Prof.Dr.Ir. Theo van der Weide Dr. Henny van der Meijden Abstract This research project aimed to make improvements to the way blind learners in the Netherlands use mathematics in an educational context. As part of this research, con- textual research in the field of cognition, braille, education, and mathematics was con- ducted. In order to compare representations of mathematics in braille, various braille codes were compared according to set criteria. Finally, four Dutch mathematics curricula were compared in terms of cognitive complexity of the mathematical formulas required for the respective curriculum. For this research, two main research methods were used. A literature study was conducted for contextual information regarding cognitive aspects, historic information on braille, and the education system in the Netherlands. Interviews with experts in the field of mathematics education and braille were held to relate the contextual findings to practical issues, and to understand why certain decisions were made in the past. The main finding in terms of cognitive aspects, involves the limitation of tactile and auditory senses and the impact these limitations have on textual aspects of mathematics. Besides graphical content, the representation of mathematical formulas was found to be extremely difficult for blind learners. There are two main ways to express mathematics in braille: using a dedicated braille code containing braille-specific symbols, or using a linear translation of a pseudo-code into braille. Pseudo-codes allow for reading and producing by sighted users as well as blind users, and are the main approach for providing braille material to blind learners in the Netherlands.
    [Show full text]
  • Chapter 3 Input Devices
    CSCA0201 FUNDAMENTALS OF COMPUTING Chapter 3 Input Devices 1 Input Devices Topics: • Input Devices • Examples of Input Device • Keyboard • Pointing Devices • Graphic and Video Input Devices • Audio Input Devices 2 Input Devices Input Devices • Any peripheral (piece of computer hardware equipment) used to provide data and control signals to a computer. • Allows the user to put data into the computer. • Without any input devices, a computer would only be a display device and not allow users to interact with it. 3 Input Devices Examples of Input Device • Keyboard • Mouse • Touchscreen • Graphic tablet • Microphone • Scanner 4 Input Devices Keyboard • One of the primary input devices used with a computer. • The keyboard looks very similar to the keyboards of electric typewriters, with some additional keys. • Keyboards allow a computer user to input letters, numbers, and other symbols into a computer • Uses an arrangement of buttons or keys. • Requires pressing and holding several keys simultaneously or in sequence. 5 Input Devices Keyboard 6 Input Devices Types of Keyboard • Standard • Laptop • Gaming and Multimedia • Thumb-sized • Virtual • Foldable 7 Input Devices Types of Keyboard Standard • Desktop computer keyboards, such as the 101-key US traditional keyboards or the 104-key Windows keyboards, include alphabetic characters, punctuation symbols, numbers and a variety of function keys. 8 Input Devices Types of Keyboard Laptop Keyboard • The laptop computer keyboard is a small version of the typical QWERTY keyboard. • A typical laptop has the same keyboard type as a normal keyboard, except for the fact that most laptop keyboards condense the symbols into fewer buttons to accommodate less space.
    [Show full text]