Visuospatial Assistive Device Art: Expanding Human Echolocation Ability

March 2018

Aisen Carolina Chacin

1 Visuospatial Assistive Device Art: Expanding Human Echolocation Ability

School of Integrative and Global Majors Ph.D. Program in Empowerment Informatics University of Tsukuba

March2018

Aisen Carolina Chacin

2 Abstract:

This study investigates human echolocation through devices for the augmentation of visuospatial skills, thus replacing vision through acoustic and haptic interfaces. These concepts are tested through two case studies; Echolocation Headphones and IrukaTact. The Echolocation Headphones are a pair of goggles that disable the participant's vision. They emit a focused sound beam which activates the space with directional acoustic reflection. The directional properties of this parametric sound provide the participant a focal echo. This directionality is similar to the focal point of vision, giving the participant the ability to selectively gain an audible imprint of their environment. This case study analyzes the effectiveness of this wearable sensory extension for aiding auditory spatial location in three experiments; optimal sound type and distance for object location, perceptual resolution by just noticeable difference, and goal-directed spatial navigation for open pathway detection. The second case study is the IrukaTact haptic module, a wearable device for underwater tactile stimulation of the fingertips with a jet-stream of varying pressure. This haptic module utilizes an impeller pump which suctions surrounding water and propels it onto the volar pads of the finger, providing a hybrid feel of vibration and pressure stimulation. IrukaTact has an echo-haptic utility when it is paired with a sonar sensor and linearly transposing its distance data into tactile pressure feedback. This gives users a quasi-parallel of touch via a cross-modal transaction of visuospatial information that enhances their ability to gauge distances through touch. IrukaTact has assistive capabilities for vision in aquatic environments where it is difficult to see. Similarly, the Echolocation Headphones have assistive capabilities for participants who may already depend on audition for spatial navigation and object location. These devices are an example of cross-modal Assistive Device Art (ADA) which expand the visuospatial skills and experience of human echolocation. This kind of art derives from the integration of Assistive Technology (AT) and Art, involving the mediation of sensorimotor functions and from both, psychophysical methods and conceptual mechanics of sensory embodiment. These cases are examples of deploying this new framework of Assistive Device Art, tools with an assistive utility to embody new perspectives and gain new skills.

3 Table of Contents

1. Introduction 1.1. The Human-Computer Relationship 1.2. The Plasticity of the Mind: Sensory Substitution 1.2.1. Towards a Future of Electric Savants 2. Principles of Sensory Perception 2.1. 2.2. Visuospatial Perception 2.2.1. Photon-less Vision and Blind Imagination 2.2.2. Seeing Space with Sound 3. Assistive Device Art 3.1. Perception Spectacles 3.2. Device Art: Playful 3.3. Cultural Aesthetics: Products of Desire 3.4. Assistive Device Artworks: The Prosthetic Phenomena 4. Methodology 4.1. Designing an Assistive Device Art Tool 4.2.Interdisciplinary Approaches to tool design 4.3. Public Presentation and Dissemination 4.4. Open-Access Engineering 4.5. Designing for Sensory Plasticity 4.5.1. Adopted Assistive Displays 5. Implementation: Case Studies 5.1. Echolocation Headphones: Seeing with Sound 5.1.1. Device Design 5.1.1.1. Mechanism: Technical Aspects 5.1.1.2. Sound Characteristics 5.1.2. Experiments 5.1.2.1. Object Location: Optimal Sound Type and Distance 5.1.2.1.1. Participants 5.1.2.1.2. Apparatus 5.1.2.1.3. Audio Characteristics 5.1.2.1.4. Procedure 5.1.2.1.5. Evaluation 5.1.2.2. Just Noticeable Difference 5.1.2.2.1. Participants 5.1.2.2.2. Apparatus 5.1.2.2.3. Procedure 5.1.2.2.4. Evaluation 5.1.2.3.Task Oriented Path Recognition 5.1.2.3.1. Participants

4 5.1.2.3.2. Apparatus 5.1.2.3.3. Procedure 5.1.2.3.4. Evaluation 5.1.2.4. Social Evaluation: Art Exhibition Interactions and Feedback 5.1.2.4.1. Novice Echolocation Behaviors 5.1.2.4.2. Participant Feedback 5.1.2.5. Echolocation Headphones Conclusion 5.2. IrukaTact: A Parallel Sense of Touch Underwater 5.2.1. Underwater Haptics Prior Art 5.2.2. Device Design 5.2.2.1. Aesthetic Design 5.2.2.2. Mechanism: Technical Aspects 5.2.2.3. Prototyping Feel 5.2.3. Experiments 5.2.3.1. Force Feedback/ L/min 5.2.3.2. Absolute Threshold 5.2.3.3. Haptic Resolution 5.2.3.3.1. Participants 5.2.3.3.2. Apparatus 5.2.3.3.3. Results 5.2.3.4. Distance Assessment 5.2.3.4.1. Participants 5.2.3.4.2. Apparatus 5.2.3.4.3. Results 5.2.4. IrukaTact Public Presentations: DemoTanks 5.2.4.1. DemoTanks Associative Perception 5.2.4.2. Observations of Participant Interactions During Demonstrations 5.2.5. Applications 5.2.5.1.IrukaTact Glove 5.2.5.1.1. Interface Design 5.2.6. IrukaTact Conclusion 6. General Discussion 7. Conclusion Acknowledgements References Bibliography Appendices i, ii, iii

5 1. Introduction of participants with no visual impairments. Visuospatial ADA tools increase sensory skill This thesis introduces a conceptual development via substitute perceptual framework, Assistive Device Art (ADA), by architectures that re-channel a user’s attention which to define, produce, and evaluate tools from one sensory modality to another, in this that expand human sensorimotor abilities with case, their visual scope into their audible or the aim to facilitate experiential awareness tactile reception. This sensory input about the diverse perceptual mechanisms that substitution is possible because visuospatial lie on the wide spectrum of human skill and awareness is a multimodal precept. It is an ability. ADA derives from the integration of ability informed by a multitude of that Assistive Technology (AT) and Art, involving include vision, audition, somatoception the mediation of sensorimotor functions and (touch), and most importantly, proprioception. perception from both, psychophysical methods The latter is the sense of knowing where we and conceptual mechanics of sensory are in relation to our environment and embodiment. ADA seeks to augment the ourselves. By devising two case studies that functionality of electronic displays by refocus a participant's spatial awareness to adopting AT as a guide for designing new audible echolocation via the Echolocation sensory interactions in computing interfaces. Headphones or as tactile transposition via the This concept engages a broader audience as IrukaTact searching tools, this research users of AT with the aim to expand their examines specific cross-modal connections perceptual acuity towards alternative methods within our visuospatial system. These case of sensing information. AT is a rich reference study interfaces have been designed and tested model for designing innovative interfaces; this by applying the ADA tool design framework. field harnesses a fruitful potential for the This investigation has an interdisciplinary future of Human-Computer Interaction (HCI). approach that involves both scientific and ADA tools adopt AT functionalities and philosophical points of view, incorporating integrates them into desirable products, psychophysical and interaction design reframing this technology from aiding principles for tool design and evaluation. disability into adding super-ability, thus embracing atypical as an The ADA framework bridges the gap inspirational source of diverse sensorimotor between disciplines, borrowing from an architectures. This empathetic approach aggregation of viewpoints to reflect on what towards tool design evolves the aesthetic and constitutes skill and ability as it co-evolves cultural associations of AT giving rise to a new with technological prosthetic extensions. This user base motivated by curiosity and interdisciplinary approach involves theoretical willingness to playfully engage in an models from HCI, sensation and perception alternative perceptual experience. science, and philosophical inquiry, incorporating principles such as In this research study, we use two psychophysical evaluation, experiential ADA sensory substitution devices that interface design, and artistic conceptualization translate vision into audition through the and presentation. The HCI field has well- Echolocation Headphones, and somatoception established methods that place the user at the via IrukaTact, to test the visuospatial abilities center of the system. The ADA framework

6 dislocates the idea of a user by re-setting the further development of richer digital purpose of assistive utility and instead these experiences created for the multimodal palette tools serve as a means to engage in experiential of human perception. In the case studies performance. The user is a performer section, this study assesses the functional augmented and co-adapting with the interface. viability of two specific ADA tools through a In the scheme of an ADA tool, the principal series of psychophysical experiments that motivation is to display a distinct sensory evaluate the performance of visually able architecture that the participant slowly absorbs participants engaging in spatial tasks while with practice and attenuation. This process of deprived of vision and instead, wearing tool incorporation enhances their visuospatial translating ADA tools. Given that understanding of what constitutes skill and the majority of device displays are centered on ability. ADA engages participants in vision the focus of this study is to learn how to technologically aided disability performances supplement visual information, and as for this that form new constructs physiologically, particular study we focus on visuospatial tasks. intellectually, and socially through the means Through these tests, we examine how of relational empathy. ADA tool design successful the participating subjects were in centers on experience, it seeks to explore of the accomplishing visuospatial cross-modality. limits of our perceptual abilities by adopting We consider that non-visual spatial perceptual new models of observation. Furthermore, this abilities may be more natural to visually research investigates how the rising interest in impaired people who are able to process sensory pliable wearable computing devices mental imagery. If their physiology is could earn a broader and more inclusive accustomed to use audio and somatic cues in consumer demand base for AT. Current tools order to perceive spatial information, they that provide alternative avenues for perceptual would therefore have a perceptual advantage display are meant to fill a need in the sensory above sighted subjects who are instead impaired community, thus existing as accustomed to perceive visual phenomena and accessibility devices. These devices provide a so imagine spatial information with visual way for them to supplement their abilities quality. This research takes into consideration while performing daily tasks and correct their the public dialogue caused by the portrayal of physiological disparity. ADA looks at AT as a ADA as potential or real commercial products model from which to derive alternative through the media. It also analyzes the public information displays that further the status quo reception of ADA tools via artistic of interactivity. The ADA framework seeks to dissemination by presenting these prototypes sway attitudes toward ability via experiential as artworks and collecting feedback from models that fashionably facilitate assistive exhibitions via interviews and observation. awareness. This research concludes by extracting the essential aspects of ADA and how it relates to This research broadly explores the the future of wearable computing, honing in on extent at which humans are willing and desire the importance of perceptual diversity and to integrate with computing interfaces. It cross-disciplinary approaches to tool design. questions how AT could be adopted for conventional device design not only for socially integrative purposes but rather for the

7 1.1 The Human-Computer possibly came from the interactive link of Relationship simulating the action of hanging-up a land-line telephone, or perhaps to simply reduce the “With every tool man is size. This feature was discontinued through the perfecting his own organs, years and it was never truly adopted as a user whether motor or sensory, need, since then more people today are likely or is removing the limits to to relate to mobile phones rather than their functioning” landlines, and the scale of electronic components has shrunk significantly. One -Sigmund Freud, novel interaction design feature that still Civilization and Its remains in cellphone designs today was Discontents introduced in 1992 by Nokia when they filed the patent “Mobile Phone with a Vibrating All technology is inherently assistive, from a Alarm” (A.M. 1992). Adding a weighted hammer to the mini digital computers we carry motor that vibrates instead of or in conjunction in our pockets. The smartphone is a device with an audible signal was the first attempt to which has almost become another cognitive expand the mobile’s sensory display methods. human organ, equipped with extra memory This was the first time for digital haptics to and beyond. In the mid 1980’s and the early become incorporated into these consumer 1990’s when the cellphone became a popular electronic devices. Haptics and other sensory commodity, it was a brick like device with a display methods can provide more realistic long antenna providing expensive talk renderings of digital display experiences, they services. In 1992 the first Global System for can also reinforce signaling and more Mobile Communications (GSM) device universal interface designs. Adding novel became available (Temple 2017). It was able sensory transducing features to consumer to receive Short Message Service (SMS) and electronic devices demands more attention as looked more like a suitcase than a pocket these are the forefronts for HCI development. phone. These devices had limited input and display capacities, most of them only featuring Today’s typical smartphone displays a microphone, a speaker, and passive matrix audible, visual, and haptic information, LCD displays. From an automobile accessory, equipped with a full-sized touch-screen, a the mobile cellphone yearly transcended into a flashlight, and a speaker. This efficient smaller and more compact pocket device. The machine has a myriad of sensors such as a features of today’s smartphones are microphone, accelerometer, gyroscope, GPS improvements on the same interface design of module, dual front and back cameras, three decades ago. Their scale has massively Bluetooth, WIFI and other network radio changed, but their form still remains a rounded transceivers. The smartphone helps us orient corner, rectangular interface. Along the way of ourselves beyond the spatial constructs of the mobile cellphone’s evolution, only a few geolocation. It is a compass that guides our new interactive features have been place within society, as they connect us to the implemented, beyond the improvement of rest of the world and are our portals to the quality in screen resolution and portability. internet. The smartphone is a prosthetic The now lost flip-phone design feature command center from which we transmit

8 audio-visual messages and record our fundamental contributing factor to this memories. They are adopted as a multimodal phenomenon is scale. As computing devices part of our body giving us extra abilities to become smaller and more efficient, they connect and orient ourselves. There is no further integrate with our physiology. This question that smartphones are popular, pervasive incorporation has increased the desirable, and assistive. Perhaps, the electronics consumer’s appetite for interfaces rectangular form and screen display interface that provide novel sensory experiences. of the smartphone is another passing feature Electric devices have reached another level of such as the flip-phone. This device has functionality plasticity as electronic evolved to become a powerful efficient components and modules are readily available, prosthetic extension. and programmatic knowledge is easily accessible. The technical functionality of In these times when we are constantly wearable computing media is fluid and wondering if technology has sufficiently interchangeable; however, interaction and engulfed us, we ask ourselves how far are we experiential design are the critical defining from a machine? After all, we are electric guides to the extent at which we dip within the beings. The story about the discovery of the virtual. body’s electric potentials is closely intertwined with the story of the battery. When The miniaturization and wearability of Luigi A. Galvani discovered galvanic computing devices have also caused an stimulation, he thought that bodies had an unexpected catalyst between industries, for “animal electric fluid” which was different example, in the convergence of and from metal induced electricity. His associate, entertainment. Medical devices that were Volta, did not believe this liquid was particular once costly and only available to doctors as to animals. Instead, Volta used his friend’s diagnostic tools are now being worn by the name to coin a new term “galvanism" meaning average person on their wrist as a fashionable a direct current of electricity produced by accessory. Wearable devices that track bio- chemical action, an idea that shortly led him to signals such as heart rate, perspiration, and invent the battery (Houston 1905). Without the sleep quality, give users a new intimate scope electric circuit that encompasses our nerves, by which to learn more about themselves. we would not have the batteries that power the Consumers embrace the commodification of a vital computing extensions that we harbor in quantified-self, eager to adopt new sensory our pockets. avenues by which to enhance their physiological mechanisms. Whether to collect Technology is the human way of innumerable amounts of data or expand their coping with the compulsion to manipulate motor abilities, these consumers seek tools that nature; it is an extension of our agency and afford them a fuller experience and an ever motives; it's a survival mechanism. Computing more encompassing perceptual awareness. interfaces are so omnipresent in our daily lives The growing interest in biometrics oriented that they have become virtually synonymous devices is an exemplary phenomenon to the with the term technology. Through the course motives that drive innovative experience and of our human-computer integration, the form interaction design. This eager adoption of interfaces has significantly changed, and the provides an outlook on the direction in which

9 humans are co-evolving with their machines experience, learning, and injury? (Kaas 1991; and reveals the collective, yet intimate, Björkman et al. 2011). Plasticity can occur demand for experiential intensification. rapidly with an intervention based on decreased inhibition, meaning that the This inward shift towards human- receptive field occupied by a past signal will aware technology has also influenced the drive expand to enable other neurons to be excited for innovation in the area of information by a new stimulus (Sanes, et al. 2000). The displays. The visual representation of perceptual mapping of sensory substitution is information is the primary focus of digital more complicated than extending a signal display technology because human perception from one neuron to another; it takes practice. is highly visual. Scientists estimate that the Take for example a person who is congenitally dedicates 50% of its processing blind; they were born without the ability to see. to visual information (Hagen, 2012). At this Suppose they only knew objects based on their specific point in time, we can see Virtual tactile information. They know what a ball is, Reality (VR) headsets flooding the markets as they have touched and played with one absorbing teenagers in a virtual environment. before. Imagine that this person suddenly Ordinary people glued to their palm-held gains their ability to see. They open their eyes devices are squinting over a small screen that and finally see a ball in front of them for the contains an entire toolkit. It is the Swiss Army first time. Could they distinguish the ball knife of digital experience, designed to assist based on sight alone? This is Molyneux’s with a myriad of tasks, whether navigational, philosophical problem and it has provoked social, or biometric. Wrist or pocket screens sensory mapping research since 1688 and stereoscopic wearable displays are all (Degenaar et Al. 2005) His question was examples of the central role that vision plays seemingly simple but complicated to answer in digital display design. While the is a by postulation alone because in his time there multimodal organ and its visual mechanisms were not many treatments or cures to any kind process many other sensory modalities, very of congenital blindness. There were a few few interfaces have ventured outside of attempts at experimentation to answer this screens and taken advantage of this pan- problem, but they were deemed inadmissible sensory aspect of perception. given lack of controlled circumstances. However, Pawan Sinha responded to the 1.2 The Plasticity of the Mind: question with proper experimentation in 2003. Sensory Substitution Devices (Ostrovsky, Sinha, et Al. 2006) He determined that the answer to Molyneux’s problem is that, and sensory substitution in the case of the previous example, the person theories suggest that the brain is moldable and would not be able to distinguish the ball by adaptable based on conditional circumstances sight alone. This is because congenitally blind such as change in behavior or environment individuals who suddenly gain their sight (Pascual-Leone et al. 2011). Also known as cannot immediately relate the visual brain plasticity, this area of study aims to phenomenological information from the same answer one of the most intriguing questions of object they only knew by tact alone (Degenaar neuroscience; how does the et Al. 2005). Instead, the connection between adapt its functionality based on sensory input, the senses is learned by experience. Pawan

10 Sinha stated in his research that his subjects implementations of the technology to were able to create this tactile to visual link make it accessible to a wide range of within a few days (Ostrovsky, Sinha, et Al. patients suffering sensory loss. 2006). This means that sensory substitution is a learning process, and not an immediate Second, moving beyond the outcome. The ways in which the brain works restoration of lost senses, it should be to capture and adapt to experience has researchers, , and most people possible to use the same technology to alike incredibly curious. expand human sensibilities, for example, enabling the use of night Paul Bach-Y-Rita was a pioneering in the field of neuroplasticity. vision apparatus without interfering He learned about the brain’s plasticity as he with normal vision. Finally, the assisted his father to regain control and recover technology enables a whole range of some of his lost sensorimotor abilities after he non-invasive low-risk experiments suffered stroke that left part of his body unresponsive (UW 2006). Throughout his with human subjects to gain a deeper career, he created successful devices that understanding of brain plasticity and enabled congenitally blind individuals to gain cognitive processes.” a new perspective on the world via tactile- vision. He did this by creating a series of -Bach-y-Rita and Paul, Kercel devices that transposed visual information to a (2003) on their article "Sensory matrix of haptic transducers. His most famous Substitution and the Human–Machine device was the BrainPort, which consisted of a Interface." pair of opaque glasses with a small camera attached to a tongue display unit that induced small electric currents in correlation to the Sensory substitution is not a function image taken from the camera (BrainPort of cancelling the properties of one organ and 2017). This masterful neurological experience replace them with another. Instead, the device designer shared some of his views on substitution occurs by a shift on focus from the future paths for sensory substitution one sensory pathway to another, and it can be prototype development, as quoted below: achieved for certain multimodal precepts. Experiences that are rich in phenomenological “Research up to the present has led to information can be perceived by multiple the demonstration of reliable types of sensory processes. The combination prototypical devices that use sensory of HCI and brain plasticity methodologies have provided new types of tools useful for substitution to restore lost sensory achieving sensory substitution via computing function. There are three envisioned interfaces. These sensory-mixing techniques paths of future development. First, are applied to inquire on the adaptability of the and most urgent, is the development brain and to translate its perceptive functions. As participants engage with these sensory to robust and relatively inexpensive remodeling devices, they re-channel their

11 senses. Sometimes they retain perceptual their abled senses more acutely than any agency over the device, and other times, the individual with normal physiology, e.g. the device overrides their physiological heightened sense of in a blind person. perception of certain stimuli. Some This adaptability of the brain to assume more interactions of sensory interfaces are already processing power to any of the senses is also hardwired, such as bone conduction hearing. present in cases of the savant syndrome. Others require training and practice in order to achieve a level of brain plasticity and bio- Although extremely rare, this cooperation. For example, in tactile-sight, the syndrome occurs in some disabled individuals perception of visual information from who, beyond their disability, display “islands somatoception requires tactile discrimination of genius” compensating their ‘incapability’ which is a cognitive related task (Bach-Y-Rita, with strikes of genius in mathematical 2003). Sensory substitution devices function calculation, memorization, lingual, and artistic as training devices until the experiential abilities among others (Treffert 2005). This connection is mapped in the brain. syndrome, occurring in one of ten autistic individuals, has been previously described as 1.2.1 Towards a Future of Electric an “autistic savant” syndrome even though Savants only 50% of people displaying these characteristics are actually autistic. The other One can assume humans walk upright, talk, 50% of people displaying the savant syndrome and have the possibility to sense in the are disabled individuals who have sustained spectrum of around 430–770 THz of the damage to their central nervous system either visible and between 20–20,000 Hz of audible by an injury or a disease (Treffert 2009). This electromagnetic frequency range. These are all condition can be acquired during a lifetime normal human abilities, that can be supported accidentally or congenitally. According to by utilitarian technology, but what about talent Treffert (2009) these extraordinary abilities or disabilities? Are we striving to remain can be acquired and lost in inexplicable within these capabilities? How will these traits instances. Damage to the central nervous inform the path of our adaptability and HCI co- system can force the brain to redistribute its evolution? processing mechanisms, sometimes resulting in this savant effect. Often, this syndrome is Disabilities are often seen as the associated with some form of dysfunction in disadvantages that a person has in comparison the left-hemisphere of the brain with a to the average abilities of normal individuals, concurrent compensation from the right- rather than regarding the specialization of their hemisphere, leading to an affinity for non- perceptual abilities. Why should we not refer symbolic skills (Sacks 2007). to deaf individuals as “visually enhanced” or the blind as “sono-somatic” humans? The Intellectual capacity is probably the plasticity of the mind allots processing most sought-after ability that humans would currency depending on the necessity of wish to enhance. What type of device could specific functions based on sensory ability. safely and temporarily induce enhanced Disabled individuals are often seen to intellectual abilities similar to the savant compensate their perceptual processes with syndrome without inhibiting other brain

12 functions? Can the savant syndrome be exhibit the savant syndrome. The USA Air electrically induced by inhibiting the left Force has applied the tDCS method on some hemisphere of the brain? Research suggests of their soldiers as they engaged in drone that it is possible to artificially induce this type guiding simulations. The application of 2mA of savant state. In his studies, Allan Snyder direct current for 30 minutes to the left- (2009) argues that these savant skills are latent anterior temporal lobe has been shown to in all humans. He hypothesizes that people shorten the training time of these soldiers in who present these abilities have privileged half (Nelson et al. 2016). access to raw, lower-level, less-processed information before it is labeled into holistic Beyond the applications of electric concepts. And while all of us have access to savant inducing technology for research this information, it is somewhere beyond the purposes, a group of Michigan hackers created grip of conscious awareness and irretrievable the GoFlow (2012). A simple tDCS instrument by introspection (Snyder 2009). with two electrodes connected to a 1.5V battery. This technology is so simple that if it There are various types of transcranial was found to be completely benign and stimulation, these are methods by which to augmentative, it could one day become a artificially induce or inhibit cortical required school supply. It is open-source and excitability. Most commonly used are available to anyone. However, there is not transcranial direct current stimulation (tDCS), enough conclusive information about the long- and repetitive transcranial magnetic term effects induced by tDCS. It should be stimulation (rTMS). While these methods of noted that interfaces that apply electricity to cortical excitability require applying magnetic the body are considered non-invasive because fields or electric currents to the brain, they are the device itself is not implanted within the relatively low frequency fields and weak body. This type of AT is assistive to humans currents. They only virtually affect the normal with normal abilities by disabling their normal brain functions for the limited time in which functions inhibiting some abilities to induce these are applied and sometimes only lasting a others. Systems such as the GoFlow have the short period of time afterwards. It has been potential to further natural human ability by found that for the case of tDCS an anodal presenting a curious dilemma, while we gain stimulation enhances cortical excitability, new abilities often times it is at the expense of while cathodal stimulation inhibits these another. The phenomena of electric (Nitsche et al. 2003). Transcranial stimulation enhancement to the body may make some has been used to study many of the brain forms of disability desirable. At which extent functions, allowing researchers to selectively are we willing to modify our bodies to stimulate cortices and gaining new insight on incorporate technology that while enhancing how specific mechanisms operate. For us they temporarily disable us? How can we example, this method has helped patients that gain new sensibilities without the danger of have damage to their motor cortex to regain loss? Human adaptability is guided by a some motor control functions (Nitsche et al. technological co-evolution. HCI guides the 2003). Naturally, both tDCS and rTMS balance between the switches that add and/or methods have been used to artificially induce subtract human abilities, and as it tailors our similar cortical activity networks to those who bodily extensions, it should be crafted with

13 careful consideration of by which extent it is given sensory modality. Each receptor has assistive or disabling. evolved to receive specific energy variations from its environment. Organs such as the 2. Principles of Sensory Perception eardrum slowly developed as a small part of the larger auditory nervous system from a The performance of being and becoming is lengthy process of adaptation and fine-tuning drawn by intuition, which goes beyond the to the specific needs of human hearing. synthetic a priory and the spectacle view of Mechanoreception is the ability to detect and reality. The fabric of our experience is a respond to touch, sound, and changes in malleable control station that manages pressure or posture. The primary physical need spiraling inputs and outputs in between the to detect movement or any other synthesis of our reality. These signals are environmental signal arises from the same interpreted as a map, a direct imprint of our evolutionary purpose of developing a interactions with the surrounding physiological strategy for survival, an ability environment, of where we are and how we are. to respond to the signals of the environment. This mass of knowledge and structural Most organisms have some form of material, consisting of directions and mechanoreception, in humans hearing and pathways that direct us in a loop of empirical touch are both forms of detecting force, information, memory, and deduction are the movement, and vibration. Some organisms co- lenses of perception. The a priori structure of adapt their sensory system to the same signals organizing thought and sensory data, is emitted from the environment, as humans are spatiotemporal in nature (Brook 2016). The a not the only organisms that can hear sound. priori structure of the mind is an imprint of the Other species such as elephants have more jewel of our environment, the schemata by sophisticated infrasonic detection which we draw our intuition. (Stearns 2014, mechanisms, able to sense a much lower LAO 2017). frequency range than that of humans. Generally, organisms have multiple receptors Traditionally we learn about the five that are complementary to their particular senses vision, hearing, touch, smell, and taste physiological requirements and niche habitats. as unimodal experiences. However, increasing They sense a whole range of vibrations from evidence in the fields of psychology, their surroundings, absorbing a myriad of psychophysics, and neuroscience suggests that information streams in their periphery, while the brain is a multisensory organ. Organisms selectively synthesizing the convergence of have specially calibrated anatomical receptors these signals. As signals become coherent extending to the periphery of their neural either by spatial, temporal or isomorphic network. The fundamental bio-receptors direction, the sensorimotor mechanisms adjust classify into three main categories: their attention in synchrony to the relevant photoreceptors which perceive light, signals. There are no organisms in the animal chemoreceptors which sense chemicals, and kingdom that have complete separation of which are sensitive to sensory processing (Stein et al. 1993). The vibrations. These anatomical mechanisms are coordination of the senses is the very the essential structures that give rise to the mechanism that allows for organisms to complex systems that we understand as any meaningfully interact with the environment as

14 well as attend to their own physiological integration is stronger if the stimuli from needs. Extensive research has been done about various modalities arises from approximately the specific sensory receptors and their the same location (Stein 2012). specific driving mechanisms; however, it is still not clear how the processing of these The multisensory integration theories signals occurs. have not always been at the forefront of perception studies, as the unimodal approach 2.1 Multisensory Integration for understanding the senses and perception have dominated the literature. The feasibility According to Driver and Spence (2000), most of neural mapping of the brain has ignited a textbooks about perception are written by renewed interest in the perceptual mechanisms separating each sensory modality and of the brain, allowing scientists to question the regarding them in isolation. However, the reductionist approach to understanding senses receive information about the same perceptual experience in isolated instances, objects or events and the brain forms and rather embracing a gestalt approach multimodal determined percepts from a towards defining these mechanisms. Gestalt is variety of informational inputs. After the a school of psychology that originated in 1980's perception researchers began and and was established in acknowledging the importance of studying the the 20th century, and from this movement interplay of between the components of the derived a new foundation for the sensory system network is equally important contemporary study of perception (Britannica as understanding these modalities in isolation 2017). The word “gestalt” derives from the (Driver, Spence 2000). Multisensory German language and it means placed or put integration is the process by which together; it is often translated to mean “form”, information from different sensory modalities while in psychology the term refers to a are combined to form perception, decisions, “pattern” or “configuration” (Britannica and overt behavior (Stein et al. 2009). This 2017). This movement was founded with the cohesion of sensory modalities is what gives purpose of adding a humanistic dimension to rise to meaningful perceptual experiences. The the sterile approach to the study of mental interaction of the primary cortices is well processes (Britannica 2017). This school of established in the field of neuroscience by thought defined two fundamental principles; observing the superior colliculus at its the first one was the principle of totality, which neuronal responses (Stein 2012). The three stated that conscious experience should be principles of multisensory processing are: the regarded holistically as a totality of the principle of inverse effectiveness, which states dynamic interactions within the brain; the that the multisensory integration is stronger in second was the psychophysical isomorphism stimuli that are usually less effective in driving principle, which states that perceptual neuronal responses independently; the phenomena correspond with activity in the temporal principle, which states that the brain (DM 2017). multisensory integration of separate modalities is stronger if the signals from these In accordance with the gestalt arrive at the same time; lastly, the spatial psychology movement, Gonzalo Justo also principle, which states that the multisensory defined psychophysical isomorphism as a

15 principle of perception, deriving from his cortex from all sensory modalities, with a research on brain dynamics in 1947. During significant contribution from vision his investigations, he realized that there was an (Ungerleider & Mishkin 1982). The parietal inverted perception related to both vision and lobe integrates information from multiple somatoception which extended to all the sensory receptors enabling complex tasks such sensory systems with a spatial character as the manipulation of objects (Blakemore and (Fonrodona et. Al 2015). In his studies on the Frith 2005). The somatosensory cortex and the mechanisms and structures of spatial direction, ’s dorsal stream are both parts of Justo concluded that there is a cerebral the parietal lobe. The ‘‘where’’ of visuospatial recruitment representing a “spiral trajectory” processing occurs in the dorsal stream, and the that seeks a successive balance between the ‘‘how’’ is referred to as the ventral stream, projection area of the brain and central action, such as the vision for action or imagination meaning that in the perception of magnitude (Goodale and Milner 1992; Mishkin 1982). and direction there is a “sensory-cerebral The parietal-occipital region is between the correspondence”, also defining this parietal, temporal, and occipital fields creating phenomenon as psychophysical isomorphism the junction of tactile-kinesthetic, auditory-

(Fonrodona et. Al 2015). vestibular and visual cortical centers. The principal function of these fields is to provide 2.3 Visuospatial Perception orientation within extrapersonal and intrapersonal space (Glezerman et al. 2002). The perception of space is a visuospatial skill These areas are also known as the extended that includes navigation, depth, distance, Wernicke's area which is known for language mental imagery, and construction. These associations (Ardila et al. 2016). functions occur in the parietal cortex at the highest level of visual cognitive processes 2.3.1 Photon-less Vision and Blind (Brain Center 2008). This multimodal ability Imagination helps us to process, identify, and imagine visual information from spatial relationships “We see with the brain, not between objects in space. The visuospatial the eyes” -Bach-y-Rita, constructs are preconscious, and play an 1972 important role in guiding the visual flows of locomotion and actions such as pointing Studying the strategies and abilities of (Mountcastle, Steinmetz 1990.) It is involved the sensory impaired, particularly blind in our ability to accurately reach for objects in subjects, is the most informative method by our visual field. This skill underlies our ability which to understand the mechanisms to move around in an environment and orient underlying perception. Many studies have ourselves appropriately. It is so vital to our concluded that the visual cortex is activated at survival that basic functions such as eating or a higher in blind individuals when performing walking would not be possible. tactile-spatial tasks such as braille reading or auditory processing, than in visually abled Spatial cognition is considered a individuals (Sadato et al. 1996, Merabet 2016). supramodal function constructed from of In his study, Sadato (1996) used positron converging inputs to the posterior parietal emission tomography (PET) scanning to

16 measure activity in the visual cortex of his 2014). participants, both, blind braille readers and visually abled, while they engaged in The spatial reference framework from somatoceptory discrimination tasks. His which we derive information about the findings suggested that both the primary and environment can be conceived of by either an secondary visual cortices were activated in egocentric or an allocentric model. The blind participants during the discriminatory allocentric representation considers task, while the visually abled subjects showed information from an object-to-object reference no activation in the region, and both groups framework, for example imagining or showed no activation where there was a tactile measuring distances by calculating the stimulus that required no discrimination difference between objects. While an (Sadato et al. 1996). egocentric model is concerned with a self-to- object reference model, where the body at the Visuospatial perception and center of experience gathers information from visuospatial imagery are two different skills; its movements and signals. Some studies have however, they share some of the same driving found that some congenitally blind individuals neurological and psychological mechanisms. may have increased difficulty in allocentric Mental imagery forms from multimodal spatial representation (Pasqualotto et al. 2012). sensory memories that relate to a particular Paqualotto et al. (2013) suggested that visual object or event. This process of imagination is experience may be necessary in order to not exclusive to the visually abled, instead develop a preference for an allocentric blind subjects employ alternative strategies for framework. In his study, he found that recalling or manipulating information. Albeit, blindfolded and late blind individuals prefer an the role that visual experiences play in mental allocentric reference frame, while congenitally imagination has not been definitive. blind participants prefer an egocentric Researchers have differing opinions about the reference frame (Pasqualotto et al. 2013). This extent at which informs preference for egocentric representation in imagery. Some argue that phenomenological blind individuals is also apparent while visual experience is imperative for operations building spatial models from verbal involving imagery, while others suggest that descriptions of routes, unlike sighted blind individuals are as effective, or more, in individuals who were more efficient at operating imagery (Szubielska 2014). The forming these spatial constructs from survey basis of this contradiction may be differing descriptions (Noordzij et al. 2006). methodologies and conceptions on the process of mental imagery. For example, imagine an In the case of blindness, the lack of apple; it has multiple informational properties visual input enables for new neural structures that characterize its color, shape, weight, to emerge, organizing its perceptual system to smell, texture, etc. The multimodal experience prioritize non-visual modalities. What if of this object apple can be imagined as someone’s visual perception was intact but the chemical, tactile, or visual image. When a task visual imagery was gone? They suddenly requires amodal spatial images from any of the suffered a case of blind imagination. It was the sensory modalities, blind individuals perform case of MX, a man who lost his ability to see as successfully as sighted subjects (Szubielska his internal visual imagery while retaining

17 intact performance in visuo-spatial tasks, and it possible to simulate the sound of a flying the subject of the study conducted by Zeman insect circling one’s head with laptop speakers AZ, Sala SD, Torrens LA, et al (2010). Their and without headphones. Their research hones findings suggested that MX adopted new on the few cues by which humans perceive the cognitive strategies in order to perform the provenience of sound. One signal being the imagery task. While performing a mental difference between the two intervals of time rotation task MX managed a normal accuracy that happens when audio reaches one ear range by using a perceptual mapping strategy. before the other. The second cue is the He also managed normal accuracy when amplitude or volume differential of the sound attempting tasks of visual imagery by verbally arriving at the two ears (Wood 2010). encoding them (Zeman et al. 2010). There have been other cases of blind imagination, Human echolocation uses the same usually in individuals that have sustained a principles of 3D-audio recreations, accounting head injury. In Sir Brain’s (1954) research for time intervals and volume differentials. studies he found that some subjects were still Echolocation is a means for navigation, object able to draw maps of their houses and their location, and even material differentiation. route from home to work; however, they Mammals such as bats and dolphins have reported great difficulty as they had to highly sophisticated hearing systems that reimagine spatial cues without visual imagery allow them to differentiate the shape, size, and as they were formally accustomed. One of his material of their targets. Their auditory subjects reported “When I dream, I seem to systems are specialized based on their know what is happening, but I don’t seem to different target needs. Dolphins tend to use see a picture. I can dream about a person broadband, short-duration, narrow beams that without seeing them, and I can remember the are tolerant to the Doppler effect, while bats person, but not having seen them.” (Brain use more extended echolocation signals that 1954). These examples suggest that there is a seek the detection of the Doppler shift disassociation with the phenomenal (Whitlow 1997). In humans, echolocation is a experience of visual imagery and the learned process, and while some humans performance visual imagery tasks. depend on this skill for navigation, target location, and material differentiation. There 2.3.1 Seeing Space with Sound are no organs in the human specialized for echolocation. Instead, our brain Audition also informs visuospatial processes is adaptable, and sensory substitution can by the sound’s sensed time of arrival, which is achieve this technique. processed by the parietal cortex. Manipulating the speed at which audio reaches each ear can The World Access for the Blind is an result in realistic effects artificially generated. organization that helps individuals echolocate; The 3D3A Lab at Princeton University (2010), their motto being ‘‘Our Vision is Sound’’ led by Professor Edgar, studies the (2010). Daniel Kish is a blind, self-taught fundamental aspects of spatial hearing in echolocation expert who has been performing humans. They have developed a technique for this skill, since he was a year old. He is the sound playback that results in the realistic president of this organization and describes rendering of three-dimensional output, making himself as the ‘‘Real Batman’’ (Kish 2011).

18 Kish is a master of echolocation; he interprets with the ideal circumstances. In this case, the return of his mouth clicks gaining a rich sighted individuals should, maybe, restrict understanding of his spatial surroundings. His their vision to allow for the auditory feedback and some other congenitally blind individuals’ to be processed by the occipital lobe, similar to early adoption of echolocation techniques the perceptual conditions of a blind individual. seems to develop naturally. Some studies show that blind individuals demonstrate exceptional 3. Assistive Device Art auditory spatial processing. These results suggest that sensory substitution is taking This chapter brings forward the concept of place, and the functions of the occipital lobe Assistive Device Art (ADA) as a way of are appropriated for audio processing defining the phenomena that surround stemming from their deprived visual inputs prosthetic related artworks where Assistive (Collignon et al. 2008). Technology (AT) becomes a product of desire by abled bodies. Assistive Technology is a Kish’s (2011) dissemination of his gateway to human enhancement, inspiring the echolocation technique has helped hundreds of limits of extra-sensory computing interfaces blind people to regain ‘‘freedom,’’ as he that together with Art, as a means of aesthetic describes it. Those who have learned to and experiential study, result in ADA tools. exercise mouth-clicking echolocation can These interfaces inquire about the plasticity of navigate space, ride bikes, skateboards, and ice our physiology, they facilitate alternative skate. They may also gain the ability to locate sensory viewpoints, opening empathic buildings hundreds of yards away with a single channels by activating curiosity. They seek to loud clap. Their clicking is a language that enhance perspectives, to display new ways of asks the environment — ‘‘Where are you?’’ seeing and sensing our environment, our cities, with mouth sounds, cane tapping, and card our landscapes. ADA goes beyond replacing clips on bicycle wheels (2010). These clicks or restoring function; these tools are not return imprints of their physical encounters seamless utilitarian AT, they are playfully with their environment as if taking a sonic designed to reveal the multimodal precepts mold of space. that inform our sensory awareness. They give participants new abilities by refocusing their The skill of echolocation is a learned attention to alternative senses, from which behavior that sometimes occurs naturally in they rediscover their surroundings lending congenitally blind individuals. It seems to be a them to reflect upon what constitutes skill. survival mechanism by sensory substitution, ADA tools borrow from a mixed methodology where areas of the brain re-wire processing that includes psychophysical and user power from one absent signal to another. Some experience evaluations, rapid prototyping blind individuals have learned to process the design and development within an artistic auditory feedback as a sonic mold of their context. These artworks reimagine environment by instruction, for example, the technological displays through the lens of pupils of Daniel Kish. Since human unique physiological architectures that inspire echolocation skills do not always develop the smell of time, the sound of space. The naturally, it is possible that sighted individuals foundation of these ADA interfaces and can also achieve this ability while presented functionalities comes from an interactively

19 assistive solution. replace or support missing or impaired body parts. This terminology implies the demand for This thesis brings forward the concept a shared experience, as the scope of restoration of Assistive Device Art (ADA) as coined by drives it to normalcy, rather than the pure the author. It is a definition for the emerging pursuit of sensorimotor expansion. Regardless cultural phenomena that integrates Assistive of its medium, technology is inherently Technology and Art. These works of art assistive. However, AT refers explicitly to involve the mediation of sensorimotor systems designed for aiding disabilities. functions and perception, as they relate to Organisms and their biological mechanisms psychophysical methods and conceptual are a product of the world in which they live. mechanics of sensory embodiment. All bodies have evolved as extensions of the Psychophysics is the methodological sensory interactions with our environment. In quantification of sensory awareness in the our quest as human organisms, the need to sciences, while the theoretical aspects of reach, manipulate, feel, and hold manifested as embodiment have a more exploratory nature— a hand. Hutto (2013) defines the hand as ‘‘an they seek to investigate perception open- organ of cognition’’ and not as subordinate to ended. The philosophical view of embodied the central nervous system. It acts as the cognition depends on having sensorimotor brain’s counterpart in a ‘‘bi-directional capacities embedded in a biological, interplay between manual and brain activity’’. psychological, and cultural context (Rosch et al. 1991). The importance of this concept is The prosthesis is an answer to conserve that it defines all experience as not merely an the universal standard of the human body, as informational exchange between environment well as the impulse to extend beyond our and the mind, but rather an interactive sensorimotor capacities. The device as transformation. This interaction of the separate or as a part of us is mechanized as an sensorimotor functions of an organism and its extension of our procedural tool which is the environment are what mutually create an agency that activates the boundaries of our experience (Thompson 2010). experience. This intentionality trans- posed through a machine or device belongs to the 3.1 Perception Spectacles body with a semi-transparent relation. In this context, transparency is the ease of bodily Assistive Device Art sits between this incorporation, the adoption of a tool that transformational crossroads of sensory through interaction becomes a seamless part of reactions in the form of tools for rediscovering the user. Ihde (1979) conceptualizes the the environment and mediating the plasticity meaning of transparency as it relates to the of human experience. This art form interfaces quality of fusion between a human and electromechanically and biologically with a machine. Interface transparency is the level of participant. It is functional and assistive, and integration between the agent’s sensorimotor its purpose is, furthermore, a means for cortex and its tool. engaging the plasticity of their perceptual constructs. Assistive Technology (AT) is an ADA tools serve the participant as umbrella term that encompasses rehabilitation introspective portals to reflect on this process and prosthetic devices, such as systems which of incorporation. These cognitive prosthetic

20 devices become ever more incorporated with theorizing this new type of media art (Iwata practice, more transparent, enhancing our 2004). This model takes into consideration the ability to re-wire our perception. The digital age reproduction methods which make transformation of our awareness does not only possible the commercial attainability of happen by an additive and reductive process of artworks to a broader audience. As a result, incorporation, but also by the accumulative this movement expanded the art experience experience gained through the shift these from galleries and museums and returned to devices/filters pose on the participant’s reality. the traditional Japanese view of art as being Assistive Device artworks are performances part of everyday life (Kusahara 2010). This that reveal the plasticity of the mind, while it consumable art framework is common in is actively incorporating, replacing, and Japanese art, as it frequently exists between enhancing sensorimotor functions. These the lines of function, entertainment, and performances are interactive and experiential, product. aided by an interface worn between the sensory datum and the architecture of our This art movement is part of the physiology. Art is about communication and encompassing phenomenon of Interactive Art, the perception of ideas. These performances which is concerned with many of the same transpose specific sensations to convey guidelines of Device Art, but envelopes a designed experiential models. As the greater scope of work. Interaction in art participant performs for themselves by involves participants rather than passive engaging these tools, they explore a new spectators, where their participation completes construct of their mediated environment and the work itself. The concept of activation of the blurring of incorporation between body artworks by an audience has been part of the agency and instrument. ADA centralizes and discussion of art movements since the empowers the urge to transcend bodily Futurists’ Participatory Theater and further limitations—both from necessity as Assistive explored by others, such as Dada and the Technology and as a means to playfully Fluxus Happenings. These artists were engage perceptual expansion. The desire to concerned with bringing the artworks closer to escape the body is what activates our state as ‘‘life’’ (Dinkla 1996). Interactive Art usually evolving beings impelled by the intentionality takes the form of an immersive environment, of our experience. mainly installations, where the audience activates the work by becoming part of it. The 3.2 Device Art: Playful ubiquitous nature of computing media boosted Understanding of the Mind this movement, making Interactive Art almost synonymous with its contemporary The term ‘‘Device Art’’ is a concept within technology. Media Art which encapsulates art, science, and Device Art albeit interactive, technology with interaction as a medium for technologically driven, and with the aim of playful exploration. Originating in Japan in bringing art back to daily life, has some crucial 2004 with the purpose of founding a new differences from Interactive Art. Device Art movement for expressive science and challenges the value model that surrounds the technology, it created a framework for art market by embracing a commercial mass producing, exhibiting, distributing, and

21 production approach to the value of art. Assistive Device Art is non-invasive or Artworks of this form embody utility by its semi-invasive. It becomes incorporated into encapsulation as a device, which is more our physiology while remaining a separate, yet closely related to a product. The device is adopted interactive device. The appeal behind mobile or detachable from a specific space or these wearable prosthetic artworks is that users participant; it only requires interaction. Device are not compromising or permanently Art frees itself from the need of an exhibition modifying their body; instead, they are space; it is available to the public with the temporarily augmenting their experience. intent of providing playful entertainment Cyborgs, people who use ADA, and those who (Kusahara 2010). practice meditation are all engaging neuroplasticity. These engagements fall under 3.3 Cultural Aesthetics: Products various levels of bodily embeddedness, but of Desire they all show a willingness to interface with their environments in a new type of Assistive Device Art stems from the merging transactional event. of AT and Device Art, and it is a form of understanding and expanding human abilities. The prosthesis is becoming an object This concept is closely related to the of desire, propelled by the appeal of human concurrent movements of Cyborg Art and enhancement. While not only providing a Body Hacking, which include medical and degree of ‘‘normalcy’’ to a disabled ancient traditions of body modification for participant, it also enables a kind of aesthetic, interactive, and assistive purposes. superhuman ability. Pullin (2009) describes They explore the incorporation of eyeglasses as a type of AT that has become a contemporary technology such as the fashion accessory in his book, ‘‘Design Meets implantation of antennas and sensor– Disability’’. Glasses are an example of AT that transducer devices. For instance, in 2013, have pierced through the veil of disability by Anthony Antonellis performed the artwork becoming a decorative statement and an object Net Art Implant, where he implanted an RFID prone to irrationally positive attribution. chip on his left hand. This chip loads re- Somehow, what seems to be a sign of mild writable Web media content once scanned by is a symbol of a cell phone, typically a 10-frame-favicon sophistication. ADA strives to empower animation (Antonellis 2013). people by popularizing concepts of AT and pioneer Dr. Kevin Warwick (2002) conducted making them desirable. Art is a powerful tool neuro-surgical experiments where he for molding attitudes and proliferating implanted the UtahArray/BrainGate into the channels of visibility by communicating median muscles of his arm, linking his nervous aesthetic style and behavior. Through the system directly to a computer with the purpose means of playful interaction, of advancing Assistive Technology. Works phenomenological psychophysics, biomedical within the Cyborg and Body Hacking robotics, and sensory substitution, ADA is movements often include, if not almost made to appeal aesthetically, become require, the perforation or modification of the desirable, and useful inclusively to everyone. body to incorporate technology within it.

22 3.3 Assistive Device Artworks: feel’’ prototype that blurred the lines of The Prosthetic Phenomena product, artwork, and assistive technology. They drew inspiration from Paul Bach-Y- Auger and Loizeau (2002) designed an Audio Rita’s previously mentioned Brain Port, the Tooth Implant artwork which consists of a AT device that could substitute a blind prototype and a conceptual functional model. person’s sight through the tactile stimulation Their prototype is a transparent resin tooth of the tongue. This device includes a pair of with a microchip cast within it, floating in the glasses equipped with a camera connected to a middle. Conceptually, this device has a tongue display unit. This tongue display unit bluetooth connection that syncs with personal consists of a stamp-sized array of electrodes, a phones and other peripheral devices. It uses a matrix of electric pixels, sitting on the user’s vibrating motor to transduce audio signals tongue. This display reproduces the camera’s directly to the jawbone providing bone image by transposing the picture’s light and conduction hearing. This implant was created dark information to high or low electric with the premise to be used voluntarily to currents (Bach-Y-Rita 1998). Beta Tank used augment natural human faculties. The this idea and fabricated a ‘‘look-and-feel’’ intended purpose of this artwork was a way to prototype, which consisted of a lollipop engage the public in a dialog about in-body fashioned with some seeming electrodes and a technology and its impact on society USB attachment as its handle. They also (Schwartzman 2011). This implant device created a company that would supposedly does not function, but it is skillfully executed bring this technology to the general consumer as a ‘‘look and feel’’ prototype. Because of the market as an entertainment device that artists’ careful product design consideration delivered visuo-electric flavors (Beta Tank and their intriguing concept which lied on the 2007). However, these electrode pops where fringes of possible technology, this artwork once again not functional, but, instead, they became a viral internet product. Time conceptually framed the use of AT as a new Magazine selected this artwork as part of their form of entertainment. In contrast, the GoFlow 2002 ‘‘Best Inventions’’ (Schwartzman 2011). device, previously mentioned in the In the public eye, this artwork became introduction, is an ADA example of regarded as a hoax, as the media framed it out completely functional commodified AT of context from its conceptual provenience, (2012). Once an open-source DIY device and confused it with a consumer product. promising intellectual enhancement, the However, this meant that their work fulfilled GoFlow project is now a company called its purpose as a catalyst for public dialog about foc.us delivering consumer grade devices in-body technology, and more importantly, it (fo.cus 2017). From hacking project to expressed its eager acceptance. consumer electronics manufacturing, this novel idea to productize AT lab technology is Another work that resonates with this an exemplary case to the framework of ADA. productized aesthetic is Beta Tank’s Eye- Candy-Can (2007). Through their art, they Not all artworks that could be were searching for consumer attention, a way considered ADA have to be desirable products of validating the desire for sensory-mixing for a consumer market. Most of these artworks curiosity. This artwork is also a ‘‘look-and- are capable of becoming real products, but

23 their primary objective is to provide an of Lundborg and his group (1999), ‘‘Hearing experience. They serve as a platform to expand as Substitution for Sensation: A New Principle our sensorimotor capacities as a form of for Artificial Sensibility,’’ where they feeling and understanding our transcendental investigate how sonic perception can inform embodiment. One of the earliest artworks that the somatosensory cortex about a physical could be an example of AT is Sterlac’s (1980) object based on the acoustic deflection of a Third Arm, a robotic third limb attached to his surface. Their system consists of contact right arm. It has actuators and sensors that microphones applied to the fingers which incorporate the prosthesis into his bodily provide audible cues translating tactile to awareness and agency (Sterlac 1980). This acoustic information by sensory substitution prosthetic artwork does not restore human (Lundborg et al. 1999). While the presentation ability; instead, it expands on the participant’s of this work is not in the context of art, its idea sensorimotor functions transferring of playing with our sensory awareness superhuman skills by allowing them to gain resonates with the essence of ADA’s purpose control of a third arm. This artwork questions of exploratory cross-modal introspection. the boundaries of human–computer integration, appropriating Assistive 4. Methodology: Designing an Technology for art as a means of expression Assistive Device Art Tool beyond medical or bodily restoration. This work is not concerned with the optimal and The Sensory Pathways for the Plastic Mind is efficient function of a third hand; it physically a series of ADA that explores consumer expresses the question of this concept’s utility product aesthetics as well as delivering new and fulfilling its fantasy. Professor Iwata’s sensory experiences (Chacin 2013). These (2000) Floating Eye is another work that wearable extensions aim to activate alternative expresses this transcendence of embodiment. perceptual pathways as they mimic, borrow This artwork includes a pair of wearable from, and may be useful as Assistive goggles that display the real-time, wide-angle Technology. This series extends AT to the image received from a floating blimp tethered masses as a means to expand perceptual above the participant (Iwata 2000). This awareness and the malleability of our device substitutes the wearer’s perception perceptual architecture. They are created by from their usual scope to a broader, outer-body appropriating the functionality of everyday visual perspective, where the participant can devices and re-inventing their interface to be perceive their entire body as part of their newly experienced by cross-modal methods. This acquired visual angle. This experience has no series includes the two case studies in this specific assistive capabilities regarding research, Echolocation Headphones and restoring human functions, but it widens the IrukaTact Glove, as well as works not included perspective of the participant beyond their such as ScentRhythm and the Play-A-Grill. body’s architecture. It becomes a perceptual The ScentRhythm is a watch that uses olfactory prosthesis that extends human awareness. mapping to the body’s circadian cycle. This device provides synthetic chemical signals that Some psychophysical experimental may induce time-related sensations, such as devices that are not art have the potential to be small doses of caffeine, paired with the scent considered ADA. Take, for example, the work of coffee to awaken the user. This work is a

24 fully functional integrated prototype. Play-A- signal with congruent directionality. This Grill is an MP3 player that plays music means that the artificial echo-signal must through the wearer’s teeth using bone originate from an egocentric location, the conduction, similar to the way cochlear participant’s original modalities for which implants transduce sound bypassing the they usually intake spatial information. This eardrum (Chacin 2013). Play-A-Grill is not visuospatial ADA tools must be able to just a ‘‘look and feel’’ prototype; its enhance the perception of space through other functionality has been implemented and modalities, training participants to gain a new tested, unlike Auger and Loizeau’s Audio relationship to their surroundings. While AT Tooth Implant. This series exhibits functional is designed to specifically aid and, perhaps, AT that is desirable to the public. The restore functions, an Assistive Device artwork combination of product design aesthetics and is concerned with the playful means by which the promise of a novel cognitive experience an experience forms perceptual awareness. has made some of the works in this series Assistive Technology aims to develop more which become part of the viral Internet media. efficient systems by evaluating and comparing In this way, they have also become validated devices to previously existing products, as a product of consumer desire. whereas Assistive Device artworks are less concerned with seamless utility or ability 4.1 Interdisciplinary Approaches restoration. Instead, these artworks are for a to Tool Design diverse participant base, designed with an experiential basis by which all participants can Assistive Device artworks are created expand their perception regardless of their with an interdisciplinary approach that need for echolocation as a means of daily considers functionality with practical navigation. applications, as well as fulfilling an aesthetic, experiential, and awareness driven purpose. The interaction design for ADA pieces While these strategies are not mutually have a foundational basis in appropriating the exclusive, there is a delicate balance for wearable functions, uses, and interactions of achieving a satisfactory evaluation method, common wearable devices, such as goggles, where one approach is not more important than watches, spoons, and mouthpieces. They are the other. For instance, in the design and often reconstructions of these devices with an concept formulation of the Echolocation exchange in sensory functionality. Their Headphones, the technical functionality of this interaction has been reinvented not solely for device is crucial to achieving its purpose as a accessibility purposes, rather as an perceptual introspection tool. In this specific experiential artwork. ADA applies AT case, the Echolocation Headphones first need methods such as cross-modal psychology to aid in human echolocation to provide the applications to common devices that in their experiential awareness of spatial perception. form and utility allow participants to question And in the case of IrukaTact the first how we learn, and how do our adapt to identifiable need is to provide an effective way new channels of perception. ADA for displaying haptic information underwater. development is to create a device and As for the echo locative applications of both performance composition for the translation, interfaces, they must be designed to display a adaptation, and learning of new sensory

25 processes. This framework sits within the using an artifact (Houde and Hill 1997). This fields of cross-modal psychology, assistive type of prototype has no technical technology, and interaction design. Many of implementation, it just looks and feels like a the applications for sensory substitution real product. This type of prototype is closely experiments emerge for accessibility needs, related to ‘‘vaporware,’’ which is the term which is the quest for making technologies and used in the product design industry referring to information available to anyone. ADA is a a product that is merely representational. It is type of testing hardware that provides a fresh announced to the public as a product, but not outlook for new human-computer interfaces available for purchase. Similarly, in that centers on sensory translation and vaporware, the aesthetic design and role are adaptation. Though not all of ADA tools prove fully implemented, but also void of technical to be entirely efficient, AT efficiency is not the functionality. Finished products of Speculative main goal of the interfaces. The goal is to Design, vaporware, and ‘‘look-and-feel’’ create interfaces that expand experiential prototypes rely on aesthetics to express an awareness and to provide an insight on the idea. This approach to design resonates with limits at which normally abled participants are the traditional view of the object in art, where willing to engage with perceptually-shifting it is devoid of practical function. Its utility has technology. been stripped, and its form reincorporated as part of a concept in the context of art. ADA In the world of consumer electronics, takes into account the same aesthetic design there are well-defined aesthetics that drive principles that surround these products of product appeal. ADA often uses the aesthetics design. However, depending on their purpose, of product design resonating with consumer most of these ADA artworks are often entirely culture similar to Speculative Design. Dunne integrated prototypes, including their wholly and Raby (2014) brought forward this concept functional technical implementation. Works in of Speculative Design a means of exploring this realm are concerned with the experience and questioning ideas, speculating, rather than of incorporation, rather than merely hinting at solving problems through the design of the idea. everyday things, a way to imagine possible futures. The critical aspect of this concept lies 4.2 Public Presentation and in the imagination and possibility that a model Dissemination represents, rather than its actual technical implementation. Houde and Hill (1997) Prototype, artwork, vaporware, defined a method for prototyping where each venture capital product, and off-the-shelf are prototype would be designed to fulfill three ambiguous terms in the cross-fire of defining categories for full integration: ‘‘look-and- an idea when it becomes public through the feel,’’ ‘‘role,’’ and ‘‘implementation’’. In their media. Consumers and journalists identify and design process, each prototype would lie publicize these artworks as real products on somewhere between these three categories and their own, driven by their desire for new slowly become integrated as each new technological advancements. Albeit, often iteration borrowed from the previous one. The Assistive Device artworks are intendedly ‘‘look-and-feel’’ prototype actualizes an idea designed to become viral internet products by only concerning the sensory experience of with the purpose of provoking public reactions

26 that measure the consumer desire for body demonstration and not an art gallery or extensions and technological upgrades to our exhibition, its aesthetic appeal would change bodies. This ‘‘viralization’’ validates what the reception of the device. Design these artworks represent—the human considerations will define the opportunity to transcendental desire to augment and enhance engage more public attention, and thus receive their embodiment. The popularization of more critical feedback for improvement. wearable technology through the Internet has ADAs should strive for flexibility, creating become a method for artistic exhibition and interfaces that are inquisitive in a metaphorical public dialogue outreach. While art often and conceptual basis, while simultaneously seems stagnant in art galleries, the boards of demonstrate new applications in both design social media are open channels for discussion. and technical implementations. The click-economy has opened the doors to a new way for electronic media artists to engage 4.3 Open-Access Engineering the public. Prototypes that never made it to production stage were dusty models in the ADAs should strive to be open-source tools corner of a lab, now they have access to the made by borrowing and contributing to same center stage as the big-player consumer scientific research while exploring the artistic market. While they may not at all profit from capabilities of interfacing the body via atypical sales, as often times ADA is a single original venues of physiology. These alternative prototype, they profit from publicity. This displays are often created using open- outreach method by which to test consumer hardware modules and hacking techniques. interest gauges the pulse for novel interactions. Making technology that helps people with The dialogue created via an internet “product” limited senses is a small industry of privatized is recorded and retrievable. This way of technology therefore attaining these devices is presenting ADA keeps a public account of the difficult and expensive. Open-hardware can dialogue that these ideas generate. facilitate making these technologies available to people that need them the most, for as Beyond the scope of digital outreach, affordable as cost of their materials. Also, conference demonstrations and art exhibition sharing methods and techniques in this area of are two different models for engaging public HCI development provokes more thinkers and participants in a physical environment to tests tinkerers of open-source hardware to ADAs beyond lab experimentation. Taking contribute to existing or create new interfaces prototypes out of the clinical environment of for information display. research facilities allows for a critical assessment of the work in a real-world The availability of accessible hardware environment with a wider-rage public. While toolkits that promote open-source engineering the prototypes may suffer from technical has revolutionized device design. Platforms instability, as they are often delicate original such as YouTube, Arduino, and Processing mechanisms, public presentation is a test of offer free and open access to electronics resilience physically, experientially, and prototyping methods appealing to non- conceptually. It is also a test of aesthetic engineers and causing new methods and appeal; even if an ADA prototype is being motives for tool design. Electrical and presented in an academic conference computer science reverse-engineers are

27 emerging from creative fields, interaction effective when designed with a top-down design among other artistic practices. Their approach as there is now copious amounts of inventive approach is more concerned with the information accessible detailing methods and humanistic aspects of tool design. They are techniques for implementing microprocessor focused on concept, meaning, public driven devices prototypes. The focus of reception, and igniting dialogue. This strategy development for ADA tool lies on questioning is concerned with the relational matters of the how to expand sensory awareness which are tool to the user, society, and its cultural concepts difficult to test with a bottom-up underpinnings. These developers take a top- approach. The holistic view of multimodal down approach, beginning with an idea and perception encourages interface design to gaining insight by deconstructing its consider a unified vision that firstly proposes a components into sub-systems as a way of new mode for information reception or reverse engineering. This technique poses sensorimotor adaptation. more complex challenges at the beginning of the process, as the ideation phase will examine 4.4 Designing for Sensory global questions, and in a way, create a new Plasticity problem to be solved. The top-down approach is iterative, it generates numerous prototypes We can arrive at the same understanding of an using qualitative and instinctual strategies, experience based on a few signals. Depending focusing on the essential aspects of an idea, on the different goals of perception, various and addressing the necessary technical inputs can collect useful information, for challenges as arise. Typically, computer example: to learn about our spatial scientists and electrical engineers are trained surroundings one can use proprioception, to develop interfaces using a bottom-up audition, sight, and kinesthetic senses. Taste, approach. They have a strong foundation of is not a very efficient way of learning about scientific principles and highly data-driven our spatial surroundings. Understanding the testing methods, where the quantifiable senses and how they are interpreted in the aspects of interface inception, production, mind is crucial for defining the new models of testing, and deployment are the most interaction and provoke meaningful inquiry on important. This approach is flexible at the ADA perceptual expansion devices. The beginning of the process, starting with smaller human senses are divided into three main parts of a system and building sub-sections to categories: mechanoreceptors (audition and be joined and complete a model. A bottom-up somatoception), chemoreceptors (olfaction approach may be more appealing to engineers and gustation), and photoreceptors (vision). who are interested in solving problems, rather Most of the senses fall under the than planning on new problems to solve. category, more specifically under the somatic sense, from which we feel Both approaches to tool design have thermoception (temperature), nociception their particular benefits; top-down can provide (pain receptors), and proprioception. This also preliminary observations on usability and includes the vestibular sense, which is our gauge the interest of a novel interaction, while sense of balance located in our inner ear, one bottom-up yields stable, dependable, and of the most finely tuned mechanoreceptor efficient devices. ADA tools can be most systems of the body. The perception of the

28 world around us is a delicate and balanced Interfacing the body is the most dance between our sensory and motor cortex. delicate aspect of this study because it One interprets information, and the other one intimately examines cultural and biological mandates new functions allowing the flux of implications of physiology. Taking safety our perception. precautions is imperative, as this research is borrowing from a multitude of conceptual As established in the introduction, perspectives and scientific methods, neuroscientists have begun to understand the incorporating precautions during experimental brain as a plastic organ that allows for design is the only way to engage in productive reorientation and re-establishment of creative inquiry. Taking a playful view on processes. Marina Bedny, an MIT postdoctoral performance allows for a certain malleability associate in the Department of Brain and between willingness to participate and Cognitive Sciences says- “Your brain is not a curiosity. This egocentric approach to the prepackaged kind of thing. It doesn’t develop experiential art engages viewer at a very along a fixed trajectory, rather, it’s a self- personal physiological level, that beyond building toolkit. The building process is vulnerability it simultaneously provokes an profoundly influenced by the experiences you urge of curiosity to experience something new. have during your development.” (2011). Our These devices are conceptualized or developed brain is so elastic, we can add other senses to in accordance with perception and sensory it. For example, surgically implanting magnets substitution principles. Albeit, not all ADA into the fingertips has become a new body has complete functionality sometimes existing modification trend because with these only as vaporware, the implementation of implants one can sense the intensity and shape technical aspects of experiential design give of magnetic fields in close proximity. The more insight about the quality of an brain cannot be compartmentalized, since experience. Data oriented research gives an there is not a specific part of the brain insight about the perceptual performance of dedicated to process magnetic information. participants while wearing an ADA interface. Our brains are capable to interpret new sets of When the functionality of the device is fully or available information from any kind of at least partially implemented more phenomenological stimulus given the right information can be gathered through tools. It’s possible to plug and play with your psychophysical experimentation. There are sensory field and achieve alternative pathways well-established methods in this area in order of perception. Participants engaged in sensory to find how stimulus may be perceived by a substitution or augmentative devices while re- human user. These types of experiments channeling or extending their senses, achieve usually include perceptual accuracy of the a level of brain plasticity and machine-bio- participant while a stimulus is being cooperation. ADA interfaces aim to open new transduced, just noticeable difference (JND), modes of perception for all users, striving to absolute threshold, among others. A common understand sentient physiological and example of an absolute threshold psychological phenomena that expand the psychophysical experiment is the typical perceptual potential of new interfaces for hearing test. In this experiment, a participant is computing media. presented with selected frequency tones within the audible spectrum in an increasing or

29 decreasing order, while they are asked to calls in silent mode, as a display method to discern each tone therefore arriving at their show the user that an icon has been selected. own particular range of audition. Soon we will begin to see haptic simulation effects in mobile consumer electronics that will create textures and different somatic stimuli with localized matrix generated The ADA is a model for experimentation with vibrations, enhanced braille displays, rather non-invasive devices that can supplement or than the single motor in current devices. It will substitute our general and current perceptual be greater than the difference between black abilities. Many times, the functionality of and white or color television; it will be perhaps these devices derives from existing scientific the difference between TV and a flashlight. research on perception. It incorporates AT with a top-down approach that deconstructs Voice command and recognition is the design common devices and appropriates another emerging form of AT that is some of their functions to be transduced to permeating popular applications. Previously, other sensory modalities. It explores how AT the main users of audio computer UIs such as and consumer technology can merge and at screen readers were the visually impaired. which extent it can adopted by common users Now, we can see regular people voice via its commodification. ADA inquiries about commanding their phones through a Bluetooth future cross-modal applications for wearable earpiece calling their virtual assistants, devices. In understanding the advantages and whether it is Siri for Apple Devices, Cortana disadvantages of the normal brain and the from Windows or asking Alexa, the home disabled brain lie many unexplored traits that voice command personal assistant from can be useful for survival. Designing devices Amazon, to “Please, turn off their air- that facilitate extraordinary ability or serve as conditioning.” What other perceptual methods training apparatuses for further engraving are useful for interface design? alternate paths for neuronal operations such as seeing with tactile information, is one path to 5. Implementation- Case Studies discover what non-obvious evolutionary traits our species might benefit from. This section describes the design process, psychophysical evaluation, and future 4.4.1 Adopted Assistive Displays- applications of the two case studies; Touch and Listen Echolocation Headphones and IrukaTact Glove. These case studies are examples of Tactile displays are an example of ADA tools created specifically to increase the accessible technology that has made a leap on plasticity of spatial perception. In both cases to popular applications. Braille display devices we inhibited the vision of both experiment and vibrotactile stimulation, also used in event participants and members of the public when simulations, has been used to aid blind people engaging the translating interfaces, whether to interface with computational media. Now they were experiencing audible echolocation we see mobile devices becoming reactive to or in the case of the echo-haptic translation. touch by using emitting a vibration through a This inhibition of the visual field is what motor, previously used solely for incoming allows for the sensory substitution process to

30 begin to take place, allowing the ADA The effectiveness of this translation has been participants to embody a new sensory tested in two ways; DemoTank units and an architecture by which to engage in alternative IrukaTact sonar directional torch for aqueous visuospatial imagery. environments. This cross-modal transaction of visuospatial information has assistive The Echolocation Headphones is an capabilities for aquatic environments where ADA tool that aids human echolocation, human vision is mostly hindered or disabled namely the location of objects and the by those conditions. This study concludes with navigation of space through audition. The the design of the IrukaTact glove as an interface is designed to substitute the Assistive Device Art tool. participant’s vision with the reception of a focused sound beam. This directional signal Both, Echolocation Headphones and serves as a scanning probe that reflects the IrukaTact Glove are tools that emerge from sonic signature of a target object. Humans are assistive technology and go beyond restoring not capable of emitting a parametric echo, but function as a way of expanding the sensory this quality of sound gives the participant an array of computational displays. This section auditory focus, similar to the focal point of expands on each case study while relating its vision. In the Echolocation Headphones case provenience and performance to the ADA study, we define the design parameters and conceptual and psychophysical approach of technical aspects of the tool and analyze the tool design. perceptual performance of participants while wearing this tool. This tool is designed to aid 5.1 Echolocation Headphones: human echolocation, facilitates the experience Seeing Space with Sound of sonic vision, as a way of reflecting and learning about the construct of our spatial Echolocation Headphones are a pair of opaque perception. goggles which disable the participant’s vision. This device emits a focused sound beam which IrukaTact haptic module, a wearable activates the space with directional acoustic device for underwater tactile stimulation of the reflection, giving the user the ability to fingertips. It utilizes an impeller pump that navigate and perceive space through audition. suctions surrounding water and propels it onto The directional properties of parametric sound the volar pads of the finger, providing a hybrid provide the participant a focal echo, similar to feel of vibration and pressure stimulation. In the focal point of vision. This study analyzes this case study, we evaluate the performance the effectiveness of this wearable sensory of this tool by conducting a series of extension for aiding auditory spatial location experiments, such as testing the displayed in three experiments; optimal sound type and force and flow rate, and the participants’ distance for object location, perceptual absolute threshold and perceptual resolution resolution by just noticeable difference, and while wearing the device. IrukaTact has an goal-directed spatial navigation for open echo-haptic utility when paired with a sonar pathway detection, all conducted at the Virtual sensor transposing distance data from an Reality Lab of the University of Tsukuba, ultrasonic range-finder into haptic feedback, Japan. The Echolocation Head- phones have giving users a quasi-parallel sense of touch. been designed for a diverse participant base.

31 They have both the potential to aid auditory that uses a focused acoustic beam. The spatial perception for the visually impaired wavelength of an audible sound wave is and to train sighted individuals in gaining omnidirectional and larger than an ultrasonic human echolocation abilities. Furthermore, wave. The parametric sound is generated from this Assistive Device artwork instigates ultrasonic carrier waves, which are shorter and participants to contemplate on the plasticity of of higher frequency, above 20 kHz. When two their sensorimotor architecture. intermittent ultrasonic carrier waves collide in the air, they demodulate, resulting in a focused 5.1.1 Device Design audible beam.

The priority for this interface is to place the sense of hearing at the center of this tool design. The parametric speaker was attached to the single square lens of a pair of welding goggles. Because this device performs sensory substitution from vision to audition, appropriating the functional design of glasses is a natural choice that pre- vents sight and replaces it with a central audible signal. Furthermore, the signal for an echo usually originates from the frontal direction of navigation, such as from the mouth of a human Figure 1: Echolocation Headphones front view (left) echolocation expert. Welding goggles are and back view (right). designed to protect the wearer from the harsh light, which makes them attractive for acoustic spatial training necessary for sighted subjects. This signal becomes perceivable in the audible In informal settings, the goggles do not spectrum while retaining the directional entirely deprive the sense of sight properties of ultrasonic sound. For example, purposefully, so that sighted participants can one ultrasonic wave is 40 kHz; the other is 40 get accustomed to spatial mapping from vision kHz (plus the added audible sound wave). As to audition. However, when used for shown in Fig. 2, the red wave being inaudible experimental purposes, the protective glass is to the user and the blue wave returns from the covered with black tape to cover the wearer’s wall as an acoustic signal. It is possible to vision completely. Giving the participants the calculate and affect the spatial behavior of freedom to explore their surroundings is a sound by accounting for the time intervals crucial interaction for experiencing audio- between sonic signatures (Haberken 2012). based navigation and object location. The original version of parametric speakers is 5.1.1.1 Mechanism: Technical the Audio Spotlight by Holosonics, invented in the late 1990’s by Joseph Pompei (1997). Aspects

The main component of the Echolocation Headphones (Fig. 1) is a parametric speaker

32 pressure of 116bB (MIN) and a continuous allowable voltage at 40 kHz of 10 Vrms. The transducer model is AT 40-10PB 3 made by Nippon Ceramic Co, and they have a maximum rating of 20 Vrms (TriState 2014a). These transducers are mounted on a printed circuit board (PCB) board and connected to the driver unit. The driver includes a PIC microcontroller and an LMC 662 operational amplifier. The operating voltage for this circuit is 12 V, with an average of approximately 300 Figure 2: Echolocation Headphones: Sound Interaction mA and a peak of about 600 mA. It has a mono 1/8’’ headphone input compatible with any sound equipment. The sound quality outputted The speaker used in the Echolocation from the Tristate parametric speaker is in the Headphones device is the Japanese TriState audio frequency band between 400 Hz and 5 (2014a) single directional speaker. It has 50 kHz. The sonic range of the speaker is ultrasonic oscillators with a transmission

Figure 3: First Iteration of the Echolocation Headphones with Augmented False Auricles.

33 detectable from tens of meters (TriState (2011) created a ‘‘shh’’ noise to show the 2014b). distance of a lunch tray from his face. White noise works well, because it provides an This device is battery operated with a extended range of tonality at random; this lithium cell of 12 V and 1 A, kept in a plastic eases the detection of the level and speed enclosure with an on/off switch attached to the change. Beyond the applicability of back of the strap of the welding goggles. The navigation, this tool is also useful for battery enclosure and the driving circuit are differentiating material properties such as connected to an MP3 player or another sound dampening. outputting device, which could be a computer or a Bluetooth sound module. The electronics 5.1.2 Experiments: Performance of the driving circuit are exposed; however, evaluation of Echolocation there is another version of these headphones that include two rounded and concave plates Headphones that are placed behind the ears of the wearer to The Echolocation Headphones were evaluated which the driving circuit is attached and quantitatively through three experiments; concealed (Fig. 3). This auricle extension had optimal sound and distance for object location, a sonic amplification effect and helped the perceptual resolution by just noticeable recollection of sound. An echolocation expert difference, and goal-directed spatial mentioned that the plates prevented him from navigation for open pathway detection. This perceiving the sound from all directions, device was also evaluated qualitatively in art neglecting the reflections behind him, and exhibitions where participants were able to use therefore, this feature was removed. it freely and asked about their experience in 5.1.1.2 Sound Characteristics retrospect.

These headphones provide the wearer with 5.1.2.1 Experiment I: Object focal audition as they scan the surrounding location: optimal sound type and environment. The differences in sound distance reflection inform a more detailed spatial image. This scanning method is crucial for This first experiment aimed to find the optimal perceiving the lateral topography of space. The sound type and distance for novice constantly changing direction of the sound echolocation in detecting an object while using beam gives the user information about their the Echolocation Headphones. The spatial surroundings through by hearing the participants asked if they perceived an object contrast between acoustic signatures. The placed at six distance points from 0 to 3 m in MP3 player connected to the Echolocation segments of 0.5 m. The three sound types Headphones is loaded with a track of compared were slow clicks, fast clicks, and continuous clicks and another of white noise. white noise. Each position was questioned Experimentation and prior art yielded white twice for each sound type with a total of 12 noise as is the most effective sound for this trials per sound type and distance point, all echolocation purpose. When demonstrating totaling 36 trials per participant. This his echolocation method to an audience, Kish experiment took place at the VR Lab of

34 Empowerment Informatics at the University of Core i7 with an output audio sample rate of Tsukuba, Japan during July 10th, 12th, and 44100 with a block size of 64 bits, and the 31st 2016. software used was Pure Data version 0.43.4- Extended. The computer was connected to the 5.1.2.1.1 Participants Echolocation Headphones by a 1/8-in. headphone jack in real time. There were 12 participants between the ages of 22 and 31 with a median age of 23 years old, of which two were female, and ten were male. All the participants reported to have normal or corrected to normal vision, and no participants reported having the previous experience with practicing echolocation.

Figure 4: Object location, Experiment I set-up

5.1.2.1.2 Apparatus

This experiment took place in a room, without any sound dampening, of approximately 4.3 m Figure 5: a Amplitude modulation graph of the 4- 9 3.5 m 9 3.6 m. Participants were asked to sit kHz sound wave decay used for emitting both fast on a chair with a height of 50 cm in the middle click and slow click sound types (taken from our of the room and wear the Echolocation pure data patch). b Amplitude spectrum in decibels Headphones device (Fig. 4), which covers graph (generated with the ‘‘pad.Spectogram.pd’’ their eyes, thus disabling their visual input. The object which they were detecting was a pure data rolling erase board with the dimensions of 120 5.1.2.1.3 Audio characteristics cm in height and a width of 60 cm. The sounds were created using a MacBook Air 2.2 Intel The sonic signature for the noise used was a

35 standard noise function of Pure Data (noise *), by 750 ms, similar to the experiment generated by producing 44100 random values mentioned above. per second in the range of - 1 to 1 of membrane positions (Kreidler 2009). White noise is often Figure 5a shows the decay of the 4-kHz used in localization tasks, including devices sound wave created by an envelope generator that guide people to fire exits in case of using an object that produces sequences of emergency (Picard et al. 2009). ramps (vline *). The series used to generate the envelope was as follows: 10-0-0.1, 5-0.2-0.3, The sonic signature of the clicks 2.5-0.4-0.6, 1.25-0.8-1.2, 0-2.4-2.4, and 0-0- follows some essential features from the (750 or 250 depending on fast or slow click). samples used by Thaler and Castillo– Serrano The first number is the amount by which the (2016) in a study where they compared human amplitude increases. The second is the time echolocation mouth-made sound clicks to that it takes for that change to occur in loudspeaker made sound clicks. They used a milliseconds. The third is the time that it takes frequency 4 kHz with a decaying exponential to go to the next sequence from the start of the separated by 750 ms of silence. Both the fast ramp also in milliseconds, e.g., ‘‘5-0.2-0.3’’ and slow clicks of our experiment modulate a means that the sound will ramp-up to 5 in 0.2 decaying 4-kHz sound wave with a total ms and wait 0.3 ms from the start of the ramp duration of 200 and 250 ms of silence (Holzer 2008). Figure 5b shows the amplitude separated the rapid clicks and the slow clicks spectrum of the click, both fast and slow, in

Figure 6: Object detection by distance and sound type graph depicting the object detection accuracy of participants

36 decibels. the other sound types with an average of 64.14%. The performance of participants 5.1.2.1.4 Procedure while detecting the object with fast click sound type had an accuracy average of 96% for both While wearing the Echolocation Headphones, 0.5 and 1 m, with an overall performance the participants were shown the control accuracy average of 80.14%. The noise sound stimulus, the sound of the room with no object type outranked all other sound types with a for each of the three sound types. Once the performance accuracy average of 85.67%, experiment began, they were presented with with the best detectable range being 96% at 1 one of the sound types selected at random to m of distance location, while 0.5 m and 1.5–2 begin the object location task. The participants m of 92% of performance accuracy average. were asked to reply ‘‘yes’’ if they sensed an object or ‘‘no’’ if they did not sense it. 5.1.2.2. Experiment II: Just noticeable difference The board was moved to a distance point chosen at random and each of these distance The goal of this experiment was to find the points was questioned twice. Once all distance perceptual resolution of the Echolocation points were completed twice, another sound Headphones by measuring the sensed type was chosen at random and the same object difference of distance between two planes location task was repeated. The participants through a just noticeable difference (JND) wore noise canceling headphones with music standard procedure. playing at a high volume, adjusted for their comfort, every time that the experimenter The experiment was conducted for two moved the object to another location in order reference distances, standard stimuli, of 1 and to prevent other sonic cues to inform their 2 m. For each reference distance, the object on perception about the location of the object. the right remained in a static position (1 or 2 The total duration of the experiment was m) from the participant and the second object between 30 and 45 min, depending on the was moved back-and-forth. The moving object number of explanations which the participant (MO) started at either 20 cm less or 20 cm required. more from the static object (SO) (e.g., static object at 1 m; moving object at 1.20 or 80 cm). 5.1.2.1.5 Evaluation The starting distance of the MO was selected at random from the above-mentioned starting The performance accuracy of participants was points and was moved in increments or measured by the number of total correct decrements of 2 cm. This experiment was answers divided by the number of trials per conducted at the VR Lab of Empowerment distance point. In this experiment, it is clear Informatics at the University of Tsukuba, that the optimal echolocation distance based Japan in April 11th, 20th, and 30th 2017. on the average performance of 12 participants is between 0 and 1.5 m, 1 m having the best 5.1.2.2.1 Participants results all sound types (Fig. 6). While the slow click sound type also showed a perceived There were six participants between the ages accuracy average of 88%, it underperformed of 22 and 35 with a median age of 31 years old,

37 of which one was female and five were male. perpendicular to the table surface. All the participants reported to have normal or corrected to normal vision. Only three of the 5.1.2.2.3 Procedure participants reported having the previous experience with practicing echolocation, since The participants were asked to wear the they had participated once in a previous Echolocation Headphones device, which experiment wearing the Echolocation completely covers their vision. Before the Headphones. experiment began, the participants, while wearing the device, were showed the control 5.1.2.2.2 Apparatus stimuli. The MO and SO at the same distance for each reference point of 1 and 2 m showed the MO at 20 cm further from each reference point and 20 cm closer. The experimenter explained the task to the participants to reply ‘‘yes’’ if they noticed a difference between the two objects or ‘‘no’’ if they sensed no difference. They were also asked to not consider their previous answer, since the objects would be moved at random.

For each task, the participant was asked if they detected a difference in distance between the objects, and depending on their answer, the Figure 7: JND, Experiment II Set-up. MO would be moved towards the SO or away from it. If the participant replied ‘‘Yes’’, meaning that they detected a difference, the This experiment was conducted in a room MO would be adjusted by 2 cm in the direction without any sound dampening of of the SO, and in the direction away from the approximately 4.3m long, 3.5m wide, and a MO if they replied ‘‘No’’. The placement of 3.6m height. Participants were asked to sit on the MO would change at random between the a chair with a height of 50 cm a tone of the tasks, picking up from the last answer, long ends of a table measuring 1m x 3.4m x changing the further starting point and the 0.76m located in the middle of the room (Fig. closer starting point. Once the answers from 7). The sound was generated from a recorded the participants reached a sequence of five sound file of white noise, played through an affirmative and negative answers in both MP3 player connected to a short end-to end directions, the experiment concluded. The headphone jack. The MP3 player was a total duration of the experiment varied from 30 FastTech mini device with a TransFlash card to 60 min depending on the ability of the (miniSD) memory model DX 48426. The participant to sense the different distance objects were two boards of 29.7 cm 9 42 cm of between the object. medium-density fiberboard (MDF) in portrait orientation. They included an acrylic stand on the back that allowed them to stand perfectly

38 5.1.2.2.4 Evaluation cm from the target distance to the constant

Figure 8: Echolocation Headphones just noticeable difference experiment results’ graph

The results of the experiment were analyzed stimuli. While there is a slightly more acute by finding the average distance of which the resolution for the 2-m constant stimulus, one participants sensed the difference between the can observe that the lower standard deviation two objects (MO and SO) (Fig. 8). The final for both constant stimuli distance points is distance of the MO further from the SO was around 2 cm, which is the shortest distance calculated and averaged for all trials, and the selected for the experiment as the lowest final distance of the MO closer from the SO perceived resolution. As for the top standard was averaged for all trials. These two averages deviation, the average for the 1-m constant were then combined returning the final stimulus is around 13 cm, while the 2-m average of perceptual resolution for both constant stimulus is around 11 cm. It can be constant stimuli of 1 and 2 m. The top standard concluded that there is a slight acuity gained deviation was selected from the average of the from sensing the distance between objects at 2 closer MO average, and the bottom was or 1 m. However, the difference is very small, selected from the further average for each of almost negligible, therefore, concluding that the constant stimuli. there is a 6-cm perceptual resolution for distances between 80 and 220 cm. In Fig. 8 above, the average perceived distance resolution for both constant stimuli is around 6

39 5.1.2.3 Experiment III: Task- Figure 9: Open path recognition, Experiment III set-up oriented path recognition

The aim of this experiment was to measure the The experiment was conducted in a hallway ability of participants to find an open path with concrete floors, MDF board walls, and a while wearing the Echolocation Headphones. metal flashing ceiling with exposed fiber The experiment was set in a hallway where insulation bags. The hallway was not fitted participants walked while scanning their with any kind of sound dampening materials, surroundings with the objective to find an open and its measurements were around 10 m 9 1.4 door. There were three doors used in the m 9 3.6 m (Fig. 9). The sound was generated experiment located at various placements from a recorded sound file of white noise, along the way, which could be open or closed played through the same FastTech mini MP3 in random combinations for each trial of one, player, model DX 48426, of the previous JND two, and three open doors, a total of three trials experiment. The doors to be located were per participant. This experiment was around 1-m width and 2.1 m in height. conducted at the VR Lab of Empowerment Informatics at the University of Tsukuba, 5.1.2.3.3 Procedure Japan in May of 2017. The participants were asked to wear the 5.1.2.3.1 Participants Echolocation Headphones, which completely cover their vision. The participants, while There was a total of six participants between wearing the device, were showed the control the ages of 24 and 35, with a median age of 31 stimuli and a sample of the task. They stood in years old, of which three were female and front of a door which was open and then three were male. All participants reported guided towards the wall where there was no having normal or corrected vision to normal. wall to sense the audible difference between an Of the six participants, only three participants open path and a wall obstruction. They were had the previous experience with echolocation given the freedom to move from side to side while wearing the Echolocation Headphones. towards the open path and the obstruction. 5.1.2.3.2 Apparatus Once they felt comfortable with their ability to deter- mine the difference in stimuli, the experiment began and received the instructions for the experiment. They were asked to walk through the hallway, slowly scanning their surroundings. They were given two options on how to scan the hallway. One way was to walk sideways scanning one wall first and then the opposite wall on their way back from the end of the hallway. The second option was to scan the hallway by turning their head from left to right, back-and- forth, seeking the feedback of each side one step at a

40

Figure 10: Open path recognition, perceived number of open doors experiment results graph

time. The participants were given this freedom standard deviation was retrieved from the to slow their walking speed, because by number of doors that participants failed to walking forward too quickly missed a few recognize for each trial. cues. Once their walking style was determined, they were asked to find the open Of all participants, only two perceived doors and to point in its direction saying ‘‘This phantom doors. While observing their is an open door.’’ Each participant was tested behavior, it can be determined that those false through three trials for each combination of positives were due to their scanning of the wall one, two, and three open doors selected at at an angle, meaning that the angle provided random. them with a decrease in volume from the echoing signal, which they had already 5.1.2.3.4 Evaluation associated with an open path. This phenomenon should be addressed as it In the Open Path Recognition graph in Fig. 10, dampens the performance of participants the average of perceived open doors is very wearing the tool, albeit only a few of them close to the ideal number of perceived open were unable to detect the difference from an doors. This means that the participants were open door and an angular reflection of the able to assess accurately the difference from a returning signal. Beyond this anomaly, the path and an obstacle. The top margin of error overall average performance of the is the average number of false doors that were participants was almost perfectly in line with perceived for all trials, while the bottom the ideal performance. It can be concluded that

41 the Echolocation Headphones is a helpful tool aware of the sonic signatures of the space for aiding navigation in circumstances where around them, navigating without stumbling audition is required and vision is hindered. with object in their direct line of ‘‘sight.’’ Their navigation improved when they realized 5.1.3 Social evaluation: art that the side-to-side and back-and-forth exhibition interactions and scanning technique was necessary to gather feedback distance differential. This is a similar movement performed with the eyes, also The Echolocation Headphones has been known as saccades. Without these movements, shown in different kinds of settings, from a static eye cannot observe, since the fovea is museum installations to conference very small. This is the part of the eye that can demonstrations. It has also been tested by a capture in detail and allow for high-resolution wide range of participants, who through visual information to be perceived Iwasaki and playful interaction gain a new perspective on Inomara (1986). The scanning behavior was their spatial surroundings. helpful for most situations, but given that the field of ‘‘sight’’ from the device is narrower 5.1.3.1 Novice echolocation than that of actual sight, there were some behaviors collisions. When there were objects in the way of participants that did not reach the direct line During most of the exhibition settings, of the sound beam, such as a table which is attendants were invited to try the Echolocation usually at waist height, they would almost Headphones. Each time, the device was always collide with it. always paired with an audio player, either a cell phone or an MP3 player. The sound was always white noise, either from an MP3 file or a noise generating application such as SGenerator by ScorpionZZZ’s Lab (2013). These exhibiting opportunities allowed for Echolocation Headphones to be experienced in a less controlled environment than the lab experiments.

Participants were given simple guidelines and a basic explanation of how the device functions. In this informal setting, the behavior of participants was similar and they Figure 11: Exhibition showing the material were usually able to identify distance and differentiation task at the EMP Open Studio, Tsukuba, spatial resonance within the first minute of Japan wearing the device. They would often scan In one of the art exhibitions, the device their hands, walk around, and when finding a was shown alongside three rectangular boards wall, they would approach it closely, touch it, of acrylic with a thick- ness of 2.5 mm, and gauge the distance. After wearing the corrugated cardboard of 3 mm, and acoustic device for a while longer, they became more

42 foam with an egg crate formation of 30 mm noise to space. He said that ‘‘In blindness, one (Fig. 11). From observation, participants were can finally see it all when it rains’’. He meant able to distinguish materials based on their that the noise of the seemingly infinite drops reflection using sound as the only informative of rain falling to the ground creates a detailed stimuli. They were most successful in finding sonic map of the environment. the acoustic foam naturally, since its sound absorption properties of this material are very 5.1.4 Echolocation Headphones porous and specifically manufactured for this Conclusion purpose. Participants are able to detect the difference of materials based on the intensity The Echolocation Headphones device is a change of the sound when echolocation is per- sensory substitution tool that enhances human formed, the intensity of the audible reflection ability to perform echolocation, improving is crucial for gathering the properties of the auditory links to visuospatial skills. They are target in space. an Assistive Device artwork, designed within the interdisciplinary lines of engineering and 5.1.3.2 Participant feedback art. This device shines a new light on AT and its user base expansion. It brings interaction, The participants’ comments when playfulness, and perceptual awareness to its experiencing the Echolocation Headphones participants. This concept popularizes and were mostly positive, e.g., ‘‘The device was makes desirable technology that could prove efficient in helping navigate in the space useful for the blind community; however, it is around me without being able to actually see.’’ designed to be valuable and attainable to and ‘‘It indicates when an object is nearby that everyone. reflects the sound back to the sensor.’’ Another participant had an observation From an engineering perspective, this pertinent to the theories that apply to sensory device presents a novel application for substitution and brain plasticity where these parametric speakers and a useful solution for phenomena happen through a learning process spatial navigation through the interpretation of and practice. This participant stated — ‘‘It was audible information. Utilizing the parametric interesting to listen to, I think it would take speaker as an echolocation transducer is a new time to be able to use it effectively and learn to application, because these speakers are recognize the subtle details of the changes in typically used to contain and target sound sound’’. When participants were asked about towards crowds. They usually are audio the white noise generated by the device, they information displays for targeted advertising. stated —‘‘It actually becomes comforting after This prototype also takes into account a trend a while, because you know it’s your only way within the study of perception, which is of really telling what’s going on around you’’. considering a multimodal perspective in Another comment was more direct stating that contrast to the reductionist approach of the sound was ‘‘Pretty annoying’’. While studying sensory modalities independently. white noise is very effective for audio location purposes, it can be jarring because of its The Echolocation Headphones have frequency randomness. However, one been designed not to focus on the translation participant noted a wonderful analogy from of visual feedback directly into sound, but

43 rather as a tool that enhances the multimodal Modulation (PWM) signal. This signal varies precept of space. This tool re-centralizes the the speed of a stream that disembogues user’s attention from gathering audible instead through a small channel, pressurizing water of visual information to determine their onto the finger pads as shown in Figure 1. The location. Thus, their spatial perception remains sensation of this pressure is similar to placing the same; however, it arrives through a a finger under a tap of running water. This different sonic perspective. The Echolocation sensation is a desirable quality in the attempt Headphones have been evaluated with of artificially rendering tactile information, psychophysical methods to investigate their much more than vibration alone. IrukaTact has performance and perceptual cohesion. an inherent vibration due to its motorized According to the data gathered through the mechanism. The aim of this device is to focus multiple experiments and exhibition on selectively stimulating the demonstrations, the feasibility of sonic mechanoreceptors of the skin, and providing visuospatial location through the Echolocation slight muscular and skeletal pressure. The Headphones as a training device for sighted design of this device takes into account the individuals is positive. Utilizing sound as a need for rendering new types of haptic means of spatial navigation is not imperative feedback. Vibrotactile stimulators are widely for sighted subjects, but this tool shows that used, and according to Dr. Iwata (2008), they the experience of sensory substitution is are relatively easy to produce. Vibrating possible regardless of skill. This prototype motors are the most commonly utilized exemplifies the ability and plasticity of the method for wearable haptic displays, often brain’s perceptual pathways to quickly adapt serving as a means of signaling such as alert from processing spatial cues from one sense to tones. However, there are many other types of another. selective skin stimulation, such as electrode, ultrasonic, and air jet displays with the purpose 5.2 IrukaTact: A Parallel Sense of of recreating virtual texture and an object’s Touch Underwater surface (Iwata 2008).

The motivation behind this case study is to This section discusses the design develop a wearable device for underwater process of this new underwater haptic haptic stimulation and to understand the actuation technique, the aesthetic sensibility perceivable range for this tool as a tactile and technical implementation for creating this display, as well as pairing this device with a tool considering its wearability and modular sonar sensor for haptic rendering of design. Furthermore, it describes a series of underwater topographies. The experiments that test the module’s mechanoreceptors on the volar pad of our performance. The firsts two experiments fingers are very sensitive to light touch, and by yielded that the haptic module propelled using the viscosity of aqueous environments to approximately 0.9 L/min at is maximum speed propel force feedback, these haptic modules and a force of 4,21005.84 N. Its display display a detectable mixture of vibration and performance was tested by searching for the pressure. The IrukaTact haptic probe uses a absolute threshold and the haptic perceptual motor to collect water from its surroundings resolution of user while wearing the device. that is digitally controlled by a Pulse Width The experiments conclude with an adaptation

44 of the IrukaTact module that includes a sonar (2014) utilizes electric polarity to simulate module, testing the user’s perception of direction and stimulus orientation using distances inside of a pool and rendered electric plates attached on four quadrants of a underwater. User feedback was collected from cylindrical reservoir of water. The IrukaTact various demonstrations and exhibitions where fingertip haptic modules are wearable and are the DemoTank units were presented. Lastly, it not limited to a specific liquid substance. They discusses the next steps for improving this tool are useful in any aquatic environment, from a as an echo-haptic glove for future applications vat, to a pool of water, or any other liquid, of this technology as an assistive aid. provided that it has a similar viscosity to water. 5.2.1 Underwater Haptics Prior- Furthermore, translating visuospatial Art information to haptic feedback, similar to the IrukaTact glove application, can be seen in the Emerging techniques for haptic actuation with work of Cassinelli (2006), Haptic Radar, liquids have a high potential for virtual reality where ultrasonic probes placed around a among other applications. While vibration headband inform vibrotactile actuators placed devices can easily become waterproof, beneath each sensor, giving the user a utilizing water pressure as a medium to be surround haptic display of range data. Haptic sculpted by reactive systems can result in feedback has also been used to enhance visual richer experiences for touch. Researchers are bathymetry map data by mapping the elevation working with liquids for haptic feedback in change of the ocean floors to color indicators different capacities. Some are activating the and to the StickGrip haptic device (Evreinova, user’s sense of touch with liquids in a dry et Al 2012). Using this visuo-tactile hybrid environment, such as LiquidTouch, a typical method they were able to gain higher accurate touchscreen interface with a mounted water jet navigation by 14.25% to 23.5% in a range of that sprays water, similar to a water gun, onto bathymetric data of 40- 140m (Evreinova, et the user’s finger to reaffirm activation of the Al 2012). These submersible haptic modules screen (Richter 2013). Jorro Beat is a similar contribute to the advancement of underwater haptic interface which synchronizes music haptic actuators, and they are innovative in beats to water modulation from a shower head, their design to be wearable, as well as using the water spray to convey and enhance providing non-restrictive force feedback with other information such as sound modulation a mixture of vibration stimulation. (Hoshino 2015). Another interface is Yoshimoto’s (2010) Haptic Canvas, which 5.2.1 Device Design similarly projects colors onto a tub of starch Beyond its utility as an underwater display tool and water mixture, a non-Newtonian fluid, as and its utilitarian applications, this work is an a control method to change the viscosity of this artistic exploration, a piece of Assistive particular fluid with a suction device attached Device Art (ADA). Which comes from the to the fingertip. While Yoshimoto’s interface Japanese movement of Device Art led by Dr. is wearable, it relies on the properties of a Hiroo Iwata (2014), which is a form of art specific fluid mixture and thus can only be blended within the fields of engineering and used in a vat of this specific liquid. Another informatics, taking into account the tub oriented interface created by Nakamura

45 commercial capacities of digital reproduction. This fingertip haptic module is designed to Device Art and ADA are not always created house a small motor placed on top of the with the purpose of commercialization, but finger (Figure 12), and it’s equipped with an rather these works function in transcendence, impeller tip that suctions water from the providing more questions and inspire new surrounding environment entering through possibilities beyond the pure utility of devices. port (a) as depicted on Figure 12. The water is Assistive Device Artworks are suspended propelled through a channel that disembogues between the utilitarian purpose of technology, on to an opening (b) directed to the volar pad the drive for augmenting human abilities, and of the fingertip. The motor has a voltage of DC augment creative perspectives to learn about 3.7V no-load quiescent current: 0.04A, the no- our own physiology. IrukaTact strives to load rotational speed: 46000 ± 10% rpm, stall cultivate the drive for the curiosity which helps torque: 8g * cm, stall current: 1A, noise: 78dB, us better understand the impact and unwitting below vibration: 0.2cm/s or less, and its total applications of technology that assists humans length including the axis is: 18.4mm. Its speed to further develop their senses underwater. is controlled with an ATmega 328PU Arduino UNO microcontroller and through a PNP 2222 5.2.1.1 Aesthetic Design transistor. The maximum voltage at which the motor can operate while submerged Aesthetic sensibility was carefully considered underwater is 5.39 V. The cable connections for the design of this tool as much as the design are run through a waterproof protective of its functionality and usability. The look and silicone tube that passes through a holding ring feel of the tip design bridges the interest of as shown in Figure 12 (c), which fastens the people who are not familiar with haptic cable comfortable over the finger. The technology, and thus promotes haptic fingertip haptic module has been designed technology to a broader public. The visual using a 3D CAD online platform and printed design of this tool was inspired by its name with a high quality PolyJet 3D printer, which ‘iruka’, (イルカ) dolphin in Japanese, and the produces waterproof models. Attached to the look of jetpacks and other missile style motor is a small impeller that suctions the devices. surrounding water. For the syphoning mechanism, the six-blade impeller in Figure 5.2.1.2 Mechanism 13 is of 1 cm2.

Figure 12: Cross-section diagram of the IrukaTact

haptic module mechanism.

46 Figure 13: Six Blade Impeller Attachment for the Evolution Motor Three kinds of thimble finger housings were An underwater camera captured the suction designed to distinguish between usability and mechanism of the fingertip module by letting a denser liquid settle to the bottom of a water bowl. Figure 14 shows the plume of the darker soy sauce entering the haptic module through the impeller pump opening. The soy sauce was used for demonstration of the suction mechanism for purposes only, even though the device uses the surrounding water during normal use. Figure 16: Underbelly impeller fingertip thimble housing.

Figure 14: Impeller fingertip suctioning denser liquid Figure 17: Cricket, vertical motor impeller fingertip on the bottom to illustrate the function of the pump. actuator housing.

5.2.1.3 Prototyping Feel

The process of ideating new ways to translate haptic actuation yielded various models that could apply pressure, vibration, and displacement underwater. Figure 15 demonstrates the various iterations that were generated while prototyping the size of the Figure 18: IrukaTact original top impeller fingertip fingertip haptic modules, the left most design thimble housing. on this image is the first iteration of the thimble housings. actuation effectiveness of the motor placement (Figures 16, 17, 18). Under limited, intuitive testing of these fingertip housings, the results were that the underbelly model (Figure 16) was the most effective in terms of sensory actuation, providing a strong signal of both vibration and force, but the placement of the motor under the finger would intervene with Figure 15: Impeller Model Haptic Actuator the user’s grasping functions. The vertical

47 motor placement model (Figure 17) was the least effective in terms of haptic actuation, and its design would also impede the user’s grasping movements. Finally, the top motor placement (Figure 18) provided a compromise between maximum usability, comfort, and sufficient haptic actuation, even though the underbelly model had a more effective haptic actuation of the three models.

5.2.2 Experiments Figure 19: Liters per Minute Data Table. While the motor placement and thimble First, the liters per minute were calculated, this housing design was an intuitive process of was done setting the tube to discharge in a cup usability and comfort, the accuracy of signal measuring milliliters. Each speed level was perception was tested through a series of tested ten times in order to record the average experiments, one searching for the force liters per minute. This was done by display performance, and three perceptual programming a timer of 10 seconds and studies of absolute threshold (AT), haptic recording the amount of water released in each resolution, and distance detection and trial. The averages for each level are found in approximation. the data table of Figure 19. Thaptic module propels approximately 0.9 L/min at is 5.2.2.1 Experiment I: Liters/min maximum speed. Feedback 5.2.2.2 Experiment II: Force This experiment seeks to assess the force Feedback feedback of the IrukaTact haptic module. These experiments were conducted at the EMP 250 PIPE HEAD (CM) 215.92

VRLab, of the University of Tsukuba, Japan. 200

) 172.42

A new version of this module was created M

C

(

D

E 150 L

where the disemboguing channel of the device E V

A 113.74

R

T

is connected to a silicone tube with an inner E

C 100

N

A

T S

circumference of 3mm. The maximum motor I 60.5 D speed of the module was divided in five 50 intensity levels (0-5) yielding the following 0 0 PWM values: 0, 64, 128, 192, and 255 and 0 1 2 3 4 PROBE INTENSITY LEVEL voltages of: 0 V, 1.80 V, 2.94 V, 4.27 V, 5.39 V. Figure 20: Pipe Head Data Table. Once the pipe head for each five levels was averaged, we used the following formula to deduce the force applied to the fingertips:

48 The force feedback of the module was the fluid density of water, times g which is calculated by fastening the tube to a wall at a gravity, and h which is the height in meters height of 4m, with the module submerged in a where the fluid stabilized at each of the five tank of colored water below it. A measuring intensity levels of the module yielding the tape was then placed next to the tube. Each of results of the applied force in Newtons/m2, the same levels were assessed five times. This yielding a maximum force of 21,005.84 N/ m2, was done by letting the liquid reach its stable and can be observed in Figure 21. height within the tube and recording this height in cm and averaging the pipe head 5.2.2.3 Experiment III: Absolute results shown in Figure 20. Threshold

3 3 2 Pfluid = r g h = (1.00 x10 kg/m ) (9.8 m/s ) The first haptic perception experiment conducted was the absolute threshold test. This 2 (m) = Newtons/m test allowed us to understand the minimum stimuli for signal detection, more specifically In this formula, we calculate the pressure of at which PWM point can 100% of the the water by multiplying (r • g • h), where r is participants perceive a stimulus at all. This

Force per Intensity Level 25000

4, 21005.8443 20000

3, 17014.53775 2

m 15000

/

s

n o

t 2, 11199.1943 w

e 10000 N

1, 5933.02325 5000

0 0 1 2 3 4 5 Probe Intensity Level

Figure 21: Pressure Force Feedback Data Table.

49 experiment was conducted at the EMP VRlab 5.2.2.3.2 Apparatus at the University of Tsukuba, Japan. The experiment consisted on an array of 5.2.2.3.1 Participants randomly sequenced signals from 0 to 64 PWM, forces of 0-5933N/ m2. The maximum There were seven participants from the ages of signal point was selected from 5933N/ m2 22-35, with a median age of 24 years old, two being the highest force of the first quarter of of which were females and five of them were the maximum intensity of the motor, which is males. All subjects participated in one trial 255PWM (4,21005.84 N/ m2). The signals each, and three of them participated in two on were produced by a testing module, which their own accord. All of the participants were consisted of an Arduino UNO, one PNP 2222 new to this probe, and were first time users. transistor, the IrukaTact haptic module, and They were free to undergo the experiment with one button which when pressed would provide the hand and finger of their choosing. There the next random PWM signal. The signal had was a total of 10 trials for this experiment. a time limit of 1000ms, after which the signal

Newtons/m2

Figure 22: Absolute Threshold results.

50 would revert to 0 until the button was pressed learning whether a participant could again. The data from this experiment was distinguish between five different signals at recorded through the serial port, inscribing the random. This experiment was conducted at the PWM point in question as well as a ‘1’ if the Art|Sci Center at the University of California, participant claimed a sensation and a ‘0’ Los Angeles, U.S.A. otherwise. 5.2.2.4.1 Participants Each user was asked to wear the submersible haptic module on their finger and insert it For this experiment, we had 11 participants inside a bowl of water in front of them. The from ages 16-37, with a median age of 20 years participant was asked to say ‘yes’ if they felt a old. There were four females and seven males, signal and ‘no’ if they could not feel anything. and all of them used the hand and finger of Each signal point was tested twice for each their choice. All of the participants were new trial. to this haptic module, and were first time users. The participants were volunteers from a 5.2.2.3.3 Results summer program at the university.

There is a 50% signal perception rate between 5.2.2.4.2 Apparatus 2873.79 N/ m2 (31 PWM) and 3615.42 N/ m2 (39 PWM). There is a 95% consensus on the The signals to be presented in random order perception of the signal at both 5005.96 N/ m2 were chosen from the same five levels used in (54 PWM) and 5469.48 N/ m2 (59 PWM). It is the liters/minute and force feedback our conclusion that the absolute threshold is experiment. These five levels were displaying 5562.18 N/ m2 (60 PWM), with a 100% of forces of: 0 N/m2, 5,933.02 N/m2, 11,199.19 participant affirmation for this stimulus (Fig. N/m2, 17,014.53 N/m2, and 21,005.8443 N/m2. 22). The different signals were produced by a testing module that contained five buttons, one 5.2.2.4 Experiment IV: Haptic for each signal. The signal was not timed; they Resolution were produced continuously until the conductor changed to a new signal. This experiment consisted of finding participants’ haptic resolution while wearing Each participant was asked to wear the the probe, with the aim to understand how submersible haptic module on their finger and many pressure values a participant can insert it inside a large tank of water sitting differentiate at random. The fingertip haptic beside them at arm’s length. They were first actuator module can provide a maximum force shown the five different levels in a sequence of 4,21005.84 N/ m2 divisible by 256 values of increasing from 0 to maximum, and then PWM from 0-255, but unless the signals are decreasing in reverse order, and asked to say sequentially increasing or decreasing most ‘OK’ every time they felt the level change. users may have trouble distinguishing between Once each participant was comfortable in intensities. Hypothetically a participant can distinguishing between the five different levels distinguish between three intensities: none, sequentially, we proceeded to administer the low, and high; however, we were interested in levels at random twice for each intensity level

51 in question and asked them to say which level

Figure 23: Haptic Resolution Accuracy

Haptic Perceptual Resolution of Probe Intensity Level

L E

V 4

E L

3.7

Y

T

I S

N 3

E 3

T

N

I

E 2.4

B 2

O

R

P

D

E 1.4

V 1

I

E

C

R E P 0 0 1 2 3 4

ACTUAL PROBE INTENSITY LEVEL

Figure 24: Haptic Resolution: Perceived Intensity Level

52

they were feeling from 0 to 4. 5.2.2.5 Experiment V: Distance Assessment 5.2.2.4.3 Results The aim of this experiment is to assess the There were 22 trials for each of the five levels performance of this haptic probe while in question. Figure 23 shows the average displaying linearly mapped data from an accuracy percentage of all trials for each of the ultrasonic range-finder. The goal was to find intensity levels presented during the whether participants were able to distinguish experiment. The highest accuracy of 100% between distances, and gauge the meters of was achieved at 0, every participant was length they were sensing. It evaluated whether absolutely certain that there was no sensation the IrukaTact haptic display can be useful for at all. The second highest accuracy rate of 77% an echo-haptic application, and give users a was achieved by the most intense level 4 of parallel sense of touch underwater. This 2 21,005.8443 N/m (255 PWM). Levels 1 experiment was conducted at the athletic pool through 3 presented less accuracy, the median of half-Olympic size at the University of level 2 of 128 PWM being the least likely to Tsukuba. be detected accurately at 50%. 5.2.2.5.1 Participants Figure 24 demonstrates that participants tend to aim higher for levels 1 and There were seven participants of the ages (24- 2, with an average of 1.4 for level 1, and 2.4 35) with a median age of 26 years old. Two of for level 2. Level 3 has the highest accuracy the participants were female and the rest were average, and as for level 4 participants aim male. Four of the participants were new to this lower with an average of 3.7. This experiment probe and three had participated in previous concludes that most participants could discern experiments. the difference between none, low, and high, while still having trouble understanding the 5.2.2.5.2 Apparatus difference between levels 1-3. At the end of each trial participants were asked to elaborate on their experience, and to describe whether it was easier to know the level of intensity while it was presented to them in a descending or ascending sequence. All participants, except for one, agreed that presenting the stimulus sequentially would be more helpful. One participant added – ‘[It would be] more dramatic if it was in all your fingers. The jolt between the levels is also helpful,’ referring to the motor quickly turning levels without a Figure 25: IrukaTact Sonar Tip smooth transition. There were two experimenters, one which assisted the participants inside of the pool, and another which recorded the participant

53

14 m 2 m

3 m 1 11 m 7 m 2 7 m 9 m 3 5 m

2 0 m 2 0 m 2 0 m 2 0 m M A X MA X M A X

Pool

Back and forth disorientation # sample trajectory and spin of 3 testing points Pool divider rope & a led subject. floats

Figure 26: Pool Distance Experiment Set-up Diagram.

responses outside. For this investigation, a left), (7m right, 20m center, 7m left), and (9m new model of the IrukaTact module was right, 20m center, 5m left) also shown in figure designed by directly attaching the sonar sensor 16. The motor of the haptic module was set to as part of the thimble. Figure 25 shows this map the sonar data inversly, the closer the new iteration were the probe looks similar to a distance, the more pressure that was applied to flashlight, giving the user directionality the fingertip. In this case, the 20m distance was feedback based on gauging the boundary set to 0 pressure, and 1m was set to maximum distances in the direction they are pointing pressure. Below is the function used to map the towards. ultrasonic range finding data to the IrukaTact haptic module output. There were three pre-selected locations of which three surrounding distances were (x − Smin) ∗ (Omin − Omax) Output = tested. For each location, there was one tested (Smax − Smin) + Omax distance that remained the same which was the maximum length of the pool of 20m, as The output is the PWM value of IrukaTact, x depicted in Figure 26. The coordinates of each is the sensor value in the rages or Smin (0) to location were: (3m right, 20m center, 11m Smax (1023) which is the analogue sensor

54 input values between 0-5v converted into a disorient them as shown in the experiment set- digital signal; the input is converted to Omin up diagram in Figure 15. Once they were at the (0) to Omax (255) which are the ranges of right location, which would have been one of PWM. The output minimum value is mapped the predetermined distance points, they were to the input’s maximum value in order to instructed to face a specific direction and achieve a stronger haptic sensation if an object gauge the distance that they were pointing or boundary is closer to the subject; the further towards. As they turned to each direction in a target is, the less sensation the participant question they were asked to say how far the will feel. boundary of the pool they were pointing towards was from them in meters. Participants were asked to enter the pool and were helped to place the IrukaTact 5.2.2.5.3 Results Sonar Tip (Fig. 14) on their finger. They were shown the relationship of the haptic sensation The results of this experiment can be found in to the distance before beginning the the graph of Figure 28, which shows the experiment, and shown the controls of shortest statistical results generated by Tukey’s and longest distances. They were told the size multiple comparison test method, which tested of the pool which was 20m length by 14m the significance of each average result per width, and related this information to their distance and compared it to one another. These initial free-form experimentation. findings show that the performance of the participants was best at discerning the distances when comparing it to the maximum distance of 20 m as shown by the low p-value of <0.01. Participants were also successful in discerning the distances between 3m and 9m, as well as between 3m and 11m.

These results suggest that this echo- haptic translation is successful so long as the participant is able to compare distances, and the further these distances are from one another, the more accurate their responses. It can be determined that participants are able to Figure 27: Pool Distance Experiment distinguish between at least three different Participant and Guide distance points accurately. However, there is much room for improvement in the design of Once they were acquainted with the the device. The ultrasonic range finder was not mechanism, they were asked to wear meant to function underwater and thus there swimming goggles that had been painted black were glitches which sometimes provided an inside, inhibiting vision completely (Fig. 27). inaccurate distance display to the participant. This allowed them to guide their distance Also, the pressure display of the IrukaTact assessment solely via the haptic stimulus. module can be improved by providing more They were turned three times and passed under force, thus able to render more noticeable a few swimming dividing cables in order to

55 differences between sensed data points.

Figure 28: Perceived distance with the IrukaTact Sonar Tip average graph with a multiple comparison test data by Tukey’s method.

56 5.2.3 IrukaTact Public a catalogue of archeological findings including Interactions: DemoTanks the prototypes of the enclosure of the electronics of the glove, the propeller tips that IrukaTact has been exhibited world-wide in look like fish fins (Figure 31). Throughout many public exhibitions1 under the context of these public demonstrations we collected some new media art. More specifically, this work is comments from participants, about their Assistive Device Art, a new form of Device experience with the DemoTanks displaying the Art that is specifically created as an associative perception of spatial distance to the experiential and exploratory form of assistive haptic intensity variation on the fingertip technology, seeking to extend human sensory haptic module. abilities beyond the restoration of normal human ability.

Figure 29: Design Process 3D Printed Tips and Parts The concept of IrukaTact arose as a new technology able to assist underwater searching, and thus a flooding aid toolkit for locating sunken objects, while treading submerged areas under limited visibility. The exhibition designs that showcase this work usually consisted of two or three DemoTanks (Figure 30), the IrukaTact glove (Figure 33), Figure 30: DemoTank annotated image; ‘A’ is the cases showing the iterative process of the tip sensor’s range, ‘1’ is the sensor, ‘2’ is the tank, ‘3’ is designs such as the IrukaTact tip evolution the haptic module, and ‘4’ is the sensing unit plate. (Figure 29) and display cases that looked like

(2015); TEI, Eindhoven, Netherlands (2016); 1 This work has been exhibited in: G7, Tsukuba, Japan (2016); D-nest Inventors Post Cities, Ars Electronica, Linz, Austria (2015); Exhibition, Venice, Italy (2016); IEEE World Speculum Artium, Trbovlje, Slovenia Haptics, Fürstenfeldbruck, Munich, (2015); Tsukuba Media Arts Festival, Japan Germany (2017).

57 5.2.3.1 DemoTanks: Associative sensor; the closer their hand was to the sensor Perception plate, the more pressure they felt on their fingertip. The fingertip haptic modules were presented in a DemoTank unit in order to demonstrate the association of distance to the haptic sensation. These units consist of a sonar sensor stand and a fingertip actuator insert, attached to a water tank (Figure 19). Three of these testing units were designed for large crowds in a festival, each displaying different fingertip actuator models and featuring new ideas for underwater haptic sensations including impeller pumps and propeller models. The two most exhibited actuators were: the IrukaTact original tip (Fig.12 and Fig. 18) and the propeller model (Figure 31) both of which display the most Figure 31: Propeller Haptic Actuator Model contrast in sensation. The sensation of the propeller model is an upward lifting motion mixed with vibration. This actuator works best The circuit components for the when the thimble is hovering on the surface of DemoTank units are shown in Figure 30 with the water. When the propeller is able to mix air numbers associated with each part, a and water, there is a significant perceivable MaxBotix MB7066 sensor (1), a water tank haptic stimulation. However, when the (2), a small motor per haptic fingertip module propeller model is submerged underwater, the (3), an electronics enclosure (4) containing: actuation of this type of fingertip haptic An Arduino Pro Mini, a 12 Volt power supply module is barely perceivable, many testers adaptor, and a PN222 transistor, and an LCD were unsure whether they felt any sensation at screen. The LCD screen was used to display all. Because of this lack of sensation during the distance of the user’s hand to the sensor in full submersion, we wanted to show the centimetres and the intensity of the fingertip difference between a pump actuation model to haptic module’s motor. a propeller actuation model during our demo installations. The DemoTank’s sonar has a very broad sensing range, from 12cm-10m, which Each DemoTank has been equipped poses a problem for translating the sensed with a different fingertip haptic module distance into haptic data. The constrains of the displaying the difference between the ultrasonic range finder are set from its propeller (Figure 31) and impeller model. minimum range of 12 cm to the top of the Each model was designed to display varying sensing unit plate which is 56 cm, depicted in pressures on the user’s finger in relation to the Figure 30 noted by the arrow labeled (A). For data gathered from the sonar sensor. The this device, we used the same mapping strength of the pressure felt by the user was function as the Distance Assessment dependent on the proximity of their hand to the

58 experiment, but in this case the analogue sensor reading constraints were modified to reflect the distance in question as aforementioned. Similar to that experiment, activating the sensor closer to the plate initiates the haptic module’s maximum speed, and vice versa. During the demonstrations, the participants were guided to use one hand to activate the sensor, and insert a finger from the other hand inside the fingertip haptic module of the tank (Fig. 32). The reason we flipped the direction of the sonar was to simulate the distance of the hand to the ground. This way we could better illustrate the relation between distance to haptic actuation as a parallel sense of touch. We asked participants to imagine they were scanning the ground with their hand and feeling the topography of the ground below. Figure 32: Participant feeling the distance and haptic actuation correlation on the DemoTank unit at the 5.2.3 Observations of Participant TEI’16 conference exhibition. Interactions During Demonstrations Most users have been pleasantly surprised These modules have been displayed across about the haptic sensation that this actuator has various conference and art exhibition settings, provided, sometimes linking it to the water jets and their function has been demonstrated to of a Jacuzzi. Participants commented ‘I get it! over a thousand participants from a variety of I understand the relationship between the nationalities, backgrounds and ages. We have distance of my hand to the ground and the gathered comments and observations from sensation on my finger,’ and ‘This gloves these units while in exhibition settings, giving looks really cool, where can I buy one?’ This us a wide understanding of how the greater comment about the commercial availability of public react to the concept of underwater the product, is directly related to the look and haptic stimulation. Some of our participants feel of this probe, and the rarity of its function. are fellow researchers that understand the This feature is an important aspect of Assistive concepts of haptic technology, but most of our Device Art, where they exist as prototype, participants have been curious attendants to artwork, and consumer product, making Media Art exhibitions. We have had groups of assistive technology more approachable and children and groups of elderly tours try replacing disability with desirability. One IrukaTact through the DemoTanks. participant from the D-nest exhibition, who was an archaeological scuba diver suggested that it could be used to locate old coins under the ocean floor, for which case the sensor would have to be replaced with a metal

59

Figure 33: IrukaTact underwater haptic search glove.

detector. Most people saw the opportunity of we used for the DemoTank. This haptic this tool to be extended as an assistive device technology could also be implemented for for blind users, providing an echo-haptic virtual reality applications, moving away from utility that could enrich their scuba diving probe sensing and instead utilizing gyroscopes experience, giving them more information and accelerometers to keep track of the user’s about their surroundings. hand position. This data could be used to superimpose virtual environments where the 5.2.4 Applications user can encounter and interact with 3D objects rendered by the varying pressures of The IrukaTact fingertip haptic module is a the water jets onto the finger pads. While this display tool for underwater applications, application is not the primary focus to our which can be paired with a multitude of research of underwater haptics, it poses a sensors including the range finding sonar that useful extension to the function and

60 application of the underwater haptic actuator accessible to anyone through a browser. This module. kit has been created with the purpose of digital distribution with materials and processes that 5.2.3.1 IrukaTact Glove are readily available in most developed cities, such as Arduino microcontrollers, simple motors, hardware store supplies, and 3D printing. All programming and digital fabrication files can be freely downloaded and adapted, and links to parts available online are provided. The files can be easily edited and remixed by anyone who is interested to contribute to our progress, or deploy the glove for preliminary use. The fingertip haptic modules can be resized for fingers of different sizes, which was a common problem we saw while demonstrating the technology to a wide range of people of all ages and sizes. 5.2.3.2 Interface Design

The look and feel of IrukaTact is largely inspired by its name, dolphin, as we designed each part, we were thinking of waves and fins. Our main goal was to design a glove that would allow the user to be able to continue using their hands naturally and feel their Figure 34: Underside view of the IrukaTact Glove environment beyond the tool’s feedback. The original IrukaTact impeller model (Figures 12 and 18) was used for the glove application The IrukaTact glove (Figure 33) is an because it provides comfort for wearability, Assistive Device Art tool with the capacity of while providing sufficient feedback when it’s sensory augmentation, given that it provides completely submerged underwater. This an extension to the sense of touch in an fingertip haptic module design only inhibits environment where visibility is hindered. the outermost joint of the finger in order to Ideally, people with limited or no visibility can minimize grasp and movement inhibition. benefit from this device while diving, or to aiding search teams in areas affected by The IrukaTact glove sends mapped flooding in emergency situations. The files for signals from the sonar sensor which is attached the glove are open source and anyone can to the wrist to three fingers, while leaving the download the DIY and User Manual, where all thumb and the little finger to be free, instructions for assembly and parts list are minimally covering the hand. Providing the detailed (Chacin, 2016). The 3D model was same signal to three fingers allows for the created in an online platform which is stimulation to be amplified, similar to the

61

Figure 35: Sensor in Parallel with the Hand Diagram

comment from one of the participants from the When the user’s palm is stretched out haptic resolution experiment. Silicone tubes without bending the wrist the hinge (Figure 35 which house the wires to the finger pumps run #1) is lose, and falls pointing towards the along the top of the hand towards the ground. This diagram also shows the electronics enclosure, which rests on top of the relationship (A) between the sensor and the wrist, like a watch or bracelet. The sonar fingertip stimulation, when the user’s palm is sensor is attached to the bottom of the wrist on turned upwards with a wrist movement (B), a hinge (Figures 33 and 34). The silicone tube the face of the sensor points at the same that leads the sensor’s wires to the enclosure is direction (C), thus stimulating the fingers wrapped around a ring that is worn on the depending on the information detected by the middle finger. The tension of the silicone tube range finder in that direction. holder on the ring allows for the hinge to move in synchronicity with the user’s palm, keeping Our device is waterproof, since it was the sensor in parallel. This way the sensor will created using a high quality PolyJet 3D printer. pick up the information coming from their As far as sealing the motors we used grease, palm’s direction at all times. which is a common practice for waterproofing motors underwater such as pumps and boat

62 propellers. We also designed a twisting cap for the media. This device bridges the gap the electronics enclosure with a silicone O- between desirable and assistive technology, ring that would allow the user to open the empowering the public’s perception on housing and recharge the device. disabilities in a playful way. Assistive technology can go beyond restoring normal 5.2.5 IrukaTact Conclusions human functions and presenting an opportunity to expand the sensory The IrukaTact haptic module presents many functionality of common devices. In this case other display capabilities beyond the echo- study, we found the IrukaTact Glove was haptic utility of the glove to be expanded for popular and desirable, even though the metal detecting purposes or for rendering utilitarian functionality of the device presents virtual reality objects and boundaries little consumer need alleviation, as it need underwater. Tests suggest that perhaps the much technical improvement to reach any force applied by the module could be level of commodification. Beyond the improved from its maximum of 21,005.8443 application of this tool for underwater object N/m2. Perhaps this could be improved by searching, an equally important motivation selecting motors that have a higher torque, or that guided this research was designing new by gearing the motors. sensations for haptic displays. We realized that in order to create a 6. General Discussion more universal tool, the fingertip actuator housing needs to be more flexible, and not Transposing an experience goes made of a rigid material such as 3D printed beyond orchestrating sensory data, it relies on plastic. One of the participants noted that ‘The understanding the mechanisms of perception tightness of the device and the temperature of from multiple angles. This research described the water make a difference in sensation.’ The the intimate relationship that humans have reason for this is that there are many finger with the computing devices and argue that all sizes, and the reliability of the sensation is technology is assistive. The eager adoption of dependent on the fitting of the tool. If the the smartphone and our shift towards strong fingertip housing is too small, then the user’s dependability on this device has made it circulation will be cut, and thus feel less. If the become a kind of cognitive prosthetic. Mobile, user’s finger is entirely too big to fit the wearable devices arise from the closely thimble housing, they cannot feel the entangled history of the body potentials and difference of the lower force intensity levels the beginning of the battery. We consider the because the water jet cannot reach their finger physical characteristics of these devices and pad. If the user’s finger is too small, then they propose that through the integration of AT and are unable to reach the channel’s the miniaturization of their scale, they will disemboguing point, rendering hardly any become more integrated with our physiology. haptic actuation similar to the previous While the enhancement of visual displays has remark. Furthermore, as an Assistive Device been dependably advancing, there has been Art tool, the IrukaTact glove has gathered little improvement in alternative-modality positive attention from the public in digital displays. Audio displays have also exhibitions, demonstrations, as well as through received important attention, and while there

63 are incredible advancements for the realism curiosity to experience a new perspective is that these devices render, users will always enticing enough for people to engage with AT. seek an ever more authentic digital experience. This experience is multimodal and integrative, Expanding the experiences of digital displays and can be crafted by training apparatuses has been the impetus of ADA as it frames and using sensory substitution methods. Haptic observes the impulse towards our evolutionary displays have been slowly incorporated in future. This research proposes that HCI consumer electronics in its most basic borrow from the study of cognitive or motor function, the vibrating motor. While there are disabilities, as a way for observing new specific haptic display devices, these are frontiers for human perceptual and motor usually application specific. An area that has adaptation. We reviewed the mechanisms of seen the most state-of-the-art haptic displays is perception from a neuroscientific perspective, surgical laparoscopic displays. These are where a new approach for studying sensory cumbersome, large devices, similar to the experience as a multimodal process is bulky quality of the first computers. As we saw emerging. While the gestalt approach to the in the trajectory for screen based mobile study of cognition is relatively new, the device incorporation, haptic displays will also compartmentalization of the mind in separate become further integrated into our prosthetic modalities is becoming a concept of the past. toolkit. Mechanoreception is the basis of The research we have analyzed suggests that tactile experience, paired with sound, this the mind is a multimodal organ, and that while perceptual capacity allows us to gain there are areas that process specific raw information from the vibrations of our information, perception is an aggregation of a surrounding. multitude of information. This multimodality also supports the theories of neuroplasticity, The tangibility of digital information is allowing the processing power of the brain to the next frontier for crafting and manipulating be distributed to the modalities available to a our own experience. Here we also posed specific physiological architecture. This questions about the lengths at which humans explains why blind individuals may have a are willing to integrate new abilities into their heightened sense of hearing in comparison physiological architectures, while these with visually abled people. We considered the enhancements may hinder some of their visuospatial abilities of various perceptual normal abilities. While there is a small architectures such as the cases of blind following to the Cyborg Art movement and imagination, where the physiological other sub-culture groups such as Grinders, phenomena of visual perception are intact, but DIY performers of surgical body modification, the imagery has lost its visual qualities. Or for non-invasive means for human-computer example, in the case of visual cortex activation integration and body augmentation are in braille reading blind individuals while preferred and more likely to be adopted. performing discriminatory tasks. We learned Beyond the utility of bodily enhancement, we that blind individuals have preference an proposed that most people are interested in egocentric perception to space, while sighted learning more about their physiology, and subjects have an allocentric preference. From while the skills gained through ADA may not all of these insights we proposed that devices be applicable for utilitarian purposes, the mere that pose a new visuospatial framework can

64 facilitate the enhancement of trans-modal Assistive Technology and to further human– awareness. The perception of experience is the computer integration while they seek to interpretation of our senses in a spatiotemporal enhance their experience. transformative interplay that aggregates information from multiple sources of input. ADA design is an interdisciplinary undertaking where the theoretical models of 7. Conclusion engineering, psychophysics, design, and entangle. This researched This thesis described the concept of ADA, as proposed a framework by which to integrate brought forth by the author, and its origins by HCI, sensory and perception science, and Art, observing the phenomena that surround the by borrowing from some of the methodologies aesthetics of prosthesis-related art. ADA was that pertain to these fields. Through employing presented as a framework by which to classify, this framework this research yielded two new design, and evaluate technology within the interfaces, Echolocation Headphones and cross-section of AT and Art. This concept IrukaTact Glove. Furthermore, we described described a method by which to innovate on the design process, psychophysical evaluation, digital display technology while deeply and future applications of the Echolocation investigating the structures of empirical Headphones and the IrukaTact haptic module contemplation and facilitating a way of and its various iterations, relating their reflecting on our perception through physical provenience as ADA tools. Through these and sensory extensions. ADA is a platform for interfaces we tested the feasibility of AT interactive transformation that is the methods for abled individuals as a mode of sensorimotor dance of body and environment. investigating the perceptual ability for AT addresses the necessity of a shared visuospatial multimodal integration. As for the experience, where all bodies have the Echolocation Headphones, this interface capability in engaging in the same everyday provided most subjects with a positive tasks. In contrast, ADA takes inspiration from experience. They were able to distinguish atypical with the aim of creating between distances of objects, as well as to perceptual models that provide a new shared navigate their surrounding while on average experience from a multimodal mediated finding an open pathway. Moreover, this perspective. Assistive Device Art investigates interface exhibited as an artwork provoked the limits of a mediated embodiment by attention and willingness from participant reframing the user base AT and making it interaction. There were some aesthetic desirable to a broader public. It presents an considerations about the quality of sound that avenue for exploring new sensory perspectives these devices used for the echolocation task, and contemplating on the transient nature of where the white noise was “jarring”, but the our condition as human bodies and our most effective. In the case of IrukaTact, echo- relationship to technology. This art reimagines haptic location underwater presented more our future bodies proposing new avenues of challenges. A new haptic display mechanism HCI integration, often camouflaging itself as a was devised in order to actuate the fingertips consumer product indulging the illusion of underwater while retaining agency of their escaping our experiential reality. It measures hands mobile functions. This new underwater the public’s willingness to engage with haptic actuator module was then paired with a

65 range-finding probe, and translated the Anders Björkman, Niels Thomsen and Lars distance detected to water pressure. While the Dahlin (2011). New Treatment Strategies in design of this haptic actuator is an engineering Diabetic Neuropathy, Recent Advances in the contribution, the IrukaTact Glove is an ADA Pathogenesis, Prevention and Management of tool. This prototype also received public Type 2 Diabetes and its Complications, Mark attention through the ‘viralization’ phenomena B. Zimering, MD, PhD (Ed.), ISBN: 978-953- previously described in the methodology of 307-597-6 this research. Albeit, the utilitarian aspects of this technology are questionable, these Antonellis A (ed) (2013) Net art implant. interfaces provide more than an engineering http://www.anthonyant onellis.com/news- solution. They are physically proposed post/item/670-net-art-implant. Accessed 12 questions that ask “How would you like to be Sept 2017 physiologically enhanced?” ADA borrows Anthony, Sebastian (2012) GoFlow: a DIY from AT and artistically crafts new sensory performances that inquire and propose new tDCS brain-boosting kit. In: ExtremeTech. perceptual models towards a future of more https://www.extremetech.com/extreme/12186 integrated human-computer systems that are 1-goflow-a-diy-tdcs-brain-boosting-kit. passive and augmentative co-evolving and co- Retrieved December 23, 2017 adapting. Ardila A, Bernal B, Rosselli M (2016) How Extended Is Wernicke’s Area? Meta-Analytic Acknowledgements Connectivity Study of BA20 and Integrative Proposal. Neuroscience Journal 2016:1–6. doi: The research conducted for this manuscript 10.1155/2016/4962562 and prototype were supported by the Empowerment Informatics department at the Au Whitlow WL (1997) Echolocation in University of Tsukuba, University of dolphins with a dolphin–bat comparison. California, Los Angeles, and in small part by Bioacoustics 8:137–162. the National Science Foundation. This doi:10.1080/09524622. 1997.9753357 research has been made possible by the careful consideration and advisement of Pr. Hiroo Auger J, Loizeau J (2002) Audio tooth Iwata, Pr. Victoria Vesna, and Pr. Hiroaki implant. In: http://www.auger- Yano, and with the editing help of Tyson Urich loizeau.com/projects/toothimplant.Accessed and Nicholas Spencer. I would also like to 12 Aug 2017 extend my gratitude to Takeshi Oozu, who assisted me in many occasions and Bach-y-Rita, P. (1972). Brain mechanisms in collaborated with me on the IrukaTact project. sensory substitution. New York: Academic Press. References Bach-y-Rita P, Kaczmarek KA, Tyler ME, A. M., Et al. (1992). U.S. Patent No. EP Garcia-Lara J (1998) Form perception with a 0688125 A1. Washington, DC: U.S. Patent and 49-point electrotactile stimulus array on the Trademark Office. tongue. J Rehabilit Res Dev 35:427–430

66 Bach-y-Rita, Paul, and Stephen W. Kercel. The neuroscience of visual impairment. Mas- 2003. "Sensory substitution and the human– sachusetts: The MIT Press. machine interface." Trends in Cognitive Sciences 7, no. 12, 541-546. Chacin, Aisen C. (2015) IrukaTact. 24 Oct. doi:10.1016/j.tics.2003.10.013. 2015

Bau, O.; Poupyrev, I.; Israr, A.; Harrison, C. Chacin AC (2013) Sensory pathways for the (2010) TeslaTouch. In Proceedings of UIST plastic mind. http:// ’10, pages 283–292. aisencaro.com/thesis2.html. Accessed 25 May 2017 Bedny, Marina (2011) Interview In:"Parts of brain can switch functions” - MIT News Collignon O, Voss P, Lassonde M, Lepore F Office. 17 Dec. 2012 (2008) Cross-modal plasticity for the spatial processing of sounds in visually deprived Beta Tank, Eyal B, Michele G (2007) Eye subjects. Exp Brain Res 192:343–358. candy. In: Eyal B (ed) Studio. doi:10.1007/ s00221-008-1553-z D3A Lab at http://www.eyalburstein.com/eye- Princeton University (2010) candy/1n5r92f24hltmysrkuxtiuepd6nokx. http://www.princeton.edu/3D3A. Accessed 20 Accessed 12 Aug 2017 May 2013

Blakemore S-J, Frith U (2005) The learning Degenaar, Marjolein and Lokhorst, Gert-Jan brain: lessons for education. Blackwell (2005) "Molyneux's Problem", The Stanford Publishing, Malden. ISBN 1-4051-2401-6 Encyclopedia of Philosophy.Winter 2017 Edition., Edward N. Zalta (ed.), URL = Brain Center America (2008) Brain functions: . http://www.braincenteramerica.com/visuospa. php. Accessed May 2013 "Dexta Robotics." (2014) 24 Oct. 2015 Brain, R. (1954). Loss of Visualization. Proceedings of the Royal Dinkla S (1996) From interaction to Society of Medicine, 47(4), 288–290. participation: toward the origins of interactive art. In: Hershman-Leeson (ed) Clicking in: hot BrainPort V100 Vision Aid. (n.d.). Wicab. links to a digital culture. Bay, Seattle, p 279– Retrieved December 1, 2017, from 290 https://www.wicab.com/ D. M. (2017, December 12). Psychophysical Cassinelli, Alvaro, Carson Reynolds, and Isomorphism. Lecture presented at 2009 Masatoshi Ishikawa (2006) "Augmenting Cognitive Neuroscience Annual Spring spatial awareness with haptic radar." Wearable Retreat and Kavli Institute for Brain and Mind Computers, 2006 10th IEEE International Symposium. in Symposium on. IEEE. Http://www.psychologyconcepts.com/psycho physical-isomorphism-don-macleod-may-2- Cattaneo, Z., Vecchi, T., 2011, Blind vision. 2009/.

67 Driver, Jon, and Charles Spence. 2000. 1Gonzalo Fonrodona, Isabel and Gonzalo "Multisensory Perception: Beyond Modularity Rodríguez-Leal, Justo (2015) The pioneering and Convergence." Current 10 (20): research of Justo Gonzalo (1910–1986) on R731-R735.doi:10.1016/s0960 brain dynamics. In: 9822(00)00740-5. http://hdl.handle.net/10347/4341 Accessed: December 2, 2017 Dunne A, Raby F (2014) Speculative everything: design, fiction, and social Goodale MA, Milner A (1992) Separate visual dreaming. MIT Press, Cambridge. ISBN: pathways for perception and action. Trends 9780262019842 Neurosci 15:20–25. doi:10.1016/0166- 2236(92)90344-8 Encyclopædia Britannica. (2017). . Retrieved December 2, 2017, Guillot P (2016) Pad Library. In: Patch from: storage. http://patchstorage. com/pad-library/. https://www.britannica.com/science/Gestalt- Accessed 10 May 2017 psychology Haberkern R (2012) How directional speakers Evreinova, T.V.; Evreinov, G.; Raisamo, R. work. Soundlazer. (2012) Haptic visualization of bathymetric http://www.soundlazer.com/. Accessed 26 data. Haptics Symposium (HAPTICS), IEEE. May 2017 359-364. Hagen, Susan, 2012. The Mind’s Eye. Foc.us “Go Flow 1020 - TDCS Stimulator, Rochester Review. Vol. 74, No. 4. Battery, Cable, Electrodes, Sponges & 1020 Hat.” Foc.us TDCS Bran Stimulators Holosonics Research Labs (1997) About the with EEG, www.foc.us/go-flow-pro-1020. inventor: Dr Joseph Pompei. In: Audio Retrieved December 1, 2017. spotlight by Holosonics. https://www. holosonics.com/about-us-1/. Accessed 26 Freud, Sigmund. 1961. Standard Edition of the May 2017 Complete Psychological Works of Sigmund Freud, Volume XXI (1927-1931): The Future Holzner D (2008) The envelope generator. In: of an Illusion, Civilization and Its Discontents, Hyde, Adam (ed 2009) Pure data. and Other Works. London: Hogarth Press and https://booki.flossmanuals.net/pure- the Institute of Psycho-Analysis. data/audio-tutorials/envelope-generator. Accessed 26 May 2017 Glezerman TB, Balkoski VI (2002) Parietal- Occipital Region: Spatial Perception and Hoshi, Takayuki, et al. (2010) Noncontact Word Form. In: Language, Thought, and the tactile display based on radiation pressure of Brain. Cognition and Language: A Series in airborne ultrasound. Haptics, IEEE Psycholinguistics. Springer, Boston, MA DOI: Transactions on 3.3, 155-165 https://doi.org/10.1007/0-306-47165-5_4 Hoshino, Keisuke, et al. (2015) "Jorro Beat: "GoFlow - World's First tDCS Kit." 2012. 17 Shower Tactile Stimulation Device in the May. 2013 Bathroom." Proceedings of the 33rd Annual

68 ACM Conference Extended Abstracts on Furusho. 2005. Development of VR-STEF Human Factors in Computing Systems. ACM. system with force display glove system. In Proceedings of the 2005 international Houde S, Hill C (1997) What do prototypes conference on Augmented Tele- existece prototype? Handbook Human Computer (ICAT '05). ACM, New York, NY, USA, 91- Interact. doi:10.1016/b978-044481862- 97. 1.50082-0 Kish D (2011) Blind vision. In: PopTech: Houston, Edwin James (1905) "Electricity in PopCast http://poptech.org/ Everyday Life", Chapter XXII. P. F. Collier & popcasts/daniel_kish_blind_vision. Accessed Son. 20 May 2013

Hutto D, Myin E (2013) A helping hand. Kreidler J (2009) 3.3 Subtractive synthesis. In: Radicalizing enactivism: minds without Programming electronic music in Pd. content. MIT Press, Cambridge. 46. ISBN http://www.pd-tutorial.com/english/ 9780262018548 ch03s03.html. Accessed 3 Mar 2017

Ihde D (1979) Tecnics and praxis, vol 24. D. Kusahara M (2010) Wearing media: Reidel, Dordrecht, pp 3–15. doi:10.1007/978- technology, popular culture, and art in 94-009-9900-8_1 Japanese daily life. In: Niskanen, E (ed) Imaginary Japan: Japanese fantasy in Iwasaki M, Inomara H (1986) Relation contemporary popular culture. Turku: between superficial capillaries and foveal International Institute for Popular Culture. structures in the human retina. Investig http://iipc.utu.fi/ publications.html. Accessed Ophthalmol Vis Sci 27:1698–1705. 25 May 2017 http://iovs.arvojournals.org/article. aspx?articleid=2177383. LAO (Laboratory for Applied Ontology). 2017. "Relation Schematic-Relation in Theory Iwata H (2000) Floating eye. Structuring-Concepts." Ontolingua Reference http://www.iamas.ac.jp/interaction/i01/ Manual. Accessed February 10 2017. works/E/hiroo.html. Accessed 25 May 2017 http://www.loa.istc.cnr.it/old/medicine/structu ring-concepts/relation66.html. Iwata H (2004) What is Device Art. http://www.deviceart.org/. Accessed 10 April Lundborg G, Rose ́n B, Lindberg S (1999) 2017 Hearing as substitution for sensation: a new principle for artificial sensibility. J Hand Surg Iwata, Hiroo (2008) "History of haptic 24(2):219–224 interface." Human haptic perception: Basics and applications 355-361. Merabet LB, Amedi A, Pascual-Leone A (2006) Activation of the visual cortex by Kaas, Jon H. (1991)"Plasticity of sensory and Braille reading in blind subjects. motor maps in adult mammals."Annual review Reprogramming the Cerebral Cortex 377–394. of neuroscience 14.1:137-167. doi:10.1093/acprof:oso/9780198528999.003. Ken'ichi Koyanagi, Yuki Fujii, and Junji 0022

69 Mishkin M, Ungerleider LG (1982) Pasqualotto, A., Proulx, M.J., 2012, The role Contribution of striate inputs to the of visual experience for the neural basis of visuospatial functions of parieto-preoccipital spatial cognition. Neuroscience and cortex in monkeys. Behav Brain Res 6:57–77. Biobehavioral Reviews, 36, 1179-1187. doi:10.1016/0166- 4328(82)90081-x Pasqualotto, A., Spiller, M.J., Jansari, A.S., Nakamura, Taira, et al. (2014) "Localization Proulx, M.J., 2013, Visual experience fa- Ability and Polarity Effect of Underwater cilitates allocentric spatial representation. Electro-Tactile Stimulation." Haptics: Neuroscience, Devices, Modeling, and Picard DJ, Hulse J, Krajewski J (2009) Applications. Springer Berlin Heidelberg. Integrated fire exit alert system. US Patent 216-223. 7528700. 5 May 2009

Nelson, J., McKinley, R. A., Phillips, C., Pfeiffer, Max; Schneegass, Stefan; Alt, McIntire, L., Goodyear, C., Kreiner, A., & Florian; Rohs, Michael (2014) Let me grab Monforton, L. (2016). The effects of this: a comparison of EMS and vibration for transcranial direct current stimulation (tDCS) haptic feedback in free-hand interaction. In on multitasking throughput Proceedings of the 5th Augmented Human capacity. Frontiers in human International Conference (AH '14). ACM, neuroscience, 10. New York, NY, USA, Article 48, 8 pages.

Nitsche, Michael A et al. "Facilitation of Pullin G (2009) Desing meets disability. implicit motor learning by weak transcranial Massachusetts Institute of Technology Press, direct current stimulation of the primary motor Massachussetts cortex in the human." Journal of Cognitive Neuroscience 15.4 (2003): 619-626. Richter, Hendrik; Manke, Felix; Seror, Moriel (2013) LiquiTouch: liquid as a medium for Noordzij, M.L., Zuidhoek, S., Postma, A., versatile tactile feedback on touch surfaces. In 2006, The influence of visual experience on Proceedings of the 7th International the ability to form spatial mental models based Conference on Tangible, Embedded and on route and survey descriptions. Cognition, Embodied Interaction (TEI '13). ACM, New 100, 321- 342. York, NY, USA, 315-318.

Ostrovsky Y, Andalman A, Sinha P (2006) Rosch E, Thompson E, Varela FJ (1991) The Vision Following Extended Congenital embodied mind: cognitive science and human Blindness. Psychological Science 17:1009– experience. MIT Press, Cambridge, MA 1014. doi: 10.1111/j.1467-9280.2006.01827.x Sacks O. (2007) Musicophilia: tales of music Pascual-Leone, Alvaro et al. (2011) and the brain. Knopf Publishing Group; New "Characterizing brain cortical plasticity and York, NY. pp. 314–315. network dynamics across the age-span in Sadato N, Pascual-Leone A, Grafman J, et al health and disease with TMS-EEG and TMS- (1996) Activation of the primary visual cortex fMRI." Brain topography 24.3: 302-315. by Braille reading in blind subjects. Nature

70 380:526–528. doi: 10.1038/380526a0 Sterlac (1980) Third hand. http://stelarc.org/?catID=20265. Accessed 25 Sanes, Jerome N, and John P Donoghue (2000) May 2017 "Plasticity and primary motor cortex." Annual review of 81 neuroscience 23.1: 393-415. Suzuki, Y.; Minoru, K. (2005) Air jet driven force feedback in virtual reality. IEEE Schwartzman M (2011) See yourself sensing: Computer Graphics and Applications, 25(1): redefining human perception. Black Dog 44-47, 2005. Publishing, London, pp 98–99. ISBN: 9781907317293 Szubielska M (2014) Strategies For Constructing Spatial Representations Used By ScorpionZZZ’s Lab (2013) SGenerator Blind And Sighted Subjects. Studia (Signal Generator). http:// Psychologica 56:273–285. doi: sgenerator.scorpionzzz.com/en/index.html. 10.21909/sp.2014.04.666 Accessed 11 May 2017 Temple, Stephen. 2017. Vintage Snyder A. Explaining and inducing savant Mobiles. GSM History: History of GSM, skills: privileged access to lower level, less- Mobile Networks, Vintage Mobiles Retrieved processed information. Philosophical December 20, 2017, from Transactions of the Royal Society B: http://www.gsmhistory.com/vintage- Biological Sciences. 2009;364(1522):1399- mobiles/#orbitel_901_1992 1405. doi:10.1098/rstb.2008.0290. Thaler L, Castillo-Serrano J (2016) People’s Stearns, Paul. 2014. "Introduction to Kant's ability to detect objects using click-based Critique of Pure Reason: Part 4." YouTube. echolocation: a direct comparison between https://youtu.be/UPJh6fRpDjc?list=PLFGHE mouth-clicks and clicks made by a 1xQFhhzLba8GxgYtjAD6cfKoQJmH. loudspeaker. PLoS ONE. doi: 10.1371/journal.pone.0154868 Stein, B., Meredith, M. A., & Wallace, M. T. (1993). The visually responsive neuron and Thompson E (2010) Chapter 1: The enactive beyond multisensory integration in cat and approach. Mind in life: biology, monkey. In The Visually Responsive Neuron: phenomenology, and the sciences of mind. From Basic Neurophysiology to Harvard University Press, Cambridge. ISBN Behavior (Vol. 96). Amsterdam, The 978-0674057517 Netherlands: Elseiver. Treffert D.A. (2005) The savant syndrome in Stein BE, Stanford TR, Rowland BA (2009) autistic disorder. In: Casanova M.F., The neural basis of multisensory integration in editor. Recent developments in autism the midbrain: Its organization and maturation. research.Nova Science Publishers, Inc.; New Hearing Research 258:4–15. doi: York, NY. pp. 27–55. 10.1016/j.heares.2009.03.012 Treffert, Darold A. (2009) "The savant 1Stein BE (2012) The new handbook of syndrome: an extraordinary condition. A multisensory processes. The MIT Press. synopsis: past, present, future." Philosophical

71 Transactions of the Royal Society B: Bibliography Biological Sciences 364.1522:1351-1357. Arias, Claudia, and Oscar A. Ramos. TriState (2014) Supplement for frequency “Psychoacoustic Tests for the Study of Human modulation(FM) In: TriState. Echolocation Ability.” Applied Acoustics, vol. http://www.tristate.ne.jp/parame-2.htm. 51, no. 4, 1997, pp. 399–419., Accessed 26 May 2017 doi:10.1016/s0003-682x(97)00010-8.

TriState (2014) World’s first parametric Brucker-Cohen, Jonah. “Alerting speaker experiment Kit. In: TriState g. Infrastructure!” Proceedings of the 2016 CHI http://www.tristate.ne.jp/parame.htm. Conference Extended Abstracts on Human Accessed 26 May 2017 Factors in Computing Systems - CHI EA '16, 2016, doi:10.1145/2851581.2891082. UW Dept. Biomedical Engineering. (2006). Founder. Retrieved December 11, 2017, from Chen, Gang, et al. “An Adaptive Non- https://tcnl.bme.wisc.edu/laboratory/founder Parametric Short-Time Fourier Transform: Application to Echolocation.” Applied WAFB (2010) Daniel Kish, Our President. In: Acoustics, vol. 87, 2015, pp. 131–141., World access for the blind. doi:10.1016/j.apacoust.2014.06.018. http://www.worldaccessfortheblind.org/node/ 105. Acces- sed 20 May 2013 Fründt, Hans. “Echolocation: The Prelinguistic Acoustical System.” Semiotica, Warwick K (2002) In: Kevin Warwick vol. 125, no. 1-3, 1999, Biography. http://www. kevinwarwick.com/. doi:10.1515/semi.1999.125.1-3.83. Accessed 25 Sept 2017 Gray, Carole, et al. Developing a Research Wood M (2010) Introduction to 3D audio with Procedures Programme for Artists & professor choueiri. Video format. Princeton Designers. Centre for Research in Art & University. https://www.princeton.edu/ Design, Robert Gordon University, 1995. 3D3A/. Accessed 10 May 2013 Gray, Carole, and Julian Malins. Visualizing Yoshimoto, S.; Hamada, Y.; Tokui, T.; Research: a Guide to the Research Process in Suetake, T.; Imura, M.; Kuroda, Y.; Oshiro, O. Art and Design. Ashgate, 2012. (2010) Haptic canvas: dilatant fluid based haptic interaction. In proceedings of ACM Haraway, Donna J. “A Cyborg Manifesto.” SIGGRAPH 2010 Emerging Technologies. Manifestly Haraway, Jan. 2016, pp. 3–90., ACM, New York, NY, USA, 49-52. doi:10.5749/minnesota/9780816650477.003.0 001. Zeman AZ, Sala SD, Torrens LA, et al (2010) Loss of imagery phenomenology with intact Haraway, Donna J. “The Virtual Speculum in visuo-spatial task performance: A case of the New World Order.” Feminist Review, vol. ‘blind imagination.’ Neuropsychologia 55, no. 1, 1997, pp. 22–72., 48:145–155. doi: doi:10.1057/fr.1997.3. 10.1016/j.neuropsychologia.2009.08.024

72 Hatzfeld, Christian, and Thorsten A. Kern. doi:10.1145/1278960.1278967. Engineering Haptic Devices: a Beginner's Guide. Springer London, 2014. Poissant, Louise. “New Media Dictionary.” Leonardo, vol. 36, no. 3, 2003, pp. 233–236., Kozhevnikov, Maria, and Mary Hegarty. “A doi:10.1162/002409403321921479. Dissociation between Object Manipulation Spatial Ability and Spatial Orientation Rowan, Daniel, et al. “Identification of the Ability.” Memory & Cognition, vol. 29, Lateral Position of a Virtual Object Based on no. 5, 2001, pp. 745–756., Echoes by Humans.” Hearing Research, vol. doi:10.3758/bf03200477. 300, 2013, pp. 56–65., doi:10.1016/j.heares.2013.03.005. Lowery, Percival Chelston. Art and Esthetics as Applied to Prosthetics. White Dental Schenkman, Bo N., et al. “Human Manufacturing Co., 1921. Echolocation: Acoustic Gaze for Burst Trains and Continuous Noise.” Applied Acoustics, Mann, Steve. “Decon2 (Decon Squared): vol. 106, 2016, pp. 77–86., Deconstructing Decontamination.” Leonardo, doi:10.1016/j.apacoust.2015.12.008. vol. 36, no. 4, 2003, pp. 285–290., doi:10.1162/002409403322258691. Seago, Alex, and Anthony Dunne. “New Methodologies in Art and Design Research: “Media and Prosthesis: The Vocoder, the The Object as Discourse.” Design Issues, vol. Artificial Larynx, and the History of Signal 15, no. 2, 1999, p. 11., doi:10.2307/1511838. Processing.” Qui Parle, vol. 21, no. 1, 2012, pp. 107–149., doi:10.5250/quiparle.21.1.0107. Simner, Julia. “Defining Synaesthesia.” British Journal of Psychology, vol. 103, no. 1, Moore, Brian C. J., et al. Basic Aspects of Nov. 2011, pp. 1–15., Hearing Physiology and Perception. Springer, doi:10.1348/000712610x528305. 2013. Stetson, Chess, et al. “Motor-Sensory Papadopoulos, Timos, et al. “Identification of Recalibration Leads to an Illusory Reversal of Auditory Cues Utilized in Human Action and Sensation.” Neuron, vol. 51, no. 5, Echolocation - Objective Measurement 2006, pp. 651–659., Results.” 2009 9th International Conference doi:10.1016/j.neuron.2006.08.006. on Information Technology and Applications in Biomedicine, 2009, Stiles, William B., and Michael Barkham. doi:10.1109/itab.2009.5394427. “Practice-Based Evidence From the United Kingdom Using the CORE System.” Pinder, Shane D., and T. Claire Davies. PsycEXTRA Dataset, “Exploring Direct Downconversion of doi:10.1037/e633922012-001. Ultrasound for Human Echolocation.” Proceedings of the 8th ACM SIGCHI New Tatur, Guillaume. “Scene Representation for Zealand Chapter's International Conference Mobility of the Visually Impaired.” Mobility on Computer-Human Interaction Design of Visually Impaired People, 2017, pp. 283– Centered HCI - CHINZ '07, 2007, 310., doi:10.1007/978-3-319-54446-5_10.

73 Thaler, Lore, et al. “Neural Correlates of Kendrick, Mandy. "Tasting the Light: Device Natural Human Echolocation in Early and Lets the Blind "See" with Their Tongues: Late Blind Echolocation Experts.” PLoS ONE, Scientific American." Science News, Articles vol. 6, no. 5, 2011, and Information/ Scientific American, August doi:10.1371/journal.pone.0020162. 13, 2009. Web. 03 Dec. 2011. Click-Based Echolocation: A Direct Comparison between Mouth-Clicks and "Tongue Control Technology." Think-A- Clicks Made by a Loudspeaker.” Plos One, Move, Ltd. Think-A-Move, Ltd., 2007. Web. vol. 11, no. 5, Feb. 2016, 17 Dec. 2011. .

Tonelli, Alessia, et al. “Depth Echolocation Wiener, Norbert (1948). Cybernetics, or Learnt by Novice Sighted People.” Plos One, Communication and Control in the Animal vol. 11, no. 6, Mar. 2016, and the Machine. Cambridge: MIT Press. doi:10.1371/journal.pone.0156654. Tsing, Anna. “Unruly Edges: Mushrooms as Companion Species.” Environmental Horowitz, Seth S. The universal sense: how Humanities, vol. 1, no. 1, 2012, pp. 141–154., hearing shapes the mind. Bloomsbury doi:10.1215/22011919-3610012. Publishing USA, 2012.

Warwick, Kevin, et al. “Prosthetics: A User Becker, Robert O, and Gary Selden. The Body Guide for Posthumans.” The American Electric: Electromagnetism and the Journal of Psychology, vol. 121, no. 1, Jan. Foundation of. NY: Morrow, 1985. 2008, p. 161., doi:10.2307/20445449. Smith, M., & Morra, J. (2007). The prosthetic Wickens, Christopher D., et al. An impulse: from a posthuman present to a Introduction to Human Factors Engineering: biocultural future. Cambridge: MIT Press. Pearson New International Edition. Pearson Prentice Hall, 2014.

Blakeslee, Sandra "BrainPort, Dr. Paul Bach- y-Rita, and Sensory Substitution". Mind States - Tribe.net. 23 November 2004. Web. 03 Dec. 2011.

"History of the Synapse - Max R. Bennett - Google Books." 2012. 21 May. 2013

74

Appendix ii

Guiding Questions:

How can these modes of perception be adopted for popular applications?

Isn’t all technology assistive in a specific capacity of its purpose as a tool?

Can ADA tools enhance people's understanding of what constitutes skill and ability?

Are ADA desirable tools? Can they create an alternative demand for prosthetics?

How necessary is it for the ADA tools to be functional vs. vaporware?

What alternative perceptual methods are useful in computing?

Is there a possible way to achieve non- subtractive super-abilities with non-invasive technology?

Could the ability to further understand spatial information based on sonic proximity be helpful for more than to aid blindness?

75

76