University of Nevada, Reno

Haptic Interface for Non-Visual Steering

A thesis submitted in partial fulfillment of the requirements for the degree of Master of Science in Computer Science and Engineering

by

Burkay Sucu

Dr. Eelke Folmer/Thesis Advisor

May, 2013

THE GRADUATE SCHOOL

We recommend that the thesis prepared under our supervision by

BURKAY SUCU

entitled

Haptic Interface For Non-Visual Steering

be accepted in partial fulfillment of the requirements for the degree of

MASTER OF SCIENCE

Eelke Folmer, Ph. D., Advisor

Sergiu Dascalu, Ph. D., Committee Member

Cahit Evrensel, Ph. D., Graduate School Representative

Marsha H. Read, Ph. D., Dean, Graduate School

May, 2013

i

Abstract

Glare significantly diminishes visual perception, and is a significant cause of traffic acci- dents. Existing haptic automotive interfaces typically indicate when and in which direction to steer, but they don’t convey how much to steer, as a driver typically determines this us- ing visual feedback. We present a novel haptic interface that relies on an intelligent vehicle positioning system to indicate when, in which direction and how far to steer, as to facilitate steering without any visual feedback. Our interface may improve driving safety when a driver is temporarily blinded, for example, due to glare or fog. Three user studies were per- formed, the first study tries to understand driving using visual feedback, the second study evaluates two different haptic encoding mechanisms with no visual feedback present, and a third study evaluates the supplemental effect of haptic feedback when used in conjunction with visual feedback. Studies show this interface to allow for blind steering through small curves and that it can improve a driver’s lane keeping ability when combined with visual feedback. We improved our interface to identify whether this allows for blind steering with no visual feedback by users who are blind. We conducted a user study with an adapted version of our interface that implements self correction and conducted user studies with users who are blind. ii

Acknowledgments

It is a great pleasure acknowledging the support and help of the kind people around me, without them this thesis would have remained a dream. First of all, I would like to thank the most friendly advisor ever, Dr. Eelke Folmer, who is more like an older brother to me as we talk about everything else more than we talk about work. I have always been very grateful working with him since he is always supportive, encouraging, and understanding. Without him, it would not be possible to complete this work. Secondly, I want to express my deepest gratitude to Dr. Cahit Evrensel, not only because he accepted being in my committee, but also for everything he has done since we met. He is always available for help about anything, and hospitable. His work ethics is admirable, and he is a true role model for his students. I also need to thank another committee member Dr. Sergiu Dascalu, for being very encouraging and having a very positive attitude all the time. Just after couple of weeks we met, he asked me to take a road trip to Lake Tahoe. It was a pleasant welcoming for me to Reno. I should thank my friends in our office, Vinitha, Miran, Ilias, Alex, and Cagri, who turn the working environment into somewhere fun. I would like to thank my friends Sehribani and Bugra. They are the ones who made me come to Reno in the first place. They are my sister and brother here, and they make Reno a much better place. I also want to thank to my family and friends for their support. If I try to list their names, acknowledgments might get as long as thesis itself. Finally and most importantly, I want to thank the three most important people in my life; my mother, my father, and my brother, who always believe in and support me, and I am happily dedicate this work to them. May, 2013 iii

Contents

Abstract i

List of Tables v

List of Figures vi

1 Introduction 1 1.1 Background and Related Work ...... 4 1.1.1 Haptic Steering Interface ...... 4 1.1.2 Haptic Racing Interface ...... 7

2 Design of a Haptic Steering Interface 10 2.1 Setup of the Simulation Environment ...... 11 2.2 Study 1: Understanding Driving ...... 17 2.2.1 Participants ...... 17 2.2.2 Instrumentation ...... 17 2.2.3 Procedure ...... 18 2.2.4 Results ...... 19 2.2.5 Haptic Steering Interface ...... 21 2.3 Study 2: Evaluating Haptic Encoding schemes ...... 24 2.3.1 Participants ...... 24 2.3.2 Instrumentation ...... 24 2.3.3 Procedure ...... 25 2.3.4 Results ...... 26 2.4 Study 3: Evaluate Multimodal Effect ...... 27 2.4.1 Participants ...... 27 2.4.2 Instrumentation & Procedure ...... 27 2.4.3 Results ...... 27

3 A with a Haptic Interface 30 3.1 Setup for the Game ...... 30 3.2 Developing a Dynamic Haptic Feedback System ...... 32 3.3 User Study: Evaluating The Racing Game ...... 36 3.3.1 Participants ...... 36 iv

3.3.2 Instrumentation ...... 36 3.3.3 Procedure ...... 37 3.3.4 Results ...... 38

4 Discussion and Future Work 40 4.1 Haptic Steering Interface ...... 40 4.2 Haptic Racing Interface ...... 43

5 Conclusion 45

Bibliography 46 v

List of Tables

2.1 Prelim. study results: average deviation (stdev) in meters ...... 20 2.2 Study 2 results: average deviation (stdev) in meters ...... 26 2.3 Study 1&3 results: average deviation (stdev) in meters ...... 28

3.1 User study results of sighted subjects: average deviation (stdev) in meters and total number of hits ...... 38 3.2 User study results of visually impaired subjects: average deviation (stdev) in meters and number of hits ...... 38 vi

List of Figures

1.1 A driver blinded by the headlights of oncoming car ...... 2

2.1 Overall setup of the simulation environment ...... 11 2.2 Graph of the function of wheel position to digital values ...... 13 2.3 Radius of rotation circle calculation ...... 14 2.4 Error in position calculation ...... 15 2.5 Error function plotted using Desmos Graphing Calculator ...... 16 2.6 Screenshot of simulator we modified to analyze driving behavior using vi- sual feedback...... 18 2.7 Four different roads with curve angles: 180◦, 135◦, 90◦, 45◦ ...... 19 2.8 Average values for the right turns ...... 20 2.9 Bott’s dots (left) and rumble strips (right) ...... 22 2.10 How the system works: steering cues are provided through a vibrotactor integrated in the left and right of the steering wheel. Drivers steer away from a cue felt in either hand, in order to find a dead-band window that indicates the target orientation of the wheel, which changes as the car drives through the curve ...... 23 2.11 Vibrotactors attached to the left and the right side of the steering wheel, with drivers placing their hands on top of them...... 25 2.12 Correlation between steering time and index of difficulty for no magnitude, frequency magnitude, visual and haptic + visual feedback...... 29

3.1 Top-down view of the racing game track ...... 31 3.2 Dynamic calculation of haptic feedback ...... 33 3.3 Dead-band window as a circular target ...... 35 3.4 Four-lap play with smallest standard deviation in Group A (clockwise) . . . 39 3.5 Four-lap play with smallest standard deviation in Group B (clockwise) . . . 39

4.1 Average steering wheel values for the 180◦ turn ...... 41 1

Chapter 1

Introduction

For this thesis, we developed two projects: (i) a haptic steering interface which can be used in real-life systems to increase driving safety when it is jeopardized due to a temporary blindness arises from glare or fog, (ii) a haptic racing interface for a blind-accessible racing game which is novel in that it only uses haptic feedback to allow a player drive a car on a race track successfully without the need of any visual or audio feedback for steering. Glare, caused by sunlight or headlights is a significant cause of vehicle accidents, as visibility is a basic requirement for safe driving [1]. Especially in winter, due to a lower elevation of the sun and the presence of snow and ice, there is a significant increase in traffic accidents due to glare. Depending on the type of exposure, it may take an eye between 1 and 7 seconds to adjust to glare [2, 3, 4]. As a person ages, the ability to focus and recover from glare continues to diminish [5] and as a result nearly 40% of drivers involved in accidents due to glare are older than 45 years [1]. Though glare can occur in various contexts, one of the more dangerous situations is when a driver is steering through a curve, as there is a greater risk of getting blinded due to the significant change in direction. Unlike driving on a straight road, steering through a curve requires continuous adjustment of the steering wheel, therefore a temporary blindness could have major consequences. In recent years there has been increasing interest in improving automotive safety using haptic interfaces. For example, lane keeping [6] or lane changing [7] systems are com- mercially available where haptic cues warn the driver of impeding danger. Other systems [8, 9, 10] aim to reduce overloaded modalities, where haptic feedback conveys high level navigation instructions. Haptic feedback has some desirable properties over other modal- 2

Figure 1.1: A driver blinded by the headlights of oncoming car ities in that it is private and doesn’t distract any passengers. Haptic feedback provided through the steering wheel allows for robust and efficient communication of rich tactile information [11], as the driver is always holding it. For steering, existing systems (see Section 1.1.1) have only explored the use of haptic feedback to indicate when to steer and in which direction, but not how far to steer, as the user typically evaluates this by looking at the curvature of the road. If a driver is temporarily blinded through glare, the informa- tion provided using haptic feedback in these systems is not sufficient for steering a vehicle through a curve. We present a haptic steering interface, that relies on an intelligent vehicle positioning system in a computer simulation environment with 1 cm accuracy and a map of the road, to indicate when and in which direction to steer, but also how much to steer. This allows for steering without visual feedback in contexts, i.e., steering through a curve at high speed, where getting blinded is most lethal. industry is growing continuously. In 2011, it reached to more than $16 billion in revenue in U.S. market and this number goes up to $25 billion when combined with the revenue generated from gaming hardware sales, such as consoles, controllers, and 3 other accessories [13, 14, 15]. In this growing industry, developers work hard on their game design, art-work, origi- nality, replay value, story, etc. to attract a huge amount of the gaming community. How- ever, there exist another audience which is usually neglected; most of these games are not playable by people with impairments or disabilities. Video game accessibility allows peo- ple with at least one of vision, hearing, speech, mobility, or cognitive impairments to play and enjoy it. The current state of video game accessibility is not very promising for various reasons. Developers may not be aware of these issues or being not disabled, they may not know how to make a game more accessible. Cost-benefit analysis may claim it does not worth the effort needed to improve a game’s accessibility either because the revenue will not increase to compensate the expense or only a very small portion of the audience will make use of accessibility features [16]. Making a game accessible has social and financial benefits to both parties; players and developers [16]. Since video games are fun and many people love playing them, people with impairments feel more like being a part of a social group rather than being outsider by playing and enjoying these games with their friends. Moreover, developers can sell their products to a broader range of people by implementing accessibility features. And because of this positive attitude, they will most likely be advertised by media for free. There existed couple of blind-accessible racing games some time ago, but either these games are no longer available, not supported any more, or they are not compatible with today’s mainstream hardware and operating systems according to discussion boards, un- fortunately [17, 18, 19]. Based on the information on their websites, these games convey audio cues to players to steer a car with a keyboard, mouse, or . These control interfaces lack immersion, a key component of video games. Utilizing the experience we gained while developing and evaluating our haptic steering interface, we decided to modify it and create a haptic racing interface to fit into a racing game, and enable people with visual impairments enjoy a racing game in a more immersed way. Guiding a player with haptic instead of audio feedback has some other advantages, the player can listen to her favorite songs or have a conversation with friends without being 4 have to follow the audio cues while playing the game.

1.1 Background and Related Work

1.1.1 Haptic Steering Interface

Within the Human-Computer Interaction community a steering task is typically associated with using a pointing device for steering a mouse cursor within the confines of a 2D tun- nel, for example, navigating hierarchical menus. The human performance of steering is modeled by the Steering law [20]. Studies show that different modalities of error feedback affect the accuracy of steering but not movement time, with tactile feedback improving accuracy more than visual or audio feedback [21]. The problem addressed in this project involves performing a steering task, with the difference that instead of a pointing device, a steering wheel is used that can turn left or right to control the direction in which a vehicle is moving. Another difference is that most steering tasks, such as navigating cascading menus, assume an overhead view, whereas driving a vehicle uses a first person view. Bateman [22] has explored how different types of views affect driving performance in a simulator and found driving using a first person view to yield significantly better performance than using an overhead view. Haptic feedback is already available in certain car systems, for example, Ford’s Lane Keeping System [6], or Lane Change Assist systems used by Audi, Volkswagen, BMW, Porsche, and Mazda [7]. These systems make use of state of the art technology to check if the car stays in the lane or not, and warn the driver using haptic cues if the car crosses the lane line before the driver uses the signals or if changing lanes is not safe due to a car present in the blind spot. These systems build on nearly a decade of research in automotive user interfaces, which we will summarize here, specifically focusing on steering. Enriquez [23] was one of the first to implement a tactile display in a vehicle context. Different types of warnings, e.g., errors on various gauges, are conveyed through pulsations of varying frequencies on the drivers hands by embedding inflatable pads in the steering wheel. User studies indicate the usefulness of haptic feedback in case of excessive sensory 5 load to a driver, show a significant decrease in response time and demonstrate the feasibility of using frequency to convey different warnings. Van Erp et al. [24] implement a tactile display by embedding eight vibrotactors in two rows in the driver’s seat. Turning left or right is indicated by activating the four vibrotactors under the driver’s corresponding leg. The distance to a turn is conveyed using pulse length modulation. User studies show a significant increase in driving performance over using visual feedback alone with a decrease in cognitive load and reaction time. Griffiths and Gillespie [25] developed a driving simulator where the steering wheel is both held by the driver and motorized for automatic control. The motion of the steering wheel is a response to the sum of forces acting from the human grasp, from the automatic control motor, and from the steering linkage. Feeling the actions of the wheel, the driver can either comply with it or override it by applying more force. User studies show significant increase in the user’s lane keeping ability while decreasing the visual demand and reaction time. An autonomous vehicle developed for DARPA’s Urban Challenge [26] was modified to allow a user who is blind to drive it [27], as part of the blind driver challenge [28]. Audio provided using a headset is used for steering, e.g., sonification using frequency indicates the direction and magnitude of a turn. A modified massage chair provides haptic feedback to indicate speeding up or slowing down using a series of vibrotactors. No user studies were conducted. Kern et al. [11] present a steering wheel with six integrated vibrotactors to convey navigation information to the driver. Spatial cues on the wheel indicate to turn left or right. User studies evaluated the effectiveness of supplemental directional information in different modalities (audio/tactile) and found that haptic feedback impedes driving performance, with no significant effect for the other modalities. They further explored using dynamic patterns for conveying the steering direction, by sequentially activating vibrotactors in the direction the wheel needs to be turned. Qualitative results are reported, i.e., users preferred audio over the haptic interface. Hogema et al. [29] place a matrix of eight by eight vibrotactors to the driver seat, and 6 conduct user studies in real world driving conditions to see if drivers can successfully dis- tinguish the activated group of vibrotactors corresponding to four cardinal and four ordinal directions. Participants are able to identify the right direction with a 93.3% accuracy. Their study shows that the vibration arising from non-smoothness of road surface does not affect the success of the system, which is consistently well performed on different types of road surfaces. Same tests also suggest that drivers can respond to unexpected feedback as well, the wrong localizations in such cases are not away from the right direction by more than 1 segment, i.e. 45 degrees. The ”haptic steering wheel” [8] embeds 32 linear vibrotactors in a steering wheel, which allows for communicating information regardless of where the drivers hold their hands and further allows for displaying tactile illusions, such as sensory saltation. Spa- tial and temporal patterns (clockwise/counter-clock wise activation) are used to indicate whether to steer left or right, as well as to convey various types of alerts. Three differ- ent spatial encodings, three different stimulus times and three different tactile illusions are evaluated. User studies evaluate the user’s ability to successfully recognize these patterns and perform corresponding actions, but not the effect on driving. The highest recognition rate is achieved for 450ms stimulus time, an overlapping spatial encoding and the sensory saltation tactile illusion. Asif and Boll [9] propose a vibrotactile belt that conveys navigation information using eight vibrotactors, similar to the system explored by Hogema et al. [29]. Drivers wear this belt while driving and the location of a turn is indicated by the location of the active vibrotactor and distance through different temporal patterns. User studies show increase in navigation performance but no change in cognitive workload. Kim et al. [10] present a haptic steering wheel with 20 vibrotactors. Turning direc- tions are indicated using clockwise or counterclockwise activation of vibrotactors. User studies evaluate multimodal feedback for younger and elderly drivers and found significant improvement in performance for haptic feedback provision. Most of these systems provide high-level navigation instructions, e.g., turn left or right, where the required steering task (how much to steer) is performed by the driver based 7 on visual input. Only a single system [27] allows for non-visual steering as it facilitates blind people to drive a vehicle. This system relies on an autonomous vehicle capable of determining its position on the road using GPS and a map of the environment with drivable roads. How much to steer is indicated using audio provided using a headset. Because many states ban the use of headsets that cover both ears while driving [30] and audio provided using a speaker can be annoying for passengers to hear, we explore solutions using haptic feedback to convey steering information.

1.1.2 Haptic Racing Interface

In most of the computer video games, primary stimuli is vision and audio is used as a complementary second stimuli [31]. As a result, the most common way video games are made blind accessible is switching the audio to primary stimuli since visionary stimuli is not an option for visually impaired. Communities or sites such as [32] or [33] lists the blind-accessible games for gamers with visual impairments. There are some existing blind- accessible racing games as well as games from other genres, and no surprise they mostly rely on audio feedback. Audio Formula 1 [34] borrows from Formula 1 2002 season; models tracks into 3D audio and also contains teams and drivers of that season. It conveys information to a player with the help of different sounds, such as beeps or ticks, corresponding to shortest path or borders of the track. Lewis, developer of the game, reports in the interview that qualitative results of tests with people between ages 8 and 38 underlines it was a fun game. Although it was a promising work containing many gaming elements, unfortunately it is not available any more. Mach 1 Car Racing [35] is designed after Pole Position, a classic racing arcade game, highly popular back in 80s. Mach 1 warns the driver about the direction of the turn by using audio cues. It makes use of stereo speakers to indicate the next direction of turn, i.e. if the next turn is a right, a sound such as a song is played in right speaker. The game does not require a text to speech software since it uses self-voicing rate of which is adjustable. are used for controlling the car. 8

1000 miles [36] is a turn-by-turn, text based racing game, where a player can hold up to 7 cards in her hand and pick and play a card when it is her turn. Attack cards like red light, speed limit, out of gas, or accident are used to prevent the opponents from advancing in the race track. One player cannot move until the attack is overcome with the corresponding defense card. There are some immunity cards as well. To advance in the game, a player simply uses the distance cards. This is a fully functional online multiplayer game accessible to blind players through a screen reader. Although the theme is racing, there is not any steering task in this game and its genre is closer to strategy. Top Speed 3 [37] is the latest version of the popular accessible racing game series Top Speed. It allows playing with 7 more players (computer controlled or over network) play at the same time on the same track. This game conveys the needed information to play to the player through audio cues as well as optional force-feedback. It sometimes gets hard to hear the direction cues when they are suppressed by annotation feedback. Although haptic feedback is not used as the primary guidance for steering but as an additional element for creating a more immersive environment in computer or video games, there are some research projects discussed in Section 1.1.1 which aim enabling blind people to drive a car independently with the help of haptic cues. Moreover, there are some other real life projects related to the subject. Blind accessible racing games through audio feedback come alive with a project named Blind Behind the Wheel, a car race for blind individuals [38]. Blind drivers are let to drive a car with the NASCAR stock car racer co-pilots’ instructions about steering and speeding in an actual race car on a 400 meters long oval race track. Before starting, half an hour of training is given. The commands are simple single words, i.e. left, right, go, stop. Repeat- ing a command emphasizes the strength, so if co-pilot sitting in passenger seat says ”left left”, visually impaired driver simply turns a little more to the left. Based on the informa- tion on the project’s website, Ryan Kucy, a legally blind and half deaf person, reached to 85kmh. Another participant, Cory Martin, tells how excited driving a race car was on his blog [39]. In five years, about 30 blind people have driven a race car with the assistance of a sighted person. This project proves that although vision is the primary stimuli for non- 9 blind people to steer, it is not a must if the needed information is conveyed to the driver in other forms. Google has developed a driverless car, which autonomously steers itself by sensing the environment, i.e. stop signs, and the surrounding traffic with the help of video cameras, a laser range finder, and radar sensors. [40]. In an experiment, it successfully gave a ride to a Californian blind person, Steve Mahan, from his home to multiple destinations [41]. This project proves that determining the needed action in steering can be achieved by some computer systems. In theory, if this project is used to replace the co-pilot in [38], a visually impaired person can drive a car without another person’s assistance. 10

Chapter 2

Design of a Haptic Steering Interface

Where most studies with haptic driving assistance have evaluated the ability of haptic feedback to reduce the driver’s cognitive load -when provided in conjunction with visual feedback- we are specifically interested in performing a steering task using haptic feedback alone. In existing systems, haptic cues only indicate when to steer and in which direction, but not by how much as the driver typically determines that using visual feedback, e.g., by looking at the curvature of the road. In particular contexts, for example, when the driver is temporarily blinded by glare or their view is limited by fog, knowing how much to steer is essential information for keeping the car in the lane. A steering task can be considered to be some form of a choice response task, i.e., providing a particular input based on a certain stimulus. Various studies with subjects per- forming choice response tasks have shown that a stimulus represented in multiple modal- ities simultaneously can be detected at lower thresholds, faster, and more accurately than when information is provided separately in each modality [42, 43]. Based on these studies, one can argue that a haptic driving assistance system that also indicates how far to steer when used in conjunction with visual feedback, could allow a driver to more closely follow the median of the lane than using visual feedback alone. Before designing a haptic interface that indicates how far to steer, we conducted a study with the goal to better understand how humans perform a steering task using visual feedback. 11

2.1 Setup of the Simulation Environment

While developing the driving simulation, we used couple of hardware which are illustrated in Figure 2.1. A G27 Racing Wheel is connected to an iMac directly via usb cable. iMac and a PlayStation 3 console are networked over a Linksys router. Two PlayStation Move Controllers are used for conveying haptic feedback, which can communicate with the PS3 console via Bluetooth connection and were used in expanding the system to a haptic interface.

Figure 2.1: Overall setup of the simulation environment

Considering all of its advantages including easy implementation, fast prototyping, high performance, and ready-to-use libraries for Logitech Racing Wheel and PS3 Move controllers, XNA game development framework is used for all the software de- velopment purposes of the project. An open source driving simulator created in Microsoft 12

XNA and playable only with a keyboard [44] was modified to suit our study. The simulation runs at 60 frames per second, meaning the car position and orientation is updated 60 times in a second. The simulation’s speed and rotation handling parts are modified, to make it drivable with the racing wheel instead of the keyboard. In our tests, to make an objective comparison of different users’ results, the speed of the car is fixed. Otherwise, users going at lower speeds would achieve better results since they would have more time to correct mistakes. The gas pedal is used as a start button; when it is pressed once, the car starts moving immediately at the fixed speed. After some cases such as a crash or completing all the laps in one direction, the car stops and waits for the player to hit the gas pedal once again. How the fixed speed value is determined will be discussed in related sections. The main modification to the simulation is integrating the steering wheel for handling of turning the car. The wheel can be turned 5π/2 radians to each direction, leading the domain to be in the range of [−5π/2, 5π/2] radians. However, a need to turn the wheel more than π/2radians at a fairly high fixed speed would be impractical. Moreover, such a large amount of turn would cause confusions in the proposed interface, because vibrotactors would change sides, i.e. left vibrotactor would be closer to the right sight of the driver. So, only the portion of [−π/2π, π/2] radians interval is used effectively for the simulation, and that interval is made to generate a three decimal floating point number in the interval of [−1.0, 1.0]. When the wheel is turned more than π/2 radian, −1.0 is generated if turned left and 1.0 is generated if turned right. This mapping is showed in Figure 2.2. The maximum possible value of how much the front wheels of the car -which are always parallel to each other- can be turned, Φ, is predetermined to be π/12 radians in the simulation, which means the front tires of the car can turn at most Φ radians in each direction. At any time, how much front tires are turned is calculated by multiplying Φ with the value read from driving wheel, and that angle is shown as Θ in Figure 2.3, adapted from the explanations in [44]. Since steering wheel never generates a value greater than 1 or less than -1, Θ lies in the interval [−Φ, Φ]. In this figure, C represents the center point around which the car draws circles as long as front tires stay at that orientation. Note that when Θ 13

Figure 2.2: Graph of the function of wheel position to digital values is zero, C is at infinity and the radius r is infinite, so the car simply goes straight. When Θ is equal to Φ, the maximum possible value for it, r gets its minimum possible value and car goes on the circumference of the smallest circle it can draw. In Figure 2.3, it is illustrated how to calculate the radius r of rotation circle using trigonometry and dimensional properties of the car; namely vertical length VD, horizontal length HD, and distance L between the center of mass O of the car and the middle-point between rear tires. L is not necessarily equal to VD/2 . In simulation environment, the numbers 2.66, 1.58, and 1.419 are assigned to VD, HD, and L, respectively. With these values, the minimum possible value for r is 10.8 meters. Once r is calculated, new values of orientation and position of the car are easy to find. Because the speed is fixed, acceleration is always zero and in the time passed since last frame, t, the car can go a distance of st where s is the speed of the car. Since the length of a π radian arc on a circle with a radius of r is equal to πr and the car advances a distance of st on the circular path with the radius r, the change in car’s rotation is equal to that arc’s angle, which is st/r radian. That amount is added to or subtracted from the current rotation of the car depending on in which direction the wheel is turned, left or right. Then the new position of the car is calculated by summing up the current position and the vector resulting from the multiplication of st with the unit vector in car’s new heading direction. Repeating 14

Figure 2.3: Radius of rotation circle calculation these calculations and updating the car accordingly 60 times in a second, the transitions seem smooth to human eye. Although in real life, Θ moves in the range [−Φ, Φ] uniformly, in our modified sim- ulator it is assumed that Θ reaches immediately to the target angle, as if steering wheel is directly connected to front wheels. The motivation behind that is reducing computa- tional complexity. For same reason, both air and ground frictions, which were not taken into consideration in the original simulator either, are also ignored. This simplified way of calculation of new position and rotation values reveals that the car moves in very small straight sections instead of real curves and there occur some calculation errors. Figure 2.4 helps to clarify how the new rotation and position of the car is calculated and the error in 15 position emerges. Initially, the car is at point P , heading to the direction shown with the blue arrow. In every frame, Θ is updated by multiplying the value read from steering wheel and Φ. Using Θ, first the radius is found to be r and then the center of rotation is found _ to be point C. In the ideal environment, the car would follow the dashed arc PQ length of which is st and reach to point Q, heading to west. However, the approach in the simulation first rotates the car at its initial position, then advances it in the new direction. So, the sim- ulation is rotated st/r radian counter-clockwise at point P and is moved a distance of st, which means car ends up at point R. There is not any error in rotation calculations other than rounding errors of floating numbers, so purple and green arrows are parallel to each other. The error in position can be found by following the below steps:

Figure 2.4: Error in position calculation

(Because green arrow is tangent to the circle with center C and radius r, and it is parallel to PR, CQ is perpendicular to the green arrow and to PR)

Θ = st/r c + d = r, a + b = st 16

a = st − rsinΘ, b = rsinΘ d = rcosΘ, c = r(1 − cosΘ) √ E = a2 + c2 E = stp(1 − sinΘ/Θ)2 + (1 − cosΘ)2/Θ2

Running at 60fps, t is always 1/60 seconds and assigning the maximum speed of the car to 50km/h, s is at most 50/3.6mps during the entire runtime of simulation. Error function E is plotted using Desmos Graphing Calculator [45] and the maximum possible error value is found to be less than 4cm when Θ is at its maximum or minimum possible value, ±Φ, as illustrated in Figure 2.5. y axis shows the error in meters.

Figure 2.5: Error function plotted using Desmos Graphing Calculator

These errors reach to their maximum when Θ is at its maximum and r is at its mini- mum, when the arc with the same amount of radian is more like a curve. The maximum possible error is less than 0.04m, which does not worth the computational complexity oth- erwise would occur. These errors can be safely ignored without the risk of the wrong data generation since they are minimal and not significant. If the value of st/r were high enough, first calculating the new orientation and then adding a vector in that direction would cause major problems. Assume the most extreme case where it was π radians. Then a car facing north initially would be turned to south first, and would be teleported to south instead of to the west (or east, depending on the rotation direction). In such a case, the error would be p4r2 + (st)2. Figure 2.4 may be misleading the human perception because the arc is picked as π/4 radian to clearly show the process. However, maximum possible value 17 for st/r is 0.007π radian and the corresponding arc is almost straight.

2.2 Study 1: Understanding Driving

Steering is required in various driving contexts, for example, parking, stopping at an inter- section and turning into another lane. However, in these situations the vehicle is moving at a low speed and –when blinded– the driver can simply stop the vehicle without jeopardiz- ing the safety of other drivers. Steering at higher speeds typically happens when changing lanes or taking an exit, in which case the driver only has to steer for a very brief period (less than a few seconds). Steering for longer periods at a constant speed only occurs when there is a curvature in the road, with the most extreme case when a driver has to take a 270◦ loop ramp when driving over a cloverleaf highway interchange. Haptic feedback would be the most useful in this context as drivers have a greater risk of getting blinded due to the significant change in direction and a greater risk of an accident due to having to steer for a prolonged period. For this study, we therefore analyze driving behavior when driving on a curved road.

2.2.1 Participants

We recruited eight computer science students (2 female, average age 27.3, SD=3.5) to participate in a user study. All subjects were right handed and none reported any non- correctable impairments in perception or motor control. Subjects had an average of 5.5 years (SD=3.8) driving experience.

2.2.2 Instrumentation

The modified open source driving simulator explained in Section 2.1 is used in this study without haptic feedback. We designed four different roads where each road starts and ends with a 50meter straight section connected by a circular curve with a radius of 50meters. Using standard design dimensions recommended by the federal highway administration, we designed a two-lane road with a lane width of 3.6m [46] and used an average car width of 1.85m. We used four different angles for curving roads: 45◦, 90◦, 135◦ and 180◦ (see 18

Figure 2.6: Screenshot of simulator we modified to analyze driving behavior using visual feedback.

Figure 2.7). The radius of the right curves’ lane median is 48.2m and radius of left curves’ lane median is 51.8m. Subjects drive each of these roads in both directions to understand how the radius affects steering performance. For a curve radius of 50m the department of transportation recommends a speed between 40 and 48km/h [47] but since our simulator doesn’t take into account friction, we set the speed of the car in the simulator to 50km/h. These values were chosen as to have approximately between 3 and 12 seconds of steering input required from the driver, which is long enough for the vehicle to leave the lane, when the driver is temporarily blinded. Figure 2.6 shows a screenshot of the simulator used. For input, we used a simulator grade racing wheel, the Racing Wheel, with an 28 cm diameter (see Figure 2.11) that was attached to a desk. 60 times in a second, we log the position of the car and the steering wheel, which ranges from -1.000 (left) to 1,000 (right) for a 180◦ turn of the wheel, with 0 being the neutral position.

2.2.3 Procedure

Participants were seated in front of the steering wheel facing the screen. Participants were instructed that they only needed to steer the vehicle to maintain its position in the lane while the vehicle drove through the curve. Participants were allowed to try a few roads for practice as to minimize any learning effects and when they felt comfortable enough, the 19

Figure 2.7: Four different roads with curve angles: 180◦, 135◦, 90◦, 45◦ experiment would start. Participants would drive each of our four roads in both directions three times in random order for a total of 24 trials. The vehicle automatically starts driv- ing at the beginning of each track and upon completion of the track, is teleported to the beginning of the next track.

2.2.4 Results

For every single trial, we calculate the standard deviation of the vehicle from the median of the lane. Since keeping the wheel at the neutral position is enough to keep the vehicle on the lane’s median for the straight sections, these are excluded from our calculation as they decrease standard deviation significantly and don’t involve any steering. Table 2.1 lists the results for each type of curve. A repeated measures ANOVA with a Greenhouse-Geisser correction found no statistically significant difference between left and right curves and between different curvatures (F1.83,12.78 = 1.51, p = .26). The average deviation from the 20

Table 2.1: Prelim. study results: average deviation (stdev) in meters CURVATURE LEFT RIGHT 45◦ .490 (.167) .422 (.243) 90◦ .563 (.282) .621 (.218) 135◦ .621 (.278) .584 (.294) 180◦ .557 (.281) .557 (.223) .558 (.054) .546 (.087) All .552 (.277) lane’s median for all curves is .552 m (SD=.277). Figure 2.8 lists the average values for all subjects for the steering wheel position for all right curves. The graphs for different curvatures are the same because the radius is the same and only differ in the amount of time that the wheel needs to be held in a maximum

position (Tright=∼225). The graphs for left curves are similar with a maximum value of

Tleft=∼-200. Our study confirms there is a linear relationship between the radius (r) of the curve and the maximum value of the steering wheel (T ). Each graph in Figure 2.8 can be approximated with a function t(φ), which depends on the position (φ) of the vehicle on the curve and the curve’s radius (r), curvature θ; and the velocity (v) of the vehicle and which yields the target position T 0 of the steering wheel.

Figure 2.8: Average steering wheel values for the right turns 21

2.2.5 Haptic Steering Interface

Existing haptic driving assistance solutions [8, 11, 10] are often overly complex in terms of used, for example, arrays of vibrotactors are required as to allow for providing sophisticated tactile illusionary tactile effects. Such solutions may be expensive to implement and more prone to breaking due to the large amount of hardware involved. It can be argued whether rich tactile effects, such as sensory saltation [8] are superfluous in conveying the information that is required for steering. The design of our haptic interface is therefore largely motivated by identifying the smallest amount of haptic feedback required for steering. The information that needs to be conveyed to the driver using haptic feedback is the target position T 0 of the steering wheel as the vehicle follows the curve. Haptic feedback can be provided using various technologies including, force feedback, skin stretch or ther- mal feedback. Most commonly, haptic feedback is provided through vibrotactile feedback, for example, mobile devices typically feature a single rotary mass motor that provides on/off vibrotactile feedback with a fixed frequency. However, their latency limits the use of sophisticated drive signals [48]. For our steering interface we use vibrotactors due to their low cost and their compact size. We designed a physical interface where vibrotactors are integrated into the steering wheel, which is motivated by the following reasons: hap- tic feedback provided through a belt [9] or seat [24, 27] may be impeded by the driver’s clothes, receptors in a driver’s hips and back are also not sensitive enough to distinguish complex stimuli [8], hands are most sensitive due to an abundance of tactile receptors in the fingertips [12], and getting feedback from the steering wheel itself may be more intuitive as it may allow the driver to control the wheel using an associated physical mapping [11]. Because a driver needs to steer either to the left or right, we integrate a vibrotactor in the left and the right side of the wheel. Our interface is inspired by how rumble strips or Bott’s dots work (see Figure 2.9), which provides tactile feedback to a driver when they drift from their lane. When the current position of the steering wheel is to the left of the target position T 0 the left vibrotactor will vibrate and if it is to the right the right 22

Figure 2.9: Bott’s dots (left) and rumble strips (right) vibrotactor will vibrate. Driver should turn the wheel more as long as she feels a cue. The next subsection details what haptic cues can be provided. Upon feeling a cue on a side the driver needs to steer away from the cue, e.g., a left cue requires the driver to turn the wheel to the right, until haptic feedback stops. Figure 2.10 illustrates how our technique works for steering through a curve. Note that driver feels no haptic feedback when the steering wheel is in dead-band window, shown as a green rectangle, and she should keep that position until she feels a cue. No haptic feedback is felt when T 0 is achieved, which makes sense as prolonged exposure to vibrotactile feedback may temporarily decrease the haptic sensitivity of the driver’s hands, which is undesirable. Study 1 showed that the wheel is at most turned 40◦ so this mapping avoids confusion on how to interpret the haptic cues, i.e. if it was needed to be turned more than 90◦, the left vibrotactor would be closer to the right side of the driver, leading possible confusions. This interface is similar to the interface used by Hong et al [27], that uses sonification to indicate steering information. As it is difficult to precisely hold the wheel in the target position T 0, drivers will keep oscillating between steering left and right. Oscillation can be minimized by implementing a dead-band window of size w around the target position T 0 in which no vibrotactile feedback is felt. For w a tradeoff must be made between oscillation and reduced accuracy in being able to follow 23 t(φ), with a risk of the vehicle possibly leaving the lane.

Figure 2.10: How the system works: steering cues are provided through a vibrotactor integrated in the left and right of the steering wheel. Drivers steer away from a cue felt in either hand, in order to find a dead-band window that indicates the target orientation of the wheel, which changes as the car drives through the curve

Haptic Encoding Scheme: Our haptic steering interface is minimal in terms of num- ber of vibrotactors required but also with regard to the haptic encoding scheme used, as we use a simple on/off cue with feedback being provided at a frequency (275Hz) that the hu- man skin is most sensitive to. A problem with this design is that the driver can only find the target orientation by steering, with a risk of steering too slowly leading to understeering or missing the window and oversteering when steering too fast. This problem may be solved by using vibrotactors that allow for frequency modulation and use frequency to indicate how far to steer. The vibrotactors’ perceivable range of frequencies can be linearly mapped

to the interval [w/2, Tmax] since we know in advance what the maximum steering error can be, for a given curvature using available map information. Frequency modulation is better for conveying steering magnitude than using pulse delay, since T 0 is indicated using no vibrotactile feedback, which would be difficult to sense when pulse delay is increased as the driver gets closer to T 0. A reverse encoding scheme (pulse increase) would require a transition from 0 pulse delay to no feedback, which drivers may perceive as confusing. A related study with virtual object manipulation also finds better performance for amplitude than pulse delay modulation [49]. Tactons [50] could be defined to indicate magnitude, but this requires drivers to memorize their meaning and requires discretization of magnitude, where frequency modulation is more intuitive and allows for a linear mapping. 24

2.3 Study 2: Evaluating Haptic Encoding schemes

The second user study focuses on evaluating the effectiveness of both proposed haptic steer- ing encoding schemes, e.g., no magnitude (NM) modulation versus frequency magnitude (FM) modulation.

2.3.1 Participants

We recruited 12 computer science students (4 female, average age 28.6, SD=5.9) to partic- ipate in a user study. All subjects were right handed and none reported any non-correctable impairments in perception or motor control. Subjects had on average 7.1 years (SD=6.2) of driving experience.

2.3.2 Instrumentation

We used the same simulator as in the first study and we use it as a proxy for an intelligent vehicle positioning system with map information, as this suffices for evaluating the effec- tiveness of each haptic encoding scheme. For haptic feedback provision we used a com- mercially available wireless motion-sensing controller (Sony Playstation Move), which has an integrated vibrotactor that allows for frequency modulation with a perceivable range of 91 to 275Hz. Two controllers were used, where each was attached to the left or right front of the steering wheel using tape (see Figure 2.11). By having the drivers place their hands over the controllers, we maximize the surface area of the hand that touches each controller and achieve maximum sensitivity to vibrotactile feedback. The controller lights did not convey any information. A value for the dead-zone width (w) was determined experimen- tally using preliminary trials, where a value of w = 50 would minimize oscillations with an acceptable deviation from the lane’s median. Because there is a small delay before each vi- brotactor is activated, haptic cues are provided 380ms ahead of when the user must steer in order to accommodate this. For indicating steering magnitude using frequency, we linearly

map the perceivable range of the vibrotactor [91, 275Hz] to [25, Tmax], with a higher fre- quency indicating a larger steering error. For this study, we adjust T 0 as described by t(φ) 25

Figure 2.11: Vibrotactors attached to the left and the right side of the steering wheel, with drivers placing their hands on top of them.

but not Tmax depending on the position of the car with regards to the median of the lane, e.g., there is no auto correction. Preliminary experiences with using autocorrection found to yield significant oscillations, which typically lead to the car leaving the road. Instead we keep Tmax at the values found in our first study, and our study thus focuses on how well each haptic encoding scheme allows a driver to approximate t(φ).

2.3.3 Procedure

Participants were randomly assigned into two six-people groups (A,B) where group A played the simulator using no magnitude (NM) and group B using frequency magnitude (FM). A between-subjects study design is justified to avoid interference effects, e.g., when participants have mastered one haptic encoding scheme it may confuse their ability to learn and use another. We used the same procedure as for the first study, except subjects were unable to see a display. Because simulating a temporary blindness with varying durations at various points in the curve could lead to significantly different results, and would re- quire a large number of studies to be performed, we limit our study to steering without any visual feedback. This provides us with a worst-case performance in the case a driver is 26 completely blinded, but this allows for an accurate comparison between both haptic encod- ing mechanisms. Participants were instructed on how to interpret the haptic cues provided for each encoding scheme and were allowed to try a few roads for practice, to minimize any learning effects. When subjects felt comfortable enough, the experiment would start. Participants would drive each of our eight roads three times in random order. The vehicle would automatically start driving and at the end of the second straight section teleport to the next road.

Table 2.2: Study 2 results: average deviation (stdev) in meters CURVATURE NO MAGNITUDE FREQUENCY MAGNI- TUDE 45◦ 1.457 2.073 (.950) (1.100) 90◦ 2.778 4.036 (1.862) (1.296) 135◦ 3.433 6.719 (2.879) (1.777) 180◦ 2.972 8.007 (2.623) (1.720) All 2.660 (.847) 5.209 (2.667)

2.3.4 Results

Because our first study did not detect any significant differences between left and right turns, data for both turns was combined for each curvature. Straight sections were excluded from calculating the lane deviation as little deviation was observed and it doesn’t involve steering. For each set of data, a Grubb’s test was performed to detect significant outliers. Three data points for FM were removed from the collected data for the 180◦ curves. Table 2.2 lists the results. We found an average error of 2.660m for NM and 5.209 m for FM. A one-way MANOVA found a statistically significant difference between haptic encoding

2 mechanisms (F4,7 = 5.226, p < .05, Wilk’s λ = .251, partial ε = .749). Pairwise comparisons using a Bonferroni correction revealed a statistically significant difference in lane deviation between both encoding schemes for 180◦ curves (p < .05) but no statistically significant differences between the other curve angles. 27

2.4 Study 3: Evaluate Multimodal Effect

The second study shows significantly better performance for NM encoding. As multimodal feedback increases performance over using unimodal feedback [42, 43], a third study in- vestigates the effect of NM encoding on steering performance when used in conjunction with visual feedback.

2.4.1 Participants

We recruited 8 computer science students (2 female, average age 27.9, SD=4.4) to match the number of subjects in our initial study. 5 participants had previously participated in the second user study using NM encoding but none of the other subjects had participated in the second study. All subjects were right handed and none reported any non-correctable impairments in perception or motor control. Subjects had an average of 6.2 years (SD=6.0) of driving experience.

2.4.2 Instrumentation & Procedure

We used the same setup and procedure as in the second study with the difference that subjects would use a display. To minimize interference effects, the 3 subjects who had not participated in the second study, received a significant amount of time to familiarize themselves with the haptic feedback provision and drove a number of roads without visual feedback.

2.4.3 Results

Table 2.3 lists the results of study 3 combined with the results of study 1. We found an average error of .552 m for visual and .221 m for visual and haptic feedback. A one-way MANOVA found a statistically significant difference between types of feedback

2 (F4,11 = 3.710, p < .05, Wilk’s λ = .426, partial ε = .574). Pairwise comparisons using a Bonferroni correction revealed a statistically significant difference (p < .05) between visual and visual + haptic feedback for all angles. 28

Table 2.3: Study 1&3 results: average deviation (stdev) in meters CURVATURE VISUAL VISUAL+HAPTIC 45◦ .456 (.184) .207 (.089) 90◦ .592 (.216) .232 (.102) 135◦ .603 (.239) .215 (.100) 180◦ .557 (.215) .231 (.088) All .552 (.067) .221 (.012)

Modeling human movement. Using data collected from all studies, we analyze steer- ing time (ST) as a function of index of difficulty (ID) of the steering law [20], e.g.,

Z d(s) ST = a + b ∗ (2.1) C W (s)

Where ST is the average time to navigate through the path, C is the path parameterized by s, W (s) is the width of the path at s, and a and b are experimentally fitted constants. Because the speed of the vehicle in the simulator and the width of the road are constants in our study this function is simplified to:

ST = a + b ∗ A (2.2)

with the index of difficulty only including the length of the path (A) [51]. Analyzing driving time versus curve length (excluding the straight parts) yields the following values for R2; no magnitude (.9839), frequency magnitude (.9339), visual (.9965) and visual & haptic (.9999). Regression plots for each type of feedback can be found in Figure 2.12. Using the Fisher r-to-z transformation a statistically significant difference was found between no magnitude and frequency magnitude (Z = −6.044,N = 144, p < .05) and visual and visual & haptic feedback (Z = −16.503,N = 192, p < .05). 29

Figure 2.12: Correlation between steering time and index of difficulty for no magnitude, frequency magnitude, visual and haptic + visual feedback. 30

Chapter 3

A Racing Game with a Haptic Interface

There exist a large variety of blind-accessible games from different genres, such as mu- sic/rhythm, rpg, arcade, racing, even first person shooter [18, 17, 52, 53, 54, 55, 56, 57, 58, 59]. These games mostly use audio cues to convey the needed information to the visually impaired player and enable her to play the game. We developed a blind-accessible racing game, which is novel in that it is playable with the haptic steering interface proposed in Chapter 2.

3.1 Setup for the Game

The same hardware setup explained in Section 2.1 is used for this game as well. However, the vibrotactors are unmounted from the steering wheel for multiple reasons; holding the wheel with the vibrotactors mounted does not feel natural and decrease the control of the gamer over the wheel. Moreover, since it is a racing simulation and there are lots of steering required leading a huge amount of continuous haptic feedback through vibrotactors, hands would get less sensitive after playing for a while due to long time exposure to tactile cues, and it gets easier to get confused about which vibrotactor is vibrating. Because of these, overall performance of a gamer would decrease, making it a less fun game. So, vibrotactors are attached to the subjects’ backhands with fabric straps. player can use her palms for only grabbing the wheel, and for the software part, only the roads of the simulation are removed and instead of them, a track is included to turn the simulation into a game. Since the primary purpose of this project is measuring proposed interface’s effectiveness but not developing 31 a full-game, track is designed to be a simple rounded rectangle Nascar track [60]. We tried to replicate the Bristol Motor Speedway track [61], as shown in Figure 3.1. However, we did not include any banking nor speeding zones and used fixed width for the whole track. Instead of using elliptic arcs, we used circular arcs, avoiding the complex cal- culations of length and angle of elliptic arcs. For convenience, the four different segments of the track are illustrated with four different colors in the figure. From this point forward, the name of a cardinal direction will be used to indicate the segment in that direction. That track just needs three parameters to be formed: length of the straight sections L, width of the road W, and the radius of the circular sections R. To make it to be as similar as Bristol Motor Speedway, these three parameters, L, W, and R, are assigned the values 200m, 12.2m, and 75m, respectively. The resulting track is 2(200 + 75π) = 871.23m on its median, slightly longer than the 857.78m long Bristol track.

Figure 3.1: Top-down view of the racing game track

The outer radius, RO, is equal to the addition of R and W/2, whereas the inner radius,

RI , is equal to the difference of them. They are also important parameters since the player car is not allowed to go off the track. Barriers are planted in both, inner and outer, edges of the track for this purpose. 32

3.2 Developing a Dynamic Haptic Feedback System

In haptic steering interface discussed in Chapter 2, the amount of haptic feedback were being calculated as a simple function of time, as shown in Figure 2.8. However, reacting too slow or too fast to haptic feedback leads to understeering or oversteering, which in turn causes getting the car away from median, and sometimes off the road. Even if the car gets off the road, there were not any correcting haptic cues to lead driver get the car back on the road. Since the car is teleported to start position of next road after every trial, this was not a problem in that project. But in a racing game, in which a player is asked to complete a number of laps on a track continuously without any visual feedback, this cannot be the right approach because the amount of haptic feedback is static and pre-determined and the errors would accumulate, causing the haptic cues to become completely meaningless. So, we worked on a new method to overcome this problem. This new method should be implemented in such a way that it could sense the position and orientation of the car relative to the track, and dynamically calculate the needed amount of haptic feedback to guide the driver stay on the median, and correct her mistakes whenever she makes one, get the car back on the median, and eventually reduce standard deviation, which means a better overall performance. Figure 3.2 illustrates how the amount of haptic feedback is calculated dynamically. In that figure, the position of the car is point C, and its orientation is vector R~. Point P is the projection of C on the median of the track. After finding P , the target point T is found by adding a distance D on the median. This calculation is different than the calculation of car’s new position explained in 2.1: when the car is on the curved part of the track, D is the length of the arc between P and T , not the distance between these two points. This also explains the motivation behind designing the track with circular curves but not elliptic. Finding point T on a circular arc is easy, and reduces computational complexity. Then the angle α between R~ and TC is found. Using parameters α and speed s, the desired steering wheel position to reduce α to zero is calculated by following the inverse procedure to calculate the car rotation, as shown in Figure 2.3. In that inverse procedure, the value of 33 time t plays a key role. To better understand, one should keep in mind that if the steering wheel is held in the position returned by this function and speed is kept the same for t second(s), α will reach to zero after t second(s). In other words, keeping the wheel in the position calculated by this function would make α zero after t second(s) if the simulation was running at 1/t frames per second. So, it is not taken as the time passed since last update, but it is passed as a parameter to that function and adjusted experimentally. By this approach, a driver can avoid continuous oscillations of steering wheel. Once the target position of the steering wheel is found, it is checked against the current position of the steering wheel, and the haptic feedback is provided accordingly. Also, because it is almost impossible to find the right spot of the driving wheel (the sensitivity is π/2000 radian); a dead band window, in which no haptic feedback is provided, is used as in previous work to avoid drivers turning the wheel to left and right continuously.

Figure 3.2: Dynamic calculation of haptic feedback

The value of D is calculated by using the formula offset + speed ∗ 1sec + 1 .24m. Because 1.24m portion of it lies within the car (from its origin to its front edge), the distance between point T and the front bumper of the projected car is offset +speed ∗1sec. Although it is fixed in user tests, the game is designed in such a way that it lets drivers control the speed as well. So an offset is needed to make sure that target position is still ahead of the car when speed of the car is too low at takeoff. These values are found by experiments, and offset is assigned to be 2 meters. When D is smaller, driver cannot react on time because rapid direction changes in steering wheel would be needed. When it is larger, it 34 increases standard deviation by making the car reach to the median at a more distant point. A larger value for D even may cause crashes while steering on the curve parts of the track if the connecting line of car’s position to target position (CT in Figure 3.2 ) intersects the circumference of the inner circle. When the speed of the car is too low, experiments show poor results; so D is assigned to be 8.24m if that formula returns a smaller number, i.e. car is going at a speed less than 5m/s. However, in the real game, the speed is fixed to a high value, so that check is not needed. We had two options for implementing the dead-band window. First one, the relative approach, is checking the difference between desired and current positions of the steering wheel and providing haptic feedback accordingly. This method is good to find the right spot of the driving wheel since the vibrotactors keep vibrating only if the wheel is needed to be turned more, and this was used in the project discussed in Chapter 2. Second option, the absolute approach, is using left and right haptic cues as soon as the wheel is needed to be positioned in left or right, but not at home position, without considering the current position of the steering wheel. We selected the second approach as it is an easier mapping and a driver would not get confused even after driving for a long time. For a successful gameplay, a player can follow different strategies. One of them is ”quick corrections” strategy: if a player starts turning the wheel π/2 radians to the direction she feels the cue and turns it back in one second as soon as she feels the cue, she will mostly keep on the median, and if she somehow drags off it, she will be led back to it. Experiments show great results for that one. However, this is not the necessary action for a successful gameplay, as some subjects achieved great results using ”real-life steering” strategy during experiments. As the first

user study of the first project shows, keeping the wheel in a certain position Tmax through the curve keeps the car on median. If a player can find that right spot, α will stay close to zero or mostly within the window and she will not feel any haptic cues, and when she does, it means a need for small corrections. So, this interface is also applicable to real life scenarios without the risk of turning a vehicle over due to sudden steering wheel turns as indicated in first strategy. After some experiments, dead-band window size is decided to be 20, so if the absolute 35

Figure 3.3: Dead-band window as a circular target

value of the desired position of steering wheel is found to be less than 20, no haptic feed- back is felt. At the fixed speed used in the game, this means there will be no feedback if α is less than 0.033π radian (6 degrees). Dead band window can be thought as a circular target around target point T with a dynamically changing radius, as illustrated in Figure

3.3, in which two cases are pictured. When car is at point C1, player does not feel haptic feedback if car can hit the pink circle without changing its orientation. Similarly, when

car is at point C2, no haptic feedback is provided if car can hit blue circle without turning steering wheel. If α is more than β, it cannot hit target circle, therefore a left or right cue is provided accordingly. Radius increases as car gets away from median, and decreases as it gets closer, e.g. blue circle’s radius is bigger than pink circle’s. This is because T is found by adding distance D not directly to car’s position, but to its projection on median. Note that β is not fixed itself but it depends on speed, so if player is allowed to control the speed as well, β will be updated depending on it, i.e. if speed reaches 100kmh, β will be 0.066π radian. This dynamically updating window makes a smooth driving possible. 36

3.3 User Study: Evaluating The Racing Game

A user study is conducted to measure the performance of the proposed interface, a dynam- ically correcting, completely non-visual haptic steering interface.

3.3.1 Participants

This study is conducted with two groups of subjects. For Group A, we recruited 5 sighted computer science students (1 female, average age 26.2, SD=2.4) to participate in this study. None reported any non-correctable impairments in perception or motor control. All sub- jects were right handed. They had an average of 5.6 years (SD = 4.0) driving experience. For Group B, we recruited 3 participants (1 female, average age 34.0, SD=2.2) through a local NFB chapter. All participants in Group B were visually impaired: 2 subjects being totally blind (TB), and 1 subject being legally blind (YB). All were right handed and none reported any impairments in perception or motor control.

3.3.2 Instrumentation

As in the user studies of the system described in Chapter 2, two Sony Move controllers are used for conveying the haptic feedback. Bott’s dots theme is kept, so whenever the car starts getting to the left edge of the road, the left vibrotactor vibrates and a right turn is needed. The controllers’ lights are initially set to red and blue, to distinguish the unmounted left and right vibrotactors, and not used for conveying anything about the direction or amount of the needed steering wheel turnings. Considering the poor performance of Frequency Modula- tion in earlier studies, only on/off mode is implemented. Using the combination of absolute positioning approach and a simple on/off encoding, we sacrifice the initial system’s ability to convey how much to steer. But the system still performs well since making quick correc- tions is not dangerous in a game environment. And as discussed in previous section, it still allows finding the target position and keeping the steering wheel at that position through the curve by not supplying any feedback as long as car follows the right path. 37

3.3.3 Procedure

The start line of the track is located in the middle of the north section. The car is positioned in the middle of the start line, facing East. Initially the car stops and as soon as the driver hits the gas pedal once, the speed gets fixed to 50kmh. Game consists of the completion of 8 laps, 4 clockwise and 4 counter-clockwise. After 4 clockwise laps, the car is stopped, teleported to the start position again facing West this time. When the car hits the barriers, it stops and turns to the target point calculated in the last frame before the crash. When car is stopped explicitly just before starting the 4-lap half-game and after crashes, driver is required to hit the gas pedal once more to get the car going again at the fixed speed of 50kmh. The game would last for 502 seconds if the car never leaves the median ((872m/lap ∗ 8laps)/(50/3.6)mps). However, this is almost impossible even if visual feedback is pro- vided to the driver, and game stops at least once, when the direction of laps is changed. So it takes about 10 minutes on average to complete the game. To indicate whether car is moving or waiting for subject to hit the gas pedal to go after a crash or completion of all tracks in one direction, an engine sound is used. It is a short stationary audio file which is played in a loop, so it does not convey any information about steering. Subjects are seated in front of steering wheel and the pedals of the racing wheel kit is placed under their feet. One Sony Move controller is attached to each backhand using their fabric straps. The system and how they are going to play the game are explained to subjects before running the game. Participants in Group A are allowed for a small trial of one-lap play with visual feedback. After explanations, subjects are allowed to play the game and they are given immediate oral feedback about their performance. Once they feel comfortable with how to play the game, the test is started. During the test, the screen is turned black, so sighted subjects play the game without any visual feedback. 38

3.3.4 Results

Table 3.1 and Table 3.2 show the average performance results of subjects in Group A and Group B, respectively.

Table 3.1: User study results of sighted subjects: average deviation (stdev) in meters and total number of hits St. Dev. # of Hits Curves .928 (.339) 2 Clockwise Straights .866 (.135) 0 Combined .906 (.245) 2 Curves .916 (.292) 3 Counter- Straights .996 (.177) 0 Clockwise Combined .962 (.212) 3 Curves .924 (.309) 5 All Straights .934 (.153) 0 Combined .935 (.225) 5

Table 3.2: User study results of visually impaired subjects: average deviation (stdev) in meters and number of hits St. Dev. # of Hits Curves .802 (.050) 0 Clockwise Straights .835 (.067) 1 Combined .819 (.027) 1 Curves .943 (.323) 2 Counter Straights 1.045 (.112) 0 Clockwise Combined 1.008 (.171) 2 Curves .886 (.187) 2 All Straights .947 (.082) 1 Combined .923 (.083) 3 In Group A, two subjects crashed twice, one subject crashed once, and two subjects did not crash. Three of five crashes occurred in first circular part of first lap; two in clockwise (east segment) and one in counter-clockwise (west segment). Other two occurred in last counter-clockwise lap, one in each circular segments. This shows participants crash either just after starting the game or just before it ends. We think this is due to being surprised initially after very first haptic cue and getting exhausted towards the end. Only one subject in Group B crashed during second clockwise lap in south segment, which is straight. Other two crashed once in east segment of counter-clockwise laps, one in first lap and other in 39

Figure 3.4: Four-lap play with smallest standard deviation in Group A (clockwise) last lap. No significant difference found between clockwise and counter-clockwise perfor- mances in terms of standard deviation from median of the track. No significant difference between performances of Group A and Group B is found, either.

Figure 3.5: Four-lap play with smallest standard deviation in Group B (clockwise)

In Figure 3.4 and Figure 3.5, four-lap plays with smallest standard deviations for both Group A (.687 m) and Group B (.786 m) are illustrated. Four different colors correspond to the four different laps in each direction. Colored areas shows the gap between the median of the track and the paths subject drove on. 40

Chapter 4

Discussion and Future Work

4.1 Haptic Steering Interface

The results from our second study in Section 2.3.4 show that conveying how far to steer using frequency has a detrimental effect on steering performance as compared to using NM encoding. We anticipated that FM would reduce oversteering as it allows drivers to sense how far to steer, but it actually leads to understeering. An analysis of average steering wheel values for all drivers for all curvatures reveals (see Figure 4.1 for 180◦ turn) that using FM, drivers generally respond slower. When the controller vibrates with a lower frequency, drivers would only turn the steering wheel slightly, with the result of T 0 hovering around the edge of the window, generally leading to understeering and a significantly larger deviation from the lane’s median. Using NM, drivers do oversteer but this is corrected

0 much faster and allows T to remain much closer to Tmax than using FM. It seems using a higher frequency alerts drivers more quickly than when using a lower frequency. Possibly a combination of both mechanisms, e.g., a high frequency “wake up” followed by FM encoding may increase performance. The second study yields an average median lane deviation of 2.660 m (SD=.847) for NM and 5.209 m (2.667) for FM for all curves. Blind steering can only be safely achieved for curves of 45◦ as NM has an average deviation of 1.457 (SD=1.100), which is small enough for a car to stay in its lane on a two lane road. To contextualize these results, we evaluated a driver’s ability to steer only using haptic feedback, which is the most extreme case in which a driver is blinded completely for the whole duration of driving through the 41

Figure 4.1: Average steering wheel values for the 180◦ turn curve. Typically, glare from headlights takes an eye between 1-3 seconds to recover [4], but recovering from sunlight (flash blindness) may take longer. In these specific contexts, as the driver is only temporarily blinded and some residual visual feedback may be available, we anticipate a better performance for our interface than when no visual feedback is available. As simulating a temporary blindness is subject to many variables, we are confident study 2 and 3 accurately identify an upper and lower bound on the performance of our interface. Our third study confirms that multimodal feedback yields significantly better perfor- mance than unimodal feedback [42, 43]. This result could be slightly misleading, as using visual feedback only, a few subjects were observed to deviate slightly from the lane’s me- dian towards the apex of the curve as this is faster. This was only observed for right turns and not for the left turns, as this would include veering into the other lane. These differ- ences were not statistically significant. Using haptic feedback, drivers are forced to more closely follow the lane’s median, which improves safety. A practical consideration of using our interface could be that it may not be very com- fortable for drivers to be continuously exposed to haptic feedback, especially when multiple consecutive curves are involved as this could numb the driver’s hands and reduce the sen- sitivity to haptic feedback temporarily. Alternatively, an intelligent vehicle system could turn the haptic interface on automatically when it senses a greater chance of getting blinded, e.g., with a low sun, with icy roads, or at night due to headlights. 42

A number of problems need to be addressed before our interface can be implemented in a real vehicle. Our system relies on an intelligent vehicle positioning system for de- termining the car’s position and curvature of the road. A driving simulator was used as a proxy in our study, but using a real vehicle positioning system may pose some significant constraints on our approach, e.g., GPS may be used to determine the car’s position but most existing systems use cameras to track markings on the road. Though road curvature is typically known ahead of time using map information, the car’s exact position may only be known hundreds of milliseconds in advance, which may limit being able to provide haptic cues slightly ahead of time to accommodate their startup delay. Providing haptic feedback through a steering wheel is efficient as drivers always hold it [11], but drivers may not always hold it with their hands in the recommended position [8]. A possible solution would be to use multiple vibrotactors and sense the position of the hands [8] but an easier solution would be to use a butterfly steering wheel, i.e., a wheel that can only be held in one particular way. Our system can be integrated in existing lane keeping systems [6, 7] such as to create a single system for improving driving safety. Future work will focus on investigating the following issues. Interface in that project doesn’t implement auto correction, as preliminary studies using this found it to lead to sig- nificant oscillations. Similar findings were experienced by Hong [27] using audio. Prob- lems with oscillation are challenging to solve, but a possible solution could be to dynam- ically adjust the window along the curve, e.g., window size could increase when the user

0 steers from 0 to Tmax and then decrease while the user holds T at Tmax, which could im- prove accuracy and decrease oscillations. Depending on the type of localization system used there may be some constraints on implementing auto correction. Dynamic haptic feedback in second project is calculated in a different way as explained in Section 3.2, but changing window size comes as a side effect with it. Our interface currently only includes steering, but maintaining and adjusting speed are just as important for driving a vehicle. We plan to investigate low-cost tactile solutions for conveying desired driving speed. On/off encoding without visual feedback has an average lane deviation of 2.660 m, which is not large enough to stay in a single lane, but could be large enough to drive a car on a race- 43 track, which are generally much wider. This is also the motivation of our second project. We further plan to investigate the usefulness of our interface with regard to the blind driver challenge [28] by conducting user studies with blind users.

4.2 Haptic Racing Interface

After first project was implemented and user studies were conducted, the number one future work was listed as integrating auto correction to the system and investigating how much it could improve the steering performance. We were planning to achieve that by adjusting the window size depending on some other parameters such as the distance of the car from median. Initially we implemented a window, size of which was dynamically being updated based on the distance between car and median, but our experiments gave superior results for the approach explained in Chapter 3. User study gives an average standard deviation of 0.935 meter for Group A and 0.923 meter for Group B. The number of crashes per subject is 1 for each group. User study shows that our proposed interface when empowered with a dynamically correcting algorithm can achieve great results. Calculated standard deviations from median and average number of crashes per subject also proves that people with visual impairments can outperform sighted people, which reveals visual feedback is not a must for learning how to use our interface. Standard deviation in circular segments of this study can be compared to 180◦ degree turn results of preliminary user study where only visual feedback is provided to the driver. Comparing the standard deviations of right and left 180◦ degree turns shown in Table 2.1 with clockwise and counter-clockwise circular segment standard deviations in Table 3.1 and Table 3.2, respectively, it can be said that haptic feedback is not as good as visual feedback. When visual feedback is provided, the average deviation from median is .557 meter and when only auto-correcting haptic feedback is provided it goes up to .924 meter for Group A and .886 meter for Group B. However, while making this comparison, one should keep in mind that in race game user studies a subject steers the car for four-full laps continuously without being teleported to a better position which leads accumulation 44 of errors. So, achieved results are valuable and promising. As future work, a more challenging racing game containing curvatures with arbitrary radiuses will be designed. By adding an option to let the driver have control over speed as well, a more fun version of that game can be released for visually impaired people. To make it even better, a multiplayer version of this game with bots or other networked players can be designed by integrating obstacle avoidance algorithms. Since this game is a simulation in computer environment, we can precisely calculate position and orientation of the car, i.e. our system still relies on an intelligent vehicle positioning system which can sense its environment. Google driverless car [41] proves that it is also possible in real life using video cameras, laser range finders, and radar sensors. However, a middle layer for conveying that sensed information to the driver through haptic cues is needed before that interface can be implemented in a real car. 45

Chapter 5

Conclusion

Temporary blindness, caused by glare or fog while driving can lead to serious traffic ac- cidents. We present a novel haptic interface that improves over existing haptic steering systems by not only conveying when and in which direction to steer, but also how much, which facilitates steering with limited or no visual feedback. Steering cues are provided through a vibrotactor integrated in the left and right side of the steering wheel, similar to how rumble strips work. Drivers steer away from a cue felt in either hand in order to find a dead-band window that indicates the target orientation of the wheel, which is adjusted as the car drives through the curve. User studies evaluate two different haptic encoding mechanisms and assess the effect of haptic feedback on steering performance when used in conjunction with visual feedback. Results show that our steering interface allows for blind steering through small (45◦) curves and that it improves a driver’s lane keeping ability when used in conjunction with visual feedback. The successful results of racing game user studies clearly show that a blind/blindfolded player can be guided through a racing track with only haptic cues. This study also proves that a player can be guided to steer the car on any path as long as the car is not required to draw circles with a smaller radius than it can, i.e. that path does not include any curvature with a smaller radius than r in Figure 2.3. Such paths can be used to design new and more challenging tracks. That said, obstacle avoidance can be integrated to this racing game by introducing some dynamic sensing of the environment and so a multiplayer mod of this game can easily be created. This project can be extended to totally blind-controlled real-life haptic racing vehicles. 46

Bibliography

[1] Choi, E.-H., and Singh, S. Statistical assessment of the glare issue – human and natural elements. In proceedings of the Federal Committee on Statistical Methodology Workshop (2005).

[2] Schieber, F. Age and glare recovery time for low-contrast stimuli. proceedings of the Human Factors and Ergonomics Society Annual Meeting 38, 9 (1994), 496–499.

[3] Xiong, K., Xiang, Z., and Ge, J. Evaluation of the human eye glare after strong exposure. In Proceedings of the 2008 International Conference on BioMedical Engi- neering and Informatics Vol. 1, IEEE Computer Society (2008), 660–663.

[4] Lighting research center, recovery time of visual acuity after exposure to a glare source, http://www.lrc.rpi.edu/resources/newsroom/pdf/2004/ readaptation.pdf, access date: 10-17-2012.

[5] Sanford, L. Why lighting is important for the aging eye, http: //www.lighthouse.org/eye-health/the-basics-of-the-eye/ the-aging-eye/lighting/. In Lighthouse International’s Aging & Vision newsletter. 1999.

[6] Ford lane keeping system, http://media.ford.com/article_display. cfm?article_id=35776, access date: 5-6-2012.

[7] Smartmicro’s lane change assist system. http://www.smartmicro.de/ index.php?option=com_content&view=article&id=59&Itemid= 67, access date 7-4-2012. 47

[8] Hwang, S., and hee Ryu, J. The haptic steering wheel: Vibro-tactile based navigation for the driving environment. In proceedings of PERCOM’10, IEEE Computer Soc. (2010), 660 –665.

[9] Asif, A., and Boll, S. Where to turn my car?: comparison of a tactile display and a conventional car navigation system under high load condition. In proceedings of the 2nd international conference on automotive user interfaces and interactive vehicular applications (automotiveUI ’10), ACM (2010), 64–71.

[10] Kim, S., Hong, J.-H., Li, K., Forlizzi, J., and Dey, A. Route guidance modality for elder driver navigation. In Proceedings of the 10th international conference on Pervasive Computing (Pervasive’12), Springer Berlin / Heidelberg, (2012), 179–196.

[11] Kern, D., Marshall, P., Hornecker, E., Rogers, Y., and Schmidt, A. Enhancing naviga- tion information with tactile output embedded into the steering wheel. In Proceedings of the 7th International Conference on Pervasive Computing (Pervasive ’09), Springer Berlin / Heidelberg, (2009), 42–58.

[12] Bach-y Rita, P., and W Kercel, S. Sensory substitution and the human-machine inter- face. Trends in Cognitive Sciences 7:12, (2003), 541–546.

[13] 2011 total consumer spend on all games content in the u.s. estimated between 16.3to16.6 billion https://www.npd.com/wps/portal/npd/us/news/ press-releases/pr_120116/, access date: 04-01-2013.

[14] The entertainment software association http://www.theesa.com/, access date: 04-01-2013.

[15] Siwek S. E., Video games in the 21st century, the 2010 report http://www. theesa.com/facts/pdfs/VideoGames21stCentury_2010.pdf, access date: 04-01-2013.

[16] Making video games accessible: business justifications and design con- siderations http://msdn.microsoft.com/en-us/library/windows/ desktop/ee415219%28v=vs.85%29.aspx, access date: 04-01-2013. 48

[17] Rail racer http://www.blindadrenaline.com/railRacer/, access date: 03-17-2013.

[18] Drive, a racing game for the blind, featuring only sound and no visuals. http:// audiogames.net/drive/, access date: 03-17-2013.

[19] Audiogames.net forum http://forum.audiogames.net/, access date: 03- 17-2013.

[20] Accot, J., and Zhai, S. Beyond fitts’ law: models for trajectory-based hci tasks. In proceedings of the acm sigchi conference on human factors in computing systems (CHI ’97), ACM (1997), 295–302.

[21] Sun, M., Ren, X., and Cao, X. Effects of multimodal error feedback on human per- formance in steering tasks. Information Processing Society of Japan 51,12 (2010), 284–292.

[22] Bateman, S., Doucette, A., Xiao, R., Gutwin, C., Mandryk, R. L., and Cockburn, A. Effects of view, input device, and track width on video game driving. In proceedings of graphics interface 2011 (GI ’11) (2011), 207–214.

[23] Enriquez, M., Afonin, O., Yager, B., and Maclean, K. A pneumatic tactile alerting system for the driving environment. In Proceedings of the 2001 workshop on Percep- tive user interfaces (PUI ’01), ACM (2001), 1–7.

[24] van Erp, J. B., and van Veen, H. A. Vibro-tactile information presentation in automo- biles. In proceedings of Eurohaptics’01 University of Birmingham (2001), 99–104.

[25] Griffiths, P., and Gillespie, R. Shared control between human and machine: haptic display of automation during manual control of vehicle heading. In Proceedings of the 12th international conference on Haptic interfaces for virtual environment and teleoperator systems (HAPTICS’04) (2004), 358 – 366.

[26] Darpa urban challenge, http://archive.darpa.mil/grandchallenge/, access date: 10-10-2011. 49

[27] Hong, D., Kimmel, S., Boehling, R., Camoriano, N., Cardwell, W., Jannaman, G., Purcell, A., Ross, D., and Russel, E. Development of a semi-autonomous vehicle op- erable by the visually-impaired. In Proceedings Multisensor Fusion and Integration for Intelligent Systems, 2008. MFI 2008, IEEE Computer Society (2008), 539 –544.

[28] The blind driver challenge, http://www.blinddriverchallenge.org/, access date: 12-1-2011.

[29] Hogema J. H., De Vries S. C., van Erp, J. B., and Kiefer R. J. A tactile seat for direction coding in car driving: field evaluation In IEEE Transactions on Haptics Vol. 2, no. 4 (October 2009), 181-188

[30] Department of motor vehicle vc section 27400 wearing of headsets or earplugs, http://www.dmv.ca.gov/pubs/vctop/d12/vc27400.htm, access date: 10-10-2012.

[31] Yuan, B., Folmer, E., and Harris, F. C. Game accessibility; a survey. Universal Access in the Information Society 10, 1 (2010), 88-111

[32] AudioGames, your resource for audiogames, games for the blind, games for the visu- ally impaired http://www.audiogames.net/, access date: 03-17-2013.

[33] Fantastic accessible games and where to find them www.pcsgames.net/ game-co.htm, access date: 03-17-2013.

[34] Christopher Lewis, The Formula 1 Audio Racing Game http://audiogames. net/pics/upload/chrislewisinterview01.htm, access date: 03-17- 2013.

[35] Jim Kitchen, Kitchen’s Inc, Mach 1 Car Racing http://www.kitchensinc. net/, access date: 03-17-2013.

[36] 1000 miles http://www.qcsalon.net/en/1000miles, access date: 03-17- 2013. 50

[37] Top Speed 3 http://www.playinginthedark.net/topspeed3_e.php, access date: 03-17-2013.

[38] Blind Behind the Wheel, a car race for blind individuals http://www. edmontonblindrace.com/, access date: 04-04-2013.

[39] Cory Martin, can you believe that as a blind person i actually raced a car? http://www.nerdball.net/2012/08/26/ blind-behind-the-wheel-of-a-racecar-2/, access date: 04-13- 2013.

[40] Sebastian Thrun, What we’re driving at http://googleblog.blogspot. com/2010/10/what-were-driving-at.html, access date: 01-20-2013.

[41] google’s self-driving car takes blind man for a ride http://www.pcmag.com/ article2/0,2817,2402340,00.asp, access date: 03-29-2013.

[42] Miller, J. Divided attention: evidence for coactivation with redundant signals. Cog- nitive Psychology Vol. 14, 2 (1982), 247–279.

[43] Hershenson, M. Reaction time as a measure of intersensory facilitation. Journal of Experimental Psychology, Vol 63 (1962), 289–293.

[44] Amit Dey, Driving simulation in XNA, http://www.codeproject.com/ Articles/29323/Driving-Simulation-in-XNA, access date: 4-6-2012.

[45] Desmos - Graph functions, plot tables of data, evaluate equations, explore transfor- mations, and more, https://www.desmos.com/, access date: 03-17-2013.

[46] Federal highway administration, recommended lane width. http:// safety.fhwa.dot.gov/geometric/pubs/mitigationstrategies/ chapter3/3_lanewidth.htm, access date: 9-12-2012.

[47] Department of transportation, roadway design manual, http://www.state.nj. us/transportation/eng/documents/RDM/sec4.shtm, access date: 8- 8-2012. 51

[48] Morelli, T., and Folmer, E. Twuist: A discrete tactile-proprioceptive display for eye and ear free output on mobile devices. In proceedings of Haptics Symposium ’12, IEEE Computer Society (2012), 443–450.

[49] Stepp, C. E., and Matsuoka, Y. Vibrotactile sensory substitution for object manip- ulation: amplitude versus pulse train frequency modulation. IEEE Transactions on Neural Systems and Rehabilitation Engineering, Vol. 20, 1 (2012), 31–37.

[50] Brewster, S., and Brown, L. M. Tactons: structured tactile messages for non-visual information display. In proceedings of the fifth conference on australasian user inter- face - vol. 28 (AUIC ’04), Australian Computer Society. ( 2004), 15–23.

[51] Zhai, S., and Woltjer, R. Human movement performance in relation to path constraint - the law of steering in locomotion. In Proceedings of the IEEE Virtual Reality 2003 (VR ’03), IEEE Computer Society (2003), 149–156.

[52] Yuan, B., and Folmer, E. Blind hero: enabling guitar hero for the visually impaired. In Proceedings of the 10th international ACM SIGACCESS conference on Computers and accessibility (Assets ’08). pages 169-176, Halifax, Nova Scotia, Canada, 2008.

[53] Zork, 80’s style text adventure game accessible to the blind http://www. infocom-if.org/downloads/downloads.html, access date: 03-17-2013.

[54] Shades of Doom, a revolutionary Windows-based game for the visually impaired. http://www.gmagames.com/sod.html, access date: 03-17-2013.

[55] Troopanum 2, a fast action packed arcade game http://www.blindsoftware. com/SoftwareBlindDetail.aspx?id=22&swreg=1967troop2, access date: 03-17-2013.

[56] Accessible Battleship SV http://www.gamesforblind.com/webpages/ sv/BattleshipSVDemo.html, access date: 03-17-2013.

[57] Accessible BlackJack SV http://www.gamesforblind.com/webpages/ sv/BlackJackSVDemo.html, access date: 03-17-2013. 52

[58] Rhythm heaven fever for wii http://rhythmheavenfever.nintendo. com/, access date: 03-17-2013.

[59] Luigi’s mansion: dark moon for nintendo 3ds http://luigismansion. nintendo.com/, access date: 03-17-2013.

[60] NASCAR http://www.nascar.com/, access date: 03-17-2013.

[61] Bristol Motor Speedway http://www.bristolmotorspeedway.com/, ac- cess date: 03-17-2013.