Spatial Tactile Feedback Support for Mobile Touch-Screen Devices

Spatial Tactile Feedback Support for Mobile Touch-Screen Devices

Spatial Tactile Feedback Support for Mobile Touch-screen Devices by Koji Yatani A thesis submitted in conformity with the requirements for the degree of Doctor of Philosophy Department of Computer Science University of Toronto Copyright ⃝c 2011 by Koji Yatani Abstract Spatial Tactile Feedback Support for Mobile Touch-screen Devices Koji Yatani Doctor of Philosophy Graduate Department of Computer Science University of Toronto 2011 Mobile touch-screen devices have the capability to accept flexible touch input, and can provide a larger screen than mobile devices with physical buttons. However, many of the user interfaces found in mobile touch-screen devices require visual feedback. This raises a number of user interface challenges. For instance, visually-demanding user interfaces make it difficult for the user to interact with mobile touch-screen devices without looking at the screen—a task the user sometimes wishes to do particularly in a mobile setting. In addition, user interfaces on mobile touch-screen devices are not generally accessible to visually impaired users. Basic tactile feedback (e.g., feedback produced by a single vibration source) can be used to enhance the user experience on mobile touch-screen devices. Unfortunately, this basic tactile feedback often lacks the expressiveness for generating vibration patterns that can be used to convey specific information about the application to the user. However, the availability of richer information accessible through the tactile channel would minimize the visual demand of an application. For example, if the user can perceive which button she is touching on the screen through tactile feedback, she would not need to view the screen, and can instead focus her visual attention towards the primary task (e.g., walking). In this dissertation, I address high visual demand issues found in existing user interfaces on mobile touch-screen devices by using spatial tactile feedback. Spatial tactile feedback means ii tactile feedback patterns generated in different points of the user’s body (the user’s fingers and palm in this work). I developed tactile feedback hardware employing multiple vibration motors on the backside of a mobile touch-screen device. These multiple vibration motors can produce various spatial vibration patterns on the user’s fingers and palm. I then validated the effects of spatial tactile feedback through three different applications: eyes-free interaction, a map application for visually impaired users, and collaboration support. Findings gained through the series of application-oriented investigations indicate that spatial tactile feedback is a beneficial output modality in mobile touch-screen devices, and can mitigate some visual demand issues. iii Acknowledgements I would like to extend my utmost thanks to my advisor, Khai N. Truong, for his guidance and support over the past five years. This dissertation would not have been possible without his deep involvement and encouragement. I also would like to thank my thesis committee: Ravin Balakrishnan, Mark Chignell, Daniel Wigdor, and the external examiner, Stephen Brewster. Their feedback was simply invaluable. This dissertation has much support from other researchers. Particularly, I would like to mention Darren Gergle’s support on designing the experiment and performing analyses presented in Chapter 6. I also would like to thank Nikola Banovic, who helped me design the experiment and perform analyses on results described in Chapter 5. He also offered me great knowledge about interfaces for visually impaired users, which was very helpful. Besides the dissertation work, I have been very fortune to work with many bright professors, researchers, and students. I learned many from all of them, and they are always my source of inspiration. A special thanks to Hrvoje Benko, Marshall Bern, Bill Buxton, Eunyoung Chung, Carlos Jensen, Nicole Coddington, David Dearman, Richard Guy, Ken Hinckley, Steve Hodges, Elaine M. Huang, Julie A. Kientz, Victor Kuechler, Frank Li, Mark W. Newman, Michael Novati, Michel Pahud, Kurt Partridge, Shwetak N. Patel, Jenny Rodenhouse, Jeremy Scott, Andrew Trusty, Nicolas Villar, and Andy Wilson. I want to acknowledge faculties, postdocs, visitors and students I had the pleasure to meet in school, including Anand Agarawala, Seok-Hyung Bae, Xiaojun Bi, Nilton Bila, Simon Breslav, Xiang Cao, Fanny Chevalier, Gerry Chu, Mike Daum, Pierre Dragicevic, Carrie Demmans Epp, Dustin Freeman, Clifton Forlines, Tovi Grossman, John Hancock, Sam Hasinoff, Aaron Hertzmann, Justin Ho, Akitoshi Kawamura, Alex Kolliopoulos, Martin de Lasa, Shahzad Malik, Mike Massimi, Karyn Moffatt, Igor Mordatch, Nigel Morris, Tomer Moschovich, Peter O’Donovan, Matthew O’Toole, Gonzalo Ramos, Abhishek Ranjan, Mike Reimer, Alyssa iv Rosenzweig, Ryan Schmidt, Eron Steger, Huixian Tang, Daniel Vogel, Jack Wang, Mike Wu, and Shengdong Zhao. I also thank to DAG members: Ablishek, Alex, Eron, Jack, Jean-Nicolas McGee, and Jenny Wang. It is always a very special place to me. Finally, I am deeply indebted to my parents, Kenichi Yatani and Masae Yatani. I would not be able to complete my degree without their full support and confidence in me over years. v Copyright Notice and Disclaimer Sections of this document have appeared in a previous publication or have conditionally accepted in a conference at the time of writing. In all cases, permission has been granted by the publisher for the work to appear here. Below, the publisher’s copyright notice and disclaimer are given, with thesis chapter and corresponding publication noted. Association for Computer Machinery Copyright ⃝c 2011 by the Association for Computing Machinery, Inc. (ACM). Permission to make digital or hard copies of portions of this work for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page in print or the first screen in digital media. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Send written requests for republication to ACM Publications, Copyright & Permissions at the address above or fax +1 (212) 869-0481 or email [email protected]. Portions of Chapter 4 Koji Yatani and Khai N. Truong. 2009. SemFeel: a user interface with semantic tactile feedback for mobile touch-screen devices. In Proceedings of the 22nd annual ACM symposium on user interface software and technology (UIST 2009), ACM, New York, NY, USA, 111-120. DOI=10.1145/1622176.1622198 vi Portions of Chapter 6 Koji Yatani, Darren Gergle, and Khai N. Truong. 2012. Investigating effects of visual and tactile feedback on spatial coordination in collaborative handheld systems. To appear in the proceedings of the ACM conference on computer supported cooperative work (CSCW 2012), ACM, New York, NY, USA. vii Contents 1 Introduction 1 1.1 Research Objective and Overview ............... 3 1.2 Contributions .......................... 6 1.2.1 Spatial Tactile Feedback Hardware .......... 6 1.2.2 Distinguishability of Spatial Tactile Feedback ..... 6 1.2.3 Eyes-free Mobile Interaction with Spatial Tactile Feedback ........................ 6 1.2.4 Spatial Relationship Learning Support for Visually Impaired Users ..................... 7 1.2.5 Spatial Tactile Feedback Support in Remote Collaboration ...................... 7 1.3 Dissertation Outline ....................... 8 2 Background Literature 11 2.1 The Hand ............................ 13 viii 2.1.1 The Skin Senses .................... 14 2.2 The Hardware .......................... 16 2.2.1 DC Motors ....................... 17 2.2.2 Voice Coil Actuator ................... 19 2.2.3 Piezoelectric Bending Actuators ............ 20 2.2.4 MR Fluid ........................ 21 2.2.5 Mechanical Pin Arrays ................. 22 2.2.6 Ultrasound Transducers ................ 23 2.2.7 Electrovibration ..................... 23 2.2.8 Force Feedback Hardware ............... 25 2.2.9 Summary ........................ 26 2.3 Vibrotactile Feedback Parameters ............... 26 2.3.1 Intensity ......................... 27 2.3.2 Frequency ........................ 28 2.3.3 Duration ........................ 29 2.3.4 Waveform ........................ 30 2.3.5 Rhythm ......................... 30 2.3.6 Locus (Spatial Patterns) ................ 31 2.3.7 Spatio-temporal Patterns ................ 32 2.3.8 Summary ........................ 34 ix 2.4 User Interfaces with Tactile Feedback ............. 34 2.4.1 Touch Screens and Touch-sensitive Surfaces ..... 35 2.4.2 Handheld and Mobile Devices ............. 40 2.4.3 Perception of Vibrotactile Feedback .......... 43 2.4.4 Effects of Vibrotactile Feedback ............ 44 2.5 Summary ............................ 46 3 Spatial Tactile Feedback Hardware 48 3.1 Design Requirements ...................... 48 3.2 First Prototype ......................... 49 3.3 Second Prototype ........................ 52 4 Eyes-free Interaction 53 4.1 SemFeel Concept ........................ 54 4.2 Related Work .......................... 55 4.3 SemFeel System ......................... 58 4.3.1 Hardware Configuration ................ 58 4.3.2 Vibration Patterns .................... 59 4.4 Experiment 1: Distinguishability of Patterns .......... 60 4.4.1 Tasks and Stimuli .................... 60 4.4.2 Variables .......................

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    194 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us