
ABSTRACT Title of Dissertation: HANDSIGHT: A TOUCH-BASED WEARABLE SYSTEM TO INCREASE INFORMATION ACCESSIBILITY FOR PEOPLE WITH VISUAL IMPAIRMENTS Lee Stearns, Doctor of Philosophy, 2018 Dissertation directed by: Professor Jon E. Froehlich Department of Computer Science Many activities of daily living such as getting dressed, preparing food, wayfinding, or shopping rely heavily on visual information, and the inability to access that information can negatively impact the quality of life for people with vision impairments. While numerous researchers have explored solutions for assisting with visual tasks that can be performed at a distance, such as identifying landmarks for navigation or recognizing people and objects, few have attempted to provide access to nearby visual information through touch. Touch is a highly attuned means of acquiring tactile and spatial information, especially for people with vision impairments. By supporting touch-based access to information, we may help users to better understand how a surface appears (e.g., document layout, clothing patterns), thereby improving the quality of life. To address this gap in research, this dissertation explores methods to augment a visually impaired user’s sense of touch with interactive, real-time computer vision to access information about the physical world. These explorations span three application areas: reading and exploring printed documents, controlling mobile devices, and identifying colors and visual textures. At the core of each application is a system called HandSight that uses wearable cameras and other sensors to detect touch events and identify surface content beneath the user’s finger. To create HandSight, we designed and implemented the physical hardware, developed signal processing and computer vision algorithms, and designed real-time feedback that enables users to interpret visual or digital content. We involve visually impaired users throughout the design and development process, conducting several user studies to assess usability and robustness and to improve our prototype designs. The contributions of this dissertation include: (i) developing and iteratively refining HandSight, a novel wearable system to assist visually impaired users in their daily lives; (ii) evaluating HandSight across a diverse set of tasks, and identifying tradeoffs of a finger-worn approach in terms of physical design, algorithmic complexity and robustness, and usability; and (iii) identifying broader design implications for future wearable systems and for the fields of accessibility, computer vision, augmented and virtual reality, and human-computer interaction. HANDSIGHT: A TOUCH-BASED WEARABLE SYSTEM TO INCREASE INFORMATION ACCESSIBILITY FOR PEOPLE WITH VISUAL IMPAIRMENTS by Lee Stearns Dissertation submitted to the Faculty of the Graduate School of the University of Maryland, College Park, in partial fulfillment of the requirements for the degree of Doctor of Philosophy 2018 Advisory Committee: Professor Jon E. Froehlich, Chair / Advisor Professor Rama Chellappa, Co-Advisor Professor Leah Findlater Professor Ramani Duraiswami Professor Gregg Vanderheiden, Dean’s Representative © Copyright by Lee Stearns 2018 Dedication To my family, who supported and encouraged me throughout this long process. v Acknowledgements Above all else, I would like to thank the three professors with whom I have worked closely throughout this process. First, I thank my advisor, Jon Froehlich, for his guidance, support, and infectious enthusiasm. Thank you to my co-advisor, Rama Chellappa, for his patience and sense of humor. And thank you to Leah Findlater for her valuable advice and encouragement. I have learned a great deal from each of you about how to become a better researcher and would not have been able to complete this dissertation without you. Thank you also to the two additional members of my dissertation committee, Gregg Vanderheiden and Ramani Duraiswami, for your time and valuable advice. Your input made this dissertation stronger, and your recommendations will help to improve the quality of my research and presentations in the future. I use the pronouns “we” and “our” throughout this dissertation to acknowledge the contributions of other students, professors, and associates throughout nearly every stage of this research. I especially want to thank Uran Oh for working with me throughout much of the project. Thank you also to Ruofei Du, Liang He, Jonggi Hong, Alisha Pradhan, Anis Abboud, Victor De Souza, Alex Medeiros, Meena Sengottuvelu, Chuan Chen, Jessica Yin, Harry Vancao, Eric Lancaster, Catherine Jou, Victor Chen, Mandy Wang, Ji Bae, David Ross, Darren Smith, and Cha-Min Tang. And thank you to my fellow lab mates for your advice, encouragement, and feedback: Matthew Mauriello, Kotaro Hara, Seokbin Kang, Dhruv Jain, Manaswi Saha, Majeed Kazemitabaar, Ladan Najafizadeh, and Brenna McNally. vi Table of Contents Dedication .................................................................................................................... v Acknowledgements ..................................................................................................... vi Table of Contents ....................................................................................................... vii List of Tables ............................................................................................................... x List of Figures ............................................................................................................ xii List of Abbreviations ............................................................................................... xvii Chapter 1: Introduction ................................................................................................ 1 1.1 Research Approach and Overview ................................................................ 2 1.1.1 Reading and Exploring Printed Documents .......................................... 4 1.1.2 Controlling Mobile Devices with On-Body Input ................................ 5 1.1.3 Identifying Colors and Visual Patterns ................................................. 7 1.2 Summary of Contributions ............................................................................ 8 1.3 Dissertation Outline ...................................................................................... 9 Chapter 2: Background and Related Work ................................................................ 11 2.1 Portable Assistive Camera Systems ............................................................ 11 2.1.1 Smartphone Applications .................................................................... 12 2.1.2 Cameras Worn on the Upper Body ..................................................... 13 2.1.3 Head-Worn Vision Enhancement Systems ......................................... 15 2.1.4 Cameras Worn on the Finger .............................................................. 17 2.2 Access to Visual Surface Information ........................................................ 20 2.2.1 Reading Text using Optical Character Recognition (OCR) ............... 21 2.2.2 Identifying Colors and Patterns .......................................................... 22 2.3 Access to Digital Information ..................................................................... 25 2.3.1 Smartphone and Smartwatch Accessibility ........................................ 25 2.3.2 Touch Gestures on Arbitrary Surfaces ................................................ 27 2.3.3 On-Body Input .................................................................................... 29 2.4 Summary ..................................................................................................... 31 Chapter 3: Reading Printed Materials by Touch: Initial Exploration ........................ 32 3.1 System Design ............................................................................................ 34 3.1.1 Design Goals ....................................................................................... 34 3.1.2 Hardware ............................................................................................. 35 3.1.3 Image Processing Algorithms and Offline Evaluation ....................... 37 3.2 User Study to Assess Audio and Haptic Feedback ..................................... 41 3.2.1 Method ................................................................................................ 42 3.2.2 Analysis and Findings ......................................................................... 45 3.3 Discussion ................................................................................................... 48 3.4 Summary ..................................................................................................... 50 Chapter 4: Evaluating Haptic and Auditory Directional Finger Guidance ................ 51 4.1 Study I: Audio vs. Haptic Guidance for Finger-Based Reading ................. 53 4.1.1 Method ................................................................................................ 54 4.1.2 Findings............................................................................................... 64 vii 4.2 Study II: Preliminary Use of a Proof-of-Concept Prototype ...................... 74 4.2.1 Method ................................................................................................ 75 4.2.2 Findings............................................................................................... 81 4.3
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages275 Page
-
File Size-