Robot-Based Haptic Perception and Telepresence for the Visually Impaired

Robot-Based Haptic Perception and Telepresence for the Visually Impaired

ROBOT-BASED HAPTIC PERCEPTION AND TELEPRESENCE FOR THE VISUALLY IMPAIRED A Thesis Presented to The Academic Faculty by Chung Hyuk Park In Partial Fulfillment of the Requirements for the Degree Doctor of Philosophy in the School of Electrical and Computer Engineering Georgia Institute of Technology August 2012 ROBOT-BASED HAPTIC PERCEPTION AND TELEPRESENCE FOR THE VISUALLY IMPAIRED Approved by: Professor Ayanna M. Howard, Advisor Professor Magnus Egerstedt School of Electrical and Computer School of Electrical and Computer Engineering Engineering Georgia Institute of Technology Georgia Institute of Technology Professor Anthony Yezzi Professor Wayne Book School of Electrical and Computer School of Mechanical Engineering Engineering Georgia Institute of Technology Georgia Institute of Technology Professor Maysam Ghovanloo Date Approved: June 13, 2012 School of Electrical and Computer Engineering Georgia Institute of Technology DEDICATION For my parents, my wife, and my son, who have filled my life with love, happiness, and inspiration. iii ACKNOWLEDGEMENTS This work was supported in part by the National Science Foundation under Grant No. 0940146. I would like to express my gratitude to my advisor, Dr. Ayanna Howard, for her guidance and advisement, the National Federation for the Blind and the Center for the Visually Impaired in Atlanta, for participation and support for my experiments, and Rayshun Dorsey of WizKids Science and Technology Center for managing the camp and logistics. I would also like to acknowledge each member of the Human Automation Systems (HumAnS) Lab. for their continued support and sharing of insights regarding the approaches presented here. iv TABLE OF CONTENTS DEDICATION ................................... iii ACKNOWLEDGEMENTS ............................ iv LIST OF TABLES................................. viii LIST OF FIGURES ................................ ix SUMMARY..................................... xiii I INTRODUCTION.............................. 1 1.1 Visual Impairments . .4 1.2 Assistive Technology for the Visually Impaired . .5 1.3 Assistive Robots for the Visually Impaired . .8 1.4 Haptic Modality for Assisting the Visually Impaired . 11 1.4.1 Haptic Interfaces . 12 1.4.2 Haptic-Based Research for Assisting the Visually Impaired . 14 II HAPTICS FOR NON-VISUAL ENVIRONMENTAL FEEDBACK . 17 2.1 System Overview . 18 2.1.1 Feedback Primitives . 18 2.1.2 Communication and System Architecture . 20 2.2 Multi-Modal Feedback Design . 21 2.2.1 Feedback Description . 21 2.3 Application of Multi-modal Feedback for Students with Visual Im- pairments . 25 2.4 Experiment Design . 28 2.5 Results . 29 2.5.1 User Responses . 31 2.5.2 Multi-Modal Perception Results . 32 2.5.3 Learning Performance Results . 33 2.6 Conclusion . 36 v III HAPTIC EXPLORATION FROM ROBOTIC VISION.......... 38 3.1 System Overview . 39 3.1.1 Human Interface . 39 3.1.2 Robotic System . 40 3.1.3 System Controller . 42 3.2 Stereo Vision Processes . 46 3.2.1 Stereo Vision for Large-scale Perception . 47 3.2.2 Mono-stereo for Proximity Perception . 48 3.2.3 Post-processing of Disparity Images . 49 3.2.3.1 Ground Detection . 50 3.2.3.2 Histogram-based Mixed-scale Disparity Map . 50 3.2.3.3 Separation of Foreground Objects and Background 51 3.3 3D Map Building . 55 3.4 Haptic Rendering from Vision-based Robotic Perception . 56 3.4.1 Haptic 3D-Rendering Algorithm . 58 3.4.2 Haptic Interaction with Robotic Sensor Data . 59 3.5 Results . 61 3.5.1 Stereo Process Results . 63 3.5.2 Mono-Stereo Process Results . 63 3.5.3 Human Subject Results with Haptic Rendering . 66 3.6 Conclusions . 67 IV HAPTIC TELEPRESENCE AND MULTI-MODAL INTERACTION . 70 4.1 System Architecture for a Telepresence Robotic System . 71 4.2 System Blocks and Functional Modules . 73 4.2.1 Handling of the Information Flow from the RGB-D Sensor . 73 4.2.2 Handling of the Sensory Limitations of the RGB-D Sensor . 74 4.2.3 Multi-modal Feedback for Tele-perception . 75 4.2.3.1 Haptic Feedback . 76 vi 4.2.3.2 Auditory Feedback . 77 4.2.4 Finite State Machine . 78 4.3 Experimental Setup . 84 4.4 Results . 86 4.4.1 User Performance for Task Completion . 86 4.4.2 Control Load for the User . 88 4.4.3 Robot Trajectory . 88 4.4.4 Questionnaire Results . 91 4.5 Conclusion . 93 V HAPTIC SKILL-TRANSFER........................ 95 5.1 Background . 96 5.2 Algorithms . 97 5.2.1 Neural Network Learning Module . 98 5.2.2 SVM Learning Module . 99 5.2.3 Haptic Guidance . 100 5.3 Experimental Setup . 101 5.4 Results . 102 5.5 Analysis of Haptic Skill Transfer . 108 5.6 Conclusion . 111 VI CONCLUSION................................ 112 6.1 Key Contributions . 113 6.1.1 Haptics for Non-visual Environmental Feedback . 113 6.1.2 Haptic Exploration from Robotic Vision . 114 6.1.3 Haptic Telepresence and Multi-modal Interaction . 115 6.1.4 Haptic Skill-Transfer . 116 6.2 Future Work . 116 REFERENCES................................... 118 vii LIST OF TABLES 1 Population of VI among age groups in U.S.A. .5 2 Post-session Survey Results for Questions 1 and 2 . 31 3 Processing time and RMSE value of depth error for image pairs in Figure 45 . 63 4 Processing time and RMSE value of depth error for image pairs in Figure 46 . 65 5 Results of haptic object-recognition experiment with subjects without VI after mono-stereo process . 67 6 Results of haptic object-recognition experiment with subjects with VI after mono-stereo process . 67 7 Confusion matrix of subjects without VI in identifying objects with haptic exploration . 68 8 Confusion matrix of subjects with VI in identifying objects with haptic exploration . 68 9 List of color names differentiable with verbal feedback. 78 10 List of States for our Telepresence System. 81 11 List of Events for our Telepresence System. 82 12 List of Sub-states (Robot) for our Telepresence System. 83 13 List of Sub-states (Haptic source) for our Telepresence System. 83 14 Success rates of human subjects in Scenario 1 & 3 . 87 15 List of color names differentiable with verbal feedback. 91 16 Data compression ratio of ANN and SVM over 3 datasets (UNIT=KB).108 17 Training time for ANN and SVM over 3 datasets (UNIT=sec). 109 18 LCS results for ANN and SVM. 111 viii LIST OF FIGURES 1 Snellen chart. .4 2 Population of VI among age groups in U.S.A. .5 3 Traditional aids for VI. .6 4 Examples of modern assistive devices. .7 5 GuideCane: a mobile robotic system for navigational aid for VI [85]. .9 6 RoboCart [42]. 10 7 Haptic vest for assisting a blind driver [30]. 10 8 V-Go telepresence robot. 12 9 Haptic force primitives. 13 10 Common haptic interfaces. 14 11 Advanced haptic interfaces. 15 12 LEGO MindstormTM NXT robot. 18 13 System Architecture and Communication Flow . 21 14 Feedback signals for NXT robot traveling forward: (a) Haptic feedback and (b) auditory feedback. 22 15 Feedback signals for NXT robot turning left/right: (a,b) Haptic feed- back and (c,d) auditory feedback. 24 16 Feedback signals for NXT robot sensing an obstacle in front: (a) Haptic feedback and (b) auditory feedback. 25 17 Feedback signals when NXT robot bumped into a wall or an obstacle: (a) Haptic feedback and (b) auditory feedback. 26 18 Feedback signals when NXT robot reached a goal: (a) Haptic feedback and (b) auditory feedback. 27 19 Programming and testing a robot in camps for students with visual impairments. (a) Testing with direct tactile sensing, and (b) testing with multi-modal feedback (Wiimotes are marked with circles). 27 20 Two mazes used throughout the camp to test the students' program- ming skills and the multi-modal feedback system. 31 21 DTW result of Group 1 in Maze #2 with both haptic and auditory feedback . 33 ix 22 DTW result of Group 2 in Maze #1 with only haptic feedback . 34 23 Boxplot of DTW for Maze #1 . 34 24 Boxplot of DTW for Maze #2 . 35 25 Learning curve for Maze #1. 35 26 Learning curve for Maze #2. 36 27 Learning rate. 36 28 Basic architecture of the HEMM system. 40 29 Human interface for HEMM system. 41 30 Haptic exploratory mobile manipulation (HEMM) system. 41 31 Workspace configurations in a HEMM system. 43 32 Joint configuration of the Pioneer Arm [41]. 44 33 D-H parameters for the Pioneer Arm. 44 34 Several results of stereo disparity-generation algorithms. 46 35 Ground estimation process on hue/saturation/value (HSV) channels using a statistical-region-merging algorithm. (a) Hue channel image of an indoor scene, (b) saturation channel image of the scene, (c) value channel image of the image, (d){(f) Results SRM process on corre- sponging (a){(c) images, (g) final ground estimation from (d){(f). 51 36 Stereo perception process. 52 37 Flow-chart of 3D perception through the mono-stereo process with foreground/background separation. 53 38 Example of the mono-stereo process. ..

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    137 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us