Vision Based Natural Assistive Technologies with Gesture
Total Page:16
File Type:pdf, Size:1020Kb
Vision based Natural Assistive Technologies with gesture recognition using Kinect Jae Pyo Son School of Computer Science and Engineering University of New South Wales A thesis in fulfilment of the requirements for the degree of Master of Science (Computer Science Engineering) 2 3 Abstract Assistive technologies for disabled persons are widely studied and the demand for them is rapidly growing. Assistive technologies enhance independence by enabling disabled persons to perform tasks that they are otherwise unable to accomplish, or have great difficulty accomplishing, by providing the necessary technology needed in the form of assistive or rehabilitative devices. Despite their great performance, assistive technologies that require disabled persons to wear or mount devices on themselves place undue restrictions on users. This thesis presents two different attempts of a solution to the problem, proposing two different novel approaches to achieve vision-based Human-Computer Interaction (HCI) for assisting persons with different disabilities to live more independently in a more natural way, in other words, without the need for additional equipment or wearable devices. The first approach aims to build a system that assists visually impaired persons to adjust to a new indoor environment. Assistants record important objects that are in fixed positions in the new environment using pointing gestures, and visually impaired persons may then use pointing gestures as a virtual cane to search for the objects. The second approach aims to build a system that can assist disabled persons who have difficulties in moving their arms or legs and cannot drive vehicles normally. In the proposed system, the user only needs one hand to perform steering, switching between gears, acceleration, brake and neutral (no acceleration, no brake). The technical contributions include a new method for pointing gesture extracted from forearm and a new algorithm for measuring the steering angle from the movement of a single hand. The proposed approaches were experimented with and both quantitative and qualitative analyses are presented. The experimental results show that the general performance of the implemented systems is satisfactory, though the effectiveness of the systems in a real-life situation for disabled persons is currently unknown. However, it is believed that these approaches can potentially be used in real life in future, and will encourage development of more natural assistive technologies. 4 CONTENTS 1. Introduction ...................................................................................... 10 1.1. Research Overview ........................................................................................... 10 1.2. Scope ................................................................................................................ 11 1.3. Contribution ...................................................................................................... 12 1.4. Organization ..................................................................................................... 13 2. Gesture Recognition for Assistive Technologies ................................ 14 2.1. Assistive Technologies ..................................................................................... 14 2.1.1. Pointing Gesture ........................................................................................ 15 2.1.2. Hand Gestures ........................................................................................... 15 2.1.3. Kinect ........................................................................................................ 16 2.2. Gestures in HCI ................................................................................................ 18 2.2.1. Pointing Gesture ........................................................................................ 19 2.2.2. Hand Segmentation and Tracking ............................................................. 27 2.2.3. Pattern and Hand Trajectory Recognition ................................................. 29 2.3. Discussion ........................................................................................................ 33 2.4. Tools, Engines and Frameworks ...................................................................... 35 2.4.1. OpenNI ...................................................................................................... 35 2.4.2. OpenCV .................................................................................................... 36 2.4.3. Unity3D ..................................................................................................... 36 2.5. Summary .......................................................................................................... 36 3. Object-Based Navigation ................................................................... 38 3.1. Pointing Gesture Recognition .......................................................................... 38 3.1.1. Body Joints Detection ............................................................................... 39 3.1.2. Pointing Direction Estimation ................................................................... 41 3.1.3. Pointing Gesture Detection ....................................................................... 44 3.2. System Description ........................................................................................... 45 3.2.1. Offline Training ........................................................................................ 46 3.2.2. Object Navigation ..................................................................................... 47 3.3. Experimental Results ........................................................................................ 50 3.3.1. Pointing Gesture Accuracy Comparison for Different Body Joints .......... 50 3.3.2. System Performance.................................................................................. 53 3.4. Limitations ........................................................................................................ 57 3.5. Remarks ............................................................................................................ 57 4. Single-handed Driving System ........................................................... 59 4.1. Hand Segmentation .......................................................................................... 60 5 4.2. Fingertip Detection ........................................................................................... 62 4.3. System Description ........................................................................................... 63 4.3.1. Nonholonomic Steering ............................................................................ 65 4.3.2. Differential Steering .................................................................................. 68 4.4. Experimental Results ........................................................................................ 69 4.5. Simulation ........................................................................................................ 71 4.5.1. Testing of Nonholonomic Steering ........................................................... 71 4.5.2. Differential Steering .................................................................................. 74 4.6. Limitations ........................................................................................................ 81 4.7. Remarks ............................................................................................................ 82 5. Conclusion ........................................................................................ 83 5.1. Thesis Overview ............................................................................................... 83 5.2. Contributions .................................................................................................... 84 5.3. Limitations and Future Work ........................................................................... 85 5.4. Concluding Remarks ........................................................................................ 86 A. Publications Arising from Thesis ....................................................... 87 B. Acronyms and Abbreviations ............................................................ 88 Bibliography ......................................................................................... 89 6 List of Figures 1.1: Research Overview .................................................................................................. 11 2.1: Examples of Pointing Gesture ................................................................................. 15 2.2: Examples of Hand Gesture ...................................................................................... 16 2.3: Mircosoft Kinect ...................................................................................................... 17 2.4: Kinect-based HCI Applications ............................................................................... 18 3.1: Three Steps for Pointing Gesture Recognition ........................................................ 39 3.2: Skeleton Tracking Example ..................................................................................... 40 3.3: Calibration Pose ....................................................................................................... 41 3.4: 3D Space Created by User’s Arm and a Point Cloud .............................................