
Optical Hand Tracking in Virtual Reality to Teach American Sign Language Christian O’Brien BSc Submitted in partial fulfilment for the degree of Master of Science in Serious Games and Virtual Reality August 12, 2020 Supervised by Dr. Daniel Livingstone MSc BEng FHEA School of Simulation and Visualisation The Glasgow School of Art Abstract Optical hand tracking technologies in Virtual Reality (VR) offer an exciting new method of human-computer interaction. Among the applications being explored, its use for education holds great promise. Gesture recognition systems are in development, but none have yet proven to support existing gesture collections such as American Sign Language (ASL). Fingerspelling is central to ASL and is often a challenge to ASL learners, so digital tools could assist them in the acquisition of this skill. This paper explores the use of a virtual reality serious game which utilizes optical hand tracking to teach the ASL alphabet. The question is posed whether the use of such a game will improve a player’s ability to fingerspell. The new “ASL Fingerspeller” game was developed for the Oculus Quest using state of the art software and hardware, the latest of which came to market in February 2020. The game utilizes a simple sign recognition system which defines a sign as finger positions and wrist orientation and is customizable to each player. The game challenges the player to fingerspell names and places using the ASL alphabet. Evaluation evidence confirmed that the game was able to successfully teach the ASL alphabet to participants in the research and improved their fingerspelling ability over a short period of time. Online users and in-person research participants enjoyed playing the game and thought it to be a useful learning tool. Results from extended testing saw an average improvement in sign production rate of 60 percent within five uses of the game. Testing showed areas for the game’s improvement including improving the capability to detect signs with crossed or hidden fingers. These results are illustrative yet with the caveat that since testing of the game was limited by Covid-19 related restrictions, sample sizes were statistically insignificant. The research demonstrates the potential use of hand tracking for educational purposes. The technology is not yet capable of supporting full ASL vocabulary but can support most signs of the alphabet. The game may not promote development of fingerspelling skills in the same manner native i signers learn the skill, but it can be used as a starting point for new learners. Further research could explore the inclusion of more advanced gesture recognition systems, especially those that recognize dynamic gestures. This research is some of the first conducted using the Oculus Quest’s hand tracking technology, and its application towards education. The methods in this study could be used as an example for further research in the fields of hand tracking games, games about ASL education, and simple gesture recognition systems. As the technology develops, including becoming even more user-friendly and affordable, this project illustrates that the opportunities for new applications are extensive and exciting. ii Table of Contents Abstract i Table of Contents iii List of Tables vi List of Figures vii Declaration of Originality viii Definitions and Abbreviations ix 1 Introduction 1 1.1 Application 1 1.2 Hand Tracking Technology 1 1.3 Groups of Interest 2 1.4 Structure of this Dissertation 3 2 Literature Review 5 2.1 Digital Game Based Language Learning (DGBLL) 5 2.2 ASL Instruction and Fingerspelling Comprehension 9 2.3 Gesture recognition technology 11 2.4 Summary 12 3 Materials 14 3.1 Virtual Reality Headsets and Hand Tracking Solutions 14 3.1.1 Oculus Quest 14 3.1.2 Hand Tracking 14 3.1.3 Alternative Solutions 15 3.2 Development Tools 16 iii 3.2.1 PC to VR Workflow 16 3.2.2 Game Engine 16 3.2.3 Hand Tracking Compatibility 17 3.2.4 Modelling and Animation 18 4 Methods 19 4.1 Planning 19 4.2 Sign Detection 20 4.2.1 Defining a “Sign” 20 4.2.2 Sign Saving and Detection 20 4.2.3 Dynamic Signs 21 4.2.4 Sign Detection Implementation 21 4.3 Sign Functionality 22 4.4 Gameplay 22 4.5 User Interface Design 23 4.6 Finishing Touches 25 5 Implementation Results 26 5.1 Menu 27 5.2 Sign Calibration 29 5.3 Practice Mode 30 5.4 Fingerspelling Challenge 31 5.5 Issues 33 6 Evaluation 35 6.1 Approaches 35 iv 6.1.1 In-person Medium Scale Testing 35 6.1.2 Online Testers 35 6.1.3 Extended User Testing 36 6.2 Limitations 36 6.3 Feedback from Online Users 37 6.3.1 SideQuest 37 6.3.2 Reddit 37 6.3.3 Twitter 37 6.3.4 Online Survey 38 6.4 In-person testing 41 6.5 Conclusion 43 7 Conclusion 45 7.1 Discussion 45 7.1.1 Achievements 45 7.1.2 Limitations 46 7.1.3 ASL Fingerspeller 46 7.1.4 Other Uses 47 7.2 Conclusion 47 References 49 Referenced Software and Games 51 List of Third-Party Materials Used 52 Appendix A. User Evaluations Results 53 Appendix B. Code 58 v List of Tables Table 6.1: Resulting Statistics of a Play Session from a Native ASL User .......................................... 38 Table 6.2: SUS Survey Responses ....................................................................................................... 39 Table R.1: Third Party Materials Used ................................................................................................. 52 Table A.1: Open Ended User Responses from Online Survey ............................................................. 53 Table A.2: In-person User Testing Results .......................................................................................... 55 List of Graphs Graph 6.1: Production Rate Improvement Over Time ......................................................................... 42 Graph 6.2: Average Ratio of Incorrect Letters Detected per Correct Letter Produced ........................ 43 vi List of Figures Figure 1.1: Oculus Quest VR Headset ................................................................................................... 2 Figure 2.1: The ASL Fingerspelling Alphabet ....................................................................................... 8 Figure 3.1: HTC Vive Headset with LMC Attached ............................................................................ 15 Figure 3.2: Hand Tracking Under Occlusion ....................................................................................... 17 Figure 3.3: Oculus Integration Bone ID Legend .................................................................................. 18 Figure 4.1: Games Design Document .................................................................................................. 19 Figure 4.2: Interaction Range ............................................................................................................... 23 Figure 4.3: Implementation of a "Physical" Button in Early Stages of Development .......................... 24 Figure 4.4: Tap-to-Click System Developed by Oculus ....................................................................... 24 Figure 5.1: Main Menu ......................................................................................................................... 26 Figure 5.2: Set Active Hand Menu ....................................................................................................... 27 Figure 5.3: Player Setting the Right Hand as Active ............................................................................ 28 Figure 5.4: Player Using The Main Menu ............................................................................................ 28 Figure 5.5: Sign Calibrator ................................................................................................................... 29 Figure 5.6: Player Using The Sign Calibrator ...................................................................................... 30 Figure 5.7: Practice Mode .................................................................................................................... 30 Figure 5.8: Player Using the Practice Mode ......................................................................................... 31 Figure 5.9: Fingerspelling Challenge ................................................................................................... 32 Figure 5.10: Player Playing the Fingerspelling Challenge ................................................................... 33 Figure B.1: Spellchecker Code ............................................................................................................. 58 Figure B.2: Code Implementation of a "Sign" ..................................................................................... 59 Figure B.3: Overwrite Function used in Calibration ............................................................................ 59 Figure B.4: Sign Detection/Recognition System .................................................................................. 60 vii Declaration of Originality STUDENT ID No.: 19129939 Own Work Declaration You must complete this declaration (each box ticked to show that the condition has been met), signed and dated, and included with each piece of work submitted for assessment (practical and written). Please note work will not be accepted unless this form is attached. As GSA have now obtained Turnitin plagiarism software, a cross section of student submissions will be evaluated. Name: Christian O’Brien Course/Programme: MSc Serious Games and Virtual Reality Title of Work: Optical Hand Tracking
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages72 Page
-
File Size-