Thesis Paper.Docx
Total Page:16
File Type:pdf, Size:1020Kb
CALIFORNIA STATE UNIVERSITY, NORTHRIDGE Utilizing Smart Watch Motion Sensors in Human Computer Interaction via Pattern Detection A thesis submitted in partial fulfillment of the requirements For the degree of Master of Science in Computer Science By Danial Moazen May 2015 1 The thesis of Danial Moazen is approved: SIGNATURE PAGE ___________________________________ ____________ Professor Vahab Pournaghshband Date ___________________________________ ____________ Professor Gloria Melara Date ___________________________________ ____________ Professor Ani Nahapetian, Chair Date California State University, Northridge ii Contents SIGNATURE PAGE .......................................................................................................... ii LIST OF TABLES .............................................................................................................. v LIST OF FIGURES .......................................................................................................... vii ABSTRACT ..................................................................................................................... viii 1. INTRODUCTION .......................................................................................................... 1 2. RELATED WORK ......................................................................................................... 3 2.1. Wearable Computing ............................................................................................... 3 2.1.1. Smart Watch...................................................................................................... 4 2.2. Gesture Recognition................................................................................................. 5 2.2.1. Letter Recognition ............................................................................................ 5 2.2.2. Challenges of Gesture Recognition .................................................................. 6 3. SYSTEM OVERVIEW .................................................................................................. 8 3.1. Hardware Overview ................................................................................................. 8 3.2. Software Overview ................................................................................................ 10 3.3. Weighted Moving Average Algorithm .................................................................. 13 3.4. Angle Calculation .................................................................................................. 14 3.5. Session Detection Algorithm ................................................................................. 16 3.6. Letter Detection with Machine Learning ............................................................... 17 4. APPROACH ................................................................................................................. 18 iii 4.1. Inertial Navigation ................................................................................................. 18 4.2. Pattern Detection .................................................................................................... 21 4.2.1 HMM................................................................................................................ 21 4.2.2 DTW ................................................................................................................ 23 5. RESULTS ..................................................................................................................... 24 6. CONCLUSION ............................................................................................................ 29 7. REFERENCES ............................................................................................................. 30 iv LIST OF TABLES Table 3.1: Comparison of the devices’ specifications used in this project; Nexus 7 as the handheld device and Samsung Gear Live as the wearable device. ..................................... 9 Table 3.2: Comparing the number of data (average for 5 times measurement) updated and transferred between devices while connection is continues and while the connection is only there during the sessions. The words are the 5 top most searched food related words in the U.S. in 2014. ........................................................................................................... 17 Table 4.1: Gyroscope average data of 10000 samples. ..................................................... 19 Table 4.2: Linear Accelerometer average data of 10000 samples. ................................... 19 Table 5.1: Detection accuracy of HMM algorithm for 5 non-similar letters. The size of letters is 12 inches by 12 inches. The test is done 100 times for each letter..................... 24 Table 5.2: Detection accuracy of DWT algorithm for 5 non-similar letters. The size of the letters is 12 inches by 12 inches. The test is done 100 times for each letter..................... 25 Table 5.3: Detection accuracy of HMM algorithm for 5 non-similar letters. The size of the letters is 6 inches by 6 inches. The test is done 100 times for each letter. .................. 25 Table 5.4: Detection accuracy of DTW algorithm for 5 non-similar letters. The size of the letters is 6 inches by 6 inches. The test is done 100 times for each letter. ........................ 26 Table 5.5: Detection accuracy of HMM algorithm for 5 similar letters. The size of the letters is 12 inches by 12 inches. The test is done 100 times for each letter..................... 26 Table 5.6: Detection accuracy of DTW algorithm for 5 similar letters. The size of the letters is 12 inches by 12 inches. The test is done 100 times for each letter..................... 27 v Table 5.7: Accuracy percentage of HMM algorithm for the detection of lowercase English alphabet. The test is repeated 20 times for each letter. The size of the letters is 12 inches by 12 inches. .......................................................................................................... 28 vi LIST OF FIGURES Figure 3.1 : Hardware schema; handheld device, wearable device and the Bluetooth connection. .......................................................................................................................... 8 Figure 3.2: Software Overview. Wear Application and Mobile Application ................... 10 Figure 3.3: Comparing the signal with and without weighted moving average filter. The blue line shows the acceleration signal on the z-axis while writing letter a without any filter applied, while the red line shows the same signal for the very same movement while the filter is applied. ........................................................................................................... 14 Figure 3.4: Smart watch’s fixed frame of reference. ........................................................ 14 Figure 3.5: Comparing the acceleration data before and after applying the rotation of the frame of reference while the device is moving up and down repeatedly and rotating around z-axis 90 degrees. 3.5.a, on the top, shows the acceleration without applying the rotation on the frame of reference and 3.5.b, on the bottom, shows the acceleration after applying the rotation on the frame of reference. ............................................................... 16 Figure 4.1: Comparing the velocity signal before and after applying the velocity filter. Figure 4.1.a on the right is the Velocity signal before applying velocity filter. While Figure 4.1.b is velocity signal after applying the velocity filter. ...................................... 20 vii ABSTRACT Utilizing Smart Watch Motion Sensors in Human Computer Interaction via Pattern Detection By Danial Moazen Master of Science in Computer Science Wearable computing is one of the fastest growing technologies today. Smart watches are poised to take over at least of half the wearable devices market in the near future. Smart watch screen size, however, is a limiting factor for growth, as it restricts practical text input. On the other hand, wearable devices have some features, such as consistent user interaction and hands-free, heads-up operations, which pave the way for gesture recognition methods of text entry. This thesis proposes a new text input method for smart watches which utilizes motion sensor data and machine learning approaches to detect letters written in the air by a user. This method is less computationally intensive and less expensive when compared to computer vision approaches. It is also not affected by lighting factors which limit computer vision solutions. The AirDraw system prototype developed to test this approach is presented. Additionally, experimental results close to 71% accuracy are presented. viii 1. INTRODUCTION Wearable computing is one of the fastest growing areas among today’s technologies [23]. According to a Yahoo Finance’s article published on April 2015 [24] based on a Business Insider Article [25], the global wearable market will grow at the dramatic rate of 35% annually for the next 5 years. Among different types of wearable devices such as glasses, wests and helmets, wrist based wearable devises make up a great proportion of today’s wearable devices. Forecasts suggest that smart watches, as one type of the wrist based wearable devices, is alone going to take over the half of the wearable computing market by 2018 [11][24][25]. In the area of gesture detection and recognition, most of the existing systems