Putting Artificial Intelligence Into Wearable Human- Machine Interfaces
Total Page:16
File Type:pdf, Size:1020Kb
Putting Artificial Intelligence into Wearable Human- Machine Interfaces – Towards a Generic, Self-Improving Controller Marcel Admiraal Imperial College London Department of Mechanical Engineering Ph.D. Dissertation Declaration of Originality I declare that: The content of this thesis is original and of my own work. All assistance received and sources of materials (data, theoretical analysis, figures, and text) have been acknowledged; and the work has not been submitted to any other institution for any degree or diploma. i Declaration of Copyright The copyright of this thesis remains with the author and is made available under the Creative Commons Attribution and Share-alike licence version 4. This work may be copied, distributed, displayed and derivative works created if the author is attributed and derivative works made available under an identical or less restrictive license. The full Creative Commons Attribution and Share-alike licence is available at https://creativecommons.org/licenses/by-sa/4.0/. The copyright of the software created also remains with the author and is made available under the GNU General Public Licence version 2, and specifically not any later version. The libraries and programmes are free software, they can be redistributed and modified under the terms of the licence. They are shared in the hope that they will be useful, but without any warranty; without even the implied warranty of merchantability or fitness for a particular purpose. The full GNU General Public License version 2 is available at https://www.gnu.org/licenses/old-licenses/gpl-2.0.html. ii Acknowledgements I would like to thank my supervisor Ravi Vaidyanathan and fellow research group members for all their contributions and my friends and family for their continual support and encouragement. iii Abstract The standard approach to creating a machine learning based controller is to provide users with a number of gestures that they need to make; record multiple instances of each gesture using specific sensors; extract the relevant sensor data and pass it through a supervised learning algorithm until the algorithm can successfully identify the gestures; map each gesture to a control signal that performs a desired outcome. This approach is both inflexible and time consuming. The primary contribution of this research was to investigate a new approach to putting artificial intelligence into wearable human-machine interfaces by creating a Generic, Self-Improving Controller. It was shown to learn two user-defined static gestures with an accuracy of 100% in less than 10 samples per gesture; three in less than 20 samples per gesture; and four in less than 35 samples per gesture. Pre-defined dynamic gestures were more difficult to learn. It learnt two with an accuracy of 90% in less than 6,000 samples per gesture; and four with an accuracy of 70% after 50,000 samples per gesture. The research has resulted in a number of additional contributions: • The creation of a source-independent hardware data capture, processing, fusion and storage tool for standardising the capture and storage of historical copies of data captured from multiple different sensors. • An improved Attitude and Heading Reference System (AHRS) algorithm for calculating orientation quaternions that is five orders of magnitude more precise. • The reformulation of the regularised TD learning algorithm; the reformulation of the TD learning algorithm applied the artificial neural network back-propagation algorithm; and the combination Page 2 of the reformulations into a new, regularised TD learning algorithm applied to the artificial neural network back-propagation algorithm. • The creation of a Generic, Self-Improving Predictor that can use different learning algorithms and a Flexible Artificial Neural Network. Page 3 Table of Contents Abstract.................................................................................................................................................2 Table of Figures..................................................................................................................................10 Table of Tables....................................................................................................................................17 Table of Acronyms..............................................................................................................................18 Chapter 1 Introduction.......................................................................................................................................20 1.1 The Need for a Generic, Self-Improving Controller..............................................................20 1.2 Research Goals......................................................................................................................22 1.2.1 Research Objectives....................................................................................................22 1.2.2 Research Scope...........................................................................................................23 1.2.3 Research Constraints...................................................................................................24 1.3 Original Contributions...........................................................................................................25 1.4 Publications............................................................................................................................26 1.5 Document Structure...............................................................................................................27 Chapter 2 Literature Review..............................................................................................................................31 2.1 Introduction............................................................................................................................31 2.2 Learning Human Hand Gestures............................................................................................33 2.3 Requirements for Real-Time Device Control........................................................................35 2.4 Capturing Muscle Activation Data........................................................................................37 2.5 Recent Successes and Ongoing Challenges with Using Machine Learning and Artificial Neural Networks...................................................................................................................42 2.6 Using Machine Learning to Play Games...............................................................................46 2.7 Using Machine Learning in Device Control..........................................................................51 2.8 Sensor Data Fusion................................................................................................................54 2.9 Summary................................................................................................................................58 Chapter 3 Source-Independent Hardware Data Capture, Processing, Fusion and Storage................................60 Page 4 3.1 Introduction............................................................................................................................60 3.2 IMU Type Independent Data Capture Software Interface.....................................................62 3.3 IMU Device Interfaces...........................................................................................................67 3.3.1 x-IMU..........................................................................................................................68 3.3.2 nuIMU.........................................................................................................................71 3.3.3 nuIMU Recorded Data................................................................................................74 3.4 IMU Specific Software..........................................................................................................76 3.4.1 x-IMU GUI..................................................................................................................76 3.4.2 nuIMU GUI.................................................................................................................86 3.4.3 nuIMU Firmware Updates..........................................................................................91 3.5 IMU Type Independent Control of a Virtual Robot...............................................................93 3.6 Real-Time Digital Signal Processing.....................................................................................97 3.6.1 Simple Moving Average Filter....................................................................................99 3.6.2 Median Filter...............................................................................................................99 3.6.3 Short-Term Energy Filter..........................................................................................100 3.6.4 Simple Low-Pass Filter.............................................................................................101 3.6.5 Simple High-Pass Filter............................................................................................101 3.6.6 Butterworth Difference Filters..................................................................................102 3.6.7 Envelope Filters.........................................................................................................103