<<

Controlling a 3D Printed Hand by Using Brain Waves

by

BOYAN LI, B.S.E.E.

A THESIS

Submitted to

The Department of Engineering

Colorado State University-Pueblo

In partial fulfillment of the requirements for the degree of Master of Science

CERTIFICATION OF ACCEPTANCE This thesis

presented in partial fulfillment of

the requirements for the degree of

Master of Science

has been accepted by the

Program with emphasis in Mechatronics Engineering

Colorado State University – Pueblo

APPROVED:

Dr. Nebojsa I. Jaksic, Ph.D., P.E., Committee Chair Date

Dr. Jude L. DePalma, Ph.D., Committee Member Date

Dr. Bahaa I. Kazem, Ansaf, Ph.D., Committee Member Date

Master’s Candidate BOYAN LI, B.S.E.E.

Date of Thesis Presentation August 15th, 2018

BOYAN LI for the degree of Master of Science in Engineering with emphasis in

Mechatronics, presented on August 15th, 2018 at Colorado State University-Pueblo

Title:

Controlling a 3D Printed Bionic Hand by Using Brain Waves

Abstract approved:

Professor Nebojsa I. Jaksic, Ph.D., P.E., Advisor

Abstract

High quality bionic hands are expensive while low-cost bionic hands have many operational problems during the process of using them. Hence, this thesis aims to improve the stability of low-cost bionic hands. The Hackberry bionic hand has been selected for characterization and improvement. A new control system combining Electroencephalogram

(EEG) with an Inertial Measurement Unit (IMU) sensor is proposed and implemented for the

Hackberry bionic hand in this thesis. The purpose of the IMU sensor is to decide when the bionic hand should close based on the hand movements. The EEG-based headset (Mindwave Mobile) designed by Neurosky is used to communicate with human brains. Several validation tests have been conducted to verify the stability of the new control system. Conclusions and future work are also addressed.

Table of Contents

Abstract ...... III

Table of Contents ...... iii

List of Figures ...... v

List of Tables ...... ix

Acknowledgements ...... x

Abbreviations ...... xi

Chapter I Introduction ...... 1

1.1 Motivation ...... 1

1.2 Research Goals ...... 2

1.3 Structure of Thesis ...... 5

Chapter II Background and Literature Review ...... 7

2.1 History of prosthetics ...... 7

2.2 Technologies ...... 11

2.2.1 Bionic Hands ...... 11

2.2.1 Brain-Computer (BCI) ...... 15

2.2.2 Electroencephalogram...... 18

2.3 Previous Research ...... 21

2.4 Conclusion ...... 25

Chapter III Hardware ...... 26

3.1 The Methodology of Choosing Hardware ...... 26

3.2 MakerBot® ...... 27

3.2.1 3D-Printing ...... 28

iii

3.2.2 MakerBot® Replicator 2 ...... 29

3.3 Hackberry ...... 31

3.4 Arduino...... 34

3.5 Neurosky Mindwave Mobile ...... 37

3.6 IMU Sensor: MPU6050 ...... 40

3.7 Bluetooth Module: HC-05 ...... 41

Chapter IV Bionic Hand Control System ...... 43

4.1 System Design ...... 43

4.2 Hardware Design ...... 48

4.3 ...... 52

Chapter V Testing of Bionic Hand ...... 59

5.1 Test Design ...... 59

5.2 Testing ...... 63

5.2.1 Hardware/Software Verification ...... 63

5.2.2 Standardized Assessment Procedure ...... 74

5.2.3 Physical Application Tests ...... 77

5.3 Conclusions ...... 102

Chapter VI Conclusions and Future Work ...... 105

References ...... 108

Appendix A Bionic Hand Program ...... 113

iv

List of Figures

Figure 1. An Artificial Toe Found in Egypt ...... 8

Figure 2. An Iron Hand of a German Mercenary...... 9

Figure 3. An Artificial Hand Designed by Thomas Openshaw ...... 10

Figure 4. Oscar Pistorius with Running Blades ...... 11

Figure 5. Open Bionics ...... 13

Figure 6. Brain-Computer Interface [19] ...... 16

Figure 7. The American Electroencephalographic Society Standardized - the

International 10-20 System [29] ...... 20

Figure 8. A 3D Printer - MakerBot Z18 ...... 28

Figure 9. MakerBot Replicator 2 ...... 30

Figure 10. MakerBot 2 Extruder ...... 31

Figure 11. Hackberry Bionic Hand [50] ...... 32

Figure 12. Arduino Nano Microcontroller ...... 35

Figure 13. Neurosky Mindwave Mobile Headset ...... 38

Figure 14. High Attention Meter Value ...... 39

Figure 15. Low Attention Meter Value ...... 39

Figure 16. 3-axis Accelerometer/Gyro MPU6050 ...... 41

Figure 17. Bluetooth Communication Board HC-05 ...... 42

Figure 18. Bionic Hand Control System Flowchart ...... 45

Figure 19. The New Control System Flowchart ...... 47

Figure 20. Pinout of Bluetooth Communication Board HC-05 ...... 48

Figure 21. Pinout of 3-axis Accelerometer/Gyro MPU6050 ...... 48

Figure 22. Pinout of Arduino Nano Microcontroller ...... 48

Figure 23. Control System Hardware Implementation ...... 49

Figure 24. The PCB Board of the Control System ...... 50

v

Figure 25. Initial Setup Flowchart ...... 53

Figure 26. The Initial Pitch (a) and Current Pitch (b) of the Bionic Hand ...... 54

Figure 27. The Initial Roll (a) and Current Roll (b) of the Bionic Hand ...... 54

Figure 28. DMP Output Rate Reduction ...... 56

Figure 29. Attention Collection Flowchart ...... 56

Figure 30. Main Loop State Diagram ...... 58

Figure 31. The Modified Hackberry Bionic Hand with NMM Headset ...... 60

Figure 32. Six Prehensile Patterns ...... 62

Figure 33. The Test Environment...... 65

Figure 34. The Standard for Wearing the Sensor Tip ...... 65

Figure 35. The Standard for Wearing the Sensor Clip ...... 65

Figure 36. The Initial Position for Testing the Modified Bionic Hand ...... 66

Figure 37. The Feedback for System Initialization ...... 67

Figure 38. The Hand is in the Original Position ...... 68

Figure 39. The Hand is Swinging Slightly ...... 68

Figure 40. The Feedback When the Hand is Resting or Swinging Slightly ...... 68

Figure 41. The Feedback of Changing the Operating mode to the Sleep mode: a) Hand

Opening b) Hand Stopped and ) Hand Closing ...... 69

Figure 42. The Modified Bionic Hand in the Gripping Mode ...... 70

Figure 43. The Feedback of Changing the Sleep mode to the Gripping Mode ...... 70

Figure 44. The Process of Closing the Hand in the Gripping Mode ...... 71

Figure 45. The Process of Opening the Hand in the Gripping Mode...... 71

Figure 46. The Feedback of Opening (a) and Closing (b) the Bionic Hand ...... 72

Figure 47. The Feedback of Changing the Sleep mode to the Pinching Mode ...... 73

Figure 48. The Modified Bionic Hand in the Pinching Mode ...... 73

Figure 49. The Process of Closing the Hand in Pinching Mode ...... 73

vi

Figure 50. The Process of Opening the Hand in Pinching Mode...... 73

Figure 51. Task Times (a) and Index of Functionality (b) of the SHAP ...... 76

Figure 52. The Researcher is Standing ...... 79

Figure 53. The Researcher is Walking ...... 79

Figure 54. The Fingertips Covered by Mounting Tape ...... 82

Figure 55. The Process of Gripping the Glass Cup and Pouring Water into the Bucket

...... 83

Figure 56. Opening a Soda with Both Hands ...... 84

Figure 57. The Process of Gripping the Beverage Can and Pouring Soda into the

Bucket ...... 84

Figure 58. Open a Plastic with Both Hands ...... 85

Figure 59. The Process of Gripping the Plastic Bottle and Pouring Water into the

Bucket ...... 85

Figure 60. The Process of Gripping the Paper Cup and Pouring Water into the Bucket

...... 86

Figure 61. The Process of Gripping the Mug and Pouring Water into the Bucket ...... 87

Figure 62. The Modified Fork...... 89

Figure 63. Holding the Modified Fork by Using the Bionic Hand ...... 89

Figure 64. The Process of Gripping the Mobile Phone and Plugging the Cable into the

Phone ...... 91

Figure 65. Holding the First Book with the Bionic Hand ...... 93

Figure 66. Holding the Second Book with the Bionic Hand ...... 93

Figure 67. Holding the Third Book with Both Hands ...... 94

Figure 68. The Process of Turning a Magazine Page ...... 96

Figure 69. The Process of Plugging the USB Cable into the Port in the Laptop ...... 97

Figure 70. The Process of Zipping up the Zipper of a Jacket ...... 98

vii

Figure 71. The Process of Writing the User’s Name...... 99

Figure 72. The Signature Signed by Using the Modified Bionic Hand ...... 100

Figure 73. The Process of Tying a Shoelace ...... 101

viii

List of Tables

Table 1. The Bionic Hands’ Prices and Functionality ...... 15

Table 2. Neuroimaging Methods Used in BCI ...... 17

Table 3. the Comparison of Brainwave Band ...... 19

Table 4. Arduino Micro’s Specifications ...... 36

Table 5. Arduino Nano's Specifications ...... 36

Table 6. The Commercial BCI Devices’ Channels and Prices ...... 37

Table 7. AT Commands for Configuring Bluetooth Module HC-05...... 51

Table 8. 12 abstract tasks and 14 activities of daily living ...... 75

Table 9. The Liquid Containers Gripping Test Result ...... 80

Table 10. Gripping Other Objects ...... 88

Table 11. Objects and Activities of the Pinching Test ...... 95

ix

Acknowledgements

This thesis would not have been possible without the help of all the following people. Their guidance and assistance helped me overcome numerous challenges.

First and foremost, I would like to offer my utmost gratitude to my academic advisor and committee chair, Dr. Neb Jaksic; his professional help and guidance throughout my graduate student years. He not only taught me knowledge but also supported me when I felt helpless. I express my appreciation to my committee members Dr. DePalma and Dr. Ansaf. Their help was an essential part of completing this thesis. I also want to thank Dr. Yuan. She helped me adapt to be a student in America rapidly.

Second, I am grateful to my friends Joanne, Laila, Curtis, and Chichih. Joanne, as my psychologic counselor, gave a lot of useful advises to help my depression. Laila, who is at

International Programs (CIP) office, offered her professional writing skills when I had a problem of writing this thesis. Curtis helped me build the prototype of the bionic hand. Chichih, as my roommate, encouraged me through the tough times and supported me when I was troubled.

Third, I would like to give thanks to all my families. I could not have a chance to study in

America without their support. Their patience and encouragement led me to where I am today. I dedicate this work to them.

Finally, I wish I can thank everyone else who have been given me help, advises, and encouragement.

x

Abbreviations

3D Three-dimensional

BCI Brain-Computer Interface

DMP Digital Motion Processor

ECoG Electrocorticography

EEG Electroencephalogram

EMG Electromyography fMRI Functional Magnetic Resonance Imaging

FFF Fused Filament Fabrication

FIFO First In, First Out

IDE Integrated Development Environment

IMU Inertial Measurement Unit

MEG Magnetoencephalography

NIRS Near Infrared Spectroscopy

PET Positron Emission Tomography

PLA Polylactic Acid

PWM Pulse Width Modulation

PCB Printed Circuit Board

R2 MakerBot® Replicator 2 Desktop 3D printer

xi

GUI

SHAP Southampton Hand Assessment Procedure

ADLs Activities of Daily Living

IOF Index of Functionality

xii

Chapter I Introduction

The motivation for this thesis is to help disabled people without hands to lead healthy lives by using bionic hands. Although many researchers developed bionic hands to solve this problem, most of these developments resulted in bionic hands with some disadvantages, such as prohibitive cost and instability. Hence, the goal of this thesis is to improve a Hackberry bionic hand to make it inexpensive and reliable. Finally, the structure of this thesis will be discussed.

1.1 Motivation

Hands, as one of the essential organs, are of high importance for human beings. Because of their excellent mobility and flexibility, humans can complete multiple complex actions with their hands. The thumb provides humans with a higher level of dexterity, which enables humans to achieve tasks, such as grasping, holding, picking and lifting more efficiently than other creatures can. Moreover, with the enormous amount of nerve endings in the fingertips, hands make it easier to acquire information about the surrounding environment, including pain, tactile information, and temperature. Additionally, it is worth being reminded that using hands is the typical way to show body language and sign language, which helps us communicate with others.

Therefore, hands are vital to humans as a way of interacting with the environment and building the world.

1

According to the World Health Organization report on disability, about ten percent of the total world’s population consists of people with disabilities [1]. Of this percentage, some people are missing hands. Considering the crucial functions of hands, the life of people who are missing hands is complicated. It inevitably takes them much longer to accomplish something that is typically considered natural. For example, it is difficult for them to apply toothpaste onto a toothbrush, tie their , or open a bottle. Furthermore, the person missing two hands usually cannot lead a healthy life. They cannot even eat by themselves easily. They have to learn to use their feet in place of their hands. Otherwise, they must be taken care of by their families.

Moreover, disabled people have a harder time in finding a job. Most jobs require people to have hands, which limits the categories of work for such disabled people. They cannot work in heavy manual labor and assembly-line jobs. More than that, disabled people are sometimes discriminated against just because they are missing hands.

A technology which can replace missing hands could help thousands of disabled people improve their lives, enhance their self-confidence, and participate in social activities. We, as engineers, feel obligated to use our knowledge to help disabled people. Therefore, the motivation of this thesis is to provide better bionic hands for the disabled.

1.2 Research Goals

Humans have been using artificial hands for over a thousand years. Such hands were used to help people who were missing hands to take care of their daily routines to a limited extent, but

2 the traditional artificial hands are only cosmetic. The design of an artificial hand has not changed for thousand years until the invention of the bionics.

During the third revolution of science and technology, automatic control devices have significantly improved human productivity. However, the traditional automatic control devices always use the open loop system whose control action is not affected by the output [2]. During the process of constant exploration, humans obtain enlightenment from biological systems. Thus, cybernetics which was proposed by Norbert Wiener in 1948 is applied to various fields of engineering now [3]. The basic idea of cybernetics is that machines communicate with organisms. In other words, there are similar laws between the control system of machines and organisms. Therefore, the organism can be considered as the particular machine. In bionics, scientists imitate the structure and specific skills of organisms to invent new technology.

Bionic hands were invented by understanding and imitating the structure and functions of human hands. The technology of artificial hands made a qualitative leap forward because of bionics. A bionic hand can typically move like a real hand. Now, there are many companies which are trying to create a better bionic hand. For example, a British company, RSLSteeper, invented a bionic hand named Bebionic which can tie shoes [4]. However, a whole Bebionic costs about $11,000. The price of such bionic hand is too high for most disabled people. There is a need for the bionic hand with a better benefit-cost ratio.

Using three-dimensional (3D) printing can reduce the cost of a bionic hand. The price of a 3D printed bionic hand is usually from $200 to $1,500. There exists a low-cost bionic hand

3 named Hackberry. The entire Hackberry hand has the cost of $264, and its principal parts are produced by a 3D printer. The Hackberry bionic hand uses electromyography (EMG), so the control system relies on a portion of the arm that exists. If the user does not have the forearm or the muscles in the forearm are not strong enough, the Hackberry unit will not work. Therefore, this thesis proposes an innovative method as a replacement of the current EMG control system in the Hackberry bionic hands.

Instead of EMG signals, speech commands, facial muscle movements, and EEG could be used as signals to control a bionic hand. However, the user may not be able to speak or move a facial muscle, depending on his/her degree of disability. Moreover, the user having a bionic hand may not want to get the attention of the people around him/her while he/she is operating a bionic hand. The bionic hand controlled by EEG signal directly collects the data from the brain, which can be applied to most of the prosthetic hand users. Commanding an EEG controlled bionic hand cannot be easily observed, which makes the user comfortable. Currently, EEG is commercially available, affordable, and it does not rely on the remaining muscle in the forearms to communicate with the bionic hand. Therefore, Hackberry can be improved if EEG is used in its control system. The research goal of this thesis is to improve the 3D printed Hackberry hand by using EEG.

4

1.3 Structure of Thesis

This thesis has six chapters including an Introduction, Background and Literature

Review, Hardware, Bionic Hand Control System, Testing of Bionic Hand, and Conclusions and

Future Work.

In Chapter Two, Background and Literature Review, the history of artificial hands and the development of bionic hands are introduced. Then, the basic principles and applications of technologies that are used in this thesis are addressed. At the end of the chapter, the current research on the bionic hands controlled by brain waves is discussed. Differences between this work and other research projects are analyzed.

In Chapter Three, Hardware, all hardware technologies used in this thesis are explained.

The methodology for choosing hardware including the standards for choosing the bionic hand, sensors, and the microcontroller is described. Then, hardware details are provided.

In Chapter Four, Bionic Hand Control System, the process of designing the control system based on EEG is detailed. Short comings of the original Hackberry are analyzed, and solutions provided. The scheme for changing the original control system is addressed. Finally, the design of the circuit and software are described.

In Chapter Five, Testing of Bionic Hand, the testing process with associated results is presented. Designing the tests is explained at the beginning. The tests are divided into two parts.

Firstly, multiple tests of the new control system that can verify the reliability of the new control

5 system are discussed. Secondly, multiple tests of the bionic hand that verified the performance of the modified bionic hand with respect to real-life applications are described.

In Chapter Six, Conclusions and Future Work, the improved bionic hand characteristics and recommended further work are summarized.

6

Chapter II Background and Literature Review

In this chapter, the history of the prosthetics is described from the early recorded usage of prostheses to modern times. The history is followed by an introduction of bionic hands, the EEG, and the brain-computer interface (BCI). These technologies have a deep connection with this work. At the end of this chapter, the current overview of a 3D printed bionic hand controlled by

EEG is provided.

2.1 History of prosthetics

Humans can lose their limbs because of congenital disease, or trauma. Missing limbs disadvantage the amputee physically and mentally. Prostheses have been designed to help the amputees perform every day functions. Currently, prostheses include legs, arms, hands, eyes, and knee joints. In this section, upper extremity and lower extremity prostheses are mainly discussed.

The history of the prosthetics can be traced back to 4000 years ago. The first record of prosthesis use appears in the Vedas which is a book written in Sanskrit in India between 3500 and 1800 B.C. In the book, Queen Vishpla lost a leg in battle, and the leg was replaced by an iron and wooden leg [5]. In 2000 AD, an artificial toe (Figure 1) made of wood and leather was found with an Egyptian mummy [6]. These are the oldest prostheses which have been discovered so far.

7

Figure 1. An Artificial Toe Found in Egypt

The first artificial hands were used to help soldiers hold weapons. When soldiers were using swords and other bladed weapons, they could quickly lose a hand. Some fortunate individuals that survived could not hold their weapons with two hands anymore. Since the world's population was small, it was important for soldiers to be able to return to battle quickly; therefore, prosthesis could allow soldiers to fight again. For example, there was a Roman general who lost his right arm in the Second Punic War (218 – 210 B.C.). An iron arm was developed to help him hold his shield, and he was able to fight again [6].

During the Early Middle Ages, prostheses were mainly used by soldiers. A simple device with complex internal functions was included in the prosthesis to be used for fighting in battle. In this period, the outlook of the prosthesis was a peg leg or a hand hook. Prostheses underwent significant development from the Early Middle Ages to the Late Middle Ages. During this

8 period, the new shape like an actual body part had begun to appear, and the materials used for prosthesis were wood, copper, iron, and steel. Moreover, functions of prosthesis became more versatile for daily utilization [7]. Later in 1508 AD, a German mercenary had a pair of iron hands

(Figure 2) which could be manipulated by other people. Around 1512 AD, the usage of a multifunctional artificial hand in Asia was recorded by an Italian surgeon. This hand could remove hats, open purses, and sign the user’s name. Ambroise Paré, as the father of modern amputation surgery and prosthetic design, invented lower extremity prosthesis with engineering features that are still in use [7].

Figure 2. An Iron Hand of a German Mercenary

From the 16th century to modern times, the main improvements to prostheses were mechanical structures. Using new mechanical structures decreased the weight of the prosthesis.

9

Also, the shape and appearance of the prosthesis became more like the real limb. Notably, the joints of the prosthesis became more elaborate, which made the prosthesis more functional. The artificial hand in Figure 3 was designed by Thomas Openshaw around 1916 [8]. This artificial hand had a metal hook connected to two fingers, which helped with daily use.

Figure 3. An Artificial Hand Designed by Thomas Openshaw

After World War I, the military surgeons began to realize the importance of prosthetics, and this facilitated the formation of the American Orthotic & Prosthetic Association [7]. Thus, the development of artificial hands became more scientific. Currently, performance innovation has brought new opportunity for prosthetics. These innovations contributed to the development of Running Blades, a new type of leg prostheses. Running Blades are made of fiber reinforced polymer. Because of the high performance, this prosthesis was accused of giving an

10 unfair advantage to athletes like Oscar Pistorius (Figure 4). In the future, it is possible that the continuing development of prosthesis will result in artificial limbs which are better than the real limbs.

Figure 4. Oscar Pistorius with Running Blades

2.2 Technologies

2.2.1 Bionic Hands

Humans use hands to manufacture different tools, and as a result, they ascended to the top of the food chain. Because hands can perform a more complex function, the need to develop a prosthetic hand is more difficult than the development of a simple prosthetic leg. However, from the hand hook to the bionic hand, humans have never given up seeking to develop a better prosthetic hand for amputees.

11

The bionic hand is not only a milestone for medical engineering excellence but also the hope for the disabled with missing hands. The bionic hand has better aesthetic form and more powerful functions than the traditional prosthetic hand. Current bionic hands have been designed to have five movable fingers and a wrist. To reduce the weight, metal alloys, plastic and biomaterials are often applied to the bionic hand. Surface materials of the bionic hand are approximate colors of natural skin tone. Therefore, using the bionic hand becomes more acceptable [9].

The bionic hand can typically be divided into two parts: the mechanism and electronic part. The mechanism is composed of the fingers, the palm, the wrist, and the forearm. The number of parts is not fixed; it is usually dependent upon the remainder of the hand. The electronic system is composed of motors, the microcontroller, the power supply, and sensors. To achieve complex movements of fingers, the bionic hand integrates a digital control method. The myoelectric controller is most widely applied in commercially available bionic hands. The main controller can analyze the electrical signal from the neuromuscular unit of the hand and activate the motion of the bionic hand. An example of a bionic hand is shown in Figure 5. The user can control the fingers of the bionic hand to complete work through the control system.

12

Figure 5. Open Bionics

Myoelectric control, the most popular control method for the bionic hands, has several directions of development. Firstly, targeted motor reinnervation can use residual connectivity and function of peripheral nerve stumps to assist myoelectric control to increase the precision

[10]. Secondly, there is a technique that implants bipolar differential electromyographic electrodes into the muscle in the stump; this makes the system more capable because the number of control sources for controlling the hand is increased. Therefore, the accuracy of the myoelectric control system is improved as it is reading intramuscular electromyographic signals

[11]. Thirdly, sonomyography, which is a use of ultrasonics, can be applied to detect the change in the size of contracting muscles in the stump [12].

Compared to myoelectric control, EEG control must work with the most complex organ – the brain. However, EEG control has an advantage that this type of control does not require a stump of a limb. The EEG control acquires the signal from the brain and sends commands to an

13 external device directly. Therefore, EEG control can be used in different situations for missing hands. Currently, the researchers are focusing on enhancing the quality and the precision of the

EEG signal for controlling a bionic hand. Fifer et al.[13] analyzed the high-frequency neural population activity to derive the command signal by using intracranial EEG, which can improve the accuracy of controlling the bionic hand. McMullen [14] used the multi-control system for controlling the bionic hand. The system combined intracranial EEG, eye tracking, and computer vision. The new system attempted to prevent the baseline false positive at system initialization from occurring. Zhang [15] has tested two methods of the EEG pattern recognition: the C- support vector classifiers and the Back Propagation neural network. The authors claimed that the neural prosthesis system with non-invasive EEG is feasible and reliable. In this thesis, the 3D printed bionic hand controlled by EEG is discussed in greater detail in Section 2.3.

The price of a bionic hand is in relatively wide bands. The bionic hand with more realistic functions costs $20,000 to $30,000 [16]. A very advanced bionic hand may cost as much as $100,000. Correspondingly, the bionic hand with limited function has a low cost. The price of the low-cost bionic hand is between $200 to $1500. Table 1 shows the price and functionality of bionic hands. Moreover, the price of a bionic hand has a relationship with the level of limb loss

[17]. The price of a bionic hand with a myoelectric arm up to the shoulder is three times as much as that of a bionic hand for replacing a partial loss of a hand.

14

Table 1. The Bionic Hands’ Prices and Functionality

Name of a Bionic Hand Price ($) Functionality

i-limb 60,000 Advanced

Bebionic 11,000 Advanced

Luke arm 100,000 Advanced

Dextrus 1,100 Single Function

Open Bionic 700 Single Function

Youbionic 226 Single Function

2.2.1 Brain-Computer Interface (BCI)

Brain-computer interface (BCI) is a connecting pathway between the brain of a human or animals and an external device [18]. In other words, people can control an external device without moving a muscle and get information about the environment without visual, auditory, olfactory, gustatory or tactile feedback. In 1969, Eberhard Fetz and his research team at the

University of Washington School of Medicine showed that monkeys could learn to use neural activity to control the deflection of a biofeedback meter arm (Figure 6) [19]. From then on, a magical technology - the BCI had entered people’s awareness.

15

Figure 6. Brain-Computer Interface [19]

Currently, the BCI, as technology in development, has various applications. For example, there is a new way for gaming called Neurogaming [20]. Instead of using traditional controllers,

Neurogaming uses the wearable BCI to interact with a game console. For the Smart Home, the

BCI will be combined with the Internet of Things [21]. People will use their minds to control household appliances, lights, doors and home service robots. In medical science, the BCI can help people obtain visual functions or recover from diseases like spinal cord injury, stroke, and epilepsy [22]. Moreover, there is an application which is the neuromotor prosthesis [23, 24]. This application can help paralyzed humans replace lost motor functions and recover movement. In the future, the BCI will intensify the brain’s functions by sending certain brain rhythms to brain

16 regions, which can lead to improve memory, perception, and processing speed of the brain. For example, the BCI can help the brain create, consolidate, and recall the memory.

There are several neuroimaging methods used in BCI: EEG, magnetoencephalography

(MEG), electrocorticography (ECoG), positron emission tomography (PET), functional magnetic resonance imaging (fMRI), and near-infrared spectroscopy (NIRS) [25]. Among them, EEG is the most widespread neuroimaging method because an EEG device is relatively inexpensive, convenient, and simple [25]. The summary of neuroimaging methods is shown in Table 2.

Table 2. Neuroimaging Methods Used in BCI

Activity Temporal Measurement Risk Portability Measured Resolution

EEG Electrical Direct ~0.05s Non-invasive Portable

MEG Magnetic Direct ~0.05s Non-invasive Non-portable

ECoG Electrical Direct ~0.003s Invasive Portable

PET Metabolic Direct ~0.01s Non-invasive Non-portable

fMRI Metabolic Indirect ~1s Non-invasive Non-portable

NIRS Metabolic Indirect ~1s Non-invasive portable

17

A BCI consists of five consecutive stages: signal acquisition, signal enhancement, feature extraction, classification, and the control interface [25]. Based on the position of the sensors of the BCIs, BCIs can be divided into two group: invasive and noninvasive [25]. Invasive BCIs are directly implanted into the brain's gray matter; they can acquire high-quality neural signals.

Noninvasive BCIs do not need to be implanted into the skull; They are usually headset devices.

The recorded neural signal cannot have a high resolution due to the skull barrier, but this type of device can be worn conveniently and has low probability of triggering an immune reaction.

2.2.2 Electroencephalogram

At the highest level of the nervous system, the brain leads the process of advanced neural activities in the body; such as learning, language, memory, and intelligence. It is commonly known that nerve cells in the brain create bioelectricity while the brain is functioning.

Bioelectricity generates voltage fluctuations. With the development of electronics, voltage fluctuations can be amplified and recorded using an EEG [26]. Currently, EEG is usually described in rhythmic activity and is differentiated into bands by frequency: Delta, Theta, Alpha and Beta [27]. Their comparison is shown in Table 3.

18

Table 3. the Comparison of Brainwave Band

Brainwave Band Frequency (Hz) Mental State and Condition

• Adult deep sleep Delta ﹤4 • in babies

• Drowsiness in teens and adults Theta 4 -7 • Idling

• Relaxed / reflecting Alpha 8-15 • Closing the eyes

Beta 16-31 • Active or busy thinking, focus, high alert, anxious

• Higher mental activity Gamma ﹥32 • Tactile sensations

The first record of an EEG was reported in the British Medical Journal. A physician named Richard Caton (1842-1926) published his finding of electrical phenomena from the brain of rabbits and monkeys. Hans Berger (1873-1941), a German physiologist and psychiatrist, was the person who recorded the first human EEG in 1924 [28]. Due to the developments of the EEG in the last hundred years, scientists have learned much more about our brain activities. The

American Electroencephalographic Society standardized - the International 10-20 system to

19 ensure producibility and replicability of the EEG research. Based on this standard, the electrodes are placed over the scalp as shown in Figure 7 [29]. In the figure, the Nasion and Inion are two reference points in the head to locate the electrode. Each letter corresponds to individual brain regions: “A”, “C”, “P”, “푃푔”, “F”, “퐹푝", and "O" represent the earlobe, central region, the parietal, the nasopharyngeal, the frontal, the frontal polar, and the occipital area respectively. The

“10%” and “20%” refer to the 10% and 20% of the total distance of the skull.

Figure 7. The American Electroencephalographic Society Standardized - the International 10-20 System [29]

EEG is applied widely to different areas of science and technology, especially in medicine.

EEG is most often used to diagnose diseases, such as epilepsy, sleep disorders, encephalopathy and brain death. Moreover, mental and psychological problems could be treated by changing characteristic of different EEG frequency bands. There is a new treatment named EEG

20 neurofeedback which can help the person who has Attention Deficit Hyperactivity Disorder by enhancing or decreasing different frequency bands of EEG [30, 31]. Currently, researchers are trying to promote the convergence between various disciplines and EEG. Therefore, the bionic hand has become the new direction for EEG use research.

2.3 Previous Research

In recent years, bionic hands controlled by EEG have attracted significant attention. With the development of the EEG, people can control bionic hands via a BCI effectively. Currently, commercial EEG headsets are of decent quality and are reasonably priced. Therefore, the EEG control is beginning to be used in bionic hand designs.

Some studies [32-34] stress that EEG is the unique signal for controlling 3D printed bionic hands. Bright [32] used the Neurosky headset as the EEG sensor to control a bionic hand.

In this system, the raw data from Neurosky is transmitted to a laptop. The MATLAB in the laptop analyzes and transmits the command signal to the microcontroller in the bionic hand using the ZigBee module. Kasim [33] designed a LabVIEW graphical user interface (GUI) integrated with the Emotiv EEG headset. Through this GUI, the user can control the UiTM bionic hand by changing the expression. Elstob [34] proposed two software frameworks to control a bionic hand via the Emotiv EEG headset. The first framework is that the EEG signal is analyzed by the

Emotiv Cognitive Suite. The bionic hand is controlled through character input that resulted from the taught actions of the suite. The second framework is that the EEG signal is analyzed by

21

OpenViBE for motor imagery tasks. Elstob claimed that the second framework was more desirable than the first framework. Although the methods [32-34] are reliable and low-priced, a personal computer is necessary to complete the system. Thus, the range of movements is rather limited. A bionic arm with an Emotiv headset was proposed by Beyrouthy [35]. The Raspberry

Pi III encodes mind states from the Emotiv headset to represent different hand patterns and transmits them to an Arduino Mega in the bionic hand for controlling mechanical servo units.

Additionally, this bionic hand is combined with a smart sensor network which can provide feedback of environmental information to improve the performance of the bionic hand.

Compared with the bionic hands in [32-34], the bionic hand in [35] abandoned the laptop and chose to use the Raspberry Pi III for analyzing the EEG signal. It solved the problem of portability. However, the cost of the hand is still over $1000, which means it is a financial burden on the user. Nathan [36] and Saint-Elme [37] developed an Arduino-based bionic hand using the Neurosky Mindwave Mobile (NMM) headset. In those two projects, the Arduino talks to the NMM headset directly and analyzes the EEG signal to obtain two valid data values:

Attention and Meditation. The difference between two studies is that the attention and meditation meter values are used as the control signals in [36], but only the attention meter value is used in

[37]. Compared with the bionic hand in [35], the Raspberry Pi III is not required in other studies

[36, 37]. Thus, the cost of these bionic hands has been reduced. However, these bionic hands have two disadvantages: first, the hand will move unintentionally while the user is paying attention to something else; second, this bionic hand cannot control the degree of the flexed

22 movement, which means the bionic hand cannot grip objects with different shapes, sizes, and materials.

Other studies stress low-cost bionic hands that are controlled by integration of EEG and various technologies. Oppus’s study [38] offered a control system of the low-cost bionic hand which uses the NMM headset and a voice recognition module. In this study, ten hand gestures were represented by a particular binary counter which counts from 0 to 31. The EEG signal from the NMM head is processed and subsequently mapped into ten hand gestures. Unfortunately, while the Arduino reads sensor data from both control modules, the two control modules do not interact. Therefore, the Arduino cannot enhance the control accuracy by using two control modules. Cheng [39] described a hybrid BCI system integrating EEG and eye-gaze tracking technology to control the low-cost bionic hand. In this system, EEG controls the bionic hand to grasp an item, and the eye-gaze tracker identifies the exact item wanted by the user. Cheng claims that using the hybrid BCI system can enhance robustness and accuracy of hand actions.

However, the sensor of the system needs to be worn near the eyes of the user, which can cause a blind spot. Moreover, the system requires a computer to process the signal, which is inconvenient for the user. Taken from the studies of Wu [40]and Khan [41], the integration of

EEG and EMG is realizable for controlling low-cost bionic hands. In Wu’s study [40], the attention signal from the NMM headset and the Willison amplitude of the EMG signals are used to identify the beginning and end of the user's action intent. The speed and the force for opening and closing the bionic hand are determined by estimating the integrated EMG. In Khan's study

23

[41], the subject concentration level from the NMM headset is used to activate the bionic hand, and the muscle sensor voltage output is used to determine the number of active fingers and the amount of force. Although the two control systems can enhance the precision of control efficiently, the disadvantage of EMG remains. Disabled people who do not have a right muscle structure cannot use the Wu and Khan bionic hands.

Typically, the EEG control system is an open-loop control. A few studies [42, 43] have added feedback to the control system of the bionic hand. Owen [42] addressed a closed-loop control system for the bionic hand controlled by EEG, which used the signal from force sensors placed in the tips of the fingers as the feedback. The motor is triggered by the EEG signal from the NMM headset and stopped when a force sensor encounters an object. The advantage of this control system is that objects with different shapes and sizes can be grasped. However, the object in the bionic hand may slip down because the grasp force cannot cause enough friction to hold the object. Li [43] designed a bionic hand controlled by the Emotiv headset with feedback. The control system recognizes the user’s action intent by analyzing the EEG signal and drives the finger motors autonomously with perception feedback. This method of the control is feasible because it is based on the Hidden Markov model. The feedback signals are from tactile and slip sensor in the fingers and the IMU sensor in the palm. Even though this bionic hand has advanced functions, it still relies on a computer to analyze the signals, which is inconvenient.

24

2.4 Conclusion

The history of artificial limbs has been introduced in this chapter. One can claim that the invention of the bionic hands might be of the crucial importance for disabled people. Then, the related technologies have been addressed, including the bionic hand, the BCI, and EEG. In previous research, the low-cost bionic hand controlled by EEG has been highlighted. The three directions of the relevant research have been discussed: EEG as the unique signal source, EEG integrated with various technologies, and EEG with feedback. Although the low-cost bionic hand controlled by EEG has been improved, there are still several problems that need to be addressed.

In this thesis, a low-cost bionic hand controlled by the HMM headset will be designed.

Therefore, the following two problems will be addressed:

1. The hand moves unintentionally while the user is paying attention to something else.

2. The bionic hand cannot control the degree of the flexed movement, which means the

bionic band cannot grip objects of different shapes, sizes, and materials.

25

Chapter III Hardware

The hardware used in this thesis is discussed in this chapter. Hardware standards required by the research goal’s requirements are explained. Then, the hardware chosen for this thesis is introduced in detail, including a MakerBot® Printer, Hackberry, Arduino, MPU6050 and HC-05.

3.1 The Methodology of Choosing Hardware

In this thesis, the hardware is the foundation of the control system. The high-quality operation of a bionic hand relies on appropriate hardware. There are many possible selections for hardware in the market. A good standard can help the researcher find appropriate hardware and save time. In other words, a good standard for hardware is the first step of building a reliable bionic hand. Therefore, it is necessary to discuss the details of the standard.

The research goal should be emphasized at the beginning because it is the resource for ruling the standard. In this thesis, the research goal is to improve the Hackberry 3D printed bionic hand and make it more reliable and less expensive by using BCI. The requirements based on the research goal are:

1. Hackberry printed by a 3D printer is required.

2. The BCI device is the first hardware device in this thesis.

3. Each hardware part used in this thesis must be compatible with Hackberry hand.

26

4. The primary objective of this thesis is to improve the control system of a Hackberry bionic

hand, so it is not necessary to change every mechanical part of Hackberry hand.

5. High reliability, low energy, and low cost are indispensable conditions for hardware.

A Hackberry hand and a BCI are the essential hardware components. While the structure of Hackberry is fascinating, it is not the emphasis of this thesis. The control system circuit is changed to fit the BCI and Hackberry hand. This enhancement makes the control circuit the central part that is changed. The cost of hardware must be low, but the system reliability must be high enough for everyday use. Moreover, hardware must be energy efficient because the operating time requirement of the Hackberry hand is over 8 hours. Therefore, the hardware requirements can be summarized as follows: a Hackberry hand, a BCI device, and a new circuit controlled by EEG for Hackberry, where all the hardware is highly reliable and inexpensive.

3.2 MakerBot®

3D printing is confirmed as an important invention of the last century. Everyone can build a workshop on his table instead of a factory. Many great mechanical ideas became a reality because of the power of 3D printers. The revolutionary transformation of manufacturing is owed to this brilliant technology. When the bionic hand was combined with 3D printing, that contributed to the development of innovative ideas. Currently, there are many open source bionic hands which can be found on the internet, and most of them are created by 3D printing [44].

Using 3D printing not only deals with complex parts in the bionic hand but also reduces the cost

27 of a product. In this thesis, all parts of a Hackberry have been printed by a 3D printer –

MakerBot® Replicator 2.

3.2.1 3D-Printing

3D printing is an additive manufacturing technology. The 3D printer (Figure 8) is a machine which uses rapid prototyping to synthesize a 3D object based on the digital model in which successive layers of material are formed. A 3D printer controlled by a computer can print an object in almost any shape [45].

Figure 8. A 3D Printer - MakerBot Z18

The idea of 3D printing originated in the U.S. in the late 1980’s. The 3D printer was developed and promoted in the 1980s [46]. In the 1990s, the modern technology of material deposition was invented, while sacrificial material and support material became more and more

28 common. Therefore, the process of 3D printing was improved tremendously [47]. After 2000, the

3D printing objects became larger and more complex. As an example of this trend, the first car and plane have been printed during the last five years. Moreover, the technology of 3D printing is applied to other fields, such as art, food, military, and medicine [48]. There are several types of

3D printers: Extrusion, Light Polymerized, Powder Bed, Laminated, Powder Fed, and Wire.

Fused Filament Fabrication (FFF) is the most popular technology for 3D printing. FFF 3D printer, which was invented by S. Scott Crump in the late 1980s [48], has a lot of open sources and costs less but needs to use support material because it has a few shape limitations.

Some of the 3D printing many advantages are:

1. 3D printing can reduce the waste of materials. Because it uses additive manufacturing,

fewer scraps are produced during the process.

2. A 3D printer can easily print an object which has a complicated shape or internal

structure.

3. A 3D printer does not need the traditional tools to manufacture a part. It can transform

digital models into real objects directly.

4. A 3D printer can print a product which is completely assembled.

3.2.2 MakerBot® Replicator 2

A MakerBot® Replicator 2 Desktop 3D printer (R2) (Figure 9) is used in this thesis for printing all parts of Hackberry. The company, MakerBot®, was founded in January 2009 and

29 mainly engages in the development and the research of desktop 3D printing. MakerBot® provides desktop, compact, large, and experimental 3D printers.

Figure 9. MakerBot Replicator 2

The R2 was released on September 19th, 2012. It is the fastest, easiest to use and a most affordable device made by MakerBot® now [49]. The R2, as the desktop FFF 3D printer, can create high-quality 3D printed objects. For R2, an extruder is the core of a 3D printer (Figure

10), and polylactic acid (PLA) is its filament. The 3D design file is sent to the machine by an SD card. Then the extruder heats and melts the filament and squeezes it out through a nozzle and onto the build plate. Finally, a solid object is made layer by layer.

30

Figure 10. MakerBot 2 Extruder

Although the R2 is a desktop 3D printer, it is still a powerful machine for research. The intelligent software analyzes the 3D file and controls the process of printing accurately after a few steps of preparation. The R2 has 100-micron layer resolution: the objects printed by an R2 are of a professional quality. Moreover, the dimensions of the build volume are 28.4 cm ×15.2 cm × 15.5 cm. This allows researchers to print multiple objects and large parts. Therefore, the R2 is an appropriate tool for this thesis.

3.3 Hackberry

Worldwide, the development of bionic hands has advanced in past decades. Bionic hands are accessible to people who need them because the technology can provide more functional and more reliable bionic hands. The current bionic hand works like the natural hand and can even perform better than the native hand. However, the high-quality bionic hands are expensive.

31

Although the low-cost bionic hands are in the development stage, researchers are showing great interest. Various inexpensive bionic hands can be found easily on the internet [44]. For example,

Tack-Hand, Open Bionic, Galileo hand, and Hackberry are available. Many types of bionic hands are open source and use 3D printing.

Figure 11. Hackberry Bionic Hand [50]

The body of the Hackberry hand (Figure 11) is designed by a Japanese company called exiii [50]. Based on the production method, Hackberry is divided into two parts: the 3D printed part and the non-3D printed part. The 3D printed part includes fingers, palm, wrist, and socket.

This part, which has many components, is made of PLA and created by a 3D printer. The non-3D print part includes the controller circuit, wires, screws, springs, battery, sensor board, and joint

32 axles. Both the circuit and sensor boards are Printed Circuit Boards (PCB), while the rest are metal parts.

The Hackberry hand is a forearm prosthesis with the EMG control. Hackberry uses a photo reflector as the sensor and three servo motors as actuators. Moreover, the controller is an

Arduino Micro. The hand uses an external battery to power three servo motors thereby moving fingers. The three servo motors control the thumb, the index finger, and the rest of the fingers

(middle). When the muscles of the forearm of the user are used, the photo reflector detects the movement of the muscles, and the value of the move is mapped to the rotational angle of the servo motor. Therefore, the controller makes the servo motor rotate to the corresponding angle.

The Hackberry looks like a regular hand and helps people with one hand to complete movements of griping and picking. Therefore, the Hackberry can satisfy some necessary life requirements. In this thesis, the total cost of the Hackberry hand is about $372, which includes

$245 for the non-3D printed part, $9 for materials, $10 for an IMU sensor, $9 for a Bluetooth module, and $100 for an EEG headset. Moreover, the Hackberry bionic hand can be produced by a Digital Light Processing 3D printer which uses photopolymers. Using photopolymers increases the material cost from $9 to $68, and the total cost becomes $431.The mechanical design of the

Hackberry is advanced and in accord with the general aesthetic standard compared with other low-cost bionic hands. In conclusion, the Hackberry is elected for this thesis.

33

3.4 Arduino

As the brain of the automatic control equipment, a microcontroller is critical for the operation of any mechatronics device. A microcontroller is a single integrated circuit with a microprocessor, memory and programmable input/output peripherals. It has a robust computational instruction set in a small package with low-energy and low-cost. This made applications of microcontrollers reach throughout every field, such as mobile phones, automobiles, industrial control, home appliances, and electric machine control. Therefore, a microcontroller is necessary for the control system of a bionic hand.

Arduino is a user-friendly popular microcontroller platform. It has open-source hardware with several types of boards and open-source software with hundreds of libraries. Therefore,

Arduino can be used to complete numerous control functions, mainly since thousands of projects are published on the website and the worldwide community. People can use and modify code freely from another project because Arduino is licensed under the GNU Lesser General Public

License or the GNU General Public License [51]. Therefore, it is favored by, not only academics, but also hobbyists. Moreover, the price of an Arduino board is lower than other platforms. The highest cost of an Arduino board is under $50. Arduino software Integrated

Development Environment (IDE) runs on many operating systems, like Windows, Macintosh

OSX, and . In conclusion, Arduino matches the hardware standard of this thesis.

Additionally, the original Hackberry hand is already using an Arduino board. Hence, the Arduino

34 board is kept in this thesis, but the Arduino Nano replaced the Arduino Micro. Figure 12 is showing both faces of an Arduino Nano board.

Figure 12. Arduino Nano Microcontroller

The original Hackberry hand uses the Arduino Micro. The specifications of the Arduino

Micro are shown in Table 4 [52]. However, the Arduino Nano replaces the Arduino Micro in this thesis. The specifications of the Arduino Nano are shown in Table 5 [53]. Compared to the

Arduino Micro, the Arduino Nano uses a different Microcontroller (ATmega328) but has the same size of flash memory. The most crucial difference is the size of PCB. The PCB size of the

Arduino Nano is 18mm x 45mm, while the PCB of the Arduino Micro is 18mm x 48mm. The

Arduino Nano is smaller than the Arduino Micro used in the original Hackberry hand, so

Arduino Micro is replaced by Arduino Nano for being the microcontroller of the Hackberry bionic hand.

35

Table 4. Arduino Micro’s Specifications

Microcontroller ATmega32u4

Digital I/O Pins 20

PWM Channels 7

Analog Input Channels 12

Flash Memory 32KB of which 4KB used by bootloader

SRAM 2.5KB

Clock Speed 16MHz

Table 5. Arduino Nano's Specifications

Microcontroller ATmega328

Digital I/O Pins 20

PWM Channels 7

Analog Input Channels 12

Flash Memory 32KB of which 2KB used by bootloader

SRAM 2KB

Clock Speed 16MHz

36

3.5 Neurosky Mindwave Mobile

With the development of the BCIs, the commercial devices using BCIs became available.

The field of application involves medical treatment, education, and computer games.

Commercial BCI devices mainly adopt the technology of non-intrusive detection. Furthermore, the commercial BCI devices tend to be more portable, less expensive, and user-friendly.

Currently, various commercial BCI devices are available [54]: Muse (the brain sensing headband), Emotiv Epoc+ (the 14-node EEG headset), Emotiv Insight (the 5-node EEG headset),

Ultracortex Mark IV (the open source EEG headset), and Neurosky Mindwave Mobile (the single-channel EEG headset). Channels and prices of these devices are shown in Table 6

.

Table 6. The Commercial BCI Devices’ Channels and Prices

Name of the Headset Number of Channels Price ($)

Muse 4 249

Emotiv Epoc+ 14 799

Emotiv Insight 5 299

Ultracortex Mark IV 8 349.99

Neurosky Mindwave 1 99.99 Mobile

37

Based on the comparison of devices in Table 6, the NMM headset is the least expensive.

Although the NMM headset has only one channel, the signal provided by the NMM headset is sufficient for this thesis. The NMM is a BCI headset with a dry sensor system. The Neurosky claims that this NMM headset (Figure 13) has a single channel touching the forehead and a reference point as an ear clip and uses the ThinkGear™ technology to acquire and analyze the data from brainwaves [55].

Figure 13. Neurosky Mindwave Mobile Headset

The ThinkGear™ can measure analog brainwave signals and convert them to digital signals by an onboard chip. It features the raw sampled data of the brain wave, signal poor quality metrics, EEG band power, and the eSense. The eSense (i.e., Attention eSense and

38

Meditation eSense) measures and quantifies attention and meditation on the scale from 1 to 100, which is an intuitionistic way to state the user's mental condition. For example, when the user sustains attention to a single thought or a specific subject, the attention meter value of the eSense is elevated (Figure 14). Otherwise, the attention meter value of the eSense is decreased (Figure

15).

Figure 14. High Attention Meter Value Figure 15. Low Attention Meter Value

In this thesis, the NMM headset is chosen because it has the following advantages:

1. It is the least expensive option.

2. It provides excellent signal quality.

3. It has an unobtrusive nature, which can reduce the perceived stigma from the user.

4. The weight of the headset is about 90g.

5. The user feels comfortable while wearing it.

39

3.6 IMU Sensor: MPU6050

An inertial measurement unit sensor (IMU sensor) is used in the control system. An IMU is an electronic input device combining gyroscopes and accelerometers, which can measure the subject’s force and rate [56]. Thus, an IMU can provide position, orientation, and kinematics of a subject. Based on that data, movement of the subject can be detected with using a fixed coordinate system. Moreover, an IMU also has a magnetometer or expansion interface of the magnetometer, which can measure the magnetic field around the subject. Typically, the IMU is applied to aircraft, spacecraft, industrial equipment, and smartphone as a motion sensor or a compass [57].

The MPU6050, is the first 6-axis motion processing unit in the world. Both gyroscopes and accelerometers are 3-axis and are contained in a single small package (30.5mm x 20.3mm x

5.1mm), which not only reduces the size of the sensor but also produces the output of the and the accelerometer at the same time. The MPU6050 also has a Digital Motion

Processor (DMP) and a 1024 byte First In First Out (FIFO) buffer. The DMP can enhance the efficiency of calculations, and the 1024-byte FIFO buffer can store the data in bursts and reduce power consumption. Moreover, I2C ports are featured by the sensor for communicating with

Arduino. In the MPU6050, the full-scale range of a gyroscope is ±250, ±500, ±1000, and

±2000°/sec (DPS), and the large-scale range of an accelerometer is ±2g, ±4g, ±8g, and ±16g.

These performance measures show that the MPU6050 has the high precision for tracking of both fast and slow motions. Furthermore, the cost of the MPU6050 is under $6, and the price of other

40 sensors is about $10. In a word, the MPU6050 can be considered as an Arduino-friendly IMU sensor featuring low power consumption, low cost, and high precision. Therefore, the MPU6050 is used in this thesis. Both sides of the MPU6050 are shown in Figure 16.

Figure 16. 3-axis Accelerometer/Gyro MPU6050

3.7 Bluetooth Module: HC-05

Bluetooth is a general wireless technology standard that has an excellent quality of exchanging data over short distances between devices. Because of the NMM headset, a

Bluetooth module had to be chosen for the receiver of the control system. In this thesis, the HC-

05 is the receiver of the control system. The reason is the NMM headset connected to the

Arduino through the HC-05 was successful in a previous project [58].

From the user manual, the HC-05 [59], is a Bluetooth serial interface module with the

CSR BC417 mainstream Bluetooth chip and Bluetooth V2.0 SPP protocol standards. The

41 dimensions of the HC-05 board are 15mm x 28mm x 2.5mm, and the price is about $9. Hence, the HC-05 is small and inexpensive. The HC-05 has both master and slave mode and various baud rates, which can meet different user’s demands. Moreover, it is a user-friendly module which can be specified easily by its AT commands. Both sides of the HC-05 used in this thesis are shown in Figure 17.

Figure 17. Bluetooth Communication Board HC-05

42

Chapter IV Bionic Hand Control System

In this chapter, the process of designing a new bionic hand control system is presented.

Firstly, user requirements are defined based on the goal of this thesis. Then a control system comprising hardware and software parts are introduced in detail.

4.1 System Design

Currently, the EMG control is applied in most of the low-cost bionic hands. However, the

EMG suffers from a serious limitation. Because the EMG signal is generated by the muscle, at least one sensor must be placed on the stump. The bionic hand cannot help users who have injuries to the muscle or the nerve of the limb. Therefore, the bionic hand requires new technology for its control system.

The BCI, which detects the EEG signals, doesn't have the same limitations as the EMG based BCI. The user is only required to wear an EEG headset as the BCI acquires the signal from the brain to control the bionic hand. In the previous research, several applications of the commercial EEG headset have been discussed. It must be noted that few low-cost bionic hands are controlled by the EEG headset. Thus, this research area has an enormous potential for development. Most current inexpensive bionic hands controlled by EEG require a computer to analyze the EEG signals. Although the physical dimensions of the current personal computers can be small, the total cost is still prohibitively high for most disabled persons. The ThinkGear™

43 technology used in the NMM headset can directly send the EEG serial data stream to the

Arduino through the Bluetooth. The bionic hand can be controlled by EEG signals without a computer. Therefore, the cost of an EEG-controlled bionic hand is further reduced by using the

NMM headset.

This work aims to modify the Hackberry bionic hand with the NMM headset. As stated previously, the Hackberry bionic hand with the NMM headset has two disadvantages. The first disadvantage is that the hand may move unintentionally while the user is paying attention to something else. For dealing with this problem, the control system must determine if the user’s concentration is for using the hand. In reality, the forearm is moved before the fingers flex.

Hence, the hands' positions are usually changed as people are using their hands. The MPU6050 is an IMU sensor which can detect the orientation of a hand. Based on the changes in orientation data from the MPU6050, the position/orientation of the bionic hand can be calculated. Therefore, based on the information from the IMU sensor, it can be determined if the user wants to use the bionic hand.

The second disadvantage of the Hackberry bionic hand is that this bionics hand controlled by the NMM headset cannot be stopped at the specific position, which further limits its use. In the previous research [36], the servo motors in the bionic hand are driven when the eSense meter value from the NMM headset become higher than the threshold value. However, this system is not a continuous real-time control system. Namely, the bionic hand can only be fully opened or fully closed; partial extension of the hand is not possible. The control system in

44 this study monitors the change in the eSense meter value in real time. While the servo motors are rotating, the eSense meter value is compared to the threshold value, and the system judges whether to keep driving servo motors or not. Therefore, the user can decide the fingers' position by controlling the attention.

As previously mentioned, a new control system based on the Hackberry bionic hand, the

NMM headset, the Bluetooth module, and the IMU sensor has been developed. The block diagram of the new control system is depicted in Figure 18.

Figure 18. Bionic Hand Control System Flowchart

45

In the beginning, the entire system is initialized, and the initial orientation (i.e., pitch, yaw, and roll) is calculated by the MPU6050. Then, the orientation of the hand is being monitored continuously. When the difference between the initial orientation and the current orientation exceeds the threshold value, the control system determines that the user wants to use the bionic hand. If so, the attention collection algorithm starts to analyze the signal from the headset and obtains the attention meter value through eSenses. The attention meter value is divided into three levels: the low attention level, the normal attention level, and the high attention level. If the user wants to close the bionic hand for gripping or pinching an object, he/she needs to concentrate and raise the attention meter value. When the system detects that the attention meter value indicates the high attention level, the servo motors are driven forward, and the bionic hand starts closing. If the user wants to open the bionic hand for releasing an object, he/she needs to relax and reduce the attention meter value. When the system detects that the attention meter value indicates the low attention level, the servo motors move backward, and the bionic hand starts opening. When the system detects that the attention meter value indicates the normal attention level, the user wants to hold an object, and the servo motors are paused at the current position.

During the time the servo motors are running, the user's attention meter value and the hand's orientation are monitored in real time. If the user changes the attention level, the operation of the servo motors is changed accordingly. If the difference in the orientation data is lower than the threshold value, the system determines that the user has finished using the bionic hand. Thus,

46 all servo motors return to their initial positions. Figure 19 shows the operating process of the new control system.

Figure 19. The New Control System Flowchart

47

4.2 Hardware Design

To achieve the required functions of the new control system, an appropriate hardware system is designed on a solderless breadboard. The size of the breadboard is 5.5 cm × 8.2 cm ×

0.85 cm, which is almost the same size as the palm. The new control system includes an Arduino

Nano microcontroller, a Bluetooth communications module HC-05, a 3-axis accelerometer/gyro

MPU6050, and a power supply. The pinout of an Arduino Nano is shown in Figure 20. The pinout of an HC-05 is shown in Figure 21. The pinout of an MPU6050 is shown in Figure 22.

Figure 20. Pinout of Bluetooth Communication Figure 21. Pinout of 3-axis Accelerometer/Gyro Board HC-05 MPU6050

Figure 22. Pinout of Arduino Nano Microcontroller

48

The input of HC-05 requires four pins of the Arduino Nano: +5V, Ground, TX, and RX, while the MPU6050 requires two analog pins, a digital pin, and the power supply to connect to the Arduino Nano. In this project, A4 and A5 are chosen for two analog pins, and D2 is selected for a digital pin. Moreover, D3, D5, and D6 are selected for controlling servo motors because the output signals of these pins are Pulse Width Modulation (PWM) enabled. An LED connects to the digital pin D11, and a slide switch connects to D12 and ground. The circuit shown in Figure

23 is designed by using easyEDA.

Figure 23. Control System Hardware Implementation

49

For stability and usability, a PCB board is made later. Figure 24 shows the PCB board setting on the Hackberry bionic hand. The size of the PCB board is 80 mm × 60 mm. Arduino

Nano and a DC converter are on another side of this PCB board. This board is used in the standardized assessment procedure, while the solderless breadboard is used in physical application tests.

Figure 24. The PCB Board of the Control System

To allow the NMM headset to talk to the Arduino, the HC-05 should be paired with the

NMM headset. Therefore, The HC-05 needs to be configured before use. The configuration settings of the HC-05 are:

1. The HC-05 is in the master role.

2. The passkey is 0000.

3. The baud rate is 57600.

50

4. The HC-05 works in Auto-Connect mode.

5. The HC-05 is bonded to a Unique-ID headset.

Table 7. AT Commands for Configuring Bluetooth Module HC-05

Command Response Command explanation

AT OK Test

AT+UART=57600,0,0 OK Set the baud rate

AT+PSWD=0000 OK Set the passkey

AT+ROLE=1 OK Set the master role

Set connection mode which connects the module to AT+CMODE=0 OK the specified Bluetooth mode

AT+BIND=2471,89,743597 OK Bind Bluetooth address of the NMM headset

The HC-05 is configured by using AT commands. A terminal software on a computer connects to the HC-05 via an interface of TTL to USB. To power the HC-05 in Command Mode, the button on the HC-05 is held while the computer is switched on. The AT commands used to configure the HC-05 are shown in Table 7. The Hackberry hand can automatically pair with the certain NMM headset via the HC-05 after the HC-05 is configured.

51

4.3 Software Design

In this thesis, an Arduino Nano is chosen as the microcontroller. Therefore, the Arduino

IDE is used to develop the algorithm for the control system. Based on the system design in

Section 4.1, the program for this thesis has four main parts, which include the Initial Setup,

Orientation Judgment, Attention Collection, and the Main Loop. The complete program is presented in Appendix A.

Initial Setup is the initialization for the bionic hand. It only runs once at the beginning of the program execution. Figure 25 shows the operating process of Initial Setup. In this part, the digital pin D11 is initialized as an output for the LED, and the LED is turned on. Servo motors are attached to pins of Arduino Nano, and they rotate to the initial position, which is the fully- open position for the bionic hand. Then, the MPU6050 is connected to the Arduino and initialized. After this, Orientation Judgment starts to calculate the orientation of the hand. Since the orientation data contain some errors at the beginning of the program execution, the program is delayed and must wait for the orientation to be corrected. The initial orientation has been saved after the Orientation Judgment output is stable. The hand resting naturally on the side of the body is considered as the initial orientation. Finally, the LED is turned off, which informs the user that the bionic hand is ready for use.

52

Figure 25. Initial Setup Flowchart

Orientation Judgment is based on an MPU6050 created by Rowberg [60]. This algorithm can calculate the yaw, pitch, and roll of the MPU6050, corresponding to the orientation of the bionic hand. This work only uses the pitch and roll because the yaw can be changed by turning the user’s body around. Usually, the user does not want to use the bionic hand when it rests naturally, or while it is swinging slightly alongside the body, i.e., the user wants to use the bionic hand when a longer-range movement is needed, which shows as an absolute difference between the two orientations. Equation (1) and (2) show the absolute difference between the current orientation and the initial orientation.

53

θ = |휃푛 − 휃1| (1)

ϕ = |휙푛 − 휙1| (2)

Where 휃1 and 휙1 are the angular values representing the initial pitch (Figure 26.a) and the initial roll (Figure 27.a), 휃푛 and 휙푛 are the angular values representing the current pitch

(Figure 26.b) and the current roll (Figure 27.b), and θ and ϕ are absolute differences of the pitch (Figure 26.b) and the roll (Figure 27.b).

훉 휽 ퟏ 휽풏 휽 a b ퟏ

Figure 26. The Initial Pitch (a) and Current Pitch (b) of the Bionic Hand

훟 흓 흓 풏 ퟏ 흓ퟏ a b

Figure 27. The Initial Roll (a) and Current Roll (b) of the Bionic Hand

54

In this part of the program, there is a variable called “hand mode”. The hand mode decides the system’s state of the bionic hand (i.e., Operating mode and Sleep mode). When θ or

ϕ is higher than the threshold value, the hand mode is “1”. When θ or ϕ is below the threshold value, the hand mode is “0”. The initial system state is the sleep mode. The system state changes from the sleep mode to the operating mode if the hand mode is “1”. Conversely, the system state stays or switches back to the sleep mode if the hand mode is “0”. Therefore, the modified Hackberry bionic hand has a workspace. The workspace is the area when the user swings the prosthetic arm and the bionic hand rotates over ±30° in the pitch axis or ±25° in the roll axis. The user can change the workspace by adjusting the initial orientation of the bionic hand, which can be easily adjusted by restarting the hand and pointing it to a certain direction.

It is worth mentioning that the Digital Motion Processor (DMP) output rate in this library is high. The FIFO buffer can overflow easily while other parts of this program are running. In such case, an error is reported. For dealing with this problem, a two-part solution is implemented: first, the DMP output rate is reduced. Details are provided in Figure 28; second, a library created by Oudert [61] is used to adjust processor resources of Arduino Nano. This library is a cooperative scheduler which can implement multithreading of Arduino. Idle computing resources are used to execute another thread when a thread cannot use all computing resources. Therefore, the data in the FIFO buffer is processed as soon as possible. The possibility of the FIFO buffer overflowing is reduced.

55

Figure 28. DMP Output Rate Reduction

Attention Collection is based on a project published by Bouchti [62]. The attention meter value of the eSense, specifying the intensity of user’s concentration, is one of the outputs from the ThinkGear™ chip in the NMM headset. This part of the program can collect the data stream of bytes from the NMM headset and output the attention meter value when the data stream is

ThinkGear Packets. Figure 29 shows the Attention Collection flowchart.

Figure 29. Attention Collection Flowchart

When Attention Collection is invoked, Arduino starts to parse the data stream of bytes from the NMM headset. If the current data stream has a packet header which includes two

56 synchronization bytes and a payload length byte, it is ThinkGear Packets with a valid payload. In the payload, the attention meter value is an unsigned one-byte value which ranges from 0 to 100.

When the attention meter value is higher than 70, the intensity of the user’s concentration is considered to be at the high attention level, and the move mode (a global variable of the program) is “1”. When the attention meter value is lower than 35, the intensity of the user’s concentration is considered to be at the low attention level, and the move mode is “2”. When the attention meter value is lower than 70 but higher than 35, the intensity of the user’s concentration is considered to be at the normal attention level, and the move mode is “3”.

The Main Loop controls the fingers’ position. This portion of the program is based on a library called VarSpeedServo created by NETLab Toolkit Group [63]. The state diagram in

Figure 30 shows the behavior of Main Loop. When Orientation Judgment causes the hand mode of the new control system to change to the sleep mode (hand mode = 0), all three servo motors move back to their initial positions. When Orientation Judgment causes the mode of the bionic hand to change to the operating mode (hand mode = 1), the thumb’s motor rotates and causes the thumb to close fully. The digital pin D12 is an input for the slide switch. If the state of digital pin

D12 is HIGH (Gripping mode), the index’s and middle fingers’ motors rotate based on the move mode from Attention Collection. If the state of digital pin D12 is LOW (Pinching mode), the middle’s motor rotates and causes the middle fingers to close fully. Only the index finger’s motor rotates based on the move mode from Attention Collection. When the current attention meter value indicates the high attention level (move mode = 1), the fingers’ servo motors rotate

57 closing the bionic hand. When the current attention meter value indicates the low attention level

(move mode = 2), the fingers’ servo motors rotate in the opposite direction, which means the bionic hand opens. When the current attention meter value indicates the normal attention level

(move mode = 3), the fingers’ servo motors pause at the current position, which means the bionic hand stops. During the rotation of the index and middle fingers' motors, the hand mode and move mode are monitored. The operating state of the bionic hand is controlled in real time.

Figure 30. Main Loop State Diagram

58

Chapter V Testing of Bionic Hand

Several bionic hand-specific tests are designed and implemented. The test designs include the test goal, the test method, the test subject, and the experimental procedure. Test implementations are described in detail. Each test in this thesis consists of two parts: the hardware/software verification, and the application. The test results are discussed at the end of the chapter.

5.1 Test Design

The research goal of this thesis is to improve the Hackberry 3D printed hand by using

EEG. A new control system described in Chapter 4 has been designed for this purpose. Instead of using photo-reflective sensors, this bionic hand system uses the EEG headset to acquire the signals from the brain and control the Hackberry bionic hand. The new control system needs to function at least as well as the one operating the original Hackberry hand wirelessly. Moreover, applications for the modified bionic hand should be based on some functions of real human hands. Therefore, the test goal is to prove that the modified bionic hand operates as intended and that it can be useful as a prosthesis. The method to carry out this thesis involves experimentation.

The subject of the tests is the modified Hackberry bionic hand controlled by the NMM headset

(Figure 31).

59

Figure 31. The Modified Hackberry Bionic Hand with NMM Headset

Again, to verify that the modified bionic hand performs satisfactorily, off-line verification tests as well as application tests are performed. The hardware/software verification assess how strictly the new control system meets set expectations. Within the expected working environment, the new control system allows the user to control the modified bionic hand by using EEG signals without exhibiting the two disadvantages of the original Hackberry hand discussed in Chapter 4. Again, the first disadvantage is that the hand moves unintentionally while the user is paying attention to something else. The second disadvantage is that the bionic hand cannot control the degree of the flexed movement. For observing the control program while it is running, the Arduino in the modified bionic hand is connected to a laptop, and the feedback is displayed on the serial monitor of the Arduino IDE while the modified bionic hand is operating.

Therefore, the procedure for the verification experiments is:

60

1. Connect the Arduino Nano in the modified bionic hand to a laptop and open the

Arduino IDE.

2. Wear and power the NMM headset.

3. Power the modified bionic hand when its fingers are perpendicular to the ground.

4. Open a serial monitor in the Arduino IDE and observe the feedback from the bionic

hand.

5. Control the modified bionic hand by using the brain and verify that the fingers’

movement and the feedback shown on the serial monitor are as expected.

6. Repeat the test several times and analyze the results.

The results are the feedback from the Arduino and the current position of the fingers.

When the result of the verification programs is satisfactory, this indicates that the new control system complies with expectations. Consequently, the modified Hackberry bionic hand is ready for physical application testing.

A standardized procedure is required to evaluate the functionality of the modified bionic hand and ensure the reliability and validity of the study. The Southampton Hand Assessment

Procedure (SHAP) [62] is an assessment tool which can be used to measure the prosthetic hand function in a clinical environment. In the SHAP, the hand gestures are classified into six prehensile patterns (Figure 32): tripod pinch, tip pinch, lateral pinch, power grip, spherical grip, and extension grip (different from the original). A scoring system is created to reflect each of the prehensile patterns.

61

Lateral Power Tripod

Tip Extension Spherical

Figure 32. Six Prehensile Patterns

The physical application tests evaluate the performance of the modified bionic hand. In this thesis, the performance has been defined as the ability of the modified bionic hand to use items or tools. The test objects are chosen from objects that are often used in humans' daily routines. The mechanical part of the modified bionic hand is the Hackberry bionic hand. The

Hackberry bionic hand has two operating modes: gripping and pinching. Therefore, in this part of the test, the modified bionic hand is used to grip and pinch the test objects of different shapes, sizes, weights, and materials. The material of test objects includes glass, plastic, metal, paper, ceramics, and fabric; forms of test objects include mainly cylinder, cuboid, and polyhedron; sizes

62 and weights of test objects are chosen not to be beyond the mechanical abilities of the Hackberry bionic hand. Because the modified bionic hand is a right hand, researchers hold the wrist of the modified bionic hand with their right hands during physical application testing. The test procedure is:

1. Place the test object on the table and initialize the modified bionic hand.

2. Move the modified bionic hand close to the test object and wait for it to be in the

gripping mode or pinching mode.

3. Grip or pinch the test object and it from the table.

4. Use the test object, as usual and, observe the process of using the test object.

5. Place the test object back on the table and release it.

The results show what kinds of test objects can be gripped or pinched by the modified bionic hand. By analyzing the effect, the performance of the modified bionic hand is evaluated, and the application range is determined. Therefore, one can learn how the modified bionic hand behaves in an unstructured every day environment.

5.2 Testing

5.2.1 Hardware/Software Verification

This section describes verification tests that show how closely the new control system complies with expectations. Based on the system design in Chapter 4, the expectations of the new control system are:

63

1. The Arduino Nano can analyze the EEG signal acquired by the NMM headset and

obtain the user’s attention meter value.

2. The MPU6050 can calculate orientation of the bionic hand.

3. The servo motors, the LED, and the slide switch work properly.

4. The modified bionic hand does not move when it is resting naturally or swinging

slightly.

5. The modified bionic hand can be controlled to stop at the position wanted by the user.

For obtaining the working conditions of the new control system, the verification tests are divided into three tasks:

1. Testing system initialization

2. Testing the sleep mode

3. Testing the operating mode

Following the test design, the test environment is built as shown in Figure 33. The modified bionic hand is connected to the laptop via a USB cable. The Arduino IDE is started, and the following settings are selected: the board is "Arduino Nano," the processor is

“ATmega328”, and the Port is the one connected to the Arduino Nano.

After the user makes sure that the NMM headset is worn correctly, he/she powers the headset on. The sensor tip must touch the user’s forehead (Figure 34), and the sensor clip must be attached to the user's left earlobe (Figure 35).

64

Figure 33. The Test Environment

Figure 34. The Standard for Wearing the Figure 35. The Standard for Wearing the Sensor Tip Sensor Clip

5.2.1.1 System Initialization Test

In system initialization, the servo motors, MPU6050, and HC-05 are initialized. The researcher can observe the LED on the modified bionic hand and the feedback in the serial monitor to confirm that system initialization is completed. At the beginning of system initialization, the modified bionic hand is held as in Figure 36.

65

Figure 36. The Initial Position for Testing the Modified Bionic Hand

Although the Arduino Nano is powered by the USB cable, the servo motors require extra power. Hence, the servo motors are powered by the battery in the modified bionic hand. When the serial monitor is opened, the LED is turned on and the system initialization begins. Figure 37 shows feedback displayed on the serial monitor. The first line in Figure 37 means that the NMM headset is connected to HC-05, and the Arduino can receive the data from the EEG signal. The second line informs the user that the servo motors are attached, and the hand is fully-opened. The third line indicates that the MPU6050 is ready to run. The data in the fifth line is the initial orientation of the modified bionic hand. The sixth and the seventh line show that the control system initialization is completed, and the modified bionic hand is in the sleep mode. At that time, the LED turns off.

66

Figure 37. The Feedback for System Initialization

In conclusion, the Bluetooth and the LED worked adequately during testing system initialization. The MPU6050 calculated the orientation of the bionic hand. Therefore, the control system can comply with expectations in the system initialization test.

5.2.1.2 Sleep mode Test

In the sleep mode, the servo motors cannot be controlled by the user. Based on the system design, the control system does not change the sleep mode to the operating mode when the modified bionic hand is in the original position (Figure 38) or swinging slightly (Figure 39). In

Figure 40, the feedback displays that the control system remains in the sleep mode. Moreover, the control system can change the operating mode back to the sleep mode once the user moves the modified bionic hand back to the original position. The feedback displays that the control system changes the operating mode to the sleep mode while the bionic hand is opening (Figure

67

41.a), stopped (Figure 41.b), and closing (Figure 41.c). The message “reset hand…” means that all three servo motors rotate to their initial positions.

Figure 38. The Hand is in the Original Figure 39. The Hand is Swinging Slightly Position

Figure 40. The Feedback When the Hand is Resting or Swinging Slightly

68

a b c

Figure 41. The Feedback of Changing the Operating mode to the Sleep mode: a) Hand Opening b) Hand Stopped and c) Hand Closing

In conclusion, the modified bionic hand is in the sleep mode when it is resting naturally or swinging slightly. Therefore, the modified bionic hand does not move unintentionally. The control system can comply with expectations in the sleep mode test.

5.2.1.3 Operating mode Test

When the user moves the modified bionic hand over a long distance, the thumb moves to the presupposed position (Figure 42), and the feedback message in the serial monitor shows that the mode is changed to the gripping mode (Figure 43). In the control system, the catching sub- mode is the default operating mode. After the thumb has stopped at the position in Figure 42, the user can raise his/her attention level to control the rest of fingers.

69

Figure 42. The Modified Bionic Hand in the Gripping Mode

Figure 43. The Feedback of Changing the Sleep mode to the Gripping Mode

In the operating mode, there are two sub-modes which are the gripping mode and the pinching mode. The two sub-modes are determined by the state of the digital pin D12 which can be changed by the slide switch. If the state of D12 is HIGH, the sub-mode will become the gripping mode (Figure 43), and the index and middle fingers can be controlled by the user. When the intensity of the user’s concentration is at the high attention level, the fingers start to close

(Figure 44.a) and finally make a fist (Figure 44.b). When the intensity of the user’s concentration is at the low attention level, the fingers start opening (Figure 45.a) and spread out (Figure 45.b).

70

When the intensity of the user’s concentration is at the normal attention level, the fingers stop at the current position. Feedback messages “finger opens” (Figure 46.a) and “finger closes” (Figure

46.b) are displayed while the bionic hand is opening and closing.

a b

Figure 44. The Process of Closing the Hand in the Gripping Mode

a b

Figure 45. The Process of Opening the Hand in the Gripping Mode

71

a b

Figure 46. The Feedback of Opening (a) and Closing (b) the Bionic Hand

If the state of D12 pin is LOW, the sub-mode is the pinching mode (Figure 47). The middle fingers move to the presupposed position, and only the index finger can be controlled by the user (Figure 48). The work pattern of the pinching mode is the same as that of the gripping mode. In the pinching mode, the user can control the index finger to close the thumb (Figure

49.a), and finally touch the thumb (Figure 49.b). The user also can control the index finger to move away from the thumb (Figure 50.a) and go back to the original position (Figure 50.b). The feedback is the same as in the gripping mode.

72

Figure 47. The Feedback of Changing the Sleep Figure 48. The Modified Bionic Hand in the mode to the Pinching Mode Pinching Mode

a b

Figure 49. The Process of Closing the Hand in Pinching Mode

a b

Figure 50. The Process of Opening the Hand in Pinching Mode

73

In conclusion, the servo motors and the slide switch worked as intended. The Arduino

Nano analyzed the EEG signal acquired by the NMM headset and obtained the user’s attention meter value. As a result, the modified bionic hand stopped at the position commanded by the user. Therefore, the control system could comply with expectations in the operating mode test.

5.2.2 Standardized Assessment Procedure

Based on the distinct prehensile patterns, the SHAP comprises the 26 timed tasks, including 12 abstract tasks and 14 activities of daily living (ADLs) (Table 8). The 14 ADLs belong to six prehensile patterns. However, because of the mechanical limitations, the modified

Hackberry hand cannot perform lateral pinch and extension grip defined in the assessment procedure. Therefore, parts of the tasks are canceled or performed as a similar prehensile pattern.

Indeed, only 10 abstract tasks and 11 ADLs are used to evaluate the performance of the modified

Hackberry bionic hand. Lightweight and heavyweight Lateral in abstract tasks and the button board, key, and screw in ADLs are not included.

The scoring system of the assessment procedure, which is Index of Functionality (IOF), uses the Euclidean squared distance to establish all prehensile patterns. Normative data for a subject with a natural hand is given by the SHAP, and the time for the prosthetic wearers completing a task is 6-8 times as long as the norm. For EEG prosthetic users, 8 times that of normative data is a standardized time. Moreover, if the user completes a task over 100 seconds or cannot complete successfully, the task time records as 100 seconds. 10 users were assessed, and

74 the results are shown in Figure 51. IOF scores in preliminary studies [62] are in the scale from 37 to 48.

Table 8. 12 abstract tasks and 14 activities of daily living

Abstract Tasks Activities of Daily Living

Lightweight Spherical Simulated Food Cutting

Lightweight Tripod Page Turing

Lightweight Power Jar Lid

Lightweight Tip Jug Pour

Lightweight Extension Empty Tin

Lightweight Lateral Full Jar

Heavyweight Spherical Tray

Heavyweight Tripod Key

Heavyweight Power Door Handle

Heavyweight Tip Carton Pour

Heavyweight Extension Coins

Heavyweight Lateral Carton Pour

Undo button

Screw

75

Task Time Records

0.00 20.00 40.00 60.00 80.00 100.00

Lightweight Spherical Lightweight Tripod Lightweight Power Lightweight Tip Lightweight Extension Heavyweight Spherical Heavyweight Tripod Heavyweight Power Heavyweight Tip Heavyweight Extension coins Simulated Food Cutting Page Turning Jar Lid Jug Pour carton pour Empty Tin Full Jar Tray Zip Door Handle

Time stantard [62] Experimental time average

a

Index of Functionality 40.00 38.09 34.85 35.00 31.44 32.47 29.37 30.48 29.49 29.52 30.46 29.35 30.00 25.00 20.00 15.00 10.00 5.00 0.00 User 1 User 2 User 3 User 4 User 5 User 6 User 7 User 8 User 9 User 10

b

Figure 51. Task Times (a) and Index of Functionality (b) of the SHAP

76

In conclusion, the outcome shows that experimental task times using the modified bionic hand basically are longer than mean normative task times. None of the IOF scores are in the range of 37 to 48. The reason for long task times is that some users have a hard time to relax after concentrating or concentrate after relaxing. The task time can be affected by the following factors:

1. The position of the forehead sensor.

2. The level of proficiency for using the NMM headset.

3. The level of the user’s brain fatigue.

5.2.3 Physical Application Tests

The goal of the physical application tests is to determine possible applications of the modified bionic hand. Based on the test design, the modified bionic hand is used to grip and pinch the everyday test objects. Hence, each application test consists of three modes: sleep mode, gripping mode, and pinching mode. In the sleep mode, the state of the modified bionic hand is recorded while the user is standing or walking naturally. In the gripping mode, the modified bionic hand is used to grip objects and use them. The test objects for gripping include liquid containers, a modified fork, an iPhone, and books. In the pinching mode, the modified bionic hand is applied to pinch test objects and to use them. Test objects for pinching include a magazine, a USB cable, the zipper of a jacket, and a shoelace. These everyday objects are made of varied materials and are of various shapes, sizes, and weights. The process of using test

77 objects is recorded. The results determine if the user can use those test objects satisfactorily with the modified bionic hand.

5.2.3.1 Sleep mode Tests

The goal of the sleep mode test is to prove that the modified bionic hand does not move unintentionally. In this thesis, the definition of unintentionally moving is that the modified bionic hand moves even though it is not being controlled by the user while the user is walking or standing naturally. At the beginning of the test, the new system requires four steps to set up the modified bionic hand.

1. The user wears the NMM headset correctly and powers it on.

2. The user powers the modified bionic hand and rests it naturally. The LED is turned

on.

3. The user is required to keep resting the modified bionic hand in the same position.

4. The user checks if the LED is turned off. When the LED turns off, the modified

bionic hand is ready for use.

When the hand is ready for use, the user stands like the researcher in Figure 52 with the thumb held in the initial position. While the user is walking naturally, the modified bionic hand is swinging within a small range. In Figure 53, the thumb is held in the initial position. When the thumb is in the initial place, the control system maintains the sleep mode. In the sleep mode, the

78 servo motors will not be controlled by any signal from the brain. Therefore, the modified bionic hand will not move unintentionally when the user is walking or standing.

Figure 52. The Researcher is Standing Figure 53. The Researcher is Walking

5.2.3.2 Gripping Mode Tests

Gripping is a preferred functional motion in human interactions with the environment.

Usually, gripping is the primary function of low-cost bionic hands. Therefore, the performance of gripping is an essential requirement for the inexpensive bionic hand developed here. In the gripping test, the modified bionic hand is used to grip items and use them. During this test, the orientation of the modified bionic hand is recorded when the user grips and uses the items. By analyzing the results of the gripping test, one can learn which types of objects can be gripped.

For humans, eating and drinking are two necessary activities. The modified bionic hand must be able to help in this regard. Therefore, a liquid container and a modified fork were chosen for testing. The liquid container mainly refers to a cup, a can, a mug or a bottle. The shape of a cup or a liquid container is usually a cylinder which is a typical shape in many designs. The

79 material of a liquid container can be plastic, paper, metal, ceramics, or glass. Moreover, the weight of a liquid container can be adjusted by filling it with water. Therefore, the liquid container is discussed mainly in the physical application tests. Table 9 suggests that the bionic hand is capable of lifting containers within a weight range, consistent with that of a glass of water or a can of soda.

Table 9. The Liquid Containers Gripping Test Result

Liquid The surface of The weight of The weight of a

container’s the liquid an empty liquid filled liquid

name container container (g) container (g)

glass Smooth 225 401

beverage can Smooth 15 383

plastic bottle Smooth 27 540

paper cup Smooth 8 216

mug Smooth 260 550

The liquid containers tested in this study are a glass cup, a beverage can, a plastic bottle, a paper cup, and a mug. After several trials, the results indicated that the modified bionic hand could only carry an item which weighs about 250g. Such bionic hand could not be used to lift a full bottle of soda, which means that it is not very useful. Insufficient friction between the bionic hand and the item was considered as the primary cause for this limitation. Therefore, the friction

80 must be increased to improve the performance of the bionic hand. Covering the fingertips and the palm with some high friction material is the best way to increase the friction. This material must be common, and it should not affect the movement of the fingers. Rubber and foam were considered. However, it was hard to cover the fingertips with rubber and foam without affecting the movement of the fingers. Another way to increase the friction is to use the material with some viscidity. Therefore, tape is an excellent choice because tape has proper viscidity and does not affect the movement of the fingers when tape covers the fingertips. Of course, the viscidity of the tape cannot be excessive. Otherwise, the objects stick to the fingers.

In this thesis, a type of mounting tape was chosen for increasing the friction of the fingers. This mounting tape is made of foam, and it is easy to cover the fingertips and the palm of the modified bionic hand. This mounting tape has sufficient viscidity, and it does not cling to objects. Hence, the mounting tape increases the friction between the fingertip and the object.

Figure 54 shows the fingertips of the modified bionic hand when they are covered with the mounting tape. The double-sided mounting tape is placed on the fingertips while the appearance of the fingers is virtually unchanged. Thus, the movement of the fingers is not affected. To test the performance of the modified bionic hand with the mounting tape, the user grips liquid containers filled with water. Moreover, the user uses the modified bionic hand to pour water from the liquid container into a bucket, which simulates the action of drinking water.

81

Figure 54. The Fingertips Covered by Mounting Tape

The first test object is a glass. The weight of the empty glass is 225 g, the weight of the filled glass is 401 g. The surface has ridges. The user grips the glass which is placed on the table, lifts it, and pours water into the bucket. Figure 55 shows the process of gripping the glass and pouring water into the bucket. The glass is placed between the index finger and the thumb, and the user concentrates and increases the attention meter value. When the attention meter value becomes higher than the threshold value, the index finger and middle fingers start to move to grip the glass cup. The user continues with the high attention level while the fingers hold the glass, and then he/she lifts the glass, and checks if the glass is held well enough. The user, then, switches to the normal attention level, which results in the servo motors stopping the fingers from applying more pressure to the cup. Then, the glass cup is moved to the top of the bucket.

The bionic hand is rotated to pour water out of the glass. After pouring some water out of the glass, the user puts the cup back on the table and relaxes. When the attention meter value

82 becomes lower than the threshold value, the index finger and middle fingers move back. The user keeps the low attention level until the modified bionic hand releases the glass cup.

1 2

3 4

Figure 55. The Process of Gripping the Glass Cup and Pouring Water into the Bucket

The second test object is a beverage can from Pepsi. It is a 355 ml aluminum can having a smooth surface. The weight of the empty beverage can is 15 g, and the weight of the filled beverage can is 383 g. The modified bionic hand can help in opening a new can of soda. Figure

56 shows the user opening a new aluminum can with the help of the modified bionic hand. When the real hand pulls the ring of the can, the bionic hand makes sure the can does not move. Figure

83

57 shows the process of gripping the beverage can on the table and pouring water into the bucket. The process for gripping the beverage can is the same as that for gripping the glass.

1 2

Figure 56. Opening a Soda with Both Hands

1 2

3 4

Figure 57. The Process of Gripping the Beverage Can and Pouring Soda into the Bucket

The third test object is a plastic bottle from Coca-Cola. This bottle is a 500 ml plastic bottle that also has a smooth surface. The weight of the empty plastic bottle is 27 g, and the

84 weight of the filled plastic bottle is 540 g. Opening the plastic bottle requires cooperation of both hands. Figure 58 shows the user unscrewing the bottle cap. The bottle is clutched by the bionic hand to prevent the bottle from rotating with the bottle cap. Figure 59 shows the process of gripping the plastic bottle and pouring water into the bucket. There is no difference between the gripping process of the plastic bottle and the glass.

1 2

Figure 58. Open a Plastic Bottle with Both Hands

1 2

3 4

Figure 59. The Process of Gripping the Plastic Bottle and Pouring Water into the Bucket

85

The fourth item is a paper cup. This paper cup is a disposable cup with a smooth surface.

The weight of the empty paper cup is 8 g, and the weight of the filled paper cup is 216 g. The paper cup is easy to distort thus spilling the water. The force should be controlled more carefully when the user tries to grip the filled paper cup. Figure 60 shows the process of gripping the paper cup on the table and pouring water into the bucket. As shown in Figure 60, the shape of the paper cup has been slightly distorted. However, this is still acceptable since no water was spilled.

1 2

3 4

Figure 60. The Process of Gripping the Paper Cup and Pouring Water into the Bucket

The fifth item is a mug. This ceramic mug has a handle on the side. The surface of the mug is smooth. The weight of the empty mug is 260 g, and the weight of the full mug is 550 g.

86

There are two ways to grip a mug: first, grasp the handle of a mug; second, grip the body of the mug. However, the first way could not be used by the modified bionic hand because the fingers of the modified bionic hand are somewhat inflexible. The mug could not be leveled held when it was held by the handle. Therefore, the second way was chosen for the test. Figure 61 shows the process of gripping the mug and pouring water into the bucket. In Figure 61, the bionic hand holds the body of the mug.

1 2

3 4

Figure 61. The Process of Gripping the Mug and Pouring Water into the Bucket

The test results for using different liquid containers have been recorded in Table 9. Based on the table, the modified bionic hand shows a reliable performance when it is used to grip the

87 liquid containers made of various materials. Thus, the bionic hand allows the user to drink from a variety of everyday liquid containers successfully.

Table 10 shows details of the gripped objects tested in this work. The weights, dimensions and gripping modifications of all objects are described in the table.

Table 10. Gripping Other Objects

Gripping Objects Weight (g) Dimensions (cm) Modifications

Fork 98 12.4 Bubblewrap

Cell Phone 164 6.9 x 1.5 x 13.5 Cell phone is raised

USB Cable 20 100 No

Book 1 142 14.0 x 0.5 x 21.6 No

Book 2 258 13.0 x 2.0 x 19.6 No

Help with another Book 3 1176 17.8 x 4.1 x 23.6 hand

Eating is another essential activity for humans. The modified bionic hand must enable humans in using utensils. Forks are the most commonly used utensils. Hence, a fork is tested.

After a set of preliminary tests, it was determined that the fork could not be held by using the modified bionic hand because the diameter of the fork is too small for the fingers. However, there is an effortless way to solve this problem by wrapping the handle of the fork, e.g., with

88 bubble wrap to increase its diameter. Figure 62 shows the gripping modified fork where a rubber band is used to affix the bubble wrap to the handle of the fork.

Figure 62. The Modified Fork

Figure 63. Holding the Modified Fork by Using the Bionic Hand

When the user tries to hold the fork by using the modified bionic hand, the palm faces the bubble wrap on the handle of the fork. The bubble wrap ensures that the bionic hand can grip the fork. Figure 63 shows that the fork can be held by using the modified bionic hand. The bubble wrap increases the diameter of the fork’s handle and allows the user to hold the fork by the modified bionic hand. Moreover, the space between the fingers is filled with the bubble wrap,

89 which helps the modified bionic hand to hold the fork more tightly. This result proved that a modified fork could be held firmly by the modified bionic hand.

Today, the mobile phone is a necessity. The modified bionic hand is expected to be able to help in operating a mobile phone. Therefore, an iPhone SE and its cable are tested in this section. The shape of a mobile phone is mostly cuboid. The mobile phone generally has a plastic outer surface. The weight of the mobile phone is 164 g. The dimension of the mobile phone is

6.9 cm x 1.5 cm x 13.5 cm. Preliminary tests determined that the iPhone SE is too thin to grip.

Therefore, the mobile phone is raised to make it easier to grasp. Figure 64 shows the process of gripping the mobile phone and plugging the cable into the phone using the modified bionic hand.

The user:

1. Lifts the iPhone SE from the table.

2. Inserts the cable into the phone.

3. Removes the cable from the phone.

4. Places the phone on the table.

5. Releases the phone.

90

1 2

3 4

Figure 64. The Process of Gripping the Mobile Phone and Plugging the Cable into the Phone

In the age of electronics, information is usually stored in an electronic device, but newspapers, books, and other documents, which are made of paper, are still used. Among these items, books are usually the heaviest. Hence, the user grips and lifts books for testing the bionic hand. For evaluating the performance of the hand, three books of varied sizes and weights were chosen for this test.

91

The first book has dimensions 14.0 cm x 0.5 cm x 21.6 cm and weighs 142 g. The cover of the book is soft. While holding the book, the fingers and the palm are parallel to the ground, and the book is perpendicular to the palm and the ground. Figure 65 shows the user holding the first book. The dimensions of the second book are 13.0 cm x 2.0 cm x 19.6 cm. This book weighs 258 g. The cover of the second book is soft as well. The user is holding the second book the same way as the first. Figure 66 shows the researcher holding the second book. The dimensions of the third hard-covered book are 17.8 cm x 4.1 cm x 23.6 cm. This book weighs

1176 g. However, the third book could not be lifted by only using the modified bionic hand because the book was too heavy. The modified bionic hand can create enough force to grip the third book, but the fingers of the modified bionic hand could be damaged by the weight of the book. Therefore, the modified bionic hand could be used as an auxiliary hand for carrying the heavy book. Figure 67 shows the third book being held with two hands. The real hand is holding one side of the book as the main hand, and the bionic hand is holding another side of the book as the auxiliary hand.

92

Figure 65. Holding the First Book with the Bionic Hand

Figure 66. Holding the Second Book with the Bionic Hand

93

Figure 67. Holding the Third Book with Both Hands

5.2.3.3 Pinching Tests

Picking things up is another useful hand function used in daily activities. This hand operation is used primarily for pinching light and small objects. The weight of the test objects does not affect the result. Therefore, the shape of the item is the primary variable in this section.

In the pinching test, a magazine, a USB cable, a zipper of a sweatshirt, a pen, and a shoelace were chosen as the test objects. Table 11 shows all testing objects and activities in the pinching test. The user assesses the pinching mode by completing the activities in the table.

When the control system changes the gripping mode to the pinching mode, middle fingers move to the preset position, and only the index finger can be controlled by the user.

Through the pinching test, additional applications for the modified bionic hand are investigated.

94

Table 11. Objects and Activities of the Pinching Test

Objects Activities

Magazine Turn a Page

USB Cable Plug into a Port of a Laptop

Zipper Zip a Zipper on a Sweatshirt

Pen Write the User’s Name

Shoelace Tie a Shoe

The first test object is a magazine. When people are reading, they should be able to turn pages. In this test, a magazine is used to represent all books, magazines, and documents which need to have pages turned. Before the test, the magazine is placed on the table, and the researcher tries to turn a page of the magazine. Figure 68 shows the process of turning a magazine page by using the modified bionic hand. The researcher concentrates and controls the modified bionic hand to pinch a corner of one page, then keeps with the normal attention level and turns it over.

In the end, the researcher mentally relaxes and releases the edge of the page.

95

1 2

3 4

Figure 68. The Process of Turning a Magazine Page

The second test object is a USB cable. The USB cable and the laptop are placed on the table. Figure 69 shows the process of plugging the USB cable into a USB port on the laptop. The user keeps with the high attention level and makes the modified bionic hand pinch the cable, and then aims for the port in the laptop and plugs the USB into it. At last, the user changes to the low attention level and releases the USB cable.

96

1 2

3 4

Figure 69. The Process of Plugging the USB Cable into the Port in the Laptop

The third test object is the zipper of a jacket. During the process of wearing clothes, one of the most complicated actions is to zip the zipper of a jacket. Therefore, a zipper is chosen as a test object. Before the test, the initial orientation of the bionic hand must be adjusted because the bionic hand is not in its workspace. The bionic hand will be restarted and pointed nearly straight head while it is running the Initial Setup. Figure 70 shows the process of zipping up a jacket. The user concentrates and controls the modified bionic hand to pinch the box of the zipper, then uses

97 the real hand to join the pin and the box together. The last step is to pull up the slider while pinching the box of the zipper by using the modified bionic hand.

1 2

3 4

Figure 70. The Process of Zipping up the Zipper of a Jacket

The fourth test object is a pen. Although writing is not as common as it used to be, people still need to be able to sign their names. Before the test, the user places the pen between the index finger and the thumb of the modified bionic hand. Figure 71 shows the process of writing the researcher’s name by using the modified bionic hand to hold a pen. The user concentrates and controls the modified bionic hand to pinch the pen, then keeps the normal attention level and

98 writes the researcher’s name. During the tests, sometimes the pen was dropped from the hand because the action of holding the pen cannot be adjusted easily. The user must write carefully with the modified bionic hand. Based on the result, the handwriting is considered as acceptable but childish (Figure 72). Therefore, the user can sign his/her name with the help of the modified bionic hand.

1 2

3 4

Figure 71. The Process of Writing the User’s Name

99

Figure 72. The Signature Signed by Using the Modified Bionic Hand

The fifth test object is a shoelace. As same as zipping up a jacket, the workspace must be adjusted before this test. The user is wearing an untied left shoe and squats down. Figure 73 shows the process of tying a shoelace by using the modified bionic hand. There are four steps to tie the shoelace of the left shoe.

1. The user concentrates and controls the modified bionic hand to pinch the right

side of the shoelace, uses the real hand to make the first cross of the shoelace

while keeping the high attention level.

2. The user makes the first loop on the left side of the shoelace and uses the

modified bionic hand to loop the right side of the shoelace over the first loop.

3. The user uses the real hand to make the right side of the shoelace cross the hole

under the first loop of the second loop and relaxes mentally until the modified

bionic hand releases the side of the shoelace.

100

4. The user concentrates and controls the modified bionic hand to pinch one of the

circles on the shoelace, another circle is held by the real hand, and pull both loops

to make the second cross of the shoelace.

5 6

7 8

9 10

Figure 73. The Process of Tying a Shoelace

In this part of the test, the fingers of the modified bionic hand need to be moved often.

Time taken for tying the shoelace is longer than the time when using a real hand. Moreover, the user must keep the normal attention level for a long time for pinching the shoelace.

101

5.3 Conclusions

In this chapter, a set of experiments has been designed and performed to prove that the modified bionic hand could work as well or better than the original Hackberry hand. Each test was divided into two parts: the verification test and the physical application test. The verification tests approved the functions of the modified bionic hand, while the physical application tests showed how the modified bionic hand could perform in a realistic environment. Moreover, the

SHAP is used as an assessment criterion to test the performance of the modified bionic hand.

Verification tests proved that each module of the new control system worked correctly.

The new control system solved the two disadvantages of the original Hackberry hand discussed in Chapter 4. The first disadvantage is that the hand moves unintentionally while the user is paying attention to something else. The second disadvantage is that the bionic hand cannot control the degree of the flexed movement. The performance of the new control system complied with expectations. The modified bionic hand was controlled by EEG signals. The modified bionic hand did not move unintentionally when the user was standing or walking. Also, the modified bionic hand could stop in the desired position. Hence, the modified bionic hand worked as expected.

The SHAP usually evaluates the functionality of a myo-controlled bionic hand. This research firstly attempted to evaluate an EEG-controlled bionic hand. Compared with a myo- controlled bionic hand, the user with an EEG-controlled bionic hand is susceptible to

102 environmental and emotional impacts. Especially, people are nervous once the stopwatch starts ticking. Moreover, the user group was comprised of people with both healthy hands. They had a short training time for completing all tasks with the modified bionic hand. The lack of familiarity with experimental procedures and the modified bionic hand can cause the users to experience difficulties in controlling their attention

In the physical application tests, the modified bionic hand was tested for gripping and picking up some everyday items. In the gripping test, liquid containers, a modified fork, a USB cable, an iPhone, and books were chosen as test objects. Five types of liquid containers were tested. The modified bionic hand performed reliably. Therefore, the modified bionic hand can help users drink water. Moreover, it can also help users deal with small items having a cylindrical shape, such as medicine bottles and jars. The modified bionic hand allowed a user to use a fork to eat. Although the traditional fork must be adjusted, the method to adjust the fork is quite simple and can be expanded to other utensils with handles, like spoons and knives. Testing a USB cable and an iPhone proved that the modified bionic hand could enable the user to charge a mobile phone. At the end of the gripping test, three books of various sizes were tested. The tests successfully demonstrated the hand's ability to hold books as well as the cooperation between the real and the bionic hand.

For the pinching test, a magazine, a USB cable, a zipper of a jacket, a pen, and a shoelace were chosen as test objects. Turning a page of a magazine proved that the modified bionic hand could help the user read. The test involving plugging a USB cable into the laptop indicates that

103 using the modified bionic hand can allow the user to charge an electronic device and use the flash drive. Testing the zipper and the shoelace made sure the user could fasten clothes and tie shoes independently. When considering holding a pen one can claim success although the handwriting is childish. The modified bionic hand still allows the user to sign his/her name.

Finally, a direct correlation between the user’s mental state and the ease of use of the hand was observed.

104

Chapter VI Conclusions and Future Work

In this thesis, challenges dealing with hand prostheses have been presented. As a result, the need for the development of suitable bionic hands was justified. Modern, well-functioning bionic hands are cost prohibitive, while the low-cost bionic hands usually do not meet users’ basic needs. Bionic hands with better benefit-cost ratios are required. Therefore, improving functionality of the low-cost bionic hands was the focus of this thesis. Two technologies have been wildly used in the low-cost bionic hand applications, the EMG and the EEG. The EMG is used in most low-cost bionic hands, but this control method depends on a healthy stump. Hence, the user who does not have a healthy stump cannot use the bionic hand controlled by the EMG.

Although the EEG is not used often, this technology has been improved in recent years, and the prices of the commercial BCIs became more affordable. Therefore, the EEG control has a potential use in the production of the low-cost bionic hands. However, current bionic hands controlled by the EEG usually use a computer to analyze the data for the BCI device. Thus, use of a computer results in an increase in the cost of the bionic hand and also limits the range of the user’s movement. At an affordable price, the NMM headset, which is a BCI device, uses the

EEG technique to communicate with the controller of a bionic hand directly. However, using the

NMM headset created two problems: first, the user could not adjust the position of the fingers, and second, the bionic hand could move unintentionally.

105

To deal with these two problems, an improved control system was developed. This control system was implemented using the mechanical hand parts based mainly on the Hackberry bionic hand. Instead of a miniature reflective object sensor, the NMM headset is used as the sensor which acquires the EEG signal for the control system. The EEG signal is analyzed by the

Arduino Nano and converted to control the servo motors. Additionally, the IMU sensor –

MPU6050 is added to assist the control system by providing the accelerometers’ and gyroscopes’ data to detect the orientation of the modified bionic hand. Experimental results showed that the user could control the modified Hackberry bionic hand by using the NMM headset. The user does not need to have a healthy stump for using the Hackberry bionic hand. In addition, the position of the fingers can be adjusted for holding or griping different objects. Moreover, the control system can identify if the user is resting or swinging their bionic hand naturally, which means the user does not want to use the bionic hand at that time.

After some experimentation, the modified Hackberry bionic hand controlled by the NMM headset worked as intended. Amputees can use this bionic hand to meet their essential needs, including putting clothes on, reading books, answering the phone, eating and drinking. In conclusion, the research goal, which was to improve the Hackberry by using EEG, has been achieved.

This work also has some limitations. The control system’s response time is somewhat long because each person has his/her own definition of “concentrate” and “relax”. The user has to find a reliable way to increase and decrease the intensity of concentration to control the fingers

106 of the bionic hand. Moreover, the program that decides if the user wants to use the bionic hand is not intelligent enough. In addition to standing and walking, there are many other situations where the user does not want to use the bionic hand.

As far as the hardware is concerned, the bionic hand shown in this work is only a prototype. The electronic circuit is built on a handmade board with wires and electronics exposed to the environment.

In the future, a better algorithm for calculating attention levels may be developed and used in the control system. This algorithm should reduce the response time and make the judgment for driving servo motors more precise. The IMU sensor can be programmed specifically for the bionic hand. Since the sensor can accurately detect acceleration of the bionic hand, based on the acceleration data, the control system could make a more precise decision when the user wants to use the bionic hand.

107

References [1] A. Officer and A. Posarac, "World report on disability," World Health Organisation, 2011. [2] J. J. Distefano, A. J. Stubberud, and I. J. Williams, Schaum's Outline of Feedback and Control Systems. McGraw-Hill Professional, 1997, p. 572. [3] N. Wiener, Cybernetics; or, Control and communication in the animal and the machine. New York: J. Wiley, 1948. [4] C. Medynski and B. Rattray, "Bebionic prosthetic design," 2011: Myoelectric Symposium. [5] P. Duraiswami, M. Orth, and S. Tuli, "5000 years of orthopaedics in India," Clinical orthopaedics and related research, vol. 75, p. 269, 1971. [6] J. Draycott. (2017, May 17, 2017 ). Severed limbs and wooden feet: how the ancients invented prosthetics. Available: https://theconversation.com/severed-limbs-and-wooden- feet-how-the-ancients-invented-prosthetics-77741 [7] K. Norton, "A brief history of prosthetics," InMotion, vol. 17, no. 7, pp. 11-13, 2007. [8] H. Oatman. (2012, October 29th, 2012). War and Prosthetics: How Veterans Fought for the Perfect Artificial Limb. Available: https://www.collectorsweekly.com/articles/war- and-prosthetics/ [9] R. Clement, K. Bugler, and C. Oliver, "Bionic prosthetic hands: A review of present technology and future aspirations," The surgeon, vol. 9, no. 6, pp. 336-340, 2011. [10] G. S. Dhillon, S. M. Lawrence, D. T. Hutchinson, and K. W. Horch, "Residual function in peripheral nerve stumps of amputees: implications for neural control of artificial limbs," The Journal of hand surgery, vol. 29, no. 4, pp. 605-615, 2004. [11] R. Weir et al., "New multifunctional prosthetic arm and hand systems," in Engineering in Medicine and Biology Society, 2007. EMBS 2007. 29th Annual International Conference of the IEEE, 2007, pp. 4359-4360: IEEE. [12] X. Chen, Y.-P. Zheng, J.-Y. Guo, and J. Shi, "Sonomyography (SMG) control for powered prosthetic hand: a study with normal subjects," Ultrasound in medicine & biology, vol. 36, no. 7, pp. 1076-1088, 2010. [13] M. S. Fifer et al., "Simultaneous neural control of simple reaching and grasping with the modular prosthetic limb using intracranial EEG," IEEE transactions on neural systems and rehabilitation engineering, vol. 22, no. 3, pp. 695-705, 2014. [14] D. P. McMullen et al., "Demonstration of a semi-autonomous hybrid brain–machine interface using human intracranial EEG, eye tracking, and computer vision to control a robotic upper limb prosthetic," IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol. 22, no. 4, pp. 784-796, 2014.

108

[15] X. Zhang, Y. Wang, Y. Li, and J. Zhang, "An approach for pattern recognition of EEG applied in prosthetic hand drive," Journal of Systemics, Cybernetics and Informatics, vol. 9, no. 6, pp. 51-56, 2011. [16] G. McGimpsey and T. C. Bradford, "Limb prosthetics services and devices," Bioengineering Institute Center for Neuroprosthetics Worcester Polytechnic Institution, 2008. [17] D. K. Blough, S. Hubbard, L. V. McFarland, D. G. Smith, J. M. Gambel, and G. E. Reiber, "Prosthetic cost projections for servicemembers with major limb loss from Vietnam and OIF/OEF," Army Medical Dept Washington DC 2010. [18] A. Vallabhaneni, T. Wang, and B. He, "Brain—computer interface," Neural engineering, pp. 85-121, 2005. [19] E. E. Fetz, "Operant conditioning of cortical unit activity," Science, vol. 163, no. 3870, pp. 955-958, 1969. [20] C. Berka et al., "NeuroGaming: Merging cognitive neuroscience & virtual simulation in an interactive training platform," Advances in understanding human performance: Neuroergonomics, human factors design, and special populations, pp. 313-324, 2011. [21] A. K. Tom and U. Ramachandraiah, "Brain Computer Interface Based Smart Living Environmental Auto Adjustment Control system Using Internet of Things (IOT) Networking," International Journal of Emerging Technology in Computer Science & Electronics (IJETCSE), vol. 13, no. 1, pp. 360-367, 2015. [22] B. Lee, C. Y. Liu, and M. L. Apuzzo, "A primer on brain–machine interfaces, concepts, and technology: a key element in the future of functional neurorestoration," World neurosurgery, vol. 79, no. 3, pp. 457-471, 2013. [23] L. R. Hochberg et al., "Neuronal ensemble control of prosthetic devices by a human with tetraplegia," Nature, vol. 442, no. 7099, pp. 164-171, 2006. [24] K. A. Moxon and G. Foffani, "Brain-machine interfaces beyond neuroprosthetics," Neuron, vol. 86, no. 1, pp. 55-67, 2015. [25] L. F. Nicolas-Alonso and J. Gomez-Gil, "Brain computer interfaces, a review," Sensors, vol. 12, no. 2, pp. 1211-1279, 2012. [26] E. Niedermeyer and F. L. da Silva, Electroencephalography: basic principles, clinical applications, and related fields. Lippincott Williams & Wilkins, 2005. [27] G. Deuschl and A. Eisen, "Recommendations for the practice of clinical neurophysiology: guidelines of the International Federation of Clinical Neurophysiology," Electroencephalography and Clinical Neurophysiology, 1999. [28] L. F. Haas, "Hans Berger (1873–1941), Richard Caton (1842–1926), and electroencephalography," Journal of Neurology, Neurosurgery & Psychiatry, vol. 74, no. 1, pp. 9-9, 2003. [29] H. H. Jasper, "The ten twenty electrode system of the international federation," Electroencephalography and Clinical Neurophsiology, vol. 10, pp. 371-375, 1958.

109

[30] J. Null, "EEG Neurofeedback: An effective treatment for ADHD," The Review: A Journal of Undergraduate Student Research, vol. 9, no. 1, pp. 33-35, 2007. [31] J. F. Lubar, M. O. Swartwood, J. N. Swartwood, and P. H. O'Donnell, "Evaluation of the effectiveness of EEG neurofeedback training for ADHD in a clinical setting as measured by changes in TOVA scores, behavioral ratings, and WISC-R performance," Applied Psychophysiology and Biofeedback, vol. 20, no. 1, pp. 83-99, 1995. [32] D. Bright, A. Nair, D. Salvekar, and S. Bhisikar, "EEG-based brain controlled prosthetic arm," in Advances in Signal Processing (CASP), Conference on, 2016, pp. 479-483: IEEE. [33] M. A. A. Kasim et al., "User-Friendly LabVIEW GUI for Prosthetic Hand Control Using Emotiv EEG Headset," Procedia Computer Science, vol. 105, pp. 276-281, 2017. [34] D. Elstob and E. L. Secco, "A low cost eeg based BCI prosthetic using motor imagery," arXiv preprint arXiv:1603.02869, 2016. [35] T. Beyrouthy, S. K. Al Kork, J. A. Korbane, and A. Abdulmonem, "EEG Mind controlled Smart Prosthetic Arm," in Emerging Technologies and Innovative Business Practices for the Transformation of Societies (EmergiTech), IEEE International Conference on, 2016, pp. 404-409: IEEE. [36] S. Nathan. (2013). The Arduino Prosthesis Using the Neurosky Mindwave. Available: http://learn.parallax.com/educators/inspiration/arduino-prosthesis-using-neurosky- mindwave [37] E. Saint-Elme, C. Kracinovich, and D. Renshaw, "Design of a Biologically Accurate Prosthetic Hand," presented at the 2017 International Symposium on Wearable Robotics and Rehabilitation (WeRob), Houston, TX, USA, November 5-8, 2017. [38] C. M. Oppus, J. R. R. Prado, J. C. Escobar, J. A. G. Mariñas, and R. S. Reyes, "Brain- computer interface and voice-controlled 3D printed prosthetic hand," in Region 10 Conference (TENCON), 2016 IEEE, 2016, pp. 2689-2693: IEEE. [39] Y.-C. Cheng, "A Hybrid Brain-computer Interface for Intelligent Prosthetics," Master, Texas A & M University, 2014. [40] C. Wu, A. Song, and P. Ji, "A Control Strategy for Prosthetic Hand Based on Attention Concentration and EMG," in International Conference on Intelligent Robotics and Applications, 2015, pp. 307-318: Springer. [41] A. H. Khan, M. Sarkar, and I. Khan, "Development of a Prosthetic Hand Operated by EEG Brain Signals and EMG Muscle Signals," International Journal of Control Theory and Applications, vol. 8, no. 3, pp. 941-948, 2016. [42] M. Owen, "The Development of a Brain Controlled Robotic Prosthetic Hand," University of Waikato, 2015. [43] Y. Li, X. Zhang, Z. Huang, and Q. Hu, "Design of wearable intelligent mind controlled prosthetic hand," in Mechatronics and Automation (ICMA), 2012 International Conference on, 2012, pp. 2462-2466: IEEE.

110

[44] J. ten Kate, G. Smit, and P. Breedveld, "3D-printed upper limb prostheses: a review," Disability and Rehabilitation: Assistive Technology, vol. 12, no. 3, pp. 300-314, 2017. [45] C. W. Hull, "The birth of 3D printing," Research-Technology Management, vol. 58, no. 6, pp. 25-30, 2015. [46] J. Bird, "Exploring the 3D printing opportunity," The Financial Times. Retrieved, pp. 08- 30, 2012. [47] C. Amon, J. Beuth, L. Weiss, R. Merz, and F. Prinz, "Shape deposition manufacturing with microcasting: processing, thermal and mechanical issues," Transaction-American Society of Mechanical Engineers Journal of Manufacturing Science and Engineering vol. 120, pp. 656-665, 1998. [48] C. K. Chua, K. F. Leong, and C. S. Lim, Rapid prototyping: principles and applications. World Scientific, 2010. [49] Andrew. (2012 ). MakerBot Launches Replicator 2 For High Quality Prototyping. Available: https://www.makerbot.com/media-center/2012/09/19/a-whole-new-makerbot- introducing-replicator-2-desktop-3d-printer [50] e. Inc. (2015, November 19, 2017). HACKberry Open source community. [51] Arduino. Arduino - Introduction. Available: https://www.arduino.cc/en/Guide/Introduction [52] Arduino. Arduino Micro. Available: https://store.arduino.cc/usa/arduino-micro [53] Arduino. Arduino Nano. Available: https://store.arduino.cc/usa/arduino-nano [54] M. Swan, "Sensor mania! the internet of things, wearable computing, objective metrics, and the quantified self 2.0," Journal of Sensor and Actuator Networks, vol. 1, no. 3, pp. 217-253, 2012. [55] (2015). MindWave Mobile: User Guide. Available: http://download.neurosky.com/support_page_files/MindWaveMobile/docs/mindwave_m obile_user_guide.pdf [56] A. Cismas, I. Matei, V. Ciobanu, and G. Casu, "Crash Detection Using IMU Sensors," in Control Systems and Computer Science (CSCS), 2017 21st International Conference on, 2017, pp. 672-676: IEEE. [57] R. Johnson, "GPS system with IMUs tracks first responders," ed: EETimes, 2011. [58] Interfacing Mindwave Mobile with Arduino. Available: https://www.pantechsolutions.net/brain-computer-interface/interfacing-mindwave- mobile-with-arduino [59] I. Studio, "HC-05-Bluetooth to Serial Port Module," Datasheet, June, 2010. [60] J. Rowberg, "I2Cdevlib. MPU-6050 6-axis accelerometer/gyroscope," Publicación electrónica: http://www. i2cdevlib. com/devices/mpu6050 Consultada, vol. 12, no. 01, 2014. [61] F. Oudert. (Dec 14). fabriceo/SCoop: Simple Cooperative Scheduler for Arduino and Teensy ARM and AVR. Available: https://github.com/fabriceo/SCoop

111

[62] A. Bouchti. (2014). MindWave Mobile and Arduino. Available: http://developer.neurosky.com/docs/doku.php?id=mindwave_mobile_and_arduino [63] N. T. Group. Varspeedservo. Available: https://github.com/netlabtoolkit/VarSpeedServo

112

/****************************************************************************

****************************************************************************

Project :- Controlling a 3D Printed Bionic Hand Using Brain Wave

Class :- Master Thesis Research Application

Advisors :- Dr. N. Jaksic, Dr. J. DePalma, and Dr. B. Ansaf

Creator :- Boyan Li

File :- Thesis.h

About :- This code is the main for this thesis

****************************************************************************

****************************************************************************/

#include

#include "MPU6050_6Axis_MotionApps20.h"

#include "I2Cdev.h"

#include "Wire.h"

#include

/****************************************************************************

* Global Definitions

***************************************************************************/

#if defined(SCoopANDROIDMODE) && (SCoopANDROIDMODE == 1)

113

#else

#error "Scoop error"

#endif

#define baudrate 57600

#define indexpin 3

#define middlepin 5

#define thumbpin 6

#define switchpin 12

#define LEDpin 11

#define indexmin 140

#define indexmax 32

#define middlemin 83

#define middlemax 32

#define thumbpinch 56

#define thumbopen 153

#define highattention 60

#define lowattention 35

#define motormax 7

#define middlespeed 8

#define indexspeed 12

114

MPU6050 mpu;

/* MPU6050 to Arduino Nano

VCC - 5V

GND - GND

SDA - A4

SCL - A5

*/

/* create servo objects to control three servo motors */

VarSpeedServo index;

VarSpeedServo middle;

VarSpeedServo thumb;

/****************************************************************************

* Global Variables

***************************************************************************/ int movemode = 1; int handmode = 0;

bool dmpReady = false;

115 uint8_t mpuIntStatus; uint8_t devStatus; uint16_t packetSize; float yaw = 0; float pitch = 0; float roll = 0; float newyaw; float newpitch; float newroll; float yawchange; float pitchchange; float rollchange;

uint16_t fifoCount; uint8_t fifoBuffer[64];

Quaternion q;

VectorInt16 aa;

VectorInt16 aaReal;

VectorInt16 aaWorld;

116

VectorFloat gravity; float ypr[3];

boolean fingerstate = true; volatile bool mpuInterrupt = false; void dmpDataReady() {

mpuInterrupt = true;

}

/**************************************************

Initial Setup

*************************************************/ void setup() {

pinMode(LEDpin, OUTPUT);

digitalWrite(LEDpin, HIGH);

pinMode(switchpin, INPUT_PULLUP);

Serial.begin(baudrate);

Serial.flush();

117 while(!Serial.available());

Serial.println("the headset is ready");

index.attach(indexpin); middle.attach(middlepin); thumb.attach(thumbpin);

index.write(indexmin); middle.write(middlemin); thumb.write(thumbopen);

Serial.println("servo motors are ready");

#if I2CDEV_IMPLEMENTATION == I2CDEV_ARDUINO_WIRE

Wire.begin();

TWBR = 12; // 400kHz I2C clock (200kHz if CPU is 8MHz)

#elif I2CDEV_IMPLEMENTATION == I2CDEV_BUILTIN_FASTWIRE

Fastwire::setup(400, true);

#endif

mpu.initialize();

118

Serial.println(mpu.testConnection() ? F("MPU6050 connection successful") : F("MPU6050 connection failed"));

devStatus = mpu.dmpInitialize();

mpu.setXGyroOffset(7);

mpu.setYGyroOffset(6);

mpu.setZGyroOffset(27);

mpu.setXAccelOffset(-2322);

mpu.setYAccelOffset(760);

mpu.setZAccelOffset(730);

if (devStatus == 0) {

mpu.setDMPEnabled(true);

attachInterrupt(0, dmpDataReady, RISING);

mpuIntStatus = mpu.getIntStatus();

dmpReady = true;

packetSize = mpu.dmpGetFIFOPacketSize();

} else {

119

Serial.print(F("DMP Initialization failed (code "));

Serial.print(devStatus);

Serial.println(F(")"));

while(1){

digitalWrite(LEDpin, LOW);

delay(500);

digitalWrite(LEDpin, HIGH);

delay(500);

}

}

mySCoop.start();

int i = 0;

Serial.println("calculating initial orientation"); while (i < 5000 ){

getposition();

i++;

} getposition();

120

yaw = newyaw;

pitch = newpitch;

roll = newroll;

Serial.print(yaw);

Serial.print(" ");

Serial.print(pitch);

Serial.print(" ");

Serial.println(roll);

Serial.println("setup is completed");

digitalWrite(LEDpin, LOW);

}

/**************************************************

Main Loop

*************************************************/

void loop() {

/* Decide the mode of the bionic hand*/

121 if (handmode == 0){

if (thumb.read() != thumbopen){

Serial.println("reset hand...");

index.write(indexmin);

middle.write(middlemin);

thumb.write(thumbopen);

}

Serial.println("Sleep mode");

}else if (handmode == 1){

if (thumb.read() != thumbpinch) {

thumb.write(thumbpinch);

Serial.println("Operating mode");

if (digitalRead(switchpin) == LOW){

middle.write(middlemax);

fingerstate = false;

Serial.println("Pinching Mode");

} else{

fingerstate = true;

122

Serial.println("Gripping Mode");

}

}

/**************************************************

Servo Motors Driver

*************************************************/

switch (movemode) {

case 1:

Serial.println("finger closes");

index.write(indexmax,indexspeed,false);

if(fingerstate){

middle.write(middlemax,middlespeed,false);

}

break;

case 2:

Serial.println("finger opens");

index.write(indexmin,indexspeed,false);

if(fingerstate){

123

middle.write(middlemin,middlespeed,false);

}

break;

case 3:

index.stop();

if(fingerstate){

middle.stop();

}

Serial.println("Keeping current position");

break;

}

}

mySCoop.delay(500);

}

/**************************************************

Orientation Judgment

*************************************************/

124 void getposition() {

if (dmpReady) {

mpuInterrupt = false;

mpuIntStatus = mpu.getIntStatus();

fifoCount = mpu.getFIFOCount();

if ((mpuIntStatus & 0x10) || fifoCount == 1024) {

mpu.resetFIFO();

Serial.println(F("FIFO overflow!"));

} else if (mpuIntStatus & 0x02) {

while (fifoCount < packetSize) fifoCount = mpu.getFIFOCount();

mpu.getFIFOBytes(fifoBuffer, packetSize);

fifoCount -= packetSize;

mpu.dmpGetQuaternion(&q, fifoBuffer);

mpu.dmpGetGravity(&gravity, &q);

mpu.dmpGetYawPitchRoll(ypr, &q, &gravity);

newyaw = ypr[0] * 180/M_PI;

125

newpitch = ypr[1] * 180/M_PI;

newroll = ypr[2] * 180/M_PI;

}

}

}

defineTaskLoop(loop2){

getposition();

yawchange = fabs(newyaw - yaw);

pitchchange = fabs(newpitch - pitch);

rollchange = fabs(newroll - roll);

if (pitchchange > 25 || rollchange > 30){

handmode = 1;

} else {

handmode = 0;

}

126

}

/**************************************************

Attention Collection

*************************************************/

/* collect data from the headset */ byte ReadOneByte() {

int ByteRead;

while(!Serial.available());

ByteRead = Serial.read();

return ByteRead;

}

/*analyze signals and get the value of the user's attention*/ defineTaskLoop(loop3) {

127 byte generatedChecksum = 0; byte checksum = 0; int payloadLength = 0; byte payloadData[64] = {0}; byte attention = 0; byte poorQuality = 0; boolean bigPacket = false;

if(ReadOneByte() == 170) {

if(ReadOneByte() == 170) {

payloadLength = ReadOneByte();

if(payloadLength > 169)

return;

generatedChecksum = 0;

for(int i = 0; i < payloadLength; i++) {

payloadData[i] = ReadOneByte();

generatedChecksum += payloadData[i];

}

128

checksum = ReadOneByte(); generatedChecksum = 255 - generatedChecksum;

if(checksum == generatedChecksum) {

poorQuality = 200;

attention = 0;

for(int i = 0; i < payloadLength; i++) {

switch (payloadData[i]) {

case 2:

i++;

poorQuality = payloadData[i];

bigPacket = true;

break;

case 4:

i++;

attention = payloadData[i];

break;

129

case 0x80:

i = i + 3;

break;

case 0x83:

i = i + 25;

break;

default:

break;

}

}

if(bigPacket) {

if(poorQuality == 0){

if(attention > 0 && attention <= 100){

if (attention > highattention){

movemode = 1;

}

else if (attention < lowattention){

movemode = 2;

130

}

else {

movemode = 3;

}

}

}

else {

movemode = 3;

}

}

}

}

}

}

131