<<

A VERSATILE HUMAN MACHINE INTERFACE FOR GESTURE RECOGNITION WITH APPLICATIONS IN

A Thesis submitted to the faculty of San Francisco State University In partial fulfillment of the requirements for the Degree -

Master of Science

In

Engineering: Embedded Electrical and Computer Systems

by

Kartik Bholla

San Francisco, California

May 2018 Copyright by Kartik Bholla 2018 CERTIFICATION OF APPROVAL

I certify that I have read A Versatile Human Machine Interface for Gesture Recognition with Applications in Virtual Reality by Kartik Bholla, and that in my opinion this work meets the criteria for approving a thesis submitted in partial fulfillment of the requirement for the degree Master of Science in Engineering: Embedded Electrical and

Computer Systems at San Francisco State University.

Xiaorong Zhang, Ph.D. Assistant Professor

Hao Jiang, Ph.D. Associate Professor A VERSATILE HUMAN MACHINE INTERFACE FOR GESTURE RECOGNITION WITH APPLICATIONS IN VIRTUAL REALITY

Kartik Bholla San Francisco, California 2018

In recent years, the technological advancements had led to the development of ultra-high definition immersive video technology and virtual reality (VR) systems from companies like , , etc. VR has applications in many fields such as gaming, surgery training, robotics, and rehabilitation. To enable intuitive control of VR systems, this research aimed to evaluate a user-oriented and versatile gesture control interface for VR systems. Specifically, a first-person shooter (FPS) VR game was developed using Unity 3D and interfaced with an electromyogram (EMG)-based gesture control interface developed by the SFSU Intelligent Computing and Embedded Systems

Laboratory (ICE Lab). A comparable study has been done on able-bodied human subjects to compare the gesture-based control scheme and the keyboard and mouse-based control scheme.

I certify that the Abstract is a correct representation of the content of this thesis.

Date ACKNOWLEDGEMENTS

I would first like to thank my thesis advisor Dr. Xiaorong Zhang. The door to Dr.

Zhang’s office was always open whenever I ran into a trouble spot or had a question about my research or writing. She consistently allowed this paper to be my own work but steered me in the right the direction whenever she thought I needed it.

Besides my advisor, I would like to thank my friend and colleague Ian Donovan for his wonderful collaboration. You supported me greatly and were always willing to help me.

Finally, I must express my very profound gratitude to my parents for providing me with unfailing support and continuous encouragement throughout my years of study and supporting me spiritually throughout my life. This accomplishment would not have been possible without them. Thank you.

v TABLE OF CONTENTS

List of Table...... viii

List of Figures...... ix

List of Appendices...... x

1. Introduction...... 1

1.1 Current Approach of EMG Gesture Recognition...... 2

1.2 ICE HMI: Introduction...... 3

2. Virtual Reality: Introduction...... 7

2.1 Background of Virtual Reality...... 8

2.2 Current State of Virtual Reality...... 10

2.3 VR Branching Out of Gaming...... 13

2.4 Uses for Virtual-Reality Tech ...... 15

2.5 Accessories ...... 24

3. Virtual Reality Zombie Game...... 29

3.1 Human Machine Interface for VR Gam e...... 32

4. Experiment & R esults...... 37

4.1 Experiment Protocol ...... 37

4.2 Evaluation & Results ...... 39

4.3 Results Summary...... 43 4.4 Drawbacks of Current Approach...... 44

4.4.1 Drawbacks of Virtual Reality G am e...... 44

4.4.2 Drawbacks of Custom Interface Pipe ...... 45

5. Features Added to Interface...... 46

5.1 Implementing Third Byte in the HMI...... 46

5.2 Designing an HMI Simulator...... 48

6. Conclusion & Discussion...... 51

6.1 Results Concluded...... 51

6.2 Future Work...... 52

Reference...... 54

Appendices...... 56

vii LIST OF TABLES

Table Page

1. Average Age of VR Tests Subjects...... 37 2. Average Experience in FPS Games of VR Tests Subjects...... 38 3. Average Experience with VR of VR Tests Subjects...... 39 4. Average Results with Standard Deviation Results of VR Tests...... 40 5. Post Survey Results for Ease of play with different control schema...... 41 6. Post Survey Results for feel of game with different control schema...... 42 7. Post Survey Results depicting engagement with different control schema...... 42 8. Post Survey Results depicting associated level of fatigue with different control schema...... 43 LIST OF FIGURES

Figures Page

1. MYO Armband...... 4 2. HMI Structure...... 5 3. ICE HMI Console...... 6 4. Nintendo ...... 8 5. Nintendo Virtual Boy 3D Graphics...... 9 6. Rift CV1...... 11 7. HTCVIVE...... 12 8. VR in Military Training...... 17 9. VR Training for Dental Students...... 19 10. VR CAD Simulation for Automotive Industry...... 20 11. VR Travel Experience for the Disabled...... 21 12. Physical Therapy Through VR...... 22 13. VR helping to fight Phobias in a safe controlled environment...... 23 14. PRIO VR...... 25 15. ...... 26 16. Dexmo Exoskeleton Gove...... 27 17. VR Zombie FPS Gamepay...... 30 18. VR Zombie Game Control Schema...... 32 19. Duplex NamePipe Server Implement on Local Machine...... 34 20. HMI Interface Flow...... 35 21. Intensity Bar Implementation...... 47 22. HMI Simulator Flow...... 49 23. HMI Simulator API Working...... 50 ix LIST OF APPENDICES

Appendix Page

1. Engagement Questionnaire...... 56 2. Usability Questionnaire...... 58

x 1

Chapter 1

Introduction

Gestures have always been an integral part of human expression. In recent years, gesture recognition has gained popularity in several fields, as it provides a potential solution to a variety of problems. Research has been done on hand gesture recognition for a defined sign language library, in order to help the deaf, communicate more conveniently Research surrounding electromyographic (EMG) signals has shown that gestures can be quantified and classified by using feature extraction and pattern recognition methods. EMG based Human machine Interface (HMI) have shown signs of being a promising solution for limb control. There exists a wide range of applications outside the current ones for which an accurate, real-time gesture recognition interface could prove critically beneficial. This research aims on processing EMG and Inertial

Measurement Unit (IMU) data to develop such an interface. The goal of this human machine interface was to be modular, so that it adapts to different client applications whether hardware or application and have room to add more features as per required. For this particular project a Virtual Reality (VR) application was developed.

This after mentioned VR application served as a client for the HMI and was used as an evaluation platform for the system. Virtual Reality was selected in particular because VR seemed as the next big technological break of this decade. Millions has been poured in 2

the development of VR technology to bring it to main stream as an alternative to the way we consume media, play games, interactive teaching in universities and in medical treatments like of rehabilitation, Post Traumatic Stress Disorder (PTSD), phobias etc.

Studies done involving virtual reality environments have made use of gesture-based

control schemes t|0^, as they allow for a natural, immersive user experience.

1.1 Current Approach of EMG Gesture Recognition

In the past, gesture recognition was done by somehow tracking user’s movement

and position in 3D space. Typically, this tracking was done by cameras and computer-

vision. The results were very impressive under certain factors such as proper and

adequate lighting and a good quality camera within the line of sight of the movements.

However, it is limited by the capabilities of the camera. Another type of tracking can be

done by Motion-sensing hand gloves. The results with motion-sensing gloves are pretty

accurate but, on the downside, setting up these gloves can be cumbersome as the devices

are pretty bulky. It may also cause an inconvenience to the user with the shear bulk and

any dangling interface connections from the device which may restrict the user’s movement. Third and most recently emerging approach to determine a user’s movement

is through bio signals that is, reading and manipulation the electrical signals from a user’s body. These electric signals are Electromyographic (EMG) signals. These EMG signals 3

can be captured by placing a pair of electrodes on the user’s muscle and reading the slight variations in the electric signals passing through while making different gestures.

1.2 ICE HMI: Introduction

ICE MyoHMI project aimed to develop an intelligent EMG-based gesture control interface (GCI) which identifies the user’s hand and arm gestures from EMG signals collected from forearm muscles as well as provides interfaces to external gesture- controlled applications. It is a flexible, low cost and open source software-based human- machine interface (HMI) for gesture recognition, developed in the ICE Lab (Intelligent

Computing & Embedded Systems Laboratory). It translates bio signals into predicted user intentions in real-time to control machines (VR, Prosthetics, Exoskeletons).

MyoHMI uses an off the shelf commercially available armband called Myo Band from

Thalmic Labs (Figure 1). Figure 1. Myo Armband (source: myo.com)

This armband enables the MyoHMI to record EMG signals from the forearm of the user. The data is collected by eight prefiltered EMG sensors at 200Hz. It also collects kinematic data at 50 Hz via an inertial measurement unit (IMU). The band interface with the system with low energy Bluetooth 4.0. The software platform integrates a user- friendly graphic user interface (GUI) and a sequence of signal processing modules for

EMG feature extraction and pattern classification. Figure 2 details foundational structure 5

of MyoHMI. The MyoData module provides connection to the Myo armband and collects multiple channels of EMG signals as the system input.

Figure 2. Human Machine Interface Structure (source: MyoHMI: A Low-Cost and Flexible Platform for Developing Real-Time HM1 for Myoelectric Controlled Applications, IEEE Trans. SMC, 2016 4495-4500.)

The input signals are segmented by overlapped sliding analysis windows. For each analysis window, the FeatureCalculator module extracts EMG features which characterize individual EMG signals. To recognize the user’s gesture, EMG features of individual channels are concatenated into one feature vector and then sent to the

Classifier module for gesture classification. The gesture classification algorithm consists of two phases: training and testing. In the training phase, a set of EMG data are collected from each investigated gesture for the ClassifierTrainer module to create a classification model that maximally distinguishes the EMG patterns of different gestures 6

*>%#!' Mmm'. &»*»■»** V#j* IKSMAV; S £*»*»** WMAY ?fcf*4HNO«t 0

I %? 0*# !*«#«» | f f; : L

|0 Ate (km O «M. ** m», a Do 'change weapon* u $**«

$im CM* Ornnh ftMMm* T**srs$ Cum* to#w tectum 40* Hmt

S*m m *M * tm * *•>< *■»* U«s#i» ? w r*Q Aeeuwcy : tt C& T*«

o Rtc»rf 0*a*M* P»» O mm** null

My® I

Figure 3. ICE HM1 Console

A pipeline was developed in order to give the GCI the capability to output gesture

classifications and IMU data to a client application. The client application can then use

this data for its desired purposes. A virtual reality application was developed

in conjunction with this project that implemented this pipeline to use gesture output from the GCI for control purposes inside the game as opposed to standard keyboard/mouse

control. More design and development details of the virtual reality application are

discussed further ahead in the report. 7

Chapter 2

Virtual Reality: Introduction

Virtual Reality (VR), also known as immersive multimedia or computer- , is a computer technology that replicates an environment, real or imagined, and simulates a user’s physical presence and environment in a way that allows the user to interact with it. Virtual reality artificially creates sensory experience, which can include sight, touch, hearing and even smell. While VR stays in its earliest stages, it is rapidly developing into a suitable medium for people to connect to the internet with the real world. The pace of development in VR is unquestionably quickening, and the cost of the VR devices is relied upon to drop quickly as more buyers endorse the technology. 8

2.1 Background of Virtual Reality

Figure 4. Nintendo Virtual Boy (source: Wikipedia.com)

Virtual Reality isn’t a very new technology. The earliest traces of virtual reality dates back to early 1990’s. In 1995, Nintendo released the Virtual Boy (Figure 4), a monochromatic headset that promised users to offer true 3D graphics in gaming for the first time (Figure 5). The hardware was a clunky, desk-mounted device that sold for $180 9

($280 in 2016 dollars) and gave users splitting headaches. It sold barely a tenth of what

Nintendo had hoped and was discontinued in less than a year later.

Figure 5. Virtual Boy 3D Graphics (source: digitalspy.com/retrogaming)

Nintendo’s Virtual Boy failed because the in early 1990’s we lacked the technology to develop a for the consumer market. The microprocessors weren’t powerful enough, Liquid Crystal Display (LCD) and Organic

Light-Emitting Diode (OLED) displays weren’t available back then to provide a smooth, high-definition, high frame-rate viewing. That’s why Virtual Boy was ahead of its time, but unfortunately wasn’t ahead in its technology. Luckily the research and development of virtual reality didn’t stop there. In 2012, (founder, Oculus VR) 10

launched a Kickstarter campaign to develop a virtual reality headset solely for gaming.

The original goal was to acquire US$250,000 for the campaign but with the community interest, the campaign raised US$2.4 million, ten times the original goal.[2] [3] [4] In 2014,

Facebook acquired Oculus VR for US$2.3 billion in cash and stocks. The company also partnered up with Samsung to develop its Samsung Gear VR for Samsung Galaxy

Smartphones.[5] All the hype and mergers led other big players like HTC and Steem into the race to develop the first and the best virtual reality headset. Many development studios also started creating content for VR before the launch of the hardware device itself. All the competition drove the market for efficient and wider variety of virtual reality headsets at reduced prices and much better quality

2.2 Current State of Virtual Reality

Virtual Reality is set for a breakout. Since 2010, investors have poured nearly $4 billion into startups working on virtual reality [6]. Corporates invested a record $2 billion into AR/VR startups in the 2017, despite the market still being in its earliest stages. The year 2016 saw the release of the first serious consumer virtual reality headsets. In late

2016, 11

Figure 6. CV1 (source: oculus.com)

Oculus was the first to release the Oculus Rift CV-1 (Consumer Version-1) of its virtual reality headset (Figure 6). The CV-1 incorporated specialized VR displays with a pair of 2160x1200 pixels display, positional audio and infrared tracking. Later that year

HTC released the competitor of Oculus Rift, The HTC VIVE (Figure 7) with similar specifications but with a pair of motion controllers that tracked the movements of the player. Both the VR headsets are targeted towards the general consumers. 12

Figure 7. HTC Vive (source: vive.com)

During the research, it was decided that a third-party application in Virtual

Reality (VR) will be developed as a client for the HMI. The inclination to virtual reality

was so because there was already an Oculus Rift DK-2 (Development Kit 2) in the ICE

lab. The Oculus Rift DK-2 further referred to as HMD (Head Mounted Display), was the

latest Oculus offered for its VR developers at the time. It wasn’t available for the masses but instead Oculus introduced the Rift DK-1 and Rift DK-2 solely for the developers ahead for its launch of Oculus Rift CV-1 (Consumer Version 1) in late 2016. Oculus

main business strategy for doing this was to create content for virtual reality before releasing the actual virtual reality device for the masses. Oculus did this to boost its sales as the content would drive people to buy actual virtual reality hardware. 13

2.3 VR Branching Out from Gaming

Virtual Reality as a technology is being marketed as a revolution in gaming industry. VR promises to provide most immersive 3D gaming yet, that would trick a user’s mind in believing what it’s seeing in a simulated 3D virtual environment. Much like today’s PC gaming, a virtual reality application will be more demanding than other current generation PC games. But the standard for satisfactory VR gaming is going to be much, much higher. Currently users will be satisfied with 1080p, 60 frames per second performance. But that’s not going to be enough for virtual reality.

On the architectural level, consumer versions of virtual reality headsets use a pair of 2160 x 1200 pixels displays which are constantly being refreshed at 90 Hz. That depicts that the PC running a virtual reality game gave to render graphics on two monitors at much higher refresh rate at 90 frames per second for a smooth experience.

Rendering such intense graphics is no easy task for any PC. Special high end multi-core central processing units (CPU) and graphic processing units (GPU) are required. This high-end hardware and technology comes at a steep cost and usually most of the whether competitive or casual are consumers of such hardware and technology. Any VR ready PC could cost multiple times that any regular PC would cost. So, the hardware being reserved for gaming industry made gaming platform a perfect breeding place for virtual reality. Since people owing high end gaming PC’s would just have upgrade their 14

graphics card (GPU’s) to be compatible with the virtual reality hardware, instead of

building a new multiple thousand-dollar gaming PC from scratch.

For this research, Oculus Rift Development Kit 2 (DK-2) is used. None of the PC

in the lab wasn’t that powerful enough to run Oculus or any game on it. The first plan of

action was to get a new high-end graphics card and try it on the current PC. A Nvidia

GTX 980-Ti GPU was bought solely for this purpose. This GPU was the latest and

greatest of 2016. It comprised of 2,816 CUDA (Compute Unified Device Architecture)

cores with a base clock at 1000MHz and 8GB of GDDR5 memory. But, unfortunately

this GPU didn’t work with the existing PC in the lab. The existing PC lacked the

processing power, memory, extra HDMI port for VR, extra high-speed USB 3.0 interface

and most importantly a PCI-E 4 slot for the GPU.

Soon all the above problems were resolved when a new VR compatible PC was

assembled in the lab. The PC comprised of all the latest and powerful hardware of the

time. It was powered by a Core i7 quad-core CPU with multi-threading enabled along

with 16 GB of DDR4 memory clocked at 2666MHz on an Asus Z170A SLI Plus

motherboard. Along with that the peripherals used with this system were CoolerMaster

Storm Octane Gaming Gear mouse and keyboard combo, for audio the Logitech G230

Stereo Gaming Headset was used, the HMD of choice was the Oculus Rift DK2

(Development Kit 2) and for gesture controller Myo Armband was used. Same GPU

Nvidia GTX 980-Ti was used. On the new VR PC Oculus runtime was setup and a demo virtual reality application was played to test the whole system and the Oculus Rift DK-2. 15

With successful build and testing it was time to brainstorm the idea for the virtual reality application that would give a proof of concept of the proposed ICE Human Machine

Interface with a third-party application. More about the application idea, workflow and development is explained further in the report.

2.4 Uses for Virtual-Reality Tech

The innovation in virtual reality technology holds tremendous potential to impact and alter the future for various fields from pharmaceutical, medical healthcare, business, training simulation, military, rehabilitation, architecture to manufacturing.

Here is a list of many applications of virtual reality.

• Virtual Reality in the Military

• Virtual Reality in Healthcare

• Virtual Reality in Education

• Virtual Reality in Engineering

• Virtual Reality in Sport

• Virtual Reality in Telecommunications

• Virtual Reality and Scientific Visualization

• Virtual Reality in Construction 16

There are numerous a larger number of employments of virtual reality technology than first realized which go from academic research through to engineering, design, business, the arts and entertainment. Yet, regardless of the utilization, virtual reality delivers a set of information which is used to establish new models, training methods, communication and interaction.

• Virtual Reality in the Military

Military adopted virtual reality technology in its three services (army, navy and

air force) where it is used for warfare simulation and training purposes. Virtual reality

simulation enables the soldiers to re-enact a particular scenario like engagement with

an enemy in a virtually simulated environment which they have no experience of. It

reduces the risk of death and serious injury while been proven to be safer and low

cost than the traditional training methods. Virtual reality is also used to treat post-

traumatic stress disorder. Soldiers suffering from battlefield trauma and other

psychological conditions can learn how to deal with their symptoms in a ‘safe’

environment. The idea is for them to be exposed to the triggers for their condition

which they gradually adjust to. 17

Figure 8. Use of VR for Military Training (source vrs.org.uk)

• Virtual Reality in Healthcare

Healthcare services is one of the greatest embracers of virtual reality which includes surgery reenactment, phobia treatment, simulated robotic surgery and skills training. One of the upsides of the virtual reality technology is that it enables doctors and other medical professionals to learn new skills and also reviving existing ones in very protected and safe domain. Additionally, it enables this without endangering the patients. 18

A great example of virtual reality tech in healthcare is the human simulation system. The human simulation system enables doctors, nurses and other medical professionals to interact with patients and engage in 3D interactive and immersive training scenarios. Virtual robotic surgery is also a popular use of virtual reality tech in healthcare domain. In virtual robotic surgery, the surgery is performed by a robotic arm on an actual patient or in a virtual training environment. The robotic arm is controlled by a professional human surgeon, it is very useful in the field of remote telesurgery, where the patient is in a separate location from the doctor for example emergency situation in a battlefield or setting up a boot camp in rural areas.

More examples of virtual reality in healthcare:

• Virtual reality in dentistry

Nobody wants to be a guinea pig for dental students in training. Thanks to

virtual reality, dental students can make all their mistakes on a virtual patient.

(Figure 9) One program, called the Virtual Dental Implant Training Simulation

Program, walks students through an entire procedure, from administering

anesthesia to choosing the right drill size. Virtual patients even come with

different personalities and medical histories. 19

Figure 9. VR for Training Dental Students (source: evolving-science.com)

• Virtual reality in medicine

• Virtual reality in nursing

• Virtual reality in surgery

• Surgery simulation

• Virtual reality in Automotive Manufacturing

From the design process to virtual prototypes, car manufacturers have

been using high-tech simulations for decades. With the Oculus Rift, Ford Motor

Company has made virtual reality central to its automotive development. In

Ford’s Immersion Labs [8] employees can wear a virtual-reality headset and

inspect the interior and exterior of a car, as well as have a seat inside an

automobile before it is manufactured. (Figure 10) The prototype in virtual

environment allows designers and engineers from various departments to closely 20

inspect different elements, such as the engine or upholstery, and spot potential

problems before they arise.

Figure 10. VR CAD Simulator for Automotive Industry (source: roadtovr.com)

• Virtual reality for the disabled

In 1995, The New York Times [7] ran a story describing multiple uses, like

a VR experience that let a 5-year-old boy with cerebral palsy take his wheelchair

through a grassy field, or another that let 50 children with cancer spend some time

"swimming" around an animated fish tank. 21

v*. tos v Vf'«

Figure 11. VR Travel Experience for the Disabled (npr.org)

• Virtual reality therapies

• Virtual reality treatment for autism

Professors at the University of Texas, Dallas created a training program to

help kids with autism work on social skills. It uses brain imaging and brain wave

monitoring, and essentially puts kids in situations like job interviews or blind

dates using avatars. The study found that after completing the program,

participants brain scans showed increased activity in areas of the brain tied to

social understanding. 22

• Virtual reality rehabilitation

For patients who survived a stroke or traumatic brain injury, time is of the

essence. The earlier they start rehabilitation, the better chances they have for

successfully regaining lost functions.

Figure 12. Physical Therapy through VR (source: rehabalternatives.com)

• Virtual reality in phobia treatment

One treatment for patients with phobias is exposure therapy. The VR

experience provide for a controlled environment in which patients can face their 23

fears and even practice coping strategies, as well as breaking patterns of

avoidance — all while in a setting that's private, safe, and easily stopped or

repeated, depending on the circumstances.

Figure 13. VR helping to fight phobias in a safe controlled environment (source: ibtimes.sg)

• Virtual reality treatment for PTSD

Nearly 8 million adults suffer from PTSD (Post-Traumatic Stress

Disorder) during a given year, according to the National Center for PTSD. The

condition can occur after someone has been exposed to a significant stressor and

often includes symptoms such as avoidance, hyper-vigilance, anger issues and

mood swings. One common method for treatment is called “exposure therapy.” A

therapist can take advantage of virtual reality by observing the patients while they

navigate the virtual environment. The virtual environment can be controlled in 24

real-time by increasing or decreasing the difficulty by the therapist so that the

patient doesn’t get overwhelmed with particular emotions.

2.5 Virtual Reality Game Accessories

Virtual Reality is shedding its niche status and emerging further as technology to look forward to in the coming years. With the currently expanding industry and with the inflow of big names such as Facebook and HTC mass producing their Oculus Rift CV1 and VIVE headsets respectively, the hardware that is required to drive them is getting cheaper and with that more and more people are able to experience the immersive nature of VR. With that in mind a big question for the VR experience is how immersive the whole experience can get. The visual experience of VR is half of the story for full immersive experience the control schema within that VR environment must be as immersive as the VR experience itself. Currently many developers are designing new immersive controllers for the VR experience, these designs currently come in two different types, devices with button input control, such as remotes and joysticks. In my research, I found some of the devices that claim to be superior that the traditional control schema of mouse and keyboard. Some of the interesting ones were: 25

• PrioVR

PrioVR uses high-performance inertial sensors to provide 360 degrees of

low-latency, real-time motion tracking without the need for cameras, optics, line-

of-sight, or large, awkward equipment. (Figure 14). The downside is the user need

to require huge space and wear a mesh on sensors on the body. All the wires and

sensors dangling from head to toe can actually restrict the user’s movement.

LZ Lite Pro

H= wired hub D = wireless hub O = sensor In front o = sensor in back ■= wire in front ■» wire in back ■= harness ■= strap Figure 14. Prio VR by YEI Technology (source: roadtovr.com)

• Cyberith Virtual izer

The Virtualizer’s (Figure 15) flat base plate has a low-friction surface that

enables user to walk, run, and strafe freely in every direction. As it’s flat,

movement feels realistic, dramatically enhancing immersion. On the downside, 26

the system requires huge space and encaging the user in it. It might feel too tight or claustrophobic to some.

Figure 15. Cyberith Virtualizer (source: cyberith.com) 27

• Dexmo

Dexmo (Figure 16) is a wearable mechanical exoskeleton that captures

your hand motion as well as providing you with force feedback. It breaks the

barrier between the digital and real world and gives you a sense of touch.

Figure 16. Dexmo Exoskeleton Glove (source: roadtovr.com)

What the industry currently lacks is benchmark to calculate the immersive factor of VR technology with the tradition displays and controllers. Many studies have been made comparing the VR experience against traditional virtual 3D experience, but the goal of this research was to compare and contrast the different control methods. With this in mind we wanted to be able to test the two VR experiences and a standard virtual 28

experience within a first-person (FPS). For assessment we used in game metrics to evaluate performance within the game and also used surveys that gave us a description of the player’s past with VR and gaming in general and also their immersiveness and awareness while in the game itself. 29

Chapter 3

Virtual Reality Zombie Game

The idea was to design a 3D 360-degree virtual reality game and to develop an interface to and fro from ICE MyoHMI and the game. The game would act as a third-part client application that would accept its control input from the MyoHMI. The game would also be a proof of concept of this research and the interface would be a stepping stone for several VR or Non-VR applications in the field of gaming, simulation and most importantly healthcare for treatment of post-traumatic stress disorders (PSTD), anxiety, rehabilitation, surgery simulation etc.

After brainstorming for game ideas, a first-person shooter (FPS) game was selected. (Figure 17) This particular genre of game was selected as an FPS game controls could efficiently utilize and test the MyoHMI and all of its functionalities. The objective of the game was that the player has to survive transition from one location to a rescue point while trying to obtain high score based of eliminating AI enemies, which in this case were Zombies trying to attack the player and take down the player’s health. Many small yet significant control inputs were added to the game such as input to propel player forward, input to switch weapons, input to reload weapons, input to switch on and off vehicles headlights and input to shoot the weapon. Additionally, enemies were spawned randomly on specific locations within the game, with a maximum number of enemies 30

being allowed within the game at any given time. To simplify the inputs, there is no free range of movements, the player is in a car for the majority of the game and at the last checkpoint the player will get out of the car and will proceed towards a helicopter to the finishing point of the game.

The game lasts for around 2.5 minutes to 3 minutes. The difficulty of the game can be adjusted by controlling the number of enemies spawning in the game. For starting, we experimented off with 7 enemies in the game that were recurring, meaning as soon as the player kills an enemy another one re-spawns at a random spawn point but there will maximum 7 enemies at any given point of time in the game.

Figure 17. VR Zombie FPS Screenshot 31

Since the research also aims at comparing the proposed gesture-based control scheme and traditional control schemes using peripheral devices such as a keyboard and/or a mouse to evaluate which gives a better sense of immersion for the user.

Specifically, the video game is developed with three different control methods. The first method is more of a traditional way which is based on the standard input with a keyboard and a mouse while the output being displayed on a monitor, the second method is using the same input schema with keyboard and a mouse, but the output is displayed on the

Oculus Rift DK2 HMD, and the third and the last method was based on input from the

Myo arm band running on custom ICE HMI software while the output is displayed on the

Oculus Rift DK2 HMD.

For keyboard and mouse implementation, the mouse controls aiming and shooting while keys on the keyboard handles the other actions performed within the game such as switching weapons, reloading weapons, player movement and interaction with objects within the game. For both VR implementations, (Figure 18) the HMD controls the aiming and the sight while the inputs are controlled by whichever implementation device is being used (mouse or Myo). 32

Figure 18. Different Control Schemas to Play the VR Zombie Game

3.1 Human Machine Interface for VR Game

An interface pipeline was developed in order to give the ICE HMI the capability to output gesture classifications and IMU data to a client application. The client application can then use this data for its desired purposes. The virtual reality video game explained previously was developed in conjunction with ICE HMI, that implemented this pipeline to use gesture output from the HMI for control purposes inside the game as opposed to standard keyboard/mouse control. 33

The game naturally accepts discrete keystroke commands as input. One issue in interfacing the HMI with the game was that the HMI was designed to output a continuous stream of gesture classifications. The challenge was in finding a means by which to pick the correct classifications out of this continuous stream to feed to the game as keystroke commands. The method implemented used the previously mentioned windowed standard deviation of the MMAV of the EMG signals to send the latest classification after this value had dropped below some user defined threshold. A named pipe server was used to provide the interprocess communication between the pipe server (HMI) and the pipe client (VR Zombie game) on a local machine. Named Pipes provides interprocess communication between a pipe server (HMI) and a pipe client (VR Zombie game).

Names pipes were selected to be used because they provide full duplex communication which is a feature that lacks on using anonymous pipes. The sever creates a thread that can accept client connection; the connected client process then sends an acknowledgment bit to the server. The server process then opens a buffer and sends the gesture bits to the client application. For debugging purposes, the data received from the server is also displayed to the console on Unity3D. In this research project the client server process is intended to run on the same machine, so the server name provided to the

NamedPipeClientStream object is as shown in figure 19. If the client application and the server application is on different systems, the would be replaced by the network name or the global IP of the server. 34

• (NamdPipeClientStream pipeClient NamdPipeCIi&ntStr&am(**.“, pipeName, PipeDirection.InOut))

Figure 19. Duplex NamePipe Server on Local Machine

Currently, the HMI pipeline has the ability to output a 3-byte package to a client application with each byte representing a specific value: the index of the most recent gesture classification, the average magnitude of that gesture (MMAV), and an 8-bit byte with each of the bits representing either a gesture command or information from the IMU

(i.e. arm swings). Each bit can be toggled to represent the current state of the gesture that was assigned to that bit. Three control schemes to set/reset a bit have been implemented: continuously (bit set for as long as the gesture is being classified), when entering a steady state of the gesture, or when an MMAV peak of the gesture has occurred (i.e. the user has not changes the gesture, only tensed and relaxed). 35

Figure 20. Interface Flow 36

Upon successfully receiving the package, the client application sends back a single byte as an acknowledgement to the server (HMI), this way the client can give feedback to the server. The HMI can receive a number of different acknowledgements, which can be set to vibrate the Myo armband, providing haptic feedback to the user. 37

Chapter 4

Experiment & Results

4.1 Experiment Protocol

After the testing the ICE HMI and the VR Zombie game individually, it was time to design an experiment protocol that would test the entire system the ICE HMI, Interface pipe and the VR Zombie as one. The VR game was used as a usability assessment platform to test the feasibility of using gesture classifications and IMU data from the

HMI to control the game as opposed to a standard keyboard/mouse setup. Two different survey questionnaires were created (linked in the Appendix below) along with a pre­ survey that was given to each subject to collect information about his or her age range

(Table 1), gender, and whether or not the subject had prior experience with VR (Table 3) or first-person shooters (Table 2).

Age Subjects (years) (percentage) 18-20 33.3 20-24 22.2 24 + 44.4 Table 1. Average Age of VR Tests Subjects 38

Four gestures were used to control the game which included inputs from both

EMG and IMU: fist with hand down (fire weapon), sweep right with hand down (change weapon), fist with hand up (reload weapon), and sweep right with hand up (toggle vehicle headlight). In addition to the four gestures, arm swings were used to control a walking portion of the game, each arm swing equates to a one step inside the game.

Experience Subjects (years) (percentage)

Never 11.1

1-3 33.3

4-10 22.2

11 + 33.3

Table 2. Average Experience in FPS Games of VR Tests Subjects

Eleven subjects participated in the experiment, out of which 9 were males and 2 were females. Prior to each trial, the subject was given a training session on how to play the game. Each subject conducted two trials, one for each control scheme: using mouse only and using the Myo armband only. Before the Myo armband trial, each subject had to train each of the four gestures in the GCI to ensure accuracy of gesture classification. The subject then put on an Oculus Rift VR headset and played each trial of the game using 39

one of the control schemes. Each subject then filled out a post-survey meant to gauge the subject’s experience in using the two control methods.

The metrics used in the usability test included: firing accuracy (hit-to-miss ratio), the final score of the match (based on the number of zombies killed), and the remaining health of the player.

VR Experience Subjects (percentage)

First Time 33.3

1-5 44.4

6-20 22.2

Owns VR Headset 0

Table 3. Average Experience with VR ofVR Tests Subjects

4.2 Evaluation & Results

After the subject experiment, on average the mouse control results were a tad higher than the gesture controls (Table 4). The score was calculated on basis of number of zombies killed;

Score = Number of zombies killed * 10 40

Also, only four out of the eleven subjects were able to complete the game without their character dying when using gesture controls, whereas all but two of the subjects completed the game with health remaining using the mouse.

Accuracy was the ratio of the number of shots fired from the gun to number of shots that actually hit the zombies.

Mouse Mouse Myo Myo

(Avg.) (S.D) (Avg.) (S.D)

Score 199.09 51.46 104.46 64.08

Final health 46.81 28.83 16.36 13.66

Accuracy 65.56 19.39 59.5 28.17

Table 4. Average Results with Standard Deviation Results of VR Implementation Tests

The game started initially with 100 health points dedicated to the player. These health points were decremented by 5 points whenever a zombie attacks the player. The total health points remaining at the end of the game was considered as the Final Health.

Also, the game ends when the player reaches zero health.

The post-surveys suggest that subjects found gesture controls to be difficult to use. Average of the post surveys were taken to compute the final result. 66.66% of the 41

respondents answered that the gesture controls were either hard or very hard to use.

Responses were taken on a scale of 1- 5 where, 1= very easy and 5 = very hard (Table 5).

Response

Ease to see Zombies in VR 1.5

Shoot with Myo 3.5

Shoot with Mouse 1

Reload with Myo 3.5

Reload with Mouse 1

Ease of play with Myo 3

Ease of play with Mouse 1.5

Table 5. Post Survey Results for Ease ofplay with different control schema

The post-surveys also revealed that 90% of the subjects found it more natural to look at the zombies in VR but only 40% found it natural to play with Myo in contrast to

80% that felt it more natural with mouse. Responses were taken on a scale of 1- 5 where,

1= very natural and 5 = not natural at all (Table 6). 42

Response

Look of Zombies in VR 1.5

Shoot with Myo 3

Shoot with Mouse 2

Playing with Myo 3

Playing with Mouse 2

Table 6. Post Survey Results for feel of game with different control schema

The engagement of the player with different control schema was calculated next where 70% of the subjects found the game as very fun to play with Myo, in contrast to

100% of subjects who found the game as very fun to play with mouse. Responses were taken on a scale of 1- 5 where, 1= very fun and 5 = no fun at all (Table 7).

Response

Playing with Myo 2 11 Playing with Mouse 1.5

Table 7. Post Survey Results depicting engagement o f players with different control schema

The post-surveys also reveled that 60% of the subjects felt muscle fatigue after playing the game with Myo, although the fatigue was temporary and was caused due to 43

excessive use of muscles for making repetitive gestures such as making a fist for shooting zombies. It was found that on an average a subject would make the gesture used for shooting (fist, in this case) for approximately 40 times during full course of the game.

Responses were taken on a scale of 1- 5 where, 1= not tiring and 5 = very tiring (Table

8).

Response

Playing with Myo 3.

Playing with Mouse 1

Table 8. Post Survey Results depicting associated level of fatigue with different control schema

4.3 Results Summary

On the engagement front of the game, 66% subjects believed they lost the sense of time while playing with Myo in-comparison to 55% with mouse. 44% of the subjects believed they felt unaware of their environment while playing with Myo and VR, in contrast to 55% who played with mouse and VR. 66% felt that the gameplay felt fairly automatic with Myo while 55% felt the same while playing with a mouse. 33% subjects forgot that they are playing a game while using Myo, In-comparison to 11% who felt the 44

same while playing with a mouse. 55% of the subjects didn’t want to stop playing the game with Myo, 66% of the subjects didn’t want to stop playing the game with a mouse.

4.4 Drawbacks of Current Approach

4.4.1 Drawbacks of Virtual Reality Game

The goal for this project was to develop an interface from the HMI to an external application (Virtual Reality Zombie game in this case). Due to time constraint we were limited to a set of features for the game. Initially the proposed approach was being to make an 3D environment within the game so that the player could move freely anywhere on the map, while making the game more interactive by placing ammunition and weapon chest across the map. This would give us a chance to try wider range of gestures for different movements within the game as well as making it more interactive. 45

4.4.2 Drawbacks of Custom Interface Pipe

Due to time constraints we were unable to make the interface pipeline as customizable as we initially desired. A three-byte package was sent to the client application. Each byte comprised the index of the most recent gesture, duration and the average magnitude of the gesture. The feature to transmit the average magnitude of the gesture wasn’t implemented in the initial trials of the testing, this feature was skipped and leftover for the next iteration of the game. The way the pipe handles the third byte of the package was written specifically for the use of the virtual reality application and much of its functionality was hardcoded into the HMI and not very flexible in making changes for other client applications. Ideally, inside the GUI the user would have an option in the output tab to set a desired gesture to any of the 8 bits of the value and decide how that bit is set before the package is sent (either continuously set as a certain gesture is being made, upon entering a steady of the current gesture, or upon a tense of the current gesture). 46

Chapter 5

Features Added to Interface

5.1 Implementing Third Byte in the HMI

Due to time constraints the implementation of the magnitude of gesture via the third byte in the pipe wasn’t completed in the after-mention project. Determining the magnitude of the gesture and making it available to the client application i.e. the VR

Zombie game was one of the important features of the ICE HMI. To implement this feature some changes were to be made in the ICE HMI itself. The changes were made by putting thresholds on the gesture magnitude at the time of recording of gestures. Two thresholds were kept at 25% and 60% on the windowed standard deviation of the MMAV of the EMG signals. The HMI was reconfigured to output two different identifiers, one for the threshold between 25% and 60% and another for threshold above 60%. The game was further updated to reflect the changes made in the HMI. In the games user interface, an intensity bar was implemented. The purpose of this intensity bar was to display the average intensity of the gesture made. The intensity bar was made to change its color depending on the intensity magnitude of the gesture made. The intensity bar would turn red for intensity magnitude between 25% - 60% and would turn blue when the threshold exceeds 60% (Figure). This intensity bar was placed in the Heads-Up Display (HUD) of 47

the game. In the game the intensity bar was implemented in such a way that it becomes visible only when the game receives the third byte data from the pipe i.e. only when the game is played using gestures by wearing the Myo armband. If the player switches to more traditional ways like keyboard and mouse, then the intensity bar won’t be displayed. This sort of implementation also served as a debugging tool for testing the data in the third byte.

Figure 21. Implementation of the Intensity Bar 48

5.2 Designing an HMI Simulator

To ease the development and testing process of the interface and the game (client application), an HMI simulator was developed that simulated the outcome of the ICE

HMI. The simulator simulated the three-byte package data as an output for the game.

This approach also saved time as at each trial more 60% of the time was consumed in setting up the HMI and recording trials. With the simulator we were able to send definite number of data packets and then compare on the client side for successful delivery of the same packets. This helped to analyze any packet loss during the transmission in the interface pipe or in the game.

The simulation software was basically an API (Application Specific Interface) developed in JavaScript which is a scripting language similar kind of C sharp (C#) the one used for the VR Zombie game development. The simulator was set to send a response every 3 second apart. The responses were an idle bit, weapon fire gesture bit, weapon reload gesture bit, toggle car headlights gesture bit and change weapon gesture bit. 49

Figure 22. HMI Simulator Flow

The simulator was set to send a fire bullet gesture every 15 second apart. When the bullet count was 20, it would send a reload gesture. Also, when the bullet count was a multiple of 5 the simulator would trigger the vehicle headlight toggle gesture. Apart from the above mention responses, the simulator would send an idle response when no gesture is made. Each package comprises of the index if the gesture made, the duration of the gesture made and the magnitude of the intensity of the gesture made. gesture; idle

bullerCount: 5 gesture: light gesture

bulletCount:5 gesture: idle

bulletCount:5 gesture: idle

bulletCount:5 gesture: idle

bulletCount:5 gesture: idle

bulletCount:5 gesture: idle

bulletCount: 6 gesture: bullet fire

0

Figure 23. HMI Simulator API Working 51

Chapter 6

Conclusion & Discussion

The research work has been carried out for the classification of different hand gestures based on EMG signals. The system combines data from an IMU accelerometer and multi-channel EMG sensors to achieve real-time hand gesture recognition which are then stream to an external client application (Zombie game in virtual reality), via the custom developed Human Machine Interface pipe. Experiments were conducted to evaluate the usability and the overall user engagement of the system. The additions that have been made to the interface have improved usability as well as functionality, especially the implementation of the third byte to depict the intensity magnitude of the gesture. This feature has a special use case in rehabilitation therapies and can also be used more productively for future interactive applications.

6.1 Results Concluded

From the research results, it can be concluded that playing the game with gesture controls on long test sessions induces stress on the limb muscles which further induces fatigue and reduces the overall performance in the game, as 60% of the subjects felt muscle fatigue after playing the game with Myo. However, that’s can be the only case 52

with this particular game as the game is designed to be high paced action shooter. The results with other applications especially training and rehabilitation might vary.

From the results it can be also concluded that 90% of the subjects found the game more immersive in VR in comparison to playing on a monitor, but only 40% found it natural to play with Myo in contrast to 80% that felt it more natural with mouse. It can be due to the low adaptation time given to the subjects as there were only three trials which lasted for approximately 15 minutes in total.

6.2 Future Work

A large part of the work done during this project has been laying the groundwork for future improvements to be made to the interface. The future work will focus on enhancing the robustness of the system and extending the methods to other types of applications, for example, to gesture-based mobile interfaces.

The data used for the presented experiments was not collected by experienced professionals and some level of error is possible. With only five subjects and three trials, additional evaluations. With only eleven subjects and three trials, additional evaluations of the interface and the pipe-client architecture is suggested. However, the findings from this research are promising. 53

Lastly, we would like to propose the full execution and real-world, real-time testing of the interface with the state of the art ICE HMI system. From the finding of this project, we suggest the development of a modular and flexible application that can utilize the full potential of the system and can use the 8-bit byte of the interface in a interesting and useful way. 54

References

1. Myo armband wiki: https://en.wikipedia.org/wiki/Mvo armband

2. "Facebook to Acquire Oculus". FacebookNewsroom. Facebook. March 25, 2014. Retrieved March 26, 2014.

3. Plunkett, Luke (March 25. 2014). "Facebook Buys Oculus Rift For $2 Billion". Kotaku.com. Retrieved March 25, 2014.

4. Welch, Chris (March 25, 2014V "Facebook buying Oculus VR for $2 billion". The Verge. Retrieved February 2, 2017.

5. Oculus VR wiki: https://en.wikipedia.org/wiki/Oculus VR

6. http://fortune.com/2015/11/3O/investment-hot-virtual-reality/

7. http://www.nvtimes.com/1994/04/13/garden/in-virtual-realitv-tools-for-the- disabled.html

8. https://media.ford.com/content/fordmedia/fna/us/en/news/2013/12/12/new- virtual-lab-improves-ford-global-vehicle-qualitv—engineers-.html

9. Zhang, Xu, Xiang Chen, Yun Li, V. Lantz, Kongqiao Wang, and Jihai Yang. "A Framework for Hand Gesture Recognition Based on Accelerometer and EMG Sensors." IEEE Transactions on Systems, Man, and Cybernetics - Part A: Systems and Humans. IEEE (2011).

10. Phelan, Ivan, Madelynne Arden, Carol Garcia, and Chris Roast. "Exploring Virtual Reality and Prosthetic Training." IEEE Virtual Reality (VR). IEEE (2015).

11. Donovan I, Valenzuela K, Ortiz A, Dusheyko S, Jiang H, Okada K, Zhang X: MyoHMI: A Low-Cost and Flexible Platform for Developing Real-Time Human Machine Interface for Myoelectric Controlled Applications, IEEE Trans. SMC, 2016 4495-4500.

12. Curtis Silver (August 17, 2015). "Gift This. Not That: Mvo Armband vs. This Toaster", www.forbes.com. Retrieved 2015-12-25. 55

13. Wilson, P. T., Nguyen, K., Harris, A., and Williams, B. 2014. Walking in place using the microsoft kinect to explore a large ve. In Proceedings o f the 13th ACMSIGGRAPHInternational Conference on Virtual-Reality Continuum and Its Applications in Industry, ACM, New York, NY, USA, VRCAI ’14, 27-33.

14. R.P. McMahan, D. Gorton, J. Gresock, W. McConnell, and D.A. Bowman, "Separating the Effects of Level of Immersion and 3D Interaction Techniques," Proceedings o f ACM Symposium on Virtual Reality Software and Technology (VRST), pp. 108-111, 2006.

15. M. Usoh, K.W. Arthur, M.C. Whitton, R. Bastos, A. Steed, M. Slater, and F.P. Brooks, "Walking > Walking-in-Place > Flying, in Virtual Environments," Proceedings o f Annual Conference on Computer Graphics and Interactive Techniques (SIGGRAPH), pp. 359-364, 1999.

16. Whitton, M. C., Cohn, J. V., Feasel, J., Zimmons, P., Raz- Zaque, S., Poulton, S. J., Mcleod, B., and Brooks Jr, F. P. 2005. Comparing ve locomotion interfaces. In Virtual Reality, 2005. Proceedings. VR 2005. IEEE, IEEE, 123- BO.

17. Van Wyk, E., and De Villiers, R. 2009. Virtual reality training applications for the mining industry. In Proceedings of the 6th international conference on computer graphics, virtual reality, visualization and interaction in Africa, ACM, 53-63.

18. Razi, M., Tsvirkunova, L., Chow, J., Reus, R., Donovan, I., Enriquez, A., Pong, W., and Zhang, X. "Engaging Community College Students in Engineering Research through Design and Implementation of a Human- Machine Interface for Gesture Recognition." Proceedings: 2016 American Society o f Engineering Education Conference, Pomona, CA, April 21-23, 2016. 56

APPENDIX A: ENGAGEMENT QUESTIONNAIRE

Directions: For each of the following, please rate your experience of the sensation while playing the game, on the following scale from 1 (did not experience) to 5 (definitely experienced).

1. I lost the sense of time (Myo)

2. I lost the sense of time (Mouse)

3. I felt scared(Myo)

4. I felt scared(Mouse)

5. I felt unaware of my surroundings(Myo)

6. I felt unaware of my surroundings(Mouse)

7. Playing seemed automatic(Myo)

8. Playing seemed automatic(Mouse)

9. I forgot this was a game(Myo)

10.1 forgot this was a game(Mouse)

11.1 played without thinking of my actions(Myo) 12.1 played without thinking of my actions(Mouse)

13.1 did not want to stop playing (Myo)

14.1 did not want to stop playing (Mouse) 58

APPENDIX B: USABILITY QUESTIONNAIRE

1. Rate how easy it was to view zombies within the game (1= very easy, 5 = very

hard)

2. How easy was it for you to shoot and hit the zombies with Myo

3. How easy was it for you to shoot and hit the zombies with the mouse.

4. How easy was it for you to reload your weapons with the Myo

5. How easy was it for you to reload your weapons with the mouse

6. How easy was the game to you with Myo

7. How easy was the game to you with the mouse

8. How natural it was for you to look at the zombies ( 1 = very natural, 5 = not

natural at all)

9. How natural was it for you to shoot and hit the zombies with Myo

10. How natural was it for you to shoot and hit the zombies with the mouse

11. How natural was the game in general with Myo

12. How natural was the game in general with the mouse 59

13. How fun was the game with Myo ( 1 = very fun, 5 = not fun at all)

14. How fun was the game with the mouse

15. How tiring was the game with Myo ( 1= not tiring at all, 5 = very tiring)

16. How tiring was the game with the mouse

17. How accurate was the gesture control with the Myo (1 = very accurate, 5 = not

accurate at all)