MHEALTH TRACKER TO TRACK POSTURAL STABILITY AND HISTORY

by

JIA CHEN

Submitted in partial fulfillment of the requirements

For the degree of Master of Science

Electrical Engineering and Computer Science

CASE WESTERN RESERVE UNIVERSITY

May, 2019 CASE WESTERN RESERVE UNIVERSITY

SCHOOL OF GRADUATE STUDIES

We hereby approve the thesis/dissertation of

Jia Chen

candidate for the degree of Master of Science *.

Committee Chair

Ming-Chun Huang

Committee Member

Jing Li

Committee Member

Andy Podgurski

Date of Defense

January , 2019 �� ��

*We also certify that written approval has been obtained

for any proprietary material contained therein. Table of Contents

List of Tables v

List of Figures vi

Acknowledgements vii

Acknowledgments vii

Abstract viii

Chapter 1. Introduction 1

Chapter 2. Literature Review 7

Qualitative Balance Test 7

Quantitative Balance Test 8

Pain Measurements 9

Augmented Reality 11

Summary 13

Chapter 3. mHealth System 15

Subsystem: Wearable Gait Lab 16

Subsystem: Pain Marker 23

Chapter 4. Experiments 32

Experiments for Subsystem: WGL 32

Experiments of Subsystem: Pain Marker 36

Chapter 5. Results and Case Study 37

Subsystem: WGL System 37

iii Subsystem: Pain Marker 43

Chapter 6. Discussion 48

Compromise of Efficiency and Accuracy 48

Segmentation Rules 49

Chapter 7. Conclusion 51

Chapter 8. Suggested Future Research 53

mHealth Tracker 53

WGL: Dynamic Balance Tests and Daily Activities 53

WGL: Balance Instructions with HoloLens 54

Pain Marker: Distinguish Pain In Different Anatomy System 55

Appendix. Complete References 56

iv List of Tables

4.1 Summary of Experiments and Calculated Parameters 33

5.1 WGL: Linear Correlation Between COP and COG 40

5.2 WGL:Sway Velocities of Subject #5 in LOS Experiments 40

5.3 WGL: Statistical Data in Sit-To-Stand Tests 41

5.4 WGL: On-Axis Velocities in RWS Tests 42

5.5 Pain Marker: Comparison of Pain Marker Records and Traditional

Self-Evaluation Questionnaires 44

5.6 Pain Marker: Pain Information Overview 47

v List of Figures

3.1 mHealth System Overview 15

3.2 Wearable Gait Lab Subsystem implementation flowchart 16

3.3 WGL: Wearable Underfoot Force Sensing Unit and Joint Angular and

EMG Sensing Unit 17

3.4 WGL: App Overview 19

3.5 WGL: PC User Interface Overview 20

3.6 Pain Marker Subsystem Overview 24

3.7 Microsoft HoloLens Hardware 25

3.8 Pain Marker: Self-Report panel 27

3.9 Pain Marker: Mark Function Example. 28

3.10 Pain Marker: Show History Panel 29

3.11 Pain Marker: Cursor Design 31

4.1 WGL: LOS Experiment Setup 34

4.2 WGL: STS Experiment Setup 35

4.3 WGL: RWS Experiment Setup 35

4.4 Pain Marker: Questionnaire Example 36

5.1 WGL: Linear Regression Analysis 38

5.2 WGL: An Example of Center of Gravity (COG) in LOS experiments 39

5.3 Pain Marker: Correlation of App Pain Severity Word Scale and

Questionnaire Numeric Scale 45

vi Acknowledgements

0.1 Acknowledgments

The research had been approved by CWRU IRB Protocol Number: IRB-2016-1419 and

IRB-2016-1504.

vii Abstract

mHealth Tracker to Track Postural Stability and Pain History

Abstract

by

JIA CHEN

Balance ability and pain are two factors often monitored at same time in actual treat- ment and medical researches. Balance has been an essential indicator of human health and is realized by the coordination and support from several body systems including the vestibular, visual, auditory, motor, and higher level premotor systems1. Pain itself can be a central feature and it can also be regarded as a symptom of some progress2. Stud- ies suggest that pain and balance ability are related, however, for further study on their relationship, a system for monitoring balance and conditions in long term, collecting relative objective data, and providing sufficient visualization during and after is necessary. Therefore, mHealth Tracker system is proposed in this thesis for filling this gap and helping balance and pain related researches and regular treatments. The sys- tem consists two subsystems, Wearable Gait Lab (WGL) subsystem, for activities of feet monitoring during balance tests, and Pain Marker, for self-report and review pain infor- mation. The reliability of each subsystem of mHealth Tracker are tested and evaluated separately with standard tools used currently. The WGL system is evaluated with stan- dard balance tests (Limits of Stability, Sit-To-Stand, and Rhythmic Weight Shift), whereas the data collected are analyzed with data mining techniques to verify the reliability of the

viii designated process. Certain parameters are computed such as Center of Gravity (COG), weight transfer time, and sway velocities. The reliability of Pain Marker subsystem is proven by comparing with traditional pain reporting questionnaires. Also, Pain Marker is highly recommenced based on user experience questionnaires. The result shows that mHealth Tracker is informational and reliable in the process of determining balance sta- tus and pain information collection with additional advantages in high portability and efficient review communications.

ix 1

1 Introduction

Balance ability and pain are two factors often monitored at same time in treatment and medical researches. Balance is the ability to control muscular energy in the body and maintain an even distribution of weight to remain a stable posture. Balance has been an essential indicator of human health and is realized by the coordination and support from several body systems including the vestibule, visual, auditory, motor, and higher level premotor systems.1 Pain is an unpleasant sensory and emotional experi- ence associated with actual or potential tissue damage. Pain, especially chronic pain, can be a key source of reference for diagnosis. In many cases, pain and balance dif-

ficulties appeared at same time and influenced people’s living qualities. For example, research of Menz3 discussed the relation between pain, balance and knee strength in middle age females, and poole’s study4 discussed the relation between pain and gait parameters in elder population. Also, some researches shown that pain can per- turb one’s balance ability and postural stability. Mientjes’s research5 found chronic low influences one’s balance and increasing body sway in upright standing pos- ture when reducing vision and increasing task complexity. Hamaoui’spaper6 shown that respiration perturb chronic lower back pain subjects larger than health subjects. The relation between balance and pain among several population and different pain areas Introduction 2

are studies, but more detailed and specific relation still hasn’t been investigated. There were researches aimed to find out the relation between pain and balance, however the current tools for balance evaluations and for pain recording and visualization are not satisfying, not mentioning there’s no system focusing on collection and visualization of both information together.

To evaluate the patients’ postural stability systems, researchers and clinicians utilize different functional test measures to assist the patients to identify the sources of balance problems. The lab-based tests normally require the presence of patients in a clinic set- ting to perform balance tests under the supervision of clinicians. After the tests, a score of the patient’s performance is calculated based on certain institution-developed stan- dards, such as Motor Assessment Scale, Berg Balance Scale,7 and Rivermead Mobility

Index.8 Due to the requirement of clinicians’ supervision, most of the tests use qualita- tive standards to evaluate balance systems for the patients and generate an overall score based on the clinicians’ opinion on how well the test subjects complete the tests. For instance, the Berg Balance Scale has 14-item scale designed to measure a patient’s bal- ance ability in a lab setting. Each scale has five levels of performance resulting a score ranging from 0 4, and the scores from all the fourteen tests add up to a total score of ° 56. One example contained in Berg Balance Scale is asking the subject to pick up an object from the floor from a standing position. A 4 is scored when the patient is able to pick up the object safely and easily; a 3 is recorded when the patient can pick up the object but needs supervision; and a 2 is recorded when the person is unable to pick the object up but reaches 2 5cm from the object and able to keep balance independently. ° However, it is relatively hard and non-objective for the clinicians to judge the extent of difficulties for the subject to pick the object up. Since those classic balance tests use Introduction 3

partially qualitative, semi-subjective standards based on clinicians’ personal judgments upon the patients’ performances, there may be inaccuracies. In addition, these tests confined in traditional gait lab environment may lead to environmental bias concerns and cannot reflect patients’ natural behaviors in their usual environment.9

For providing accurate description of pain in patients’ diagnosis, treatment and re- covery progress, methods such as 4-point scale(FPS), vocal rating scale (VRS) and vocal analogue scale (VAS) are currently applied in medical record system. Compared to FPS,

VAS shows more satisfaction for patient self-rating pain process. It is judged to be more sensitive, accurate and less subject to bias, and the VAS method has been proven to be no more difficult to be understand by patients.10 Moreover, the VAS method is found to show how the patient feels more closely by comparing the F-ratios of the result from the two methods.11 Currently, VAS method is widely used in medical system; however, it usually mainly focuses only on scales, which show the intensity of pain. In many sit- uations, clinicians need to know much more information than intensity, such as pain quality, location, radiation pattern, duration, timing, etc. Further descriptions of pain history are acquired by forms12 and questionnaires13 adopted by the current medical record system. However, for different pain, the evaluation process could be different.

Acute pain,14,15 often caused by injury, surgery, trauma or tissue inflammation, should be assessed both at rest and during movement. Dynamic pain treatment is important for function recovery and elimination of complications. For example, immobilization caused by surgery always relates to dynamic pain. During the recovery process, the mobilization of patients is limited and thus the risk of having immobilization could be raised.16 Pain of patients is even more complex because it often relates to mul- tiple symptoms and the specific information should be tracked.14 With such a variety Introduction 4

of pain evaluation needs, the current pain recording methods do not cover enough in- formation, or become too complicated with forms and questionnaires. Moreover, for chronic pain, in order to accomplish a comprehensive assessment, detailed documen- tations of pain history (including pathology and etiology), physical examination and di- agnostic tests are required. Since the VAS method only records the pain information at a time point, the establishment of pain history may require multiple evaluations. Because most of the current pain evaluation scales and tools are documented by paper, when the records are retrieved, it may be hard to view the pain records in a smooth, continuous way. Some studies proposed more detailed chronic pain recording methods17 with both pain intensity and persistence information; however, the persistence is not specified and no continuous pain record or dynamic information tracking method is discussed among these studies.

Wearable sensory accessories and hardware designs have the potential to extend im- pact of wearable devices in heath field, especially applying sensory techniques to quan- tize human health indicators to better understand body structure and conditions. Digi- tal system offers a more direct and convenient way of keeping records, enables an easy access for physician and patients to review progress and fills the blank of information and dynamic history loss. mHealth Tracker is a system using wearable, portable devices and digital system for human postural stability and pain studies. To solve the problems existing in the classical balance tests, especially in poor accessibility, the team imple- ments a solution that the mHealth Tracker system will support data recording and up- load for human body balance tests and pain information at any place that is convenient for the users. The system will only require the user to place a wearable gait lab insole un- der each foot, while connected to an Android device with Bluetooth Low Energy (BLE). Introduction 5

An Android application has been implemented to for the users to control the data col- lection process. The wearable gait lab insole will provide feet motion data and plantar pressure data. The users are able to view the plots the collected data in real time and upload them to the cloud server with the Android app. Once the data are uploaded to cloud server, the researchers or the clinicians can review them remotely with a PC user interface the team has built and furthermore provide medical suggestions based on the results reflected in the data. Building upon the AR technology, our system also solved the problem met in current pain recording system by using dynamic color represent- ing pain changing on 3D human models and user comments. With a "next generation, reality-based interface"18, the user may "touch” the pain area on the body to record more accurate pain information.

Thus, the contribution of mHealth Tracker system including: 1) for postural stability, mHealth Tracker system eliminates the negative effects caused by semi-subjective stan- dards and environmental bias by providing relatively objective, quantitative data and providing the users with options to perform balance tests at their preferable places; 2) for pain tracking, the system makes up the missing useful information for diagnosis and pain visualization; 3) mHealth Tracker provides an reliable and quantitative tool for pain and balance ability related researches and treatments.

The remainder of this paper is structured as follows. Chapter 2 summarizes related works in area of balance and pain relation studies, balance tests, sensory applications in those balance tests, pain measurement tools and augmented reality techniques which we used in Pain Marker. In Chapter 3, the mHealth Tracker System is explained, in- cluding the hardware of sensory equipment, software implemented, applications of the system, and applied algorithms. In Chapter 4, mHealth Tracker system is evaluated with Introduction 6

chosen tests, and demonstrations are shown for experiment procedure, and in Chapter 5 experiment results are studied and analyzed. In Chapter 6, several compensations and design obstacles are discussed. Chapter 7 and Chapter 8 are conclusion and suggested future works. 7

2 Literature Review

In this Chapter, current solutions and tests for balance and pain evaluations are in- troduced and discussed, as well as a brief summary of related technologies and hard- ware we used in mHealth Tracker system.

2.1 Qualitative Balance Test

The classic balance tests operated by labs and clinics are mostly based on semi-subjective standards, whereas clinicians are responsible for reporting the quality of a test object’s balance status during test activities with scores. Those methods that clinicians imple- ment include Berg Balance Scale and the Mini-BESTest, normally requiring the clini- cians to evaluate the test objects’ balance ability by their personal preferences and judg- ments on a number scale19. Due to balance test differences among clinicians, progress in new technologies has given rise to sensory hardware that are able to evaluate balance test parameters at a more objective scale. The technological devices being developed could be classified as non-wearable sensors and wearable sensors20. Non-wearable sen- sors commonly require the use of controlled research facilities where the sensors are located in the test station statically such as image processing sensors and floor sensors.

Compared to non-wearable sensors, the wearable sensors are more diverse and required Literature Review 8

to be located on parts of the body, such as feet, knees, and thighs; those sensors include

IMU sensors, extensometers, goniometers, active markers, electromyography, etc.

2.2 Quantitative Balance Test

Currently the non-wearable measuring system such as Vicon and GAITRite are widely used in human movement measurements including balance tests. However, the devices are highly expensive and lack mobility. Another option used by the researchers is visual technology devices such as Microsoft Kinect. Researchers choose to use the system in order to acquire human body movement data at a macro-graphic, visual level. To prove its validity, Ross A. Clark made comparisons on the estimated anatomical landmarks obtained by Microsoft Kinect and 3D motion analysis system21. The analysis illustrates that Microsoft Kinect can offer reliable data with lower cost and easier setup process compared to the aforementioned balance systems by providing real-time anatomical landmark position data in three dimensions. On the other hand, the research also shows that the proportional biases and inability of assessing joint rotations. By the research of

Phillip A. Gribble, the ankle movements to different directions performs an important part in the balance test22. As a result, measurements on joints are essential in balance test data recording process.

To further record joint information during balance tests, many research reports show the wearable sensors’ capability of measuring 3-D data to assist in balance tests. Lugade

V proved the validity of tri-axial accelerometer in movement detection by identifying postural orientation and movement from accelerometer data against movement video recordings in his experiments23, and Barth et al. validated a system using gyroscopes Literature Review 9

and accelerometers in order to measure the gait functions24. In addition, Moore ST. et al. explored the use of wearable sensors on ankle to monitor gait activities25. However, only accelerometer and gyroscope data would not provide sufficient information for a reliable balance test system. In Saunders’ paper, he mentions that the body sway, an- other important indicator of balance status and a drawback of the measurement with accelerometers, is difficult to be measured only with accelerometers. In order to accu- rately predict balance patterns with body sway data, the approximate center of mass

(COM) of the subject needs to be computed; however, one may not find the COM po- sition accurately when performing different actions. Many investigators estimate the change of COM in position over time by measuring changes in the center of applied pressure (COP) on a force plate26. Although the estimation has validity only when the subject body behaves as a rigid structure rotating only about the ankle in the sagittal and frontal planes27, one may use accelerometer, gyroscope and magnetometer sensors to recognize the direction of the ankle and calculate COM in real time. Due to unreliabil- ity of determining COM changes solely with changes in COP in order to calculate body sway, the team uses leg orientation data to indicate the sway in legs, which improves the reliability of evaluating the effects on balance states caused by body sway.

2.3 Pain Measurements

For over half a century, methods to evaluate pain have been sought. The state of the art in pain measurement are several methods which can be grouped into four categories:

Numerical Rating Scales (NRSs), Visual Analogue Scales (VASs), Verbal Rating Scales Literature Review 10

(VRSs), and Graphics Scales. Numerical Rating Scales, which require the patient to indi- cate their pain intensity by marking on a scaled line, usually range from 0 to 10. Visual

Analog Scales, which instruct patients to indicate their pain by drawing a mark on an unscaled line,range from no pain to worst pain. Clinicians then measure the distance from no pain to the patient’s mark. Verbal Rating Scales provide a table of short descrip- tions of pain for patient to choose from. Graphic scales, such as Wong-Baker FACES Pain

Rating Scales28 require patients select one of six faces that describe pain and its effect on behavior. Approved as reliable in many situations, these now traditional one dimen- sional pain measurement scales record only pain severity. Locations and types of pain are neglected. These information data can be documented with multiple - dimension scales which have been raised in some research, including the multi-dimension scale of

Faleiros Sousa et al.12 which provided a numerical rating scale for users to express their pain intensity, a table of numbered pain types to choose from, and two labeled human body figures (front and back) on which patients can find the corresponding number of pain body parts. The West Haven-Yale Multidimensional Pain Inventory (WHYMPI) is another method of multiple-dimension pain analysis which examines how patients perceive pain and its effects including others’ perceptions and reactions as well as im- pact of pain on daily functioning13. Additionally, the Multidimensional Pain Inventory measures pain a propos its cognitive, behavioral and affective dimensions and has been found to be an effective tool for determining treatment29. We propose tak- ing this a step further by using augmented reality to perform multiple-dimension pain analysis.

In 2018, Lin and her team tried to measure one’s pain in objective ways by moni- toring physiological signals. In Lin’s research the team claimed that skin conductance, Literature Review 11

pupillary unrest under ambient light , facial expression, and electroencephalography signals can detect if the individual is experiencing pain at the moment accurately.30 But since pain is very subjective, private feelings, self-reports are still the most commonly used and easiest access methods in pain assessment.

2.4 Augmented Reality

Unlike virtual reality, in which no real world element appears, augmented reality is the combination and interaction of mixed reality and virtual reality. This provides a new ex- perience for humans - one in which they interact with computers by blending computer generated information and models into the real world environment and interacting with real physical objects. Another term "mixed reality" is also used in some researches and commercials. More specifically, the phrase mixed realty includes two ideas: augmented reality and augmented virtuality. Augmented reality means putting computer generated information into the user’s view of the real world scene; augmented virtuality, on the other hand, is inserting real world objects into a virtual environment. However, the dif- ferences between mixed reality and augmented reality is quite vague. These features have sparked interest in researchers and developers who have conducted diverse stud- ies examining AR’s potential benefits to humanity. The aim of this paper is not about the theory of augmented reality but simply use the idea of displaying a virtual object in reality for improving the usability and convenience of our system, and thus we will use

AR for both AR and mixed reality.

In 1998, a study described augmented reality as a way to enhance real world views through techniques such as labeling, rendering 3D models, and shading modifications.31 Literature Review 12

Then, it was studied that augmented reality works in both indoor and outdoor environ- ments; advances are being made to make displays less cumbersome, such as embedded within conventional eyeglasses.32 Furthermore, steps are being taken towards making it hand-held using conventional hardware that most have access to.33 This innovative technique has been particularly interesting in the medical field, especially in 3D im- age visualization which is being increasingly used in medical areas,34,35 because it can provide better views of complex situations and allow doctors to review pathology with more vivid description. Visualization is applied in multiple stages of treatment: diagno- sis, treatment planning, and treatment itself. Augmented reality aids in surgical proce- dures by localizing body contents, like tumors to be removed, organs adjacent to their location, blood vessels in that part of the body and by making correct dissection planes easier to determine.36 Another study found that augmented reality can be used in neu- rosurgery to guide the surgeon a propos organ and tumor structure by superimposing three dimensional images with the patient’s head.37 Even general surgery has begun to incorporate augmented reality.38

However, augmented reality is being used not just in medical practice but also in medical education. Case Western Reserve University and Cleveland Clinic have pio- neered an anatomy class where students study the body using a 3D virtual reality model via Microsoft HoloLens.39 The rotate-ability of this model enables study of organs in the back of the body; additionally it was possible to isolate individual organs (e.g. the from the cardiovascular system) for study or view the system with blood pumping; fur- thermore, fibers can be color coded - to show tumors for example. These improvements represent significant achievements as in traditional anatomy classes, students learn by dissecting cadavers - this practice does not have the aforementioned benefits. Literature Review 13

In our system, we also used Microsoft HoloLens, or HoloLens for short. It is a com- mercially available wireless, hands-free, head mounted, high resolution augmented re- ality display with gesture interface, gaze and voice recognition control mechanisms, and the ability to map objects spatially. A propos development of an application to be used with Microsoft HoloLens, both research findings and Microsoft guidelines promote the use of Unity 3D - benefits of using this engine include thorough documentation and guidelines, already setup framework and tools, and quick development iterations. Evans et. al found that Microsoft HoloLens is a capable augmented reality platform and the user interface created with Unity 3D is user friendly.40

2.5 Summary

Nevertheless, the quantitative balance test systems, which mostly concentrate on ac- quiring macro quantitative data, lack the mobility and portability that are essential to accurate ubiquitous daily balance tests. In addition, quantitative data solely based on macro body statistics such as COM and weight distribution will not fully represent one’s balance status. Therefore, electromyography data are introduced to fill the blank to re-

flect balance statuses at certain parts in human bodies such as lower limbs. On pain measurement side, most of evaluated methods are still paper-based with limited visual- ization and lack of consistency. Augmented reality shown a possibility to simplify pain measurement with good visualization for both patients and their doctors. Literature Review 14

In this paper, the team proposes mHealth Tracker system to perform balance tests, collect pain information and realize data analysis. Wearable Gait Lab subsystem uti- lizes accelerometer sensors, gyroscope sensors, magnetometer sensors, pressure sen- sors, and electromyography sensors on the human’s lower limbs to analyze essential in- formation for balance status and look for unstable patterns. Electromyography sensors are widely used in medical field to assess muscle health and nerve cell information by measuring the strength and speed of nerve signals with electrodes taped to skin sur- faces.41 Leg electromyography data are essential in the process of analyzing muscle ac- tivities,42 and studies have shown that related muscles in the legs are activated during stance. In order to improve the portability and ease of use of the system, wireless and

Bluetooth connection is the only communication method among devices in the process of data collection and analysis. Pain Marker subsystem uses augmented reality device to improve visualization and consistency of pain conditions. With the system, the patients will not need to take balance exams or pain evaluation in clinics periodically, and the health care professionals do not need to spend time monitoring the balance test pro- cess and easily reviewing their pain treatment progress, while relatively objective data are still recorded for analysis. 15

3 mHealth System

mHealth Tracker is developed to assist pain and balance studies by quantifying col- lection process, improving visualization using portable, wireless sensors and devices.

As shown in Figure 3.1, mHealth Tracker including two subsystems for postural stabil- ity tracking and pain history tracking. Section 3.1 and 3.2 introduce each subsystems in details.

Figure 3.1. System overview and experiments flowchart. mHealth Tracker system is a portable and wireless system that can support pain and balance researches and studies within labs and clinics and in daily life. The left side demonstrates the WGL subsystem for postural stabil- ity detection; the right side demonstrates Pain Marker subsystem for pain information collection and review. mHealth System 16

Figure 3.2. Wearable Gait Lab Subsystem implementation flowchart

3.1 Subsystem: Wearable Gait Lab

In this Section, the hardware choices and software implementations of the Wearable

Gait Lab are discussed. This Section consists of five parts: Wearable Underfoot Force

Sensing Unit, Joint Angular and EMG Sensing Unit, Wearable Gait Lab Android Applica- tion, JavaFX PC User Interface, and Dynamic Time Warping Algorithm. A system imple- mentation flow chart is shown in Figure 3.2.

The complete set of hardware in the system includes the Wearable Underfoot Force

Sensing Unit recording feet IMU and pressure data and the Joint Angular and EMG Sens- ing Unit recording leg EMG and IMU data. Once the test subject wears the system, the two units are connected to an Android device with Bluetooth Low Energy. An Android application is developed by the team to control the data recording process including sys- tem data collection initialization and termination, plotting collected data in real time, and uploading the data to the cloud server when the data collection process is com- pleted. Once the data files are uploaded to the cloud server, the researchers or the clin- icians are able to view and analyze the data through the PC user interface. The PC user interface includes a 2D plantar pressure map and four line charts to display CSV data mHealth System 17

files stored on cloud server or local disk. Then the clinicians can provide professional suggestions for the patients after data analysis.

Figure 3.3. Left: Wearable Underfoot Force Sensing Unit Right: Joint Angular and EMG Sensing Unit

3.1.1 Wearable Underfoot Force Sensing Unit

The key component of the system hardware is the Bluetooth Low Energy controlled

Wearable Underfoot Force Sensing Unit, as shown in Figure 3.3. Each smart insole in the unit contains a textile pressure array, an inertial motion sensor, a micro control unit

(MCU) with Bluetooth Low Energy module, and a battery module. The textile pressure array records data from forty-eight pressure sensors used to obtain a high resolution plantar pressure map. The 9-axis inertial motion sensor records accelerometer, gyro- scope, and magnetometer data, whereas the accelerometer and the gyroscope measure the movement of the insole, and the magnetometer provides aids for data calibration.

X, Y, Z axis of all three parameters are sampled during the data collection process. The

MCU with Bluetooth Low Energy module realizes the system utilization and provides a wireless channel to connect the insole to a smart electronic device. The battery mod- ule contains a battery and a micro USB battery connector, allowing the user to recharge the battery when the insole is out of power. In the system, the Wearable Underfoot Force mHealth System 18

Sensing Unit is placed under feet and functioned to record the plantar pressure data and feet movement data during the balance tests.

3.1.2 Joint Angular and EMG Sensing Unit

Another component of the system hardware is the Joint Angular and EMG Sensing Unit, as shown in Figure ??, containing a pair of Myo limb bands. Manufactured by the Thalmic

Labs, the Myo limb bands are designed to record EMG data while reading the electri- cal activities of human arm muscles and sending control orders to devices based on motion and gesture information. The team utilizes the Joint Angular and EMG Sens- ing Unit in the Wearable Gait Lab system due to its functionality of recording IMU and

EMG data, and it can be used to better record lower limb activities when worn on legs during the balance tests. X, Y, Z axis of accelerometer and gyroscope data and eight

EMG data are recorded at each timestamp in the process. Studies have shown that distal

(leg and thigh) muscle activities are important indicators of balance adjustment behav- iors.43 The leg EMG data would fill in the gap in current sensory balance tests so that the test results would be more reliable with additional muscle activity indicators. The leg

IMU and EMG data are recorded concurrently with the data from the Wearable Under- foot Force Sensing Unit in a balance test for future data analysis.

Myo Dual Data Collector on PC. Since the official Myo SDK for Windows does not pro- vide the option to record both IMU and EMG data from both Myo limb bands in the Joint

Angular and EMG Sensing Unit simultaneously, the team has implemented a solution to fulfill data recording purposes in the system so that a CSV data file would be gener- ated for each Myo to record sensory data in the order of timestamp, gyroscope data, ac- celerometer data, orientation data, and EMG data. Since EMG data are generated with a mHealth System 19

higher frequency than IMU data, empty IMU data lines are filled with existing IMU data from the previous timestamp to make the generated data a complete matrix for ease of data analysis. The Myo Dual Data Collector program is used in the data collection pro- cess only as a supplement to the Android application and is the second choice for Myo data recording due to portability.

Figure 3.4. Wearable Gait Lab Android Application Control Panel and Plotting Interface

3.1.3 Wearable Gait Lab Android Application

The Wearable Gait Lab Android application is implemented for the purposes of display- ing and recording the sensor data from the left, right, or both of the lower limbs by bridg- ing the data to PC host-target system through BLE. As shown in Figure 3.4, the Android application uses a single pane structure whereas all functionalities are found by simple mHealth System 20

Plantar Insole Data Myo Data Pressure Map Line Charts Line Charts

Figure 3.5. JavaFX PC User Interface for Data Review

scrolling actions. The data types that would be recorded are feet accelerometer, feet gy- roscope, feet magnetometer, feet pressure, leg accelerometer, leg gyroscope, leg orienta- tion, and leg electromyography accordingly. The top of the application interface locates the control panel, containing the device information including timestamp, RSSI, con- nectivity and battery. In the control panel, the user initializes data collection process and uploads recorded data to the cloud server. Currently the team has acquired and managed a secured cloud storage space from Case School of Engineering Information

Technology Department with a feasible capacity. Under the control panel, the interface contains line charts to plot IMU (accelerometer, gyroscope, and magnetometer) and electromyography data in real time, and a plantar pressure map to visualize pressure map under each foot. mHealth System 21

3.1.4 JavaFX PC User Interface

The JavaFX program is a user interface on PC for ease of display and analysis of the data collected from Wearable Gait Lab system. The software is designed for the researchers or the clinicians to read CSV data generated from Wearable Gait Lab Android application and display sensor information the same as the Android application. Correspondingly, the software is optimized to share similar interface style and include charts and graphs in a similar structure of sections for operation affinities while switching between the

PC software and the Android application. In the Wearable Gait Lab system, both of the

Wearable Gait Lab units are calibrated and the smartphone with the Android application is able to receive the corresponding real-time movement and pressure sensor data and process the data into CSV files. The user interface of the PC software is built with JavaFX, whereas the control functions are contained in controller functions written in Java, and the interface is written in FXML and optimized with CSS, as shown in Figure 3.5.

The JavaFX PC user interface reads in CSV data exported from the Android user ap- plication from the cloud server or the local disk, whereas between every timestamp, the

CSV data contain information of foot side, the timestamp, nine IMU sensor data in- cluding three each from accelerometer, gyroscope, and magnetometer, and forty-eight pressure sensor data on the Wearable Gait Lab. Similar to the Android application, the

PC user interface can be chosen to display underfoot force sensing data information from the left insole, the right insole, both of them, or neither of them. The PC user inter- face displays the data in four line charts and one 2D plantar pressure map. The interface would read in the file and display the data in a chronological order in a continuous video form, meaning that it displays the data in the first line of the data as the user imports the

file and data in the following lines correspondingly until the last line. Clinicians are also mHealth System 22

able to adjust the speed of playing, pause the data display process, and manually choose display time, and save screenshots for future record.

3.1.5 Dynamic Time Warping Algorithm

The Wearable Gait Lab system might collect data at a slightly different frequency. In or- der to find a more accurate time series, the Dynamic Time Warping algorithm (DTW) is implemented to map right and left Wearable Gait Lab’s data.44 Known as a delicate technique to perform an optimal alignment and discern for connections between two time-dependent sequences of different lengths, the Dynamic Time Warping technique is able to support building traceable patterns between right and left insole’s sensor data, even if their timestamps do not match.45 In addition, the DTW algorithm is used to syn- chronize data between the Wearable Underfoot Force Sensing Unit and the Joint Angular and EMG Sensing Unit to realize data fusion, so that the timestamps between different sensors would match accordingly.

During the computing process, a warping path is defined to be the alignment be- tween the two sequences by assigning elements in one sequence to the elements in the other. The warping path is constructed by the following equations of Wk ,

W (i, j) (3.1) k = and

Wk 1 (i 0, j 0),i i 0 i 1, j j 0 j 1 (3.2) + = ∑ ∑ + ∑ ∑ + mHealth System 23

where i and j are timestamps found in left and right Wearable Gait Lab data. The dis- tance of the warping path can be calculated by

k K = Dist(W ) [Dist(wki,wkj)] (3.3) = k 1 X= th where Dist(W ) is the distance of the wrapping path W , and Dist(wki,wkj) is the k element’s distance between two data timestamps of the warping paths.

3.2 Subsystem: Pain Marker

Pain Marker is an augmented reality application running on Head-Mounted Device (HMD) to simplify the recording processes of self-pain evaluation and display recorded data for further review and analysis. Pain Marker is running on Microsoft HoloLens, and its hard- ware is introduced in Section 3.2.1. The developing process is discussed in Section 3.2.2.

In Section 3.2.3, an app description including the functions and data storage processes are introduced. Data collection process, is introduced in Section ??.

3.2.1 Microsoft HoloLens Hardware

Microsoft HoloLens, or HoloLens, is a cordless and self-contained head mounted aug- mented reality device. It generates holograms, computer-generated objects made of light and sound, with two HD 16:9 light engines and built-in speakers. The holographic images are displayed in 30 to 60 frames per second. For Pain Marker app, there are much fast speed movements of holograms, we set the display in 30 frames per second to fasten calculation. The central processing unit (CPU) is Atom x5-Z8100, running 32- bit Windows Mixed Reality platform, which is part of Windows 10 operating system. It mHealth System 24

Figure 3.6. This subsystem including three parts, development process, Application description and Data Collection. During development pro- cess, 3D human model is constructed and then AR application is built. The application can be display and interact with users through Mircosoft HoloLens. Pain information are collected by Application and question- naire. Also, another questionnaire about application user experience are collected.

also contains a graphics processing unit (GPU) and two light engines for displaying two video streams (one for each eye), and a holographic processing unit (HPU) for rendering holographic images. To enable interactions between holograms and real-word environ- ment, several cameras and sensors are also built in the visor of HoloLens. As shown in

Figure 3.7, four environment understanding cameras are used to detect environment around user, and one depth camera is used to measure the distance between user and real-word object at which the user is looking. Holograms can also take photos and videos of what user sees, real and holographic content, by a MP photo / HD video camera. One inertial measurement unit (IMU) is used to detect the movement or rotation of the users’ head.46 We chose AR platform for our system because unlike cellphones and other de- vices, there is no limitation on the screen size. We can show larger model with more details and gives the user a complete vision without zoom in and out. Depending on mHealth System 25

one’s standing position, one can view full settings or having a closer look at a specific area.

Figure 3.7. Cameras and sensors setting of HoloLens visor.47

3.2.2 Development Process

As shown in Figure 3.6, the development process has two steps: 3D human model con- struction and application development.

3D Human Model Construction. The 3D Model using in the app is generated by Au- todesk Maya, which is a powerful software used in creating 3D models and animations.

3D human body model is built and designed based on the rule of easily select body parts with less hand movements, the model is designed to keep a standing posture, hands down and turn both balms to face front. The model is one single object, but later in Pain

Marker app, different areas of the model should be able to be identified and interact with user separately. To distinguish different area, "skeleton", made of large amount of connected spheres and cones as shown in "3D Model Construction" part in Figure 3.6, is added. In Autodesk Maya, usually the purpose of using "skeleton", which is a set of connected "joints"(spheres), is to change model postures without ruining the shape of model. So "joints" are placed and oriented at approximately where real joints are, based mHealth System 26

on bone structure of the model. However, in Pain Marker app, not only joints, but also some other areas should be distinguished, for example, one might feel pain at left lower quadrant, which doesn’t have with a specific joint or bone. Therefore, the "joints" in our human model are placed based on both human bone structure and muscular system.

For example, the joint "neck" is placed at the neck area on the model and the joint la- beled as "LLQ" is placed at the model’s left lower quadrant. In current version of Pain

Marker, the human model contains 84 well-labeled "joints". Each pixel of the model has been assigned a different weight based on which joint it belongs to manually by using

"Paint Skin Weights Tool" of Maya.

It is true that manually matching joints and corresponding areas by painting could be time consuming based on how detailed the model is and how many joints it contains.

However, by using this skin weight format rather than build each body part a distinct model and bind them together, the number of joints can be extend conveniently for achieving better accuracy of pain area in the future.

Application Development. After the human body model construction, Unity 3D is used to develop the application, including the user interface setup and functions define. Unity3D is a well-known game engine and it is commonly used for HoloLens application devel- opment. Pain Marker application is not a game, but it is design to fit in the life-cycle of

Unity3D completely. Each time user start to enter his personal information and mark his pain, it can be viewed as one new round of the game. Once he saved his records, it can be view as the end of that round. Besides, Unity3D allows cross-platform pro- gramming, which means applications can be deployed in multiple platforms with little or none modification of the code, and compatible with most commonly used platforms for mobile, tablet and VR devices. By using this feature, our application can be easily mHealth System 27

extended to other devices. We also developed an Android version which has same func- tions as the AR version, for covering the occasions when AR device is not accessible.

3.2.3 Application Description

There are three main panels of Pain Marker application: self-report, spine chart display and record review. UI setting and functions of each panels are introduced separately in this section. Other design details for HoloLens application, including data storage and user inputs methods are also discussed.

Figure 3.8. Self-Report panel. (A) shows the default setup. (B) shows the expanded menu.

Self-report section and data storage. Once start the App, self-report section is shown

(Figure 3.8 (A)). User can set the body at a comfortable location by click on "Set it here" button. A 3D human model in the middle of the user’s view. A user can start to record his pain by entering name and ID number through a Bluetooth connected keyboard. Af- ter that, the user can select pain type and intensity from provided options (as shown in

Figure 3.8 (B)) which described his pain most accurately. Then one can mark his pain directly on the model by tapping on the corresponding body part. Sixteen options which mHealth System 28

covers most commonly pain types, are provided, and the marks of these options are dis- tinguished by different colors. The color selection follows the results of a study about using color to describe pain.48 Five intensity options from minor to Unbearable can be reflected by the depth of color: the lighter the pain is, the lighter the color of the mark is. As shown in Figure 3.9, left leg is marked in solid blue color, which means the user feel unbearable throbbing pain on left leg. Different types of pain and intensities can be shown on the same model. If different are felt at same area, the user can mark it with different color. All pain information at that area is recorded in files, but only the last pain will shown on the model. All records are automatically stored in .csv files in real-time. The user can also leave comments to the clinician using Bluetooth connected keyboard. Undo function is also available, so the user can correct his record easily. Users can remove previous marks by tap on the "Undo" icon and the saved record will be re- vised automatically. To start a new record, simply tap on the "Restart" icon.

Figure 3.9. The model’s left leg is marked in solid blue color, which means the user feel unbearable throbbing pain on left leg.

Spine Chart Section. In the spine chart panel, based on the marker painted on the human model, corresponding parts of is highlighted on spine chart mHealth System 29

graphs (front and back), which clinicians can use this information as reference in fur- ther studies.

Record Review Section. Figure 3.10 shows a demonstration of record review section.

The application can display previous records saved. By tapping "Show History" button, the model will display all saved files one by one. For each file, the model shows all marks of pain and turns 360 degree slowly, as shown in Figure 3.10, so that both pain marks on front and back can be seen. A "pause" icon pops up at the end of menu list. Once tapped on it, the model will stop spinning and the user/clinician can have a better look at the pain information. All recorded file names are listed in blue buttons and the currently displaying one is highlighted in yellow color. One can also review a specific file by tap on the name of that file. This function is designed for clinician and physicians to review the changing of one’s pain over time for situations such as checking and treatment progress.

Figure 3.10. The model is displaying previous records and slowly spin- ning. All file names are listed and the current displaying one is in yellow font. mHealth System 30

HoloLens Inputs. To interact with users, HoloLens provided gaze, gesture and voice in- puts. In this project, gaze and gesture inputs are used and a cursor that can transform between 3D and 2D is designed.

Gaze Input To select UI objects and 3D model, "Gaze" is used to target these ele- ments. However, HoloLens can only target at the right point when the users eyes look- ing straight to the front. Thus, a cursor is needed to ensure user the precise direction

HoloLens is targeting. The cursor is shown in the center of user’s eye-sight, and it moves along with head with the same orientation and speed. If the cursor is not pointing at any element, it is in a 3D ball shape. Once it reaching the human model, the cursor trans- forms to a 2D ring and attached to the surface of model. 3D cursor is chosen because when the user is not looking at the model and walking around with the HoloLens, un- like 2D cursor, there’s no need consider spinning it for facing the user. But 2D cursor is also necessary. The maximum distance between the user and the cursor is predefined, otherwise, when the cursor is not blocked by any objects, it will be placed at infinity far away from the user and couldn’t be seen. The setting of maximum distance brings us another problem. When the user is looking at the model, 3D cursor can’t differentiate whether the cursor reached the model or floating at somewhere between the user and the model. By adding 2D cursor, the transform between 2D and 3D cursor gives a clear signal to users whether they are target on the model or not. Another solution of this problem is highlight selected elements when it is targeted, but any kinds of model color changing should be cautious because colors and lightness have important meanings in this app.

Gesture Input There are three default gestures can be recognized by HoloLens: Bloom,

Air Tap and Hold. Bloom is used to exit app or to open main menu. To perform Bloom mHealth System 31

Figure 3.11. Self-Report panel. (A) The blue glowing sphere in the middle is the 3D cursor. (B) Once the cursor is pointed at an object, it became blue 2D ring and attached to the object surface. gesture, palm up and hold all fingertips together, then open the hand. Air Tap, same as click mouse, is used to confirm the selection of buttons or the position of marks the user want to paint on the 3D body figure. It is used together with Gaze, which is used to select object. First, the user should perform Gaze by turning his head to select objects. Next, raise index finger and bend it down, then back up. Hold gesture is not available in this

App. Similar to Air Tap, Hold gesture requires the user hold his index finger down for a longer time before return to the start position. 32

4 Experiments

There was no commercial system emphasize on both postal stability and pain infor- mation, so two subsystems were evaluated separately by comparing with current meth- ods used in clinics and researches. In Section 4.1, for verifying the reliability of WGL subsystem, three postal stability experiments using PGL and NeuroCom SMART Bal- ance Master (SBM)/Balance Master (BM) were introduced. In Section 4.2, volunteers with chronic pain were asked to report their pain information using both Pain Marker and self-reporting questionnaires, and user reviews were also collected to evaluate the efficiency of Pain Marker. All these experiments were supported by clinicians from Ohio

Living Breckenridge Village and all subjects were volunteers selected from Ohio Living

Breckenridge Village residences.

4.1 Experiments for Subsystem: WGL

To evaluate the reliability of WGL system, existing balance systems on market including

NeuroCom SMART Balance Master (SBM) and Balance Master (BM) are used as refer- ences. Four standard balance tests using Balance Master Systems are applied: 1) Lim- its of Stability (LOS); 2) Sit-To-Stand (STS); and 3) Rhythmic Weight Shift (RWS). These three tests are semi-static balance tests, which means among all gesture the subject is Experiments 33

Exp 1 LOS COG Sway Velocity Exp 2 STS Wt Time Sway Velocity L/R Wt Symmetry Exp 3 RWS On-Axis V

Table 4.1. Summary of Experiments and Calculated Parameters

WGL: Summary of Experiments and Calculated Parameter]tab.experimentDetails

asked to do, the monitored feet/foot is not leaving the ground. The following Table ?? lists all parameters calculated in each experiments, including center of gravity (COG), sway velocity (SV), left/right sway velocity differences (L/R SV difference), weight trans- fer time (Wt time), left/right weight transfer symmetry(L/R Wt symmetry), and On-Axis

Velocity(On-Axis V). Five subjects participated in this test.

4.1.1 Limits of Stability (LOS)

The Limits of Stability test is aimed to analyze the human’s ability to maintain balance at the maximum distance one can displace Center Of Gravity (COG)49. The participant was required to stand on SMART Balance Master and wear the Wearable Gait Lab system.

Based on the instructions shown on a screen in front of the test subject, after hearing a tone, the test subject is instructed to shift his/her center of gravity to one of the eight cardinal and diagonal directions without lifting his/her heels and toes. During the pro- cess, both feet of the test subject must stay on the ground. The left part in Figure 4.1 shows two conditions of the LOS experiment: moving COG from the standing phase to front (arrow A) and moving COG from the standing phase to back (arrow B) and the right part in Figure 4.1 shows the equipment set up including the test subject. Five volunteers participated in this test. Experiments 34

Figure 4.1. Left: A Demonstration of Displacing COG from Center to Front and from Center to Back. Right: Equipment set up. The test subject is wearing Wearable Gait Lab while standing on SMART Balance Master.

4.1.2 Sit-To-Stand (STS)

Sit-To-Stand test is designed to quantify the test subject’s balance status when he/she stands up from seated posture49. Key parameters measured during the test include weight transfer time, sway velocity during the rising phase, and left/right symmetry of the rising force. During the test, the test subject is asked to wear the Wearable Gait Lab and sit on a wooden stool placed on Balance Master (BM) as shown in the right image of

Figure 4.2. The participant is requested to stand up as soon as possible after hearing an alert tone, as shown in the left two figures in Figure 4.2. This procedure is repeated three times. During this test, the harness set was not required. Eight volunteers participated in this test. Trial 3 is the fastest, and Trial 1 is the slowest. Experiments 35

Figure 4.2. Left: Sit-To-Stand Illustration Right: Sit-To-Stand Experiment Setup

4.1.3 Rhythmic Weight Shift (RWS)

Rhythmic Weight Shift (RWS) is a test to evaluate one’s ability of shifting their weight dis- tribution between left and right or between backward and forward rhythmically. There are two parallel bars shown on the screen of SMART Balance Master system as shown in

Figure 4.3, both vertically or horizontally at different times. The participant is required to shift their COG following a cursor moving between the bars on the screen. The cur- sor shifts in three speeds on each direction. During this test, the harness set was not required.

Figure 4.3. Examples of SBM screen set-up, blue sun-shape cursors in- struct the user, and gray cursors show user’s COG in real-time. Left: hori- zontal; Right: vertical Experiments 36

4.2 Experiments of Subsystem: Pain Marker

To test and evaluate Pain Marker subsystem, eight eligible volunteers with chronic pain were recruited from Ohio Living Breckenridge Village.

First volunteers were asked to fill a traditional paper-base questionnaire about their pain information and marked their pain on two human figures (back and front) as shown in Figure 4.4. Then, volunteers were asked to put on HoloLens, marked their pain and entered personal information following instructions of Pain Marker. Both Pain Marker and the first questionnaire were only used for gathering pain information, including pain intensities and types. After that, they were asked to fill a second paper question- naire, which was about user experience and comments of Pain Marker and comparison of both systems.

Figure 4.4. The traditional questionnaire uses human figures to identify pain area and numerical scale for describing pain intensity50.

Among our eight volunteers, 75% were female and 25% were male. The average age was 83.5 and the standard deviation was 5.29. The eldest age was 92 and the youngest one was 76. All of these volunteers were experiencing at least one type of chronic pain at that moment. All ages and pain conditions were listed in Table 5.5 (Section ??). 37

5 Results and Case Study

In this chapter, detailed results of each mHealth experiment is discussed. In Sec- tion 5.1, WGL data were calculated and compared with BSM/BS. In Section 5.2, data col- lected by Pain Marker is compared with data collected by paper-based questionnaires, and reviews collected by user-experience questionnaires are summarized.

5.1 Subsystem: WGL System

5.1.1 Limits of Stability (LOS)

The team found linear relations between the pressure data collected by sensors in the

Wearable Gait Lab System and COG given by the SBM, for each trial, using multiple lin- ear regressions between the pressure sensor data and computed COG, from which a linear correlation between the two has been concluded. Figure 5.1 shows that one of the participants’ residual plots under Condition 2 as an example, during which the test subject was requested to move COG to the front-right direction. The upper plot in Fig- ure 5.1 named "residual case order plot (x axis)” refers to the test subject’s instability statistics on x-direction on the Balance Master (test subject’s center to left and to right).

Accordingly the plot "residual case order plot (y axis)" refers to data on the test subject’s Results and Case Study 38

center to front and to back. The x-axises of two plots refer to the case number, which are the data collected from the Wearable Gait Lab the test subject wore, and the y-axises refer to the residuals in each case. According to the plots, the majority of the cases fit in the linear relation well, presented in green. The outliers, presented in red, are relatively sparse compared to the data fitted to the relation. The coefficient of determinations R2 and p-values of five subjects are listed in Table 5.1. All p-values are much smaller than

0.05, therefore, statistically strong linear correlations hold.

Figure 5.1. Linear Regression Analysis: Moving COG to Front-right in LOS Experiments

Each pressure sensor data is multiplied by their weights to compute the COG. Fig- ure 5.2 intuitively illustrates the trends of COG transfer under each condition when the test subject is requested to move COG to different directions. The traces in the figure Results and Case Study 39

indicate that the test subject moves his/her COG from the original stance place to the destination gradually. The fluctuations represent the test subject’s self adjustments to keep balance while switching COG to the left or to the right.

Figure 5.2. An example of Center of Gravity (COG) in LOS experiment.

The sway velocity at each direction is also calculated based on COG to illustrate how fast one can change their COG in the test environment. A bigger value in sway velocity means a faster speed (a shorter time) one can react to keep balance. The sway velocities can be calculated by

arcsin(µi µi 1) Sway Velocity ° ° (5.1) = t whereas

abs(COGi COGi 1) µi arcsin ° ° (5.2) = height 55% ≥ £ ¥ in which µ is the angle between the human body and and the vertical direction orthog- onal to the ground. As an example, the result of Suject 5 is shown in the Table 5.2. Results and Case Study 40

Subject # x axis R2 x axis F x axis p-value x axis error var 1 0.8714 1858 1.24e-58 2.67e-05 2 0.8567 12981 2.41e-75 7.26e-06 3 0.9229 3487 2.62e-282 1.76e-05 4 0.7817 1822 1.25e-83 5.61e-05 5 0.8045 1808 8.73e-32 3.81e-05 Subject # y axis R2 y axis F y axis p-value y axis error var 1 0.8574 1164 5.08e-168 1.75e-05 2 0.9023 2950 3.51e-218 8.72e-06 3 0.9358 2860 2.87e-261 1.50e-05 4 0.7685 790 8.50e-49 1.59e-05 5 0.8349 449 1.05e-146 2.42e-05

Table 5.1. R2, F statistic, p-values, and error variances of five subjects. All p-values are smaller than 0.05, which proves the linear correlation of pressure data collected by our system and COG collected by SBM.

Condition # LOS1 LOS2 LOS3 LOS4 Sway Velocity (deg/sec) 1.94 3.51 1.81 2.29 Condition # LOS5 LOS6 LOS7 LOS8 Sway Velocity (deg/sec) 3.79 2.08 5.37 2.16

Table 5.2. Sway velocities of subject #5 in LOS experiments.

Reaction Time (RT) is the time in seconds it takes for the participant to initiate ad- justing COG (to reach the target) after the starting signal. However, since it is extremely difficult to start collection from both the Wearable Gait Lab system and SMART Bal- ance Master system at the same time manually, there is certain time error that could not be entirely eliminated. The results can be collected accurately, if both systems can be bound and initialized together. Nevertheless, the results from statistical analysis have indicated strong correlation between the data from the Wearable Gait Lab system and the SMART Balance Master system. Results and Case Study 41

Sit To Stand Subject # Trial WT Time Sway Velocity % L/R Wt (sec) (deg/sec) Symmetry 1 0.13 26.08 -4.61 2 0.91 20.60 3.25 1 3 0.67 26.07 -6.12 mean 0.57 24.24 -2.49 1 0.13 14.07 -2.42 2 0.16 14.55 -58.84 2 3 0.6 21.43 -47.3 mean 0.30 16.68 -36.21 1 0.13 25.03 -26.95 2 0.73 14.73 20.23 3 3 0.17 31.59 20.70 mean 0.34 23.78 4.66 1 0.31 24.53 -38.33 2 0.46 17.29 -35.31 4 3 0.40 21.25 -25.24 mean 0.39 21.02 -32.96 1 0.15 17.43 11.94 2 0.25 15.20 -0.06 5 3 0.35 7.00 29.21 mean 0.25 13.21 13.69 1 0.28 9.10 5.93 2 0.09 14.89 -6.09 6 3 0.18 13.81 -5.49 mean 0.18 12.60 -1.89 1 0.53 28.67 22.02 2 0.12 27.74 -62.19 7 3 0.35 14.92 4.88 mean 0.33 23.78 -11.76 1 0.36 2.06 -10.38 2 0.51 2.06 -25.96 8 3 0.53 1.93 -24.93 mean 0.47 2.06 -20.42 Table 5.3. Statistical data in Sit-To-Stand tests. Results and Case Study 42

On-Axis Velocity (deg/sec) Horizontal Vertical Subject # Trial1 Trial2 Trial3 Trial1 Trial2 Trial3 1 1.45 5.77 9.46 0.54 0.47 1.36 2 1.63 5.12 3.63 0.39 0.40 0.67 3 2.13 2.33 6.40 0.76 0.96 2.17 4 4.49 4.60 6.72 0.41 0.49 0.91 5 2.86 3.93 8.89 0.49 1.00 1.41 6 2.44 4.75 4.92 0.57 0.97 2.17 7 2.67 5.78 6.95 0.24 0.35 1.21 8 3.94 5.13 6.16 0.60 0.61 1.23

Table 5.4. On-axis velocities in Rhythmic Weight Shift tests by Wearable Gait Lab system.

5.1.2 Sit-To-Stand (STS)

STS experiment results are shown in the Table 5.3. If the left/right weight symmetry is negative, the participant puts more weight on left foot; if it is positive, it means that the participant puts more weight on right foot. It has also been confirmed that the results from the Wearable Gait Lab agree with those from SMART Balance Master.

5.1.3 Rhythmic Weight Shift (RWS)

For RWS, on-axis velocity, which is the sway velocity along the direction of the partic- ipant shifts, is calculated(Table 5.4). In Table 5, On-Axis Velocity for each trial by each subject has been included in both horizontal and vertical directions. Due to policies at

Breckenridge Village, the team is not allowed to release the unprocessed data from the

SMART Balance Master system even anonymously. However, it is confirmed with the results that the On-Axis Velocities collected from the Wearable Gait Lab system and the

SMART Balance Master system agree with each other with large similarity. Results and Case Study 43

5.2 Subsystem: Pain Marker

In this section, the accuracy of different pain information collected by app-based Pain

Marker is verified by comparing to paper-based questionnaires. User comments and satisfaction reviews of Pain Marker are also summarized from questionnaires.

5.2.1 Case1: App vs Paper Questionnaire

Each volunteer marked where they felt hurt on both PainMarker and traditional pain questionnaire. To ensure the reliability of PainMarker, we compared the subsystem records and traditional paper questionnaire based on records in two values: the amount of pain locations and pain severity. The amount of locations is the total number of body parts marked by volunteer. Our app allows different type of pain to be marked on the same area, therefore, the amount of locations is not necessarily equivalent to the num- ber of records collected by the app. For questionnaire, because volunteers marked their pain in very different marks, the amount of locations is counted manually by profession- als.

As shown in Table 5.5, the amount of pain locations recorded by app and paper are not always equivalent. Two possible reasons might cause this inconsistency of pain lo- cation results: volunteers either 1) failed in app marking process; or 2) failed in paper questionnaires marking process. We found that there are two types of app marking fail- ures, and both of them caused by inconvenient UI design. Volunteer 7 only had one pain location but marked sixteen locations on the app model. As shown in Table 5.6), all of them are marked as the same type, also all these areas’ severity are minor, which is the second lightest color of that pain. Later in the app review questionnaire, he mentioned Results and Case Study 44

id Age Gender Amt of Amt of Pain Severity (app) Pain Sever- locations locations ity (paper) (App) (paper) 1 85 female 1 2 severe 9 2 84 female 11 10 severe, moderate 8 3 86 female 5 4 moderate 3 4 76 male 23 13 mild, moderate 3 5 76 female 5 2 moderate 5 6 85 female 14 9 server 8 7 92 male 16 1 minor 2 8 84 female 2 2 minor 3 Table 5.5. Comparison of Pain Marker records and traditional self- evaluation questionnaires. that the color of the pain was hard to recognize. It is possible that volunteer 7 acciden- tally marked most of these locations, but he didn’t notice them, and thus didn’t correct these records. Another inconvenient design shows in the records of volunteer 1. The volunteer was only experiencing pain in two distinct locations, which were marked ac- curately on the traditional questionnaire. However, the second pain location is too small for the volunteer to select on the app model, so the volunteer gave up on marking it af- ter a few attempts. The volunteer also verified this process in the app review question- naire. For the paper questionnaire failures, most of them are caused by careless mark- ing. Some volunteers only left very tiny marks on questionnaire, however, when they recording their pain on Pain Marker app, all body parts around that location is marked on purpose. Marks chosen by these volunteers are in different styles, which might lead to misunderstanding when professionals summarize their pain information.

For pain severity records, Pain Marker uses five words: mild, minor, moderate, severe and unbearable, to describe the pain intensity. The questionnaire which is compared to uses numeric scale ranging from 1 to 10. Distinct levels of pain on different body part can be easily marked by Pain Marker App, however, only one is provided on Results and Case Study 45

questionnaire. We found that these volunteers are more intended to mark the most se- vere pain level on questionnaires.The correlation between app scale and numeric scale are shown in Figure 5.3. Mild equals to pain level 1; minor is ranging from 2 to 3; mod- erate is ranging from 4 to 5; server is ranging from 6 to 9; and unbearable equals to level

10. None of our volunteers are unbearable pain or level 10. The correlation between Pain Marker application severity scale and numeric scale is not linear.

Figure 5.3. Correlation of app pain severity word scale and questionnaire numeric scale.

There is no record of pain type provided by the questionnaire data. Pain type and other detailed pain information are discussed in Section5.2.2.

5.2.2 Case 2: Pain Information From App

Based on the result of PainMarker subsystem, 7 out of 8 volunteers are undergoing one single type of pain, and only one of them is having 3 distinct pain types at the same time.

The most common pain type is aching, which is experienced by 5 out of 8 volunteers. Results and Case Study 46

75% of volunteers are enduring one single pain intensity and the other two volunteers were feeling different levels of pain severity at different body areas. Most volunteers described their pain as moderate or severe level. Lower and middle back pain on the right side and in the middle are mentioned most. Among all these volunteers, the most common pain area is the middle of upper back. The most common pain in women is knee, “LB_lowerleg” and in men is “Spine02_M_Back”. 3 out of 8 volunteers are suffering

2 levels of pains; 1 out of 8 volunteers was suffering from 3 levels of pain and the rest of them are suffering only 1 level of pain.

5.2.3 Case 3: Pain Marker App Evaluation

Based on the questionnaire of Pain Marker and comments from volunteers, 62.5% of people think the system is easy to use; 87.5% of people are willing to use it for report- ing pain; 87.5% people believe this system can help them communicate with clinicians better; and 50% of population think the system is simpler than filling traditional pain questionnaire, and 37.5% of them comment it simplified the questionnaire a little and majority of them comment they need some time to get used to this app. Results and Case Study 47

Table 5.6. Pain information of each volunteer recorded by Pain Marker App

id Age Gender Pain Location Num of Pain Pain Pain Loca- Type Severity tion 1 85 female R_Knee 1 aching severe 2 84 female Spine03_M_Back, 11 aching severe, LUQ,Spine01_L_Back, moder- Spine01_R_Back, RLQ, ate LLQ,joint16, L_gluteus, R_gluteus, R_Knee, L_Knee 3 86 female LUQ, Spine02_M_Back, 5 aching moderate Spine01_L_Back, Spine01_R_Back, R_Forearm 4 76 male R_Forearm, R_Elbow, L_lowerleg, 23 tingling, mild, Spine01_L_Front, RLQ, LLQ, numb, moder- joint16, Spine01_R_Front, aching ate RB_lowerleg, R_Ball, L_Ball, L_Ankle, R_lowerleg, L_Knee, L_Hip, R_Knee, R_Hip, RB_ankle, RB_knee, RB_Hip, LB_Hip, LB_knee, LB_lowerleg 5 76 female Spine01_L_Back, RUQ, 5 aching moderate Spine02_M_Back, RLQ, Spine02_L_Back 6 85 female R_Knee, L_Knee, L_Elbow, 14 buring severe R_Elbow, L_Clavicle, R_Clavicle, Neck, joint16, RLQ, Spine01_L_Back, LUQ, Spine01_R_Back, R_Forearm, L_Forearm 7 92 male L_Hip, L_gluteus, 16 dull minor R_Forearm, Spine03_L_Back, Spine02_L_Back, Spine02_L_Front, RUQ, Spine02_M_Back, R_Backshoulder, Neck, L_Clavicle, LUQ, Spine01_L_Front, Spine01_L_Back, R_Clavicle, L_Backshoulder 8 84 female L_gluteus, R_gluteus 2 aching mild 48

6 Discussion

6.1 Compromise of Efficiency and Accuracy

Comparing with traditional paper questionnaire with 2D human figures, the 3D AR ap- plication is more precise and convenient in pain recording, not mentioning the benefits of digitization of pain records in data storage and analysis. With predefined pain area, it also eliminates the misjudgment of marks and notes hand written by users. Although every spot on model are able to be marked, one thing worth to be further studied is how many pain areas should be labeled considering both the efficiency of user experience and the precision of information. The larger the number are, the smaller each pain ar- eas are and the more precisely the pain can be described. Imagine the situation that one feels pain at his finger tip of the index finger, while another person might be suffering from pain in the whole hand. If the whole hand is considered as one single area, for the

first person, the model can’t describe his pain accurately, and further comments should be left to describe it. However, if the model is separated into a large amount of small seg- ments, the second user has to air tap on a dozen pain areas to finally cover all affected areas. Neither of these two situations is the most efficient way to define pain area.

Also HoloLens requires a lot of head movement and rotation to target the object one intends to air tap on. Two commonly suggested solutions are changing the size and Discussion 49

default position of the 3D human model, however, these two might not be efficient so- lutions because there are conflicts between such user experience and other functions.

The size of body model should be large enough, so each body area can be easily targeted by the cursor. To offer potential accurate pain area options, large amount of bones is demanded for the 3D human model. Not only large amount of these joints, because of human body structure, some of these joints are placed adjacent to each other tightly, for example, there are nine joints placed on head area for covering all parts from top of head to chain. Also, number of joints might even be extended in future work to achieve more accurate description of pain areas to match with each piece of muscle and human joints.

For example, in our current version, one can only mark a whole finger even if only his

finger tip is suffering from pain. With such large number of options and some of them are crowded together limited by human body structure, if the model is not large enough, user can hardly place the cursor to right places and stick it to these areas until the air tap gestures are accomplished. For the position of the body model, if the body model is placed too far away from the user, it is hard to see where the cursor pointing at; however, if the model is placed to near to user, larger head movement is expected to reach body parts such as top of the head or heels. Thus, the precision of pain area definition should be further studied. Considering the limitation of HoloLens itself and user differences, the compensation between precision and efficiency of pain recording is inevitable.

6.2 Segmentation Rules

Besides the size and number of pain areas, shapes of pain areas are also need to be ad- justed and discussed considering different conditions. There are many ways to segment Discussion 50

one body part. For example, the forearm can be separated based on the muscle shapes at that area, or separated into three parts evenly as upper, middle and lower ones. For some pain, such as sore, the affected muscle might be very clear for the user to identi-

fied, but for burning pain, it is more possible an area is affected, but not specific mus- cles. Clinicians and researchers from different departments might also have their own preference of segmentation rules for different purposes and research interests. 51

7 Conclusion

In this paper, the mHealth system is proposed for assisting balance and pain related researches and the system is evaluated as two different subsystems. The Wearable Gait

Lab is designed for balance tests data collection and remote data analysis in order to simplify balance tests procedures and improve result accuracy. The WGL system allows the user to connect all the wearable sensors wirelessly through a smart device, and the data collection procedures can be simply controlled by an Android application. Accord- ingly the data collected during the process can be conveniently reviewed remotely by researchers or clinicians. In the experiments, the system has been verified with stan- dard balance tests including Limits of Stability test, Sit-To-Stand test, and Rhythmic

Weight Shift test. Certain status indicators are computed and compared to verify the proposed system with existing balance systems on the market, including COG, sway ve- locity, weight transfer time, reaction time, and so on. It has been confirmed that the pro- posed Wearable Gait Lab system has advantages in data collection and review process simplification and high portability compared to existing balance systems, while keep- ing test results accurate. The other subsystem, Pain Marker, is a reliable pain self-record system for collecting patients’ pain information and providing digital records for clini- cians. Based on our experiments and case studies, volunteers show both willingness to Conclusion 52

use this app and good adaption to this system. They commented this subsystem as user- friendly, easy-to-follow and they believe this app eases their ability to communicate with clinicians about their pain efficiently. Compared with traditional paper-based question- naire, these records are more informative. For clinicians and other professionals who are interested in pain study among specific population, it provides a tool to collect large amount of well-labeled data. Spinal chart and history review functions are also available for studies about the relation of spinal cord and the changing of pain of individuals. Both two subsystems are reliable and welcomed by elder community, but further research are required to analysis the whole mHealth system completely and thoroughly. 53

8 Suggested Future Research

8.1 mHealth Tracker

In the future, more data samples can be collected by the team, so that the team can compare test results among different groups and better serve balance and pain related researches. The system would be more advantaged with additions of functionalities in real time monitoring and analysis with statistics. Currently, both subsystems were eval- uated separately with different users, and for balance tests, pain conditions were not considered during volunteer recruitment process. Further balance tests within healthy volunteers and those who suffers from pain should be studied and compared with re- lated researches. Due to vigorous development of wireless sensory industry today, ad- ditional components can also be added to the system to better achieve desired sensory data for pain and balance experimental purposes.

8.2 WGL: Dynamic Balance Tests and Daily Activities

As mentioned in Section 4.1, all three experiments, LOS, STS and RWS are static balance tests. These experiments proven that WGL system is accurate and reliable, however, one leading advantage of WGL system comparing with Balance Master and SMART Balance Suggested Future Research 54

Master is the portability and Mobility. Dynamic balance tests and daily activity detection can be perfect experiments for verifying the mobility of the WGL subsystem. The team had several researches about using WGL to monitor and to detect several worker-related activities, such as lifting, carrying heavy loads, and these activities are proven to be re- lated to several pains, such as lower-back and waist pains. However, more daily-based activities within various population spectrum are also worth study.

8.3 WGL: Balance Instructions with HoloLens

Pain Marker is a superior and convenient tool for those with limited anatomy knowl- edge to present their pain information precisely through 3D human model displayed by

AR device. It also a reliable application automatically generates spinal chart informa- tion and provides history review functions for all kinds of pain studies and management progress evaluation. Pain area, type, and intensity are shown directly on the model, and more information such as rest/moving pain and the consistency of pain are also recorded in files. However, the visualization capability of HoloLens can be used to in- struct users for balance tests and other balance-related physical educations, especially for those who suffers from chronic pain and those who need regular balance training.

Nowadays, most balance-related tests and rehabs requires the user to present in clinic, mHealth system might allow them to exercise and to communicate with their clinicians from home. Suggested Future Research 55

8.4 Pain Marker: Distinguish Pain In Different Anatomy System

Although Pain Marker subsystem has proved it is informative, information with even more details and more precisely descriptions of pain can be recorded without enhanc- ing the efforts for users. Case Western Reserve University and Cleveland Clinic39 de- signed an anatomy education application with which the user can combine or separate different 3D anatomy systems using HoloLens. Inspired by their study, we found it is possible that our next generation of Pain Marker app supports marking pains in differ- ent anatomy systems. For example, at current stage, a person can only mark the skin of forearm, and clinicians can’t tell if it is muscle or . With different anatomy systems, pain can be simply marked in the corresponding system. To achieve this fea- ture, new models of different systems, such as muscle, nerves, skeletons need to be con- structed. Bibliography 56

Complete References

[1] Horak FB. Clinical assessment of balance disorders. Gait Posture, 6:76–84, 1997.

[2] Wilbert E Fordyce, Roy S Fowler, Justus F Lehmann, and Barbara J Delateur. Some implications of learning in problems of chronic pain. Journal of chronic diseases, 21(3):179–190, 1968.

[3] Hylton B Menz and Stephen R Lord. Foot pain impairs balance and functional abil- ity in community-dwelling older people. Journal of the American Podiatric Medical Association, 91(5):222–229, 2001.

[4] Eliza Poole, Julia Treleaven, and Gwendolen Jull. The influence of on balance and gait parameters in community-dwelling elders. Manual therapy, 13(4):317–324, 2008.

[5] MIV Mientjes and JS Frank. Balance in chronic patients com- pared to healthy people under various conditions in upright standing. Clinical Biomechanics, 14(10):710–716, 1999.

[6] A Hamaoui, MC Do, L Poupard, and S Bouisset. Does respiration perturb body bal- ance more in chronic low back pain subjects than in healthy subjects? Clinical Biomechanics, 17(7):548–550, 2002.

[7] American Academy of Health and Fitness. Berg balance scale.

[8] Lorraine H DeSouzza Sarah F Tyson. Reliability and validity of functional balance tests post stroke. Clin Rehabil, 18(8):916–923, 2004.

[9] A. Schrewe J. Brandewiede A. Haemisch N. GÃ˝urtz M. Schachner N. Sachser L. Lewejohann, C. Reinhard. Environmental bias? effects of housing conditions, laboratory environment and experimenter on behavioral tests. Genes, Brain, and Behavior, 5:64–72, 2015.

[10] CRB Joyce, DW Zutshi, V Hrubes, and RM Mason. Comparison of fixed interval and visual analogue scales for rating chronic pain. European journal of clinical pharmacology, 8(6):415–420, 1975.

[11] Edgar E Ohnhaus and Rolf Adler. Methodological problems in the measurement of pain: a comparison between the verbal rating scale and the . Pain, 1(4):379–384, 1975. Bibliography 57

[12] Fátima Aparecida Emm Faleiros Sousa, Lilian Varanda Pereira, Roberta Car- doso, and Priscilla Hortense. Multidimensional pain evaluation scale. Revista latino-americana de enfermagem, 18(1):03–10, 2010.

[13] Robert D Kerns, Dennis C Turk, and Thomas E Rudy. The west haven-yale multidi- mensional pain inventory (whympi). Pain, 23(4):345–356, 1985.

[14] Harald Breivik, PC Borchgrevink, SM Allen, LA Rosseland, L Romundstad, EK Breivik Hals, G Kvarstein, and A Stubhaug. Assessment of pain. BJA: British Journal of Anaesthesia, 101(1):17–24, 2008.

[15] Pamela Macintyre, David Rowbotham, and Suellen Walker. Clinical Pain Management Second Edition: Acute Pain. CRC Press, 2008.

[16] A Stubhaug and H Breivik. Prevention and treatment of and persistent pain after surgery. Pain best practice and research compendium. London: Elsevier, pages 281–8, 2007.

[17] Michael Von Korff, Johan Ormel, Francis J Keefe, and Samuel F Dworkin. Grading the severity of chronic pain. Pain, 50(2):133–149, 1992.

[18] Tony Jebara, Cyrus Eyster, Joshua Weaver, Thad Starner, and Alex Pentland. Stocha- sticks: Augmenting the billiards experience with probabilistic vision and wearable computers. In Wearable Computers, 1997. Digest of Papers., First International Symposium on, pages 138–145. IEEE, 1997.

[19] Marcin Uszynski Sara Hayes Blathin Casey Catherine Browne Susan Coote Elaine Ross, Helen Purtill. Cohort study comparing the berg balance scale and the mini-bestest in ambulatory people with multiple sclerosis. , 2016.

[20] Begonya Garcia-Zapirain Amaia Mendez-Zorrilla Muro de la Herran, Alvaro. Gait analysis methods: An overview of wearable and non-wearable systems, highlight- ing clinical applications. Sensors, 14(2):3362–3394, 2014.

[21] Karine Fortin Callan Ritchie-Kate E. Webster Linda Denehy Adam L. Bryant Ross A. Clark, Yong-Hao Pua. Validity of the microsoft kinect for assessment of postural control. Gait Posture, 36:372–377, 2012.

[22] Craig R. Denegar William E. Buckley Philip A. Gribble, Jay Hertel. The effects of fa- tigue and chronic ankle instability on dynamic postural control. Journal of Athletic Training, 39(4):324–329, 2004.

[23] Validity of using tri-axial accelerometers to measure human movementâA˘Tpartˇ i: Posture and movement detection. Bibliography 58

[24] Katharina Bergner Gerald Schickhuber JÃijrgen Winkler Jochen Klucken BjÃ˝urn Es- kofier Jens Barth, Michael SÃijnkel. Combined analysis of sensor data from hand and gait motor function improves automatic recognition of parkinson’s disease. IEEE, pages 5122–5125, 2012.

[25] Gracies JM Cohen HS Ondo WG Moore ST1, MacDougall HG. Long-term monitor- ing of gait in parkinson’s disease. Gait Posture, 26(2), 2207.

[26] Anne D. Kloos Deborah A. Kegelmeyer Jessica D. Dicke Steven T. Devor Nathan W. Saunders, Panagiotis Koutakis. Reliability and validity of a wireless accelerome- ter for the assessment of postural sway. Journal of Applied Biomechanics, 31:159– 163, 2015.

[27] Szurkus DC Ashmead DH Peterson SW Shiavi RG Hasan SS, Robin DW. Simultane- ous measurement of body center of pressure and center of gravity during upright stance. part i: Methods. gait posture. Gait Posture, 1996.

[28] Donna Lee Wong, M Hockenberry-Eaton, D Wilson, ML Winkelstein, and P Schwartz. Wong-baker faces pain rating scale. Home Health Focus, 2(8):62, 1996.

[29] Laurie Walter and Linda Brannon. A cluster analysis of the multidimensional pain inventory. Headache: The Journal of Head and Face Pain, 31(7):476–479, 1991.

[30] Yingzi Lin, Li Wang, Yan Xiao, Richard D Urman, Richard Dutton, and Michael Ram- say. Objective pain measurement based on physiological signals. In Proceedings of the International Symposium on Human Factors and Ergonomics in Health Care, volume 7, pages 240–247. SAGE Publications Sage India: New Delhi, India, 2018.

[31] Son-Lik Tang, Chee-Keong Kwoh, Ming-Yeong Teo, Ng Wan Sing, and Keck-Voon Ling. Augmented reality systems for medical applications. IEEE engineering in medicine and biology magazine, 17(3):49–58, 1998.

[32] Ronald Azuma, Yohan Baillot, Reinhold Behringer, Steven Feiner, Simon Julier, and Blair MacIntyre. Recent advances in augmented reality. IEEE computer graphics and applications, 21(6):34–47, 2001.

[33] Daniel Wagner and Dieter Schmalstieg. First steps towards handheld augmented reality. IEEE, 2003.

[34] The role of 3d displays in medical imaging applications, May 2015.

[35] Alaric Hamacher, Su Jin Kim, Sung Tae Cho, Sunil Pardeshi, Seung Hyun Lee, Sung- Jong Eun, and Taeg Keun Whangbo. Application of virtual, augmented, and mixed reality to urology. International neurourology journal, 20(3):172, 2016. Bibliography 59

[36] Jacques Marescaux, Francesco Rubino, Mara Arenas, Didier Mutter, and Luc Soler. Augmented-reality–assisted laparoscopic adrenalectomy. Jama, 292(18):2211– 2215, 2004.

[37] H Iseki, Y Masutani, M Iwahara, T Tanikawa, Y Muragaki, T Taira, T Dohi, and K Takakura. Volumegraph (overlaid three-dimensional image-guided navigation). Stereotactic and functional neurosurgery, 68(1-4):18–24, 1997.

[38] Jeffrey H Shuhaiber. Augmented reality in surgery. Archives of surgery, 139(2):170– 174, 2004.

[39] Cwru takes the stage at microsoftâA˘Zs´ build conference to show how hololens can transform learning, Mar 2016.

[40] Gabriel Evans, Jack Miller, Mariangely Iglesias Pena, Anastacia MacAllister, and Eliot Winer. Evaluating the microsoft hololens through an augmented reality as- sembly application. In Degraded Environments: Sensing, Processing, and Display 2017, volume 10197, page 101970V. International Society for Optics and Photonics, 2017.

[41] Mayo Clinic Staff. Tests and procedures electromyography (emg), 2013.

[42] L. M. Nashner. Fixed patterns of rapid postural responses among leg muscles dur- ing stance. Experimental Brain Research, 30:13–24, 1977.

[43] Marjorie H. Woollo Pei-Fang Tang. Control of reactive balance adjustments in perturbed human walking: roles of proximal and distal postural muscle activity. Springer-Verlag, 119:141–152, 1997.

[44] Eamonn Keogh and Chotirat Ann Ratanamahatana. Exact indexing of dynamic time warping. Knowledge and information systems, 7(3):358–386, 2005.

[45] Meinard Müller. Dynamic time warping. Information retrieval for music and motion, pages 69–84, 2007.

[46] Allen G. Taylor. HoloLens Hardware, pages 153–159. Apress, Berkeley, CA, 2016.

[47] Microsoft. A new way to see your world. https://www.microsoft.com/en-us/ hololens/hardware, 2018.

[48] Vikki Wylde, Victoria Wells, Samantha Dixon, and Rachael Gooberman-Hill. The colour of pain: can patients use colour to describe osteoarthritis pain? Musculoskeletal care, 12(1):34–46, 2014.

[49] Natus Balace and Mobility. Neurocom test protocols, 2016. Bibliography 60

[50] houston medical . Pain assessment form, 2015.