Responsive IoT: Using Biosignals to Connect Humans and Smart Devices by Alexandre Armengol Urpi Submitted to the Department of Mechanical Engineering in partial fulfillment of the requirements for the degree of Master of Science in Mechanical Engineering at the

MASSACHUSETTS INSTITUTE OF TECHNOLOGY

June 2018

@ Massachusetts Institute of Technology 2018. All rights reserved.

Signature redacted A uthor ...... Department of Mechanical Engineering 141 May 21, 2018 Signature redacted Certified by ...... Sanjay E Sarma Professor Thesis Supervisor Signature redacted A ccepted by ...... Rohan Abeyaratne Chairman, Department Committee on Graduate Theses

MASSACHUSETT INSTITUTE OF TECHNOLOGY

JUN 2 5 2018 LIBRARIES ARCHIVES 2 Responsive IoT: Using Biosignals to Connect Humans and Smart Devices by Alexandre Armengol Urpi

Submitted to the Department of Mechanical Engineering on May 21, 2018, in partial fulfillment of the requirements for the degree of Master of Science in Mechanical Engineering

Abstract

The growing Internet of Things (JoT) ecosystem being built today is already af- fecting a great many daily objects, which may share information about their state, location and sensed data among others. Today, humans communicate with IoT de- vices through visual, voice or tactile interfaces. More natural and organic interaction requires more sophisticated communication methods. This thesis explores seamless interfaces between human and IoT devices. In particular, I focus on using biological signals as the interface to directly connect with the smart surroundings. The work is partitioned in two parts. First, I present a wearable sensing system to estimate the thermal comfort level of the user by monitoring skin temperature, blood volume pressure and skin conductivity. This effort is a first step towards connecting room occupants and smart A/C devices, which can enable real-time adjustments of indoor conditions. In the second part of the thesis, brain signals are used as the interface to navigate in a Virtual Reality (VR) environment. We develop Sublime, a new concept of Steady-State Visually Evoked Potentials (SSVEP) based Brain-Computer Interface (BCI). In this technology, brain-computer communication is triggered by imperceptible visual stimuli integrated in the virtual scene and subliminal informa- tion is seamlessly'conveyed to a computer. By monitoring the elicited SSVEPs, the system is able to identify the gaze target of the user, thus enabling a hands-free menu navigation tool.

Thesis Supervisor: Sanjay E Sarma Title: Professor

3 4 Acknowledgments

First of all, I wish to express my sincere thanks to Professor Sanjay Sarma for his valuable directions and continuous encouragement. He has provided me total freedom and flexibility to satisfy my interests both inside and outside of school, which is an invaluable gift. Research is full of ups and downs, and I have been truly impressed by how he is able to transform frustration into excitement and motivation with just a 5-minute conversation. But above all, he is an excellent person who has always been open to talk both professionally and personally.

I also want to thank Dr. Stephen Ho for his guidance and support during these two years at the lab. His critical advice has been very valuable in my first two years as a graduate student.

A special mention should also go to all Sanjay's lab members: Nithin, Debbie,

Yongbin, Pranay, Pankhuri, Nidhi, Dajiang, Shane, Felipe, Esteve, Alex, Josh, Rahul,

Brian and Laura. They provided everyday help and support but, most importantly, they allowed me to be part of this great working environment which made me feel so welcome since the first day.

These two years here at MIT wouldn't have been the same without the prince's, tsaprs, swing-dancers, dumpling-makers, phantastics, catalans and spaniards; all bringing meaning to my everyday life.

Finally I want to thank my mom, my dad and my sister for always staying by my side, wherever I am and wherever they are.

5 6 Contents

1 Introduction 15 1.1 M otivation ...... 15

1.2 Empathic buildings: sensing thermal comfort ...... 16

1.3 Brain-Computer Interfaces ...... 17

2 Background 19

2.1 Thermal Comfort Sensing ...... 19 2.1.1 PMV - PPD Model ...... 19

2.1.2 Adaptive Models ...... 23

2.1.3 Personalized Thermal Comfort Sensing ...... 23

2.2 SSVEP-based Brain-Computer Interfaces ...... 25

3 A Wearable-based Thermal Comfort Sensing System 27

3.1 Potential Biosignals for Thermal Comfort Estimation ...... 27

3.1.1 Wrist Skin Temperature ...... 27

3.1.2 Vasomotion ...... 27

3.1.3 Electrodermal Activity ...... 28

3.2 Wearable: Biosignal Monitoring Wristband ...... 30

3.2.1 Sensors ...... 30

3.2.2 Functioning Modes ...... 31

3.3 First Experiments ...... 32

3.3.1 R esults ...... 33

3.4 Machine Learning Approach ...... 39

7 3.4.1 Feature Selection ...... 39 3.4.2 Time Windows ...... 40

3.4.3 Datapoint Extraction ...... 41

3.4.4 Single-User Model...... 42 3.4.5 Multi-User Model ...... 44

4 A Brain-Computer Interface for Virtual Reality 49 4.1 Sublim e ...... 4 9 4.2 Materials and Methods ...... 50 4.2.1 Virtual Reality Display Device ...... 5 0 4.2.2 Visual Stimuli Generation ...... 5 0

4.2.3 Beating effect ...... 5 1 4.3 Virtual Reality Application ...... 53 4.3.1 Main Menu ...... 53 4.3.2 Movie Playback Menu ...... 53 4.3.3 Real-time Feedback ...... 54 4.4 EEG Recording Equipment ...... 5 5 4.5 SSVEP Detection ...... 55 4.5.1 Canonical Correlation Analysis ...... 5 5 4.5.2 Logistic Regression ...... 56 4.6 Real-time Data Processing ...... 5 7 4.7 System Configuration ...... 58 4.8 Experiments ...... 58 4.8.1 Subjects ...... 58 4.8.2 Experiment 1: Navigation Time ...... 5 9 4.8.3 Experiment 2: Subjective Experience ...... 59 4.9 R esults ...... 6 0 4.10 Discussion ...... 6 1

5 Conclusions and Future Work 63 5.1 Sensing Thermal Comfort . 63

8 5.2 Brain Waves in Virtual Reality ...... 64 5.3 Future Work ...... 65 5.3.1 Improvements in the comfort models ...... 65 5.3.2 Future work on Sublime ...... 65

9 10 List of Figures

2-1 Predicted Percentage of Dissatisfied versus Predicted Mean Vote. . 21

2-2 Body thermoregulation diagram. Extracted from [20] ...... 24

3-1 PPG signal. Extracted from [461. Notice that vasomotion information

can be obtained from the signal amplitude...... 28

3-2 Typical shape of a phasic skin conductance...... 29

3-3 Empatica E4 Wristband ...... 31

3-4 E4 recording mode ...... 31

3-5 E4 recording mode ...... 32

3-6 Plot of data obtained during one of the experiments...... 33

3-7 Gradient of skin temperature vs. thermal sensation...... 34

3-8 PPG signal and its envelope during 6 seconds of data approximately.

The blue line represents the PPG signal measured by the wristband.

The yellow and orange lines define the envelope of the PPG signal

computed in M atlab...... 35

3-9 PPG signal and its envelope of 10 minutes worth of data...... 36

3-10 Plot of skin temperature and vasomotion signal obtained from the PPG sign al...... 36

3-11 Vasomotion signal during the experiment explained above. Orange and

green lines are thermal sensation and room type respectively...... 37

3-12 Plot of EDA signal (blue). Vertical cyan lines mark the EDA peaks detected...... 38

11 3-13 Plot of EDA signal (blue) during the experiment. Vertical cyan lines mark the EDA peaks detected ...... 38

3-14 Relative frequency of EDA peaks vs. thermal sensation...... 39 3-15 Mean IBI vs. temperature gradient point scatter for the three different classes. It shows how the distinct class datapoints cluster differently. . 43

3-16 Mean vaso vs. std vaso point scatter for the three different classes. Va-

sodilation and vasoconstriction in hot and cold states is clearly reflected. 43

4-1 Blue line shows a 44Hz stimulating sine wave sampled at RR = 1/90Hz.

The red dashed line shows the beating effect that will be perceived. 52

4-2 Screenshot of the movie covers the user sees in the main menu. .... 53 4-3 Star Wars Episode VIII playing in the movie menu...... 54

4-4 Loading bars for two different selectable objects. Figure 4-4b shows the flickering object that allows the user to return to the main menu. 54

4-5 System blocks configuration...... 58

12 List of Tables

2.1 Predicted Mean Vote scale ...... 20

4.1 Results for ExperimentI...... 60

4.2 Results for Experiment2...... 60

13 14 Chapter 1

Introduction

1.1 Motivation

The Internet of Things (IoT) is a network of interrelated computing devices that can be wirelessly accessed, controlled and sensed. These devices can be integrated into everyday objects that will in turn become part of the network. Connected cars, home automation, health monitoring devices and smart retail are just a few examples of how

IoT is taking root in our most quotidian items and activities, changing elementary aspects of industry and human life. Today IoT plays a significant role in not only helping people solve big problems facing the world such as global healthcare, climate change, food crisis, etc., but, most importantly, in fundamentally changing the way we think about designing and making everyday objects surrounding us.

The growing connected ecosystem using IoT is already expanding to our most common daily objects, and we, as humans, will be affected. Just as our surrounding objects will share information about their state, location and sensed data, the system will also, under certain circumstances, seek the state of humans. This thesis explores methods to bridge the gap between humans and devices, and in particular focuses on taking advantage of biological signals to directly interact with our smart surroundings.

Existing wearables are already allowing us to keep track of our daily activities such as exercise, sleep, calories and more, with which valuable data can be extracted to improve our living standards. But can we advance this technology to bridge IoT

15 hardware with our state of mind? This work seeks to show that by monitoring human biosignals we can build a responsive smart environment that adapts to our emotional or physical needs. This would allow us to directly interact with IoT devices through our physiological signals, creating an organic and more natural connection between people and their surroundings.

This thesis is divided in two main parts. The first focuses on the development of a wearable sensing system to detect the level of thermal comfort of the user through the monitoring of skin temperature, blood volume pressure and skin conductivity. The second consists on the utilization of brain signals as a tool to navigate in a virtual reality environment.

1.2 Empathic buildings: sensing thermal comfort

It is well known that the right indoor climate increases health and workforce produc- tivity [1]. Thinking from an IoT perspective, monitoring the thermal comfort level of room occupants could allow the building to adjust indoor conditions in real time. To do so, a method for assessing the comfort level of users needs to be developed. ISO 7730 standard defines thermal comfort as "that condition of mind which expresses sat- isfaction with the thermal environment and is assessed by subjective evaluation" [2]. Because of its highly subjective nature, thermal comfort is not a straightforward vari- able to measure. Many authors show that two people under the same environmental and clothing conditions can feel thermally opposite, because their thermal perception can be influenced by differences in mood, social aspects, culture, past thermal experi- ences and other factors [3, 4, 5]. Therefore, in order to achieve a precise evaluation of peoples' comfort levels, we develop concepts related to personalized comfort sensing. The first part of this thesis focuses on the development of a wearable sensing system to estimate the thermal comfort level of a particular person by monitoring different physiological signals, which is a first step towards connecting human and smart A/C devices.

16 1.3 Brain-Computer Interfaces

A brain-computer interface (BCI) is a system that enables the communication be- tween human and computer through brain activity to control an external device. BCI's have traditionally been used to enhance the ability of patients with motor dis- abilities to interact and communicate with their environment. Today, non-invasive modalities of these systems, such as electroencephalography (EEG) based BCI's, are already being utilized for non-clinical purposes.

Neural activity can contain information about our emotions [6, 7, 8], mental states [9, 10, 11, 12], or (imagery) movement [13, 14, 15]. Thanks to the increasing qual- ity and affordability of wearable EEG equipment, BCI's are no longer restricted to laboratory settings and brainwave monitoring is starting to be used for consumer applications such as neurofeedback, education or stress control. This data obtained from brain activity can potentially be exploited to build BCI's for IoT applications.

The last part of this work presents another approach of utilizing biosignals for human- machine interaction, where brain waves will be used as the interface to connect and interact with a Virtual Reality environment.

There are several neural control signals that can be used for the brain-computer communication, but the most commonly utilized are visually evoked potentials (VEP), slow cortical potentials (SCP), event related potentials (ERP) and sensorimotor rhythms. This work focuses on Steady State Visually Evoked Potentials (SSVEP), a subcategory of VEP, which are oscillatory signals elicited at the visual cortex in response to flickering eye stimulation. I develop a virtual environment with integrated flickering objects that detects the gaze of the user and enables a hands-free virtual navigation. High-frequency flickering stimuli is used so that they are unobtrusively integrated in the virtual scene and become imperceptible for the user.

17 18 Chapter 2

Background

2.1 Thermal Comfort Sensing

Although the safest method for thermal comfort assessment is a direct survey-because of its strong dependency on subjective evaluation-, there are many models that define the variables that can have the greatest effect on thermal comfort and relate them to estimate the comfort of a group of people. To do so, these models are mainly based on the principal condition that has to be accomplished to maintain thermal comfort, which is the fulfillment of the body's energy balance: the heat produced by the metabolism should be equal to the amount of heat lost from the body to the environment. Heat balance equations are the basis of the PMV model explained in section 2.1.1, the most widely used and accepted thermal comfort model.

2.1.1 PMV - PPD Model

Definition

The PMV model was developed by Fanger [161 and it is derived from the physics of heat transfer combined with an empirical fit to sensation. Fanger formulated the so-called PMV index, which predicts the mean response of a large group of people voting on how comfortable they are in a given environment on a scale ranging from cold (-3) to hot (+3), being 0 a neutral thermal sensation.

19 Fanger collected data from a large number of people, subjecting them to different conditions in a climate chamber and having them select a number on the scale that best described their comfort sensation. He then derived a mathematical model from the data that related the environmental and physiological factors considered through heat balance principles. The Predicted Mean Vote (PMV) sensation scale is shown in table 2.1.

PMV [Thermal sensation -3 Cold -2 Cool -1 Slightly cool 0 Neutral 1 Slightly warm 2 Warm 3 Hot

Table 2.1: Predicted Mean Vote scale

The PMV model on thermal comfort has been a path breaking contribution to the theory of thermal comfort and to the evaluation of indoor thermal environments in buildings. It is widely used and accepted for design and field assessment of thermal comfort and the recommended PMV range for thermal comfort from ASHRAE 55 is between PMV values of -0.5 and 0.5 for indoor spaces.

From the PMV equation, Fanger derived the Predicted Percentage of Dissatisfied

(PPD) for each PMV value. As PMV values move away from 0, PPD increases. A plot that relates PPD to PMV is shown in Figure 2-1

Factors

The PMV model is based on 6 different factors, 2 personal and 4 environmental.

Personalfactors:

" Metabolic rate [met]

" Clothing insulation [cloI

20 100

1o90 \

80 --

70

60 ~50/ 40

30

20

10

0-3 2 -1 0 1 2 3 PMV

Figure 2-1: Predicted Percentage of Dissatisfied versus Predicted Mean Vote.

Environmental factors:

" Air temperature [C]

" Mean radiant temperature [Ci

" Air speed [m/s]

* Ambient humidity [%]

Equations

The comfort equation was derived by Fanger [16] and it connects the measurable physical parameters with thermally neutral sensation as experienced by an average person. It allows evaluating under which conditions thermal comfort may be offered in an indoor environment by measuring some physical parameters. Fanger first proposed the following heat balance equation for the :

M -W = H + Ec + Cre + Eres (2.1)

where M is the metabolic rate, which is the transformation rate of chemical energy into heat and mechanical work by aerobic and anaerobic activities, W is the external work (usually = 0), H corresponds to the heat loss through convection, radiation and

21 conduction from the body, E, is the evaporative heat exchange at the skin, Cre, is the respiratory convective heat exchange and Ere, is the respiratory evaporative heat exchange. These variables can in turn be computed as a function of the six factors seen above. See [16] for the complete equations. The heat balance equation expresses that, on average, heat produced and received by the body must be balanced by heat outputs from the body.

Based on the heat balance equation, Fanger derived the PMV equation

PMV = (0.303 - e-0.0 3 6M + 0.028) - [(M - W) - (H + Ec + Cres + Eres)] (2.2)

As seen, if the heat balance equation is fulfilled, the right term (also known as thermal load) equals to zero, so the PMV value is zero. The PPD equation is only function of PMV:

4 2 PPD = 100 - 95 e(0.03353PMV +0.2179PMV ) (23)

Limitations

Although the PMV-PPD model is widely used, it is often too cumbersome for practical applications and it is not a personalized model because it predicts an average vote over a large population. The model uses an averaging process and it cannot represent unique or individual physiological characteristics such as age, gender, body mass index and mood among others, which are variables that can affect thermal comfort [17]. Therefore, PMV-PDD model only provides an estimated mean vote or an estimated percentage of dissatisfied people. This can clearly be seen in the PPD plot, where even though the PMV value is zero, the percentage of people dissatisfied never goes below 5%, so there will still be people uncomfortable in such given conditions. This brings up the need of personalized thermal comfort detection systems.

22 2.1.2 Adaptive Models

Adaptive models state that indoor comfort is influenced by outdoor climate and that human behavior can affect thermal sensation. These models assume that if build- ing occupants feel thermally uncomfortable, they will react in order to restore their comfort by, for instance, taking off clothes, reducing their activity levels or opening a window; and that these actions will cause a thermal adaptation. As stated in [18], adaptive hypotheses predict that contextual factors, such as having access to envi- ronmental controls, and past thermal history influence building occupants' thermal expectations and preferences. After surveying a large amount of people that use dif- ferent type of buildings, the study concluded that occupants of naturally ventilated buildings accept a wider range of temperatures than people in air-conditioned build- ings, because their thermal preferences are tied to outdoor conditions. These results were incorporated in the ASHRAE 55 standard as the adaptive comfort model [19].

The adaptive charts relate indoor comfort temperature to predominant outdoor tem- perature and defines zones of 80% and 90% satisfaction. Adaptive models can only be applied in naturally ventilated buildings (non-HVAC environments), occupants must have metabolic equivalent (MET) values between 1 and 1.3, and the prevailing outdoor temperature must be between 100 C and 33.5'C. In general, three different categories of thermal adaptation can be identified. They are behavioral, physiological and psychological. The latter corresponds to what has been mentioned above, where mood, thermal expectations or recent thermal experiences can alter thermal percep- tion, one of the main reasons why personalized thermal comfort sensing is important to evaluate someones' thermal state.

2.1.3 Personalized Thermal Comfort Sensing

Physiological Signals

Several authors studied the physiological signals that change to maintain a stable body temperature, which can provide information about an individual's thermal sensation

[20, 21, 22, 23]. The most studied of them are skin temperature and rate,

23 which fluctuate to increase or decrease heat exchange with the environment in order to maintain a constant body temperature. As can be seen in Figure 2-2, temperature skin variation is the main thermoregulatory process to maintain the desired body temperature while air temperature is in the range 18*C - 33.5'C. In more extreme conditions, shivering and sweating take place to generate internal heat and reduce body temperature respectively.

Temperature regulation center Cold Warm

g reSnTe2pat-2e SBi ntsurtfr [. Generate iernal heat Vasocu atedthon S urfae evaporation

BODY

By tmpture 4 - Thermal environment

Figure 2-2: Body thermoregulation diagram. Extracted from [20].

As seen above, skin temperature is regulated through vasodilatation and vasocon- striction. The first increases blood flow to the skin in warm environments, so that heat transport and consequently heat dissipation are increased. With vasodilatation, the body seeks to reduce its temperature. In cold environments, through vasocon- striction the procedure is reversed, that is, blood flow to the skin is reduced, so that heat transport and heat dissipation decrease. There are studies that seek to relate skin temperatures with thermal sensation. For instance, Yao et al. [23] obtained regression equations of local and overall skin temperature vs thermal sensation. Other authors not only used skin temperature, but also rates of skin temperature variation as input variables to estimate the thermal sensation of the subject. For example, authors in [20] use skin temperatures and rates of change in six body lo- cations (head, trunk, arms, hands, legs and feet). Taniguchi et al. [24] developed a bio-sensing controller for HVAC in automobiles. They identified that thermal sensa- tion in a vehicle could be obtained from the skin temperature on the face.. The studies showed above need the monitoring of multiple body locations, which

24 makes them too invasive for practical implementations. Based on the mentioned re- search experiments that indicate a significant relationship between skin temperatures and thermal sensation, Choi JH [251 seeks to find the best body location to place only one skin temperature sensor for thermal sensation estimation. After several ex- periments, the gradient of wrist skin temperature is identified as the most robust bio-signal for thermal sensation estimation.

Heart rate variability (HRV) has also been studied as an indicator of thermal com- fort. Liu et al. [26] found that the ratio of absolute powers in low and high frequency bands of HRV may be used as a physiological indicator for human thermal comfort levels. This ratio is considered an indicator of sympathetic-parasympathetic balance

127], and this study shows that the sympathetic plays an important role in thermal discomfort: when the sympathetic excites, thermoregulation leads to thermal discomfort.

The combination of temperature and humidity was used in [28] to develop a per- sonalized HVAC control system, which improved both the energy usage and thermal comfort.

2.2 SSVEP-based Brain-Computer Interfaces

Steady-state visually evoked potentials (SSVEP) are brain signals generated at the visual cortex [29, 30], which occur in response to visual stimulation at specific fre- quencies [31, 32, 33]. When the is excited with flickering stimuli at a particular frequency within the 6-90Hz range [29], electrical potentials at the same frequency and its harmonics are generated in the occipital area of the brain. This phenomenon has been widely exploited to build brain-computer interfaces (BCIs) [34]. These BCIs use visual stimuli rendered in computer screens or light sources modulated at a spec- ified frequency to elicit response signals at the visual cortex, which are then captured by EEG equipment and processed to identify which stimulus the subject is looking at. Using SSVEPs, BCIs can be created for a variety of actions including spelling words [35], controlling a robot [36] or playing video games [37]. As for the stimulating

25 frequencies used, it has been widely proved that the brain does not respond uniformly throughout all the spectrum, i.e. it resonates more strongly to some frequencies than others, giving the highest SSVEP amplitudes in the 6-16Hz band [38, 39]. This is one of the main reasons why most of the SSVEP-based BCI applications use flickering stimuli at the lowest part of the spectrum [34].

SSVEP-based BCIs have also been used within virtual environments (VE) due to their high information transfer rates and negligible user training requirements [40, 41].

Most of these applications superimpose basic geometric shapes-squares, circles or arrows-for the flickering stimuli [34], but this sometimes becomes too obtrusive for the virtual scene and makes it less immersive for the user. Hence, making the stimuli part of the scene has also been explored in VE [42] and VR [43]. However, these studies use stimulating frequencies in the low (1-12Hz) or medium (12-25Hz) part of the spectrum, both of which present two major disadvantages: first, low frequency flickering lights (5-25Hz) can be annoying and cause visual fatigue to the user 134], and second, flashing stimuli, especially in the 15-25Hz range, have the potential to induce photo-epileptic seizures [44]. In this work I not only overcome these downsides by using the high-frequency band (>40Hz), but I also introduce a novel concept for establishing a subconscious connection between user and computer thanks to the undetectable visual stimuli.

26 Chapter 3

A Wearable-based Thermal Comfort Sensing System

3.1 Potential Biosignals for Thermal Comfort Esti-

mation

3.1.1 Wrist Skin Temperature

As seen in Section 2.1.3, within the air temperature range of approximately 18'C - 330 C, skin temperature variation is the main thermoregulatory process of the human body, increasing or reducing heat exchange with the environment. Some studies suggest that wrist skin temperature and gradient may be a potential biological signal to be monitored in order to estimate users' thermal comfort. Therefore, for the purposes of this study, we use wrist skin temperature as one of the variables monitored during experimentation.

3.1.2 Vasomotion

Skin temperature changes are caused by cutaneous blood flow variation, which is the main factor in human thermoregulation during mild thermal challenges [451 and it is accomplished by dilating or constricting skin blood vessels (cutaneous vasodilation or

27 vasoconstriction) in the limbs depending on whether the person is in a warm or cold environment respectively. This vasomotion will change the blood flow reaching the limb's skin.

Monitoring the skin blood flow could be an indirect way of measuring the cuta- neous vasomotion, which would add important information to estimate the thermal comfort of the subject. Photoplethysmography (PPG) is an optical technique that uses variations in reflected light from changes in blood flow during heart activity.

Therefore, one could derive vasomotion data from using the light absorbance/reflectance information provided by a PPG sensor. Since the PPG signal is based on light ab- sorbance by the oxygenated blood, the amplitude of the PPG wave provides infor- mation regarding the amount of blood flow in the vessels, which is directly related to the vasoconstriction/vasodilatation, as seen in Figure 3-1

PPG signal

vemoconrwmkm

Figure 3-1: PPG signal. Extracted from [46]. Notice that vasomotion information can be obtained from the signal amplitude.

3.1.3 Electrodermal Activity

Human thermoregulation is controlled by the sympathetic nervous system, and cuta-

neous sympathetic activate the constriction or dilation of skin vessels [47, 48]. It is well known that skin conductivity or electrodermal activity (EDA) can be a

measure of sympathetic arousal [49], hence we also use EDA for the thermal comfort

estimation model.

28 Types of Measured EDA

9 Phasic - Response to stimuli

Phasic skin conductance measurements are typically associated with short-term

events and occur in the presence of discrete environmental stimuli - sight, sound, smell, cognitive processes that precede an event such as anticipation, decision-

making, etc. Phasic changes usually show up as abrupt increases in the skin

conductance, or peaks in the skin conductance. These peaks are generally re-

ferred to as Skin Conductance Responses (SCRs) [50]. A typical phasic EDA

peak can be seen in Figure 3-2.

4.24-

4.22 40 1 2 3 4 5 6 seconds

Figure 3-2: Typical shape of a phasic skin conductance.

* Tonic - Baseline

Tonic skin conductance is generally considered to be the level of skin conduc-

tance in the absence of any particular discrete environmental event or external

stimuli. The tonic skin conductance level can slowly vary over time in an indi-

vidual depending upon his or her psychological state, hydration, skin dryness, and autonomic regulation. The baseline tonic skin conductance level may vary

from day to day. Tonic changes in the skin conductance level typically occur in

a period of from tens of seconds to minutes. Tonic EDA is also known as Skin

Conductance Level (SCL).

29 3.2 Wearable: Biosignal Monitoring Wristband

The Empatica E4 wristband 1461 is a wearable research device that offers real-time physiological data acquisition and software for in-depth analysis and visualization. It

is designed for research purposes, since it gathers high-quality data and it gives open

access to the raw data collected. It has different sensors that collect data from which we can derive the thermoregulation signals explained in the previous section.

3.2.1 Sensors

" Photoplethysmography (PPG) sensor: measures blood volume pressure (BVP), from which heart rate (HR), heart rate variability (HRV), and other

cardiovascular features may be derived, such as an estimate of vasoconstriction and vasodilation.

- Resolution: 0.9 nW/digit.

- Sampling rate: 64 Hz.

" Electrodermal activity (EDA) sensor: measures the skin conductivity, which is related to sympathetic nervous system arousal.

- Resolution ~ 900 pSiemens.

- Range: 0.01 pSiemens - 100 pSiemens.

- Sampling rate: 4 Hz.

" Infrared Thermopile sensor: measures peripheral skin temperature.

- Resolution: 0.02'C.

- Accuracy: 0.2'C within the range 36'C - 39'C.

- Sampling rate: 4 Hz.

" 3-axis accelerometer

- Resolution: 8 bits.

30 Range 2g.

- Sampling rate: 32 Hz.

Figure 3-3: Empatica E4 Wristband

3.2.2 Functioning Modes

This device can be used in two different modes, depending on the user needs.

Recording mode

When the E4 is set in recording mode, it stores the data in its internal memory. It allows recording for up to 36 hours. We used this mode to collect data during the experimentation, since there was no need to analyze the data in real time. Once data has been recorded in this mode, the user can view and manage the data on a secure cloud platform, E4 Connect. Raw data can also be downloaded in CSV format for easy processing and analysis in third party applications. The data is secured with encryption and can be deleted after use. The data includes: Electrodermal Activity

(EDA) also known as Galvanic Skin Response (GSR), Blood Volume Pulse (BVP), 3-axis Acceleration, Heart Rate (HR), and Skin Temperature.

Figure 3-4: E4 recording mode

31 Bluetooth Streaming mode

The streaming mode allows the user to view sensor data in real time with a temporal resolution of 0.2 seconds with the connected device. The data is automatically up- loaded to E4 Connect after the session ends. This mode could be useful for eventual user monitoring and real-time application of the thermal comfort sensing algorithm.

Figure 3-5: E4 recording mode

3.3 First Experiments

We performed several tests to analyze how the biosignals described above relate to thermal sensations. The user was wearing the Empatica E4 Wristband during the experiments.

The tests were carried out in different room temperatures, which were classified in 3 different groups: hot room (> 30 'C), neutral room (~ 25 'C) or cold room (<

15 'C). All possible room-switching combinations were considered in order to observe the variable (wrist skin temperature) change in different conditions. In general, the user remained in each room for 15 minutes, and answered every 3 minutes to the following thermal sensation survey:

Choose one of the following numbers in the scale from +3 to -3 that best represents

your thermal sensation right now: +3 Hot +2 Warm +1 Slightly warm

0 Neutral

-1 Slightly cool

-2 Cool

32 _M

-3 Cold

3.3.1 Results

Wrist Skin Temperature

Results proved that when the subject is hot, skin limb temperatures increase so as to increase the heat transfer with the environment, and when cold, skin limb temper-

atures decrease. This behavior can be seen in the data collected and an example is

shown below. In Figure 3-6 one can see the wrist skin temperature collected by the

wristband (blue line), the thermal sensation reported by the subject in the survey

(orange line) and the room type the subject is in (green line), as a function of time.

Skin Temperature 3__Thwrmal sensation

2 34.5

34 /j

NeWal Netra E

32.5 2

31.5 :2

31

0 500 1000 1500 2000 2500 3000 3500 4000 4500 Tine (seconds)

Figure 3-6: Plot of data obtained during one of the experiments.

Just with a quick glance at the plot, we can see that there is a correlation between thermal sensation and wrist skin temperature dynamics. In addition, we can see a

hysteresis or a delay of the effect of changing room, since the wrist skin temperature

does not immediately respond to a change of environmental temperature.

33 Gradient of Wrist Skin Temperature

As explained above, what may be more relevant for thermal sensation estimation is the variation of the wrist skin temperature rather than the temperature itself, since skin temperature variation is related to an active thermoregulation. Therefore, I computed the temperature gradient with a time difference of 3 minutes, and I plotted it as a function of the thermal sensation (see Figure 3-7).

0.6 -

-- Usd cmrvw 0 0.4 -

0

-0.2 0 0 -Z 0

0L 0o 10

0 0~ -- 00. ~ 0

-0.6 - 0 00

-0.8 -2.5 -2 -1.5 .1 -0.5 0 0.5 1 1.5 Thermal sensation

Figure 3-7: Gradient of skin temperature vs. thermal sensation.

As seen, there is clearly a linear pattern that correlates both variables. It is

important to note that it has a positive slope with an intercept point of almost zero.

This means that when there is a neutral thermal sensation, the gradient of wrist skin

temperature is approximately zero, whereas when the thermal sensation is hot or

cold, the gradient is positive or negative respectively. These results match with our

hypothesis of what we should expect based on the thermoregulation system, that is,

when in a neutral thermal situation, there is no need to increase or decrease blood

flow to redirect the situation, so skin temperature remains constant.

34 Vasomotion

A Matlab code was written in order to obtain the amplitude of the PPG signal which would be an indicative of the cutaneous vasoconstriction levels of the limb. The code computes the smoothed envelope of the signal as seen in Figure 3-8.

-200

-20

-40

1.18 1.185 1.19 1.195 1.2 1.205 1.21 1.215 122 4 Time (samples) x10

Figure 3-8: PPG signal and its envelope during 6 seconds of data approximately. The blue line represents the PPG signal measured by the wristband. The yellow and orange lines define the envelope of the PPG signal computed in Matlab.

Figure 3-9 shows the PPG signal and its envelope taking a much larger period of time, around 10 minutes.

As seen, the envelope goes through a noisy stage that should be eventually dis- carded. Figure 3-10 shows a non-noisy example of the dynamics we would expect to see regarding the relationship between limbs cutaneous vasomotion and temper- ature. In orange we can see the vasomotion signal (PPG amplitude), to which skin temperature responds proportionally when it increases or decreases.

35 L

150 -

100 -j

501so

'a 0 0p

.50

-100

-150 -

0.5 1 1.5 2 2.5 3 3.5 4 Time (samples)

Figure 3-9: PPG signal and its envelope of 10 minutes worth of data.

36.51 1

36 2500

35.5 -2000 a

35

I. 1000 CO 34.5 -500

34 '

33.5 5000 5500 800 > 6500 7000 km (seconds)

Figure 3-10: Plot of skin temperature and vasomotion signal obtained from the PPG signal.

In Figure 3-11, we plot the vasomotion signal (PPG signal amplitude), the ther- mal sensation provided by the survey answers and the room type (hot, neutral or cold room), as a function of time. Vasomotion signal was smoothed and very noisy portions were removed. One can see a correlation between thermal sensation and

36 vasoconstriction.

I A IM I

Aq~

0 5M0 Iwo 15,0 X03 So me noK0 r

Figure 3-11: Vasomotion signal during the experiment explained above. Orange and green lines are thermal sensation and room type respectively.

Electrodermal Activity

EDA signals are also very noisy, because small arm movements can create artifacts that should be rejected. Therefore, a good artifact detection method is needed. An online tool for noise and EDA peaks (SCR) detection has been used [51, 52]. After removing some noise, smoothing the signal with a low-pass filter (1Hz) and, using the online tool for peak (phasic EDA) detection, we obtain a plot as in Figure 3-12, where some SCR peaks can be seen.

In Figure 3-13 we plot the whole experiment including the room types and the thermal sensations. We can see that much many EDA peaks took place during warmer thermal sensations than during cooler periods. This could be related to the fact that skin conductance responses (SCR) are generated by reflex activation of sweat glands via cholinergic sudomotor sympathetic fibers and that sympathetic vasodilator fibers also act via a mechanism that involves cholinergic cotransmission [47, 53].

37 0.33 -- EDA ---- EDA pM1ki (SCR) - 0 0.32

0.31

I0.3

0.29 0 0.28 -K

0.27

0.26 050 6 - 400 450 500 550 600 Time (seconds)

Figure 3-12: Plot of EDA signal (blue). Vertical cyan lines mark the EDA peaks detected.

7 0

00

0 j . 4.0 -50 0 we lo0 00 2002500 3W 3500 4000 C00 Ti" (I0000t)

Figure 3-13: Plot of EDA signal (blue) during the experiment. Vertical cyan lines mark the EDA peaks detected.

In Figure 3-14 we plot the relative frequency (# peaks/time segment) of SCRs with respect to thermal sensation. As expected, we can see that the relative frequency of SCRs in warm/hot thermal sensations is much higher than in the other thermal conditions. Additionally, if we analyze the tonic part of the EDA signal (baseline) in Figure 3-13 we can see that the levels increase during warm thermal sensations and decrease during cooler thermal sensations.

38 0.8

0.7 0.6 0.5 0.4

0.3

0.2 0.1 0 Warm/Hot Neutral Cool/Cold

Figure 3-14: Relative frequency of EDA peaks vs. thermal sensation.

3.4 Machine Learning Approach

Due to the multi-dimensionality of this problem, it is easy to overlook possible variable relationships or data patterns that could provide useful information. Therefore, we implemented a simple machine-learning model to take advantage of these potential hidden patterns. Since our model needs to provide a thermal state/sensation of the user, it will consist of a classification problem, where the algorithm output is a class (hot, cold or neutral) given a N-dimensional point, where N is the number of features that will be considered for the problem and is defined in the next section.

3.4.1 Feature Selection

From the 3 physiological signals that are provided by the wristband sensors, I needed to guesstimate which features could be derived for the model. Based on the first experimentation explained above, the six features that were considered are the fol- lowing:

" Skin temperature.

" Gradient of skin temperature.

" Tonic EDA level.

39 9 Phasic EDA (peaks).

* Vasomotion. As explained in previous section, vasomotion is obtained from

the amplitude of the PPG signal that is given by the wearable: Large PPG signal amplitude may imply vasodilation and small amplitude suggest vasocon-

striction. The first occurs when the user is warm and the latter when he or she is cold.

" Inter-Beat Interval (IBI). Since PPG signal also provides information re-

garding heart rate, we can analyze how it varies depending on the thermal

sensation. In particular, inter-beat interval (IBI) is the time between two con-

secutive heartbeats. Therefore, a low IBI means high heart rate, whereas, a high

IBI corresponds to a slow heart rate. Thanks to the experiments performed, it

was seen that, in general, when a person is feeling hot or cold, his/her heart

rate is higher or lower respectively. Consequently, a high or low IBI may imply

cold or hot state respectively (see next section for graphical examples).

3.4.2 Time Windows

A relevant parameter to fix is the frequency at which the thermal comfort sensing system needs to output a result (thermal state of the user), and how long the system would need to be analyzing the data for to be able to provide a result with confi- dence. The first depends on the mobility of the user - a system with a static or a sedentary user will not need to update the thermal state very frequently - and/or on the environmental conditions variability. The second is related to the variability of the exposed physiological signals, and to the span the system needs to analyze them in order to capture changes in time.

There is no strategy to know which the optimum time frames are, but based on our experimentation, time windows of 5 minutes with 30-second increments seemed to give best results. Therefore, the system needs to analyze 5 minutes of data to extract a result and it will give an output every 30 seconds. This scheme will also provide enough datapoints for the machine learning approach (see next section), since

40 the nature of the project does not allow us to have a vast amount of data (human experimentation).

3.4.3 Datapoint Extraction

From the 5 minutes worth of data, we need to extract the information related to the 6 variables explained above. Therefore, a 6-dimensional datapoint will be obtained from each 5-minute window, and it will be output once every 30 seconds. The 6 dimensions of these points will be derived as follows, always referring to the 5-minutes window data:

1. Mean skin temperature:

S = Nl i (3.1)

where T is the ith temperature sample given by the wristband sensor in the actual time window.

2. Gradient of skin temperature:

V2 =TN - T (3.2)

3. Mean of vasomotion signal (PPG amplitude):

V A (3.3) zf=K

where Ai is the ith amplitude value of the PPG signal sample given by the wristband sensor in the actual time window.

4. Mean of IBI:

M" IBI. = i1 (3.4) M

where IBI is the ith inter-beat interval sample provided by Empatica E4 algo- rithm in the actual time window. The Empatica algorithm computes the IBI

41 from the PPG signal and directly provides the values.

5. Mean of EDA signal (tonic):

V= A (3.5) Zi L

where EDAj is the ith sample of the EDA signal given by the wristband sensor

in the actual time window.

6. # EDA peaks in the time window:

V6 = #EDApeaks (3.6)

obtained using the tool in [52].

3.4.4 Single-User Model

I first considered the development of a single-user model in order to study the fea- sibility of using machine-learning techniques for this problem. A few hours worth of data was collected in the three thermal sensation conditions: neutral, hot and cold. The only user was always in a resting state, usually sitting on an office chair.

Figure 3-15 shows how a machine-learning algorithm will be able to exploit data patterns, since the datapoints belonging to different classes cluster differently and are not spread randomly in the 2-D plane. This plot also shows how these variables change depending on the thermal sensation. One can see how the gradient of temperature

is mostly positive (skin temperature increasing) in the hot class and mostly negative

(skin temperature decreasing) in the cold class, whereas neutral datapoints are more spread in the neutral state. Regarding mean of IBI, we can clearly identify that in the cold state IBI is generally larger (slower heart rate) than in the hot state (faster

heart rate).

42 1.4

1- - ha~at 1.21-

cold.

S0.8

0.6-

0.4- 0

0-2 -1.1 5 -1 -0.5 0 0.5 1 1.5 2 2.5 Gradient of temperature

Figure 3-15: Mean IBI vs. temperature gradient point scatter for the three different classes. It shows how the distinct class datapoints cluster differently.

Figure 3-16 shows a 2-D plot of the vasomotion mean versus the vasomotion

standard deviation. It clearly shows how blood flow is greater during hot state.

* neutral * hot 250 * cold

150

300- -. 4 0

250 0 0 .0 0*

0~ 0

0 0 10 20 30 40 50 60 70 80 90 std vaso

Figure 3-16: Mean vaso vs. std vaso point scatter for the three different classes. Vasodilation and vasoconstriction in hot and cold states is clearly reflected.

All data was processed as explained above and variables were derived to be used

43 as the input of the machine-learning algorithm. Over 1000 6-dimensional datapoints were used. After analyzing different classification algorithms, it was determined that bagged trees was the method that showed the best results for this problem. In order to avoid overfitting, k-fold cross validation was used during training, with k=5. Results for the single-user model were very good, with an accuracy of 95%.

These results allow us to reach a very important conclusion: that a model trained with data from a user will behave really well when used with the same person.

3.4.5 Multi-User Model

Having seen the good results for the single-user model, the new objective was to develop a multi-user model, which could be generalized to any user, not needing to train each user's data to be used with them. Therefore, redefinition of some features was needed.

Features Redefinition

Based on experiments with different subjects, I discerned that there are natural dif- ferences of some features between users, even if they are in the same thermal state:

* Mean skin temperature: The mean can change considerably between people.

" Mean of vasomotion signal: PPG light absorbance changes with skin color: a

darker skin will not absorb as much light as a light skin, and consequently, PPG signal will have higher amplitude, with higher values of vasomotion signal.

" Mean IBI: Heart rate in resting state is different between people.

Therefore, I needed to redefine some variables to take into account these natural

differences between people. I considered the values in the neutral thermal state to be

each user's baseline, which will be substracted from the previous features to capture

the difference from the neutral state:

44 " Instead of mean skin temperature, I used the difference between the mean of the actual temperature and the mean temperature in the neutral state:

V, WN1 T - Tneutrai VI - Tneutrai (3.7)

wr TeL , previously computed from neutral data of that user.

" Instead of mean of vasomotion signal, I used the difference between the mean of actual vasomotion signal and the mean vasomotion in the neutral state:

- Aneutrai = V - Aneutrai (3.8) V= K1

where Aneutrai -, previously computed from neutral data of that

user.

" Instead of mean IBI, I used the difference between the mean of actual IBI and

the mean IBI in the neutral state:

V, = IBI, - IBIneutrai 1 = V 4 - IBIneutrai (3.9) M

where IBneutra = l , previously computed from neutral data of

that user.

Experiments Description

For the multi-user model experiments I used 3 different types of rooms, where the user was asked to spend time sitting in an office chair and doing office work (reading, using laptop or writing): neutral room, hot room and cold room. The experiments were conducted as follows:

1. Start in neutral room (15 min). The user needed to agree that the room tem- perature was comfortable for him/her. If not, temperature was adjusted until

45 comfort was achieved. During this period, baseline data was collected, which would be used as the neutral state data to compute the differences explained above.

2. Change to hot or cold room. The subject was asked to notify when he or she would like the AC or heating to stop due to their non-comfortable state. Once user notified, data collected from then on was considered to belong in the hot or cold class respectively. This also provided us information regarding when an smart AC or heating would need to respond to the user's state.

3. The subject remained in the uncomfortable state for 45 minutes approximately.

Results

In order to prove the generalizability of the model, the algorithm was trained with the data from the N-1 users and tested with the Nth, where N was the total num- ber of users. The method used was once again bagged trees. The following are the accuracies of the model trained with N-1 users and tested with the one left (3-class model): 0.6727 0.9070 0.7667 0.8000 0.9615 1.0000 1.0000 0.9146 0.4762 0.9479 0.9333 1.0000 1.0000

46 0.9863 0.6622 0.9750 0.9375 1.0000 1.0000 1.0000

Average: 89.7%.

We can see that with the exception of 3 users, the results are relatively good, proving that the multi-user model is feasible. The lowest accuracy (47%) corresponds to a user that took the experiment after a period of exercise and this may have affected the results, since heart rate and body temperature may not have settled yet.

Overall, the algorithm seems to work pretty well and, even for the 3 users mentioned, the system is always way above the 33% accuracy for a random decision (3-class problem).

47 48 Chapter 4

A Brain-Computer Interface for Virtual Reality

4.1 Sublime

Sublime is a new concept of Steady-State Visually Evoked Potentials (SSVEP)-based BCI that allows the user to unconsciously communicate subliminal information to the computer in a virtual environment. This can happen under too conditions:

1. The visual stimuli frequencies must exceed the flicker fusion rate, i.e., the fre-

quency at which a flashing light appears to be steady [54, 551. SSVEPs can be

generated up to 90Hz and the flicker fusion rate is around 40Hz, so there exists

a frequency range that allows for the stimulation and consequent generation

of SSVEPs even though the user perceives a steady light [56]. As explained

above, highest SSVEP amplitudes are given in the 6Hz-16Hz band, so I create

these unobtrusive stimuli at the expense of lower SSVEP power and most likely

longer detection times.

2. The flickering stimuli takes the form of virtual objects in the scene rather than

inserted fiducials such as rectangles or arrows. In other words, if the scene

contains a chair, the chair itself must flicker. The full screen can be partitioned

into several of these stimulating elements, which will be integrated as part of the

49 virtual scene. Then, different flickering frequencies are used to encode object IDs. This can be implemented thanks to the fact that even though multiple stimuli fall within the visual field of the user, only the one that receives the attention focus of the subject will elicit the corresponding SSVEP response [31, 57].

For Sublime to come alive, both conditions need to be fulfilled. I tested this in a VR movie-watching environment. The user can navigate between the application menus by looking at the interactive objects and choosing the movie to watch just by gazing at the corresponding cover.

4.2 Materials and Methods

4.2.1 Virtual Reality Display Device

For the purposes of this work, a display with a high refresh rate was essential, since I needed to render stimuli above the flicker fusion rate. I used an Oculus Rift VR headset [58], which has a Refresh Rate (RR) of 90Hz and allows me to generate stimulating signals at frequencies up to 45Hz, i.e. half the RR.

4.2.2 Visual Stimuli Generation

Stimuli signals generated by the VR display will always be constrained by its Refresh

Rate (RR), since rendered images can only be updated once every RR1 seconds, 90' in this case. Therefore, an 'on/off' stimulation pattern (max/min screen luminance respectively) could only generate sub-multiples of the refresh rate. In that case, stimuli signal frequencies would quickly drop below the flicker fusion rate, which is an undesired scenario for the aims of this work. Hence, this study used a sampled sinusoidal stimulation method [59, 60] which allows me to realize stimuli signals at any frequency up to half of the refresh rate. These signals can be generated by modulating

50 the luminance L(fst, k) of the display screen using the following expression:

L(fst, k) = 0.5sin(2rfst(k/RR)) + 0.5 (4.1)

where ft is the flickering frequency of the visual stimulus, k is an integer that indicates the frame index in the sequence and RR corresponds to the refresh rate of the display. L(fst, k) represents a sampled sine of frequency fat with a sampling rate of RRHz. The dynamic range of the signal is from 0 to 1, where 0 represents dark and 1 maximum luminance.

Since all stimuli frequencies must be higher than the flicker fusion rate and lower than half of the refresh rate, and considering the flicker fusion rate to be 40Hz t541, the targeted frequency band for this work is 40Hz - 45Hz.

4.2.3 Beating effect

Sampling a wave of a frequency very close to the Nyquist rate causes the emergence of apparent beats. These beats oscillate at a frequency fbeat:

fbeat = 2(Fs/2 - fat) (4.2)

where F, is the sampling frequency and fat is the frequency of the sampled signal.

Since in this study we have F, = 90Hz, the beats generated oscillate at fbeat = 90 - 2fst. An example of the apparent beating effect for a 44Hz sine is shown below:

This sampling effect translates into an undesired perception of a slowly (fbeat) oscillating luminance. In order to minimize the beating effect, a stimulating sine signal with lower amplitude is also utilized:

L2 (fat, k) = 0.3sin(27rft(k/RR)) + 0.7 (4.3)

In this case, the luminance L(fst, k) is modulated as a sine wave with an amplitude of 0.3, so the signal will range from 0.4 to 1. This will generate smaller amplitude beats, and the fading effect is barely perceived, complying with the purposes of this

51 1 ___4414z sine Al0.L50.

Time (seconds)

Figure 4-1: Blue line shows a 44Hz stimulating sine wave sampled at RR 1/90Hz. The red dashed line shows the beating effect that will be perceived. work of creating perceptually steady stimuli. However, beating effect disappears at the expense of power reduction of the stimulus and consequently of the SSVEPs. This may cause the SSVEPs to be buried in noise and lead to the need for longer SSVEP measurement times, as can be seen in section 4.9. As a consequence, a compromise needs to be found between the signal power and the perceivable beating effect.

Without reducing the signal amplitude, the beating effect will be perceivable as long as fbeat ffusion. Solving for the refresh rate of the display and assuming ffuszon = 40Hz it leads to RR < 2fst + 40Hz. Therefore, the minimum refresh rate that will allow for maximized amplitudes of the visual stimuli while preserving their non-perceptiveness is given by

RRmin ;> 2fst + 40Hz (4.4)

The VR headset available for this study has a refresh rate of 90Hz, which does not fulfill Equation 4.4 for stimulating frequencies fst higher than ffu8 ion. Thus, amplitude reduction of the stimuli signal will be required to prevent the user from noticing the beating effect. This will affect SSVEP amplitudes and argues for higher

52 refresh rate displays in the future.

4.3 Virtual Reality Application

4.3.1 Main Menu

The virtual environment I present to prove the concept of the system was developed in Unity3D [611 and consists of two different type of scenes. The starting one is a main menu with four different movie covers that the user can select. Each movie cover has a different stimulating signal associated, with frequencies fst = {42, 43, 44, 45} Hz. The user can pick the movie to watch by just gazing at the desired cover without the need of any joystick or hand controller. The corresponding SSVEP signals generated will be detected by the EEG electrodes and the application will transition to the selected movie playback scene.

Figure 4-2: Screenshot of the movie covers the user sees in the main menu.

4.3.2 Movie Playback Menu

This scene plays the movie the user selected in the main menu. In the bottom-left corner there is a selectable object to allow the user navigate back to the main menu.

It is also an integrated visual stimulus with its associated frequency fat = 42Hz and the user may look at it to stop the movie and switch to the starting menu.

53 Figure 4-3: Star Wars Episode VIII playing in the movie menu.

4.3.3 Real-time Feedback

In order to inform the users that the system is responsive to their intentions, I included real-time feedback in the form of loading bars below each selectable object. When the system detects the user is gazing at an object, the corresponding bar starts loading, and it takes 4 seconds to fully charge and begin the transition to the selected scene.

In the case the user stops gazing at the loading object, the bar automatically resets and the scene transition is suspended.

MAIN MENU

(a) (b)

Figure 4-4: Loading bars for two different selectable objects. Figure 4-4b shows the flickering object that allows the user to return to the main menu.

54 4.4 EEG Recording Equipment

Aligning with the interests of this work to build a non-invasive system and to keep the user unaware of the brain-computer communication taking place, I wanted to avoid the bulkiness of an EEG electrode cap. Instead, I directly embedded EEG electrodes in the VR headset. Oculus Rift has a triangle-shaped back strap that cradles the back of the head and is equipped with part of the headset tracking system. This shape allowed us to locate three EEG electrodes in positions Pz, 01 and 02 according to the international 10-20 system [62]. Reference and ground electrodes were placed in the lobes with earclips. As for the EEG recording equipment, I used OpenBCI

Ganglion board which allows for bluetooth low energy data transmission and has a sampling rate of 200Hz [63].

4.5 SSVEP Detection

4.5.1 Canonical Correlation Analysis

In order to pick up the elicited brain signals, I utilized canonical correlation analysis

(CCA), a widely used method for SSVEP detection [64, 65, 66]. Given two variable sets X, Y and their linear combinations x = XWT, y = YWf, CCA finds the weight vectors Wx and W which maximize the correlation p between x and y by solving the following:

E [WXXTYWT] max p = (4.5) wE[WXXTXWxT]E[WyYTYWyT]

In this approach I define X c RLxN as the recorded EEG data, where L and N denote the number of samples of the EEG signal and the number of EEG channels respectively. Since I use three electrodes (N = 3) the matrix X takes the form:

X = [X1 X2 X3 1 (4.6)

where Xi E RLx1. On the other hand, I define Yf E RLx2H as the artificially

55 generated sinusoidal signals of frequency f and its multiples used as the reference, where H represents the number of harmonics. Note that the duration of the signals

X and Yf must be the same. Each submatrix of Yf contains the pair cos(27rhft) and sin(27thft), where h = 1, 2, ...H. Several studies have shown that the amplitude of the

SSVEP (fundamental h = 1 and harmonics h > 1) resulting from high stimulation frequencies (> 40Hz) is considerably lower than in low-frequency SSVEP [29, 39].

Therefore, in this approach I only considered the fundamental frequency H = 1 for the reference signals Yf, giving:

Yf = [cos(27rft) sin(27rft)] (4.7)

Applying CCA to X and Yf, the correlation will be maximized by enhancing the part of the evoked response (present in Yf), and by reducing the noise (not present in Yf), thereby improving the signal-to-noise ratio of the filtered signal x.

Since I have four possible stimulation frequencies (fst = {42, 43, 44, 45}Hz), I define four different reference signal matrices Yf: Y4 2 , Y4 3 , Y4 4 , Y4 5 . Then, I obtain a correlation value pf for each pair Yf, X and identify the targeted frequency as the one that gives a higher pf. For every data segment, I can define a correlation vector

R E R :

R = [P42 P43 P44 P451 (4.8)

which contains the four correlation values resulting from applying CCA four times.

4.5.2 Logistic Regression

In order to discriminate whether an elicited SSVEP is present in the recorded EEG signals I need to define the condition the maximum pfmax needs to fulfill with respect to the other three pf. Since it is not simple to find this condition empirically, I used logistic regression as a simple binary classification model, which takes as input the following features derived from the correlation vector R:

56 e Standard deviation of correlations: hR2 (4.9) ZSiP R

" Difference between maximum correlation pfmaz and the second highest one.

" Difference between maximum correlation pfmax and sum of the other three.

Taking these three features as input, the binary model outputs a positive or neg- ative answer. If positive, the system considers the recorded EEG signal contains

elicited SSVEP and the targeted frequency will be the one corresponding to the max-

imum correlation pf . In the case of a negative response, this approach assumes

the EEG is empty of SSVEP and that the user is not looking at any stimulus.

4.6 Real-time Data Processing

EEG signals recorded are processed in 5 seconds time windows, with 4 seconds of

overlap so that the system updates every second. Each 5-second worth of raw EEG

is first band-pass filtered with a finite impulse response filter of order 10 and cutoff

frequencies 10Hz and 50Hz. Then, CCA is applied to the filtered data segment and

each reference signal matrix Yf, with which the vector of correlations R is obtained.

Finally, the trained logistic regression classification model is applied to the features computed from R and a system output is given.

In this approach I prioritize minimizing false positives since they can be annoying

for users in such an application where the human-computer communication happens

in the "background". Thus, the state of the VR application will not be changed until

two equal consecutive outputs are given by the SSVEP detection algorithm. This applies to both starting or suspending the loading process of a gazed object.

57 4.7 System Configuration

The different system blocks are connected as follows:

" The Oculus Rift VR headset is connected to a computer running the Unity3D application.

" EEG data recorded by OpenBCI's Ganglion board is transmitted through Blue- tooth Low Energy (BLE) to a computer running Matlab [671.

" Matlab code takes care of the real-time EEG signal processing and SSVEP detection.

* Matlab is connected to the Unity3D application via TCP/IP and sends updated information when applicable.

Figure 4-5 shows a scheme of the explained system configuration.

.V) MATIAB Raw EEG

SSVEP Decision TCP/IP 0

Server

VR Application unity User + Oculus

Figure 4-5: System blocks configuration.

4.8 Experiments

4.8.1 Subjects

3 participants aged between 20 and 30 years old volunteered for the experiment. Even though stimuli frequencies were much higher than those associated with seizures, I

58 made sure that participants had not had episodes in the past. All participants signed an informed consent form and the whole experimental procedure was approved by the MIT Committee on the Use of Humans as Experimental Subjects (COUHES). The subjects were seated on an armchair behind an office table and wore the VR headset during the whole experiment.

4.8.2 Experiment 1: Navigation Time

To assess the different aspects that characterize this work, the experiments were divided into two types. The first one consisted on measuring the time it took for each user to complete a predefined navigation task so that the Information Transfer Rate (ITR) could be computed. The second one allowed for a subjective evaluation of the technology where the user could freely navigate within the application and report his or her experience.

In the first experiment, the user had to navigate to each of the four movie playback menus, returning to the main menu in between. In order to evaluate the effect of reducing the amplitudes of the stimuli to eliminate the beating perception, each user completed this task twice, first using stimuli with maximum amplitudes L( and second using reduced amplitude stimuli L 2 () - see Equation 4.1 and Equation 4.3 respectively.

4.8.3 Experiment 2: Subjective Experience

This experiment consisted on letting the subjects interact freely with the application and answer a survey in the end, which is a common way of measuring a subjective system performance [68]. Again, this task was completed twice, one time with each type of stimulation signals. The questions asked in the survey were the following:

1. "How easy was it to navigate between menus?"

2. "How would you quantify the perception of flickering, if any"

3. "How would you rate the overall experience of using this system?"

59 All ratings had to be evaluated in a scale from 0 to 5.

4.9 Results

Results for Experiments 1 and 2 can be found in Tables 4.1 and 4.2 respectively. Navigation time corresponds to the time the subjects spent completing the task for

Experiment 1. The Information Transfer Rate (ITR) was measured in bits per minute

[69]. Each navigation task consisted of detecting the correct stimulus 8 times (4 movies and 4 menu returns), so ITR = 8 x 60/T x lg2 4 , where T corresponds to the navigation time minus the loading time of all target objects, that is 4s x 8 = 32 seconds.

There were no misclassficiations during the experiments, so accuracy of the system was 100%. This high accuracy is in part due to the fact that I aimed to minimize false positives, at the expense of response time and ITR. As explained above, this was accomplished by requiring two equal classifications in a row to start loading an object.

Table 4.1: Results for Experiment 1

Stimulation S1 S2 S3 Average signal Navigation time (sec) 57 63 64 61.3 L() ITR (bits/min) 16.8 15.2 15 15.7 Navigation time (sec) 64 75 73 70.7 L2 () ITR (bits/min) 15 12.8 13.1 13.6

Table 4.2: Results for Experiment 2

Stimulation S1 S2 S3 Average signal 1) How easy to use? 5 4 4 4.3 LO 2) Perception of flickering? 5 5 5 5 3) Overall experience? 3 3 4 3.3 1) How easy to use? 4 2 3 3

L 2 ( 2) Perception of flickering? 1 1 1 1 3) Overall experience? 4 3 5 4

60 As expected, we can see that navigation times using full amplitude stimuli (LO)

are shorter than using L 2 , and therefore ITR is higher. This is tightly related to the fact that, on average, users found easier to use the first approach, which gives faster responses.

Regarding the question in perception of flickering, highest values are given to the

approach with L( due to the beating effect. The fact that lowest values of flickering

perception are given to L 2 stimuli proves that the proposed solution to minimize the beating effect is valid.

Finally, the ratings for the overall experience question are higher for L 2 , that is, for the non-beating stimuli approach. This shows that even though subjects found it easier to use the first approach, they still prefer the experience of unobtrusive BCI with imperceptible flickering stimuli.

4.10 Discussion

Results show that the system was tested with success, giving 100% of accuracy and

good overall experience for the non-beating stimulating signals. ITRs obtained are in general lower than the average ITRs in SSVEP-based BCIs found in literature

[341. This is due to the fact of using high-frequency stimuli (as opposed to the vast majority of other works using low-frequency stimuli) as well as because I prioritize

accuracy rather than time response, for which we wait for two equal classifications

in a row to activate a target and use long (5-second) detection windows. Despite

this, the fact that subjects preferred the experience when using the application with non-perceivable signals even though the application latency is greater importantly endorses the potential of Sublime.

The amplitude reduction of stimulating signals to generate non-perceivable flicker-

ing objects reduces the power of the elicited SSVEP and therefore increases detection time as shown. A higher refresh rate (see Equation 4.4) would allow for full amplitude stimulating signals and increasing ITR. There already exist 120Hz VR displays, but is is believed that much higher refresh rates will be needed in the future to achieve a

61 fully comfortable Virtual Reality experience. That is where Sublime will have most applicability. The fact that stimuli can be integrated to any part of the virtual scene and are relatively imperceptible for the user enables Sublime to be used not only as a naviga- tion tool, but also as a way to interact with the virtual world, to change the gaming experience, or to simply detect where the user's attention is. As opposed to other

BCIs, SSVEP-based BC~s require negligible training time for the user, which also contributes to increase the applicability of this system.

I have also placed emphasis on the importance for Sublime to be unobtrusive in the virtual scene, but the loading bars added in this demo may be undesired in some other scenarios. Therefore, instead of using loading bars, I believe that user feedback could be provided by changing the color or the shape of the virtual elements, for instance.

In this work I have assumed that flicker fusion rate is at 40Hz but there is ex- tensive literature proving that this rate may be influenced by stimuli color, lumi- nance, background, shape, subject's age and even retinal eccentricity among others

[70, 71, 72, 73].Therefore, I believe flicker fusion rate should be addressed and further characterized in future work for eventual applications of this technology.

62 Chapter 5

Conclusions and Future Work

This thesis has presented two different approaches to take advantage of human biosig- nals so as to connect ourselves to external smart devices. It has been shown that a wearable wristband monitoring temperature, blood volume pressure and galvanic skin response is capable of providing information regarding the thermal comfort level of the person. This could eventually be used to provide valuable data to smart A/C to automatically adjust room environment to user preferences. On the other hand, brain waves have been used to develop a non-obtrusive virtual reality controller that allows for a hands-free menu navigation in the virtual world. Both projects demonstrate that a more natural connection between us and our smart surroundings is possible.

5.1 Sensing Thermal Comfort

In the first part of this thesis I developed a system to estimate the thermal comfort level of a person by monitoring his or her biosignals using a wearable device. Two different models were developed, a single-user one optimized for a particular person and a more generalized one which could be used for an external subject that the model had not seen before. The first one performed with high accuracy (95%) and used six features derived from the three physiological signals measured by the wearable: skin temperature, gradient of skin temperature, vasodilation, inter-beat interval times, skin conductivity and number of phasic peaks of electrodermal activity. In the second

63 model, changes in the thermoregulation signals with respect to their baseline in the neutral state were used to capture comfort information, hence new features based on these differences were defined. The model was trained with N-1 users and tested with the one left apart, and it performed satisfactorily well for most of the users, reaching an accuracy average of 89%. Both models were only developed for data obtained from experiments where users where in a resting state, sitting and doing office work.

5.2 Brain Waves in Virtual Reality

In the last part of this work I presented Sublime, an SSVEP-based brain-computer interface where the user unconsciously conveys subliminal information to a computer thanks to using high-frequency (an therefore non-perceivable) stimulation signals in- tegrated in the virtual scene. The system has been successfully tested as a menu navigation tool in a Virtual Reality environment. The fixed 90Hz refresh rate of the VR display causes undesired beats that were perceived by the user when maximum amplitude signals were used, so I proposed to utilize signals with reduced amplitude to minimize this effect. Experiments with five subjects have shown that although time response is larger with the non-beating stimulating signals, all users reported a better overall experience when using these. I conclude that this is not only due to the fact that these signals don't cause visual fatigue nor are annoying after a while, but also because of the positive user experience when the system responds to user's gaze at what only seem steady virtual objects. The future of VR displays seems to be pointing towards higher refresh rates which would allow to use non-beating stimuli signals with maximum amplitudes and therefore reduce the system time response. Since Sublime is unobtrusive for the virtual scenes-i.e. virtual scenes with or without active stimuli have the same appearance for the user-I believe that potential applica- tions of this technology are very extensive, not only to be used as a navigation tool but for any other application where knowing where the user's visual focus is can be of value.

64 5.3 Future Work

5.3.1 Improvements in the comfort models

As far as the first part of the thesis is concerned, both single-user and multi-user models developed have shown very good results but they have only been tested with the user in resting state. Developing a model for active subjects should be the next step. How the physiological signals in an active state differ from the ones measured in this work needs further study. Determining if changes in the biosignals caused by thermoregulation can be captured should be studied, since these changes might be hidden behind the signal dynamics due to movement and physical activity. In a longer term, the output of the model would be connected to the smart A/C which would use the information from all the occupants of the room to set the optimum environment conditions to satisfy the maximum number of people as well as to improve the energy efficiency of the building.

5.3.2 Future work on Sublime

The work on Sublime can be improved from different perspectives. First, in order to have a practical implementation of this work, SSVEP detection times should be reduced, so exploring other methods for noise reduction could be a start. Second, the granularity of the flickering objects in the scene should also be studied, that is, how small and close to each other the stimulating objects can be. Similarly, the minimum gap between the stimulating frequencies should also be analyzed, that is fat- fost. This would define the total number of frequencies available for stimulation in a given range. The combination of the resulting conclusions of these two last suggested future studies would determine the maximum number of flickering virtual objects to be created in the field of view. Finally, as mentioned in section 4.10, there is extensive literature studying the flicker fusion rate effect, and it has been demonstrated that it can be affected by several factors such as color, background and luminance among others, so analyzing this phenomena will also be needed to define

65 the rendering limitations of the virtual scenes in Sublime.

66 Bibliography

[1] William J Fisk and Arthur H Rosenfeld. Estimates of improved productivity and health from better indoor environments. Indoor air, 7(3):158-172, 1997.

[2] International Organization for Standardization. Ergonomics of the Thermal En- vironment: Analytical Determination and Interpretation of Thermal Comfort Using Calculation of the PMV and PPD Indices and Local Thermal Comfort Criteria. International Organization for Standardization, 2005.

[3] Nosl Djongyang, Ren6 Tchinda, and Donatien Njomo. Thermal comfort: A re- view paper. Renewable and Sustainable Energy Reviews, 14(9):2626-2640, 2010.

[4] George Havenith, Ingvar Holm6r, and Ken Parsons. Personal factors in thermal comfort assessment: clothing properties and metabolic heat production. Energy and buildings, 34(6):581-591, 2002.

[5] Ahmad Manasrah, Rasim Guldiken, and Kyle Reed. Thermal comfort and per- ception inside air-conditioned areas. In Proceedings of the 2016 ASHRAE Annual Conference, 2016.

[6] Jingxin Liu, Hongying Meng, Asoke Nandi, and Maozhen Li. Emotion detection from eeg recordings. In Natural Computation, Fuzzy Systems and Knowledge Discovery (ICNC-FSKD), 2016 12th International Conference on, pages 1722- 1727. IEEE, 2016.

[71 Stefano Valenzi, Tanvir Islam, Peter Jurica, and Andrzej Cichocki. Individual classification of emotions using eeg. Journal of Biomedical Science and Engi- neering, 7(08):604, 2014.

[8] Robert Jenke, Angelika Peer, and Martin Buss. Feature extraction and selection for emotion recognition from eeg. IEEE Transactions on Affective Computing, 5(3):327-339, 2014.

[9] Klaus-Robert Muller, Michael Tangermann, Guido Dornhege, Matthias Kraule- dat, Gabriel Curio, and Benjamin Blankertz. Machine learning for real-time single-trial eeg-analysis: from brain-computer interfacing to mental state moni- toring. Journal of neuroscience methods, 167(1):82-90, 2008.

67 [101 Kannathal Natarajan, Rajendra Acharya, Fadhilah Alias, Thelma Tiboleng, and Sadasivan K Puthusserypady. Nonlinear analysis of eeg signals at different mental states. BioMedical Engineering OnLine, 3(1):7, 2004.

111] Charles W Anderson, Saikumar V Devulapalli, and Erik A Stolz. Determining mental state from eeg signals using parallel implementations of neural networks. Scientific programming, 4(3):171-183, 1995.

[121 Pouya Bashivan, Irina Rish, and Steve Heisig. Mental state recognition via wearable eeg. arXiv preprint arXiv:1602.00985, 2016.

[131 Gert Pfurtscheller and Christa Neuper. Motor imagery and direct brain-computer communication. Proceedings of the IEEE, 89(7):1123-1134, 2001.

[14] Marc Jeannerod. Mental imagery in the motor context. Neuropsychologia, 33(11):1419-1432, 1995.

[15] Carlo A Porro, Maria Pia Francescato, Valentina Cettolo, Mathew E Diamond, Patrizia Baraldi, Chiara Zuiani, Massimo Bazzocchi, and Pietro E Di Pram- pero. Primary motor and sensory cortex activation during motor performance and motor imagery: a functional magnetic resonance imaging study. Journal of Neuroscience, 16(23):7688-7698, 1996.

[16] Poul 0 Fanger et al. Thermal comfort. analysis and applications in environmen- tal engineering. Thermal comfort. Analysis and applications in environmental engineering., 1970.

[17] L Schellen, MGLC Loomans, BRM Kingma, MH De Wit, AJH Frijns, and WD van Marken Lichtenbelt. The use of a thermophysiological model in the built environment to predict thermal sensation: coupling with the indoor envi- ronment and thermal sensation. Building and Environment, 59:10-22, 2013.

[18] Richard J De Dear, Gail Schiller Brager, James Reardon, Fergus Nicol, et al. Developing an adaptive model of thermal comfort and preference/discussion. ASHRAE transactions, 104:145, 1998.

[19] Standard ASHRAE. 55: Thermal environmental conditions for human occu- pancy, american society of heating refrigeration and air conditioning engineers. Inc., Atlanta, 2004.

[20] XL Wang and FK Peterson. Estimating thermal transient comfort. American Society of Heating, Refrigerating and Air-Conditioning Engineers, 1992.

[21] JD Hardy and JA Stolwijk. Partitional calorimetric studies of man during ex- posures to thermal transients. Journal of Applied Physiology, 21(6):1799-1806, 1966.

68 [22] J LeBlanc, B Blais, B Barabe, and J Cote. Effects of temperature and wind on facial temperature, heart rate, and sensation. Journal of applied physiology, 40(2):127-131, 1976.

[23] Ye Yao, Zhiwei Lian, Weiwei Liu, and Qi Shen. Experimental study on skin temperature and thermal comfort of the human body in a recumbent posture under uniform thermal environments. Indoor and Built Environment, 16(6):505- 518, 2007.

[24] Yousuke Taniguchi, Hiroshi Aoki, Kenji Fujikake, Hisashi Tanaka, and Moto- hiro Kitada. Study on car air conditioning system controlled by car occupants' skin temperatures-part 1: Research on a method of quantitative evaluation of car occupants' thermal sensations by skin temperatures. Technical report, SAE Technical Paper, 1992.

[25] Joon Ho Choi. CoBi: bio-sensing building mechanical system controls for sus- tainably enhancing individual thermal comfort. PhD thesis, Carnegie Mellon University, 2010.

[26] Weiwei Liu, Zhiwei Lian, and Yuanmou Liu. Heart rate variability at different thermal comfort levels. European journal of applied physiology, 103(3):361-366, 2008.

[27] Alberto Malliani, Massimo Pagani, Federico Lombardi, and Sergio Cerutti. Car- diovascular neural regulation explored in the frequency domain. Circulation, 84(2):482-492, 1991.

[28] Mark Feldmeier and Joseph A Paradiso. Personalized hvac control system. In Internet of Things (IOT), 2010, pages 1-8. IEEE, 2010.

[29] Christoph S Herrmann. Human eeg responses to 1-100 hz flicker: resonance phe- nomena in visual cortex and their potential correlation to cognitive phenomena. Experimental brain research, 137(3-4):346-353, 2001.

[30] Steven A Hillyard, Hermann Hinrichs, Claus Tempelmann, Stephen T Morgan, Jonathan C Hansen, Henning Scheich, and Hans-Jochen Heinze. Combining steady-state visual evoked potentials and f mri to localize brain activity during selective attention. mapping, 5(4):287-292, 1997.

[31] ST Morgan, JC Hansen, and SA Hillyard. Selective attention to stimulus location modulates the steady-state visual evoked potential. Proceedings of the National Academy of Sciences, 93(10):4770-4774, 1996.

[32] Matthias M Muller, Terence W Picton, Pedro Valdes-Sosa, Jorge Riera, Wolf- gang A Teder-Sdlejdrvi, and Steven A Hillyard. Effects of spatial selective atten- tion on the steady-state visual evoked potential in the 20-28 liz range. Cognitive Brain Research, 6(4):249-261, 1998.

69 [33] Fabrizio Beverina, Giorgio Palmas, Stefano Silvoni, Francesco Piccione, Silvio Giove, et al. User adaptive beis: Ssvep and p300 based interfaces. PsychNology Journal, 1(4):331-354, 2003.

[341 Danhua Zhu, Jordi Bieger, Gary Garcia Molina, and Ronald M Aarts. A survey of stimulation methods used in ssvep-based bcis. Computationalintelligence and neuroscience, 2010:1, 2010.

[35] Xiaogang Chen, Yijun Wang, Masaki Nakanishi, Xiaorong Gao, Tzyy-Ping Jung, and Shangkai Gao. High-speed spelling with a noninvasive brain-computer in- terface. Proceedings of the national academy of sciences, 112(44):E6058-E6067, 2015.

[36] Robert Prueckl and Christoph Guger. A brain-computer interface based on steady state visual evoked potentials for controlling a robot. In International Work-Conference on Artificial Neural Networks, pages 690-697. Springer, 2009.

[37] Ignas Martisius and Robertas Damasevieius. A prototype ssvep based real time bci gaming system. Computational intelligence and neuroscience, 2016:18, 2016.

[38] Maria A Pastor, Julio Artieda, Javier Arbizu, Miguel Valencia, and Jose C Mas- deu. Human cerebral activation during steady-state visual-evoked responses. Journal of neuroscience, 23(37):11621-11627, 2003.

[39] Gary Garcia. High frequency ssveps for bci applications. In Computer-Human Interaction. Citeseer, 2008.

[40] Josef Faller, Gernot Miiller-Putz, Dieter Schmalstieg, and Gert Pfurtscheller. An application framework for controlling an avatar in a desktop-based virtual envi- ronment via a software ssvep brain-computer interface. Presence: teleoperators and virtual environments, 19(1):25-34, 2010.

[41] Anatole L6cuyer, Fabien Lotte, Richard B Reilly, Robert Leeb, Michitaka Hi- rose, and Mel Slater. Brain-computer interfaces, virtual reality, and videogames. Computer, 41(10), 2008.

[42] Jozef Leg6ny, Raquel Viciana Abad, and Anatole L6cuyer. Navigating in virtual worlds using a self-paced ssvep-based brain-computer interface with integrated stimulation and real-time feedback. Presence: Teleoperators and Virtual Envi- ronments, 20(6):529-544, 2011.

[43] Bonkon Koo, Hwan-Gon Lee, Yunjun Nam, and Seungjin Choi. Immersive bci with ssvep in vr head-mounted display. In Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual InternationalConference of the IEEE, pages 1103-1106. IEEE, 2015.

[44] Robert S Fisher, Graham Harding, Giuseppe Erba, Gregory L Barkley, and Arnold Wilkins. Photic-and pattern-induced seizures: a review for the epilepsy foundation of america working group. Epilepsia, 46(9):1426-1441, 2005.

70 [45] BRM Kingma. Human thermoregulation: a synergy between physiology and mathematical modelling. 2011.

[46] Empatica e4. https: //www. empatica. com/e4-wristband. Accessed: 2018-03- 05.

[47] B Gunnar Wallin and Nisha Charkoudian. Sympathetic neural control of in- tegrated cardiovascular function: insights from measurement of human sympa- thetic nerve activity. Muscle & nerve, 36(5):595-614, 2007.

[48] Nisha Charkoudian. Skin blood flow in adult human thermoregulation: how it works, when it does not, and why. In Mayo Clinic Proceedings, volume 78, pages 603-612. Elsevier, 2003.

[49] Wolfram Boucsein. Electrodermal activity. Springer Science & Business Media, 2012.

[50] Eda data. https://support.empatica.com/hc/en-us/articles/ 203621955-What-should-I-know-to-use-EDA-data- in-my-experiment-. Accessed: 2018-03-05.

[51] Sara Taylor, Natasha Jaques, Weixuan Chen, Szymon Fedor, Akane Sano, and Rosalind Picard. Automatic identification of artifacts in electrodermal activ- ity data. In Engineering in Medicine and Biology Society (EMBC), 2015 37th Annual International Conference of the IEEE, pages 1934-1937. IEEE, 2015.

[52] Eda explorer. http: //eda- explorer . media. mit .edu/. Accessed: 2018-03-05.

[53] John M Johnson and Duane W Proppe. Cardiovascular adjustments to heat stress. Comprehensive Physiology, 2011.

[541 Auria Eisen-Enosh, Nairouz Farah, Zvia Burgansky-Eliash, Uri Polat, and Yossi Mandel. Evaluation of critical flicker-fusion frequency measurement methods for the investigation of visual temporal resolution. Scientific reports, 7(1):15621, 2017.

[55] Stanley W Davis. Auditory and visual flicker-fusion as measures of fatigue. The American journal of psychology, 68(4):654-657, 1955.

[56] Francis Crick and Christof Koch. Are we aware of neural activity in primary visual cortex? Nature, 375(6527):121-123, 1995.

[57] Matthias M M6ller, P Malinowski, T Gruber, and SA Hillyard. Sustained divi- sion of the attentional spotlight. Nature, 424(6946):309, 2003.

[58] Oculus rift. https: //www. oculus.com/rift/. Accessed: 2018-03-05.

71 [59] Nikolay V Manyakov, Nikolay Chumerin, Arne Robben, Adrien Combaz, Mar- ijn van Vliet, and Marc M Van Hulle. Sampled sinusoidal stimulation profile and multichannel fuzzy logic classification for monitor-based phase-coded ssvep brain-computer interfacing. Journal of neural engineering, 10(3):036011, 2013.

[60] Xiaogang Chen, Zhikai Chen, Shangkai Gao, and Xiaorong Gao. A high-itr ssvep-based bci speller. Brain-Computer Interfaces, 1(3-4):181-191, 2014.

[61] Unity3d. https: //unity3d. com/. Accessed: 2018-03-05.

[621 Frank Sharbrough. American electroencephalographic society guidelines for stan- dard electrode position nomenclature. J cin Neurophysiol, 8:200-202, 1991.

[63] Openbci. http: //openbci. com/. Accessed: 2018-03-05.

[64] David Regan. Human brain electrophysiology: evoked potentials and evoked magnetic fields in science and medicine. 1989.

[65] Zhonglin Lin, Changshui Zhang, Wei Wu, and Xiaorong Gao. Frequency recog- nition based on canonical correlation analysis for ssvep-based bcis. IEEE trans- actions on biomedical engineering, 54(6):1172-1176, 2007.

[66] Guangyu Bin, Xiaorong Gao, Zheng Yan, Bo Hong, and Shangkai Gao. An online multi-channel ssvep-based brain-computer interface using a canonical correlation analysis method. Journal of neural engineering, 6(4):046002, 2009.

[671 Matlab. https://www.mathworks. com/products/matlab.html. Accessed: 2018-03-05.

[68] Sandra G Hart and Lowell E Staveland. Development of nasa-tlx (task load index): Results of empirical and theoretical research. In Advances in psychology, volume 52, pages 139-183. Elsevier, 1988.

[69] Jonathan R Wolpaw, Niels Birbaumer, Dennis J McFarland, Gert Pfurtscheller, and Theresa M Vaughan. Brain-computer interfaces for communication and control. Clinical neurophysiology, 113(6):767-791, 2002.

[70] Carney Landis. Determinants of the critical flicker-fusion threshold. Physiological Reviews, 34(2):259-286, 1954.

[71] Ernst Simonson and Josef Brozek. Flicker fusion frequency: background and applications. Physiological reviews, 32(3):349-378, 1952.

[72] Josef Brozek and Ancel Keys. Changes in flicker-fusion frequency with age. Journal of Consulting Psychology, 9(2):87, 1945.

[731 GS Brindley, JJ Du Croz, and WAH Rushton. The flicker fusion frequency of the blue-sensitive mechanism of colour vision. The Journal of physiology, 183(2):497- 500, 1966.

72