REGULATING DIRECT-TO-CONSUMER EEG NEURODATA – WHY

LESSONS FROM GENETICS WILL NOT BE ENOUGH

By Iris Coates McCall

A thesis submitted to Johns Hopkins University in conformity with the requirements for the degree of Master of Bioethics

Baltimore, Maryland October 2017

© Iris Coates McCall All Rights Reserved. Abstract

Historically, most neurodata – information about the structure and function of the – has been obtained in clinical and research settings. However, neurotechnologies are now being sold direct to consumer (DTC) and marketed to the general public for a variety of purposes. Commercially sold DTC electroencephalogram (EEG) devices are a rapidly expanding enterprise. Widespread personal will mean a plethora of neurodata being generated in the public sphere. This raises concerns related to privacy, confidentiality, discrimination, and individual identity. These observations may seem familiar – it may seem as though we have had this conversation before in the DTC genetic testing (DTC-GT) debate. As such it might look like the challenges of DTC-EEG devices could be solved easily within another framework, namely, that developed for the management and regulation of data from DTC-GT. However, there are distinctions between the two types of data that have important implications for their management and regulation. This paper will argue that in the era of big data, DTC commercially obtained EEG neurodata raises unique ethical, legal, social, and practical challenges beyond that of DTC genetic testing, and thus the frameworks developed for the management and protection of genetic data will be insufficient for the management and protection of neurodata. To this end, this paper outlines some of the most salient differences between neurodata and genetic information and highlights the associated challenges that arise relating to those differences. It concludes that special considerations for legislation arising from these challenges must be recognized by policymakers interested in regulating the production, use, and operationalization of neurodata in the public sector.

Primary reader: Dr. Travis Rieder Secondary Reader: Dr. Debra Mathews

ii Acknowledgements

I would like to thank Travis, Debra, and David Peña-Guzmán for all their input, help, and support with this project. I would also like to thank my MBE classmates for their discussion and encouragement throughout this process.

iii

Table of Contents

Abstract...... ii Acknowledgements...... iii

Introduction...... 2 Background...... 4 The Devices...... 10 The Threats: What Problems does DTC-EEG Pose?...... 16 Privacy...... 17 Confidentiality...... 20 Discrimination...... 25 Neurodata – The New Genetic Information Debate?...... 27 Differences Between Neurodata and Genetic Information...... 29 Differences in the nature of the information...... 29 Differences in beliefs about the information...... 32 Effects of the Differences Between Neurodata and Genetic Information...... 42 Privacy and confidentiality...... 43 The operationalizability of neurodata...... 45 Implications for monitoring...... 47 The Need for Regulation...... 49 Conclusion...... 50 Works Cited...... 53 Biography...... 63

iv In a recent article on the Blog,1 Jessie Ginsberg described the potential dangers related to mental privacy in the age of big data. From the story of the girl whose father found out she was pregnant due to Target’s impressive ad targeting analytics,2 to the example of cognitive functioning information collected by brain training programs such as

Lumosity,3 Ginsberg describes a dystopian future (or perhaps even present) where our personal mental life and cognitive functioning is wide open for corporate consumption and tracking along with all the other personal data third parties collect through our use of smartphones and other technology. Surprisingly absent, however, from her depiction is a different sort of cognitive functioning information – the information gleaned from neuroimaging devices which can relay data on the structure and function of our . As direct to consumer (DTC) neurotechnologies which were once confined to the clinic and lab are increasingly marketed and sold to the public, we are faced with some troubling questions:

What exactly does this data reveal about ourselves? Who has access to this data? And what ought we do, if anything, to protect this information? This paper will address the concerns raised by DTC neurotechnologies in the era of big data and argue that, while similar to the issues raised by DTC genetic testing, the problems are not identical and thus current regulations are an insufficient framework for the ethical management of neurodata.

1 Ginsberg, J. (2017). Mental Privacy in the Age of Big Data. The Neuroethics Blog. Retrieved on September 20, 2017, from http://www.theneuroethicsblog.com/2017/06/mental-privacy-in-age-of-big-data.html 2 Duhigg, Charles. “How Companies Learn Your Secrets.” The New York Times Magazine, 16 Feb. 2012, www.nytimes.com/2012/02/19/magazine/shopping-habits.html?_r=1&hp=&pagewanted=all. Accessed 20 Sept. 2017. 3 Purcell, Ryan H., and Karen S. Rommelfanger. "Internet-based brain training games, citizen scientists, and Big Data: ethical issues in unprecedented virtual territories." Neuron 86.2 (2015): 356-359. 1 Introduction

Neuroimaging is increasingly capable of revealing information about the contents of the mind previously thought to be the stuff of science fiction. Functional magnetic resonance imaging (fMRI) data can recreate an image of a previously perceived face from memory,4 generate a representation of a viewed image,5 and even generate images of a subject’s own cranio-facial features from their pattern of brain activity.6 The information generated by this form of technology can be referred to as neurodata – data about the structure or functioning of the human brain. Historically, most neurodata has been obtained in clinical and research settings. However, neurotechnologies are now being sold direct to consumer (DTC) and marketed to the general public for a variety of purposes. In the ethics literature, most of the attention paid to these DTC neurotechnologies has focused on transcranial magnetic stimulation (TMS) and transcranial direct-current stimulation (tDCS) devices and whether or not they require regulation as a medical device.7 Less attention, however, has been paid to electroencephalogram (EEG) devices that, rather than altering neural functioning, can provide information about that functioning, thereby generating neurodata. This lack of attention is concerning, because their low cost, ease of use, and portability increase the likelihood that DTC-EEG will bring widespread neuroimaging capabilities to the public. This raises concerns as to the management of the neurodata

4 Lee, Hongmi, and Brice A. Kuhl. "Reconstructing perceived and retrieved faces from activity patterns in lateral parietal cortex." Journal of Neuroscience 36.22 (2016): 6069-6082. 5 Du, Changde, Changying Du, and Huiguang He. "Sharing deep generative representation for perceived image reconstruction from human brain activity." arXiv preprint arXiv:1704.07575 (2017). 6 Toga, Arthur W. "Neuroimage databases: the good, the bad and the ugly." Nature Reviews. Neuroscience 3.4 (2002): 302. 7 Wexler, Anna. "A pragmatic analysis of the regulation of consumer transcranial direct current stimulation (TDCS) devices in the United States." Journal of Law and the Biosciences 2.3 (2016): 669-696. 2 generated by these devices, and the ethical, legal, and social implications of the collection and use of information about people’s mental functioning.

Issues of neuroprivacy have been discussed extensively in the academic and legal literature,8 much of which has focused on fMRI data due to its high temporal and spatial resolution. fMRI machines, however, are large, expensive, and require expertise to operate.

Thus outside of private neuroimaging clinics, it is unlikely that fMRI will find widespread adoption for personal use in the DTC market. EEG devices, on the other hand, are small, portable, relatively inexpensive, and fairly easy to operate. As a result, DTC-EEG devices made for and marketed to the general public are a rapidly expanding enterprise, with over ten models currently available on the market.

However, widespread personal neuroimaging will mean a plethora of neurodata being generated in the public sphere. This raises concerns related to privacy, confidentiality, discrimination, and individual identity. Our mental contents are some of the most private things we hold, and have strong implications for our identity and sense of self. These observations may seem familiar – it may seem as though we have had this conversation before in the DTC genetic testing (DTC-GT) debate. As such it might look like the challenges of DTC-EEG devices could be easily solved with another framework, namely that developed for the management and regulation of data from DTC-GT. However, there are distinctions between the two types of data that have important implications for their management and regulation. This paper will outline the concerns raised by DTC-EEG devices, explore the similarities and differences between these issues and those raised by

8 For a good summary of the legal issues relating to neuroprivacy, see: Committee on Science and Law. "ARE YOUR THOUGHTS YOUR OWN?"" NEUROPRIVACY" AND THE LEGAL IMPLICATIONS OF BRAIN IMAGING." RECORD-ASSOCIATION OF THE BAR OF THE CITY OF NEW YORK 60.2 (2005): 407, and Shen, Francis X. "Neuroscience, mental privacy, and the law." Harv. JL & Pub. Pol'y 36 (2013): 653. 3 DTC-GT, and argue that the lessons learned from genetics provide an insufficient framework for approaching DTC-EEG neurodata regulation.

Background

EEG is an electrobiological monitoring system designed to record the electrical activity of the brain. The fluctuations recorded, often referred to as “brainwaves”, result from ionic currents within the brain cells (neurons). When neurons are activated, local currents are produced. These are recorded by a series of metal electrodes placed directly on the scalp, either with or without conductive gel, which are located based on an internationally recognized placement system – the 10-20 system. Devices can vary in the number of electrodes used, ranging from a single channel up to 256 channels. EEG is a completely non-invasive procedure that can be applied repeatedly to patients, normal adults, and children with virtually no risk or limitation.9

There are five universally recognized EEG patterns of activity that are found in all individuals.10,11 Gamma waves are in the frequency range of 31Hz and up, and while not well understood, are associated with arousal, excitement, and alertness. Beta waves are in the frequency range of 13-30Hz, and are related with action, concentration, conscious thought, and external locus, and are dominant during normal states of wakefulness with open eyes.

Alpha waves, the best understood rhythm, are in the frequency range of 8-12Hz, and reflect relaxation, disengagement, and a “spacey” or dreamy state. Theta waves ranging from 4 to

7Hz are linked to inefficiency, daydreaming and subconscious activity. Finally, delta waves

9 Mantri, Shamla, et al. "A Survey: Fundamental of EEG." International Journal 1.4 (2013). 10 Li, QianQian, Ding Ding, and Mauro Conti. "Brain-computer interface applications: Security and privacy challenges." Communications and Network Security (CNS), 2015 IEEE Conference on. IEEE, 2015. 11 Mantri et al. (2013) 4 ranging from 0.5 to 4Hz are the slowest waves and occur when a user is in hypnoidization or is deeply unconscious. These patterns of activation are automatic and universal.

In addition to these ongoing and universal neural patterns, EEG can capture individual responses to different events and stimuli. An event-related potential (ERP) is the measured brain response that is the direct result of a specific sensory, cognitive, or motor event.12 In other words, it is any stereotyped electrophysiological response to a stimulus.

Using ERPs can allow you to associate certain brain activity patterns with specific events or stimuli. A subtype of ERP, the evoked potential (EP), involves averaging over multiple recordings the EEG activity time-locked to the presentation of a stimulus of some sort.

Despite the simplicity of the device, researchers have been able to obtain a lot of information about individuals’ brain functioning through EEG, some of it quite astounding.

One major effort in the field is the potential use of individual EEG response patterns for biometric verification. Ruiz-Blondel et al. (2016) recorded the brain activity of fifty people while they looked at a series of images.13 They found that subjects’ brains reacted differently to each image, so much so that a computer system was able to identify each subject’s ERP

"brainprint" with 100% accuracy, thus determining which individual was viewing the image.

Researchers have also used EEG to identify individuals with 100% accuracy against a baseline assessment using resting EEG rather than ERP.14 De Gennaro et al. (2008) showed that humans have an individual profile of the EEG spectra in the 8-16Hz frequency during non-rapid eye movement (REM) sleep which is stable over time and resistant to

12 Luck, Steven J. An introduction to the event-related potential technique. MIT press, 2014. 13 Ruiz-Blondet, Maria V., Zhanpeng Jin, and Sarah Laszlo. "CEREBRE: A novel method for very high accuracy event-related potential biometric identification." IEEE Transactions on Information Forensics and Security 11.7 (2016): 1618-1629. 14 La Rocca, Daria, et al. "Human brain distinctiveness based on EEG spectral coherence connectivity." IEEE Transactions on Biomedical Engineering 61.9 (2014): 2406-2412. 5 experimental changes, indicating that the brain activity profile during sleep is highly unique and can be used to fingerprint people.15 These findings are supported by Finelli et al. (2001) who demonstrated that the pattern of the EEG power distribution in non-REM sleep is characteristic for an individual.16 One group of researchers even managed to attain a 99% accuracy rate at identifying individuals based off of a single channel EEG device out of an albeit small set of fifteen individuals.17

Another universal feature of EEG is the waveform that is produced when the subject observes something familiar and indicates the familiarity and salience of the stimuli.

This can be used to tell whether a person has seen someone or something before, and has been used to recognize the subject’s name out of a list of random names,18 discriminate familiar faces from non familiar faces,19 for ,20 and to detect high-impact autobiographical information.21 It is argued that the P300 has the potential for detecting concealed information through a specific response known as a memory and encoding related multifaceted electroencephalographic response (MERMER), which is elicited when a person recognizes and processes a stimulus that is particularly noteworthy to him/her.22 For example, in a seminal study of FBI agent trainees, Farwell and Smith (2001) presented

15 De Gennaro, Luigi, et al. "The electroencephalographic fingerprint of sleep is genetically determined: a twin study." Annals of neurology 64.4 (2008): 455-460. 16 Finelli, Luca A., Peter Achermann, and Alexander A. Borbély. "Individual ‘fingerprints’ in human sleep EEG topography." Neuropsychopharmacology 25.5 (2001): S57-S62. 17 Chuang, John, et al. "I think, therefore I am: Usability and security of authentication using brainwaves." International Conference on Financial Cryptography and Data Security. Springer, Berlin, Heidelberg, 2013. 18 Rosenfeld, J. Peter, Julianne R. Biroschak, and John J. Furedy. "P300-based detection of concealed autobiographical versus incidentally acquired information in target and non-target paradigms." International Journal of Psychophysiology 60.3 (2006): 251-259. 19 Marcel, Sebastien, and José del R. Millán. "Person authentication using brainwaves (EEG) and maximum a posteriori model adaptation." IEEE transactions on pattern analysis and machine intelligence 29.4 (2007). 20 Abootalebi, Vahid, Mohammad Hassan Moradi, and Mohammad Ali Khalilzadeh. "A new approach for EEG feature extraction in P300-based lie detection." Computer methods and programs in biomedicine 94.1 (2009): 48-57. 21 Rosenfeld et al. (2006). 22 Farwell, Lawrence A., and Sharon S. Smith. "Using brain MERMER testing to detect knowledge despite efforts to conceal." Journal of forensic science 46.1 (2001): 135-143. 6 probes consisting of words, phrases, and acronyms which only FBI agents would know, along with Targets and Irrelevants. Non-FBI personnel were also tested. The MERMER

System correctly classified all seventeen of the FBI new agent trainees and all four of the control subjects, despite efforts to conceal said knowledge.23 In a review of his studies on the MERMER, Farwell (2012) reported that all studies found accuracy of the MERMER to be 100% with no false positives, no false negatives, and only 3% indeterminates when determining if the subject had been exposed to the incident or not.24 Other authors fiercely contest these findings due to Farwell’s low sample sizes and methodological shortcomings.25

However, the P300 waveform is considered to be a well-established phenomenon, having been researched in over a thousand peer reviewed publications, and many studies in leading peer reviewed journals support the use of ERP for the detection of concealed information.26

The P300 test is far from perfect, but in controlled lab tests it is surprisingly good, detecting

“lies” between 74% and 80% of the time.27

The universal patterns of activation can also be used to detect information about a subject’s level of arousal, engagement, and stress level. Stopczynski et al. (2014a) were able to determine whether subjects were concentrating on mental math or focusing on driving, claiming this showed an ability to detect whether the driver’s attention was fully focused on driving and presented the potential for continuous monitoring to detect distracted driving to

23 Farwell and Smith (2001) 24 Farwell, Lawrence A. "Brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials." Cognitive neurodynamics 6.2 (2012): 115-154. 25 Meijer, Ewout H., et al. "A comment on Farwell (2012): brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials." Cognitive neurodynamics 7.2 (2013): 155-158. 26 Ibid. 27 Abootalebi, Vahid, Mohammad Hassan Moradi, and Mohammad Ali Khalilzadeh. "A comparison of methods for ERP assessment in a P300-based GKT." International Journal of Psychophysiology 62.2 (2006): 309- 320. 7 prevent accidents.28 There are generalized EEG markers that researchers claim can quantify mental workload across tasks and individuals, with EEG measures correlating with both subjective and objective measures of performance metrics on a second-to-second time scale.29 Even one second long EEG markers have been used to identify drowsiness- alertness,30,31,32 mental workload,33,34,35 and individual differences in the effect of sleep deprivation.36 This ability is the basis of so-called “neuroergonomics” – using the ability to continuously monitor an individual’s level of fatigue, task engagement, and mental workload in operational environments,37 as “EEG is the only physiological signal that has been shown to accurately reflect subtle shifts in alertness, attention and workload that can be identified and quantified on a second-by-second time-frame.”38

Relatedly, researchers claim EEG can detect variations in cognitive workload and reflect cognitive and memory performance. There is some evidence that EEG oscillations in the alpha and theta bands reflect cognitive and memory performance in particular.39 Zhang et

28 Stopczynski, Arkadiusz, et al. "Privacy for personal neuroinformatics." (2014a). 29 Berka, Chris, et al. "EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks." Aviation, space, and environmental medicine 78.5 (2007): B231-B244. 30 Levendowski, D. J., et al. "Correlations between EEG indices of alertness measures of performance and self- reported states while operating a driving simulator." 29th Annual Meeting, Society for Neuroscience. Vol. 25. 1999. 31 Levendowski, Daniel J., et al. "Electroencephalographic indices predict future vulnerability to fatigue induced by sleep deprivation." Sleep 24.Abstract Suppl (2001). 32 Levendowski, Daniel J., et al. "Event-related potentials during a psychomotor vigilance task in sleep apnea patients and healthy subjects." Sleep 25 (2002): A462-A463. 33 Berka, Chris, et al. "Real-time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset." International Journal of Human-Computer Interaction 17.2 (2004): 151-170. 34 Berka, Chris, et al. "Evaluation of an EEG workload model in an Aegis simulation environment." Proceedings of SPIE. Vol. 5797. 2005a. 35 Pacific Science & Engineering Group. DARPA Augmented Cognition Technological Integration Experiment (TIE). 2003 July 7. San Diego, CA; 2003. 36 Berka, Chris, et al. "EEG quantification of alertness: Methods for early identification of individuals most susceptible to sleep deprivation." Proceedings of SPIE Defense and Security Symposium, Biomonitoring for Physiological and Cognitive Performance during Military Operations. Vol. 5797. FL: SPIE: The International Society for Optical Engineering, 2005b. 37 Berka et al. (2007). 38 Berka et al. (2007). 39 Klimesch, Wolfgang. "EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis." Brain research reviews 29.2 (1999): 169-195. 8 al. (2014) claim they can provide an accurate prediction of workload variation and real-time assessment of cognitive workload using only fifteen channels.40

Finally, EEG activity has been proposed as a means to detect biosignatures for various neurological and psychological disorders such as epilepsy,41 child behavioural checklist dysregulation profile,42 depression,43,44 alcoholism,45 and schizophrenia.46,47

Of course, there are serious limits to the reliability of these findings as well as significant challenges to their applicability in real-world settings. For example, many of these determinations are based off information gleaned from ERPs and EPs rather than continuous, real-time EEG and as such could not be revealed without repeated exposure and the wearer’s cooperation. Also, the signal from EEG is very susceptible to artifacts from sources not related to brain activity, has poor spatial resolution compared to other neuroimaging techniques, and can only accurately measure surface-level cortical activation and not that of deep-brain structures.

Further, most of this work has been done in well-controlled research settings and as such is unlikely to be easily reproduced in the settings that DTC devices will be used. The weaknesses of these new technologies are many and I have only mentioned a few here. But it

40 Zhang, Haihong, et al. "Detection of variations in cognitive workload using multi-modality physiological sensors and a large margin unbiased regression machine." Engineering in Medicine and Biology Society (EMBC), 2014 36th Annual International Conference of the IEEE. IEEE, 2014. 41 Smith, S. J. M. "EEG in the diagnosis, classification, and management of patients with epilepsy." Journal of Neurology, Neurosurgery & Psychiatry 76.suppl 2 (2005): ii2-ii7. 42 McGough, James J., et al. "A potential and cognitive biosignature for the child behavior checklist–dysregulation profile." Journal of the American Academy of Child & Adolescent Psychiatry 52.11 (2013): 1173-1182. 43 Gotlib, Ian H. "EEG alpha asymmetry, depression, and cognitive functioning." Cognition & Emotion 12.3 (1998): 449-478. 44 Davidson, Richard J., et al. "Depression: perspectives from affective neuroscience." Annual review of psychology 53.1 (2002): 545-574. 45 Acharya, U. Rajendra, et al. "Computer-aided diagnosis of alcoholism-related EEG signals." Epilepsy & Behavior 41 (2014): 257-263. 46 Karson, Craig N., et al. "Computerized EEG in schizophrenia." Schizophrenia bulletin 14.2 (1988): 193. 47 Sponheim, Scott R., et al. "Resting EEG in first‐episode and chronic schizophrenia." Psychophysiology 31.1 (1994): 37-43. 9 is important to be aware of the potential implications of new technologies as they emerge and are still in a nascent stage. It is undeniable that the quality of these devices is improving and there is increasing ability to get useful EEG information in natural settings. Stopczynski et al. (2011) demonstrated a fully functional handheld brain scanner consisting of a fourteen- channel EEG headset wirelessly connected to a smartphone that enables minimally invasive

EEG monitoring in naturalistic settings.48 They claim that early tests of the system indicate the potential for minimally invasive and low-cost EEG monitoring in naturalistic settings.49

It is such devices and their potential that this paper will focus on, as more and more of them come to market.

The Devices

At the time of writing, at least ten commercial companies were identified that are actively marketing their EEG devices online, with hundreds of smartphone and tablet applications (“apps”) and software programs to accompany them. These devices vary in the number of electrodes and channels they have, and by price, ranging anywhere from $79.9950 to $20,600.51 They are marketed for a range of uses including performance and wellness promotion, brain-training, education, gaming, and research. Whether they can actually fulfill any of these claims, however, is a far more open question.

It is important to note that while the promoters of the devices make many claims about what the devices can do, it behooves the public to maintain an informed and healthy

48 Stopczynski, Arkadiusz, et al. "A smartphone interface for a wireless EEG headset with real-time 3D reconstruction." Affective Computing and Intelligent Interaction. Springer, Berlin, Heidelberg, 2011. 317-318. 49 Stopczynski, Arkadiusz, et al. "The smartphone brain scanner: a portable real-time neuroimaging system." PloS one 9.2 (2014b): e86733. 50 “MindFlex.” NeuroSky, store.neurosky.com/products/mindflex. Accessed 17 Sept. 2017. 51 “Cognionics Bundles!” NeuroGuide by Applied Neuroscience, Applied Neuroscience, Inc., 2017, www.appliedneuroscience.com/NeuroGuide_Cognionics.htm. Accessed 20 Sept. 2017. 10 skepticism about the reality of these claims. The purpose of this paper is not to promote the scientific credibility of these devices but rather to consider the ethical implications if even a portion of their claims can in fact be borne out.

Two companies stand out as the preeminent devices that are most widely adopted for consumer apps and programs – Emotiv and NeuroSky. Emotiv markets two models of

EEG – the EPOC+ and the Insight – that claim to “monitor cognitive load and discover emotional responses that are preventing you from achieving peak mental performance.”52

The Insight is a five channel, $299 headset that “allows you to monitor your cognitive health and wellbeing and optimize your performance.”53 The EPOC+ is a fourteen-channel, $799 headset that claims to be able to detect facial expressions (blink, left wink, right wink, furrow/frown, raise brow/surprise, smile, clench teeth/grimace, glance left, glance right, laugh, smirk left side, smirk right side) and emotional states (instantaneous excitement, long term excitement, frustration, engagement, meditation, interest/affinity).54 Their myEmotiv app “takes the complexity out of reading and interpreting your brain waves, so you can easily measure your mental performance and fitness” and has a 3D brain visualizer showing patterns of activation in real time. Their website includes a page of independent research that has been performed using their headsets.55 One of these papers evaluated the EPOC+ for its ability to detect and classify six pairs of mental actions, and found that the system performs significantly better than chance for all mental actions, improves over time with additional

52 “Homepage.” Emotiv, www.emotiv.com/. Accessed 20 Sept. 2017. 53 “Insight Brainwear® 5 Channel Wireless EEG Headset.” Emotiv, www.emotiv.com/insight/. Accessed 20 Sept. 2017. 54 “EPOC - 14 Channel Wireless EEG Headset.” Emotiv, www.emotiv.com/epoc/. Accessed 20 Sept. 2017. 55 “Independent Studies Archives.” Emotiv, www.emotiv.com/category/independent-studies/. Accessed 17 Sept. 2017. 11 training data, and was able to accurately classify mental actions 87.5% of the time.56 Emotiv headsets have been used in many research projects, including one successfully using a smartphone-paired EEG device for bioauthentication.57

Neurosky has several EEG headsets for sale, including the MindWave Mobile

($99.99), BrainLink Pro ($199), and MyndPlay (£179). All have only one electrode and hence one EEG channel, and connect to a smartphone or tablet. The manufacturers of the

Neurosky MindWave claim that beyond the typical EEG device functionality their device is also capable of detecting two mind states (focused and relaxed).58 Some research has been done using Neurosky devices as well as some work validating their accuracy. One paper found that the commercially available dry EEG devices produced by NeuroSky are a viable solution for use in real-world applications monitoring drowsiness.59 Rebolledo-Mendez et al.

(2009) reported that NeuroSky’s MindSet provides accurate readings regarding attention, since there is a positive correlation between measured and self-reported attention levels.60

Both companies have introduced their own app stores in order to facilitate the expansion of brain-computer interface (BCI) applications available for their products.61 They also both have software development kits (SDKs) available for third party developers to create new apps for their devices. The result is hundreds of DTC-EEG products marketed

56 Taylor, Grant S., and Christina Schmidt. "Empirical evaluation of the Emotiv EPOC BCI headset for the detection of mental actions." Proceedings of the Human Factors and Ergonomics Society Annual Meeting. Vol. 56. No. 1. Sage CA: Los Angeles, CA: SAGE Publications, 2012. 57Klonovs, Juris, and Christoffer Kjeldgaard Petersen. "Development of a mobile EEG-based feature extraction and classification system for biometric authentication." Master's Thesis: Aaolburg University Copenhagen (2012). 58 “MindWave.” NeuroSky, store.neurosky.com/pages/mindwave. Accessed 22 Sept. 2017. 59 Jones, Ashley, and Galina Schwartz. "Using brain-computer interfaces to analyze EEG data for safety improvement." Team for Research in Ubiquitous Secure Technology (2010). 60 Rebolledo-Mendez, Genaro, et al. "Assessing neurosky’s usability to detect attention levels in an assessment exercise." International Conference on Human-Computer Interaction. Springer, Berlin, Heidelberg, 2009. 61 Takabi, Hassan, Anuj Bhalotiya, and Manar Alohaly. "Brain Computer Interface (BCI) Applications: Privacy Threats and Countermeasures." Collaboration and Internet Computing (CIC), 2016 IEEE 2nd International Conference on. IEEE, 2016. 12 for a wide range of purposes.

The majority of products are marketed with claims relating to performance, wellness promotion, and stress management. Such uses range from sleep aids, to meditation aids, to

“concentration training” through neurofeedback – a process where the device informs you of your current mental state in order for you to alter it through concentrated effort. One product, Muse, is a seven-channel, $249 “brain sensing headband” marketed as a meditation aid by “guiding you through focused attention training exercises, and delivering real-time feedback while you train.”62 Kokoon markets a pair of €239 EEG headphones with allegedly sleep-aiding audio technology that claims “helps you rest, relax and focus.”63 It claims to adapt to you and your environment – getting quieter as you fall asleep, reacting to ambient sounds, and “learning what helps you rest and relax.” For $399, myBrain Technologies will sell you their two-channel Melomind – “a drug-free, easy-to-use, and perfectly safe solution to stress.”64 They claim to have “identified the cognitive neuro-marker linked to relaxation and created the perfect companion to enhance your own abilities to relax. Including the world's first audio EEG headset and a dedicated coaching app, the Melomind solution teaches you how to efficiently relax and achieve peace of mind.”65 One EEG app even claims to help with irritable bowel syndrome.66

Other apps and devices claim to use this same Neurofeedback process for brain training. “Super Powers for Super Kids” is an app for Neurosky’s MindWave headset that is marketed as attention and concentration training specifically for children diagnosed with

62 “MUSE ™ | Meditation Made Easy.” Muse: the brain sensing headband, www.choosemuse.com/. Accessed 17 Sept. 2017. 63 “Homepage.” Kokoon, www.kokoon.io. Accessed 17 Sept. 2017. 64 “MeloMind.” MyBrain Technologies, www.mybraintech.com/. Accessed 17 Sept. 2017. 65 Ibid. 66 “IBS: Calm & Control your gut.” NeuroSky. https://store.neurosky.com/products/calm-control-your-gut. Accessed 17 Sept. 2017. 13 ADHD.67 Referring to this product, Stanley Yang, CEO of NeuroSky said, “[w]ith the rapidly increasing demand from healthcare providers for computer delivered, wide scale and cost effective mental health therapies, we believe ZenZone [parent company] is very well placed to take a leadership role in this exciting area of brain fitness."68 Despite these marketing claims to the device as mental health therapy, SuperPowers for Super Kids has not been evaluated by the FDA.

Similar to the claims previously made by Lumosity, apps such as Home of

Attention’s “Brain Starter” provides cognitive training activities that claim to improve concentration and focus.69 DeepWave is another app for a NeuroSky headset that claims to be “the first widely available mobile tool for everyday brain wave training. It offers intuitive and easy guidance with images and sounds to help you attain beneficial brain states. Benefits include improved performance and creativity, better mood, reduced stress, and the ability to mobilize mental resources on demand.”70 It is free and operates on an iPhone or iPad.

Gaming is perhaps one of the industries most interested in EEG headset technology.

Several games, such as Mindflex ($79.99)71 and the Star Wars Force trainer ($129)72 use the universal frequency bands of EEG signals to move a ball through an obstacle course of tubes using various levels of concentration. BrainCopter allows you to control a virtual helicopter around a computer game,73 and MindDrone is the first commercially available

67 AD HD: Super Powers for Super Kids, NeuroSky, store.neurosky.com/products/super-powers-for-super-kids. Accessed 22 Sept. 2017. 68 Ibid. 69 “Brain-Starter.” NeuroSky, store.neurosky.com/products/brain-starter. Accessed 17 Sept. 2017. 70 “DeepWave.” NeuroSky, store.neurosky.com/products/deepwave. Accessed 17 Sept. 2017. 71 Mindflex, NeuroSky Store, store.neurosky.com/products/mindflex. Accessed 22 Sept. 2017. 72 “Star Wars Science Force Trainer.” Amazon.com, www.amazon.com/Star-Wars-Science-Force- Trainer/dp/B001UZHASY. Accessed 22 Sept. 2017. 73 “BrainCopter.” NeuroSky, store.neurosky.com/products/braincopter. Accessed 22 Sept. 2017. 14 “brain-controlled” drone.74

Finally, DIY scientists have a host of options for leveraging their personal EEG devices to conduct research outside of the lab. MindRec is a $200 software program that pairs with Neurosky’s MindSet device and is marketed towards researchers and neuromarketers as a tool for continuous data streaming and raw EEG data collection.75 The

$499 NeuroView Research Tools is designed to be appropriate for novice to intermediate

EEG researchers wishing to view and record EEG data in real-time.76

A healthy dose of skepticism is of course warranted regarding these products’ claims.

Maskeliunas et al. (2016) evaluated and compared the Emotiv EPOC+ and the Neurosky

MindWave by having ten subjects perform concentration/relaxation and blinking recognition tasks.77 Their findings were that both devices exhibit high variability and non- normality of attention and meditation data, making both of them difficult to use as an input for control tasks. The results of blinking recognition show that the Neurosky device’s recognition accuracy is less than 50%, while the Emotiv device has achieved a recognition accuracy of more than 75%; for tasks that require concentration and relaxation of subjects, the Emotiv EPOC+ device had a recognition accuracy 9% higher than the NeuroSky device.

They conclude that the Emotiv EPOC+ device may be more suitable for control tasks using the attention/meditation level or eye blinking than the Neurosky MindWave device, but that neither are particularly impressive regarding the accuracy of inferring mental states and are only suitable for a beginner level brain signal measurement and research.

However, the fact that results may not be reliable does not mitigate the potential

74 “MindDrone.” Emotiv, www.emotiv.com/product/minddrone/. Accessed 22 Sept. 2017. 75 “MindRec.” NeuroSky. https://store.neurosky.com/products/mindrec. Accessed 17 Sept. 2017. 76 “Research Tools.” NeuroSky, store.neurosky.com/products/mindset-research-tools. Accessed 17 Sept. 2017. 77 Maskeliunas, Rytis, et al. "Consumer-grade EEG devices: are they usable for control tasks?." PeerJ 4 (2016): e1746. 15 concerns this neurodata raises, in particular due to the popular mistaken belief that the results of brain scans are highly accurate and telling.78 In an interview, cognitive neuroscientist and Director of the Center for Neuroscience & Society Martha Farah said quite succinctly, “[p]robably the only thing worse than having people successfully reading your mind with brain imaging is having people unsuccessfully reading your mind with brain imaging and thinking that they can trust that information.”79 There are plenty of issues to address when technologies are deployed and over-hyped. Despite little or no confirmation of the reliability or efficacy of these scans for their advertised purpose, the companies are still collecting a tremendous amount of data from people – data that may, unbeknownst to the data providers, be very useful to the companies even if not to the data providers themselves.

Companies and users may have vastly different interests in terms of what information the device is generating.

Beyond concerns relating to efficacy, there are other technical aspects that warrant attention. Literature discussing the potential problems relating to these devices has largely come from the computer science sector and focused on the technical aspects of security of the neurodata generated by these devices.80,81,82 Little attention, however, has been paid to the broader ethical concerns arising from this technology and the social implications of having large amounts of information about our neural functioning floating around in the era of big data.

78 Weisberg, Deena Skolnick, et al. "The seductive allure of neuroscience explanations." Journal of cognitive neuroscience 20.3 (2008): 470-477. 79 Olson, Steve. "Brain scans raise privacy concerns: advances in neuroimaging may provide the ability to" read" someone's mind, rightly or wrongly." Science 307.5715 (2005): 1548-1551. 80 Martinovic, Ivan, et al. "On the Feasibility of Side− channel Attacks with Brain− computer Interfaces." Proceedings of the 21st USENIX conference on Security symposium. USENIX Association, 2012. 81 Frank, M., Hwu, T., Jain, S., Knight, R., Martinovic, I. et al. “Subliminal Probing for Private Information via EEG-Based BCI Devices.” arXiv Preprint arXiv, 1312, 6052 (2013). 82 Li, Ding, and Conti (2015). 16

The Threats: What Problems Does DTC-EEG Pose?

Privacy

“Normatively and culturally, the mind is an archetypal space of privacy”83

As we have seen, even consumer-grade EEG devices are capable of revealing information about ourselves and our mental life, some of which we might not want shared.

Our (albeit limited) ability to access the contents of someone’s mind despite active efforts to conceal that information, as shown by Farwell et al., marks a new threshold of violation of privacy. Here, privacy refers to an individual’s interest in avoiding the unwanted collection of her functional neuroimaging information by a third party.84 The reason this violation of privacy is so egregious is that if anything can be considered private, one would think it would be the content of your thoughts. In a 1986 opinion, Justice Allen E. Broussard wrote that,

“[i]f there is a quintessential zone of human privacy it is the mind.”85 If we see the mind as a special sphere of privacy, then the argument could be made that even in the absence of any other physical, psychological, or social harm stemming from that violation, the violation of that sphere alone represents a harm or wrongdoing to the individual. In line with this, the case has recently been made for neuroprivacy as a human right.86

Indeed, legal precedent seems to support the sanctity of mental privacy as a protected right. In Pennsylvania v. Muniz, the court concluded that the privilege protected under the Fifth Amendment is “‘served when the privilege is asserted to spare the accused

83 Pustilnik, Amanda C. "Neurotechnologies at the intersection of criminal procedure and constitutional law." (2012). 84 Tovino, Stacey A. "Functional neuroimaging information: A case for neuro exceptionalism." Fla. St. UL Rev. 34 (2006): 415. 85 Long Beach City Emps. Ass'n. v. City of Long Beach, 719 P.2d 660, 663 (Cal. 1986). 86 Ienca, Marcello, and Roberto Andorno. "Towards new human rights in the age of neuroscience and neurotechnology." Life Sciences, Society and Policy 13.1 (2017): 5. 17 from having to . . . share his thoughts and beliefs with the Government,’ because it is the attempt to force [the accused] ‘to disclose the contents of his own mind’ that the privilege protects against.”87,88 In her legal analysis of the notion of mental privacy, Amanda Pustilnik concludes that “[c]ase law and cultural norms support the conclusion that people have a reasonable expectation of privacy and security in their physical bodies, in the contents of and in their actions within private spaces like the home, and in their unexpressed or unpublished thoughts and reflections”89 (emphasis mine).

However, unpublished thoughts and reflections are now accessible without the thinker’s participation or cooperation. Researchers have shown that it is possible to probe people for information they did not mean to share. For example, researchers Matovu and

Serwadda (2016) sought to find out if they could glean sensitive personal information from brain data captured by two popular EEG-based bioauthentication systems. Using only that information collected for bioauthentication, they were able to correctly identify 25% of the alcoholics in an old medical data set of EEG scans based on a well-documented EEG artifact associated with alcoholism – a delayed P300 response.90,91 Although 25% reflects a high number of false negatives and as such is fairly insensitive, it still means that 25% of people lost their privacy regarding something that has a strong potential negative impact on their lives. And we are still in the early stages – the concern is for the potential invasion represented by the continuing evolution of these devices.

87 Pennsylvania v. Muniz, 496 US 582, 595 (1990). 88 Pustilnik (2012). 89 Ibid. 90 Scudellari, Megan. “EEG Identification Can Steal Your Most Closely Held Secrets.” IEEE Spectrum: Technology, Engineering, and Science News, 9 Sept. 2016, spectrum.ieee.org/the-human- os/biomedical/devices/eeg-identification-can-steal-your-most-private-secrets. Posted 9 Sep 2016. Accessed 17 Sept. 2017. 91 Matovu, Richard, and Abdul Serwadda. "Your substance abuse disorder is an open secret! Gleaning sensitive personal information from templates in an EEG-based authentication system." Biometrics Theory, Applications and Systems (BTAS), 2016 IEEE 8th International Conference on. IEEE, 2016. 18 Other authors have documented how the neurodata generated by BCI technologies can be used to obtain sensitive information. Frank et al. (2013) conducted subliminal probes in which the victims were shown visual stimuli for 13.3ms – a duration not usually long enough for conscious cognitive perception.92 The stimuli flashed images of things the viewer may or many not recognize, in order to elicit a P300 response to determine if the image was of something particularly familiar/salient to the viewer. The results showed that by carefully designing the visual stimuli, an attacker can reduce the uncertainty of guessing a user’s private information by more than 20% relative to chance, at the same time that the victim remains unaware of being probed.

Bonaci et al. (2015) studied how EEG BCI platforms used in games or web navigation can be misused to extract users’ private information.93 They presented subliminal stimuli to the users for approximately 7ms while playing a game and used their EEG signals to extract private information. Based on the user’s reaction to different stimuli presented (for example, logos of different companies) they claim to have been able to determine user preferences between brands, for example if a user preferred Target over Walmart.94

Finally, the first malicious software designed to detect a user’s private information using BCI – or “brain spyware” – was presented by Martinovic et al. at the 2012 USENIX

Security Symposium. They investigated how third-party EEG applications could infer private information about the users, by manipulating the visual stimuli presented on screen and analyzing the corresponding responses in EEG signal.95 They used Emotiv’s EPOC+ and a computer game they developed to present users with visual stimuli and record their EEG

92 Frank et al. (2013). 93 Bonaci, T. L. B. M. T., J. Herron, and H. J. Chizeck. "How susceptible is the brain to the side-channel private information extraction." American Journal of Bioethics, Neuroscience 6.4 (2015). 94 Frank et al. (2013). 95 Martinovic et al. (2012). 19 neural signals. Focusing on the P300 response, they analyzed the recorded signals and were able to successfully detect users’: (a) 4-digit PINs (20% correct on the first try), (b) bank information (30% on the first try), (c) months of birth (60% on the first try), (d) locations of residence (30% on the first try), and (e) if they recognized the presented set of faces (20% on first try). In their review of existing BCI apps, Takabi et al. (2016) concluded, “our findings show that all applications are capable of collecting EEG signals of their users and extracting private information about them.”96

Confidentiality

While privacy issues stem largely from concerns regarding the security of brain-apps and “brain hacking,” confidentiality issues arise from wilfully shared information on the part of the app developers. Confidentiality in this case refers to “the obligation of an individual or organization to prevent the unauthorized or otherwise inappropriate use or disclosure of appropriately gathered functional neuroimaging information.”97

It is well known that apps share user data, including health information that many users assume is private.98,99 MapMyFitness, for example, earns half of its revenue from partnerships with large health insurance companies such as Humana and Aetna.100 A UK study found that out of the 79 NIH-accredited health and wellness apps, 89% transmitted information to online services and 66% of the apps that sent identifying information over

96 Takabi et al. (2016). 97 Tovino (2006). 98 Grundy, Quinn, Fabian P. Held, and Lisa A. Bero. "Tracing the Potential Flow of Consumer Data: A Network Analysis of Prominent Health and Fitness Apps." Journal of medical Internet research 19.6 (2017). 99 Privacy Rights Clearinghouse. "Mobile health and fitness apps: What are the privacy risks." Retrieved September 7 (2013): 2013. 100 Steel, Emily , and April Dembosky. “Health apps run into privacy snags.” Financial Times, 1 Sept. 2013, www.ft.com/content/b709cf4a-12dd-11e3-a05e-00144feabdc0. Accessed 20 Sept. 2017. 20 the internet did not use encryption. 20% did not have a privacy policy at all. Most (90%) communicated with one or more third-party services directly and one fifth sent information to advertisers and marketers directly. Nearly half (47%) of apps did not fully disclose that strong personal identifiers would be transmitted and a quarter of apps sent analytics information without informing users.101 At the heart of the ethical issues presented by these devices is not what the devices can actually do, but the fact that they are collecting data for future uses without any protection for users/data providers.

In the US, a 2013 Privacy Rights Clearinghouse study found that 43% of free apps share personal identifying information with advertisers,102 and concluded that the mobile app

“ecosystem” is largely unregulated – a concerning fact for health and wellness apps which often collect both demographic and health information that does not fall under the protections of any health privacy laws.103 In most cases, health and fitness data are only protected to the extent stated in a privacy policy—if there is a privacy policy at all. However, even if an app does agree not to share EEG data or personal identifiers with users’ employers, insurers, or law enforcement, privacy policies do not apply to the third parties who they do share the information with.104

In 2013, an investigation and majority staff report by the Committee on Commerce,

Science, and Transportation found that “[d]ata brokers collect a huge volume of detailed information on hundreds of millions of consumers, including financial, health, and other personal information; consumers have little or no awareness of these activities.”105 From this,

101 Huckvale, Kit, et al. "Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment." BMC medicine 13.1 (2015): 214. 102 Privacy Rights Clearinghouse (2013). 103 Ibid. 104 Ibid. 105 Senate, U. S. "A Review of the Data Broker Industry: Collection, use, and sale of consumer data for marketing purposes." Washington, DC: Committee on Commerce, Science, and Transportation, US Senate (2013). 21 data brokers are able to build dossiers on app users to sell to marketers – dossiers that consumers are unable to obtain or correct. When combined, consumer-generated data can be used for statistical modeling for health or financial risk profiling. Such information is purchased by hedge funds, hospitals, large provider networks, payers, pharmaceutical companies, and others.106 Even if the data are de-identified, third parties or app “families” can link multiple user accounts across apps to create aggregated user profiles and a more complete picture of a consumer’s social network and health status. These aggregate profiles are then monetized and used for “marketing, screening prospective tenants or employees, or maliciously for identity fraud.”107

All of this is big business. Data brokers value health data more than any other kind of user data and as a result health data is worth a lot of money. One study found that while demographic information such as age, gender, and location was worth only $0.00005 per person, health information, such as specific diseases or drug prescriptions a person has, was worth $0.26.108

All of this is potentially relevant to the neurodata generated by DTC-EEG devices and their associated apps and software programs. Whether the purpose of the device is purely informative regarding your neural functioning, or collecting the information is simply a means to control an external object, DTC-EEG products generate a lot of health information, and this stands only to increase as these devices are refined and become more popular. Depending on to whom the company sells the data, and in what format, neurodata

106 Sarasohn-Kahn, Jane. “Here's Looking at You.” California Health Care Foundation , July 2014, www.chcf.org/publications/2014/07/heres-looking-personal-health-info. Accessed 17 Sept. 2017. 107 Li, Jingquan. "A privacy preservation model for health-related social networking sites." Journal of medical Internet research 17.7 (2015). 108 Comfort, Nathaniel. “The Genetic Self.” The Point Magazine, 15 Jan. 2015, thepointmag.com/2014/examined-life/genetic-self. Accessed 17 Sept. 2017. 22 could pose harms to the device users. For example, perhaps a brain-training game records that your EEG consistently shows higher theta, delta and beta frequencies than the average user, as is commonly observed in alcoholics.109 Paired with information from MyFitnessPal regarding consumption patterns (i.e., are many of your calories coming from alcoholic beverages or “other”?), or even alone, a data broker might draw the conclusion that the device/app user is an alcoholic – information that a user’s insurance company would surely be interested in.

Because health, and especially “wellness”, apps are often provided by organizations that are not traditional medical providers, they can fall outside the scope of existing legal and professional confidentiality safeguards.110 Thorough analysis of the different existing legal frameworks that could regulate neurodata has been conducted.111 As such, I will only briefly summarize why each of these is insufficient to protect the confidentiality of DTC-acquired neurodata. The point of this endeavour is to show the scope and validity of the threats posed to the confidentiality of neurodata, before turning to why the nature of neurodata in particular makes data sharing of this type of particular concern.

Difficulties in regulating the confidentiality of DTC-generated neurodata arise from the fact that it falls somewhere between medical information, research data, commercial product data, and educational performance records. The neurodata collected by these DTC-

EEG devices is not protected information under HIPAA, since commercial companies that sell the DTC-EEG devices are neither covered entities nor are they likely to meet the criteria

109 Acharya et al. (2014). 110 Huckvale et al. (2015). 111 For a thorough analysis of the protections currently awarded neurodata, see Tovino (2006) and Kostiuk, Stephanie A. "After GINA, NINA-neuroscience-based discrimination in the workplace." Vand. L. Rev. 65 (2012): 933. 23 to qualify these companies as “business associates” of covered entities.112 As such HIPAA is unlikely to offer sufficient protections of consumer-generated neurodata.113

Neurodata also fails to be protected as general consumer data collected by commercial entities. The Federal Trade Commission (FTC) has the authority to penalize consumer-facing, for-profit companies for failing to abide by commitments regarding data use as stated in privacy policies.114 However, product developers get to determine which protections to offer their users, or even if they are going to have a privacy policy at all, and the FTC’s stance on mobile privacy is “not intended to serve as a template for law enforcement actions or regulations under laws currently enforced by the FTC.”115 As such, there is no standard assumption of privacy of personal information, and terms and conditions are often heavily skewed towards the commercial companies’ interests.

If one views results from brain training games as educational performance records, then an argument could be made that the neurodata is protected under the US Family

Educational Rights and Privacy Act which heavily protects educational records;116 however, this is unlikely as the companies collecting the data are not educational institutions.

As for the device itself, the FDA only regulates medical devices, which are defined by their intended use, not their mechanism of action,117,118 and product developers are careful to market their devices as general wellness aids, something that the FDA has decided is

112 “Health App Use Scenarios & HIPAA.” Health App Developers, what are your questions about HIPAA?, US Dept. of Health and Human Services Office for Civil Rights, Feb. 2016, hipaaqsportal.hhs.gov/community- library/accounts/92/925889/OCR-health-app-developer-scenarios-2-2016.pdf. Accessed 17 Sept. 2017. 113 Ibid. 114 Federal Trade Commission. "Mobile privacy disclosures: Building trust through transparency." USA: Federal Trade Commission (2013). 115 Ibid. 116 Purcell & Rommelfanger (2015). 117 Federal Food, Drug, and Cosmetic Act, Section 201(h). 118 By the FDA’s definition “the words intended uses... refer to the objective intent of the persons legally responsible for the labeling of devices ... this objective intent may be shown by labeling claims, advertising matter, or oral or written statements by such persons.” Code of Federal Regulations. Title 21, Vol. 8. Sec. 801.4 Meaning of intended uses. Rev. Apr 1, 2016. 24 beyond their purview to regulate.119 The FDA has also made a public policy decision not to regulate health apps unless they can physically harm an individual.120,121 Since EEG apps pose little to no threat of physical harm, these products remain unregulated.122

The essential question is - what sort of data is this? Is DTC neurodata one of the types of information described above, or is it something else? Is it protected by all of these oversight mechanisms, some, or none? Confidentiality and the use of cognitive performance neuroimaging data falls into a new territory somewhere between commercial, research, medical, and educational domains, and as such, its regulation and confidentiality protections are falling through the cracks.

Discrimination

One of the main reasons why privacy and confidentiality are so important with respect to neurodata in a big data context is the potential for discrimination in education, employment, and insurance based on traits or conditions revealed by EEG data about an individual’s neural functioning. Conditions discussed such as epilepsy, alcoholism, and

119 In 2015 the FDA attempted to determine a cutoff for what they were responsible for monitoring and issued a non-legally binding “guidance” that stated: “general wellness products presenting a low risk to safety will not be regulated as medical devices by the FDA” and that a general wellness product is one that makes claims related to “maintaining or encouraging a general state of health” without references to diseases or conditions. Examples of acceptable wellness claims are those relating to “mental acuity,” “concentration,” “problem solving,” and “relaxation and stress management.” (United States of America. U.S. Department of Health and Human Services . Food and Drug Administration. General Wellness: Policy for Low Risk Devices. N.p.: n.p., January 20, 2015. Web.) 120 Food and Drug Administration. "Mobile medical applications: guidance for industry and Food and Drug Administration staff." USA: Food and Drug Administration (2013). 121 Food and Drug Administration. “Medical Device Data Systems, Medical Image Storage Devices, and Medical Image Communications Devices: guidance for industry and Food and Drug Administration staff." USA: Food and Drug Administration (2015). 122 EEG is a class II medical device. Low-Moderate risk requires premarket notification, meaning the sponsor must demonstrate that the device to be marketed is at least as safe and effective (“ie substantially equivalent”) as a legally marketed device (21 CFR § 882.1400). An EEG spectrum analyzer on the other hand is classified as a class I device. (21 CFR 882.1420). Thus it is likely that while a DTC-EEG device needs to notify the FDA of its intent to go to market, associated apps do not. 25 depression can, with varying degrees of precision, be detected with EEG data. The potential for harm from the disclosure of that information is high. For example, an insurance company could demand that you share your neurodata on your phone with them and upon seeing the 300ms delay in P300 response characteristic of depression,123 record that as a pre- existing condition that disability insurance will not cover.

Perhaps more concerning, however, is the potential use of EEG data to assess cognitive functioning. Employers or schools could evaluate patterns of functional activity that have been claimed to be associated with some aspect of “intelligence”124 and discriminate based on perceived innate abilities, rather than based on demonstrated merit. As

Kostiuk (2012) points out, with it now illegal in the US to use genetic testing in this context, neuroimaging may be an attractive alternative for employers.125 However, this use of neurodata would constitute discrimination in the same way as use of genetic data, as people should be evaluated based on actual observable, behavioural abilities and not previously invisible biological information/proclivities.

Existing non-discrimination regulations would not protect individuals from the use of such technologies, as these provisions usually only apply to protected classes or disabilities.

For example, the Americans with Disabilities Act (ADA) does not consider “common personality traits such as poor judgment or quick temper” to be a protected disability, and thus the use of EEG devices to screen for personality traits or anything that is not a recognized disability or protected class would not be disallowed.126 The insufficiency of the

123 Olbrich, Sebastian, and Martijn Arns. "EEG biomarkers in major depressive disorder: discriminative power and prediction of treatment response." International Review of Psychiatry 25.5 (2013): 604-618. 124 Thatcher, Robert W., Duane North, and C. Biver. "EEG and intelligence: relations between EEG coherence, EEG phase delay and power." Clinical neurophysiology 116.9 (2005): 2129-2141. 125 Kostiuk (2012). 126 Tovino (2006). 26 Genetic Information Non-Discrimination Act (GINA) for neurodata will be discussed below.

Discrimination based on neurodata also has implications for mental illness and associated stigma. Of note, many of the DTC companies specifically advertise for help with mental illnesses.127 Part of what would make neurodata-based discrimination different from other forms of discrimination is that far too many people tend to think of biological conditions as illnesses in the technical sense, but still tend to think of mental illnesses as signs of a character flaw. Generally, many do not have the same cultural understanding of mental illness as they do of other forms of illness, which makes the risk for discrimination on the basis of neurodata more acute and the potential for stigmatization much greater.

For reasons of privacy, confidentiality, and discrimination, a growing number of scientists, ethicists, and policy makers are calling for increased protections of neurodata.128,129,130

Neurodata – The New Genetic Information Debate?

So far, this analysis of the ethical, legal, and social implications of DTC neurodata may sound very familiar – it has raised many of the same considerations that have been raised regarding DTC-GT and the resulting increasing presence of commercially obtained and owned genetic information. Indeed, neurodata shares many of the same features as genetic information. Both genetics and neuroscience present novel opportunities to collect

127 For a discussion of the ethics and privacy concerns relating to the increasing use of mental health apps, see Giota, Kyriaki G., and George Kleftaras. "Mental health apps: innovations, risks and ethical considerations." E-Health Telecommunication Systems and Networks 3.03 (2014): 19. 128 Ienca and Andorno (2017). 129 Kostiuk (2012). 130 Tovino (2006). 27 biological information about ourselves that was previously inaccessible, and this is increasingly being done with DTC products. Both are unique identifiers,131 both have the ability to provide information about future disease burden that apply to individuals and families, both can be potentially stigmatizing or cause for discrimination, and both are closely related to our identity and sense of self. But while the ethics of DTC-GT has received much attention in the literature, there has been insufficient discussion of DTC neurotechnologies and the associated neurodata generated.

Ubiquitous in the literature about neurodata are comparisons with genetic information.132 However, we must be careful not to assume that these two types of information raise identical problems, and be aware of the unique issues raised by neurodata.

It would be a mistake to simply repeat the genetic information debate and resulting regulations, given that there are essential differences in the nature of the information collected through these DTC products. As this paper will argue, while both markets pose potential risks, there are qualitative differences between genetic information and neurodata, such that the latter requires unique regulation beyond and distinct from that required for

131 Albeit genetic information is at this time far more adept at this, but advances in the field of bioauthentication suggest that may not be the case for long. 132 A few examples of the language used to draw these parallels include: “similarities [between genetics and neuroimaging] are striking and span the domains of both research and clinical ethics.” Illes, Judy, and Eric Racine. "Imaging or imagining? A neuroethics challenge informed by genetics." The American Journal of Bioethics 5.2 (2005): 5-18. “[M]ost of the ethical perplexities raised by genomics are transposable, perhaps with an even higher degree of urgency, to the field of neurosciences.” Mauron, Alex. "Renovating the house of being." Annals of the New York Academy of Sciences 1001.1 (2003): 240-252. “[M]any of the qualities supposedly marking out neurodata are also shared by genetic data.” Hallinan, Dara, et al. "Neurodata and neuroprivacy: Data protection outdated?." Surveillance & Society 12.1 (2014): 55. “The committee observed there are important similarities between genetic and brain data, in that: (1) “both genetic and brain data hold out the promise of prediction (not only disease, but also behavior)”, and (2) “both types of information expose unique and personal, and to a large extent, uncontrollable aspects of a person that previously were unobservable”. Based on these observations, the committee proposed exploring and leveraging for neuroethics those medical, ethical and legal rules already set forth in genetic research.” Bonaci, Tamara, Ryan Calo, and Howard Jay Chizeck. "App stores for the brain: Privacy & security in Brain-Computer Interfaces." Ethics in Science, Technology and Engineering, 2014 IEEE International Symposium on. IEEE, 2014. For an excellent discussion of the similarities and differences between ethics and genetic, see Illes and Racine (2005), and Greely, Henry T. "Neuroethics and ELSI: Similarities and differences." Minn. JL Sci. & Tech. 7 (2005): 599. 28 genetic information.

This paper will argue that DTC, commercially obtained neurodata raises unique ethical, legal, sociopolitical, and practical challenges beyond that of DTC genetic testing, and thus the frameworks developed for the management and protection of genetic data will be insufficient for the management and protection of neurodata. To this end, this paper outlines some of the most salient differences between neurodata and genetic information and highlights the associated challenges that arise relating to those differences. It concludes that special considerations for legislation arising from these challenges must be recognized by policymakers interested in regulating the production, use, and operationalization of neurodata in the public sector.

Differences Between Neurodata and Genetic Information

Neurodata and genetic information are qualitatively different for two main reasons: the nature of the information itself is different, and our beliefs about the information are different. I will argue that special considerations for regulation and management arise from those differences.

Differences in the nature of the information

The first way in which the nature of the information expressed by neurodata differs from that expressed by genetic information is that while the latter is static, the former is dynamic. This added temporal aspect of the information raises issues beyond those raised by genetic information.

29 A genome remains static over time, meaning that once you sequence a person’s genome, there is nothing more to learn from future testing. Advancements in genetics and genomics may be able to give us more information about that genome, but it reflects nothing new or changing about the person herself or her genome. That is, the new genetic information gleaned does not reflect changes in the person, but rather a change in our knowledge about that person/genome. With functional neuroimaging, however, we can observe an individual’s brain functioning over time with minute temporal resolution and can continuously and repeatedly track cognitive functioning and changes in brain structure. Not only does this mean one can constantly learn new things about an individual as they arise, but it raises the potential for monitoring (either self-monitoring by the individual or by others in the case that personal imaging technology gets “hacked”), as well as the operationalizing of neurodata (for example, through BCIs).

One could argue that epigenetics proves this to be false: our genes are not static and unalterable but responsive to our environment, interactions, and experiences. Childhood trauma, for example, has been shown to lead to increased DNA methylation, leading to greater cortisol stress reactivity and overactive stress response in children and through adulthood.133 Further, the ability to non-invasively monitor in vivo global gene expression means that we can now track genomic “functioning” over time.134 However, this merely reflects changes in what aspects of the genome get expressed, and not a material change in the underlying genome itself. Further, unlike neurodata, humans cannot alter this genetic

133 Houtepen, Lotte C., et al. "Genome-wide DNA methylation levels and altered cortisol stress reactivity following childhood trauma in humans." Nature communications 7 (2016). 134 Koh, Winston, et al. "Noninvasive in vivo monitoring of tissue-specific global gene expression in humans." Proceedings of the National Academy of Sciences 111.20 (2014): 7361-7366. 30 information at will. It is this manipulability that makes the dynamic nature of neurodata of importance.

This manipulability is the second difference in the nature of neurodata and genetic information. The fact that neurodata is changing and changeable means that an individual can intervene in the brain and alter its functioning in a way that you cannot engage with your genes. It’s not just that an individual’s neurodata changes over time (it could change randomly, for instance) - it’s that it changes in response to stimuli, including stimuli that we control. Based on the information that neurodata provides, one can choose to alter neural functioning through a number of means such as simple concentrated effort with meditation, neurofeedback, and cognitive behavioural therapy.135 This has both therapeutic implications and means that neurodata is operationalizable in a way that genetic information is not – a topic that will be returned to later. Genetic information does not present the opportunity for individual active intervention in biology in the same way.

Finally, neurodata and genetic information differ in terms of their proximity to behaviour. Based on what behavioural genetics has taught us, practically all behaviours found to have a genetic contribution are polygenetic in nature and require the input and interaction of multiple genes and the environment.136 Further, there are a large number of steps between a gene and its purportedly associated behaviour – that is, between genotype and phenotype – when it comes to behaviour.

135 Whether or not these kinds of manipulations alter the kind of data that one would worry about (for example, evidence of alcoholism or mental illness) varies. For example, it is unlikely that by force of will an alcoholic could “normalize” her theta band activity. On the other hand, research has shown that people can exercise their will to change deep-seated OCD brain patterns. However, one cannot exercise their will to change their genetic expression in any case. 136 For a review of behavioural genetics, see Samek, Diana, Bibiana D. Koh, and Martha A. Rueter. "Overview of behavioral genetics research for family researchers." Journal of family theory & review 5.3 (2013): 214-233. 31 Neural functioning, on the other hand, can be directly associated with a given behaviour, and this association can be followed on a millisecond-to-millisecond time scale.

Furthermore, behaviour can be disrupted in often predictable ways when the brain is damaged.137 This is significant in terms of the import we are able to ascribe to neural vs. genetic information. Except for a handful of monogenic disorders, genes are probabilistic – they indicate an increased likelihood for something to happen. Neurodata on the other hand shows something that actually is happening - there is less uncertainty about it. As Roskies

(2009) writes: “it does seem that if one traces a causal path from genes through to behaviour, the brain events are much more proximal to behaviours than are details of the genetic code.”138 This proximity has an important impact on the way we view neurodata and thus has implications for its management.

Thus neurodata and genetic data differ in the nature of the information in that neurodata is dynamic in response to the world, manipulable, and is closely and directly associated with behaviour in a non-probabilistic way.

Differences in beliefs about neurodata and genetic information

Not only do important qualitative distinctions arise from the differences in the nature of the information provided by neurodata and genetic information, but the way in which we view this information also differs: a reflection of the fact that there are important differences in our cultural assumptions about the brain and genes. Although both give rise to essentialist views of the self — that is, both types of information share the features of

“having a largely unobserved, underlying, nontrivial causal influence on people’s behaviours

137 For example, we know that damaging Broca’s area directly impedes speech and that a damaged occipital cortex prevents the synthesis of visual data. 138 Roskies, Adina L. "What's “Neu” in Neuroethics?." The Oxford Handbook of Philosophy and Neuroscience. 2009. 32 and outcomes,”139,140— we are essentialist about them in different ways and this is important when it comes to the meaning of this information for us.

Neuroessentialism is similar enough to genetic essentialism to share the associated concerns relating to informational exceptionalism.141 However, neuroessentialism is sufficiently different from genetic essentialism in its implications to pose unique concerns regarding the handling of associated information. While we tend to use essentialist language in both neuroscientific and genetic debates, I will argue that the essentialism is not the same: the sense of Self that is derived from neuroessentialism is more in line with our folk conceptions of the Self than is the sense of Self that derives from genetic essentialism.

Discussions of essentialist tendencies draw materially different pictures between neuroessentialism and genetic essentialism, with the latter far more focused on family, kinship, and group membership. While genetic essentialism is more characterized by it’s tendency to define entities in terms of group membership, neuroessentialism has the tendency to evoke a sense of Self based on individualism rather than shared group

139 Dar-Nimrod, Ilan, and Steven J. Heine. "Some thoughts on essence placeholders, interactionism, and heritability: reply to Haslam (2011) and Turkheimer (2011)." (2011a): 829. 140 Dar-Nimrod et al. propose that people’s understanding of genetics with relation to life outcomes is shaped by their psychological essentialist biases. Genetic essentialism refers to the tendency that “learning about genetic attributions for various human conditions leads to a particular set of thoughts regarding those conditions: they are more likely to be perceived as (a) immutable and determined, (b) having a specific etiology, (c) homogeneous and discrete, and (d) natural” (For a discussion of genetic essentialism, see Dar- Nimrod, Ilan, and Steven J. Heine. "Genetic essentialism: on the deceptive determinism of DNA." Psychological bulletin 137.5 (2011b): 800.) Similarly, neuroessentialism is the view that the definitive way of explaining human psychological experience is by reference to the brain and its activity (Schultz, William. "Neuroessentialism Theoretical and Clinical Considerations." Journal of Humanistic Psychology (2015): 0022167815617296.), the underlying assumption being that “for all intents and purposes, we are our brains” (Reiner, Peter B. "The rise of neuroessentialism." (2011). p. 161). 141 Genetic exceptionalism is the belief that genetic information displays certain novel characteristics and must therefore be treated differently from other types of medical information. Similarly, the argument has been made for neuroexceptionalism. (For a thorough argument for neuroexceptionalism, see Tovino 2006). This exceptionalism stemming from our essentialist beliefs is similar across domains and leads to concerns relating to privacy and discrimination common to both genetic information and neurodata. As it is very sensitive information about what we hold to be the essence of ourselves, we do not want other people knowing about it without our explicit permission. Further, both provide information that might not necessarily be visible to the naked eye, so one can discriminate not on an individual’s behaviour but based on something over which the individual has no control. 33 characteristics. This is manifest in the different ways that people discuss genetic essentialism and neuroessentialism as will be addressed below.

Essentialism in general is a psychological tendency defined as “an ordinary mode of category representation that has powerful social-psychological consequences.”142 Dar-

Nimrod and Heine (2011) write:

“People tend to “essentialize” certain entities that they encounter. They perceive “natural” categories such as chemicals, minerals, and especially living organisms as having an underlying, nontrivial, fundamental nature that makes them what they are… People demonstrate psychological essentialism when they perceive an elementary nature or essence, which is underlying, deep, and unobserved, that causes natural entities to be what they are by generating the apparent shared characteristics of the members of a particular category… As a cognitive heuristic, psychological essentialism facilitates, and at times determines, the formation of categories.”143,144

However by examining various definitions of essentialism in the literature, there appear to be two senses in which people use the term– essentialism described above as having an essential feature that defines a category, and essentialism as being the defining essence of a given thing. The defining elements of psychological essentialism as they pertain

142 Prentice, Deborah A., and Dale T. Miller. "Psychological essentialism of human categories." Current directions in psychological science 16.4 (2007): 202-206. 143 Dar-Nimrod, Ilan, and Steven J. Heine. "Genetic essentialism: on the deceptive determinism of DNA." Psychological bulletin 137.5 (2011b): 800. 144 Similar accounts include ““essentialism represents groups as fundamentally and categorically distinct. If a group’s identity is determined by an essence, its members are all deeply different from members of other groups, as well as being fundamentally the same as one another…Whatever form it takes, essentialist thinking has an insidious tendency to deepen divisions among human groups, creating a view of the social world as collection of fixed and segregated categories.” Haslam, Nick. "Genetic essentialism, neuroessentialism, and stigma: commentary on Dar-Nimrod and Heine (2011)." (2011): 819, and “Essentialism is the view that certain categories (e.g., women, racial groups, dinosaurs, original Picasso artwork) have an underlying reality or true nature that one cannot observe directly. Furthermore, this underlying reality (or "essence") is thought to give objects their identity, and to be responsible for similarities that category members share.” Gelman, Susan A. "Essentialism in everyday thought." Psychological Science Agenda 19.5 (2005): 1-6. 34 to category membership fit nicely with lay-conceptions of genes, as such making them a particularly strong placeholder for such an “essence”.145,146

Brodwin (2002) sums up this interpretation of genetic essentialism as a measure of category membership nicely when he says “[k]nowledge of genetic connection alters how we imagine our “significant same”: those people who are significantly like me, connected to me, and hence the same as me in some categorical sense… Genetic knowledge has the power to change the group with whom we share a “deep, horizontal comradeship” and “genetic evidence can de-stabilize long-standing patterns of community membership.”147 This way of speaking about genetics has a particular flavour – that the essence to which one is essentialized is based on a category membership, of you as an instance of a type. This concept of category membership extends beyond a species level (that our genes make us a member of the category ‘human’) and to a familial level – your genes define your membership to the category of your family, your clan. In this way genetic essentialism is also more about relatedness and connection than neuroessentialism is, with you being an instance

145 Dar-Nimrod and Heine (2011a). 146 Such accounts of genetic essentialism include: “The defining elements of psychological essentialism (i.e., immutable, fundamental, homogeneous, discrete, natural) are similar to the common lay perception of genes. Such similarity suggests that members who are assumed to share a distinct genetic makeup are also assumed to share their essence. People’s understanding of genes may thus serve as an essence placeholder, allowing people to infer their own and others’ abilities and tendencies on the basis of assumed shared genes. The tendency to infer a person’s characteristics and behaviors from his or her perceived genetic makeup is termed genetic essentialism.” Dar-Nimrod and Heine (2011a);“[genetic essentialism is] the tendency for people to think in more essentialist ways upon encountering genetic concepts... A potent cognitive bias” Gould, Wren A., and Steven J. Heine. "Implicit essentialism: genetic concepts are implicitly associated with fate concepts." PloS one 7.6 (2012): e38176; and ““genetic essentialism is grounded in the belief that genes determine our species membership and our individual identity and characteristics. There are two critical features embedded within a genetic essentialist view: (i) gene determinism (the notion that genes cause species/group/individual characteristics), and (ii) categories of homogeneity and difference (the notion that genes underline the distinctions and commonalities among us).” Kong, Camillia, Michael Dunn, and Michael Parker. "Psychiatric genomics and mental health treatment: Setting the ethical agenda." The American Journal of Bioethics 17.4 (2017): 3-12. 147 Brodwin, Paul. "Genetics, identity, and the anthropology of essentialism." Anthropological Quarterly 75.2 (2002): 323-330. 35 of the type that is your family. The genetic self is both how I am made and function, and about where I am in a network of relatedness.

Neuroessentialism, on the other hand, seems to be defined more inline with essentialism definitions as they pertain to underlying individual essences, and identifying that underlying essence as being underpinned by the brain. Examples of proposed neuroessentialism definitions include:

“the belief that brains and their abnormalities define and determine identity.”148

“the position that, for all intents and purposes, we are our brains”149

“representations of the brain as the essence of a person”150

“ the view that the definitive way of explaining human psychological experience is by reference to the brain and its activity.”151

That “when we conceive of ourselves, when we think of who we are as beings interacting in the world, the we that we think of primarily resides in our brains.”152

Thus, while both make for good placeholders for essentialist tendencies, the nature of the essentialist stance that each type of information provokes is qualitatively different, essentializing the person to either one instance of a category or reducing the self to the activity of the brain. Genetic essentialism is thus actually closer in line with what

“essentialism” generally refers to in the literature, whereas neuroessentialism seems to be a bit of a misnomer because it is more of a metaphysical standpoint or a psychological theory of personal identity than a reference to a psychological tendency. Neuroessentialism carries

148 Haslam, Nick. "Genetic essentialism, neuroessentialism, and stigma: commentary on Dar-Nimrod and Heine (2011b)." (2011): 819. 149 Roskies, Adina. "Neuroethics for the new millenium." Neuron 35.1 (2002): 21-23. 150 Racine, Eric, et al. "Contemporary neuroscience in the media." Social Science & Medicine 71.4 (2010): 725-733. 151 Schultz, William. "Neuroessentialism: Theoretical and clinical considerations." Journal of Humanistic Psychology (2015): 0022167815617296. 152 Reiner, Peter B. "The rise of neuroessentialism." (2011). 36 with it a far more materialistic/reductionist flavour than genetic essentialism, with genetic essentialists being seemingly more aware of essentialism as a psychological tendency rather than a metaphysical view. As will be discussed below, this has implications for the sensitivity with which we treat the associated data.

In addition, genetic essentialism takes a form with a far more deterministic quality than does neuroessentialism. Individuals feel “a sense of inevitability with regard to their genes.”153 Rather than being predictive of our lives, as our genes are viewed to be, we tend to view the brain and its associated activity as underpinning our conscious lived experience

(admittedly, a fairly reductionist viewpoint). I will argue that for this reason, we equate the

Self with the brain more than we equate the Self with our genes.

While we might be essentialist about both our genes and our brains, I would argue that that essence which we are defined by in neuroessentialism is closer to our folk conceptions of what comprises the Self than is the essence of genetics. What we think of when we think of ourselves mostly has to do with thoughts, behaviours, and personal narrative.154 If our essence is constituted by our brain, our patterns of neural activity, and our neurotransmitters, this is in no way contradictory to the sense of Self that arises from our lived experiences, our thoughts, our mind, our decisions, and our behaviours. Genetic essentialism, on the other hand, means that our essence is our unchanging genetic code – that inherited material from our ancestors that guides our biological development. It in no way appreciates the lived experiences, conscious perception, and individual agency that we think of when we think of ourselves as an individual.

153 Clayton, Ellen Wright. "Ethical, legal, and social implications of genomic medicine." New England Journal of Medicine 349.6 (2003): 562-569. 154 For a discussion of different conceptions of the Self, and the components consistent across different accounts, see Mathews, Debra JH, Hilary Bok, and Peter V. Rabins, eds. Personal identity and fractured selves: perspectives from philosophy, ethics, and neuroscience. JHU Press, 2009. 37 Thus, neuroessentialism in an important way gives us a different “essence” of the

Self than genetic essentialism does – it gives us a Self that is more pliable and mobile, more in flux. This is perhaps why we have a tendency to adopt a neuroessentialist position in discussions of our brains – the image of the Self that emerges from it is more aligned with what we already believe about ourselves.

Of course, the ubiquity of these neuroessentialist beliefs is an empirical claim requiring evidence of the majority and minority viewpoints of people’s senses of selves, which necessarily will be culture and context specific. To date there has been little empirical work carried out on the prevalence of the various beliefs of what comprises the Self. But the work that has been done supports the findings of Fernandez-Duque and Schwartz (2015) who gathered empirical evidence of people’s beliefs regarding the centrality of the brain to the Self and found that at least in the Unites States “people readily acknowledge the contribution of the brain to the central self, both when construed abstractly and when construed at the level of personality traits.”155 172 participants compared the central self to the peripheral self. The central self, construed at this abstract level, was seen as more brain- based than the peripheral self. They found that “participants embraced the brain as the underlying substrate of their central self, that is, of who they truly are.”156 Work by Anglin

(2014) found that most people believe the Self is located at a single point in the body, rather than distributed throughout, and that participants tended to locate the “self” and mind in the head and the soul in the chest.157 The most frequent free-response provided for the location

155 Fernandez-Duque, Diego, and Barry Schwartz. "Common Sense Beliefs about the Central Self, Moral Character, and the Brain." Frontiers in psychology 6 (2015). 156 Ibid. 157 Anglin, Stephanie M. "I think, therefore I am? Examining conceptions of the self, soul, and mind." Consciousness and cognition 29 (2014): 105-116. 38 of the self in the body was the head, brain, or mind.158,159 However, little other empirical work was found.160

In the absence of empirical evidence on cultural views of the Self, more evidence for the centrality of the brain to common conceptions of the Self comes from analysis of the way people talk about the brain. Generally, authors’ discourse about the brain indicates an inclination to discuss it as if the mind, brain, and Self are the same, and many argue directly from the axiom that the mind is the Self. Authors seem comfortable enough with this equation to make such bold, unsupported statements as:

“The essentialist stance is strengthened by the fact we have a tendency to believe that we are our brains.”161

“The brain is used implicitly as a shortcut for more global concepts such as the person, the individual or the self.”162

“The relationship between the brain and the self is far more direct than the link between genes and personal identity.”163

“If one compares “genome based” and “brain based” explanations of Self and behaviour, it turns out that neural aspects of human nature are more directly relevant.”164

Mauron (2003) makes a strong argument for the centrality of the brain to the Self

158 Ibid. 159 Anglin concludes: “Most participants defined the self as one’s identity, personality, or thoughts, suggesting that self-perception is a major component of how people conceive of the self. Nearly all participants defined the mind as thoughts/consciousness. The fact that most people defined the self and mind in mental terms further demonstrates overlap between individuals’ conceptions of these entities. That is, by providing similar definitions of the self and mind and locating them in similar regions of the body, the results from this study suggest that many people consider the mind closely aligned with the self.” 160 For a summary of the empirical work done on the mind, brain, self, relationship, see Fernandez-Duque, Diego. "Lay Theories of the Mind/Brain Relationship and the Allure of Neuroscience." The Science of Lay Theories. Springer, Cham, 2017. 207-227. 161 Illes, Judy, and Eric Racine. "Imaging or imagining? A neuroethics challenge informed by genetics." The American Journal of Bioethics 5.2 (2005): 5-18. 162 Racine, Eric, Ofek Bar-Ilan, and Judy Illes. "fMRI in the public eye." Nature Reviews. Neuroscience 6.2 (2005): 159. 163 Illes and Racine (2005). 164 Mauron, Alex. "Renovating the house of being." Annals of the New York Academy of Sciences 1001.1 (2003): 240-252. 39 beyond that of genetics. Necessary to the Self is a numerical assumption – that oneself inherently means that there is one and only one of you.165 It is this individuality that is essential to the notion of the Self. It is perfectly possible and in fact common for people to share the same genes, such as in the case of identical twins. Even amongst non-identical family members, part of your genetic material is shared. Furthermore, genes are specifically constructed to replicate.166 Brains, on the other hand, are truly individual. This is because in their wirings and connections they represent the interface, intersection, and integration of all those things that do make us individual – our genes, our environmental exposures, and our experiences. In this way, there can never be two identical brains, just as there can never be two identical selves.

Similarly, as noted by Reid and Bayliss (2005) “our genes are in an important sense not “ours” to begin with: they are the endowment of our parents”167 and “there is a very real sense in which they are not exclusively personal information… The gene that is read in testing or sequencing is what it is mostly because of my heritage and ancestry. It becomes part of my personal identity through the stories I tell, and others accept, about my genes.”168

Brains, on the other hand, formulate those stories, integrate internal and external stimuli, and mediate the associated responses.

I am not arguing for materialism or neurodeterminism, nor do I need to accept a reductionist position to stand by the argument that the concepts of consciousness, minds, and brains are in many ways intertwined and inextricably linked and central to our sense of identity and Self. That directness of linkage between all these elements that we so closely

165 Ibid. 166 Ibid. 167 Reid, Lynette, and Françoise Baylis. "Brains, genes, and the making of the self." The American Journal of Bioethics 5.2 (2005): 21-23. 168 Ibid. 40 relate to the Self, even if the exact nature of that relationship is unknown, makes neuroessentialist tendencies intuitive and direct in a manner different than genetic essentialism.

Again, epigenetics could present a challenge to this view. Take again the example of childhood trauma and its effects on cortisol levels and hence stress reactivity later in life. In at least this way, your genes are the cause of the way you are (generally anxious) and what you are experiencing (stress). They are not, however, that experience itself. They may contribute to it, but they do not embody it. A given neural state, on the other hand, reflects that experience. There is no need to take a particular metaphysical stance, nor does it require any ground-breaking neuroscientific findings, to say that our brains in part underpin our perceptions and conscious experiences. That is not to say that we are conscious of everything the brain does, indeed there is ample evidence of an active subconscious. But it is to say that everything we are conscious of directly involves some related brain activity giving rise to that consciousness. One does not need to believe in the reduction of the mind to the brain in order to argue that this is true.

As previously mentioned, the related proximity of the brain to behaviour is another reason why we think of neurodata differently than genetic information. Mauron (2003) writes:

“When links are made between neuroimaging findings and our self- concepts in particular, it is even clearer that the ethics of genetics can only partially help settle ethical issues. Genetics and genomics have provided fertile ground for many ethical reflections on human nature, but the relationship between the brain and the self is far more direct than the link between genes and personal identity.”169

What we think of when we think of ourselves mostly has to do with thoughts,

169 Mauron (2003). 41 behaviours, and narrative life story. Genes are so much further removed from our behaviour that they fail to carry the same weight as neurodata in contributing to these thoughts and behaviours and thus our senses of self. However, the close linkage between the brain and behaviour intrinsically ties conceptions of the Self to conceptions of the brain. The nature of this relationship is of course the topic of much debate and determining the seat of consciousness and the relation of mind and body is well beyond the scope of this paper. It is undeniable, however, that insofar as we associate our behaviour with our Self, the brain more directly links these concepts than do our genes.

The effect of the differences discussed above, I argue, is that neurodata is held to be distinctly private because the notion of the “Self” that is associated with neurodata is closer in line with folk conceptions of the Self than is genetic data. Our cultural assumptions and location of the brain at the center of the mind and Self means neurodata is uniquely sensitive and can provide more salient information about what we truly see as the person than can genes. As a result of this centrality of the brain to the Self, we have a heightened interest in protecting it from unwanted direct observation as much as possible. Reid and Baylis write:

“Our genes are part of our biology and one aspect of our lineage. Our thoughts—that is, our reasoning, our motivations, our attitudes, beliefs, and values—are our selves and our personal identity. Our “brains are us” in ways that our genes never could be. In parallel, our thoughts are “private” in ways that are distinct from our shared genetic heritage.”170

As we have seen, while people tend to be essentialist about both their brains and their genes, the form that this essentialism takes between the two is substantially different.

Genetic essentialism takes on a very deterministic flavour – that you are you from birth and that self is constant, unchanging, and unavoidable. Neuroessentialism on the other hand need not be deterministic or a driving force behind the Self, but rather sees the brain as

170 Reid and Baylis (2005). 42 giving rise to the Self itself. Neuroessentialism can make the use of neurodata potentially dangerous simply because of the centrality of the brain to conceptions of the Self and personhood, and this direct connection raises different concerns regarding the protections of associated information we must afford each.

Effects of the Differences Between Neurodata and Genetic Information

All of the differences in the nature of neurodata and genetic information discussed above have implications for the use of this information and its consequent social, cultural, and political effects. If we only use genetic frameworks to guide our regulation of neurodata, results will focus mostly on privacy, confidentiality, and discrimination in employment and health insurance. I argued in the last section that the primary differences between genetic information and neurodata are that neurodata is dynamic and manipulable, and, because of the centrality of the brain to conceptions of the Self, we have a stronger privacy interest in neurodata than we do in genetic information. What specific challenges do these differences pose?

Privacy & Confidentiality

“Should thought information have similar privacy status as genetic information? Probably not less, but perhaps more.”171

Firstly, the intimate content of the neurodata not only exacerbates the privacy and confidentiality issues that have been previously raised by genetic information, but also gives them a different form since what is being revealed is inherently different and therefore

171 Illes and Racine (2005). 43 subject to different uses. Genetic information mostly reveals things about physical conditions, health, and heredity.172 However, neurodata is specifically about one’s mental state at a given time – visualizing cognitive processes and, as outlined above, in some instances capable of revealing specific knowledge about the contents of a person’s thoughts.

Revelation of information of this kind has no analog in the genetic information debate. If we are to follow the response to genetic privacy as a guideline for protecting disclosure of this information, what would be protected against is discrimination on the basis of employment and health insurance, as per GINA. However, neurodata also has the potential for compelled testimony and use in the court system in ways that genetic information does not, because the content of neurodata can reveal information on more topics – topics that are often of more interest to other people.173 Accordingly, the privacy concerns raised by neurodata go beyond those raised by genetic information, and require a response to the threat of compelled testimony and use of neurodata in the courtroom.

In addition to presenting different threats to privacy, I would argue that violation of neuroprivacy is also more egregious because of the previously discussed close association with sense of identity and Self. Neurodata presents new potential in the representation of the individual possible through data – allowing the translation of unique and previously unrecordable aspects of the individual into data form.174 EEG neurodata can provide insight into ‘real time’ brain functioning, allowing the direct recording of processes associated with personality, mood, behaviours, thoughts, or feelings.175 As previously discussed, not only has

172 So far research into genes for mental processes or behaviours as been largely unsuccessful and without strong associations. 173 For a thorough overview of neuroimaging in the court and legal system, see Pustilnik (2012). 174 Hallinan, Dara, et al. "Neurodata and neuroprivacy: Data protection outdated?" Surveillance & Society 12.1 (2014): 55. 175 Ibid. 44 it shown the potential to predict predispositions to mental and physical illness, but also to predict different sorts of behaviour and personality.176 “These features arguably make neurodata intimate in a very different way to other forms of data.”177 A now iconic quote comes from neurobiologist Donald Kennedy: “Far more than our genomes, our brains are us, marking out the special character of our personal capacities, emotions and convictions…

I already don’t want my employer or my insurance company to know my genome. As to my brainome, I don’t want anyone to know it for any purpose whatsoever. It is . . . my most intimate identity.”178 Thus regulations are needed outlining the privacy and confidentiality stipulations required to protect the disclosure and sharing of DTC neurodata.

The operationalizability of neurodata

Secondly, the dynamic and manipulable nature of neurodata means that it can be operationalized. While genetic data is purely informative, neurodata can also be functional.179

This is of concern for two reasons – first, it can be inadvertently collected and shared during the ordinary use of BCIs, and second, because one can intervene in the brain, one can interfere with neural functioning in potentially harmful ways.

As noted earlier neurodata can be functional through the use of BCI for everything from military use to controlling prosthetics to gaming. I predict that its operationalizable nature is the predominant way that neurodata will be commercially procured – you need to generate it in order to use BCIs. This raises the potential for continuous collection of

176 Farah, Martha J., et al. "Brain imaging and brain privacy: a realistic concern?" Brain 21.1 (2010). 177 Hallinan et al. (2014). 178 Hamilton, Joan O'C. “If They Could Read Your Mind.” Stanford Magazine, Jan. 2004, alumni.stanford.edu/get/page/magazine/article/?article_id=36320. Accessed 18 Sept. 2017. 179 This may also be true of genetic data (e.g., biowarfare), but this risk seems further in the future than the manipulation of brains, and individuals are not welcoming that intervention into their lives in the same way that people are enthusiastically welcoming BCI and DTC-EEG. 45 personal health information on users potentially without their knowledge or informed consent. The issue is not only that there could be more neurodata collected, but that it is passively collected – the purpose for which the device is being used is not to collect neurodata but to control programs/objects. Thus programs are collecting information that users may not be aware they are sharing. With DTC-GT you are aware you are undergoing genetic testing, and in theory make the decision to share that information with a relative degree of informedness.180 There is no genetic precedent for passively collected data or its management in the big data environment.181

Further, the manipulable nature of neurodata means that these DTC-EEG devices can allow individuals to interfere in their own neural functioning in unfounded or potentially harmful ways. Information about neurodata can be used to alter neural functioning, either through neurofeedback, cognitive behavioural training, or meditation. Information about genetic data cannot similarly be used to alter your genes. Individuals are unable to personally modify their genetic information through conscious effort or based on knowledge of their genetic makeup. Gene expression can be impacted by our environment and exposures, but not our own effortful behaviour. This will have implications for the sort of purposes for which these DTC-EEG devices are allowed to market themselves. Perhaps in the future, the

FDA will regulate brain-training games despite their not posing any physical harm stemming from product safety. I support the opinion of the Nuffield Commission that “The risks of non-invasive EEG are thought to be minimal. However, there has been no systematic

180 This point has its limitations. Obtaining truly informed consent for genetic testing is itself a topic of much concern, given both the complexities of the resulting information and the breadth of conditions for which testing can be done. For example, an individual may get DTC-GT for purposes of ancestry, not knowing that the company can also acquire information on some health conditions using the same test. However, this is an issue of company transparency, and not a function of the mode of data collection being passive vs. active. 181 This may not be for long, however. Helix has just introduced the first genetic based app store. This is perhaps an area where genetics can learn from a neurodata framework in the future. 46 research into the long-term effects of their use for recreational purposes. Our still limited knowledge of how the brain works, coupled with its central role in many aspects of a meaningful existence, means that unintended effects of intervening come at a potentially high cost.”182 Therefore, it might be necessary to revisit the FDA’s definition of a medical device and their decision to not monitor DTC-EEG devices and their associated brain training games.

Implications for monitoring

Finally, the dynamic and manipulable nature of neurodata has implications for monitoring. Unlike genetic information which is for the most part predictive, neurodata can tell you things about yourself that are happening right now, on a second-to-second basis.

Genetic information simply does not pose the threat of the type of surveillance that comes along with that. Even if genetic expression is not static like the genome, tracking gene expression is a resource intensive, costly and time-consuming endeavour. The same potential for real-time, in situ, in vivo neuro-monitoring does not have a genetic analog. And it does raise some serious concerns for neurodata.

The field of neuroergonomics seeks to integrate understanding of the neural bases of cognition and behaviour with the design, development and implementation of technology in order to develop the capability to continuously monitor an individual’s level of attention, fatigue, mental workload, and task engagement in operational environments using neurophysiological parameters.183 Zhang et al. (2014) developed a fifteen-channel real-time

182 Nuffield Council On Bioethics, comp. "Chapter 8 Non-Therapeutic Applications." Novel Neurotechnologies: Intervening in the Brain. London, UK: Nuffield Council on Bioethics, 2013. 162-90. Web. 183 Berka et al. (2007). 47 means for assessing cognitive workload that they claim has a broad range of applications in cognitive ergonomics and mental health monitoring.184 Proponents claim that “the ability to continuously and unobtrusively monitor levels of task engagement and mental workload in an operational environment could be useful in identifying more accurate and efficient methods for humans to interact with technology.”185 Alternatively, the constant monitoring of employees’ levels of engagement during work hours could instead be used to punish those who get distracted, are tired, or are not focused or “alert enough”. Perhaps in the future, some employers may only pay by the number of “attentive” hours worked. Neural surveillance by employers does not sound like something in tune with a society that values freedom, liberty, and privacy. Therefore we need clear discussion on what our societal values are regarding this sort of surveillance, and carefully consider the potential need for regulation addressing the use of neuroimaging and neural monitoring of employees by employers.

Further, although it is thought that employers imposing neuroimaging on current or potential employees for purposes of lie detection would be prohibited by the Employee

Polygraph Protection Act (EPPA), Tovino (2012) points out that neuroimaging would not come within the definition of “lie detector” if the exam or results were used for reasons other than to detect deception.186 Thus, the EPPA would not provide any limitation on an employer requesting an employee or applicant to submit to neuroimaging exams for non- deception reasons. For example, if an employer wished to form a conclusion about future propensity to disease, recognition of illicit images, or even response time using DTC-EEG devices, requiring the employee to undergo neuroimaging would fall completely within the legal realm under current regulations.

184 Zhang et al. (2014). 185 Berka et al. (2007). 186 Kostiuk (2012). 48 There also exists the potential for monitoring by insurers. Again, the ability of EEG to detect varying degrees of alertness could imply culpability in certain instances. For example, the previously mentioned work by Stopczynski et al. (21014a) wherein researchers detected whether a user was focusing fully on a driving task or was distracted could be used to establish culpability in car insurance: if your EEG showed that you were not totally alert at the time of a crash, you may be held accountable and not covered for damages.

A potential challenge to these arguments is – so what? Why should employers not know that their employees are distracted or that a driver was not paying attention at the time of a crash? Again, this returns to the argument about the Self in the context of privacy, and the brain as a special sphere of privacy that deserves protections beyond those of other sorts of data. What is at stake is fundamentally altering the very way our identity as a person interacts with the world, and as such the implementation of these devices should not develop unobserved with their implications unconsidered.

The Need For Regulation

The considerations discussed in this piece portray a perfect storm when it comes to

DTC neurodata. Due to its operationalizable nature, and the welcoming of these technologies and apps into individuals’ lives, there will be a tremendous amount of neurodata being passively collected through the use of BCI in the big data environment and, due to our cultural attitudes towards the brain and Self, that neurodata is going to be considered more informative in valuable ways to many people. The similarities between the

DTC neuroimaging technologies and DTC-GT have caused much of the attention on these

49 DTC devices to echo those conversations about genetic information, however there is need for regulation beyond that offered for genetic information.

Regulations surrounding genetic information have largely focused on the transgressions that can occur related to privacy and confidentiality insofar as they raise the potential for discrimination based on the genome.187 However, I argue that the most serious threat in regards to neurodata is not the potential for discrimination, no matter how disturbing that potential is, but rather it is the violation of that sphere of privacy that is the mind that in and of itself is a harm, regardless of any associated outcomes related to that breach. Unfortunately, in regards to this potential, genetic frameworks provide us with little guidance and the only federal solution offered for the genetic information debate – GINA – cannot provide us with much help.188 GINA relates only to employers and health insurers – not life insurers, the court system, or private commercial entities, and it relates only to compelled genetic screening, not to passively collected information or information obtained from other commercial third parities. Arguably, this solution is not even sufficient to deal with genetics’ own ethical challenges. With the first gene-based “app store”189 it looks likely that these big data concerns will soon enter the “genethics” realm as well. In which case, perhaps it will not be genetics serving as a jumping off point for ethical regulation for neurodata, but the other way around.

187 Other considerations exist, such as the right not to know about your genetic make-up, however these have received less attention in regulatory discussions. 188 For an analysis of the insufficiency of GINA for fMRI neurodata, see Tovino (2006). 189 Mullin, Emily. “A DNA App Store Is Here, but Proceed with Caution.” MIT Technology Review, 24 July 2017, www.technologyreview.com/s/608313/a-dna-app-store-is-here-but-proceed-with-caution/. Accessed 20 Sept. 2017. 50 Conclusion

Personal health data privacy and confidentiality in the era of big data is going to be the next social struggle to be waged. But neurodata goes a step beyond that. We are on the edge of the very way we interface with the world – the very way other people interact with our Selves - being fundamentally altered.

For those skeptics who say EEG does not have the specificity to pose these levels of threat, these are early days. Advancements in the technology occur everyday across research, clinical, and commercial endeavours. Further, Functional Near-Infrared Spectroscopy

(fNIRS) is coming; indeed a portable version is already on the market.190 Compared to EEG, fNIRS has a higher signal-to-noise ratio, is more suited for use during normal working conditions, and has a much higher spatial resolution which enables targeted measurements of specific brain regions rather than just cortical activity.191 If objections to this critique are that EEG is too inaccurate to provide really robust information, improved versions posing the same problems are coming down the pipeline. These concerns are applicable to all DTC neuroimaging devices, and as such need to be addressed quickly.

The problem is, many of these concerns are not new. Starting as early as 2003,

Mauron,192 Illes & Racine,193 Roskies,194,195 Reid & Baylis,196 and Greely197 have all drawn attention to the ways in which, as helpful a framework as the genetic information debate was as a jumping off point for a discussion around neurodata, it would not be sufficient. To

190 “NIRScout fNIRS Neuroimaging - NIRS Data Acquisition.” NIRx Medical Technologies, nirx.net/nirscout/. Accessed 20 Sept. 2017. 191 Serwadda, Abdul, et al. "fnirs: A new modality for brain activity-based biometric authentication." Biometrics Theory, Applications and Systems (BTAS), 2015 IEEE 7th International Conference on. IEEE, 2015. 192 Mauron (2003). 193 Illes and Racine (2005). 194 Roskies (2009). 195 Roskies, Adina L. "Neuroethics beyond genethics." EMBO reports 8.1S (2007): S52-S56. 196 Reid and Baylis (2005). 197 Greely, Henry T. "Neuroethics and ELSI: Similarities and differences." Minn. JL Sci. & Tech. 7 (2005): 599. 51 quote a 2005 Illes and Racine paper: “while the ethics of genetics provides a legitimate starting point—even a backbone—for tackling ethical issues in neuroimaging, they do not suffice.”198 Now, 14 years later, those arguments have not changed – but technology has.

The realm of the possible has. And it is quite possible that even today, consumers’ mental states are being collected, to what ends we cannot yet know, and shared with whom we do not know. Surprisingly, the most thoughtful literature on this topic so far has come from the computer sciences – security experts have long been raising the flag about what this could mean from a security standpoint. The same level of consideration needs to be paid to the ethical, social, political, and legal implications of these DTC-EEG devices and the associated derived neurodata. The rapidly increasing number of brain apps that record and utilize EEG derived neurodata brings all the concerns of big data into the world of neuroethics. Lessons from genetics are insufficient to deal with these challenges.

198 Illes and Racine (2005). 52 Works Cited

Abootalebi, Vahid, Mohammad Hassan Moradi, and Mohammad Ali Khalilzadeh. "A comparison of methods for ERP assessment in a P300-based GKT." International Journal of Psychophysiology 62.2 (2006): 309-320.

Abootalebi, Vahid, Mohammad Hassan Moradi, and Mohammad Ali Khalilzadeh. "A new approach for EEG feature extraction in P300-based lie detection." Computer methods and programs in biomedicine 94.1 (2009): 48-57.

Acharya, U. Rajendra, et al. "Computer-aided diagnosis of alcoholism-related EEG signals." Epilepsy & Behavior 41 (2014): 257-263.

AD HD: Super Powers for Super Kids, NeuroSky, store.neurosky.com/products/super-powers- for-super-kids. Accessed 22 Sept. 2017.

Anglin, Stephanie M. "I think, therefore I am? Examining conceptions of the self, soul, and mind." Consciousness and cognition 29 (2014): 105-116.

Berka, Chris, et al. "Real-time analysis of EEG indexes of alertness, cognition, and memory acquired with a wireless EEG headset." International Journal of Human-Computer Interaction 17.2 (2004): 151-170.

Berka, Chris, et al. "Evaluation of an EEG workload model in an Aegis simulation environment." Proceedings of SPIE. Vol. 5797. 2005a.

Berka, Chris, et al. "EEG quantification of alertness: Methods for early identification of individuals most susceptible to sleep deprivation." Proceedings of SPIE Defense and Security Symposium, Biomonitoring for Physiological and Cognitive Performance during Military Operations. Vol. 5797. FL: SPIE: The International Society for Optical Engineering, 2005b.

Berka, Chris, et al. "EEG correlates of task engagement and mental workload in vigilance, learning, and memory tasks." Aviation, space, and environmental medicine 78.5 (2007): B231- B244.

Bonaci, Tamara, Ryan Calo, and Howard Jay Chizeck. "App stores for the brain: Privacy & security in Brain-Computer Interfaces." Ethics in Science, Technology and Engineering, 2014 IEEE International Symposium on. IEEE, 2014.

Bonaci, T. L. B. M. T., J. Herron, and H. J. Chizeck. "How susceptible is the brain to the side-channel private information extraction." American Journal of Bioethics, Neuroscience 6.4 (2015).

“BrainCopter.” NeuroSky, store.neurosky.com/products/braincopter. Accessed 22 Sept. 2017.

53 “Brain-Starter.” NeuroSky, store.neurosky.com/products/brain-starter. Accessed 17 Sept. 2017.

Brodwin, Paul. "Genetics, identity, and the anthropology of essentialism." Anthropological Quarterly 75.2 (2002): 323-330.

Chuang, John, et al. "I think, therefore I am: Usability and security of authentication using brainwaves." International Conference on Financial Cryptography and Data Security. Springer, Berlin, Heidelberg, 2013.

Clayton, Ellen Wright. "Ethical, legal, and social implications of genomic medicine." New England Journal of Medicine 349.6 (2003): 562-569.

“Cognionics Bundles!” NeuroGuide by Applied Neuroscience, Applied Neuroscience, Inc., 2017, www.appliedneuroscience.com/NeuroGuide_Cognionics.htm. Accessed 20 Sept. 2017.

Comfort, Nathaniel. “The Genetic Self.” The Point Magazine, 15 Jan. 2015, thepointmag.com/2014/examined-life/genetic-self. Accessed 17 Sept. 2017.

Committee on Science and Law. "ARE YOUR THOUGHTS YOUR OWN?"" NEUROPRIVACY" AND THE LEGAL IMPLICATIONS OF BRAIN IMAGING." RECORD-ASSOCIATION OF THE BAR OF THE CITY OF NEW YORK 60.2 (2005): 407.

Dar-Nimrod, Ilan, and Steven J. Heine. "Some thoughts on essence placeholders, interactionism, and heritability: reply to Haslam (2011) and Turkheimer (2011)." (2011a): 829.

Dar-Nimrod, Ilan, and Steven J. Heine. "Genetic essentialism: on the deceptive determinism of DNA." Psychological bulletin 137.5 (2011b): 800.

Davidson, Richard J., et al. "Depression: perspectives from affective neuroscience." Annual review of psychology 53.1 (2002): 545-574.

De Gennaro, Luigi, et al. "The electroencephalographic fingerprint of sleep is genetically determined: a twin study." Annals of neurology 64.4 (2008): 455-460.

“DeepWave.” NeuroSky, store.neurosky.com/products/deepwave. Accessed 17 Sept. 2017.

Du, Changde, Changying Du, and Huiguang He. "Sharing deep generative representation for perceived image reconstruction from human brain activity." arXiv preprint arXiv:1704.07575 (2017).

Duhigg, Charles. “How Companies Learn Your Secrets.” The New York Times Magazine, 16 Feb. 2012, www.nytimes.com/2012/02/19/magazine/shopping- habits.html?_r=1&hp=&pagewanted=all. Accessed 20 Sept. 2017.

54 “EPOC - 14 Channel Wireless EEG Headset.” Emotiv, www.emotiv.com/epoc/. Accessed 20 Sept. 2017.

Farah, Martha J., et al. "Brain imaging and brain privacy: a realistic concern?" Brain 21.1 (2010).

Farwell, Lawrence A. "Brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials." Cognitive neurodynamics 6.2 (2012): 115-154.

Farwell, Lawrence A., and Sharon S. Smith. "Using brain MERMER testing to detect knowledge despite efforts to conceal." Journal of forensic science 46.1 (2001): 135-143.

Federal Trade Commission. "Mobile privacy disclosures: Building trust through transparency." USA: Federal Trade Commission (2013).

Fernandez-Duque, Diego. "Lay Theories of the Mind/Brain Relationship and the Allure of Neuroscience." The Science of Lay Theories. Springer, Cham, 2017. 207-227.

Fernandez-Duque, Diego, and Barry Schwartz. "Common Sense Beliefs about the Central Self, Moral Character, and the Brain." Frontiers in psychology 6 (2015).

Finelli, Luca A., Peter Achermann, and Alexander A. Borbély. "Individual ‘fingerprints’ in human sleep EEG topography." Neuropsychopharmacology 25.5 (2001): S57-S62.

Food and Drug Administration. "Mobile medical applications: guidance for industry and Food and Drug Administration staff." USA: Food and Drug Administration (2013).

Food and Drug Administration. “Medical Device Data Systems, Medical Image Storage Devices, and Medical Image Communications Devices: guidance for industry and Food and Drug Administration staff." USA: Food and Drug Administration (2015).

Frank, M., Hwu, T., Jain, S., Knight, R., Martinovic, I. et al. “Subliminal Probing for Private Information via EEG-Based BCI Devices.” arXiv Preprint arXiv, 1312, 6052 (2013).

Gelman, Susan A. "Essentialism in everyday thought." Psychological Science Agenda 19.5 (2005): 1-6.

Ginsberg, J. (2017). Mental Privacy in the Age of Big Data. The Neuroethics Blog. Retrieved on September 20, 2017, from http://www.theneuroethicsblog.com/2017/06/mental- privacy-in-age-of-big-data.html

Giota, Kyriaki G., and George Kleftaras. "Mental health apps: innovations, risks and ethical considerations." E-Health Telecommunication Systems and Networks 3.03 (2014): 19.

Gotlib, Ian H. "EEG alpha asymmetry, depression, and cognitive functioning." Cognition & Emotion 12.3 (1998): 449-478. 55

Gould, Wren A., and Steven J. Heine. "Implicit essentialism: genetic concepts are implicitly associated with fate concepts." PloS one 7.6 (2012): e38176

Greely, Henry T. "Neuroethics and ELSI: Similarities and differences." Minn. JL Sci. & Tech. 7 (2005): 599.

Grundy, Quinn, Fabian P. Held, and Lisa A. Bero. "Tracing the Potential Flow of Consumer Data: A Network Analysis of Prominent Health and Fitness Apps." Journal of medical Internet research 19.6 (2017).

Hallinan, Dara, et al. "Neurodata and neuroprivacy: Data protection outdated?." Surveillance & Society 12.1 (2014): 55.

Hamilton, Joan O'C. “If They Could Read Your Mind.” Stanford Magazine, Jan. 2004, alumni.stanford.edu/get/page/magazine/article/?article_id=36320. Accessed 18 Sept. 2017.

Haslam, Nick. "Genetic essentialism, neuroessentialism, and stigma: commentary on Dar- Nimrod and Heine (2011)." (2011): 819.

“Health App Use Scenarios & HIPAA.” Health App Developers, what are your questions about HIPAA?, US Dept. of Health and Human Services Office for Civil Rights, Feb. 2016, hipaaqsportal.hhs.gov/community-library/accounts/92/925889/OCR-health-app- developer-scenarios-2-2016.pdf. Accessed 17 Sept. 2017.

“Homepage.” Emotiv, www.emotiv.com/. Accessed 20 Sept. 2017.

“Homepage.” Kokoon, www.kokoon.io. Accessed 17 Sept. 2017.

Houtepen, Lotte C., et al. "Genome-wide DNA methylation levels and altered cortisol stress reactivity following childhood trauma in humans." Nature communications 7 (2016).

Huckvale, Kit, et al. "Unaddressed privacy risks in accredited health and wellness apps: a cross-sectional systematic assessment." BMC medicine 13.1 (2015): 214.

Illes, Judy, and Eric Racine. "Imaging or imagining? A neuroethics challenge informed by genetics." The American Journal of Bioethics 5.2 (2005): 5-18.

“Independent Studies Archives.” Emotiv, www.emotiv.com/category/independent-studies/. Accessed 17 Sept. 2017.

“IBS: Calm & Control your gut.” NeuroSky. https://store.neurosky.com/products/calm- control-your-gut. Accessed 17 Sept. 2017.

Ienca, Marcello, and Roberto Andorno. "Towards new human rights in the age of neuroscience and neurotechnology." Life Sciences, Society and Policy 13.1 (2017): 5. 56

“Insight Brainwear® 5 Channel Wireless EEG Headset.” Emotiv, www.emotiv.com/insight/. Accessed 20 Sept. 2017.

Jones, Ashley, and Galina Schwartz. "Using brain-computer interfaces to analyze EEG data for safety improvement." Team for Research in Ubiquitous Secure Technology (2010).

Karson, Craig N., et al. "Computerized EEG in schizophrenia." Schizophrenia bulletin 14.2 (1988): 193.

Klimesch, Wolfgang. "EEG alpha and theta oscillations reflect cognitive and memory performance: a review and analysis." Brain research reviews 29.2 (1999): 169-195.

Klonovs, Juris, and Christoffer Kjeldgaard Petersen. "Development of a mobile EEG-based feature extraction and classification system for biometric authentication." Master's Thesis: Aaolburg University Copenhagen (2012).

Koh, Winston, et al. "Noninvasive in vivo monitoring of tissue-specific global gene expression in humans." Proceedings of the National Academy of Sciences 111.20 (2014): 7361- 7366.

Kong, Camillia, Michael Dunn, and Michael Parker. "Psychiatric genomics and mental health treatment: Setting the ethical agenda." The American Journal of Bioethics 17.4 (2017): 3-12.

Kostiuk, Stephanie A. "After GINA, NINA-neuroscience-based discrimination in the workplace." Vand. L. Rev. 65 (2012): 933.

La Rocca, Daria, et al. "Human brain distinctiveness based on EEG spectral coherence connectivity." IEEE Transactions on Biomedical Engineering 61.9 (2014): 2406-2412.

Lee, Hongmi, and Brice A. Kuhl. "Reconstructing perceived and retrieved faces from activity patterns in lateral parietal cortex." Journal of Neuroscience 36.22 (2016): 6069-6082.

Levendowski, D. J., et al. "Correlations between EEG indices of alertness measures of performance and self-reported states while operating a driving simulator." 29th Annual Meeting, Society for Neuroscience. Vol. 25. 1999.

Levendowski, Daniel J., et al. "Electroencephalographic indices predict future vulnerability to fatigue induced by sleep deprivation." Sleep 24.Abstract Suppl (2001).

Levendowski, Daniel J., et al. "Event-related potentials during a psychomotor vigilance task in sleep apnea patients and healthy subjects." Sleep 25 (2002): A462-A463.

Li, Jingquan. "A privacy preservation model for health-related social networking sites." Journal of medical Internet research 17.7 (2015).

57 Li, QianQian, Ding Ding, and Mauro Conti. "Brain-computer interface applications: Security and privacy challenges." Communications and Network Security (CNS), 2015 IEEE Conference on. IEEE, 2015.

Long Beach City Emps. Ass'n. v. City of Long Beach, 719 P.2d 660, 663 (Cal. 1986).

Luck, Steven J. An introduction to the event-related potential technique. MIT press, 2014.

Mantri, Shamla, et al. "A Survey: Fundamental of EEG." International Journal 1.4 (2013).

Marcel, Sebastien, and José del R. Millán. "Person authentication using brainwaves (EEG) and maximum a posteriori model adaptation." IEEE transactions on pattern analysis and machine intelligence 29.4 (2007).

Martinovic, Ivan, et al. "On the Feasibility of Side− channel Attacks with Brain− computer Interfaces." Proceedings of the 21st USENIX conference on Security symposium. USENIX Association, 2012.

Maskeliunas, Rytis, et al. "Consumer-grade EEG devices: are they usable for control tasks?." PeerJ 4 (2016): e1746.

Mathews, Debra JH, Hilary Bok, and Peter V. Rabins, eds. Personal identity and fractured selves: perspectives from philosophy, ethics, and neuroscience. JHU Press, 2009.

Matovu, Richard, and Abdul Serwadda. "Your substance abuse disorder is an open secret! Gleaning sensitive personal information from templates in an EEG-based authentication system." Biometrics Theory, Applications and Systems (BTAS), 2016 IEEE 8th International Conference on. IEEE, 2016.

Mauron, Alex. "Renovating the house of being." Annals of the New York Academy of Sciences 1001.1 (2003): 240-252.

McGough, James J., et al. "A potential electroencephalography and cognitive biosignature for the child behavior checklist–dysregulation profile." Journal of the American Academy of Child & Adolescent Psychiatry 52.11 (2013): 1173-1182.

Meijer, Ewout H., et al. "A comment on Farwell (2012): brain fingerprinting: a comprehensive tutorial review of detection of concealed information with event-related brain potentials." Cognitive neurodynamics 7.2 (2013): 155-158.

“MeloMind.” MyBrain Technologies, www.mybraintech.com/. Accessed 17 Sept. 2017.

“MindDrone.” Emotiv, www.emotiv.com/product/minddrone/. Accessed 22 Sept. 2017.

“MindFlex.” NeuroSky, store.neurosky.com/products/mindflex. Accessed 17 Sept. 2017.

58 “MindRec.” NeuroSky. https://store.neurosky.com/products/mindrec. Accessed 17 Sept. 2017.

Mullin, Emily. “A DNA App Store Is Here, but Proceed with Caution.” MIT Technology Review, 24 July 2017, www.technologyreview.com/s/608313/a-dna-app-store-is-here- but-proceed-with-caution/. Accessed 20 Sept. 2017.

“MUSE ™ | Meditation Made Easy.” Muse: the brain sensing headband, www.choosemuse.com/. Accessed 17 Sept. 2017.

“NIRScout fNIRS Neuroimaging - NIRS Data Acquisition.” NIRx Medical Technologies, nirx.net/nirscout/. Accessed 20 Sept. 2017.

Nuffield Council On Bioethics, comp. "Chapter 8 Non-Therapeutic Applications." Novel Neurotechnologies: Intervening in the Brain. London, UK: Nuffield Council on Bioethics, 2013. 162-90. Web.

Olbrich, Sebastian, and Martijn Arns. "EEG biomarkers in major depressive disorder: discriminative power and prediction of treatment response." International Review of Psychiatry 25.5 (2013): 604-618.

Olson, Steve. "Brain scans raise privacy concerns: advances in neuroimaging may provide the ability to" read" someone's mind, rightly or wrongly." Science 307.5715 (2005): 1548- 1551.

Pacific Science & Engineering Group. DARPA Augmented Cognition Technological Integration Experiment (TIE). 2003 July 7. San Diego, CA; 2003.

Pennsylvania v. Muniz, 496 US 582, 595 (1990).

Privacy Rights Clearinghouse. "Mobile health and fitness apps: What are the privacy risks." Retrieved September 7 (2013): 2013.

Prentice, Deborah A., and Dale T. Miller. "Psychological essentialism of human categories." Current directions in psychological science 16.4 (2007): 202-206.

Purcell, Ryan H., and Karen S. Rommelfanger. "Internet-based brain training games, citizen scientists, and Big Data: ethical issues in unprecedented virtual territories." Neuron 86.2 (2015): 356-359.

Pustilnik, Amanda C. "Neurotechnologies at the intersection of criminal procedure and constitutional law." (2012).

Racine, Eric, Ofek Bar-Ilan, and Judy Illes. "fMRI in the public eye." Nature Reviews. Neuroscience 6.2 (2005): 159.

59 Racine, Eric, et al. "Contemporary neuroscience in the media." Social Science & Medicine 71.4 (2010): 725-733.

“Research Tools.” NeuroSky, store.neurosky.com/products/mindset-research-tools. Accessed 17 Sept. 2017.

Rebolledo-Mendez, Genaro, et al. "Assessing neurosky’s usability to detect attention levels in an assessment exercise." International Conference on Human-Computer Interaction. Springer, Berlin, Heidelberg, 2009.

Reid, Lynette, and Françoise Baylis. "Brains, genes, and the making of the self." The American Journal of Bioethics 5.2 (2005): 21-23.

Reiner, Peter B. "The rise of neuroessentialism." (2011).

Rosenfeld, J. Peter, Julianne R. Biroschak, and John J. Furedy. "P300-based detection of concealed autobiographical versus incidentally acquired information in target and non- target paradigms." International Journal of Psychophysiology 60.3 (2006): 251-259.

Roskies, Adina. "Neuroethics for the new millenium." Neuron 35.1 (2002): 21-23.

Roskies, Adina L. "Neuroethics beyond genethics." EMBO reports 8.1S (2007): S52-S56.

Roskies, Adina L. "What's “Neu” in Neuroethics?." The Oxford Handbook of Philosophy and Neuroscience. 2009.

Ruiz-Blondet, Maria V., Zhanpeng Jin, and Sarah Laszlo. "CEREBRE: A novel method for very high accuracy event-related potential biometric identification." IEEE Transactions on Information Forensics and Security 11.7 (2016): 1618-1629.

Samek, Diana, Bibiana D. Koh, and Martha A. Rueter. "Overview of behavioral genetics research for family researchers." Journal of family theory & review 5.3 (2013): 214-233.

Sarasohn-Kahn, Jane. “Here's Looking at You.” California Health Care Foundation , July 2014, www.chcf.org/publications/2014/07/heres-looking-personal-health-info. Accessed 17 Sept. 2017.

Schultz, William. "Neuroessentialism Theoretical and Clinical Considerations." Journal of Humanistic Psychology (2015): 0022167815617296

Scudellari, Megan. “EEG Identification Can Steal Your Most Closely Held Secrets.” IEEE Spectrum: Technology, Engineering, and Science News, 9 Sept. 2016, spectrum.ieee.org/the- human-os/biomedical/devices/eeg-identification-can-steal-your-most-private-secrets. Posted 9 Sep 2016. Accessed 17 Sept. 2017.

60 Senate, U. S. "A Review of the Data Broker Industry: Collection, use, and sale of consumer data for marketing purposes." Washington, DC: Committee on Commerce, Science, and Transportatio n, US Senate (2013).

Serwadda, Abdul, et al. "fnirs: A new modality for brain activity -based biometric authentication." Biometrics Theory, Applications and Systems (BTAS), 2015 IEEE 7th International Conference on . IEEE, 2015.

Shen, Francis X. "Neuroscie nce, mental privacy, and the law." Harv. JL & Pub. Pol'y 36 (2013): 653.

Smith, S. J. M. "EEG in the diagnosis, classification, and management of patients with epilepsy." Journal of Neurology, Neurosurgery & Psychiatry 76.suppl 2 (2005): ii2 -ii7.

Sponheim, Scott R., et al. "Resting EEG in first episode and chronic schizophrenia." Psychophysiology 31.1 (1994): 37 -43.

“Star Wars Science Force Trainer.” Amazon.com , www.amazon.com/Star -Wars -Science - Force -Trainer/dp/B001UZHASY. Accessed 22 Sept. 2017.

Steel, Emily , and April Dembosky. “Health apps run into privacy snags.” Financial Times , 1 Sept. 2013, www.ft.com/content/b709cf4a -12dd -11e3 -a05e -00144feabdc0. Accessed 20 Sept. 2017.

Stopczynski, Arkadiusz, et al. "A smartphone interface for a wireless EEG headset with real - time 3D reconstruction." Affective Computing and Intelligent Interaction . Springer, Berlin, Heidelberg, 2011. 317 -318.

Stopczynski, Arkadiusz, et al. "Privacy for pe rsonal neuroinformatics." (2014 ).

Stopczynski, Arkadiusz , et al. "The smartphone brain scanner: a portable real -time neuroimaging system." PloS one 9.2 (2014 ): e86733.

Takabi, Hassan, Anuj Bhalotiya, and Manar Alohaly. "Brain Computer Interface (BCI) Applications: Privacy Threats and Countermeasures." Collabor ation and Internet Computing (CIC), 2016 IEEE 2nd International Conference on . IEEE, 2016.

Taylor, Grant S., and Christina Schmidt. "Empirical evaluation of the Emotiv EPOC BCI headset for the detection of mental actions." Proceedings of the Human Factors and Ergonomics Society Annual Meeting . Vol. 56. No. 1. Sage CA: Los Angeles, CA: SAGE Publications, 2012.

Thatcher, Robert W., Duane North, and C. Biver. "EEG and intelligence: relations between EEG coherence, EEG phase delay and power." Clinical neuroph ysiology 116.9 (2005): 2129 -2141.

61 Toga, Arthur W. "Neuroimage databases: the good, the bad and the ugly." Nature Reviews. Neuroscience 3.4 (2002): 302.

Tovino, Stacey A. "Functional neuroimaging information: A case for neuro exceptionalism." Fla. St. UL Rev. 34 (2006): 415.

United States of America. U.S. Department of Health and Human Services . Food and Drug Administration. General Wellness: Policy for Low Risk Devices. N.p.: n.p., January 20, 2015. Web.

Weisberg, Deena Skolnick, et al. "The seductive allure of neuroscience explanations." Journal of cognitive neuroscience 20.3 (2008): 470-477.

Wexler, Anna. "A pragmatic analysis of the regulation of consumer transcranial direct current stimulation (TDCS) devices in the United States." Journal of Law and the Biosciences 2.3 (2016): 669-696.

Zhang, Haihong, et al. "Detection of variations in cognitive workload using multi-modality physiological sensors and a large margin unbiased regression machine." Engineering in Medicine and Biology Society (EMBC), 2014 36th Annual International Conference of the IEEE. IEEE, 2014.

62 Biography

Iris Coates McCall was born in 1990 in Toronto, Canada.

Before starting university, Iris studied photography and video production at Blake College,

London, UK.

Iris did her undergraduate degree at McGill University, Montreal where she majored in

Cognitive Science and minored in English, Drama and Theatre. She graduated in 2014.

In 2016, Iris began her Master of Bioethics at Johns Hopkins University. During that time she worked as a research assistant at the Berman Institute of Bioethics under the Bloomberg

Philanthropies Data for Health Initiative.

63