Privacy by Design in Brain-Computer Interfaces

Tamara Bonaci and Howard Jay Chizeck {bonaci,chizeck}@ee.washington.edu

Dept of EE, University of Washington Seattle WA, 98195-2500

UWEE Technical Report Number UWEETR-2013-0001 February 2013

Department of Electrical Engineering University of Washington Box 352500 Seattle, Washington 98195-2500 PHN: (206) 543-2150 FAX: (206) 543-3842 URL: http://www.ee.washington.edu Privacy by Design in Brain-Computer Interfaces

Tamara Bonaci and Howard Jay Chizeck {bonaci,chizeck}@ee.washington.edu

Dept of EE, University of Washington Seattle WA, 98195-2500

University of Washington, Dept. of EE, UWEETR-2013-0001 February 2013

Abstract Brain-computer interfaces (BCIs) represent a direct communication link between the brain and an external device. By using the measured brain signals for communication, BCIs allow for non-verbal communication between a user and a device. In the last decade, BCIs have mostly been used in medical applications, typically to assist, augment or repair human cognitive or sensory-motor activities. In the recent years, however, BCIs have gained popularity in the gaming and entertainment industry. In this paper, we focus on privacy issues and legal implications arising from the use of BCI devices in medical, as well as in gaming, entertainment and marketing applications. Based on the observation that BCI devices provide an access to our unique brain wave patterns, which allow others to make inferences about our memory, intentions, conscious and unconscious interests, as well as about our emotional reactions, we conjecture the impact of exploiting, or even mishandling BCI devices will be severe and far-reaching. We thus believe that privacy issues arising from the use of BCI devices deserve immediate attention. The goal of this paper is to start the discussion about possible engineering and legal steps that can be made to prevent these emerging issues.

Introduction Brain-computer interfaces (BCIs), also known as brain-machine interfaces, represent a direct communication link between the brain and an external device, such as a computer or a smart phone. By using the measured brain signals for communication,BCIs allow for non-verbalcommunication between a user and a device. In the last decades, BCIs have mostly been used in medical research, to assist, augment or repair human cognitive or sensory-motor capabilities. In the recent years, however, BCIs have gained popularity in f ction, gaming and the entertainment industry. There currently exist several low-cost consumer BCI devices, including Emotive System [1], NeuroSky [2] and Neural Impulse Actuator [3]. In this paper, motivated by the recent work of Martinovic et al. [11], we focus on privacy issues and legal implica- tions arising from the use of BCI devices in medical, gaming, entertainment and marketing applications. BCI devices measure and record signals generated by neural activity. Since neuronsin the brain are connected, these signals contain the wealth of information from different parts of the brain. With suff cient computational power and tools, in the era of Big Data, this extra information can allow others to make inferences about our memory, intentions, conscious and unconscious interests, as well as about our emotional reactions. The impact of exploiting or mishandling BCI devices is diff cult to estimate. It may be severe. Thus privacy issues arising form the use of BCI devices deserve immediate attention. We give a brief overview of BCIs and recent results. We then present possible privacy threats arising from an unauthorized or malicious use of BCI devices. The goal of this paper is to start the discussion about possible engineer- ing and legal steps that can be made to prevent these issues. We discuss a possible approach and present some of the challenges in implementing it.

1 Current BCI Systems and Their Applications BCIs can be divided into three groups: (i) invasive, (ii) partially invasive, and (iii) non-invasive devices. Invasive BCIs are directly implanted into the brain during a surgery. These devices enable the highest quality measurements of brain signals. Partially invasive BCIs are implanted inside the skull, typically on top of the brain. They provide signals of lower noise and higher selectivity than non-invasive BCIs, which involve users wearing externally mounted electrodes. Most non-invasive BCI devices are based on (EEG), a method of directly measuring elec- trical potentials produced by neural synaptic activities. While known to be susceptible to noise and signal distortion, EEG signals are easily measurable and they have good temporal resolution. EEG-based BCIs are also relatively low cost and low risk, which makes them the most often used BCI devices [13]. The Event-Related Potential (ERP) is one of important neurophysiological phenomena measured by EEG. ERP is def ned as a brain response to a direct cognitive, sensory or motor stimulus [9], and it is typically observed as a pattern of signal changes after the external stimulus. ERP waveforms consist of several positive and negative voltage peaks, related to the set of underlying components. A sum of these components is caused by the “higher” brain processes, involving memory, attention or expectation. The more prominent components are called , , , P600, and ERN (error-related negativity). These components are thought to ref ect the following [9]: N100 - a reaction to any unpredictable stimulus; P300 - processes involving stimulus evaluation or categorization; N400 - a reaction to a meaningful or potentially meaningful stimulus, including words, pictures, sounds, smells or faces; P600 - a reaction to hearing or reading a grammatical error, or other syntactic anomaly; ERN - processes occurring after an error is committed in multiple-choice tasks, even if a person is not explicitly aware of the error. In recent years, many investigators have shown how different ERP components can be used to infer things about an individual’s’ personality, memory and preferences. For example, the P300 component was used to recognize the subject’s name in a random sequence of personal names [12], to discriminate familiar from unfamiliar faces [10], as well as for detection [5]. Similarly, the N400 component was used to infer what a person was thinking about, after he/she was primed on a specif c set of words. The P600 component was used to make an inference about the person’s sexual preferences, and an estimate of the anxiety level derived from the EEG signals was used to draw conclusions about the person’s religious beliefs [7].

Privacy Concerns and Possible Legal Implications In [11], Martinovic et al. introduced what they call a “brain spyware”, malicious software designed to detect private information about a subject, using his BCI-measured EEG signals. They used a commercially available BCI device, and developed an application which presents a subject with visual stimuli, then analyzes the measured EEG signal, in particular the P300 ERP component, in order to detect the subject’s: (a) 4-digit PIN, (b) bank information, (c) month of birth, (d) location of residence, and (e) if the subject recognized the presented set of faces. It is easy to imagine how such private information could be used for unauthorized or malicious. In recent years, several marketing companies (e.g., the Nielsen Company, who acquired NeuroFocus in 2008 [6]) have shown interest in the usage of BCI devices for marketing research. If BCI devices become widespread, we might see private information extracted from individuals without their permission. Is it in the public interest that the government or private organizations might collect and analyze measured brain signals to extract private information about our likes, dislikes, beliefs, fears, and prejudices? It appears that there currently are no regulations preventing this, outside of medical HIPPA privacy and security rules [4]. It is not hard to imagine “brain spyware” applications being developed to extract private information about our memories, prejudices and beliefs, but also about possible neurophysiological disorders. Unrestricted access to these information about our private thoughts makes the worst associations of “mind reading” not so far-fetched anymore. It is also important to note that attempts to willful can themselves be detected from an individual’s neural information [8]. In addition, non-invasive brain stimulators, emitting generally imperceptible DC electrical currents, can be used to make the responses of users noticeably slower if they attempt to lie.

Possible Design Steps to Protect Users Privacy These privacy concerns call for a coordinated response. Ideally, we can design devices, standards and regulations to prevent or mitigate these problems. We believe the f rst step should be an open discussion between neuroscientists, legal experts, ethicists, neural engineers, and others. This discussion should be aimed at answering the following questions: (i) Who all should have an access to an individual’sneural information? (ii) Which componentsof the neural information should those entities have an access to? (iii) How noisy, distorted or distilled should these components be

UWEETR-2013-0001 2 made before making them available? (iv) Which purposes are the entities allowed to use the brain signals for? (v) Are the entities allowed to present individuals with new stimuli, or extract new signal components or components thereof, to make inferences about an individual? Answering the presented questions will not be an easy task. For example, let’s consider the question of who should have an access to an individual’s neural information. Current consumer-grade BCIs operate in a manner similar to the smart phone industry. Companies typically offer low-cost devices, as well as software development kits, thus supportingthe developmentof new tools and games for their device [11]. Such an “open design” allows a faster growth of the whole BCI eco-system. At the same time, it puts our privacy at risk, since anyone can develop BCI applications; those that serves the people, as well as one those extracting our most private information. As with the smart phone applications, based only on the description, it might be hard to distinguish good from malicious applications. The question then arises should an access that BCI applications have to our neural information be restricted, similar to the “app stores” paradigm in smart phone and table markets. If so, how should this restriction be enforced? Can we expect that this will be done voluntarily by the industry, or should there exist an appropriate set of IEEE standards and an accompanying legal structure? Moreover, an approach of restricting the access to our neural data might slow down the expansion of BCI applications. At the same time, it might speed the adoption of BCI devices. Thus, another issue arising from efforts to limit access to neural information is f nding the right balance between the speed of BCI development, and BCI market expansion.

References

[1] Emotiv Systems (last accessed: January 14, 2013).

[2] NeuroSky (last accessed: January 14, 2013).

[3] OCZ Technology (last accessed: January 14, 2013).

[4] The Health Insurance Portability and Accountability Act of 1996 (HIPAA) Privacy and Security Rules, 1996.

[5] V. Abootalebi, M.H. Moradi, and M.A. Khalilzadeh. A New Approach for EEG Feature Extraction in P300-based Lie Detection. Computer Methods and Programs in Biomedicine, 94(1):48–57, 2009.

[6] S. Elliott. A Nielsen Acquisition Focused on Brain Waves, May 2011.

[7] M. Inzlicht, I. McGregor, J.B. Hirsh, and K. Nash. Neural Markers of Religious Conviction. Psychological Science, 20(3):385–392, 2009.

[8] B. Luber, C. Fisher, P.S. Appelbaum, M. Ploesser, and S.H. Lisanby. Non-invasive Brain Stimulation in the Detection of Deception: Scientif c Challenges and Ethical Consequences. Behavioral Sciences & the Law, 27(2):191–208, 2009.

[9] S.J. Luck. An Introduction to the Event-related Potential Technique. 2005.

[10] S. Marcel and J.R. Mill´an. Person Authentication Using Brainwaves (EEG) and Maximum A Posteriori Model Adaptation. IEEE Transactions on Pattern Analysis and Machine Intelligence, 29(4):743–752, 2007.

[11] I. Martinovic, D. Davies, M. Frank, D. Perito, T. Ros, and D. Song. On the Feasibility of Side-Channel Attacks with Brain-Computer Interfaces. In the Proceedings of the 21st USENIX Security Symposium. USENIX, 2012.

[12] J.P. Rosenfeld, J.R. Biroschak, and J.J. Furedy. P300-based Detection of Concealed Autobiographical Versus Incidentally Acquired Information in Target and Non-target Paradigms. International Journal of , 60(3):251–259, 2006.

[13] J.R. Wolpaw, N. Birbaumer, W.J. Heetderks, D.J. McFarland, P.H. Peckham, G. Schalk, E. Donchin, L.A. Quatrano, C.J. Robinson, and T.M. Vaughan. Brain- computer Interface Technology: A Review of the First International Meeting. IEEE Transactions on Rehabilitation Engineering, 8(2):164–173, 2000.

UWEETR-2013-0001 3