<<

Hacking Interfaces: How To Control A Computer Using Your Mind

Abstract

Cutting-edge including brain-computer interfaces, kinetic user interfaces, and augmented reality hardware will radically shift the way we interact with technology. These new methods of communication between humans and computers will dictate the ways in which our society – in fact, how technology itself – will function in the future. This talk focuses on these new developments in interfaces including EEG and fMRI brain-computer interfaces which allow people to interact with computers using their minds, kinetic devices such as LeapMotion and , and AR and VR devices including The Oculus Rift and the Omni Treadmill. Examples of open frameworks for developing and hacking the and hardware for these devices will be provided along with a brief history of the way humans and computers have communicated historically and how that process is being transformed.

Outline

I. Brief history of human/computer interfaces a. Key milestones in the development of primary human/computer interaction i. Punch Card 1. An ancestor of the punch card was invented in the early 18th century for controlling textile manufacturing and started to develop into what we would recognize as punch cards in the 19th century, thanks to Charles Babbage. 2. Until the 1960s, it was the primary method of both input and output communication with computers.1 ii. Keyboard2

1http://homepage.cs.uiowa.edu/~jones/cards/history.html 2http://www.daskeyboard.com/blog/?page_id=1329 1. The teletype machine was invented in the 1940s for use with the telegraph. 2. In 1948, the Binac – the world's first commercially available digital computer – used a modified typewriter to input data on magnetic tape as well as to print the data it returned. 3. When MIT, Bell, and GE created MULTICS, a CRT display combined with a typewriter allowed programmers to see what they were typing on their screen. iii. Mouse 1. Douglas Engelbart and his assistant Bill English unveiled the first mouse prototype in 1963 at Stanford. In 1968, he took the stage to use a mouse during a presentation that became known as “” as an example of intelligence amplification.3 2. When both Bill Gates and Steve Jobs saw it in use at Xerox PARC, they realized that they had to steal this, because people had a new way to talk to computers. Interesting sidenote: despite revolutionizing the way that human- computer interaction, Engelbart never received any royalties for his invention.4 iv. What’s next? 1. We’ve reached a kairotic moment in the development of how we communicate with computers and the way they communicate with us. 2. Here are some of the advances that will act as the impetus to the next wave of the evolution of computers, the way we communicate with them, and the way they communicate with us. II. Brain-computer interfaces (BCI) a. EEG: Electroencephalography i. Disclaimer: IANANS(I Am Not A Neuroscientist) ii. How does it work?

3http://sloan.stanford.edu/mousesite/1968Demo.html 4http://www-sul.stanford.edu/depts/hasrg/histsci/ssvoral/engelbart/engfmst1-ntb.html 1. EEG uses electrodes to record the electrical activity of your brain by measuring the voltage of the billions of neurons in different parts of your brain.5 2. First human EEG recording performed in 1924.6 3. In order to prepare the electrodes, a conductor such as contact-lens saline solution is applied to the sensors. 4. Uses machine learning algorithms to determine the difference between brain states to train the computer to perform specific actions. iii. Emotiv EPOC 1. First commercially-available BCI comparable to medical- grade scanners. 2. Only $500 with SDK, affordable for developers. 3. Open-source Python library called emokit released by Daeken after reverse-engineering the Emotiv protocol.7 iv. OpenEEG: GPLed hardware and software for monitoring brainwaves. 1. Hardware a. ModularEEG. “Made up of two or more EEG amplifiers, and a 6-channel signal capture board that connects to a PC via a standard serial cable. The standard setup has two EEG channels.”8 2. Software a. BioEra: “visual designer useful for analyzing [EEG] signals in real time.”9 b. OpenViBE: C++ LGPL software platform.10 v. Other EEG devices 1. Muse a. Going to be released in December 2013 from InteraXon at $199.11 b. Designed to be worn all day.

5http://en.wikipedia.org/wiki/Electroencephalography 6http://en.wikipedia.org/wiki/Electroencephalography 7https://github.com/daeken/Emokit/ 8http://openeeg.sourceforge.net/doc/modeeg/modeeg.html 9http://www.bioera.net/ 10http://openvibe.inria.fr/ 11http://interaxon.ca 2. Neurosky’sMindWave and MindSet12 a. Detects fewer mental states than the Emotiv, but less expensive. 3. XWave Sonic a. Compatible with iOS and uses Neurosky chips.13 4. MyndPlayBrainBand a. Uses Bluetooth and Neurosky chips.14 b. Less expensive than competing devices. vi. Issues with EEG15 1. Signal-to-noise ratio a. Because of interference, even extensive data needs to be scrubbed. b. Noise issue can be mitigated with invasive EEG, which implants electrodes directly into the brain. 2. Low resolution a. Doesn’t provide information about specific areas of the brain that are active, and there are issues recording activity in the lower layers of the brain. 3. Time a. It usually takes a considerable amount of time to calibrate and orient the EEG machine on the user’s head. b. fMRI: Functional Magnetic Resonance Imaging i. How does it work?16 1. MRI measures brain activity by monitoring changes in blood flow. 2. When neurons are activated, blood rushes to a particular part of the brain, and MRI measures this change. ii. New, expensive research method with much higher accuracy than EEG, ranging from $150,000 to $2 million per machine. iii. Open source libraries: FSL, SPM (Statistical Parametric Mapping) iv. fMRI machines make more accurate BCIs, but development has been slow due to cost and complexity.17

12http://www.neurosky.com/ 13http://www.plxdevices.com/product_info.php?id=XWAVESONIC 14http://myndplay.com/products.php?prod=7 15http://en.wikipedia.org/wiki/Electroencephalography 16http://en.wikipedia.org/wiki/Functional_magnetic_resonance_imaging c. PET: Positron Emission Tomography i. How does it work?18 1. A PET scan uses high-resolution radiation imaging to track a tracer in your blood. This can be done through an injection of the tracer or breathing in the radioactive material. ii. Hailed by Ray Kurzweil among others. iii. HRRT: High Resolution Research Tomographhas a scanning resolution of 2mm.19 iv. Tons of different free software available, including AMIDE, OxiriX, and STIR.20 III. Current and Potential Uses of BCIs a. Current uses i. Robotics software 1. EEG already powers technology like wheelchairs.21 2. ExtremeTech did a demo controlling a robot using the Emotiv EPOC over Skype using their minds.22 3. The new Braindriver application from the Freie University in Berlin even allows you to drive a car using your mind.23 ii. Brain input 1. Transcranial magnetic stimulation uses electromagnetic induction to polarize or depolarize neurons.24 This is currently used for clinical purposes, but could have brain input uses as the field of continues to develop. 2. uses a concept called neuromodulation to manipulate neurons. It uses light-activated channels and enzymes to influence the way that neurons operate. 3. tDCS (Transcranial direct-current stimulation) runs a 2 milliamp current through your brain to slightly depolarize

17http://www.ncbi.nlm.nih.gov/pmc/articles/PMC2233807/ 18http://www.nlm.nih.gov/medlineplus/ency/article/007341.htm 19http://corelabs.emory.edu/csi/equipment/hrrt.html 20http://cdn.intechopen.com/pdfs/27812/InTech-Free_software_for_pet_imaging.pdf 21http://www.gizmag.com/toyota-wheelchair-powered-brain-waves/12121/ 22http://www.engadget.com/2010/04/27/rovio-robot-controlled-via-skype-with-emotiv-brain-reading-heads/ 23http://spectrum.ieee.org/automaton/robotics/robotics-software/braindriver-a-mind-controlled-car 24http://en.wikipedia.org/wiki/Transcranial_magnetic_stimulation neurons, increasing learning and performance of test subjects in a study funded by DARPA.25 iii. Polysomnography 1. Colin Petty, a Brooklyn-based musician, uses audio triggers with his Emotiv EPOC during dreaming to make him understand he’s lucid dreaming and to increase his own memory of the dream using Audiokinetic’sWwise software. iv. General computer control 1. The Emotiv EPOC has a built-in gyroscope. In the past,I have experimented with using GlovePIE26 to map my blinking to mouse clicks, which allows me to use my head as a fully functional mouse. v. Neuromarketing 1. As lucrative as it is ethically questionable. 2. Main players: Sands Research, MindLab International and NeuroSense are leaders in the industry, which is constantly growing.27 3. Might be more evil than Project MK Ultra. b. Eventual uses of BCIs i. Exocortices 1. An exocortex is essentially a motherboard that would augment a user’s brain in the same way that AR augments reality. ii. Neurocommunities 1. The ability to network with people based on their brain data, including everything from the way they react to a particular song to the way they react to you reacting to how they react to a particular song. What better way to meet people with similar interests than you than actually see what their brain looks like? This is real neural networking: across different brains across different computers and would create a network of both biological and artificial neurons. iii. Synthetic telepathy28.

25http://www.extremetech.com/extreme/84232-boost-your-brains-power-with-a-9volt-battery-and-some-wet- sponges 26https://sites.google.com/site/carlkenner/glovepie 27http://www.neurosciencemarketing.com/blog/companies 1. DARPA is conducting research on synthetic telepathy which would allow people to communicate silentlyusing their minds. The project is called Silent Talk, and “the research aims to detect and analyze the word-specific neural signals, using EEG, which occur before speech is vocalized, and to see if the patterns are generalizable.” As of 2009, the research is focused on military uses.29 iv. 1. Using a BCI, you could enter into a virtual environment such that your brain would control the avatar’s actions inside the simulated reality environment. Using your desires and movements, you could actually interact in a virtual world. This is already being done extensively with the Emotiv EPOC.30 v. Whole brain emulation 1. We could simulate an actual human brain, which the Big Blue Project, IBM’s attempt to reverse-engineer the brain, which would allow people’s brains to live on after them through the data they’ve attained. This is the kind of technology that accelerates evolution. vi. Nootropics research 1. EEG devices could be used to measure nootropics’ effects on the brain. For instance, trials could be done measuring brain data while the subject is using and then not using a nootropic and observing media. IV. Kinetic user interfaces (KUI) a. LeapMotion i. New gesture-based computing device designed for consumer market that will allow complex interaction with applications ii. Over 40,000 developers with hundreds of thousands of pre-orders.

28http://www.nbcnews.com/id/27162401/ns/technology_and_science-science/t/army-developing-synthetic- telepathy/

29http://en.wikipedia.org/wiki/Brain%E2%80%93computer_interface#Synthetic_telepathy.2Fsilent_communicatio n 30http://neurogadget.com/2011/03/21/navigating-in-a-virtual-3d-environment-with-emotiv-epoc/1393 iii. SDK available for Linux and Windows31. Their open source initiative is also taking off.32 iv. Some of Leap’s competitors: PMD, Three Gear Systems, ’s wrist system33, SoftKinetic.34 b. Gaming i. Kinect 1. The Kinect 2 is a huge upgrade. Field of view will go from 57.5x43.5 degrees to 70x60 degrees. The current Kinect has a 640x480 camera, but the new one will upgrade to 1920x1080, increasing depth resolution along with an IR stream.35 2. The Xbox 720 will also feature IllumiRoom, which is an AR projection system that changes the features of the room you’re in.36 3. Proprietary dev kits plus software like OpenKinect3738 will open this up to developers. ii. Playstation Move and Wii 1. Not as relevant because these are more akin to location- based awareness systems.39 2. The Move uses a hybrid motion-sensing controller and camera motion capture system.40 3. The Wiimote uses an accelerometer and a PixArt optical sensor along with a sensor bar that uses 10 infrared LEDs.41 4. Later versions of the devices might be more similar to the Kinect. c. Oculus Rift

31http://nodiceatall.wordpress.com/2013/03/26/leap-motion-sdk-is-now-available-for-linux/ 32http://openleap.org/ 33http://www.engadget.com/2012/10/09/microsoft-research-digits-3d-hand-gesture-tracking/ 34http://www.softkinetic.com/en-us/softkinetic.aspx 35http://www.digitaltrends.com/gaming/rumored-specs-for-kinect-2-describe-a-far-more-precise-motion- controller-for-xbox-720/ 36http://www.extremetech.com/gaming/146536-illumiroom-peripheral-projection-is-this-the-xbox-720s-killer- feature 37http://openkinect.org/ 38https://github.com/OpenKinect/libfreenect 39http://asg.unige.ch/publications/TR09/07kineticUI.pdf 40http://reviews.cnet.com/8301-21539_7-20008059-10391702.html 41http://en.wikipedia.org/wiki/Wii_Remote i. One of the most successful Kickstarters ever with over $2 million raised.42 ii. Next-generation VR headset sponsored by John Carmack, Cliffy B, Gabe Newell, Tim Sweeney, and others. iii. Linux/Mac/Windows/Android/iOS-compatible SDK with “out-of- the-box engine integrations for Unreal Engine and Unity.” iv. Competitors: nVisor SX11143. More expensive but higher resolution. d. Omni treadmill44 i. Natural motion interface made by Virtuix that allows omni- directional movement. ii. Preparing their Kickstarter and integrating with VR/AR devices like the Oculus Rift.45 e. Eyetracking i. How does it work? 1. Eyetrackers use microprojectors to create patterns on users’ eyes which they can interpret.46 2. Software allows image sensing devices to interpret the patterns, allowing the device to register what you’re focusing on.47 ii. Mirametrix 1. Expensive, enterprise-grade quality eyetracking solution.48 2. $5,000 API built for C, Python, and MATLAB. iii. Tobii Gaze 1. One of the most impressive eyetracking implementations. 2. Tobii Gaze SDK available for $995.49 iv. Eyetribe 1. Android tablet with built-in eyetracking. 2. Currently in beta – SDK already available.50

42http://www.kickstarter.com/projects/1523379957/oculus-rift-step-into-the-game 43http://nvisinc.com/product.php?id=48 44http://www.virtuix.com/ 45http://www.engadget.com/2013/04/21/virtuix-omni-treadmill-oculus-rift-demo/ 46http://www.tobii.com/en/gaze-interaction/global/eye-tracking/ 47http://www.tobii.com/en/gaze-interaction/global/eye-tracking/ 48http://mirametrix.com/products/eye-tracker/ 49http://www.tobii.com/en/eye-tracking-integration/global-old/ 50http://theeyetribe.com/ 3. Open source software: ExpertEyes is an open source Java eyetracking application.51 4. Open hardware: Jason Babcock and Jeff Pelz from the Rochester Institute of Technology developed open hardware for eyetracking.52 a. Part list available along with a schematic.53 V. Peripeteia a. The evolution of the way humans interact with computers has in many ways dictated the evolution of technology itself. Through hacking these new methods of human/computer interaction by using open software and hardware, we will experience a dramatic shift in the technological landscape. b. What will we eventually buildwith all this? i. Goal is a hackable holodeck, an amalgamation of BCI/KUI/AR/VR.

51https://code.google.com/p/experteyes/ 52http://www.jasonbabcock.com/research/ETRA04_babcock_pelz_color_small.pdf 53http://129.79.193.155/~busey/EyeTracker/