COMMENT HISTORY Two centuries of EXPLORATION On the trail of FUNDING African grant-giving OBITUARY Vladimir Voevodsky, distorting publication an ill-fated expedition to bodies need more data to pioneer in algebraic geometry metrics p.163 map the Arctic p.166 guide investments p.168 and computer proofs p.169 BSIP/UIG/GETTY

A man with a spinal-cord injury (right) prepares for a virtual cycle race in which competitors steer avatars using brain signals. Four ethical priorities for neurotechnologies and AI Artificial intelligence and brain–computer interfaces must respect and preserve people’s privacy, identity, agency and equality, say Rafael Yuste, Sara Goering and colleagues.

onsider the following scenario. feels frustrated with the experimental team. illustrates some of the challenges that society A paralysed man participates in a clin- Later, his robotic hand crushes a cup after might be heading towards. ical trial of a brain–computer interface taking it from one of the research assistants, Current BCI technology is mainly focused C(BCI). A computer connected to a chip in his and hurts the assistant. Apologizing for what on therapeutic outcomes, such as helping brain is trained to interpret the neural activ- he says must have been a malfunction of the people with spinal-cord injuries. It already ity resulting from his mental rehearsals of an device, he wonders whether his frustration enables users to perform relatively simple action. The computer generates commands with the team played a part. motor tasks — moving a computer cursor or that move a robotic arm. One day, the man This scenario is hypothetical. But it controlling a motorized wheelchair, for

©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All 9ri gNOVEMBERhts reserved. 2017 | VOL 551 | NATURE | 159

COMMENT

example. Moreover, researchers can already nations and people of varying religions, applications and send e-mails3. Meanwhile, interpret a person’s neural activity from func- ethnicities and socio-economic backgrounds researchers at Duke University in Durham, tional magnetic resonance imaging scans at will have differing needs and outlooks. As North Carolina, have shown that three a rudimentary level1 — that the individual is such, governments must create their own monkeys with electrode implants can oper- thinking of a person, say, rather than a car. deliberative bodies to mediate open debate ate as a ‘brain net’ to move an avatar arm It might take years or even decades until involving representatives from all sectors collaboratively4. These devices can work BCI and other neurotechnologies are part of society, and to determine how to trans- across thousands of kilometres if the signal of our daily lives. But technological devel- late these guidelines into policy, including is transmitted wirelessly by the Internet. opments mean that we are on a path to a specific laws and regulations. Soon such coarse devices, which can world in which it will be possible to decode stimulate and read the activity of a few people’s mental processes and directly manip- INTELLIGENT INVESTMENTS dozen at most, will be surpassed. ulate the brain mechanisms underlying their Some of the world’s wealthiest investors are Earlier this year, the US Defense Advanced intentions, emotions and decisions; where betting on the interplay between neuro­ Research Projects Agency (DARPA) individuals could communicate with others science and AI. More than a dozen companies launched a project called Neural Engineer- simply by thinking; and where powerful worldwide, including Kernel and Elon Musk’s ing System Design. It aims to win approval computational systems linked directly to start-up firm Neuralink, which launched this from the US Food and Drug Administration people’s brains aid their interactions with the year, are investing in the creation of devices within 4 years for a wireless human brain world such that their mental and physical that can both ‘read’ human brain activity and device that can monitor brain activity using abilities are greatly enhanced. ‘write’ neural information into the brain. We 1 million electrodes simultaneously and Such advances could revolutionize the estimate that current spending on neuro­ selectively stimulate up to 100,000 neurons. treatment of many conditions, from brain technology by for-profit industry is already Meanwhile, Google, IBM, Microsoft, injury and paralysis to epilepsy and schizo- US$100 million per year, and growing fast. Facebook, Apple and numerous start-ups are phrenia, and transform human experience Investment from other sectors is also building ever-more-sophisticated artificial for the better. But the technology could also considerable. Since 2013, more than neural networks that can already outperform exacerbate social inequalities and offer cor- $500 million in federal funds has gone humans on tasks with well-defined inputs porations, hackers, governments or anyone towards the development of neurotechnol- and outputs. else new ways to exploit and manipulate ogy under the US BRAIN initiative alone. Last year, for example, researchers at the people. And it could profoundly alter some Current capabilities are already impressive. University of Washington in Seattle demon- core human characteristics: private mental life, A neuroscientist paralysed by amyotrophic strated that Google’s FaceNet system could individual agency and an understanding of lateral sclerosis (ALS; also known as Lou recognize one face from a million others. individuals as entities bound by their bodies. Gehrig’s or motor disease) has used Another Google system with similar neural- It is crucial to consider the possible a BCI to run his laboratory, write grant network architecture far outperforms well- ramifications now. travelled humans at guessing where in the The Morningside Group comprises neuro­ world a street scene has been photographed, scientists, neurotechnologists, clinicians, PROTECTING PRIVACY demonstrating the generality of the tech- ethicists and machine-intelligence engineers. nique. In August, Microsoft announced that, It includes representatives from Google Federated learning in certain metrics, its neural network for rec- and Kernel (a neurotechnology start-up ognizing conversational speech has matched in Los Angeles, California); from inter­ When technology companies use the abilities of even trained professionals, national brain projects; and from academic machine learning to improve their who have the option of repeatedly rewind- and research institutions in the United States, software, they typically gather user ing and listening to words used in context. Canada, Europe, Israel, China, Japan and information on their servers to analyse And using electroencephalogram (EEG) Australia. We gathered at a workshop spon- how a particular service is being used data, researchers at the University of Freiburg sored by the US National Science Foundation and then train new algorithms on in Germany showed in July how neural at , New York, in May the aggregated data. Researchers networks can be used to decode planning- 2017 to discuss the ethics of neurotechnolo- at Google are experimenting with related brain activity and so control robots5. gies and machine intelligence. an alternative method of artificial- Future neural networks derived from a We believe that existing ethics guidelines intelligence training called federated better understanding of how real ones work are insufficient for this realm2. These include learning. Here, the teaching process will almost certainly be much more powerful the Declaration of Helsinki, a statement of happens locally on each user’s device even than these examples. The artificial net- ethical principles first established in 1964 without the data being centralized: works in current use have been inspired by for medical research involving human sub- the lessons aggregated from the models of brain circuits that are more than jects (go.nature.com/2z262ag); the Belmont data (for instance, the knowledge 50 years old, which are based on recording the Report, a 1979 statement crafted by the US that the word ‘weekly’ can be used activity of individual neurons in anaesthetized National Commission for the Protection of as an adjective and an adverb) are animals6. In today’s labs, Human Subjects of Biomedical and Behav- sent back to Google’s servers, but researchers can monitor and manipulate the ioural Research (go.nature.com/2hrezmb); the actual e-mails, texts and so on activity of thousands of neurons in awake, and the Asilomar artificial intelligence (AI) remain on the user’s own phone. behaving animals, owing to advances in statement of cautionary principles, pub- Other groups are exploring similar optical methods, computing, molecular lished early this year and signed by business ideas. Thus, information systems with engineering and microelectronics. leaders and AI researchers, among others improved designs could be used to We are already intimately connected to (go.nature.com/2ihnqac). enhance users’ ownership and privacy our machines. Researchers at Google calcu- To begin to address this deficit, here we lay over their personal data, while still lated this year that the average user touches out recommendations relating to four areas enabling valuable computations to be their phone nearly one million times annu- of concern: privacy and consent; agency and performed on those data. ally (unpublished data). The human brain identity; augmentation; and bias. Different controls auditory and visual systems to

160 | NATURE | VOL 551 | 9 NOVEMBER©2017 M2017ac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved. ©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved.

COMMENT BSIP/UIG/GETTY

After having electrodes implanted in the brain to stimulate neural activity, some people have reported feeling an altered sense of identity.

decipher sounds and images, and com- could enable earlier diagnosis of Parkinson’s their privacy rights to commercial providers mands limbs to hold and manipulate our disease7. A 2017 study suggests that measures of services, such as Internet browsing, social gadgets. Yet the convergence of develop- of mobility patterns, such as those obtained media or entertainment, without fully under- ments in neurotechnologies and AI would from people carrying smartphones during standing what they are surrendering. A offer something qualitatively different — the their normal daily activities, can be used to default of opting out would mean that neural direct linking of people’s brains to machine diagnose early signs of cognitive impairment data are treated in the same way that organs intelligence, and the bypassing of the normal resulting from Alzheimer’s disease8. or tissues are in most countries. Individuals sensorimotor functions of brains and bodies. Algorithms that are used to target advertis- would need to explicitly opt in to share neural ing, calculate insur- data from any device. This would involve a FOUR CONCERNS ance premiums or “Some of safe and secure process, including a consent For neurotechnologies to take off in general match potential the world’s procedure that clearly specifies who will use consumer markets, the devices would have to partners will be wealthiest the data, for what purposes and for how long. be non-invasive, of minimal risk, and require considerably more investors are Even with this approach, neural data from much less expense to deploy than current powerful if they betting on many willing sharers, combined with massive neuro­surgical procedures. Nonetheless, even draw on neural the interplay amounts of non-neural data — from Internet now, companies that are developing devices information — for between searches, fitness monitors and so on — could must be held accountable for their products, instance, activ- neuroscience be used to draw ‘good enough’ conclusions and be guided by certain standards, best ity patterns from and AI.” about individuals who choose not to share. practices and ethical norms. neurons associated To limit this problem, we propose that the We highlight four areas of concern that with certain states of attention. And neural sale, commercial transfer and use of neural call for immediate action. Although we raise devices connected to the Internet open up data be strictly regulated. Such regulations these issues in the context of neurotechnol- the possibility of individuals or organiza- — which would also limit the possibility of ogy, they also apply to AI. tions (hackers, corporations or government people giving up their neural data or hav- agencies) tracking or even manipulating an ing neural activity written directly into their Privacy and consent. An extraordinary individual’s mental experience. brains for financial reward — may be analo- level of personal information can already be We believe that citizens should have the gous to legislation that prohibits the sale of obtained from people’s data trails. Research- ability — and right — to keep their neural human organs, such as the 1984 US National ers at the Massachusetts Institute of Technol- data private (see also ‘Agency and identity’). Organ Transplant Act. ogy in Cambridge, for example, discovered We propose the following steps to ensure this. Another safeguard is to restrict the in 2015 that fine-grained analysis of people’s For all neural data, the ability to opt out centralized processing of neural data. We motor behaviour, revealed through their key- of sharing should be the default choice, and advocate that computational techniques, board typing patterns on personal devices, assiduously protected. People readily give up such as differential privacy or ‘federated

©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved. ©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All 9ri g NOVEMBERhts reserved. 2017 | VOL 551 | NATURE | 161

COMMENT learning’, be deployed to protect user privacy and recommend sanctions when needed. of neural technology for military purposes (see ‘Protecting privacy’). The use of other Such declarations must also protect be stringently regulated. For obvious rea- technologies specifically designed to protect people’s rights to be educated about the sons, any moratorium should be global and people’s data would help, too. Blockchain- possible cognitive and emotional effects of sponsored by a UN-led commission. based techniques, for instance, allow data to neuro­technologies. Currently, consent forms Although such commissions and similar be tracked and audited, and ‘smart contracts’ typically focus only on the physical risks of efforts might not resolve all enhancement can give transparent control over how data surgery, rather than the possible effects of a issues, they offer the best-available model are used, without the need for a centralized device on mood, personality or sense of self. for publicly acknowledging the need for authority. Lastly, open-data formats and restraint, and for wide input into the devel- open-source code would allow for greater Augmentation. People frequently experi- opment and implementation of a technology. transparency about what stays private and ence prejudice if their bodies or brains func- what is transmitted. tion differently from most10. The pressure to Bias. When scientific or technological deci- adopt enhancing neurotechnologies, such as sions are based on a narrow set of systemic, Agency and identity. Some people receiving those that allow people to radically expand structural or social concepts and norms, the deep-brain stimulation through electrodes their endurance or sensory or mental capaci- resulting technology can privilege certain implanted in their brains have reported feel- ties, is likely to change societal norms, raise groups and harm others. A 2015 study12 found ing an altered sense of agency and identity. issues of equitable access and generate new that postings for jobs displayed to female In a 2016 study, a man who had used a brain forms of discrimination. users by Google’s advertising algorithm pay stimulator to treat his depression for seven Moreover, it’s easy to imagine an aug- less well than those displayed to men. Simi- years reported in a focus group9 that he began mentation arms race. In recent years, we larly, a ProPublica investigation revealed to wonder whether the way he was interacting have heard staff at DARPA and the US last year that algorithms used by US law- with others — for example, saying something Intelligence Advanced Research Projects enforcement agencies wrongly predict that that, in retrospect, he thought was inappro- Activity discuss plans to provide soldiers black defendants are more likely to reoffend priate — was due to the device, his depres- and analysts with enhanced mental abilities than white defendants with a similar criminal sion or whether it reflected something deeper (‘super-intelligent agents’). These would record (go.nature.com/29aznyw). Such biases about himself. He said: “It blurs to the point be used for combat settings and to better could become embedded in neural devices. where I’m not sure … frankly, who I am.” decipher data streams. Indeed, researchers who have examined these Neurotechnologies could clearly disrupt Any lines drawn will inevitably be blurry, kinds of cases have shown that defining fair- people’s sense of identity and agency, and given how hard it is to predict which tech- ness in a mathematically rigorous manner is shake core assumptions about the nature of nologies will have negative impacts on very difficult (go.nature.com/2ztfjt9). the self and personal responsibility — legal human life. But we Practical steps to counter bias within or moral. urge that guidelines “Outright technologies are already being discussed in People could end up behaving in ways are established at bans of certain industry and academia. Such ongoing pub- that they struggle to claim as their own, if both international technologies lic discussions and debate are necessary to machine learning and brain-interfacing and national lev- could simply shape definitions of problematic biases and, devices enable faster translation between an els to set limits on push them more generally, of normality. intention and an action, perhaps by using an the augmenting We advocate that countermeasures to ‘auto-complete’ or ‘auto-correct’ function. If neurotechnologies underground.” combat bias become the norm for machine people can control devices through their that can be imple- learning. We also recommend that prob- thoughts across great distances, or if several mented, and to define the contexts in which able user groups (especially those who are brains are wired to work collaboratively, our they can be used — as is happening for gene already marginalized) have input into the understanding of who we are and where we editing in humans. design of algorithms and devices as another are acting will be disrupted. Privacy and individuality are valued way to ensure that biases are addressed from As neurotechnologies develop and more highly in some cultures than in others. the first stages of technology development. corporations, governments and others start Therefore, regulatory decisions must be striving to endow people with new capa- made within a culture-specific context, RESPONSIBLE NEUROENGINEERING bilities, individual identity (our bodily and while respecting universal rights and global Underlying many of these recommendations mental integrity) and agency (our ability to guidelines. Moreover, outright bans of cer- is a call for industry and academic research- choose our actions) must be protected as tain technologies could simply push them ers to take on the responsibilities that come basic human rights. underground, so efforts to establish specific with devising devices and systems capable We recommend adding clauses protecting laws and regulations must include organized of bringing such change. In doing so, they such rights (‘neurorights’) to international forums that enable in-depth and open debate. could draw on frameworks that have already treaties, such as the 1948 Universal Declara- Such efforts should draw on the many been developed for responsible innovation. tion of Human Rights. However, this might precedents for building international con- In addition to the guidelines mentioned not be enough — international declarations sensus and incorporating public opinion into above, the UK Engineering and Physical and laws are just agreements between states, scientific decision-making at the national Sciences Research Council, for instance, and even the Universal Declaration is not level11. For instance, after the First World provides a framework to encourage innova- legally binding. Thus, we advocate the crea- War, a 1925 conference led to the develop- tors to “anticipate, reflect, engage and act” tion of an international convention to define ment and ratification of the Geneva Protocol, in ways that “promote … opportunities for prohibited actions related to neurotechnol- a treaty banning the use of chemical and bio- science and innovation that are socially desir- ogy and machine intelligence, similar to the logical weapons. Similarly, after the Second able and undertaken in the public interest”. prohibitions listed in the 2010 International World War, the UN Atomic Energy Commis- Among the various efforts to address this in Convention for the Protection of All Persons sion was established to deal with the use of AI, the IEEE Standards Association created a from Enforced Disappearance. An associ- atomic energy for peaceful purposes and to global ethics initiative in April 2016, with the ated United Nations working group could control the spread of nuclear weapons. aim of embedding ethics into the design of review the compliance of signatory states, In particular, we recommend that the use processes for all AI and autonomous systems.

162 | NATURE | VOL 551 | 9 NOVEMBER©2017 M2017ac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved. ©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved.

COMMENT

History indicates that profit hunting will often trump social responsibility in the corporate world. And even if, at an

ROYAL SOCIETY ROYAL individual level, most technologists set out to benefit humanity, they can come up against complex ethical dilemmas for which they aren’t prepared. We think that mindsets could be altered and the producers of devices better equipped by embedding an ethical code of conduct into industry and academia. A first step towards this would be to expose engineers, other tech developers and academic-research trainees to ethics as part of their standard training on joining a company or laboratory. Employees could be taught to think more deeply about how to pursue advances and deploy strategies that are likely to contribute constructively to society, rather than to fracture it. This type of approach would essentially follow that used in medicine. Medical students are taught about patient confi- dentiality, non-harm and their duties of beneficence and justice, and are required to take the Hippocratic Oath to adhere to the highest standards of the profession. The possible clinical and societal benefits of neurotechnologies are vast. To reap them, we must guide their develop- ment in a way that respects, protects and enables what is best in humanity. ■

Rafael Yuste is professor of biological sciences at Columbia University, New York City, New York, USA. Sara Goering is associate professor of philosophy at the Cataloguers of the Royal Society developed the first record of published scientific research. University of Washington, Seattle, USA. (A list of 25 co-authors in The Morningside Group appears in the online version.) e-mails: [email protected]; [email protected] The catalogue that

1. Kay, K. N., Naselaris, T., Prenger, R. J. & Gallant, J. L. Nature 452, 352–355 (2008). 2. Goering, S. & Yuste, R. Cell 167, 882–885 (2016). made metrics, and 3. Sellers, E. W., Vaughan, T. M. & Wolpaw, J. R. Amyotrophic Lateral Sclerosis 11, 449–455 (2010). 4. Ramakrishnan, A. et al. Sci. Rep. 5, 10767 (2015). changed science 5. Burget, F. et al. Preprint at http://arxiv.org/ abs/1707.06633 (2017). 6. Hubel, D. H. & Wiesel, T. N. J. Physiol. (Lond.) As new ways emerge to assess research, Alex Csiszar 160, 106–154 (1962). 7. Giancardo, L., Sánchez-Ferro, A., recalls how the first one transformed the practice and Butterworth, I., Mendoza, C. S. & Hooker, J. M. Sci. Rep. 5, 9678 (2015). place of science in society. 8. Nieto-Reyes, A., Duque, R., Montana, J. L. & Lage, C. Sensors 17, 1679 (2017). 9. Klein, E. et al. Brain-Computer Interfaces 3, 140–148 (2016). n 1830, Charles Babbage had an unusual century, listing papers and comparing publi- 10. Parens, E. Shaping Our Selves: On Technology, idea. Exasperated by how little recogni- cation counts had become a popular pursuit Flourishing, and a Habit of Thinking (Oxford Univ. Press, 2014). tion science was getting in England, the among scientific authors and other observ- 11. Kitcher, P. Science in a Democratic Society Icomputer pioneer and scientific provocateur ers. Within a few decades, academic scien- (Prometheus, 2011). suggested that quantifying authorship might tists were coming to fear the creed of ‘publish 12. Datta, A., Tschantz, M. C. & Datta, A. Proc. Priv. Enhancing Technol. 2015, 92–112 (2015). be a way to identify scientific eminence. or perish’ (see ‘Catalogues and counts’). Like many of Babbage’s radical ideas, this This transformation can inform current A full list of authors accompanies this one persuaded almost nobody, but it eventu- debates about the value of algorithms for Comment online; see go.nature.com/2ij9bqt. ally proved prophetic. Before the end of the quantifying scientific credibility and

©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All ri ghts reserved. ©2017 Mac millan Publishers Li mited, part of Spri nger Nature. All 9ri gNOVEMBERhts reserved. 2017 | VOL 551 | NATURE | 163