Patient Autonomy As a Model for the Wellbeing of Technology Users Emanuelle Burton Kristel Clayville Judy Goldsmith Nicholas Mattei Dept
Total Page:16
File Type:pdf, Size:1020Kb
The Heart of the Matter: Patient Autonomy as a Model for the Wellbeing of Technology Users Emanuelle Burton Kristel Clayville Judy Goldsmith Nicholas Mattei Dept. of Computer Science Philosophy and Religion Dept. Dept. of Computer Science Dept. of Computer Science University of Illinois at Chicago Eureka College University of Kentucky Tulane University Abstract ideas have received some attention in matching and alloca- tion markets that are mediated by AI applications (McEl- We draw on concepts in medical ethics to consider how fresh, Conitzer, and Dickerson 2018; Li 2017). Whether computer science, and AI in particular, can develop criti- cal tools for thinking concretely about technology’s impact the values are pre-determined by developers or companies on the wellbeing of the people who use it. We focus on through a “top-down” approach or are learned by example patient autonomy—the ability to set the terms of one’s en- through a “bottom-up” machine learning approach (Ander- counter with medicine—and on the mediating concepts of in- son and Anderson 2011), these automated decisions—and formed consent and decisional capacity, which enable doc- the comparative values that are encoded in the decision mak- tors to honor patients’ autonomy in messy and non-ideal cir- ing algorithms—will have a profound impact on people’s cumstances. This comparative study is organized around a lives and wellbeing. fictional case study of a heart patient with cardiac implants. But what exactly makes a human life valuable and distinc- Using this case study, we identify points of overlap and of tive? What qualities of internal self or external environment difference between medical ethics and technology ethics, and need to be in place for a person to be able to live and act as a leverage a discussion of that intertwined scenario to offer ini- tial practical suggestions about how we can adapt the con- person? How do particular changes to their environment en- cepts of decisional capacity and informed consent to the dis- hance, or circumscribe, their ability to be a version of them- cussion of technology design. selves that they recognize and prefer? These are highly ab- stract questions that require careful thought and considera- tion of many points of view. Even with extensive training Introduction and experience in philosophy, it is difficult to formulate an Algorithms, robots, and other cyber-physical systems play answer that does not rely only on an individual’s moral intu- an increasing role in our everyday lives. These systems al- ition.3 Relying on intuition can often be a reliable guide in ready make important decisions that affect our everyday familiar spheres of life, but can be highly unreliable when lives: who deserves parole (pro ), who is approved for a one is designing and deploying technology for groups they home loan(Prez-Martn, Prez-Torregrosa, and Vaca 2018), are less familiar with, i.e., young programmers making de- and who is in need of medical care.1 In the near future, vices for old or infirm individuals. In addition, technology such decision-making systems will be even more deeply can and does create new conditions for human experience integrated into individual and social experience, including where no intuition exists. Hence, it is imperative to develop driving vehicles (Bonnefon, Shariff, and Rahwan 2016), co- terminology that is clear and useful for non-philosophically- ordinating the rescue of disaster victims (Imran et al. 2014), trained technologists, to enable them to realize their goals and providing care to the elderly (Sabanovic et al. 2013). In for bettering human life. ways both large and small, current and in-development ap- The proliferation of AI in daily life makes it vital that plications of technology and artificial intelligence (AI) are technologists who are designing, building, and deploying altering the basic conditions of human experience. these systems are well trained not only to build the tech- All of these AI-driven decisions are necessarily predi- nology but also to consider the impacts (Burton, Goldsmith, cated on comparative value judgments about human worth and Mattei 2018; Burton et al. 2017; Narayanan and Vallor and human goods: the importance of children’s lives vs. se- 2014). In this article we argue that technologists also need to niors’ lives in a natural disaster, or the value of students’ be able to think specifically about those aspects of the per- security vs. their personal dignity and privacy at a high- son that make them recognizable and distinct as people, and risk high school,2 or the appropriate course of medical care furthermore how those human qualities are amenable to im- for a terminally ill patient who is physically and emotion- provement, or vulnerable to harm, through specific changes ally suffering (Avati et al. 2017). These are the same value in the conditions of daily life. It is imperative that AI ethics judgments that transplant teams make every time they pre- develop its own conceptual tools that can account for the pare to operate, or other scarce medical interventions are particular ways in which AI can impact the conditions of allocated (Persad, Wertheimer, and Emanuel 2009). These daily life that affect personhood. Equipped with these tools, technologists will be able to discuss both the parameters and Copyright c 2018, Association for the Advancement of Artificial significance of the interventions that their designs are mak- Intelligence (www.aaai.org). All rights reserved. 1 https://www.ibm.com/watson/health/imaging/ 3However, many philosophers continue to argue that fundamen- 2 https://www.wired.com/story/algorithms-monitor-student- tal moral principles are grasped by intuition rather than reason or social-media-posts/ observation (Stratton-Lake 2016). ing, and to think more concretely about how design and pro- pendence is central to contemporary medical ethics (Jonsen, gramming choices can protect and enhance the lives of indi- Siegler, and Winslade 2015) . Patient autonomy as a govern- viduals and societies. ing concept in medical ethics is relatively recent; the shift to- Using technology to enhance, rather than diminish, hu- ward it and away from medical paternalism was fueled both man lives is made particularly difficult by the knowledge by broader social movements that sought to empower the gap between those who build and maintain the technologies individual and by the development of a more consumerist and those who use them without the technological expertise model of medicine as physicians sought to protect them- to understand how they work. Normal, i.e., non-specialist, selves from malpractice (Billings and Krakauer 2011). users face several disadvantages when confronted with even In practical medical ethics, the term autonomy has two “easy to use” technology. For example, these users are less distinct uses, which are related but which also operate in- likely to be aware of potential security breaches, the signs of dependently of each other. The first usage is to affirm that such breaches, or the steps they might take to prevent them; the patient deserves autonomy, the power to exert influence these users are also far less likely to be aware of any cus- over what happens to them; the second usage concerns the tomization tools that would enable them to fine-tune their question of whether the patient is able to exercise that auton- experience for their own personal comfort and convenience. omy. Because people frequently seek medical care at a mo- Thus, even at the level of everyday personal technology use, ment when they are mentally or physically compromised, it there exists a significant power imbalance between technol- is not enough to affirm that a patient deserves autonomy. It ogy experts and non-experts. The depth and scope of that is necessary for medical providers to take deliberate steps in power imbalance grows exponentially if one also consid- order to protect the patient’s autonomy, and ensure that the ers those experts’ professional work designing, building and patient is able and empowered to make decisions that reflect maintaining the systems that other users rely on but lack the their wishes, and that their wishes are respected even when expertise to understand. they are not capable of asserting them. This expertise-based power imbalance, while particularly Neither dimension of autonomy—autonomy-as- pressing in technology ethics, is not unique. A similar power recognition or autonomy-as-exercise—simply exists as imbalance has long existed in medicine, a field whose prac- a given. Because of the systemic power imbalance between titioners need extensive specialist knowledge even as they expert care providers and their non-expert patients, two key serve a user base of patients who mostly lack that knowl- constraints have been put in place to ensure that the patient’s edge. Because of the power imbalance implicit in the vast autonomy is honored in practice as well as in principle. majority of patient-practitioner relationships, patients are of- They are informed consent and decisional capacity. ten prevented from making choices about their own care In the United States, when a patient undergoes a medi- even when doctors or nurses are at pains to leave the choice cal procedure, that patient must consent to it, and that con- in the patient’s hands (Henderson 2003). To mitigate this sent must follow a conversation in which the doctor explains problem, medical ethics has developed a family of concepts the procedure’s risks, benefits to the patient, as well as other and practices to help its expert practitioners to navigate the treatment options. After this conversation has happened, the inevitable imbalance in power and knowledge (Quill and patient signs a document acknowledging that this conversa- Brody 1996). We argue that these concepts can be usefully tion took place, and the patient is thereby giving informed imported, with some significant revision, into technology consent to the procedure.