3. Beyond Narnia: More Problems Await Through the Wardrobe

Total Page:16

File Type:pdf, Size:1020Kb

3. Beyond Narnia: More Problems Await Through the Wardrobe Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe Leah A. Plunkett Published on: Aug 26, 2019 Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe 2 Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe space. It can silently take pieces of information from children and their adults, mine them for more information, and reshare that information with an unknown number of others for unspecified ends. At this point, what is your intuition? Do the bad actors—the identity thieves and others—seem risky enough to lead you to think differently about sharenting? Or are you inclined to see the threat they pose as more avalanche (terrifying but rare) than snowstorm (dangerous but manageable)? How does your risk assessment change, if at all, if you think more about the snowstorm scenario than the avalanche? If you live in an area with winter weather, at some point, you probably will drive in a snowstorm. And if you engage in sharenting, private information about your children will go through the big data blizzard. Time to hit the brakes? Magic Wardrobe2 Let’s say that the identity thief is like the stereotypical burglar who breaks into your house and takes your stuff: you’re left without your possessions and harmed by this loss. The big data thief could be seen as more akin to a customer at a yard sale who buys the old bureau you inherited from your grandmother that you think is worthless, discovers a treasure trove of family photos and other heirlooms inside, and keeps the stash for herself. That’s a helpful but incomplete analogy. Let’s look at where it works. Big data isn’t stealing. You’re welcoming it. You might be rolling out the welcome mat to big data because you don’t realize it’s there. You might know it’s there but think it’s helping you. And perhaps it is helping you or at least not hurting you. Let’s look at where this analogy breaks down. In your interactions with digital tech and associated big data, you are typically deriving immediate benefits that go beyond the removal of an unwanted possession. To make the analogy more accurate, big data might be like the yard sale customer who gives you a new dresser for free and takes away your old one. The analogy also breaks down because big data doesn’t typically deprive you of the use of any of your own information that you generate yourself: it just uses it for its own purposes. To make the analogy even more accurate, our yard sale customer might leave you a duplicate set of everything she found in the dresser and then use the set she took for her own purposes. The analogy also breaks down because, in big data world, you are continually creating new data by engaging the digital tech. It’s not a finite set of valuables that you leave in the form of a data trail but an ever-growing set. And it’s an ever-growing set that wouldn’t exist but for the digital tech that you are using. Now we’re at the point with our analogy where the yard sale customer takes away your old dresser and gives you a new one. For as long as you have it, that new dresser continues to give you new benefits, 3 Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe like a sock matcher so you never lose any socks again. What’s so bad about your magic wardrobe? You’re starting to think you might actually find Narnia3 after all these years! Well, you might. But it might also be that, instead of a witch waiting for you on the other side of the wardrobe, the wardrobe itself is bewitched. It starts learning a lot of things about you and your family that you don’t even realize it is learning. Let’s say you’re using the magic wardrobe to house your daughter’s clothes.4 The wardrobe is perfectly matching her socks, but you don’t notice that it is making a copy of each sock. The wardrobe also selects your daughter’s outfit for each day and coordinates the socks with the outfit. How does the wardrobe know how to produce an outfit that is perfect for the day’s events? You gave the wardrobe permission, when it arrived, to communicate with your iPhone calendar via an embedded sensor in the back of the wardrobe. The sensor system is also linked to sensors in your daughter’s clothes, so the smart wardrobe combines what it learns from your iPhone calendar to tell you what your daughter should wear. Forget the Lion and the Witch: it’s like Mary Poppins has taken up residence in this wardrobe! You’re loving this helpful magic so much that you don’t think about what else the magic wardrobe is learning about your daughter. You don’t ask if it’s figuring out how she’s doing in school from your calendar entry that reads “Parent-teacher conference re: bullying issue @ 2 p.m.” You don’t think about whether it’s figuring out how fast she’s growing from reading her clothing tags. You don’t wonder if it’s keeping its discoveries to itself. Your daughter looks awesome, and you have five to ten minutes more each morning to Instagram her #girlpower pics. What you’re actually doing with your children’s data in real life is a lot like this magical wardrobe. In exchange for free or inexpensive access to efficient, engaging, interactive digital services and products, you are sharing an ever-expanding amount of your children’s personal information with those tech providers. You likely don’t realize how much data you are sharing or how that tech provider can use your children’s information and allow an indeterminate number of unidentifiable third parties to use it too. We don’t need make-believe to find ourselves in a veritable Fantasia of spying objects. Out of Narnia, Back to Real Life Let’s move from make-believe enchanted objects to the real-life enchanted objects and other forms of digital tech you’re likely using today. Facebook can add your post about toilet training dilemmas as a data point to its own information about you, as well as whatever information it is sharing with third parties. Barbie, Elmo, new nanny: it’s all data. The question isn’t “Who might be interested in this kind of dossier on kids?” but “Who wouldn’t be?” 4 Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe Is this stuff happening already? Yes, it is. We are only beginning to understand the methods and the scope. The rapid pace of tech innovation, the lack of transparency in many major data-related markets, and other factors combine to keep us, as security expert Bruce Schneier tells it, the David to the Goliath of big data.5 Here’s what we do know. Federal and state laws impose almost no limits on the ability of parents to share information about their kids online.6 As soon as private individuals, companies, or nonprofits receive this information from parents, there are few legal limits on what they can do with it. Those limits that do exist come from general bodies of law or laws that apply to the receiving people or entities, not specific statutory and regulatory schemes that address parents’ legal rights to divulge their children’s private information. Some significant limits include those from criminal law. Parents can’t steal their children’s identities, manufacture child porn, or commit other crimes against or involving their children. Consumer law and contract law require companies to follow their own terms of service and policies, best-practice commitments, and other commitments they make regarding how they will use children’s data. A federal statute, the Children’s Online Privacy Protection Act, does limit what many private companies can do to collect and use information directly from children under age thirteen. The limit? The covered companies need to have a parent’s permission before collecting and using the data.7 Similar legal limits exist for teachers: they need to obtain parental consent before sharing students’ private data, unless an exception applies.8 There are government actors and institutions outside of education where parental consent is not dispositive. For instance, a juvenile court may be legally barred from sharing information about a child’s court case even with parental permission. But we now find ourselves back more or less where we started: parents stand at the center of a largely consent-based framework for the digital distribution and use of their children’s private data. After they have consented to digital data sharing about their kids, either by doing it themselves or allowing other adults and institutions to do it, the data can travel at warp speed across entities and time.9 Data Brokers Data brokers facilitate this movement by aggregating and analyzing digital data. Brokers then sell this data to third parties. Data buyers then use discrete data points or larger data sets to engage in data- driven decision making for their own purposes.10 Private companies that collect, store, and share relevant data with individuals and institutions that are willing to pay are not a new idea. Holdovers 5 Sharenthood • Sharenthood 3. Beyond Narnia: More Problems Await through the Wardrobe from last century include the consumer credit bureaus, real estate brokers, and employment headhunters.
Recommended publications
  • Working+Paper+-+Knowledgeworker
    Colophon Published by the Research project: Socially Innovative Knowledge Work – SIW Homepage: https://www.futureknowledgework.com/ Project partners: Kooperationen PROSA – Danish Association of IT Professionals ACTEE Technological Institute Denmark Roskilde University The project is partially funded by Innovation Fund Denmark For further information please contact head of research: Assoc. Prof. PhD Katia Dupret Roskilde University Universitetsvej 1 4000 Roskilde - Denmark Table of contents Colophon 1 0. Executive Summary 3 1. Introduction 5 2. Definitions and background 6 2.1. Knowledge workers vs. other workers 6 2.1.1. Definitions 6 2.1.2. Relations to other groups of workers, managers and society 7 2.1.3. Shared and particular work-life concerns 9 2.2. Relevant forms of worker organisation 9 2.2.1. The Webbs and the ‘brain workers’ 10 2.2.2. Typologies of worker organisation 10 2.2.3. Worker organisation in the 20th and 21st century 13 2.2.4. Contours of alternative/adequate labour organisation 14 3. Literature review 15 3.1. Literature walkthrough 16 3.1.1. New and self-proclaimed trade unions 16 3.1.2. Non-union initiatives 18 3.2. Analysis 22 3.2.1. Definitions and conceptualisations of knowledge workers 22 3.2.2. Typologisation: Do the Webbs still hold up? 25 3.2.3. Efficiency 29 3.2.3.1. Unions persist, but often face difficulties 29 3.2.3.2. New labour formations emerge, but are not necessarily effective 30 4. Conclusion 32 4.1. Theoretical findings 33 4.2. Implications and recommendations 36 Literature 38 Appendix: Overview of named labour initiatives 44 2 0.
    [Show full text]
  • Design Justice: Community-Led Practices to Build the Worlds We
    Design Justice Information Policy Series Edited by Sandra Braman The Information Policy Series publishes research on and analysis of significant problems in the field of information policy, including decisions and practices that enable or constrain information, communication, and culture irrespective of the legal siloes in which they have traditionally been located, as well as state- law- society interactions. Defining information policy as all laws, regulations, and decision- making principles that affect any form of information creation, processing, flows, and use, the series includes attention to the formal decisions, decision- making processes, and entities of government; the formal and informal decisions, decision- making processes, and entities of private and public sector agents capable of constitutive effects on the nature of society; and the cultural habits and predispositions of governmentality that support and sustain government and governance. The parametric functions of information policy at the boundaries of social, informational, and technological systems are of global importance because they provide the context for all communications, interactions, and social processes. Virtual Economies: Design and Analysis, Vili Lehdonvirta and Edward Castronova Traversing Digital Babel: Information, e- Government, and Exchange, Alon Peled Chasing the Tape: Information Law and Policy in Capital Markets, Onnig H. Dombalagian Regulating the Cloud: Policy for Computing Infrastructure, edited by Christopher S. Yoo and Jean- François Blanchette Privacy on the Ground: Driving Corporate Behavior in the United States and Europe, Kenneth A. Bamberger and Deirdre K. Mulligan How Not to Network a Nation: The Uneasy History of the Soviet Internet, Benjamin Peters Hate Spin: The Manufacture of Religious Offense and Its Threat to Democracy, Cherian George Big Data Is Not a Monolith, edited by Cassidy R.
    [Show full text]
  • Taking a Stand. Ethically-Driven Actions and Activism Initiated by Technology’S Creators and Users
    taking a stand. ethically-driven actions and activism initiated by technology’s creators and users. ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- ----- introduction. As technology advances at an unprecedented pace so does the societal impact. Big tech companies now have tremendous power and an outsized impact on the lives of ordinary citizens. Users and their data have become the currency of the digital world, fueling concerns around data collection, privacy, surveillance and security. Automated systems and applications of artificial intelligence (AI) are now utilized and relied on by nearly every industry including the justice system, the military, the insurance industry and real estate. Advances in big data and machine learning have made AI systems widely adaptable and accessible. At the same time, concerning issues of machine bias have emerged. Values once cherished in the early years of the digital age--fairness, openness, equal opportunity--have been corrupted by unforeseeable consequences, and society is now grappling with new ethical dilemmas. Ethical Automation, a three-month academic seminar lead by Johan Michalove explored the current state of automated and intelligent systems as well as ethics and the future of technology. Course readings and discussions examined the effects of automation and the ethical decisions faced by technologists--prompting classroom discussions about what kind of world we want to live in, who is responsible for the ethical future of technology and who has the power to make a difference. who is responsible? Should individual tech companies and executives be held responsible for the development and distribution of ethical technology? Do engineers, the people with the most knowledge about how the technology works, share this responsibility? Is it up to lawmakers and government enforcers to regulate technology? Do consumers have the power and ability to bring about change? the goal of this project is 1.
    [Show full text]
  • Organizing Big Tech by Sarah Jaffe Published by the Rosa Luxemburg Stiftung, New York Office, April 2021
    Organizing Big Tech By Sarah Jaffe Published by the Rosa Luxemburg Stiftung, New York Office, April 2021 Executive Director: Andreas Günther Editor: Aaron Eisenberg and Maria Savel Address: 275 Madison Avenue, Suite 2114, New York, NY 10016 Email: [email protected] Phone: +1 (917) 409-1040 With support from the German Foreign Office. The Rosa Luxemburg Foundation is an internationally operating, progressive non-profit institution for civic education. In coop- eration with many organizations around the globe, it works on democratic and social participation, empowerment of disadvan- taged groups, alternatives for economic and social development, and peaceful conflict resolution. The New York Office serves two major tasks: to work around issues concerning the United Nations and to engage in dialogue with North American progressives in universities, unions, social movements, and politics. www.rosalux.nyc 2 Rosa Luxemburg Stiftung New York Office Organizing Big Tech By Sarah Jaffe A supporter of the RWDSU unionization effort takes a photo of the RWDSU union rep standing with other supporters outside the Amazon fulfillment ware- house at the center of a unionization drive on March 29, 2021 in Bessemer, Alabama. (Photo by Elijah Nouvelage/Getty Images) Introduction The “tech” sector occupies a central place in American capitalism in 2021, and for good reason. As Logic magazine editor Ben Tarnoff notes, “Tech is an oasis of profitability in an era of stagnation. For this reason, it also serves a valuable ideological function.”1 But what do companies like Google, Amazon, and Tesla have in common, really? One began as a search engine; another, an online bookstore; the third an electric car maker.
    [Show full text]
  • AI Now Report 2018
    AI Now Report 2018 Meredith Whittaker​, AI Now Institute, New York University, Google Open Research Kate Crawford​, AI Now Institute, New York University, Microsoft Research Roel Dobbe​, AI Now Institute, New York University Genevieve Fried​, AI Now Institute, New York University Elizabeth Kaziunas​, AI Now Institute, New York University Varoon Mathur​, AI Now Institute, New York University Sarah Myers West​, AI Now Institute, New York University Rashida Richardson​, AI Now Institute, New York University Jason Schultz​, AI Now Institute, New York University School of Law Oscar Schwartz​, AI Now Institute, New York University With research assistance from Alex Campolo and Gretchen Krueger (AI Now Institute, New York University) DECEMBER 2018 CONTENTS ABOUT THE AI NOW INSTITUTE 3 RECOMMENDATIONS 4 EXECUTIVE SUMMARY 7 INTRODUCTION 10 1. THE INTENSIFYING PROBLEM SPACE 12 1.1 AI is Amplifying Widespread Surveillance 12 The faulty science and dangerous history of affect recognition 13 ​Facial recognition amplifies civil rights concerns 15 1.2 The Risks of Automated Decision Systems in Government 18 1.3 Experimenting on Society: Who Bears the Burden? 22 2. EMERGING SOLUTIONS IN 2018 24 2.1 Bias Busting and Formulas for Fairness: the Limits of Technological “Fixes” 24 ​Broader approaches 27 2.2 Industry Applications: Toolkits and System Tweaks 28 2.3 Why Ethics is Not Enough 29 3. WHAT IS NEEDED NEXT 32 3.1 From Fairness to Justice 32 3.2 Infrastructural Thinking 33 3.3 Accounting for Hidden Labor in AI Systems 34 3.4 Deeper Interdisciplinarity 36 3.5 Race, Gender and Power in AI 37 3.6 Strategic Litigation and Policy Interventions 39 3.7 Research and Organizing: An Emergent Coalition 40 CONCLUSION 42 ENDNOTES 44 This work is licensed under a ​Creative Commons Attribution-NoDerivatives 4.0 International License 2 ABOUT THE AI NOW INSTITUTE The AI Now Institute at New York University is an interdisciplinary research institute dedicated to understanding the social implications of AI technologies.
    [Show full text]
  • The New Writs of Assistance
    Maurer School of Law: Indiana University Digital Repository @ Maurer Law Articles by Maurer Faculty Faculty Scholarship 2018 The New Writs of Assistance Ian Samuel Indiana University Maurer School of Law, [email protected] Follow this and additional works at: https://www.repository.law.indiana.edu/facpub Part of the Information Security Commons, and the Science and Technology Law Commons Recommended Citation Samuel, Ian, "The New Writs of Assistance" (2018). Articles by Maurer Faculty. 2690. https://www.repository.law.indiana.edu/facpub/2690 This Article is brought to you for free and open access by the Faculty Scholarship at Digital Repository @ Maurer Law. It has been accepted for inclusion in Articles by Maurer Faculty by an authorized administrator of Digital Repository @ Maurer Law. For more information, please contact [email protected]. THE NEW WRITS OF ASSISTANCE Ian Samuel* The providers of network services (and the makers of network devices) know an enormous amount about our lives. Because they do, these network intermediaries are being asked with increasing frequency to assist the government in solving crimes or gathering intelligence. Given how much they know about us, if the government can secure the assistance of these intermediaries, it will enjoy a huge increase in its theoretical capacityfor surveillance-the ability to learn almost anything about anyone. This has the potential to create serious social harm, even assuming that the government continues to adhere to ordinarydemocratic norms and the rule of law. One possible solution to this problem is for network intermediaries to refuse government requestsfor aid and attempt to sustain those refusals in court.
    [Show full text]
  • Ethical Consumer, Issue 182, Jan/Feb 2020
    £4·25 182 Jan/Feb 2020 www.ethicalconsumer.org We reveal the ethical positives & negatives of batteries SHOPPING GUIDES TO Batteries Broadband Cameras Email providers Subscription TV TVs PLUS Updated: Our list of oppressive regimes CAPITAL AT RISK. INVESTMENTS ARE LONG TERM AND MAY NOT BE READILY REALISABLE. ABUNDANCE IS AUTHORISED AND REGULATED BY THE FINANCIAL CONDUCT AUTHORITY (525432). add to your without avoiding retirement pot social responsibility abundance investment make good money abundanceinvestment.com ® THE ORIG INAL S I L I C O N E M E N S TRU A L C U P 2 Ethical Consumer Jan/Feb 2020 ETHICAL CONSUMER Editorial WHO’S WHO elcome to Ethical things, the role of hydrogen as a key THIS ISSUE’S EDITOR Josie Wexler Consumer’s last issue of player in energy storage. And we heard PROOFING Ciara Maginness (Little Blue Pencil) WRITERS/RESEARCHERS Jane Turner, Tim Hunt, the decade! This issue is from our youngest ever panel member – Rob Harrison, Anna Clayton, Joanna Long, focused on various electrical 14-year-old Omi – a Youth Ambassador Josie Wexler, Ruth Strange, Mackenzie Denyer, Wgoods and services, featuring guides to for Action for Conservation. When Clare Carlile, Francesca de la Torre, Alex Crumbie, batteries, TVs, subscription TV, cameras, asked by the audience what she felt the Tom Bryson REGULAR CONTRIBUTORS Simon Birch, Colin Birch email providers and broadband. greatest barrier to progress was, she DESIGN Tom Lynton Mining and minerals are inevitably said parents! LAYOUT Adele Armistead (Moonloft), Jane Turner a major theme throughout, as making Receiving the most enthusiastic COVER Tom Lynton electrical goods uses a huge cocktail of applause from the audience though was CARTOONS Marc Roberts, Andy Vine, Richard Liptrot, metals, with supply chains that reach a rousing speech from Asad Rahman, Mike Bryson from the salt lakes of Argentina to the chief executive at War on Want with AD SALES Simon Birch SUBSCRIPTIONS Elizabeth Chater, makeshift dirt shafts in the Democratic hard-hitting data on the actual impacts Francesca Thomas Republic of Congo.
    [Show full text]
  • The New Private Ordering in High Technology Companies
    Utah Law Review Volume 2019 Number 5 Article 2 1-2020 Employees as Regulators: The New Private Ordering in High Technology Companies Jennifer S. Fan University of Washington School of Law Follow this and additional works at: https://dc.law.utah.edu/ulr Part of the Business Organizations Law Commons, and the Labor and Employment Law Commons Recommended Citation Fan, Jennifer S. (2020) "Employees as Regulators: The New Private Ordering in High Technology Companies," Utah Law Review: Vol. 2019 : No. 5 , Article 2. Available at: https://dc.law.utah.edu/ulr/vol2019/iss5/2 This Article is brought to you for free and open access by Utah Law Digital Commons. It has been accepted for inclusion in Utah Law Review by an authorized editor of Utah Law Digital Commons. For more information, please contact [email protected]. EMPLOYEES AS REGULATORS: THE NEW PRIVATE ORDERING IN HIGH TECHNOLOGY COMPANIES Jennifer S. Fan* Abstract There is mounting public concern over the influence that high technology companies have in our society. In the past, these companies were lauded for their innovations, but now as one scandal after another has plagued them, from being a conduit in influencing elections (think Cambridge Analytica) to the development of weaponized artificial intelligence, to their own moment of reckoning with the #MeToo movement, these same companies are under scrutiny. Leaders in high technology companies created their own sets of norms through private ordering. Their work was largely unfettered by regulators, with the exception of the Securities and Exchange Commission’s oversight of public companies. Now, however, white-collar employees at high technology companies are speaking out in protest about their respective employers’ actions and changing private ordering as we know it.
    [Show full text]
  • Directions for Future Work: from #Techwontbuildit to #Designjustice
    Design Justice Directions for Future Work: From #TechWontBuildIt to #DesignJustice Published on: Feb 27, 2020 Design Justice Directions for Future Work: From #TechWontBuildIt to #DesignJustice Figure 6.1 No Tech for ICE, from the #TechWontBuildIt campaign. To: Talent Acquisition at Amazon. Thank you for reaching out. While I’m sure this would be a great opportunity, I have no interest in working for a company that so eagerly provides the infrastructure that ICE relies on to keep human beings in cages, that sells facial recognition technology to police, and that treats its warehouse workers as less than human. Best wishes, [redacted]. —Anonymous participant in #TechWontBuildIt Tech workers have recently been building power through active refusal to work on oppressive technology projects, often under the banner of the hashtag #TechWontBuildIt. As Lauren Luo, a student at MIT in my Networked Social Movements seminar, describes it: On December 13, 2016, the day before top tech executives met with Donald Trump in a tech summit, a group of tech workers released the Never Again pledge … “that they will refuse to build a database identifying people by race, religion, or national origin.” Just over a month later, both Google Co-Founder Sergey Brin and Y-Combinator president Sam Altman joined protests on January 28, 2017 at San Francisco International Airport in opposition to President Trump’s executive order that banned immigration [from] seven Muslim-majority countries. Two days later, over 2,000 Google employees in offices around the world staged a walkout and donated more than $2 million (matched by Google) to a crisis fund for nonprofit groups working with refugees.1 This wave of activity continued to build.
    [Show full text]
  • The New Writs of Assistance
    THE NEW WRITS OF ASSISTANCE Ian Samuel* The providers of network services (and the makers of network devices) know an enormous amount about our lives. Because they do, these network intermediaries are being asked with increasing frequency to assist the government in solving crimes or gathering intelligence. Given how much they know about us, if the government can secure the assistance of these intermediaries, it will enjoy a huge increase in its theoretical capacity for surveillance—the ability to learn almost anything about anyone. This has the potential to create serious social harm, even assuming that the government continues to adhere to ordinary democratic norms and the rule of law. One possible solution to this problem is for network intermediaries to refuse government requests for aid and attempt to sustain those refusals in court. Although this proposal has received an enormous amount of attention, there is substantial cause for skepticism about how well it can work. Congress has given the government wide authority to demand information and assistance through tools like subpoenas, the Stored Communications Act, and Title III. Even when the government does not have specific statutory authorization, courts have interpreted the All Writs Act to authorize a great deal of open-ended aid, consistent with the well-settled Anglo-American history of third-party assistance in law enforcement. It is also far from unheard of for the executive to read restrictions on its surveillance authority narrowly, and its own inherent powers broadly, to engage in surveillance that is quasi- or extra-legal. A superior (or at least complementary) response to the problem is to restrict network intermediaries themselves by limiting how much they can learn about us and how long they can retain it.
    [Show full text]