Technocultural Pluralism:A ``Clash of Civilizations'' in Technology?

Total Page:16

File Type:pdf, Size:1020Kb

Technocultural Pluralism:A ``Clash of Civilizations'' in Technology? Paper Presentation AIES ’20, February 7–8, 2020, New York, NY, USA Technocultural Pluralism: A “Clash of Civilizations” in Technology? Osonde A. Osoba [email protected] RAND Corporation Santa Monica, CA ABSTRACT the global reach of such tech platforms (e.g. Facebook, WhatsApp, At the end of the Cold War, the renowned political scientist, Samuel TikTok, etc.), these variations can give noisy hints about where Huntington, argued that future conflicts were more likely to stem technocultural fault-lines lie. Differentiation into these ecosystems from cultural frictions – ideologies, social norms, and political is likely characterized not just by concordance in tech adoption, systems – rather than political or economic frictions. Huntington but also by consensus in culture, values, policies, tech innovation, focused his concern on the future of geopolitics in a rapidly shrink- and deployment priorities. ing world. This paper argues that a similar dynamic is at play in We develop and explore two key hypotheses in relation to tech- the interaction of technology cultures. We emphasize the role of nocultures: culture in the evolution of technology and identify the particular • [Technocultural Frictions]: an AI “technocultural cold role that culture (esp. privacy culture) plays in the development of war” is already in progress. This refers to a state of ongo- AI/ML technologies. Then we examine some implications that this ing regulatory friction among multiple interacting techno- perspective brings to the fore. cultures or governance regimes. The effect of factors like effective geographic proximity, political necessity, and/or CCS CONCEPTS economic advantage conspire to make technocultural isola- • Social and professional topics → Governmental regulations; tion rare. And the inevitable interactions result in frictions. 1 Privacy policies; Transborder data flow. The focus here is on competitive or adversarial frictions . Put differently, technocultural friction refers to regulatory ACM Reference Format: frictions due to the necessary interaction among technology Osonde A. Osoba. 2020. Technocultural Pluralism: A “Clash of Civilizations” 2 in Technology?. In Proceedings of the 2020 AAAI/ACM Conference on AI, policy spheres of influence . Ethics, and Society (AIES ’20), February 7–8, 2020, New York, NY, USA. ACM, • [Technocultural Pluralism]: the prospect of a global mono- New York, NY, USA, 6 pages. https://doi.org/10.1145/3375627.3375834 lithic AI technoculture emerging in the near-future is im- plausible3. Persistent pluralism is more likely. Pluralism here 1 INTRODUCTION refers to persistent diversity in the global technoculture. The At the end of the Cold War, Samuel Huntington, argued that future primary support for this hypothesis is admittedly an ex- conflicts were more likely to stem from cultural frictions -– ideolo- trapolation. The extrapolation is based on two premises: 1.) gies, social norms, and political systems – rather than political or technology is strongly influenced by local culture; & 2.) there economic frictions [9, 10]. Huntington was focused on the future of continues to be strong global variation in cultures and values. geopolitics in a rapidly shrinking world. But his argument applies Then a basic extrapolation suggests that technology regimes as forcefully (if not more) to the interaction of what we might call will continue to reflect the diversity of cultures. technocultures. These hypotheses are not necessarily AI-specific. But the current This discussion will use the term technoculture to refer to the efflorescence of innovation in data-hungry machine learning tech- global patchwork of interacting technology ecosystems in which we nology provides a good sand-box for making our discussions more now live. These do not have to be geographically or geopolitically concrete. constrained. But they often do rehash geopolitical boundaries due This paper has two aims. The first aim is descriptive (like most to the influence that local governance exerts on technology develop- of Huntington’s original 1993 discussion). The aim is to describe ment and adoption. As a simple intuitive illustration of such distinct underlying factors and dynamics that foster the development of ecosystems, observe the variation in popular choice of social me- differentiated technocultures. We identify and clarify some key con- dia tech platforms across the globe circa 2016 (Figure 1.). Given cepts in the process. For example, we paint a clearer picture of the concept of a technoculture. This descriptive exploration is intended Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation 1 Not focusing on military frictions in this discussion in spite of the use of the “cold on the first page. Copyrights for components of this work owned by others than the war” metaphor. author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or 2 Whereas Huntington wrote about “civilizations,” we can think of the relevant units republish, to post on servers or to redistribute to lists, requires prior specific permission of analysis here as policy spheres of influence. These may be national (e.g. China, and/or a fee. Request permissions from [email protected]. USA), subnational (e.g. California, Texas), or even supranational (e.g. the EU) aggregate AIES ’20, February 7–8, 2020, New York, NY, USA entities that exert some form of regulatory control over their geographical jurisdictions. © 2020 Copyright held by the owner/author(s). Publication rights licensed to ACM. 3 This is not asserting impossibility in the long-run. That would require strong claims ACM ISBN 978-1-4503-7110-0/20/02...$15.00 about cultural evolution that cannot be supported by the simple inductive extrapolation https://doi.org/10.1145/3375627.3375834 suggested here. 132 Paper Presentation AIES ’20, February 7–8, 2020, New York, NY, USA Figure 1: Geographical variation in the relative popularity of social media platforms based on 2016 3rd party data. Popula- tions in different jurisdictions tend to co-adopt signature collections of social media platforms. Examples of such collections include (WeChat, SinaWeibo, Qzone, BaiduTieba) and (Facebook, YouTube, WhatsApp, Google+). There can also be variation in rates or patterns of adoption within the same platform collection. For example, the figure shows that North and South America share the same vector of co-adopted platforms. But the two demographics are separable based on the relative rates of adoption for the individual platforms within the same collection. Interestingly, the observed platform co-adoption clus- terings align quite closely to the cultural fault-lines Huntington outlined almost 30 years ago. We can roughly make out Western-Europe-and-USA-and-Australia, China, Eastern Europe, Japan, and Islamic-Hindu spheres of influence. (Data Cour- tesy of GlobalWebIndex.net). © Joshua S. Mendelsohn. to also serve a persuasive function. Technocultures are easier to governance implications? Hopefully this exploration starts us off track once we observe how the warp and weft of technology innova- with basic tools to gain more insight into these types of questions. tion, deployment, culture-specific norms, and regulation “conspire” to differentiate our global technology environment. The second 2 TECHNOCULTURE: A DEFINITION aim is to extend beyond pure description by highlighting notable First there is a question of what we mean by a technoculture. dynamics and implications of technocultural pluralism. It is worth The term technoculture here refers specifically to the combina- highlighting specifically the important implications of data privacy tion of a technology ecosystem and the culture4 in which it is em- policies, data localization, and population size as mechanisms for bedded. The concept of a technoculture forces an engagement with differentiation and evolution in current instances of technocultures. questions of how cultural contexts affect, influence, or determine Part of the motivation for this discussion is to counter a specific the evolution, deployment, and adoption of technology artifacts. perspective. This perspective anticipates a future regulatory sce- This will include questions about the controlling innovation cul- nario featuring a monolithic global technology ecosystem with little ture, the prioritization of problems for technological innovation, to no geographic cultural variation. Although this is admittedly a expected modes of deployment, etc. strawman position, elements of this position arise in technology policy conversations. The narrative of an impending technology 4 The word culture tends to raise intellectual hackles because of its supposedly nebulous monoculture may be seductive because it promises a future with a or imprecise definition. However definitional imprecision is not sufficient reasonfor simpler tech regulatory environment. But the simplicity of that hy- asserting non-existence. We use culture here to refer to persistent societal norms and values that circumscribe observed behavior. The social psychologist, Geert Hofstede, pothetical future is not necessarily an argument for the likelihood defines culture as “the collective programming of the mind that distinguishes [groups].” of its occurrence. Or its desirability... What can we say about the Less abstractly, a recent exploration
Recommended publications
  • Empirical Assessment of Risk Factors: How Online and Offline Lifestyle, Social Learning, and Social Networking Sites Influence Crime Victimization Ashley Bettencourt
    Bridgewater State University Virtual Commons - Bridgewater State University Master’s Theses and Projects College of Graduate Studies 12-2014 Empirical Assessment of Risk Factors: How Online and Offline Lifestyle, Social Learning, And Social Networking Sites Influence Crime Victimization Ashley Bettencourt Follow this and additional works at: http://vc.bridgew.edu/theses Part of the Criminology and Criminal Justice Commons Recommended Citation Bettencourt, Ashley. (2014). Empirical Assessment of Risk Factors: How Online and Offline Lifestyle, Social Learning, And Social Networking Sites Influence Crime Victimization. In BSU Master’s Theses and Projects. Item 10. Available at http://vc.bridgew.edu/theses/10 Copyright © 2014 Ashley Bettencourt This item is available as part of Virtual Commons, the open-access institutional repository of Bridgewater State University, Bridgewater, Massachusetts. Empirical Assessment of Risk Factors: How Online and Offline Lifestyle, Social Learning, and Social Networking Sites Influence Crime Victimization By: Ashley Bettencourt Thesis Submitted in fulfillment of the requirements for the degree of Master of Science in Criminal Justice in the Graduate College of Bridgewater State University, 2014 Bridgewater, Massachusetts Thesis Chair: Dr. Kyung-Shick Choi 2 Empirical Assessment of Risk Factors: How Online and Offline Lifestyle, Social Learning, and Social Networking Sites Influence Crime Victimization Thesis By Ashley Bettencourt Approved as to style and content by: ______________________________ Kyung-shick
    [Show full text]
  • When Diving Deep Into Operationalizing the Ethical Development of Artificial Intelligence, One Immediately Runs Into the “Fractal Problem”
    When diving deep into operationalizing the ethical development of artificial intelligence, one immediately runs into the “fractal problem”. This may also be called the “infinite onion” problem.1 That is, each problem within development that you pinpoint expands into a vast universe of new complex problems. It can be hard to make any measurable progress as you run in circles among different competing issues, but one of the paths forward is to pause at a specific point and detail what you see there. Here I list some of the complex points that I see at play in the firing of Dr. Timnit Gebru, and why it will remain forever after a really, really, really terrible decision. The Punchline. The firing of Dr. Timnit Gebru is not okay, and the way it was done is not okay. It appears to stem from the same lack of foresight that is at the core of modern technology,2 and so itself serves as an example of the problem. The firing seems to have been fueled by the same underpinnings of racism and sexism that our AI systems, when in the wrong hands, tend to soak up. How Dr. Gebru was fired is not okay, what was said about it is not okay, and the environment leading up to it was -- and is -- not okay. Every moment where Jeff Dean and Megan Kacholia do not take responsibility for their actions is another moment where the company as a whole stands by silently as if to intentionally send the horrifying message that Dr. Gebru deserves to be treated this way.
    [Show full text]
  • Bias and Fairness in NLP
    Bias and Fairness in NLP Margaret Mitchell Kai-Wei Chang Vicente Ordóñez Román Google Brain UCLA University of Virginia Vinodkumar Prabhakaran Google Brain Tutorial Outline ● Part 1: Cognitive Biases / Data Biases / Bias laundering ● Part 2: Bias in NLP and Mitigation Approaches ● Part 3: Building Fair and Robust Representations for Vision and Language ● Part 4: Conclusion and Discussion “Bias Laundering” Cognitive Biases, Data Biases, and ML Vinodkumar Prabhakaran Margaret Mitchell Google Brain Google Brain Andrew Emily Simone Parker Lucy Ben Elena Deb Timnit Gebru Zaldivar Denton Wu Barnes Vasserman Hutchinson Spitzer Raji Adrian Brian Dirk Josh Alex Blake Hee Jung Hartwig Blaise Benton Zhang Hovy Lovejoy Beutel Lemoine Ryu Adam Agüera y Arcas What’s in this tutorial ● Motivation for Fairness research in NLP ● How and why NLP models may be unfair ● Various types of NLP fairness issues and mitigation approaches ● What can/should we do? What’s NOT in this tutorial ● Definitive answers to fairness/ethical questions ● Prescriptive solutions to fix ML/NLP (un)fairness What do you see? What do you see? ● Bananas What do you see? ● Bananas ● Stickers What do you see? ● Bananas ● Stickers ● Dole Bananas What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store ● Bananas on shelves What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas at a store ● Bananas on shelves ● Bunches of bananas What do you see? ● Bananas ● Stickers ● Dole Bananas ● Bananas
    [Show full text]
  • Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification∗
    Proceedings of Machine Learning Research 81:1{15, 2018 Conference on Fairness, Accountability, and Transparency Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification∗ Joy Buolamwini [email protected] MIT Media Lab 75 Amherst St. Cambridge, MA 02139 Timnit Gebru [email protected] Microsoft Research 641 Avenue of the Americas, New York, NY 10011 Editors: Sorelle A. Friedler and Christo Wilson Abstract who is hired, fired, granted a loan, or how long Recent studies demonstrate that machine an individual spends in prison, decisions that learning algorithms can discriminate based have traditionally been performed by humans are on classes like race and gender. In this rapidly made by algorithms (O'Neil, 2017; Citron work, we present an approach to evaluate and Pasquale, 2014). Even AI-based technologies bias present in automated facial analysis al- that are not specifically trained to perform high- gorithms and datasets with respect to phe- stakes tasks (such as determining how long some- notypic subgroups. Using the dermatolo- one spends in prison) can be used in a pipeline gist approved Fitzpatrick Skin Type clas- that performs such tasks. For example, while sification system, we characterize the gen- face recognition software by itself should not be der and skin type distribution of two facial analysis benchmarks, IJB-A and Adience. trained to determine the fate of an individual in We find that these datasets are overwhelm- the criminal justice system, it is very likely that ingly composed of lighter-skinned subjects such software is used to identify suspects. Thus, (79:6% for IJB-A and 86:2% for Adience) an error in the output of a face recognition algo- and introduce a new facial analysis dataset rithm used as input for other tasks can have se- which is balanced by gender and skin type.
    [Show full text]
  • JULIAN ASSANGE: When Google Met Wikileaks
    JULIAN ASSANGE JULIAN +OR Books Email Images Behind Google’s image as the over-friendly giant of global tech when.google.met.wikileaks.org Nobody wants to acknowledge that Google has grown big and bad. But it has. Schmidt’s tenure as CEO saw Google integrate with the shadiest of US power structures as it expanded into a geographically invasive megacorporation... Google is watching you when.google.met.wikileaks.org As Google enlarges its industrial surveillance cone to cover the majority of the world’s / WikiLeaks population... Google was accepting NSA money to the tune of... WHEN GOOGLE MET WIKILEAKS GOOGLE WHEN When Google Met WikiLeaks Google spends more on Washington lobbying than leading military contractors when.google.met.wikileaks.org WikiLeaks Search I’m Feeling Evil Google entered the lobbying rankings above military aerospace giant Lockheed Martin, with a total of $18.2 million spent in 2012. Boeing and Northrop Grumman also came below the tech… Transcript of secret meeting between Julian Assange and Google’s Eric Schmidt... wikileaks.org/Transcript-Meeting-Assange-Schmidt.html Assange: We wouldn’t mind a leak from Google, which would be, I think, probably all the Patriot Act requests... Schmidt: Which would be [whispers] illegal... Assange: Tell your general counsel to argue... Eric Schmidt and the State Department-Google nexus when.google.met.wikileaks.org It was at this point that I realized that Eric Schmidt might not have been an emissary of Google alone... the delegation was one part Google, three parts US foreign-policy establishment... We called the State Department front desk and told them that Julian Assange wanted to have a conversation with Hillary Clinton...
    [Show full text]
  • AI Principles 2020 Progress Update
    AI Principles 2020 Progress update AI Principles 2020 Progress update Table of contents Overview ........................................................................................................................................... 2 Culture, education, and participation ....................................................................4 Technical progress ................................................................................................................... 5 Internal processes ..................................................................................................................... 8 Community outreach and exchange .....................................................................12 Conclusion .....................................................................................................................................17 Appendix: Research publications and tools ....................................................18 Endnotes .........................................................................................................................................20 1 AI Principles 2020 Progress update Overview Google’s AI Principles were published in June 2018 as a charter to guide how we develop AI responsibly and the types of applications we will pursue. This report highlights recent progress in AI Principles implementation across Google, including technical tools, educational programs and governance processes. Of particular note in 2020, the AI Principles have supported our ongoing work to address
    [Show full text]
  • Facial Recognition Is Improving — but the Bigger That Had Been Scanned
    Feature KELVIN CHAN/AP/SHUTTERSTOCK KELVIN The Metropolitan Police in London used facial-recognition cameras to scan for wanted people in February. Fussey and Murray listed a number of ethical and privacy concerns with the dragnet, and questioned whether it was legal at all. And they queried the accuracy of the system, which is sold by Tokyo-based technology giant NEC. The software flagged 42 people over 6 trials BEATING that the researchers analysed; officers dis- missed 16 matches as ‘non-credible’ but rushed out to stop the others. They lost 4 people in the crowd, but still stopped 22: only 8 turned out to be correct matches. The police saw the issue differently. They BIOMETRIC BIAS said the system’s number of false positives was tiny, considering the many thousands of faces Facial recognition is improving — but the bigger that had been scanned. (They didn’t reply to issue is how it’s used. By Davide Castelvecchi Nature’s requests for comment for this article.) The accuracy of facial recognition has improved drastically since ‘deep learning’ tech- hen London’s Metropolitan extracted key features and compared them niques were introduced into the field about a Police tested real-time to those of suspects from a watch list. “If there decade ago. But whether that means it’s good facial-recognition technology is a match, it pulls an image from the live feed, enough to be used on lower-quality, ‘in the wild’ between 2016 and 2019, they together with the image from the watch list.” images is a hugely controversial issue.
    [Show full text]
  • Regulating Electronic Identity Intermediaries: the Soft Eid Conundrum
    Regulating Electronic Identity Intermediaries: The "Soft elD" Conundrum TAL Z. ZARSKY* & NORBERTO NuNo GOMES DE ANDRADEt TABLE OF CONTENTS 1. INTRODUCTION: THE RISE OF THE DIGITAL IDENTITY INTERMEDIARY .......................................... 1336 II. IDENTITY INTERMEDIATION: "HARD" EIDS/"SOFT" EIDS...............1342 A. From IDs to elDs ......................... ....1 342 B. Soft eiDs: Definitions, Roles, and Tasks .............. 1345 1. Incidental .......................... ...... 1346 2. Remote Verification ................... ...... 1348 3. Lightly RegulatedIndustries .............. ..... 1348 III. SOFT EIDs: COMMON VARIATIONS AND KEY EXAMPLES ............... 1349 A. The Ipse/Idem Distinction ........................ 1349 B. Real Names Versus Stable Pseudonyms .............. 1351 1. Real Name IDs ............................. 1351 a. Mandatory Versus Voluntary? ...................1352 b. Verification Methods ...........................1353 c. Real Name elDs: Unique Benefits and Uses............. 1354 2. Stable Pseudonyms ..................... ..... 1357 IV. THE BENEFITS AND PITFALLS OF SOFT EID INTERMEDIATION ......................................... 1359 A. The Benefits of Soft elDs: Economic, Personal, and Social .................................. 1359 B. Identity-Related Harms and Soft Identity Intermediaries .................................. 1362 1. Identity, "Virtual Identity" Theft andFraud, and the PossibleRole and Responsibility of the Intermediary. ............................... 1363 2. Virtual PropertyRights, Publicity,
    [Show full text]
  • Distributed Blackness Published Just Last Year (2020, New York University Press), Dr
    Transcript The Critical Technology Episode 4: Distributed Blackness Published just last year (2020, New York University Press), Dr. Andre Brock Jr’s Distributed Blackness: African American Cybercultures has already become a highly influential, widely cited, must read text for anyone interested in digital culture and technology. Through a theoretically rich and methodologically innovative analysis, Brock positions Blackness at the centre of Internet technoculture, its technological designs and infrastructures, its political economic underpinnings, its cultural significance and its emergent informational identities. Sara Grimes 0:00 Like many of you, as the fall 2020 semester was wrapping up, my Twitter feed was suddenly flooded with the news that Dr. Timnit Gebru had been fired from Google. A world renowned computer scientist, leading AI ethics researcher, and co-founder of Black in AI. Dr. Gebru had recently completed research raising numerous ethical concerns with the type of large scale AI language models used in many Google products. Her expulsion from Google highlights the discrimination and precarity faced by black people, especially black women, working in the tech industries and in society at large. On the one hand, Dr. Gebru’s treatment by Google execs made visible through screenshots and conversations posted on Twitter was met with an outpouring of public support from fellow Google employees, tech workers, academics and concerned citizens. The #ISupportTimnit was often paired with #BelieveBlackWomen, connecting this event with the systematic anti-black racism and sexism found across technoculture. On the other hand, since December, Dr. Gebru and her supporters have also been targeted by Twitter trolls, and subjected to a bombardment of racist and sexist vitriol online.
    [Show full text]
  • Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing
    Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing Inioluwa Deborah Raji Timnit Gebru Margaret Mitchell University of Toronto Google Research Google Research Canada United States United States [email protected] [email protected] [email protected] Joy Buolamwini Joonseok Lee Emily Denton MIT Media Lab. Google Research Google Research United States United States United States [email protected] [email protected] [email protected] ABSTRACT the positive uses of this technology includes claims of “understand- Although essential to revealing biased performance, well inten- ing users”, “monitoring or detecting human activity”, “indexing and tioned attempts at algorithmic auditing can have effects that may searching digital image libraries”, and verifying and identifying harm the very populations these measures are meant to protect. subjects “in security scenarios” [1, 8, 16, 33]. This concern is even more salient while auditing biometric sys- The reality of FPT deployments, however, reveals that many tems such as facial recognition, where the data is sensitive and of these uses are quite vulnerable to abuse, especially when used the technology is often used in ethically questionable manners. for surveillance and coupled with predatory data collection prac- We demonstrate a set of five ethical concerns in the particular case tices that, intentionally or unintentionally, discriminate against of auditing commercial facial processing technology, highlighting marginalized groups [2]. The stakes are particularly high when additional design considerations and ethical tensions the auditor we consider companies like Amazon and HireVue, who are selling needs to be aware of so as not exacerbate or complement the harms their services to police departments or using the technology to help propagated by the audited system.
    [Show full text]
  • Alexis Lothian | [email protected] | 310-436-5727 Writing Sample
    Alexis Lothian | [email protected] | 310-436-5727 Writing Sample Submission to International Journal of Cultural Studies Submitted September 8, 2011 Archival Anarchies: Online Fandom, Subcultural Conservation, and the Transformative Work of Digital Ephemera Abstract: This essay explores the politics of digital memory and traceability, drawing on Jacques Derrida’s Archive Fever and on queer theories of performance and ephemerality. Its primary example is the Organization for Transformative Works, a successful advocacy group for creative media fans whose main project has been an online fan fiction archive. I am concerned both with this public face and with the activities that fans would prefer to keep out of the official record. For scholars of subcultural artistic and cultural production, the mixed blessings of conservation and legitimacy are necessary considerations for the archiving and meaning-making work of cultural theory. Paying attention to the technological and social specificities that contextualize fandom as a digital archive culture, I trace contradictions and contestations around what fannish and scholarly archivable content is and should be: what’s trivial, what’s significant, what legally belongs to whom, and what deserves to be preserved. Keywords: subculture, fandom, archive, ephemera, memory, digital, queer Word count: 8,053 including all notes and references Lothian 2 1. Introduction: Legitimacy [W]hat is no longer archived in the same way is no longer lived in the same way. (Derrida 1996: 18) In 2011, Google’s decision to delete apparently pseudonymous accounts on its new social networking service, Google Plus, caused enormous online controversy. The hashtag #nymwars indexed online disagreements over whether individuals’ participation in online social networking should be connected to the names on their credit cards.
    [Show full text]
  • Social Media Monopolies and Their Alternatives 2
    EDITED BY GEERT LOVINK AND MIRIAM RASCH INC READER #8 The Unlike Us Reader offers a critical examination of social media, bringing together theoretical essays, personal discussions, and artistic manifestos. How can we understand the social media we use everyday, or consciously choose not to use? We know very well that monopolies control social media, but what are the alternatives? While Facebook continues to increase its user population and combines loose privacy restrictions with control over data, many researchers, programmers, and activists turn towards MIRIAM RASCH designing a decentralized future. Through understanding the big networks from within, be it by philosophy or art, new perspectives emerge. AND Unlike Us is a research network of artists, designers, scholars, activists, and programmers, with the aim to combine a critique of the dominant social media platforms with work on ‘alternatives in social media’, through workshops, conferences, online dialogues, and publications. Everyone is invited to be a part of the public discussion on how we want to shape the network architectures and the future of social networks we are using so intensely. LOVINK GEERT www.networkcultures.org/unlikeus Contributors: Solon Barocas, Caroline Bassett, Tatiana Bazzichelli, David Beer, David M. Berry, Mercedes Bunz, Florencio Cabello, Paolo Cirio, Joan Donovan, EDITED BY Louis Doulas, Leighton Evans, Marta G. Franco, Robert W. Gehl, Seda Gürses, Alexandra Haché, Harry Halpin, Mariann Hardey, Pavlos Hatzopoulos, Yuk Hui, Ippolita, Nathan Jurgenson, Nelli Kambouri, Jenny Kennedy, Ganaele Langlois, Simona Lodi, Alessandro Ludovico, Tiziana Mancinelli, Andrew McNicol, Andrea Miconi, Arvind Narayanan, Wyatt Niehaus, Korinna Patelis, PJ Rey, Sebastian SOCIAL MEDIA MONOPOLIES Sevignani, Bernard Stiegler, Marc Stumpel, Tiziana Terranova, Vincent Toubiana, AND THEIR ALTERNATIVES Brad Troemel, Lonneke van der Velden, Martin Warnke and D.E.
    [Show full text]