<<

Obama's Legacy: a Research in the History of Technological Change, Internet and Data Gathering from the Perspective of a Political Star.

By Rik van Eijk 10003025 Supervisor: dhr. dr. E.F. van de Bilt University of Amsterdam

2015

1 Introduction

“I hope we will use the Net to cross barriers and connect cultures...”

...was the answer by Tim Berners-Lee, the major inventor of the World Wide Web, when CNN asked him what he thought the internet's biggest impact would be (CNN 2005). In 1990, the general public was introduced to the World Wide Web, developed in the rooms of CERN in Geneva. Through this world wide web, people could talk, discuss and enhance their opinion on every subject that they wanted. This world wide web became quickly known as the internet. Academics could simply send and peer-review their colleague's publications. E-mail introduced a fast way to send text to a particular person; families all over the world were able to quickly (re)connect when instant messengers took over and webcams were introduced. Companies got involved in the business and started their own websites. Sites became less static, introduced pictures and video’s and designers and programmers improved websites to optimize the user experience. Games focused more and more on a multiplayer function, because the internet allowed people all over the world to play with and against each other. SixDegrees launched personal websites in the 90’s, just like or offer today. By selling advertorials based on a specific user data, internet companies, such as and Yahoo, form a billion dollar industry. They started to enhance the experience of the internet by their own web browsers and operating systems and by introducing whole new technologies and standards to society, such as Google Glasses or Oculus VR. Almost one third of the world population has access to internet: from a cable, wireless modem connection to 3G connection on a smartphone, the internet is available everywhere. In the winter of 2011 I challenged my 120 Twitter followers to try getting from the online shopping mall Amazon to the website of the University of

2 Amsterdam, without typing the UvA address in the browser, without copying and pasting and without the help of search engines. The only thing that was allowed was clicking. The links on different sites should determine whether it was possible to travel from one site to the other. One of my followers actually tried it and needed fifteen steps to accomplish the journey, whilst it would only have taken one search to accomplish this by the help of a search engine. Try to imagine that you do not want to go to the website of the University of Amsterdam, but want to know the answer to the question whether CERN published a paper of Pluto’s orbit in relation to that of Mars and Venus in 1999 (the answer is no). The point that I want to make is that the current experience of the internet has changed by the implementation of online tools by commercial companies: they make the internet easier to use. These companies, as the short history of the internet above also suggests, use information of people that enter searches in order to better aim their advertorials. The user pays the usage of the service with the information: every piece of information that can be obtained, will be obtained, analyzed and used for marketing purposes. It will be used to read the user and to achieve capital gain. The internet pioneers started with a positive vision The first users of the internet also became the main contributors of the medium. The position of these contributors has been taken over by tech companies, trying to maintain their dominance in internet service and data collection. It is this data collection that recently created a controversy in the discourse of privacy vis-a-vis security. , a NSA contractor employed by Booz Allen Hamilton, stored classified information which he passed on to journalists of , or . The discourse of data collection – or the so called big data – shifted from a commercial gain towards a security and privacy issue. The data was not only used to enhance the experience of the internet; to make it easier for people to search what they want; how to communicate and what to share. The information formed the major database for

3 the secret service agencies of the and other countries to analyze and correlate when and where security threats were most likely to occur. Snowden revealed that the NSA conducts , by collecting internet and telephone data of their own and of foreign citizens. They even tapped foreign country leaders that were known as allies, such as Germany’s Chancellor Angela Merkel. The discourse of internet privacy became relevant overnight: everyone, from politicians to newspapers and bloggers, talked about Snowden’s revelations, the effects they had on privacy and the position of governments that were involved. This paper will not discuss the difficult privacy perceptions of the current state of affairs with regard to data collection by commercial companies. Neither will it discuss the history of the Internet, how it should have been used and why it did not turn out that way. Instead, this paper will research the position of one particular man in the internet discourse, a man that has used data collecting technology in such a way that he now holds residence in the most powerful estate of the world. A man that is smart, charismatic, a good public speaker; a man known for his skin color and his position, but who is now in a difficult position regarding the discourse of internet and its freedom, the path he chose and the population’s awareness of the issue. This paper will discuss the current president of the United States, : his accomplishments and failures with regard to his ideology and the way he uses the internet and the digital sphere. The United States plays a major role in the current data collection debate. As the president of the United States, Obama needed to take a stand in the discourse. In his State of the Union address of 2014 he said “That’s why, working with this Congress, I will reform our surveillance programs – because the vital work of our intelligence community depends on public confidence, here and abroad, that the privacy of ordinary people is not being violated”. With this statement he took a stance in internet privacy for US citizens, whilst not devaluing the need of the intelligence community. At the same time, publications such as those of Sabato

4 (2010) and Issenberg (2012) expressed that social media, internet tools and big data played a major role in the campaign strategy for the elections of 2008 and 2012, either to reach out to the voters or to collect donations. This contradiction in Obama’s use of the concept of data, surveillance, social media, appeal to voters and citizens, combined with Obama's background, education, career and his Change and Hope campaign strategy, led to the subject of this master thesis. One of Obama’s main subjects in his campaign was the introduction of a collective, nation-wide health care system. Through his experiences as a community organizer in the South side of Chicago, Obama realized that a collective health care system would solve major problems of the poor. The Patient Protection and Affordable Care Act (PPACA, also known as Obamacare) was signed to law by Obama in 2010. The website was launched in October 2013, but faced major technical problems. One part of Obama’s solution was to ask CEOs and other prominent employees of tech companies to advise him on the issue, introducing them into the . Among the people that were invited were Tim Cook, CEO of Apple, Reed Hasting, CEO of Netflix and Dick Costolo, CEO of Twitter (Hall 2013). The introduction of these CEOs questions the notion of internet- interpretation by Obama and his administration. As described in the first chapter of this thesis, there are two dominant categories in the current internet-paradigm, as mentioned by Morozov: the cyber-utopians and the internet-centralists. According to Morozov, both categories have an unrealistic view of the internet and its possibilities. Cyber-utopians think of what the internet should be, in all kind of ways – a set of political, social and technological beliefs:

Cyber-utopians ambitiously set out to build a new and improved United Nations, only to end up with a digital Cirque du Soleil. Even if true – and that’s a gigantic ‘if’-- their theories proved difficult to adapt to non-Western and particularly non democratic contexts (Morozov 2011: xvii).

5 Internet-centralists not exactly opposes the Cyber-utopianism, but it certainly does not agree with it. Where the cyber-utopians think of the what of the Internet, Internet-centralists think of how practices should be done:

Unlike cyber-utopianism, Internet-centrism is not a set of beliefs; rather, it’s a philosophy of action that informs how decisions, including those that deal with democracy promotion, are made and how long-term strategies are crafted. (...) Internet-centralists like to answer every question about democratic change by first reframing it in terms of the Internet rather than the context in which that change is to occur. They are often completely oblivious to the highly political nature of technology, especially the internet, and like to come up with strategies that assume that the logic of the internet, which in most cases, they are the only ones to perceive, will shape every environment than it penetrates rather than vice versa (Morozov 2011: xvii).

Following the path paved by Morozov, I will examine the current state of affairs in the Obama administration from both the Cyber-utopian as the Internet-centralists perspective. I will examine the life of a young Barack Obama and influence of the counterculture that eventually led to the utopian internet vision of Silicon Valley (which will be chronologically explained in the second chapter of this paper). It is common knowledge that Obama is known to have used digital technologies, such as social media platforms and big data, particularly in his (re)election as president. Moreover, as a president, he dealt with internet related issues, such as the SOPA/PIPA debate, his Obamacare project and the revelations of global espionage by the NSA and American related secret services by Wikileaks and Edward Snowden. What is his stance to it, does he influence it, how aware of the technology is he actually? Why does he invite CEOs of the tech companies to the White House, whilst they were not invited to be advisors in the SOPA and PIPA hearings in Congress? What is the position and the function of the prominent tech people, their culture and their technology in the race for Obama’s election and re-election? Questions such as these will be answered in this paper.

6 Careful analysis of several biographies, such as Kloppenberg’s Reading Obama, in combination with his speeches, his own writings, his campaigns and his legislative decisions with regard to technological issues when in office, such as the SOPA and PIPA acts, will provide an answer to the above mentioned questions. The main objective of this paper is to place Obama in a society described above: where the internet is common, data gathering a must—and most importantly, how Obama uses this technology to reflect his own ideology. This paper is divided into three parts. First a theoretical background of the current digitized world relating the theory of Cybernetics. It will be followed by how Obama and his campaign team used this digital technology to reach the presidential seat; a seat that is a necessity to conduct Obama's ideology. The last chapter will focus on the use of digital technologies during Obama's terms. This will result in a paper of Obama in which his youth, education, career and presidency will all be evaluated from a technological perspective: the internet, social media, big data and others. The result will be Obama’s technological legacy.

7 Cybernetics and the Commercialization of the Internet

In the spring of 2014, Associated Press reported the existence of a Twitter-like social medium especially created for the Cuban market, called ZunZuneo; launched in 2009. So far, nothing seems odd about this story, until one discovers that the medium was created by the US Agency for International Development, a federal international development organization run under the aegis of the Department of State. The plan was to introduce the medium in the Cuban youth-sphere so they can post about the weather, sports and other subjects. After a while ZunZuneo should have become a medium comparable to Twitter in the Arab-spring, where several Arab governments fell, partially – but not exclusively – because of the public use of Twitter (and other social media). US officials hoped the Cuban-spring would occur with the help of ZunZuneo, but the medium failed in its purpose and was discontinued in 2012 when they ran out of the granted money (The Guardian 2014). The above-given example of ZunZuneo can be linked to 2009, when a controversy was created among the Iran elections. “The Revolution Will Be Twittered”, was the first title of a series of blogposts posted on The Atlantic blog by Andrew Sullivan (2009), or, as Jon Stewart, the satiric news anchor of the Daily Show puts it: “Why did we have to send an army when we could have liberated them the same way we buy shoes?"(Siegel 2011) Though the amount of Twitter users in Iran is only a small fraction of the total population, the almost-revolution in Iran changed the way the internet is used in these kind of political spectrums. The example of ZunZuneo portraits the naivety of the current American discourse on internet related topics: “we should give them – the citizens of authoritarian countries – Twitter, so they will start a revolution”.

8 Cybernetics

In his book The Net Delusion, Belarus-born Evgeny Morozov argues that the image of internet – social media, blog posts and entertainment sites – does not match with the real image one should pursue when it is put into a political context. In his chapter The Google Doctrine, which is all about the naivety of the Iranian revolution in 2009 and the democratic, western response on it, the Iran example is further exemplified. The book critiques the current paradigm in the discourse of political cyberspace by offering an other, according to Morozov, more realistic view. The history of internet analysis can be divided into two categories. First there are the cyber-utopians, especially around in the 1990’s, reflecting a positive, political way to approach the internet. Members of the other category belong to the internet-centralists: “While cyber-utopianism stipulates what has to be done, Internet-centralism stipulates how it should be done” (Morozov 2011: xvii). To know Morozov’s point, we first need to examine the two paradigms that he is critiquing. In 1940, Vannevar Bush, a former MIT professor and administrator persuaded Roosevelt to create the National Defense Research Committee. Through this institution, government dollars for military research would be funneled to civilian contractors, including MIT’s Radiation Laboratory, better known as the Rad Lab. The Rad Lab its main goal was to develop a more effective way to track and shoot down enemy planes, and, because of the bombings in Britain and the US’ own involvement in the war through Pearl Harbor, enlarged exponentially during the years of the war. Though the lab was funded with the support of large bureaucracies, the Rad Lab was a site of flexible, collaborative work and a non- hierarchical management style wherein researchers crossed former non-crossable boundaries; constantly pursuing the goal of creating better gear for the troops abroad. It was this flexible, boundary crossing working style that led to the creation

9 of the computational metaphor and the new philosophy of technology in which it made its first public appearance: Cybernetics. The father of this theory is a former mathematics prodigy, Norbert Wiener, who joined the faculty of MIT in 1919 and worked closely with Vannevar Bush in creating analogue computers in the 1930’s. Wiener was not only interested in mathematics, but also in biology, computers and electrical engineering. Together with a young engineer, Julian Bigelow, Wiener created the predictor, a machine which had to embody the philosophy of tracking the future path of airplanes:

Early in the process, Wiener and Bigelow recognized that the enemy bomber and the anti-aircraft system each depended on both mechanical and human components. From a theoretical point of view, this combination of the organic and the mechanical presented a problem (Turner 2006: 21).

They started to imagine the pilot of the bombers as a mechanical devices, so they were able to model the behavior of these persons:

… [B]y means of the same imaginative transformation of men into information processing devices, Wiener and Bigelow offered up a picture of humans and machines as dynamic collaborating elements in a single, highly fluid, socio- technical system. Within that system, control emerged not from the mind of a commanding officer, but from the complex, probabilistic interactions of humans, machines, and events around them. (...) In the predictor Wiener and Bigelow presented an example of a system in which men and machines collaborated, amplifying their respective capabilities, sharing control and ultimately serving the human good of stopping the Nazi onslaught (Turner 2006: 21).

Two years after his publication of Cybernetics, and after implementing his philosophy in fields such as biology, Wiener published his book The Human Use of Human Beings: Cybernetics and Society. According to Wiener, society as a whole consists of a mechanism, receiving and sending messages and all were simply patterns of ordered information in a world otherwise tending to entropy and noise. Against Vannevar Bush’s vision of destructing the intense collaborative research climate after the end of the war, it continued its existence in the military-industrial

10 complex; becoming an everyday practice and training a whole generation of computer engineers, scientists and technicians. Wiener’s Cybernetics found its way in research projects and academic disciplines: management theory, clinical psychology, political science, biology and ecology; and ultimately in the urban renewal projects of Lyndon Johnson. “By the late 1950’s, many Americans had begun to fear that the military, industrial and academic institutions that had brought the atomic bomb into being, were beginning to transform all of American life” (Turner 2005: 28). Cybernetics as a computational metaphor meant that the dominant minority – the minority controlling the military-industrial complex – will dehumanize man’s role; man will be fed into the machine or controlled for the benefit of collective organizations. This vision found a large and passionate audience on the college campuses of the 1960’s, where students rose up against this vision of dehumanization of people; not wanting to be a handle within a machine; using their intellect to feed the bureaucracies of the military-industrial complex and demanding the humanization of universities. At the same time the amount of students rose exponentially from fourteen percent in 1961 to fifty percent in 1970. These students did not have difficulties finding jobs, but had an anxiety of feeding the postwar industry; feeding the Vietnam war. This anxiety of feeding the postwar industry led to the creation of two different kind of youth movement, combined in the concept of the Counterculture:

The first grew out of the struggles for civil rights in the Deep South and the free Speech Movement and became known as the New Left. (...) The second bubbled up out of a wide variety of cold war-era cultural springs, including Beat Poetry and fiction, Zen Buddhism, action painting, and, by the mid 1960’s, encounters with psychedelic drugs. If the New Left turned outward, toward political action, this wing turned inward, towards questions of consciousness and interpersonal intimacy, and toward small-scale tools such as LSD or rock music as ways to enhance both (Turner 2006: 33).

11 The New Left was a movement that used existing tools – politics, networking – to create a new world, while the New Communalists, the second category mentioned above, did the exact opposite and saw the mind as the key to social change. Creating communes in the forests or countrysides of California or New Mexico, pursuing Reichs third consciousness of bureaucratically leveled communities, harmonious collaborations in which each citizen's was honest and together with every other, and in novels published the idealistic ecotopia: a future, car-less, community driven California (Barbrook & Cameron 1996). They openly declared their rejection through their cloth preference, sexual promiscuity, music and drugs. They were liberal in the social sense of the world. They turned away from the New Left, the parties, politics in general; and at the same time opened the door to a new kind of mainstream culture, particularly to high-tech research culture: “If the mind was the first site of social change, then information would have to become a key part of a countercultural politics” (Turner 2006: 38). As Turner claims, the New Communalists did not reject technology – as many historians suggested. “Even as they set out for the rural frontier, the communards of the back-to-the-land movement often embraced the collaborative social practices, the celebration of technology and the cybernetic rhetoric of mainstream military-industrial-academic research” (Turner 2006: 33). Cybernetics offered an ideological alternative, where a world was build around looping circuits and information: “These circuits presented the possibility of a stable social order based not on the psychologically distressing chains of command that characterized military and corporate life, but on the ebb and flow of communication” (Turner 2006: 38); some even proposed an electronic agora, where the convergence of media, telecommunications and computers would lead to a virtual place where everyone could freely express their opinions, following the thoughts of Marshall McLuhan:

Electronic media … abolish the spatial dimension … By electricity, we everywhere resume person-to-person relations as if on the smallest village scale. It is a relation

12 in depth, and without delegation of function or powers … Dialogue supersedes the lecture (in Barbrook & Cameron 1996).

Encouraged by McLuhan's predictions, collaborative communities on the West Coast started developing new communication technologies for alternative press and radio stations. These communities believed they were at the forefront of the fight to build a new America, resisting the then dominant military-industrial complex paradigm by embracing technology as a tool of communication. The creation of the electronic agora was the first step in the goal to create a direct democracy; hypermedia will be the tool to reach this. What follows is an increasing do-it-yourself culture in and around Silicon Valley, which is build upon the free- wheeling spirit of the hippy culture made by the New Communalists. At the time of the emergence of the counterculture, Obama was a child, living either in Indonesia with his mother, or attending Punahou on Hawaii, living with his maternal grandparents. It was at Occidental where Obama first should have been in contact with the counterculture, reading literature relating to the free speech movement such as DuBois and the biography of Malcolm X (Kloppenberg 2011: 16). Obama left the free-wheeling California, at the time home of research laboratories such as XeroX and IBM which will contribute to the creation of the personal computer, to attend Columbia University. At Columbia he wanted to experience race segregation and its aftermath, eventually researching the idea of being a writer. After his graduation and a short commercial job, he travelled to Chicago to become a community organizer before attending Harvard Law School.

The Californian Ideology

Where the hippies in the 60’s technology saw as a way to be dehumanized by the military-industrial complex, the idea is reversed in the 1990’s, where the internet,

13 and other previous communication software such as ARPANET, was the way to address oneself freely in cyberspace. This bizarre fusion o f t h e cultural bohemianism of with the hi-tech industries of Silicon Valley is sometimes called the Californian Ideology, after an article published by Barbrook and Cameron in 1995. This utopian view of the internet in the early years of the 90’s can be best explained by Vedel:

Not only did the emergence of the Internet in the 1990s bring about an entirely new communication medium that became inexpensive, instantaneous and user-friendly (in the context of industrialized countries), but it was also accompanied by a new ideology boates a new way of being together and a novel polity, which no longer takes place within the bounded territories of nation states, but in an open, deterritorialized, non-hierarchical space (Vedel 2006: 229).

The members, or virtual class, as Barbrook and Cameron call them, of this so- called Californian Ideology shared the freewheeling spirit of the hippies of the sixties, but where at the same time right-wing minded: the government should stay out of their lives. The virtual class enjoy the cultural freedoms, but they are not actively involved in the creation of the ecotopia. These are the people, the core- cluster of people that started the utopian view of the cyber-utopians and internet- centralists. Where the new-communalists turned inwards and started the emergence of a hypermediated world, the Free Speech movement in the south turned outwards toward political actions. Though not participating in this free speech movement because of being born too late, Obama was influenced by members of the movement. Being confronted with his race and his definition of it, fueling and expanding his self-consciousness, the Free Speech movement provided enough thoughts and meaning to feed Obama’s mind. This indirectly influenced Obama to participate politically – wanting Chicago’s mayor seat according to Kloppenberg – and thus not participating in the new-communalism lifestyle and its antecedents in California (Kloppenberg 2011: 28-38). Though one must say that Obama is a gifted

14 academic in law. He graduated Magna Cum Laude from Harvard, where he also held the prestigious function as the head of the Harvard Law Review. After graduating Obama refused several prestigious jobs, and started to teach at the University of Chicago and did part-time consulting jobs. According to Kloppenberg, Obama must be seen as a product of antifoundationalism, particularism, perspectivism and historicism, which we can see back in his books, speeches and the issues of the Harvard Law Review for which he was responsible – and opposing the idea of law and politics as an unchangeable basis whereon can be build. The internet was formed on the idea of the academic world, wherein scholars review and peer-review their colleagues; using citations to refer to other sources. Collectively and by trial and error, knowledge will be expanded for the greater good. In 1996, Larry Page and Sergey Brin, two PhD students at Stanford started a project called Google. With the idea of sources and reviews in mind, they started to value sites according to the links between the site and other sites by using crawlers. By valuing these links (some may have more value than others) they indexed the internet. Websites with many outgoing links, or many other websites linking to one particular website, received a higher score than websites having less or no links. Offering a search tool on their website, Google is nowadays one of the biggest multinationals, surpassing previous technological companies such as and IBM. Starting with small text-based advertorials, nowadays Google still earns its majority of revenue from the selling of advertorials on their website. By collecting data, such as (estimated) location, time, possible personal information if the user is linked to Google’s social website Plus and previous collected data, Google is able to personalize advertorials according to the demands and interests of the user of the service. It is this data collection – the quantification of human thought – that Obama uses in his campaigns of 2008 and 2012. Google was not the first start-up created in the late nineties (think of Amazon), but Google exemplifies the possibility to make money of the internet.

15 This meant that gradually the utopian thought of the Internet, the electronic agora, is surpassed by the capitalistic motives of companies such as Google and Amazon (and later Facebook, Twitter and other). Money was to be earned by selling personalized advertorials, thus creating the demand of personal information and information about information: metadata. Instead of internet-users creating their own websites and blogs, companies started to offer services, such as the above mentioned Google search function, but also social media such as Friendster, Twitter, Facebook and Youtube, making it easy to share and publish information, to connect with friends and far relatives and stumble upon new knowledge. Obama and his campaign team started a similar web based service called MyBarackObama, making it possible for supporters to participate in the campaign. These sites collectively all worked because of the collection of data and the selling of advertorials – though in the case of MyBarackObama, this meant knowing to whom to sell Obama to, instead of offering the best product for a particular person. The campaign was able to collect the data because of a user's own participation in the service: a status-update involving certain brand names or politicians might be useful to certain parties. This status-update is then analyzed and the user is categorized in a certain group. The companies that collected, analyzed and categorized their users then offers the service to parties to promote their product or (political) message in exchange for money. This so-called participation-internet is referred to as web 2.0, introduced by Tim O’Reilly in 2004 (O'Reilly 2004). Web 2.0 is seen as having an end-user focus and is built upon user-generated content, as introduced by Morison (2010). Users become an integral part of the experience of the web, adding new content and information to it daily. Social network sites such as Facebook and Twitter would not exist without this user generated content. Data added by one source, altered by a user with new data to provide new information creates something out of reach of the official purpose; mashups are

16 created, either for commercial, open source of personal use. Information will from now on be the fuel on which the internet is driven.

The amount of processing power, data-storage possibilities and with the exponentially increase in internet services, and the information gathering related to it, creates a sort of mythical idea of the possibilities it can create. Recent discourse of the field of big data are dominated by the possibilities of it, such as self-driving cars, the changing landscape of academic research and quantification of human information, together with an increasing concern in the lack of privacy, the question who owns information and (prevention of) misinformation. Though the services that are offered by commercial companies can be used to envision the internet as a utopia, such as what happened in the Arab Spring, there is, according to Morozov, a downside to the idea of big data: capitalism. In his second book, To Save Everything, Click Here (2014), Morozov discusses the concept of technological solutionism, which is attributing every problem with a technological minded idea. People do not need health insurance in the United States, people should just be made aware of their health and technology should help them with technology; such as FitBit, which is an electronic watch, collecting data on the wearer’s health and counts the steps of this particular person, providing information how to live healthy. According to Morozov, technological solutionism is not the answer for everything, because the lack of health could be of poor feeding due to a person’s three job necessary to provide for his family. In his two books, Morozov tries to create awareness of the dystopian future of the internet – oddly enough not providing any solution himself for his created problem. With the help of cybernetics, the counterculture, the Californian ideology and information gathering, one should now have a sufficient theoretical frame in which the ongoing topics of campaigning, Government 2.0, Obamacare, ACTA/PIPA and Snowden must be analyzed.

17 18 Campaign

Imagine a young woman. Being twenty-two years of age, she is full into live. She has a job at a communications company, sports in a local gym and parties in the weekends. One morning she receives a letter in the mail, containing information of presidential nominee Obama, promising the continuation of birth-control and abortion possibilities and lowering the taxes of young starters on the labour market. All these promises could be directly linked to the current position of the woman in society and her life. By enhancing the points to this person in this certain way, Obama’s campaign team enlarges the chances of her voting (or donating) to Obama. The question now is whether it was just good luck of sending the mail to this young woman, or did the campaign team had information of her so they were perfectly able to target her? It is the way the Obama campaign made use of targeting that made the difference between his and for example, ’s or McCains campaign in 2008. This chapter will go into the presidential campaigns of 2008 and 2012; both won by Obama. It will make a connection between Obama’s past, including his college life, his ideas about a grassroots campaign in combination with his work as a community organizer and his practical approach to connect his followers by the means of technology, such as Twitter and Youtube, creating a grassroots campaign in which the people felt empowered to help with Obama’s Change. At the same time, this chapter will unravel the data crunching practices of the Obama campaign, and thus relating the grassroots campaign which authors such as Sabato (2010, 2013), Johnson (2009) and Clayton (2010) write about, with the possibilities of crunching and analyzing data for the sole purpose of targeting possible voters for either their vote or donation, as written by writers such as Issenberg (2012), Baker (2011) and Lee (2013). This results in an analysis of Obama's use of digital technology to reach the seat of president.

19 Social Media Marketing

Being a state-senator from the South Side of Chicago, Obama’s national career started when he was selected by Mary Beth Cahill, Kerry’s campaign manager, to deliver the keynote address on the Democratic National Convention in 2004 (Clayton 2010: 22-3). Formerly defeated in the 2000 senate race against Bobby Rush, Obama again ran for senate in 2004 and won, defeating Alan Keyes. In the speech, Obama talked about his parents, his grandparents, his background as a kid of many places and the pluralization of ‘Red and Blue states’. The atmosphere of the convention, the speaker and the message were all aligned, people were crying, and overnight, Obama became a political superstar. The (media) attention remained and Obama announced his candidacy for presidency on the 10th of February, 2007, opposing senator and former first-lady Hillary Clinton in the race for the Democratic nomination. “Let’s talk, let’s chat, let’s start a dialogue about your ideas and mine”, were one of the lines that Hillary Clinton said when she announced that she was running to get the Democratic presidential nomination in 2008. In a movie clip published on 20th of January 2007 on the social networking and video-sharing platform Youtube, Clinton sits on a sofa, talking or ‘chatting’ to imaginary people about current states of affairs. Being a second term senator and former presidential wife, her name was already known, her possibilities to fundraising enormous, she had a better track-record of accomplishments than Obama and the polls gave her the lead (Johnson 2009: 57). But then Obama came along. Obama and his campaign team rejected the traditional top-down model of campaigning. This started in 2003 when Obama ran for a senate seat. David Plouffe, one of Obama’s close consultants during all of his campaigns, says about

20 Obama’s campaign philosophy for senator in 2003 the following:”... he was determined to win not with thirty-second ads and clever sound bites, but by building a grassroots campaign throughout Illinois” (Plouffe 2009). This philosophy continued and was perfected for his presidential campaign. Instead of a top-down model of campaigning, Obama created a grassroots campaign, some even call it a movement (Clayton 2010: 136). This rejection of the top-down movement can be traced back to Obama’s time as a community organizer. In his time as community organizer in the South Side of Chicago, Obama learned the tricks of organizing political and social action. He was thought to use the community organizing philosophy of Saul Alinsky, but rejected it and started experimentation with organizing change together with his colleagues in the South Side (Kloppenberg 2011: 31-8). Being a gifted speaker, Obama quickly knew the ins and outs of organizing, which reflects in his campaign strategy. As his national field director Temo Figueroa says: “We decided we didn't want to train volunteers, we wanted to train organizers … folks who can fend for themselves” (Clayton 2010: 139). According to Obama, real change comes from the bottom up. The objective was to empower the volunteers to make decisions on the local level; to skip micromanagement from the how-to-campaign-guide. His mission to use the medium of internet as a basis for a grassroots campaign, suited well with the appealing young people, the digital natives who know their way around on the internet, and the demand for Change. These people, young, excited and usually with a low voter turnout, were given the tools to organize, to participate and to spread the Change. By 2008, the internet was a fully institutionalized medium. The medium offers a various amount of free services, such as games, sharing platforms and news. The internet is a space wherein news and breaking-stories follow up fast, where you can befriend someone who is living 5000 kilometers away or, as Castells

21 says it: the internet offers a new way to look to time and space, introducing his concepts of Space of Flows and Timeless Time (Castells 1998). In 2004, presidential candidate Howard Dean paved the way to use the internet as a medium to raise money. The Dean campaign employed blogger outreach, online mobilization, online fundraising, social networking and niche outreach all in their 2004 primary campaign (Johnson 2009: 153), but because of ill instructed volunteers, the output was not as expected, and Dean finished as third (Clayton 2010: 143) As Joe Trippi, Howard Dean’s campaign manager during the campaign said: “I like to say that we at the Dean campaign were the Wright Brothers. We put this rickety thing together and got it off the ground” (Johnson 2009: 153). The Obama campaign perfected the online strategy with the help of tools such as Facebook, MyBo, MySpace, Twitter and Youtube. Communication over the social networking sites Facebook and MySpace became an important part of the campaign. In comparison with the McCain campaign, Obama’s new media department was not a part of the campaign’s tech team, but was a separate team wherein team leader Joe Rospars directly reported to campaign manager David Plouffe (Hendricks & Denton 2010: 56). In 2004, active online-volunteer Joe Anthony set up the MySpace page barackobama. Over the next two years, Anthony published, improved and maintained information relating to Barack Obama. When in 2007 Obama announced his candidacy, the page already had 30.000 friends. The campaign team and Anthony worked on the page in close collaboration and used it as the official Barack Obama MySpace page, but the idea of a volunteer having the control on such an important part of the campaign disliked a few members of the team. When an offer of $39.000 was rejected by Anthony, the campaign team went directly to the MySpace administrators and received the control of the site. Although this is not the finest moment of the campaign, it illustrates the importance of social tools and control in the context of campaigning. At the end of the 2008 campaign, the Obama MySpace

22 had more than 800.000 friends, opposing 200.000 friends of McCain (Hendricks & Denton 2010: 56-8). When Barack Obama announced his candidacy on February 10, 2007, Farouk Olu Aregbe, affiliated with the University of Missouri started the Facebook group “One million Strong For Barack” (Hendricks & Denton 2010: 57). The amounts of members raised exponentially until almost two hundred thousand in one week. At this time, hundreds of groups were created supporting the candidacy of Obama, but none reached the amount of members Aregbe reached. In the end, there were almost five hundred unofficial Facebook groups, and close to 2.4 million people ‘friended’ with Obama, compared to six hundred thousand friends for McCain. Another major advantage of the Obama campaign was the recruitment of Chris Hughes, one of the three founders of Facebook. He joined the campaign team in 2007 and became responsible for the creation of a separate social networking site. My.barackobama.com was launched in 2007. Members of MyBO, as the social networking site was called by campaign members, could participate in events and fundraisers for Obama. At the same time they were encouraged to talk to undecided voters. Hughes called MyBO “the connective tissue” (Johnson 2009: 156) In retrospect to the opponent’s McCain Space, MyBO was set up to be a fundamental part of the campaign. They used MyBO to both target voters and to organize its get- out-the-vote activities. Members had conversations between each other instead of a top down system where information was spread: “The system was designed to harness all of the excitement and energy ... of a charismatic online candidate and channel that energy into real activities that met the goals of the campaign” (Johnson 2009: 156). The idea of having a grassroots campaign was reflected in the creation of MyBO. In some areas tens of thousands volunteers already organized fundraisers or campaign activities before even a paid campaign staffer showed up. Almost 75 thousand offline events were organized by more than a million members

23 with the help of MyBO. The Obama campaign nearly raised 650 million dollar, compared to the 360 million dollar of McCain. More important was that these donations were relatively small (almost always lower than 50 dollars), but by reaching a large basis, the pool of donators was large. Youtube, which is a part of online multinational Google, became a vital source for the campaign. In 2006 Youtube launched YouChoose, a section of the site devoted to videos of candidates running for various offices. They encouraged candidates to start a channel and contribute with videos to reach people. Viewers could contribute to the candidate via Google check-out, a payment service. At the same time, Youtube started to host presidential debates (together with CNN). Obama ended with over 1800 videos posted on his Youtube channel, while McCain had 330 videos. Sites such as BarelyPolitical.com started to post videos relating to Obama, such as the famous “I got a crush on Obama” song, which was viewed more than sixty million times. Artists also participated in their way by posting videos of songs on Youtube containing their support for Obama; such as rappers Will.I.am of the Black Eyed Peas and Jay-Z. MySpace, Facebook, MyBO and Youtube played a major role in reaching the youth-vote. By popularizing elections, and especially the suggestion to vote for Obama, the Obama campaign was able to get the vote of this crucial part of the voters in comparison to its opponent. However, this major role of new media is not the only reason of the high proportion of youth-votes for Obama, as Johnson suggests. Other factors for the high proportion of youth-votes was the momentum from 2004-2006, where youngsters ‘doubled, tripled, and even quadrupled their turnout’; the youth vote organizations outreach such as Rock the Vote, which allowed voters to register online, and the New Voters Project which actively promoted the participation of students into politics; the outreach from political campaigns, for example by hiring youth outreach directors (for Obama this was Leigh Arsenault); organizing Camp Obama to let users actively participate in the

24 grassroots campaign; and the above mentioned usage of social media to reach the youth (Johnson 2009: 110-122). Obama received 15 million votes in the 18-29 voter range, which is the same amount of votes that this range contained in the whole 2000 elections. The support put him over the top in crucial states such as , North Carolina and and reaching a 53 percent of the vote nationwide; receiving 66 percent of the youth vote against 31 percent for McCain (Pew Research 2008). Though it seems that all these social media campaigning strategies turned out to be profitable (and change the way campaigns are set up dramatically), there is a hidden layer in the campaign strategy of Obama. The keyword of this hidden layer is data. Although it seemed that there was a grassroots campaign going on, with the help of data, people can be traced and placed in certain categories. As mentioned in the introduction of this chapter, the campaign was able to target voters precisely. The hidden layer of the campaign contained big data and algorithmic analysis. The next part will go into data collected by the campaign, for example with the help of their own social networking site MyBO or buying data from Google or other commercial companies, and how this data was used to target specific people for specific purposes.

Metamarketing

Being a two party democracy, the US has a history of fierce campaign battles between candidates. The demand for knowing voters increases because of the winner-takes-all principle: when the popular vote of a state reaches towards one side, all the electors in that state vote for that candidate. The demand for the candidate to publicize themselves is huge, but they can find an outlet in the different media platforms that can be used. Media always influenced the voters.

25 Initially by use of the printed press (including the improvement of transportation possibilities), then radio, television and internet. The television became the most important medium to reach voters in the 50’s. It was a medium of joy and information, standing in every household. The expansion of the television made presidential campaign workers aware of the possibilities, and led to the first debate between two candidates: the famous Kennedy vs. Nixon debate in 1960. This was also the time when (negative) commercials were introduced to the households of the US through the television. The continuous play between campaign and the media lasted until the introduction of the internet. By sending particular messages into the households of certain people, keeping age and geographical location in mind, targeting was possible by means of commercial placement. Information about where and which households needed to be targeted, became an interest for campaign workers to reach the voters (Maarek 2011). It is this information of voters in combination with big data that changed the way Obama campaigned. There are different ideas about big data. Looking at the c oncept, one should think when there is big data, what can be seen as small data, or medium data? One could say that big data is a hype, a buzz. Seen from the perspective of business, big data can be described according to the three V’s (Zikopoulos & Eaton 2011): Volume (kilobyte, megabyte, gigabyte, terabyte etc.), Variety (movies, blogpost, interlinks, metadata etc.) and Velocity (realtime, near realtime etc). Commercial companies such as Google, Facebook and Microsoft, but also lesser known companies such as LexisNexis and Axciom, collect and sell data (or services relating to collected data). Google is able to collect and label what you are searching on their website, or on Youtube or Gmail, because these are all services of Google. By analyzing what is searched, at what time it is searched, and where it is searched and by cross-linking this information, they are able to create a profile of the person that searches. This profile can be used to target consumers with advertorials. With the help of a share button, for example Facebook’s Like button, and cookies (small

26 temporarily files stored on your computer, such as a pre-filled login box when you want to login to Facebook), they can track and trace you all over the web. By analyzing this information, they know what sites you are on, and what you do on the web. A great example of the possibilities of big data is the Google Flu1 project, wherein the search results of flu-symptoms is linked to geographical location; Google can predict within weeks when and where a flu-epidemic will occur in a country. Imagining that Facebook now owns WhatsApp, and Google’s browser Chrome is used by more than half of all internet-users (W3Schools), they are able to track your preferences and target their services to almost everyone.

The idea of big data provoked a diverse set of reactions in the academic and public world. The idea of having a big pile of data which you can search for your need has influenced the way science takes place. Chris Anderson, a writer for WIRED, published an article in 2008 called The End of Theory (Anderson 2008), wherein he predicts a future of non-hypothesis science, wherein data will give the answer on every question. Up front made hypothesis do not count because the data will reveal anything interesting. Lev Manovich (2011: 5) says this is not the case, but big data will make a digital divide between those who have access to the data, for example researchers within Facebook and Google, and those who do not have acces According to Manovich you still have to find a way to analyze the data; a big pile is not enough. Though there are more definitions of big data and the academic discussion of big data and relating subjects goes on with authors like Shirky (2009), Morozov (2013), Mayer-Schönberger and Cukier (2013), for the sake of this article the definition of big data, from now on data, will be Manovich’s: “Big Data is a term applied to data sets whose size is beyond the ability of commonly used software tools to capture, manage, and process the data within a tolerable elapsed time. Big data sizes are a constantly moving target currently ranging from a few

1 “Google Flu Trends. Google. Accessed 21-08-2014. http://www.google.org/flutrends/

27 dozen terabytes to many petabytes of data in a single data set”. This definition will help the reader understand the journey of Obama’s data crunching.

72 hours before the closing of the last voting offices, George W. Bush stood head on in the polls of the 2000 presidential elections, which makes it odd that the elections needed a special recount procedure, and Bush finally won with the help of the Supreme Court. A special delegation of the GOP started a task force to research this odd event, and eventually advised to intensify the principle of micro targeting. This meant that people needed to be approached in a clear, non-political way which could change per person or per group (Baker 2011: 100-128). In the 2004 elections, a substantial part of Bush’ budget went to voter surveys: what was happening under the people, where was more targeting needed, which persons are eligible to be target according to our principles and which persons are likely to convert to vote for Bush? With the help of focus groups, the Bush campaign started to survey groups in society. In 2005, inspired by the micro targeting activities of the GOP and the re- election of Bush, Joshua Gotbaum, a former employee of the Carter and Clinton campaign staff, started Spotlight Analysis and started cooperating with Yankelovic, a consumer-survey company. By buying commercial data, such as that of LexisNexis and Axciom, they created a database wherein they could search for certain preferences of citizens, such as financial records, magazine subscriptions and the value of the house they live in. The keyword of targeting became proxies. With the help of the bought database, they could categorize people according to superficial facts, commonly known in society as prejudices (‘people who buy biological products tend to a more social state’), which they turn into proxies (or just ‘searchable words’). By cross-analyzing these proxies with each other, in combination with previous voting records and focus groups, they were able to categorize the whole database in five categories, ranging from the likeliness of voting for the Democratic party. The middle groups, in comparison with the two

28 outer groups, were the most interesting because they are the people open for political change. In total they formed 47 percent of all the people in the database. By cross-analyzing these proxies they were able to categorize each voter into a group with a 25 percent error. They were now able to target certain groups for certain purposes, like the GOP did with surveys. The difference was that Gotbaum led the algorithms do the work that was first done by surveyors and target groups. By the time Obama was the official Democratic candidate for the presidency, every week 10.000 people were called and surveyed, followed by 1000 people who did a longer and more intensive survey (Issenberg 2012). These people formed a sample of the society. With the help of the sample, every piece of data that could be found about a person was analyzed with the sample group and categorized. Every possible voter received a number between the 1 and 100. The groups 1-10 and 90- 100 were not interesting because the campaign already knew they would either vote McCain or Obama. The people who had a score between the 55 and 75 received the most attention from the campaign (Issenberg 2012 and Beckett 2012). Obama’s focus was on the group of voters that voted Bush in 2004, but started to dislike the government after the Katrina hurricane and the ongoing wars in the middle-east. Being nine percent of all voters, these people had a higher than average demand of family values. By focusing on this group, Obama was able to win three important swing states, though it is impossible to prove that this was exclusively reached by data alone. The collection of data became a specific goal. The reason that MyBo received so much attention from the campaign could be either because they represented the grassroots idea of a campaign, but could at the same time be seen from the perspective of data collection. Everyone who participated on the MyBo platform made data: what time they were on, how active they were, to whom they talked, where they lived, etc. By analyzing and comparing this information with, for example, donation extractions, or Facebook profiles, the campaign had real time

29 track of every voter; to categorize them, to target them, to get their vote, to participate in the campaign or to donate. Former Google employee Dan Siroker started A/B testing: showing different websites to different people if visited the Obama website. By receiving feedback on how long, what page, and the amount of donations received per layout of the website, they could perfect the website in reaching as many voters and donations (Murphy 2012). By doing so, the campaign database grew with a factor of ten: “accruing 223 million new pieces of info in the last two months of the campaign alone, the issue was integrating and accessing it” (Murphy 2012). Obama won the elections, becoming the first afro-american president in history. In his speech, the night of his victory, Obama spoke to more than 100.000 people, emphasizing the collaborative spirit, the grassroots philosophy of the campaign; the organizers, the volunteers and the campaign team that made the victory possible; emphasizing the ‘we’:

I was never the likeliest candidate for this office. We didn't start with much money or many endorsements. Our campaign was not hatched in the halls of Washington. It began in the backyards of Des Moines and the living rooms of Concord and the front porches of Charleston. It was built by working men and women who dug into what little savings they had to give $5 and $10 and $20 to the cause. It grew strength from the young people who rejected the myth of their generation's apathy who left their homes and their families for jobs that offered little pay and less sleep. It drew strength from the not-so-young people who braved the bitter cold and scorching heat to knock on doors of perfect strangers, and from the millions of Americans who volunteered and organized and proved that more than two centuries later a government of the people, by the people, and for the people has not perished from the Earth. This is your victory (CNN 2008).

The night Obama became president, the campaign for 2012 began.

30 The 2012 campaign and improved meta-targeting

In 2010 the Democratic party absorbed the worst midterm loss since 1938: a six seat loss in the senate; 63 seats in the House and the Republicans captured a majority in 26 states, and in 21 states they also had the governor seat. “I’m not recommending for every future president that they take a shellacking like I did last night. I’m sure there are easier ways to learn these lessons”, said the president that evening. Obama’s response was to talk to regular advisors, people that had a track record but were not involved in the political landscape at that time. People of this group were , former Senate majority leader, Leon Panetta, Obama’s first CIA director and David Gergen, who advised the presidents of both parties and was now stationed at Harvard. Advice that was given included comments on his relationship with Capitol Hill – critiquing his mentality that he only contacted senators and representatives when he needed something happening – and that he failed to change the culture of politics in Washington. These conversations were, according to Balz, only a part of his presidential reorientation. Obama needed to negotiate about the Bush era tax cuts, an arms treaty with the Russians and the Don’t Ask, Don’t Tell policy of the Pentagon. Obama got through, with the help of his Vice President, but when he departed to Hawaii for Christmas, one member of the traveling party said he was “as happy as I’ve ever seen him” (Balz 2013: 35-9). Mitch Stewart said about the midterm loss:

Failure is always a better teacher than success, and 2010 was tough. We learned tactically some lessons, but ultimately I think what probably helped us more than anything else is a lot of our volunteers and staff had only been involved in the ‘08 campaign, which was a log of highs, and unless you were very early on in the process like I was, there weren’t a lot of lows. So 2010 was a good learning experience just in that [it showed us] this isn’t all rainbows and bubblegum. I think it actually helped harden some of our volunteers and staff to prepare for 2012 (Balz 2013: 80).

31 Jim Messina was one of the people that joined Obama on his trip to Hawaii. In the summer of 2008, when David Plouffe was on the verge of a burn out, Messina joined the Obama campaign team. In 2011 he was sent by Obama to Chicago to lead his 2011 re-election campaign. Messina proposed to not rerun the 2008 campaign, coming up with arguments of Carter and H.W. Bush, who lost their midterm elections, in comparison with the campaigns of Clinton and W. Bush, who won. But Obama wanted not to lose his connection to the people, and that he wanted to run a grassroots campaign for the second time (Balz 2013: 40). Messina presented a list of five objectives that needed to be done before the elections. The third objective was to improve the usage of technology in the campaign, a statement that surprised the president as his national and international credits for using technology in his campaign were widespread. But Messina was obsessed with technology and how it has changed over the past years. Facebook was the major platform, MySpace was gone. Smartphones were the major symbol of change, and this needed to be incorporated in the campaign. Eric Schmidt, former Google CEO became a key advisor on everything: “how to manage a start-up, to the kind of computer platforms to set up, to the most efficient placement of online advertising” (Balz 2013: 41). As Schmidt said to Balz:” Jim is extremely analytical, so what he wanted was a technology base and analytical base to make decisions. For advertising, he would like the have a scientific basis for where you put the money, and so we worked at some length to try to understand what kind of marketing made sense” (Balz 2013: 41). After Obama’s 2008 victory, a five-hundred page document filled with recommendations was published among prominent figures of the campaign, proposing change and improvements to an already state-of-the-art campaign. Mitch Stewart, who won in 2008 as a campaign organizer said:

We did very detailed postmortem where we looked at all kinds of numbers, looking at the general stuff like the number of door knocks we made, phone calls we made,

32 number of voters that were registered. But then we broke it down by field organizer, we broke it down then by volunteer. We looked at the best way or the best examples in states of what their volunteer organization looked like (Balz 2013: 75-6).

The first step was to connect all the data. By 2012 three different databases of voter information consisted, all used by different parts of the campaign. Operation Narwhal was initiated to link the database of the fieldworkers, the campaign database and the party database: “we realized there was a problem with how our data and infrastructure interacted with the rest of the campaign, and we ought to be able to offer it to all parts of the campaign” (Issenberg 2012). An external party developed and stored the MyBo information, whilst the information of the fieldworkers was stored in the Build the Hope database. All databases consisted of 170 million potential voters and three million volunteers and donors, but the campaign and party did not have a clue which voters, volunteers or donors overlapped. Operation Narwhal unified the databases and provided information per voter, used to micro target. Messina:

What I wanted was, I didn’t care where you organized, what time you organized, how you organized, as long as I could track it, I can measure it, and I can encourage you to do more of it. So what I said to them was I want all of our data together, I want for the first time to treat [a voter] like a voter and not like a number, because right now you’re just a voter number, your voter ID number in your state, your FEC number for how much you contribute, your census data, whatever we know about you from commercial vendors. But we don’t treat [a voter] like a person (Balz 2013: 77).

Operation Narwhal led to a massive database forming the foundation whereon the 2012 campaign could be built, and to the second step in the improvement of technology use in the campaign: Dashboard. “Dashboard is what we needed to communicate”, Jennifer O’Malley Dillon said. Dillon took over the Obama field organization and renamed it to after the 2008 elections. She and Messina wanted a simple to use

33 program that would allow everyone to communicate simply and seamlessly. Dashboard needed to be the main hub of communication between both the volunteers and between volunteer and campaign manager. It was hard to develop, partly because the engineers building the system lacked the experience of being a field organizer. Some even were sent to the field to better understand the needs of field organizers. Dashboard needed to be the field organizer’s office, but online. Harper Reed, one of the executive programmers said:

When you walk into a field office, you have many opportunities. We’ll hand you a call sheet. You can make calls. You can knock on doors, and they’ll have these stacks there for you. They’ll say, ‘Harper, you’ve knocked on fifty doors. That’s great. Here’s how you compare to the rest of them.’ But it’s all very offline. It’s all very ad hoc and it’s not very modern. And so what we set out to do was create that offline field experience online (Balz 2013: 78).

Next to Narwhal and Dashboard, there was a third project involving Facebook. The campaign learned from the re-election of Bush in 2004 that people not actively involved or interested in politics, are more interested into it when approached by a relative, someone they know personally compared to an unknown volunteer. When invited by Facebook in early 2011 to encourage the campaign to spend some of their millions of campaign funding on ads on Facebook, Messina offered not to buy ads, but to work together:

We started saying, ‘Okay, that’s nice if we just advertise’. But what if we could build a piece of software that track all this and allowed you to match your friends on Facebook with our lists and we said to you, ‘Okay, [so-and-so] is a friend of yours, we think he’s unregistered, why don’t you get him to register? Or [so-and-so] is a friend of yours, we think he’s undecided. Why don’t you get him to be decided?’ And we only gave you a discrete number of friends. That turned out to be millions of dollars and year of our lives. It was incredibly complex to do (Balz 2013: 78).

The result was called Targeted Sharing. With permission, Dashboard could see one’s friends and compare data with one’s friend’s name to get a list of three or four people that were most benefited to be persuaded by someone they knew to vote for

34 Obama. The campaign knew who was and was not registered to vote, they knew which of these friends had a low propensity to vote and they knew who were solid to vote for Obama. They proposed a gentle nudge in the right direction. Eric Schmidt: “If you don’t know anything about campaigns, you would assume it’s national, but a successful campaign is highly, highly local, down to the zip code. The revolution in technology is to understand where the undecideds are in this district and how to reach them“(Balz 2013: 79). Not only online services supplied information to the databases. The majority of the campaign budget was used to buy time on television. In 2008, Obama almost spent 300 million dollar to broadcast a total of almost 550 thousand commercials during the whole campaign period, with a focus on the swing states. In 2012, the campaign developed an own television audience survey system, called Optimizer. A day was separated into 96 blocks of 15 minutes and analyzed what was shown to what audience at what time on what channel: “The revolution of media buying in this campaign was to turn what was a broadcast medium into something that looks a lot more like a narrowcast medium” (Issenberg 2012). The provided audience- information by commercial companies included geographical information and the age of the audience, but the Obama campaign needed more information to place them into separate categories (Issenberg 2012). Rentrak, a cable provider, provided the campaign with specific information about viewers. To get around privacy laws, the campaign needed to provide information of possible voters, where Rentrak compared it with their database and when matched, they provided information about what was watched at which moment of the day. Personal information was censored. In September another cable provider was added (Buckeye CableSystem), which provided per-second information per household (Farnam 2012 and Issenberg 2012). This way they found out that the voters they wanted to target – the ones that could vote for Obama – did not watch national television, but were watching niche channels such as The Food Channel or Hallmark.

35 Dan Wagner joined the campaign team, and was responsible for modeling the elections. In 2010, when Scott Brown won the Edward M. Kennedy seat in , many democrats were surprised by this electoral defeat. Wagner predicted that the Democratic candidate, Martha Coakley, would lose by using a computer model. He delivered his conclusions and data to Messina and said “we’re going to lose, and here’s why we’re going to lose” (Balz 2013: 80). Wagner modeled everything: voters, states, volunteers, donors, anything that they could think of to improve their efficiency. They saved money by providing information to call centers about who not to call. They build a model to predict the likelihood of donating. The same was done by sending e-mails: by sending e-mails differing in content and layout, the campaign found out which email led to the most donations. Half of the emails were sent the traditional way, while the other half were send in the way Wagner proposed (Scherer 2012 and Balz 2013: 81). “I’m going to over perform them”, Wagner said, and he did by fourteen percent. Eight different companies provided polling information, and this data together with their own data was modeled more than sixty thousand times each night. By morning, a report was published where the campaign should focus on that day (Scherer 2012).

When you start your computer and internet browser and start searching for something, the commercial idea of Google is that they want to know what you are searching. By previous collected data, they assume at certain parts of the day, or location, that you want to search something particular. This means, that by searching for information on the weather or politics, different information is shown compared to an internet-user in a different region or political-preference. This way, every Google search is personalized, and thus, there are is not only one internet, but there as many different internets as there are internet users. The same counts for the Obama campaign. By starting a grassroots campaign, volunteers in the campaign received freedom and control over a small part of the campaign.

36 Rejecting the traditional top-down idea of a campaign and focussing on the grassroots campaign in combination with tools on the internet, the campaign had to search for alternative ways to control the campaign. Data provided information about voters, volunteers and donors, and at the same time provided control in the form of showing what they need the voter, volunteer or donor to see; to reach the vote, action or donation that was needed to defeat McCain or Romney. Whether data is the solution for this control, or data (and the data-business model of Google and Facebook) was the origin of the idea of starting a grassroots-campaign is a chicken/egg discussion unable to answer. The fact is, data meant control and in a grassroots campaign, control seems the hardest part.

37 Obama's Presidency

The previous chapter addresses how digital technologies such as use of social media and data aggregation is used by Obama to reach the seat of president – both in his election as president in 2008 as for his re-election in 2012. This next chapter analysis several key concepts and subjects Obama introduced during his presidency. First the concept of Government 2.0 is explained to create a theoretical background. Subsequently, the issue of transparency will be addressed in this sub- chapter. After that Obamacare is discussed. Obamacare is the result of Obama's successful lobby to introduce a nation-wide healthcare system. The main analysis of this sub-chapter is the Obamacare website, as its introduction was not flawless. Next to Obamacare the debates around SOPA and PIPA and the Snowden revelations are discussed. During his terms, Obama faced several important socio- technological controversies relating to technology and especially to the issue of privacy and data gathering. Both controversies relate to the issue of privacy: a government changing or having secret policies relating to data streaming or gathering of citizens. The ubiquity of internet and human’s dependence both on commercial and personal field has led to a discussion in society of the thin line the government is currently walking between security vis-á-vis privacy infringement of citizens. The SOPA and PIPA bills and its debate around it conflicted with the idea of the online community and technological minded people and multinationals (who are especially found in Silicon Valley). The debates will be placed in context with current ongoing debates relating internet neutrality. The last section will address the revelations of classified material by Edward Snowden in the summer of 2013. The revelations made public in what way government bodies were able to aggregate data for analysis. The main argument by these bodies is to prevent terrorist attacks. Though these projects did already start when Obama was not yet elected, the

38 revelations by Edward Snowden happened during Obama's presidency, which makes it prone for analysis how Obama addresses the issue. This chapter results in an analysis of how Obama uses the digital technologies used in his campaign during his presidency. It will make a distinction between issues directly involving the White House, such as Obamacare, and issues not directly related to the White House, such as the SOPA and PIPA debate – which were a result of Congress. The issue of the Snowden revelations is a somewhat more interesting issue: it is at the same time directly linked to White House policy, whilst at the same time not directly linked to Obama as it was initiated when Obama was not yet president. The chapter subsequently addresses the issue of how digital technologies and its fast pace change are problematizing the bureaucratic process of democracy and challenges existing powerful interest groups.

Government 2.0

When one looks at the term Government 2.0, one could associate it easily with Web 2.0: it implies the second generation of government. This chapter will go into the Government 2.0 policy of the Obama administration in his first and second term. A simple form of Government 2.0 is the use of social media by politicians, and on a higher level, to encourage feedback from citizens who are able to contact the politician. Next to these two steps, the third step is to create a participatory space for citizens and government, wherein both can participate in a debate relating to government services. A fourth step is to publish non-personal data for mash-up use (Morison 2010: 561). Simply defined, a mash-up is a new kind of information that is created by combining two existing datasets (mostly found in the digital space), producing information that was out of reach of the official purpose of the data.

39 According to Morison, Web 2.0 was built with an end user focus and upon user generated content. In the fourth step of Government 2.0 another qualification is added: “The availability of information of a qualitatively new nature is the third key component which, when added to the existing elements of end user focus and user generated content, produces something with hugely significant potential” (Morison 2010: 560). Whether the actor is a government body or an individual is not important; the important part is that the new information diversifies opinion, perspective or revelation towards an issue. On his first day in office, Obama launched the Memorandum on Transparency and Open Government (Orszag 2009), initiating measures to promote transparency, participation and collaboration. It directed agency heads to use new technology in their policies. The important changes were that Obama wanted all government data to be published online; to improve the quality of government data; to create and institutionalize a culture of open government; and to create an enabling policy framework for open government. In September 2011 the Obama-administration published the Open Government Partnership National Action Plan (White House 2011). According to its website the Open Government Partnership aims to secure concrete commitments from governments to promote transparency, empower citizens, fight corruption, and harness new technologies to strengthen governance 2. The first members of the Open Government Partnership were Brazil, Indonesia, Mexico, Norway, Philippines, , and United Kingdom, as well as a group of international civil society (non-governmental) organizations, but now over 55 countries are part of the partnership. The National Action Plan consists of twenty-six commitments, divided into four different sections: ● Already accomplished commitments, like the memorandum Obama send to his agency heads on the first day in office, but also making non-personal

2 “Open Government About Page”. Open Government Partnership. Visited: 19-08-2014. http://www.opengovpartnership.org/about

40 governmental data public through the website Data.Gov, or publishing the records of who visits the White House. ● Open Government to Increase Public Integrity which involves commitments relating participation of citizens in the government, such as the We The People platform, a way for citizens to propose subjects of change to the government. When enough people sign the petition, the government demands from itself to look into the subject and come with an official government response. Another commitment is to increase public integrity are expanding declassification regulation and the protection of whistleblowers. ● Open Government to Manage Public Resources More Effectively which involves commitments relating to industries and information relating to results of resources, such as the website Performance.Gov. ● Open Government to Improve Public Services which involves commitments relating to improve government websites, use Data.Gov to promote innovation and to publish data to help consumers and scientists. In December 2013, Obama announced the second stage of the National Action Plan, containing another twenty-three commitments to improve or introduce new possibilities to open up the government, to improve transparency and prevent corruption. First of all the second National Action Plan proposes improvements in previous commitments, like advancing the security of whistleblowers, technological advancements, like application programming interfaces (API’s), on platforms such a s We The People and Data.Gov, and the implementation of laws. Secondly, big changes in the second version are the improvement of transparency in the Foreign Surveillance Intelligence Act (FISA) and to improve the privacy policy of government bodies. This second action plan should lead to an increase in transparency within the government.

41 Next to increasing transparency of the government, a large part of the National Action Plan’s focus is to open up government data to the citizens. As mentioned above, the government opened the website Data.Gov to provide citizens this information. As for today, more than one hundred thousand datasets are available on the website, divided into twenty-one different subjects such as education, law, public safety and finance. One may also divide the possible information into city-,county-, or state-scope. Data is provided in pdf, excel or csv form, both possible to read on a normal personal computer, or can be shown in a web browser. Data.Gov itself is an open source platform, and the government encourages developers to improve the platform. To support and streamline the introduction of this data in the public, Obama created a Chief Technology Officer position, first operated by and currently by . In an interview with CNN in 2012, Park said the following:

My job is to be tech entrepreneur-in-residence at the White House. My role is to be an internal change agent that works with the best innovators inside and outside of government to conceive of, and then execute, at high speed, a portfolio of initiatives that unleash the power of data, tech and innovation to improve the well-being of the American people. And about 20% of my time is functioning a senior advisor on [these] issues. (...) The president's team asks me to get involved in everything from, say, helping to move the health insurance exchange programs along to assisting on, say, initiatives to help streamline the student loan process. It actually tends to be issues that relate to how technology and data can be applied to advance the ball on key national priorities, like education, energy, health care, public safety, job creation (Ferenstein 2012).

In his book the Facebook Nation, Newton Lee quotes Steven VanRoekel, Chief Information Officer of Obama at the time, that America has to become a Facebook Nation, which demands increased transparency and interactivity from the federal government. With the introduction of Web 2.0, users became an integral part of the experience of the web. Government 2.0 is a result of an increase demand of users to be involved in the process of content: platforms such as We The People are created to fill in this demand and to let citizens (or users) become an integral part of the

42 government. An example is when you enter the website of the White House, one will be asked if they would add their mail to the mailing list to keep being updated on government policies. Links to social websites such as the White House Facebook page can be found on the homepage. Data collection, as in the campaign, becomes an integral part of the government, almost copying commercial technological companies such as Google and Facebook (Lee 2012: 117). First of all it can be used in policy making, as Park said ‘help streamline the student loan process’. Another part of data is more commercial. Park:

It's the notion of government taking a public good, which is this data – say weather data, or the global-positioning system or health-related knowledge and information – making it available in electronic, computable form and having entrepreneurs and innovators of all stripes turn it into an unbelievable array of products and services that improves lives and create jobs (Lee 2012: 117).

These jobs and life improvements are reflected in apps such as SeeClickFix (an app where non-emergency issues can be reported with the help of GPS), Street Bump (which uses a phone in-built accelerometer to detect potholes and irregularities in roads while activated when driving), Eco Finder (where to deposit hazardous material in the San Francisco area, currently called Recycle Where), Are You Safe (which neighborhood is the safest to live), Discover BPS (to inform about Boston Public Schools) and Adopt a Hydrant (to encourages people to shovel out fire hydrants after it snows). Data.Gov provides a list of over three-hundred applications that are made with the help of the data and that are accessible to the public. Though these apps are not made by the Obama administration, they reflect the administration's grassroots-campaign by letting the citizens take action and participate. It also reflects the idea of creating new information out of the purpose of the original idea: creating mash-ups. As Clay Johnson, director Sunlight Labs says: “I see [the Gov 2.0 applications] as the death of a passive relationship with government. (...) Instead of people saying ‘Well, it’s the government’s job to fix

43 that’, … people are taking ownership and saying, ‘hey, wait a minute. Government is us. We are the government. So let’s take a responsibility and start changing things ourselves’” (Lee 2012: 119). The data that is collected is used by the (local) government to provide better services to the citizens. Instead of being able to deliver passive services, the government has to change into Government 2.0 (O'Reilly 2009). Where Web 2.0 changed the internet and the experience of a personal computer by delivering new and unique features, user-generated content and the interlinking of platforms and data through the existence of API’s, Government 2.0 means that, as Steven VanRoekel proposes above through his Facebook Nation metaphor, the government has to change from a passive service provider into an active platform. State funded projects, such as GPS and ARPANET create new standards within society. They were changed from a passive service only usable by the government, into an active service just as Gov 2.0 should. As with Data.Gov nowadays, data needs to be provided to serve people’s interest, leading to renewed participation, to new apps and to new jobs. The implementation of social tools such as SeeClickFix on the one hand side and data aggregation on the other, is reflective of how Obama’s campaign was structured. He surrounds himself with tech savvy geeks. On the one hand he does that to provide information and connection with citizens, just as he did with the usage of his own social media site MyBO, Facebook and Twitter in his campaign. By creating a form of Government 2.0 as he did by opening up government data through Data.Gov, Obama depended on the innovation of people and start-ups to provide new insights. On the other hand the services provide data – the new insights – to use in policy making. The grassroots-campaign of Obama did not only put him on the most prestigious seat of the United States, he also changed the way how people look to the government. The implementation of social tools as mentioned above meant data. This data once again could be used by local or federal

44 government as a basis to form policy on; to see what is popular and what not, and thus let the citizens indirectly influence the democratic process through the means of technology rather than voting.

Obamacare

In the three years as a community organizer in the South Side of Chicago, Obama experienced the problems of a lack of health insurance. In the Audacity of Hope and in many of his speeches, Obama addresses the fact that healthcare is a divisive concept within the American population (Obama 2006: 184-7). In his book he acknowledged that the proposed idea might be too progressive, but nevertheless Obama proposed the idea to let an individual group of medical professionals create a list of basic health insurance needs. This will not solve the problem of health care, but to prevent unnecessary costs: “With the money saved through increased preventive care and lower administrative and malpractice costs, we would provide a subsidy to low-income families who wanted to purchase the model plan through their state pool…” (Obama 2006: 185) These pools would have be to set up by every state to provide the subsidy. This way Obama takes advantage of the nation’s federal structure to conduct controlled experiments in the states (Kloppenberg 2011: 166). Thanks to Mitt Romney, Obama’s Republican opponent in the 2012 presidential elections and former governor of Massachusetts, Obama had results of a state-mandated health care insurance system. The eventually plan adopted in March 2010, resembles the Massachusetts model more than any other model proposed from either sides. Starting the process in his first year of presidency, Obama mastered the process by his willingness to compromise instead of demanding a decisive victory. On both sides of the political spectrum, the issue of healthcare was deeply debated.

45 Obama, with Tocqueville in his mind according to Kloppenberg, realizes that “Americans have always sought a variety of goals consistent with their very ideals and aspirations. Democracy means squabbling about differences, reaching tentative agreements, the immediately resuming debate” (Kloppenberg 2011: 178). At the time of Tocqueville France was in a state of mess of revolution turning into a civil war “[Tocqueville] marveled at the willingness displayed by Americans of different backgrounds to find common grounds. Or at least to tolerate their differences” (Kloppenberg 2011: 179). Instead of wanting the single-payer model, Obama waited patiently and let the houses debate. He heard objections and new alternatives. When this was all done and Obama noticed that his efforts to reach a compromise for the common good were categorically rejected, he jumped in and threw himself into the battle (Kloppenberg 2011: 168). The result was the Patient Protection and Affordable Care Act, also known as ACA or Obamacare. A short summary of the ACA: having health care insurance becomes mandatory; minimum standards of healthcare policies are established; companies greater than fifty employees are obligated to have a health care plan and small businesses and incomes within a certain range receive state-funded subsidiaries to pay for health care. As read above, the amount of technology that Obama and his campaign used to reach the seat and to promote the ACA was tremendous. The young and active digital age, the digital natives, together with the knowledge of online communication, targeting and marketing led to one of the most revolutionary campaigns in the history of presidential campaigns. Next to his accomplishments in data-collection and campaigning, the creation of We The People and the publishing of government data to enhance transparency, prevent corruption and the creation of jobs, the implementation of Obamacare would be the first, and for Obama, the most prestigious part of his presidency. With previous knowledge in mind, one should not doubt the succession of the healthcare bill. It turned out differently.

46 “Millions of Americans negotiating America’s health care system know all too well what the waiting room of a doctor’s office looks like. Now, thanks to HealthCare.gov, they know what a “virtual waiting room” looks like, too” (Johnson & Reed 2013). This was the first sentence of an article by Clay Johnson and Harper Reed, criticizing the website necessary to apply for state-funded healthcare. As stated in the National Action Plan, government should try to provide governmental services through any means. The most logically way to provide the Affordable Care Act, was to let people sign up through the means of the internet. The website is offered as a marketplace for citizens to get the best health insurance they need. According to Johnson and Reed Healthcare.gov was doomed for failure, due to the government’s history of failing computer systems: “... 94 percent of large federal information technology projects over the past 10 years were unsuccessful — more than half were delayed, over budget, or didn’t meet user expectations, and 41.4 percent failed completely.” The reason is that the government must follow the Federal Acquisition Regulation. This is an 1800 page of legislature used to grant federal contracts, but resulted into mediocre companies winning with the help of lawyers and contract managers. What is seen here is the difference between legislative IT projects and commercial ones. Due to his campaign and the National Action Plan citizens expected Obama to install a new form of government. His election rhetoric was one of togetherness; his election messages were change and hope. Society was ready for Government 2.0. They wanted the apps, the platform, they got the data and made it work. The government spoke the language of the digital natives, played it like a commercial company; a Facebook Nation. To speak in terms of the previous mentioned Morozov, the idea of internet-centralism and technological solutionism was accepted by society, and then brought down hard by the reality of political procedures and bureaucracy. The result is the president of the United States, for the first time in history, apologizing for the mis-implementation of a website:

47 Now, we've had this problem with the website. I'm not happy about that. We're working overtime to make sure that it gets fixed because right now we put in place a system where people can get affordable health care plans. (...) I promise you, nobody has been more frustrated. I wanted to go in and fix it myself but I don't write code (Epstein 2013).

The website was launched on the first of October 2013, but lacked speed and only a few percent of the total amount of people could apply for health insurance. Jeff Zients was brought it to fix the website, and by December the website worked sufficient. In a first result sheet published by the White House, April 17, over eight million people are enrolled in the program and healthcare costs are already lowering lowering (White House 2014b). More important than the implementation of Obamacare, is that the image of Obama as a technology minded president was damaged.

SOPA/PIPA

Obama's reputation as a technology minded president was somewhat reinstated by his approach towards the SOPA and PIPA bills. The debate around these bills are prone for analysis as they exemplify how a grassroots spirit can challenge existing power lobbies. In the SOPA and PIPA debate, the Motion Picture Association of America was challenged by a collective of technological minded sites, such as – later joined by larger multinationals such as Google. SOPA and PIPA were supposed to create a framework wherein companies are forced to regulate content relating to intellectual property like patents, copyrights, trademarks and trade secrets. The concept of intellectual property has changed with the introduction of internet: instead of a physical object, objects could simply be copied and shared among users. By downloading or streaming a

48 movie or music album, users are able to get around paying for products (online piracy). Both bills try to address this problem of online piracy. PIPA was proposed in senate in May 2011 by Patrick Leahy whilst PIPA was proposed to the House in October 2011 by Lamar Smith, both after lobby practices of the Motion Picture Association of America. The MPAA claimed to lose income, which directly influenced tax income of the government (O'Leary 2012). The SOPA bill “targets foreign sites which engage in copyright infringement and/or intellectual property theft” (Connoly 2012: 68). Sites that are not hosted, or partially hosted, outside of the United States borders, can be flagged by this bill if the owner or administrator is involved or facilitating online piracy, like sharing documents with patents or streaming movies. The Attorney General is the one who has to take action in case of violations. Many of these sites are fully hosted outside US borders, so the US does not has the jurisdiction to take down these websites. When the Attorney General decides to pursue action “court order would require internet service providers, search engines, ad providers and payment providers to take action and to block all their services to the sites” (Connoly 2012: 69). All websites, including multinationals like Google, Yahoo and PayPall have to stop providing services to these websites. In the case of Google, they should not index the website in their result-list when one is searching for it; a company like PayPall has to stop its paying services to these websites and freeze their account. The main difference is that that PIPA only targets sites with no significant other use other than copyright infringements, while SOPA targets any foreign site that is committing or facilitating copyright infringement. Thus, a site like Google, who might index possible online piracy site when searching a particular query, might be fined under SOPA, but not under PIPA as Google is not primarily concerned with facilitating the sharing of copyright material. The main problem for companies such as Reddit, , Google/Youtube and Microsoft is that the SOPA and PIPA bills do not fully define the meaning of 'facilitating' copyright infringement.

49 Facilitating copyrights infringement is as easy as just putting a link to a download website beneath a blog post in a forum. Every website that has a comment function will risk the chance of facilitating online piracy, but websites such as Reddit or Youtube receive millions of comments per day. It is this part why sites such as Reddit, Wikipedia and Google actively started to lobby against the SOPA and PIPA bill. Multinationals and new start-up sites relating to digital technology would have to invest a substantial part of their money on preventing this from happening, because otherwise their site will go down and income will be lost. An unexpected strong opposition of the acts was mobilized after the introduction of the bills. Participators of the lobby against the bills were NGO's, Librarians (American Library Association), academics, (media)websites (Reddit, Wikipedia) and internet activists (Isakova 2013, and Plumer 2011) later combined with online multinationals like Google and Microsoft. They united themselves in NetCoalition, containing 222 interest groups against the bill. What was created were two camps: one camp representing the MPAA and other interest group in a strong regulated internet preventing online piracy and where companies were able to sue websites which would supposedly facilitate online piracy; on the other side were Google, Reddit and other large multinationals or websites representing an open and innovative internet. Though the bills were proposed in the Senate and the House of Representatives, Obama was asked for his involvement in the process through his own We The People platform. As noticed in the sub-chapter on government 2.0, the platform was created to include citizens in the government; as a representation of a grassroots government within a government 2.0 ideology. As for the SOPA and PIPA bills, the Obama government responded on two petitions started on the platform. The answer was made by Victoria Espinel, Intellectual Property Enforcement Coordinator at Office of Management and Budget, Aneesh Chopra, U.S. Chief Technology Officer, and Howard Schmidt, Special Assistant to the

50 President and Cybersecurity Coordinator for National Security Staff (2011). According to their statement, the Obama government will not support any legislation that will disrupt the open standards of the internet:

While we believe that online piracy by foreign websites is a serious problem that requires a serious legislative response, we will not support legislation that reduces freedom of expression, increases cybersecurity risk, or undermines the dynamic, innovative global Internet.

Both petitions asked to veto the two bills, but instead the White House acknowledged the issue of online piracy with the advice that the Houses need to fix this themselves:

Moving forward, we will continue to work with Congress on a bipartisan basis on legislation that provides new tools needed in the global fight against piracy and counterfeiting, while vigorously defending an open Internet based on the values of free expression, privacy, security and innovation.

The SOPA and PIPA debate in 2011 is an example of the protection of the internet- culture as we know it now. The NetCoalition started a successful lobby. For example, they organized a black-out day, where 115.000 websites made their front- page black as a representation of censorship. Because of the millions of people that visit a website like Wikipedia alone daily, attention was instantaneous resulting in the acts being shelved. As various media outlets have said, the anti-SOPA/PIPA lobby was not very organized (Gross 2012), and the big multinationals entered the game pretty late:

Rather, the protest was decentralized and organic. The tsunami of opposition transcended political divides, with extensive participation from individuals and groups on both the left and the right. It was driven by a commonality of interest in the continued vitality of social networking and "user-generated content" sites – an interest broadly and actively shared by both rank-and-file Internet users and the technology innovation community (entrepreneurs, venture capitalists, technology

51 companies, bloggers, established Internet advocacy groups like CDT and Public Knowledge and savvy new online grassroots organizations) (Harris 2012).

Though not directly linked to the White House, the SOPA and PIPA debate led to a realization of the internet culture; of the omnipresent availability of constant connection within a network; and the realization of privacy, censorship and surveillance relating to this network. But, most importantly, the ongoing change in internet culture and the technological development related to it, leads to a problematic approach towards policy-making because of the fact that technological advancements follow up each other so fast, that policy relating to it already is outdated when it is proposed (Harris 2012). The SOPA/PIPA debate is an example of how digital technologies can challenge existing lobby structures. Internet- freedom rights currently is a hot topic in Congress. Net-neutrality and fast lanes have replaced the discussion around copyright infringement (Wyatt 2014), but the challenge is the same: existing power and their lobby groups want to protect their power by encapsulating 'new power'--mostly related to, and participating in, digital technology and their economy. In the case of SOPA and PIPA, Wyatt big multinationals and their lobby-money were defeated by smaller and technological, more progressive minded people of the digital age with the population by their side; collectively organized as a grassroots organization just as that of Obama. By supporting their camp, Obama presented him again as a technological minded president, progressive and innovative.

Edward Snowden

In June 2013, Snowden revealed himself in a hotel room in Hong Kong. As a system administrator for Booz Allen Hamilton, a major government contractor, Snowden leaked classified documents to Guardian journalist Glenn Greenwald and

52 documentary maker Laura Poitras. Stationed in Hawaii, Snowden fled to Hong Kong and finally Russia where he found temporary asylum. The United States government has charged him under the Espionage Act and asked Russia to extradite Snowden. Asked by his motivation to become a whistleblower and publish the classified data, Snowden answers in an interview with New York Times journalist James Risen:

So long as there’s broad support amongst a people, it can be argued there’s a level of legitimacy even to the most invasive and morally wrong program, as it was an informed and willing decision. However, programs that are implemented in secret, out of public oversight, lack that legitimacy, and that’s a problem. It also represents a dangerous normalization of ‘governing in the dark,’ where decisions with enormous public impact occur without any public input (Risen 2013).

The classified documents that Snowden shared with Greenwald and Poitras contained information about NSA programs and projects. These programs and projects focus on the collecting, aggregating and analyzing of personal (meta)data. The first article that was published contained information of Verizon sharing call- data with the NSA (Berghel 2013), but soon more and bigger programs were exposed. A day later PRISM was exposed, “an NSA program that targeted the Internet communications and stored data of non-US persons outside the US and those communicating with them, and the extent to which US companies cooperate with the government” (Landau 2013). It became clear that companies such as Google, Facebook and Yahoo gave information rather quickly to the NSA and even changed their system for it:

In at least two cases, at Google and Facebook, one of the plans discussed was to build separate, secure portals, like a digital version of the secure physical rooms that have long existed for classified information, in some instances on company servers. Through these online rooms, the government would request data, companies would deposit it and the government would retrieve it, people briefed on the discussions said (Miller 2013).

53 More leaks exposed the sharing of information between the Fife Eyes countries: Australia, Canada, Great Britain, New Zealand and the United States. XKeyscore revealed that the NSA could track: “nearly everything a typical user does on the internet", including the content of emails, websites visited and searches, as well as their metadata. Analysts can also use XKeyscore and other NSA systems to obtain ongoing "real-time" interception of an individual's internet activity.” (Greenwald 2013) Other actions that NSA approved where phone-tapping G20 leaders including German Chancellor Angela Merkel (who was seen as a ‘spy friend’) and a plan to expand the NSA surveillance activities between 2012 and 2016, entering a ‘golden age’ of signal intelligence; acquiring the data the agency needs from ‘anyone, anytime, anywhere’; decrypt, bypass or infiltrating codes that keep communications secret by influencing ‘the global commercial encryption market through commercial relationships, human spies and intelligence partners in other countries’; and talking about the need to ‘revolutionize analysis of its vast collections of data to radically increase operational impact' (Risen & Pointas 2013). So what we have here is a massive surveillance program by the United States government and a person that leaks the programs into the media. Other media took over the news and people were informed. As for organizations such as the Electronic Frontier Foundation or the ACLU the answer was clear and they opposed the infringement of citizen’s privacy by the US government. Government officials, including president Obama, responded with more nuance but mostly opposed the leaks. According to Berghel (2013) there were two sort of responses in the first two weeks of the revelations: firstly there was the global idea by governmental officials and directors of the NSA, CIA and other intelligence services to ‘trust me’: “While the “trust me” defense has been a staple of totalitarian governments worldwide, it hasn’t been effective with the educated electorate in the US, at least since Watergate—it has become a “throw away” concept.” Secondly

54 there was the defense of a false dilemma: “the choice is to either endorse government surveillance as it is or run the risk of increased terrorist attacks, death, and disorder.” It is the latter that is used by president Obama. On the seventh of June, in a press briefing Obama responded to the question ‘Do you welcome the leaks, sir? Do you welcome the leaks? Do you welcome the debate?’ as follows:

I don't welcome leaks, because there's a reason why these programs are classified. I think that there is a suggestion that somehow any classified program is a "secret" program, which means it's somehow suspicious. The fact of the matter is in our modern history, there are a whole range of programs that have been classified because – when it comes to, for example, fighting terror, our goal is to stop folks from doing us harm. And if every step that we’re taking to try to prevent a terrorist act is on the front page of the newspapers or on television, then presumably the people who are trying to do us harm are going to be able to get around our preventive measures. That's why these things are classified. But that's also why we set up congressional oversight. These are the folks you all vote for as your representatives in Congress, and they're being fully briefed on these programs. And if, in fact, there was – there were abuses taking place, presumably those members of Congress could raise those issues very aggressively. They're empowered to do so (White House 2014a).

Next to the congressional oversight Obama references to the juridical oversight:

We also have federal judges that we put in place who are not subject to political pressure. They’ve got lifetime tenure as federal judges, and they're empowered to look over our shoulder at the executive branch to make sure that these programs aren’t being abused (White House 2013).

The juridical oversight Obama is referencing to is the FISA act and its related court. The Foreign Intelligence Surveillance Act, enacted in 1978, required a probable cause that the target is a foreign power or an agent related to a foreign power. Next to that, the purpose should be to tap foreign intelligence (Landau 2013). The FISC, the Foreign Intelligence Surveillance Court keeps oversight over the process and permits tapping of foreign powers. Berghel argues that there are certain myths

55 about the FISC. First of all the fact that FISA operates within the law, but when it became public that the Bush administration in 2005 illegally tapped telephones, the law was adjusted, promptly making it legal: “So when a government official reports that agency activities comply with the law, this is true a priori.” (Berghel 2013) Secondly the approval rate, according to a Mother Jones article, only 0.03% of all FISC requests are denied (which are 11 of 33,900 requests since 1978 (Eichelberger 2013). If only 11 requests were denied in 36 years, why not just abolish the FISC at all, argues Berghel (2013). Finally, it is worthy to mention is that 9 of the 11 federal judges currently in the FISC are appointed by Republican presidents (Reagan 3, George H.W. Bush 1, George W. Bush 5, Clinton 1, Obama 1) and all judges are appointed by Chief Justice of the Supreme Court (currently a Republican appointed by George W. Bush): “If the intention of the legislation that created the FISA court was to give the appearance of nonpartisanship, it didn’t happen” (Berghel 2013). The public outcry of the public was, compared to the SOPA/PIPA outcry, small. This is because of a combination of factors. Firstly because the SOPA/PIPA debate, people were protesting against something, whilst with the NSA revelations, they had to push for something. The collective day to demonstrate was named The Day We Fight Back and was held on February 11th. Around six thousand websites participated in the action, which is a fraction compared to the more than hundred thousand sites participating in the SOPA/PIPA protests (Robertson 2014, Perlroth, 2014). Secondly the revelations acknowledged what people already knew. Mass surveillance already was a part of society in the seventies when Echelon, the NSA’s “global system for intercepting private and commercial communications,” was deployed. In 1997 Carnivore and NarusInsight infiltrated operating systems and networks; the predecessors of programs like PRISM or XKeyscore. Even 9/11 did not change much, because in 2000 the NSA already published a memo containing:

56 “The volumes and routing of data make finding and processing nuggets of intelligence information more difficult. To perform both its offensive and defensive missions, NSA must ‘live on the network” (Berghel 2013). At the same time public surveillance technologies were proved in documentaries as Terms and Conditions May Apply. Though Obama announced in January 2014 that he has reviewed and will change the process around the NSA and their programs, he does not believe that the programs need to be shut down. In the bigger picture, the combination of an unwilling public pushing for change and an unacknowledged knowledge of mass surveillance practices by governmental agencies led to a smaller backlash as Obama as the technology president as for example the mis introduction of the Obamacare website.

57 Conclusion

This research started with a short history of cybernetics and internet. From the introduction of the cybernetics metaphor in the forties, to an ideologically minded Silicon Valley of the Californian Ideology towards a commercial implementation of data-use by companies such as Google, Facebook and others. The massive amounts of data which are generated and analyzed for advertising proved very lucrative for these companies. In the second chapter, one red how the same model of Facebook, Google and others is used in Obama’s campaign of 2008 and 2012 to target those who needed the targeting the most to win their vote. On the one hand side, the chapter provided information how new technologies became a platform in how messages could be targeted. This also included the grassroots ideology wherein volunteers became empowered to do certain small parts for the campaign. On the other side the chapter provided information relating the data collection practices that was needed to know who to target. By collecting massive amounts of data and placing key figures in the campaign, Obama and his campaign team were able to ‘map’ the current voting landscape so well to optimize their campaign spendings (through different media). The third chapter relates to Obama as president and the use of - or conflict with - digital technology during his presidency. First, the Government 2.0 ideology of Obama is addressed. By implementing certain laws and opening up data, an ideology is represented wherein citizens get more power in the government, for example through the We The People platform. This can be directly translated into the grassroots campaigns that Obama ran. The first negative scenario relating the image of Obama occurred with the introduction of his Obamacare website. The law was a prestigious project for the president, and the failed launch combined with his comments in the media led to damage to his image. Next to that, privacy issues vis- a-vis anti-terrorism is discussed in the sub-chapters of SOPA/PIPA and the

58 Snowden revelations. These chapters must have given the reader an example of technology related subjects in which the President, unwillingly, was dragged into. As one could have notice, Obama in his campaign was more free in the use of data for specific purposes. As seen in the chapters of Obamacare, SOPA/PIPA and the Snowden revelations, Obama, as a president still made a tremendous use of data and technology; it became a subject of policies, protocols and bureaucracy. Centralizing Obama as an object of study, we can see a clear ideological approach towards politics and the role a president has to play in a democracy. First his use of a grassroots campaign in combination with a control function established by data aggregating; secondly his approach towards citizens of the United States by opening up government data and trying to involve these citizens in the governing process; lastly, by picking side of the 'new' digital economy involving sites such as Reddit and companies as Google relating the SOPA/PIPA debate. In this case one could argue that Obama prefers a more progressive approach as a president towards these issues. Ignoring established power lobbies, Obama instead focuses his attention on emerging business; businesses such as Google who build upon the use of their products by customers. In this sense, his idea of a grassroots campaign is still reflected as a citizen is still entitled his agency. 'Old' lobby groups, such as the MPAA, represent the idea that a select few decide what is shown. A movie is for every movie the same, whilst using products from the 'new' lobby are personalized and give the user a form of agency – at least more than a static movie. This way, the government is actively involved in how technology is presented to the customer. At the same time it thus represents and ideology that the government is an entity that actively participates in (or maybe even in need of 'protecting') a citizens life—which is also reflected in the introduction by laws, such as for example Obamacare. By centralizing Obama in relation to technology, this research has tried to inform the reader into a new era of political engagement. In the first place, one now has an overview of the history towards the campaign, the campaign practices and

59 what Obama did during his terms. These are all very practical examples of practices, and is in some sort of sense Obama’s technological legacy – though he is still in office and things still can change. Many of these subjects are sufficient for a master-thesis themselves, as not every aspect of the subject but only the most important aspects are reflected in this research. I have chosen not to do this because, secondly, I would add another perspective into this research. In the introduction I recalled Morozov’s internet-centralists and cyber-utopians. Morozov introduces these concepts to address the fact that society is building more and more upon technologically induced changes. I would argue that the introduction of data in the political field has problematized politics specifically, and social life in general. There is an ongoing increasing dependency on data and statistics as a ‘proof’ to societal changes. In this case, Obama was just an example; but he proved to be an exceptionally well chosen example for this research as Obama became one of the most powerful people in an era wherein so many changes occurred relating to technology, whether it is data-gathering, analyzing or privacy issues. In this research, I picked a couple of the largest technological changes that occurred during Obama’s presidency, and which can be directly linked to the White House in general and Obama specific. But there are a lot more smaller technological changes that are occurring. I’ve mentioned self-driving cars or wristwatches for health purposes, but these are all purposes based upon data that is gathered. In a broader context, this research must be seen in the debate of information-gathering, as every technology based application nowadays provides or is build on information. Information aggregated out of data formed the basis of the micro targeting strategies of Obama. The opening up of data in his government formed some kind of ideological approach to open government, as saying: ‘data is equal to transparency’. The dependency of people on the internet (reflected in the Obamacare website) and the controversies relating innovation and privacy (SOPA/PIPA and Snowden): they are all subject of a societal change wherein (the

60 access to) information becomes one of the biggest goods to build on. Obama exemplifies the ongoing societal technological change in a political context: where technology could have changed in a couple of years – look for example to how MySpace disappeared and Facebook took over – Obama as a politician is prone to a democratic context wherein he is a subject of protocols, power relations and bureaucracy. This questions the position of a democracy within this changing society. As said above: Obama and his campaign-team used data-gathering practices extensively because it proved a powerful way to win the elections. But what happens to the possible voter when you specializes your message to one particular voter based on his information, or maybe more importantly: what happens to the voter when he is reduced not to a voter, but to a datapoint within a network of possible voters? Is the message still important, or are particular datapoint, or proxies, more important? What if the message does not suit? Are there, as for example Deleuze (1992) suggests, two kind of people: the individual and the dividual? These are all very media theoretical questions, which will not be answered in this research, but exemplifies how information (and data) has problematized the relation between the voter and the democracy, whilst at the same time exemplifies how society – including the democracy – has changed into a computational society, wherein everything needs to be measured, improved and stored.

61 Literature

Anderson, Chris. 2008. “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete.” WIRED. http://archive.wired.com/science/discoveries/magazine/16-07/pb_theory. Baker, Stephen. 2009. The Numerati. Reprint ed. Boston: Mariner Books, Houghton Mifflin Harcourt. Balz, Dan. 2013. Collision 2012. First Edit. New York: Viking Adult. Barbrook, Richard, and Andy Cameron. 1996. “The Californian Ideology.” Science as Culture 6 (1): 44–72. http://www.imaginaryfutures.net/2007/04/17/the- californian-ideology-2/. Beckett, Lois. 2012. “Three Things We Don’t Know About Obama’s Massive Voter Database.” ProPublica. http://www.propublica.org/article/three-things-we- dont-know-about-obamas-massive-voter-database. Berghel, Hal. 2015. “Through the PRISM Darkly.” http://www.berghel.net/col- edit/out-of-band/july-13/oob_7-13.php. Castells, Manuel. 1998. The Rise of the Network Society. West Sussex: Wiley- Blackwell. http://samples.sainsburysebooks.co.uk/9781444310146_sample_415190.pdf. Clayton, Dewey M. 2010. The Presidential Candidacy of Barack Obama: A Critical Analysis of a Racially Transcendent Strategy. First Edit. London: Routledge. CNN. 2005. “Web Inventor: Online Life Will Produce More Creative Children.” CNN. http://edition.cnn.com/2005/TECH/internet/08/30/tim.berners.lee/. ———. 2008. “Transcript: ‘This Is Your Victory,’ Says Obama.” http://edition.cnn.com/2008/POLITICS/11/04/obama.transcript/ Connoly, Mary V. 2012. “Sopa, Pipa, the OPEN Act: Where Is This Going.” In Association Of Small Computer Users in Education. https://ascue.org/wp- content/uploads/2014/11/2012-final.pdf. Deleuze, Gilles. 1992. “Postscript on the Societies of Control *.” October 59: 73–77. Eichelberger, Erica. 2013. “FISA Court Has Rejected .03 Percent Of All Government Surveillance Requests.” Mother Jones. http://www.motherjones.com/mojo/2013/06/fisa-court-nsa-spying-opinion- reject-request.

62 Epstein, Jennifer. 2013. “Obama on HealthCare.gov Problems: ‘I Don’t Write Code’.” . http://www.politico.com/politico44/2013/11/obama- laments-healthcaregov-but-not-cancelled-plans-177104.html. Espinel, Victoria, Aneesh Chopra, and Howard Schmidt. 2011. “Combating Online Piracy While Protecting an Open and Innovative Internet.” Petitions.Whitehouse.Gov. https://petitions.whitehouse.gov/response/combating-online-piracy-while- protecting-open-and-innovative-internet. Farnam, T.W. 2012. “Obama Allies Turn to Targeted TV Ads to Shore up Niche Voters.” Washington Post. http://www.washingtonpost.com/politics/obama- allies-turn-to-targeted-tv-ads-to-shore-up-niche- voters/2012/07/31/gJQA2i4NNX_story.html. Ferenstein, Gregory. 2012. “Obama’s Chief Tech Officer: Let's Unleash Ingenuity of the Public.” CNN. http://edition.cnn.com/2012/06/14/tech/web/white- house-tech-officer/. Gleick, James. 2011. The Information. A History, a Theory, a Flood. New York: Pantheon Books. Greenwald, Glenn. 2013. “XKeyscore: NSA Tool Collects ‘Nearly Everything a User Does on the Internet.’” The Guardian. Gross, Grant. 2012. “Who Really Was behind the SOPA Protests?” Mac World. http://www.macworld.com/article/1165221/who_really_was_behind_the_so pa_protests.html. Hall, Wynton. 2013. “Obama Huddles with Tech Giants on Obamacare, NSA Scandal.” Breitbart. http://www.breitbart.com/big- government/2013/12/17/obama-huddles-with-tech-giants-on-obamacare-nsa- scandal/. Harris, Leslie. 2012. “PIPA / SOPA and the Online Tsunami: A First Draft of the Future.” ABC News. http://abcnews.go.com/Technology/pipa-sopa-online- tsunami-draft-future/story?id=15500925#.TzA2ssVA9S0. Hendricks, John Allen, and Robert E. Denton. 2010. Communicator-in-Chief: How Barack Obama Used New Media Technology to Win the White House. Washington D.C.: Lexington Books. Isakova, Yulia. 2013. “Factors of Lobbying Success in the USA: Case of SOPA and PIPA ( 2011 - 2012 ) ” Central European University Department of Public Policy. Issenberg, Sasha. 2012. “How President Obama’s Campaign Used Big Data to Rally Individual Voters.” MIT Technology Review, December 19.

63 http://www.technologyreview.com/featuredstory/509026/how-obamas- team-used-big-data-to-rally-voters/. Johnson, Clay, and Harper Reed. 2013. “Getting to the Bottom of HealthCare.gov’s Flop.” The New York Times. http://www.nytimes.com/2013/10/25/opinion/getting-to-the-bottom-of- healthcaregovs-flop.html?_r=1&. Johnson, Dennis W. 2009. Campaigning for President 2008: Strategy and Tactics, New Voices and New Techniques. First Edit. London: Routledge. Kenski, Kate, Bruce W. Hardy, and Kathleen Hall Jamieson. 2010. The Obama Victory: How Media, Money, and Message Shaped the 2008 Election. New York: Oxford University Press. Kloppenberg, James T. 2011. Reading Obama: Dreams, Hope, and the American Political Tradition. New Jersey: Princeton University Press. Landau, Susan. 2015. “Making Sense from Snowden: What’s Significant in the NSA Surveillance Revelations.” http://privacyink.org/html/MakingSense.pdf. Lee, Newton. 2012. Facebook Nation: Total Information Awareness. 2013th ed. Springer. Maarek, Philippe J. 2011. Campaign Communication & Political Marketing. West Sussex: John Wiley & Sons Ltd. Manovich, Lev. 2011. Trending: The Promises and the Challenges of Big Social Data. http://manovich.net/content/04-projects/065-trending-the-promises- and-the-challenges-of-big-social-data/64-article-2011.pdf. Mayer-Schönberger, Viktor, and Kenneth Cukier. 2013. Big Data: A Revolution That Will Transform How We Live, Work, and Think. Boston: Mariner Books, Houghton Mifflin Harcourt. Miller, Claire Cain. 2013. “Tech Companies Concede to Surveillance Program.” New York Times. http://www.nytimes.com/2013/06/08/technology/tech- companies-bristling-concede-to-government-surveillance-efforts.html? pagewanted=1&_r=3&adxnnlx=1370668812-Ibl 12mnm5tNoaVnf8AKQw. Morison, John. 2010. “Gov 2.0: Towards a User Generated State?” The Modern Law Review 73 (4): 551–77. Morozov, Evgeny. 2011. The Net Delusion. New York: Public Affairs. ———. 2013. To Save Everything, Click Here: The Folly of Technological Solutionism. New York: Public Affairs.

64 Murphy, Tim. 2012. “Inside the Obama Campaign’s Hard Drive.” Mother Jones. http://www.motherjones.com/politics/2012/10/harper-reed-obama- campaign-microtargeting. O’Leary, Michael. 2012. MPAA Response to White House Position on Anti-Piracy Legislation. http://blog.mpaa.org/BlogOS/post/2012/01/14/MPAA- Response-to-White-House-Position-on-Anti-Piracy-Legislation-.aspx. O’Reilly, Tim. 2004. “What Is Web 2.0?,” September 30. http://www.oreilly.com/pub/a/web2/archive/what-is-web-20.html. Obama, Barack. 2006. The Audacity of Hope. New York: Crown. Orszag, Peter R. 2009. Memorandum for the Heads of Executive Departments and Agencies. http://www.whitehouse.gov/sites/default/files/omb/assets/memoranda_201 0/m10-06.pdf. Perlroth, Nicole. 2014. “The Day the Internet Didn’t Fight Back.” New York Times. http://bits.blogs.nytimes.com/2014/02/11/the-day-the-internet-didnt-fight- back/?_php=true&_type=blogs&_r=0. Pew Research. 2008. Young Voters in the 2008 Election. http://www.pewresearch.org/2008/11/13/young-voters-in-the-2008- election/. Plouffe, David. 2010. The Audacity to Win: How Obama Won and How We Can Beat the Party of Limbaugh, Beck, and Palin. Updated Ed. Penguin Books. Plumer, Brad. 2015. “Everything You Need to Know about Congress’s Online Piracy Bills, in One Post.” Accessed February 19. http://www.washingtonpost.com/blogs/wonkblog/post/everything-you-need- to-know-about-congresss-online-piracy-bills-in-one- post/2011/12/16/gIQAz4ggyO_blog.html. Risen, James. 2013. “Snowden Says He Took No Secret Files to Russia.” New York Times. http://www.nytimes.com/2013/10/18/world/snowden-says-he-took- no-secret-files-to-russia.html?pagewanted=all&_r=0. Risen, James, and Laura Pointas. 2013. “N.S.A. Report Outlined Goals for More Power.” The New York Times. http://www.nytimes.com/2013/11/23/us/politics/nsa-report-outlined-goals- for-more-power.html Robertson, Adi. 2014. “The Day We Fight Back: Can an Internet Protest Stop the NSA?” The Verge. http://www.theverge.com/2014/2/10/5398638/the-day- we-fight-back-asks-the-internet-to-fight-surveillance/in/4483763.

65 Sabato, Larry J. 2010. The Year of Obama: How Barack Obama Won the White House. Longman. ———. 2013. Barack Obama and the New America: The 2012 Election and the Changing Face of Politics. Lanham: Rowman & Littlefield Publishers. Scherer, Michael. 2012. “Obama Wins: How Chicago’s Data-Driven Campaign Triumphed.” Time. http://swampland.time.com/2012/11/07/inside-the- secret-world-of-quants-and-data-crunchers-who-helped-obama-win/. Shirky, Clay. 2009. Here Comes Everybody: The Power of Organizing Without Organizations. 2nd ed. New York: Penguin Books. Siegel, Lee. 2011. “Twitter Can’t Save You.” The New York Times. http://www.nytimes.com/2011/02/06/books/review/Siegel-t.html? pagewanted=all&_r=0. Sullivan, Andrew. 2009. “The Revolution Will Be Twittered.” The Atlantic. http://www.theatlantic.com/daily-dish/archive/2009/06/the-revolution-will- be-twittered/200478/. The Guardian. 2014. “Senate Committee Probes ‘Cuban Twitter’ USAid ZunZuneo Programme.” http://www.theguardian.com/world/2014/apr/10/senate- committee-cuban-twitter-usaid-zunzuneo. The White House. 2011. “The Open Government Partnership: National Action Plan for the United States of America.” http://www.whitehouse.gov/sites/default/files/us_national_action_plan_fin al_2.pdf. The White House. 2014a. “Remarks by the President on Review of Signals Intelligence.” http://www.whitehouse.gov/the-press- office/2014/01/17/remarks-president-review-signals-intelligence. ———. 2014b. “FACT SHEET: Affordable Care Act by the Numbers.” http://www.whitehouse.gov/the-press-office/2014/04/17/fact-sheet- affordable-care-act-numbers. Turner, Fred. 2006. “The Shifting Politis of the Computational Metaphor.” In From Counterculture to Cyberculture, 11–28. Vedel, Thierry. 2006. “The Idea of Electronic Democracy: Origins, Visions and Questions.” Parliamentary Affairs 59 (2): 226–35. W3schools. 2015. Browser Statistics. http://www.w3schools.com/browsers/browsers_stats.asp.

66 Wyatt, Edward. 2014. “F.C.C., in a Shift, Backs Fast Lanes for Web Traffic.” The New York Times. http://www.nytimes.com/2014/04/24/technology/fcc-new- net-neutrality-rules.html. Zikipoulos, Paul, Chris Eaton, Dirk DeRoos, Tom Deutsch, and George Lapis. 2012. Understanding Big Data. New York: McGraw Hill.

67