Algorithms are the new highway: On algorithmic contribution to the parochialization of public space

Master’s Thesis Media Studies: New Media and Digital Cultures Amsterdam, The Netherlands July 10, 2015

Evelien Christiaanse

Supervisor: Marc Tuters Second Reader: Jan Simons For Bob Table of Contents Introduction ...... 7 Chapter 1 ...... 7 Establishing Space ...... 7

Part One: Parochialization of public spaces ...... 7 Jane Jacobs ...... 8 Critique ...... 8 Part two: Fluidity ...... 9 Realms ...... 9 Space and Place ...... 9 Critical Cartography ...... 10 Non-Western Cartography ...... 10 Chapter 2 ...... 11 Software Studies ...... 11 Part One: Building blocks of algorithms ...... 12 Captured Data ...... 12 Ubiquity and Opacity ...... 13 Part Two: Algorithmic Governance ...... 14 Paradigm shift ...... 14 Discipline Society ...... 14 Control Society ...... 15 Automated management ...... 16 Critiquing Automation ...... 16

...... 19 Chapter 3 ...... 19 Democratization of Public Space ...... 19 Part one: The Tension between Software and Space ...... 19 Part two: Opportunities ...... 21 Cathedral and Bazaar ...... 22 Democratization ...... 22 Rethinking algorithms ...... 23 Physical design ...... 24 Conclusion ...... 25 Bibliography ...... 26

I give the University of Amsterdam library permission to archive the digital thesis in a repository and publicise and make it available for consultation upon request. 6

Introduction Highways are people movers. They are convenient as they bring the driver to their destination quickly, without wasting time driving through all the small city streets. Yet, Jane Jacobs, urban activist living in New York City, was opposed to building highways through neighborhoods. To Jacobs, cars disturb the city and highways reflect priority of planning over people. In the 1950's and 1960's Robert Moses worked as an urban planner (some called him the “Master Builder”) in New York City. He and Jacobs had a famous clash over plans to build a Lower Manhattan Expressway which would have displaced 2,000 families and 800 businesses and over a road that would cut through a park (Chatelain). The infra- structure needed to support cars destroys the pedestrian’s experience of the city. "Downtown's and other neighborhoods that are marvels of close-grained intricacy and compact mutual support are casually dis- emboweled...city character is blurred until every place becomes more like every other place," (Jacobs 440). This close-grained intricacy that she refers to is the diverse and lively city street which she ideal- ized. Her concern is that the roads break up the city into a loose collection of places and eliminates uniqueness of those places. Imagine that, like breaking up neighborhoods, algorithms increasingly drive like-minded people to- wards each other. This effectively eliminates public spaces, which are open to everyone, if certain peo- ple no longer have access to them. Based on one's personal profile of data that is built up about them online, individuals are increasingly being presented with personalized suggestions or results. This means that they are more likely to come in contact with like-minded people with similar interests to form what is called a parochial realm. Algorithms are convenient as they bring the searcher their infor- mation quickly, without having to waste time digging for information on useless websites. Algorithms are the new highways. Algorithms are increasingly used by taking over menial tasks like planning a route to drive. So much that there is a move towards ubiquitous computing, which is “when we were no longer aware of [com- puting] presence. Computers would be so thoroughly enmeshed in our lives that they’d be invisible to us" (Carr). At the same time, the United Nations estimates that 60 percent of the world’s population will live in cities by 2030. This will be the first time that more people live in a city than outside. De Waal explores the field of digital media and society and thinks that the combination of ubiquitous com- puting and growth of cities changes the city as a society. He thereby problematizes the change of public spaces into parochial spaces (de Waal 8). I would like to explore this further using the questions: How 7

might algorithms contribute to the parochialization of public space? What are some opportunities to mitigate this change? The change from public into parochial spaces is neither good nor bad. It simply happens. Yet, this de- velopment does affect the way that urbanites engage with the city. In the first chapter, I will discuss the role of public spaces and how they 'parochialize'. Then, I will discuss elements of Jacobs’ ideal city. Af- ter this, I will explore theoretical concepts of space over time. The purpose of this will be to demon- strate how space has been constructed over time and across cultures. The second chapter departs from urban geography to move on to software studies. Here I discuss how algorithms work and how they gain the agency to influence urbanites. Examples of how algorithms achieve this are included through- out the second chapter. The digital media that I take into consideration in this thesis are software and algorithms which produce recommendations or offer search results because it is these softwares that I think have the potential for aiding parochialization. Finally, the third chapter discusses the tension be- tween urban geography and software. It concludes with some opportunities for mitigating parochializa- tion of public spaces. Each chapter includes an introduction to the research and theorists that I draw on to answer my questions. This thesis finds itself at the juncture of urban geography and software studies. To clarify, it is not about how new medias create new publics and new public spaces via platforms such as Facebook. It is about the influence of algorithms on urban dwellers' exposure to each other through public spaces. This thesis will draw a few examples from the city of Amsterdam and The Netherlands, as I have lived and studied here, I can draw off of my own observations. I conclude with some considerations for future research, as mine is not the last word. Again, the in- crease in use of digital media and the increased number of people living in cities means that urban ge- ographers should take the role of digital media into consideration when designing new research. Thus, this thesis contributes a basic look at how, theoretically, algorithms may play a role in parochialization of public space so that geographers can build upon it.

Chapter 1

Establishing Space

First, I will consider various ideas from Martijn de Waal's book: The City as Interface: How New 8

Media are Changing the City. He uses his book to explore how software fundamentally transforms almost everything we do, including how we engage with the city (de Waal 7). The book focuses on how public spaces have changed over time. In order to understand the changes at hand, I first look at Jane Jacobs' book: The Death and Life of Great American Cities, for Jacobs' opinions on what makes a city street lively. Diversity, monotony, and infrastructure were important themes in Jacobs' book. By highlighting them I show tangible examples of how urban planning impacts city life. This will form the basis for the research question, here I will explain how spaces can parochialize, which needs to be established before I can move onto explaining how algorithms could be responsible for it.

Then, I will veer away from physical space and explore theoretical of how space is constructed. Starting with the importance of the three realms as put forth by Lyn Lofland, an American sociologist. Briefly stated, these are the roles people can play in various spaces. As this thesis will later be concerned with how strangers meet and where physical spaces play an important role. The goal here is to show that space can be constructed by many factors, so algorithms can construct space as well.

Finally, I discuss critical cartography and non-western cartography using Jeremy Crampton, a researcher on critical cartography and John Law, a sociologist at the Open University. This section aims to expose the reader to alternative ways of thinking about space, alternative to the euro-American tradition of measurable spaces. This concept of alternative constructions of space will come back in the third chapter when I discuss ways in which algorithms may be dismissive of these types of spaces.

Part One: Parochialization of public spaces In the age of new media, not only the way in which we communicate is changing, but our relationship with the city is said to be changing, as well. As Martijn de Waal puts forth in his book "...what we tend to forget is that [digital media] also changes the city as a society' (de Waal 8). Information is increasingly brought to city dwellers via 'suggestions' or based on some sort of algorithm which shapes the city to one's needs (de Waal 143), rather than that the inhabitant adjusts themselves to the city, as de Waal claims to have been the case in the past. As will become clear later, developments in technology allow for the city to become less uniform, urban space increasingly has the potential to become a different place for each individual; turning modern man into “a nomad who could choose which communities and publics to join,” be that as a family, a squatter, a rock 'n' roller, etc. (de Waal 41). Yet, that very dynamic, also brings us to the core of the problem that de Waal terms telecocooning: “while we move around the city, we use the mobile phone and living map to wrap ourselves in a cocoon that 9

hinders all spontaneous interaction with unknown others; we are constantly connected with people we know and guided to places where we are again surrounded by familiars!” (de Waal 78). Thus, the paradox presented by new media is that, at the same time as the “total diversity in the urban landscape might increase... [the] urban individual's world of experience might actually become more monotonous” (de Waal 85). Although cities generally offer a diverse array of resources, that very diversity may be unexperienced by an individual as the telecocoon shapes the urban dwellers experience. The threat to diversity, and thus the public domain as understood by Jacobs is considered to be tied to an increased parochialization of urban public space.

Diversity, in people as well as functions, has historically been considered important for a thriving city. It forces people to be confronted by strangers, like the rich were confronted by the poor when the newest boulevards of Paris were built (de Waal 13). Simply seeing each other sparks interest and consideration. De Waal calls this a “creation of collective awareness” (de Waal 102). Thus, the debate on importance of the public domain has been raging since at least the 1950's. De Waal summarizes political theorist, Hannah Arendt: “the public domain plays an essential role in the republican idea of a democratic urban society. It is the places where city dwellers with different backgrounds meet, are confronted with each other and must come to terms with each other,” (de Waal 92). The parochialization of public space is considered to be a threat to the experience of the diversity of a city. The most extreme consequence of which is segregation, the spatial and social separation of people (Attkinson and McGarrigle 1). Segregation does not necessarily have to be written into laws like the pre- Civil Rights United States, but can also be a result of housing markets excluding lower income families, thus segregating the city along social economic lines (Attkinson and McGarrigle 2). One element that Jacobs finds important for diversity is mixed-use of spaces to ensure that there is always activity on the street and that it draws more than just those who live in the area (Jacobs 290). Some cities, such as Amsterdam, achieve this with small shops in residential areas or housing above a row of shops. In this chapter, the relationship between the physical city and people will be explored by revisiting Jane Jacobs after the role of public spaces has been discussed.

Public spaces have a long and varied history. Some trace this back to ancient Greek society, where they served as a democratic space where citizens could vote (Carmona et al. 23). In his book Convivial Urban Spaces, English urban designer, Henry Shaftoe starts by exploring why public spaces are important for city life. Learning tolerance is among several reasons. As de Waal says: “the very nature of the city means that we are always surrounded by people who are different from ourselves...Yet in 10

one way or another we must find a way to live with each other,” (14). These reasons resonate throughout the rest of this thesis. How can a physical space play a role in such, almost idealistic, circumstances? First, and foremost, public spaces are “...by definition they are universally accessible, they offer one of the few opportunities for people to directly encounter other people with different norms, behaviours and cultures,” (Shaftoe 13). They are open to anyone. Public spaces can be a square in the middle of a city or a cafe or a library, as long as anyone is welcome and can access it. Shaftoe summarizes Richard Sennett's ideas, Sennett writes on cities, labor, and culture while teaching sociology in New York and London:

'People grow only by the processes of encountering the unknown' and the best places to encounter difference and the unfamiliar are public spaces, where all segments of society can cross paths... we are in danger of becoming increasingly prejudiced and narrow-minded, as we only choose the company of like-minded individuals. (Shaftoe 19)

In encountering strangers it becomes easier for an individual to put themselves in another person's shoes. As an extreme example, never seeing a person from a different religion or poverty makes it difficult to believe these differences even exist. Most importantly, Shaftoe believes choosing the company of like-minded individuals is a threat to open mindedness. What this thesis will suggest is that this choice is decreasingly left to the individual as they are increasingly reliant on algorithms to choose for them.

Jane Jacobs Although written in the early 1960's by someone who is not an academic, The Death and Life of Great American Cities remains a relevant book. It is referred to by geographers, urban planners, sociologists, and many more (Carmona et a., de Waal, Glaeser, Shaftoe, Simon and Hospers). Although there is a fair share of critique and it is not written by an academic, the book is nevertheless accepted by some as relevant for European cities in 2015 (Simon and Hospers 9). In order to better discuss Jacobs' stance, I will draw on other authors who write on her work. Edward Glaeser, a Harvard Professor of Economics, who has focused his work on the economics of cities, as well as a collection of essays on her work put together by Simon Franke and Gert-Jan Hospers, an urban planner and economic geographer. She was against large scale development for fear that the development would serve cars and businesses (Glaeser 144). Her stance is that the city should be built for the people who live in it.

The key point of her work is the importance of diversity, not only in people, but in function as well: "The diversity, of whatever kind, is that generated by cities rests on the fact that in cities so many 11

people are so close together, and among them contain so many different tastes, skills, needs, supplies, and bees in their bonnets" (Jacobs 192). Although separated by several decades, this diversity is what Jacobs and de Waal believe brings a city together: "a well-functioning city is not modeled like a hierarchical tree structure. Rather, it has countless junctions of overlapping worlds and structures. All those networks overlap a little so that the paths of city dwellers are constantly crossing", (de Waal de Stad, translated from Dutch by me). Jacobs claims that these constant small meetings between strangers build up trust in one’s city and fellow inhabitants (Jacobs 72). Additionally, the presence of people on the street provides a sense of safety, through what she calls “eyes on the street,” (Jacobs 45). A diverse, mixed-use city, based on small scale urban planning is Jacobs' ideal. The opposite, then, a city where the functions are separated, represent for her a failed and monotonous city.

As the city is constantly evolving, due to changes in the makeup of its inhabitants and economic times, diversity hangs in balance with monotony. Here, I will discuss how the built environment can have an impact on the diversity of a city. By discussing how the built environment can have an impact on diversity, I will show how individuals can be influenced by their environment, which will help make the argument that software can also influence people. Jacobs describes the situation of the self- destruction of diversity a “street suffering under its own success”. In this case, a charming street lined with unique shops gives way to a restaurant lined street: monotony (Jacobs 319-320). What once was a thriving, interesting street, being lived on at all hours of the day is changed into a monotonous street inhabited only during small portions of the day. Consider a central business district like the Zuidas in Amsterdam, with few people in the evening. To Jacobs, diversity draws people to a place and people make up a city.

A major threat towards diversity is 'clustering' or 'zoning', a tradition that arose after World War Two, where: Different types of uses were allocated to different areas of the city; so that all industry was in one pace, residential in another and leisure facilities in yet another, and so on...[which] resulted in the abandonment of central public areas for significant times of day or days of the week, as well as creating bland and indeterminate open spaces around residential blocks. (Shaftoe 48)

This style of planning was applied to post-war Rotterdam (wederopbouwrotterdam.nl). It lead to places which were empty during the day. If people work in the city center but live outside, there is no reason for them to go to the city center in the weekend, leaving it deserted. Despite this, "modern urban planning is experiencing a dramatic rise in partitioning. Residential areas, industry parks, university 12

campuses, airports, (tech clusters), and shopping centers...homogenizing and monofunctionality, which leads to gentrification and exclusion, are natural processes which Jacobs warned of," (Christiaanse, K. 25, emphasis by the author). This is the act of 'clustering' functions, which may be ideal for those in a particular cluster to meet others in their cluster, however this would divide the city and drive it towards monotony. Sprawl leads to monotony as well. As suburbs have expanded they have done an excellent job of sorting people based on their socioeconomic status (Montgomory 48). Even here, the sprawl leads to a 'stretched' life where people have fewer social connections due to the time spent commuting. Because of this, there are fewer meetings between strangers; “by omitting strangers from our lives, sprawl leaches away our capacity for dealing with radically different perspectives,” (Montgomory 56). Although this particular example stems from sprawl rather than clustering, the result is the same. The built environment impacts how inhabitants of a city can meet strangers. Meeting strangers increases our ability to understand different perspectives. This monotony may be exacerbated by algorithms as well, which will be explored in the following chapter. A busy road with no crosswalks (Jacobs 476) can divide a city just as much as any segregation, as people no longer have access to the other side of the street without making a dangerous crossing. Hajer and Reijndorp agree, lamenting the priority cars have earned over people when the “The city centre...was cut off: the public space became dominated by motorized traffic,” (Hajer and Reijndorp 8). Consider another example from Amsterdam: the Wibautstraat which is often cited as the ugliest street in Amsterdam (Boer, et. al). It was built to get cars in and out of the city center, but it also acts as a di- vide between the wealthy Weesperzijde neighborhood and the struggling Transvaalbuurt which lies between a rail line and the Wibautstraat. As a pedestrian (or cyclist, in the case of Amsterdam) will choose the path of least resistance, the inhabitants of each side are unlikely to come in contact with one another, which, is detrimental to the public realm. This infrastructure is a physical divide of a city. Later, this concept will help shape the idea that algorithms might work in a similar fashion.

Critique There is much critique on Jacobs' arguments. I would like to focus on three which I believe have most to do with her call for diversity, her dislike of large-scale urban planning and her praise of eyes on the street. These are preservationism, not-in-my-backyard (NIMBY), and the neighborhood watch, respectively. Jacobs insisted on maintaining the city as it was, simply because she thought it worked well. For example, maintaining low-rise buildings in order to protect costs for “budding entrepreneurs,” (Glaeser 147). However, as Glaeser writes, her reasoning is flawed. According to him, supply and 13

demand would eventually lead to “...protected areas [becoming] more expensive and more exclusive. On average, people who live in historic districts in Manhattan are almost 74 percent wealthier than people who live outside such areas...People living in historic districts are 20 percent more likely to be white,” (Glaeser 150). The social economic and racial homogenization of these districts is the opposite of what Jacobs wanted!

There is a thin line between preservationism and NIMBYism, as Glaeser calls it (260). Glaeser describes NIMBY as a result of blocking developments in order to protect the status quo (260). The difference is in that with NIMBY: “the interests of the people who oppose change are certainly comprehensible, but their interests usually don't match the public interest,” (Glaeser 262). An example would be blocking a new high-rise because it will impede the view. NIMBYism is characterized by a concern for protecting an individual's experience of their property from changes, the challenge there is that these individual's motivation's seem perfectly reasonable, which gives them a strong power over someone else's property. Although the high-rise may decrease the pressure on the local housing market it will not be built because those who currently reside love their existing view. NIMBYism is about keeping things the way they are to protect one's experience of the city. Those blocking developments, however, forget that “some change is necessary if the world is to become more productive, affordable, exciting, innovative, and environmentally-friendly,” (Glaeser 262). Clearly there is a fine line between what an individual wants and planning a city that can grow to accommodate a diverse population. Keeping things the way they were is, paradoxically, a threat to diversity.

Finally, the neighborhood watch concept stems from Jacobs attributing relative safety to having many people on the street to keep an eye on things. A pitfall here, is that watching your neighbors and acting as an extension of the police, rather than just observing the street, can reinforce fear and violence (Goodyear). Despite these critiques, Jacobs' concepts of diversity and eyes on the street remain important as long as it does not regress into the pitfalls described above.

Even though I compare contemporary phenomena to the past, I consider some of Jacobs' concerns to be applicable, most importantly her call for diversity. Jacobs describes her vision of a city based on, what I would call, an 'analogue' experience when cell phones were non-existent and the internet was still the dream of a few scientists. However, the introduction of internet and mobile media has arguably changed the experienced geography of a city (de Waal 8), as it is no longer primarily based on 14

measurable physical spaces. Rather, as I will explore in the second part of this chapter, spaces are increasingly constructed by experiences, yet even these spaces require diversity as Jacobs and de Waal call for. Which will be important for the remainder of this thesis, as the fluidity of space is how I think new media contributes to the parochialization of public spaces.

Part two: Fluidity Realms There are many ways to operationalize the makeup of a city. One of these ways is to describe the realms which one occupies at different times. Realms are not exclusive to a single place and no single place is exclusively one realm. Realms are not physically defined spaces. Originally there were two realms, the public and private, but these alone were not enough to accurately discuss city life. Lofland is credited with introducing a “third, intermediate sphere between the public and private: the parochial,” (de Waal 15). Now a city is said to be made up of three realms: the private, public, and parochial. This concept is used by de Waal, as well as Hajer and Reijndorp (84), an urban planner and researcher focusing on architecture, urbanism and cultural developments, respectively. As their work will be featured later and de Waal remains a prominent part of this thesis, I will use Lofland's conceptualization of realms.

Understanding the fluidity between realms lays the groundwork for the following section on space and its fluidity. Using a market, I will demonstrate how one physical space may be a stage for all three realms. The private realm is "characterized by ties of intimacy among primary group members who are located within households and personal networks," (Lofland 9); these are generally made up by families and close friend groups. In a city, the market may become a private realm if one is there with family, for example using the market as a place to teach a child the names of vegetables. As this is a private interaction, the public place becomes private. The public realm, the quintessential component of a city, is "those areas of urban settlements in which individual's in copersence tend to be personally unknown or only categorically known to one another" (Lofland 9), and it is made up of strangers or those who only know each other in terms of expected role. A market can be considered a public realm in which a shopper does not know the other shoppers and only interacts with the vendors in their expected role (to buy vegetables or other wares). Most important for this thesis, is the parochial realm. This is "characterized by a sense of commonality among acquaintances and neighbors who are involved in interpersonal networks that are located within 'communities'" (Lofland 10). Again, in the 15

case of the market, the parochial realm could best describe the vendors themselves, who may travel from different corners of the country to stand together each weekend in their usual spot. They form a community of people who work at the market. Another example is the people who come from different parts of the city to shop at an organic market based on a common interest in a particular lifestyle. The parochial realm falls in between the public and private realms, “Although open to the public, these are spaces that evidently constitute the space of a certain group: Whoever wanders in as a stranger feels like a guest, often unwanted,” (Hajer and Reijndorp 85). Each of these realms is experienced by all city dwellers in different geographical locations of the city, as just shown with the case of a market. The realms overlap much like the network described above.

Yet, the prominence of the public realm may be diminishing, in part due to the rise of new media. "With digital media we can see that members of a specific subculture have started using a specific hangout, cafe, or restaurant more often and now consider that place their place" (de Waal 68, emphasis from the author). This development can be understood from many perspectives; for an individual this is convenient since they are more likely to find their subculture, while an urban geographer might be concerned about the social cohesion and a decrease of interaction with strangers. Places are different from realms. Places are physically bound, yet can serve as a backdrop for the realms.

De Waal attributes the parochialization of public spaces to the increase in new media. How do new media accomplish this? Mostly through the algorithms which drive the information that people receive through recommendations from platforms such as or Facebook. “the digital traces we leave behind when we use our mobile phone or social networks and search machines are deployed to create profiles of us that are used to personalize our experience of the city,” (de Waal 171). How can an algorithm influence space? In order to understand this, I will first need to look at some abstract concepts regarding space.

Space and Place Two frustratingly vague, yet important concepts, space and place have had many definitions over the years. Stuart Elden, a professor of Political Theory and Geography, writes on the different ways in which geographers have thought about space. Although the space and place are often used interchangeably, they are different. “If nobody asks us what it is, we know; yet if we are asked, we do 16

not know," Elden quotes St. Augustine in his encyclopedic article on space (Elden 1). Arguably nothing has changed since St. Augustine’s time. There are endless discussions about what space is including some which overlap with 'place'. As Kitchin, a professor of Geography, demonstrates in his article on space, which is also in the International Encyclopedia of Human Geography. How space has been understood has changed over time, even over the last 70 years, Kitchen retraces academic geographic discourse in his article. Starting with the absolute space of the pre- 1950's geographers, who defined space with coordinates or other measurable units (kilometers, miles, etc.) and treated space as a container in which events happened. He then discussed the constructionist geography of the 1970's which focuses on how space is produced by social economic status or gender. Finally Kitchin covers the more recent debates regarding cyberspace (Kitchin 269-270).

The concept of place according to the human geographers has a long and dynamic history. What remains constant over time is its importance to humans, society and how they shape one another. Broadly speaking, place is space where something happens. Tim Cresswell, a human geographer and professor of History and International Affairs at Northwestern University, attributes this as having stemmed from a metaphor of a home in the woods which Heidegger used: "Human existence is existence ‘in the world’. This idea of being-in-the-world was developed in his notion of ‘dwelling’. A way of being-in-the-world was to build a world. Dwelling in this sense does not mean simply to dwell in (and build) a house, but to dwell in and build a whole world to which we are attached,"(Creswell 171). This dwelling metaphor turns a space, in this case a home, into a place. This is easy to understand, as most people have a dwelling to relate this concept to. Over time, however, this concept of place being confined by a rigid space changed. In the 1970's the humanists attributed experience to making a place and in the 1980's Marxists contended that society makes place (Creswell 173). “Space is not simply a container in which things happen; rather, spaces are subtly evolving layers of context and practices that fold together people and things and actively shape social relations" (Kitchin and Dodge 13). Theielmann, a senior research fellow at the University of Seigen, contributes to work on mobile media and geomedia. He describes the key difference between these concepts as that people make space a place:

The social geographic conceptualization of space radically turned to a social constructivist basis. That meant that space was no longer seen as an entity existing on its own terms, with a greater or lesser impact on the social sphere. Space, in these perspectives, was rather the result of social construction. Or, more pointedly, if nobody thought about space, space would not exist. (Thielmann et al. 19) 17

Space can be redefined in endless ways, but according to these lines of thought it only becomes a place once people interact with it. This relational characteristic is the reason space and place can be constructed according to different experiences. Of course, as these discourses are rooted in human geography, this is an anthropocentric way of looking at space and place as it is dependent on humans to define it. Places are more personal than space as the individual makes up a part of it. Space can be constructed by countless factors which are constantly in flux, which means there is always a new angle to study.

Critical Cartography Now that I have explored existing and historical concepts of space, in particular the attempt to define it, the next section is to discuss is how geographers and cartographers have always struggled to represent space accurately. Maps are always a simplified representation of the world simply because it is impossible to draw everything while still having a readable map. Thus, all maps are made up of aggregated information. This means that categories were defined and choices were inevitably made by the cartographer, who has the power to shape the map and hence—to some extent—to shape our conception of space. There are many opportunities to influence what a map depicts. "Mapping is involved in what we choose to represent, how we choose to represent objects such as people and things, and what decisions are made with those representations. In other words, mapping is in and of itself a political process." (Crampton 41, emphasis authors own). The what, how and what, are each opportunities to sway the meaning of a map. Map making was historically reserved for experts until somewhat recently (Crampton 37), which means that maps have a unique characteristic in that they are often considered as a representation of the (only) truth. In reality, what we consider to be knowledge is often influenced by power (Crampton 39). As discussed earlier, if maps have traditionally been created by experts who have to make choices in order to aggregate the information on the map, then the knowledge presented on a map could represent the interests of other parties, those who commissioned the map, for example (Crampton 18). Consider which points appear on the free maps given out in most cities: big museums, the Hard Rock Cafe, some shops. A tourist might accept this as a realistic representation of the city, when in reality these places are featured simply because they paid to be included. Crampton introduces his reader to critical cartography in order to learn how to question these assumptions. Critiquing the information displayed on the map is much like critiquing an algorithm. As I will discuss later, software programmers may be a modern day version of cartographers: well educated individuals who make decisions in writing something which is not often critiqued. 18

Non-Western Cartography One such assumption that is only briefly discussed in Crampton's book is the assumption that the Western definition of space is the only one. As mentioned earlier, space can be defined in any number of ways. Space can be shaped by experiences, class, gender, and many other factors, and that is only if it is considered from a Western perspective. This perspective is defined by the West's attachment to measurable units, such as time and distance (Law 131). Crampton introduces this idea, when he describes maps as "culturally learned knowledge" (Crampton 43). However, John Law expands on it with an in-depth analysis of the difference between, to use his wording, “Euro-American” (122) and Aboriginal traditions of constructing space. Perhaps the best example is that of the Aboriginal Australian's 'dreamtime', where Australia can be defined in the Euro-American sense of definite spaces and a linear time line, but also as dreamtime. This dreamtime does not follow the Euro-American linear sense of time which is often used as a tool to describe the development of spaces. Geologists use layers in the soil to describe change over time, for example. "It is not possible to discover and represent [Aboriginal stories], so to speak, as some kind of operation which is separate from their existence. If they hold their shape at all it is because they are participating together in their continuing re-creation," (Law 113). Dreamtime is past, present and future happening simultaneously. It is also experienced differently by different members of the community. Some stories are only shared amongst initiated men, for example (Law 134). This makes dreamtime timeless and unique to each individual who engages with it, but also ever changing.

In this line of thinking, then, a city is not a geographical place, but rather as a “flection in a field” (Flusser 322). This means that a city can be defined by the interactions of its users rather than by its place on a map. The word user is chosen, in lieu of 'inhabitant' because one cannot inhabit a non- geographical space. There are different ways of knowing the world and each of these ways leads to different spaces.

Conclusion What began as a discussion of the parochialization of public spaces via digital media has taken a detour to explore the various ways in which space can be constructed. This was necessary because digital media is becoming an actor in making space. In order to understand how a seemingly ungeographical thing can impact public spaces, it is important to first know how spaces are made.

The chapter opened with Jacobs' writings on the importance of diversity to a city and the side effects of 19

modernity. Although her perspective is important for this thesis, she has been critiqued for several ideas brought forth in her book. The lobby for “historical preservationism”, the “Not In My Back Yard” (NIMBY) movement, as well as the “neighborhood watch” can all be understood as, to a greater or lesser extent, stemming from Jacobs' ideas. Despite these critiques, Jacobs' book remains an important reference point when discussing the importance of diversity. Then, a brief introduction to the different between spaces and places followed. Space is often seen as a box where things happen and place is what is created when people interact with that space. Through critical cartography it is clear to see the difficulties in representing space in an entirely neutral manner. This is because there are always decisions made while producing a map, which makes them inherently political. Not only politics makes a map hard to produce, but the construction of a space does as well. As explored with the example of Aboriginal dreamtime space, there are other ways in which to construct space that are not similar to the Euro-American mapping history. This basis should open the reader's mind to the contention that digital media can influence the city as a society. Later, I will demonstrate how I think that space can be constructed by new media with the help of increased computing power and algorithms.

Chapter 2

Software Studies 20

"Packet by packet, flow by flow, they identify patterns, build models, and in accordance with network policies use these findings to manage communications dynamically. It is this process that produces dividuality." (McKelvey 605)

Introduction The previous chapter dealt with space and the shift from public realms to parochial realms. First, I discussed Jane Jacobs' book and her contention that a city is dependent on diversity and the ability for people to mix. In 1960's New York, building a highway through the city was perceived as a threat to that ability. With this discourse in mind, I contend that algorithms function as a highway. This chapter will take a closer look at algorithms and their role in shaping individual's behavior from the software studies perspective.

This lengthy chapter is dedicated to bringing the reader through various theories that I consider to build up to the concept “Algorithmic Governance”. It is broken up into three sections. One is dedicated to building up the algorithm from small pieces of captured data to describing software's ubiquity. The second will draw on the French philosophers Deleuze and Foucault for their work on the paradigm shift in governance, which I believe can be largely attributed to algorithms. The third will review some critique on automation. Interwoven into the chapter are examples of research into the algorithm of various platforms, including: Netflix, Facebook and Google. In order to answer the research question, I need to explore how algorithms work and how they may be able to influence people in general, before I can apply this idea to public spaces, which I will do in the third chapter where I discuss the tension between urban geography and software studies.

The first part will open with Agre's concept of “Captured Data” as what I consider to be the smallest units within algorithmic governance. Philip Agre has contributed many articles to New Media from the late 1990's through the early 2000's. Finally, I will make the case for the ubiquity of software, drawing from multiple researchers. Having all of this in mind, I believe, will better prepare the reader for the second part of this chapter because it relies on the concepts developed in the first part.

The second part will be dedicated to a lengthy literature review of Deleuze's theory on control society as a paradigm shift in governance; from the disciplinary to the control society. This theory was written 21

in a short manifesto entitled: Postscript on Societies of Control, published in 1992. This idea is similar to the shift in the way in which spaces are constructed. Then, the concept of the control society's new form of governance will be brought to light with examples of how algorithms work and why they might influence people's behavior. Then I will consider Kitchin and Dodge, a geographer and computer scientist duo who have contributed to human geography and software studies. They often write on how code can influence space, including an Atlas of Cyberspace. Their 2011 book, Code/Space provides the methodology for the new form of governance in what they call “Automated Management” which they argue is driven by captured data.

Since the first two sections have discussed how an algorithm could theoretically influence an individual's behavior, the chapter will conclude with a third section reflecting on critique of how algorithms are made. I will use Nicholas Carr, an author who writes on technology and culture and is critical of technical utopianism. Specifically applying his critique to the use of 'expected behavior' and the agency granted to software. Additionally, Stephen Graham's article Software Sorted Geographies will be used for its perceptive considerations for the future of cities. Although this section will not focus on the tension between software and urban geography, some thoughts from this article apply well to the critique.

Part One: Building blocks of algorithms Within computing, algorithms are the written instructions which a computer has to follow in order to solve a problem. They vary in scope, though are rarely simple. Programs can be seen as very large algorithms as are the instructions for returning a search query (techtarget.com). Programs and software are made up of algorithms and written by programmers. Fenwik McKelvey, an internet researcher most interested in algorithmic media and the increase of software used in communication, states:

Software is much more front and centre than the hidden, infrastructural work of algorithms. Algorithms function as the background of the material, symbolic, cultural, or communicative processes of new media and information technologies. Despite its association with and indebtedness to software studies, the study of algorithmic media focuses on specific facets of software. (McKelvey 610)

Wendy Chun is a professor and Chair of Modern Culture and Media at Brown University as well as a contributor to new media and software studies. In her 2004 work, On Software, she explains that 22

software is what forms a user's relationship with the hardware. The users simply use the program without much concern for how it works. As I will discuss in more depth later, the human-like language of software, forms what Chun calls an 'ideology' because the relationship is more symbolic of interaction rather than real interaction:

Software, or perhaps more precisely operating systems, offer us an imaginary relationship to our hardware: they do not represent transistors but rather desktops and recycling bins from reassuring sounds that signify that a file has been saved to folder names such as “my documents,” which stress personal computer ownership. Computer programs shamelessly use shifters, pronouns like “my” and “you,” that address you, and everyone else, as a subject. (Chun 43)

Here she describes a distancing from the hardware; that the user no longer actually controls the computer, although they think they do. If what she claims is true, then the users are actually controlling symbols which represent actions, but do not know how to perform the initial actions anymore. This brief introduction to the difference between software and algorithms should aid in reading the rest of the thesis: algorithms make up software, software functions as a symbolic relationship with hardware.

Captured Data In his 2003 article Surveillance and Capture, Agre discusses two models for gathering data in relation to privacy. Though this thesis is not concerned with the debate surrounding personal privacy on the web, the methods of data gathering as discussed by Agre are relevant for my argument that the way data are gathered matters. Data, as I will show later, drives algorithms, which is why I think a discussion on the method in which data are gathered is important. This concept will play an important role, in the second part of this chapter as “capta”, the shortened version of “Captured Data”. This capta is a building block for algorithmic governance, which is what this chapter aims to outline in order to show how an algorithm may be able to contribute to the parochialization of public spaces.

Agre uses the methods of surveillance from the novels 1984 and in A Brave New World as an example in his work. Surveillance is often discussed in a way which emanates George Orwell’s book, 1984. As a state of constant surveillance by something an individual can observe, like Big Brother's two-way video screens, used to transmit propaganda while simultaneously observing a room. Surveillance writers invoke ideas of trespassing in an individual's personal life, with terms such as 'invasion' or they "Identify [surveillance] with the state," (Agre 743). 23

The "Capture Model" is a less conspicuous manner for collecting data than the "Surveillance Model" just described. The surveillance model is based on Foucault's panopticon: a prison “designed in such a way that one can never be certain whether one is being watched or not...[which] leads the subject to adjust his or her behaviour accordingly so as to behave as if they indeed were being permanently observed” (Bucher 1170). Unlike the large screens showing Big Brother's face or the Panopticon, captured data are exactly that, captured. To catch data implies that they are taken against its nature, like an animal caught in a trap. The capture process is dynamic and complex. Agre breaks it down into five steps, which, he explains, can happen simultaneously and can influence other steps (Agre 746). What is important to capturing data is that the process uses human-like actions and human words. This turns the data collection into something almost invisible as it feels natural.

These two methods for gathering data could be seen as the pre- and post- paradigm shift in governance as Deleuze puts forth (and will be explained in depth later). Kitchin and Dodge also make this connection, observing the invisibility of the capture model:

Individuals are often largely unaware that such capta is being generated about their activities and interests, except when it is explicitly revealed to them, for example, as itemized entries on a telephone bill or when a credit application is rejected. Such a form of governance is described by Deleuze in his notion of "societies of control." Here, expressions of power are not visible and threatening as with sovereign or disciplinary regimes, rather power is exerted subtly through distributed control and protocols that define and regulate access to resources and spaces, without those who are being governed necessarily being aware of such processes. (Kitchin and Dodge 86)

The key difference in the two models seems to be how the “observed person” experiences the process of gathering data. The invisibility can be attributed to the manner in which capta is gathered without the individual seeming to notice. Individuals would notice a large screen, like the 1984 method, but not how Facebook gathers information because users give it freely, unaware of the rich data they are supplying.

Here I would like to move away from theory to provide an example of how the capture model works to bulid up a capta profile using Facebook (which in turn provides information to algorithms). The social network site has 1.44 billion monthly users which makes it a rich source of data (Facebook Newsroom). Taina Bucher is a media researcher and associate professor at the Centre for 24

Communication and Computing at the University of Copenhagen. In her research into how the Facebook algorithm may influence user's behavior she writes: "The algorithm is not merely modeled on a set of pre-existing cultural assumptions, but also on anticipated or future oriented assumptions about valuable and profitable interactions that are ultimately geared towards commercial and monetary purposes," (Bucher 1169). Facebook uses human-like language, one of the five elements that Agre considers to be a part of the captured data model. The “like” functions as an endorsement. One can “like” almost anything: brands, music, books, public figures, shared content, celebrities, stores, and the list goes on. However, the term “endorsement” is dry and corporate sounding, while a “like” is something one would likely say when discussing the article or picture their friend (another example of this 'human like' language that has been incorporated into Facebook) had posted. The “like” has become so accepted that we do not question its reason for being. The like acts as a trap to capture the user's data. Users engage with it in a way that seems natural (I "like" Iron Maiden) which provides Facebook with metadata about the user (Evelien is a metal fan). These metadata are then used for targeted advertisements so that Facebook can earn money. Key here is that “capture is never purely technical but always sociotechnical in nature," (Agre 748), which means that the system must work with human nature in order to work well (read "remain unnoticed"). The algorithms that run Facebook in this case, need to anticipate for humans to interact with it, thus it is constructed with certain social expectations in mind. However, its users do not seem to notice this. Kristie Best interviewed several people for her 2010 research: Living in the Control Society. She questions the paradox of the public’s apathetic response to surveillance in a post-Snowden world where academics and activists continuously share new developments. What she learned from her respondents, among other conclusions, is that they still imagine that they are being surveilled in a manner similar to Agre's "Surveillance Model", rather than the more complex "Capture Model", this is, in itself a testament to the success of the capture model (Best 20) and this shows that relatively recently, captured data can be gathered without the observed person truly being aware.

The concept of captured data is important to this thesis because these data provides information which algorithms use to 'solve' a problem put to them. Capta, as it will be referred to for the remainder of the thesis, are what help personalize results. It is in this ability to personalize algorithms that I see the threat of parochialization of public spaces. Again, to tailor results to an individual's likes based on their data risks that they encounter fewer strangers. Now that the method of gathering data has been made clear it is time to discuss the pervasiveness of software. I do this in order to illustrate that these algorithms have vast influence on an individual's life. 25

Ubiquity and Opacity Broadly stated, software drives everything. As Matthew Fuller, a professor and head of the Centre of Cultural Studies at Goldsmiths in London, writes “Software structures makes possible much of the contemporary world," (Fuller 1). In their effort to describe the scale of software’s influence on the contemporary world, Kitchin and Dodge describe the lengths one would have to go to avoid software:

...not appearing in government databases; not using any utilities such as water and electricity or banking services; not using the many kinds of household appliances that rely on digital code to control functions, ranging from bathroom scales to washing machines; not watching or taking part in commercial entertainment or recreational activity (e.g., watching television, reading a newspaper); excluding oneself from professional health care, avoiding all manner of everyday activities such as shopping (thereby eluding the significant role of software in barcode systems, computerized cash registers, credit cards, and the like); and not traveling anywhere by any mode of transport (pedestrians are registered on digital surveillance camera networks). (Kitchin and Dodge vii)

This highlights the pervasiveness of software's influence on daily life. Each of these examples makes it clear that software is everywhere. Kitchin and Dodge attribute the ubiquity of software to “pragmatic issues such as physical design, interface usability, reliability, and real cost for daily use [which] have undergone significant improvements... have made computation attractive in terms of widespread consumer confidence and affordability,” (Kitchin and Dodge 9). From phones to heating systems to the car one drives. Code is everywhere.

The clearest sign of the ubiquity is that we only seem to see its effects once it fails (Dodge, Kitchin 5). Perhaps we only question its power in that moment. It is in that moment that the layperson is aware of their inability to influence the code. As a car becomes more computer, for example, repairs are often about replacing parts rather than fixing them. Several large automotive companies even wish to make it illegal to work on your personal car because they are mostly code (Perez). This would make an individual powerless over their own property. However, the layperson might not be aware of that fact until their car breaks down.

Since software is incorporated into nearly everything, the world has to be coded in order for software to make sense of it simply because a computer cannot reason the way a human can. Everything needs to be broken down “...into rules, routines, algorithms, and captabases (a collection of capta stored as 26

fields, typically within a tabular form, that can easily be accessed, managed, updated, queried, and analyzed; traditionally named a database," (Kitchin and Dodge 5) in order to make people "machine readable" (Kitchin and Dodge 92). Computers are “stonily stupid" (Fuller 10), while people are extremely complex. In order for a piece of software to understand a person, people need to be broken down into what a computer can understand. Thus, become machine readable. Done poorly, this categorization could oversimplify complex situations that have been studied by social scientists for decades. Algorithms are, in many cases, the trade secret for big software companies and, as such, are protected. When Germany asked Google to reveal how search results are ranked, Google refused "claiming that publishing the search engine algorithm would mean revealing its business secrets and opening up the service to exploitation by spammers," (Cook). As a result of this secrecy and complexity, few people know what drives algorithms. “It is therefore time for a concerted, multidisciplinary effort to try and open up the black boxes that trap software-sorting, and the cultural and spatial politics of code, within their esoteric, largely unknown, and almost completely opaque, technocratic worlds,” (Graham, S. 575). The black box recalls memories of learning algebra. To learn functions we were given a number as an input, a black box, and the output. Alone, it was impossible to know what happened to the num- ber in that box in order to produce the output. Only when compared to a longer series of inputs and out- puts could we then deduce the function inside the black box. Today, the black box serves as a metaphor for an algorithm. The ubiquity of code and the layman's ignorance of how software really work gives it power over us.

Perhaps, the most striking feature of software is its potential for ubiquitous computing; called ubicomp by Weiser, and is described as computing which disappears (Weiser 75). Paul Dourish is a professor of Informatics with an interest in computer science and social science who worked with Genevive Bell, a widely published anthropologist also interested in culture and technology. Together they describe what they think is software's (and thus, algorithms) unique ability to be everywhere without an individual ever thinking about it. They think that:

This is not the "disappearance" of rapidly miniaturizing technology that becomes so small it literally shrinks from view; instead, this is an infrastructural disappearance, a form of easy habituation and invisibility in use that follows from casual familiarity. This sort of infrastructure is still quite visible if you go looking for it; it is just that, in most cases, we don't. (Dourish and Bell 96) 27

Sir Nigel Thrift and Shaun French, both prominent British geographers, agree. They note that ubiquitous computing is developing towards surrounding users as “a part of the taken-for-granted background, to the extent that some have suggested that it constitutes a kind of virtual skin (or set of skins) around human bodies," (Thrift an d French 329). Later, Dourish and Bell recommend this method for finding examples of ubicomp:

When undertaking these kinds of investigations, it focuses us not on the ways that ubicomp infrastructures will seamlessly and unproblematically support human activity but rather on the processes by which human activities are aligned with infrastructures that themselves are fragile and brittle. In other words, it turns our attention to ubicomp as a process and practice as opposed to an object. (Dourish and Bell 114)

What they are saying here is that when investigating how ubiquitous computing plays a role in daily life, one should look for how people adjust themselves to the software, rather than the other way around. Thrift and French use the proliferance of the spreadsheet program to do so. They presented an example in which financial organizations were reorganizing themselves around the spreadsheets (Thrift and French 327). This suggests that humans already adjust their behavior based on how software is built.

To conclude, the first part of this chapter developed the concept of capta as well as presented the ubiquity of code, software, and algorithms. To review, capta are small pieces of data about an individual that are taken without the individual necessarily noticing. These capta then inform algorithms so that they can provide a personalized result. Now I move on to how software and algorithms can influence human behavior based on Deleuze's and Foucault's work. The concept of the control society's new form of governance will be brought to light with examples of how algorithms work and why they might influence people's behavior.

Part Two: Algorithmic Governance Here I will discuss the paradigm shift in governance as put forth by Deleuze and Foucault, also a French philosopher. They both write on a shift in governance, “the action or manner of governing a state, organization, etc.,” (Oxford Dictionary), that they believe is changing as technology develops increased storage and computing powers for a decreasing price (Kitchin and Dodge 8). They developed 28

the following concepts independently of each other. Many writers use Deleuze's theory of the disciplinary and control society, though Foucault is credited with developing the disciplinary society years before Deleuze, in 1975.

Theorists and researchers in various fields use the control and discipline society theory as written in Deleuze's Postscript on Societies of Control. Tania Butcher, a media researcher based in Copenhagen, uses Deleuze's discipline society in combination with Foucault’s panopticon to analyze how Facebook guides its users' behavior. Foucault's theory refers to governmentality: “the semantic linking of governing (gouverner) and modes of thought (mentalité) indicates that it is not possible to study the technologies of power without an analysis of the political rationality underpinning them,” (Lemke 50). Governmentality is concerned with how things are governed, why they are governed in the way they are. Foucault was concerned with the reasoning behind governing. What is important to note is that he also made a clear point that governmentality does not solely apply to governments, but to anything that can be governed.

Paradigm shift What Deleuze and Foucault have in common is an interest in how people are governed using data. Deleuze's Postscript offers a starting point for discussing changes in governance, again, not in terms of government but of power. This shift in governance could be said to have developed together with the opportunities new technologies present and together allow for algorithms greater influence.

The argument that Deleuze puts forth is that there is a paradigm shift in the way that people and their actions are being governed. He theorizes that rather than being governed based on the space they occupy, which he calls the discipline society (as borrowed from Foucault), people are increasingly governed based on their data, which he calls the control society. The control society is a theory, used by many as a sort of periodization. However, as with any periodization, the transition is smooth rather than immediate. Examples of the discipline society could exist today just as the control society could exist in the past. Alexander Galloway dedicates an article to the importance of the Postscript, and Deleuze’s periodization in particular (Galloway 517). I believe that this shift in governance is key to explaining why algorithms are able to contribute to the parochialization of public spaces. The control society is able to govern spaces, by granting some people access while barring others. Yet, in order to understand the change, I will first examine the discipline society. 29

Discipline Society Not only are there several ways in which to define space, as explored in the previous chapter, but also in how to control the people in it. There has been a paradigm shift in describing governance as written in Deleuze's Postscript. Historically, control was defined by space, which Deleuze calls 'disciplinary society’.

This kind of control is characteristic of the eighteenth and nineteenth centuries where "the individual never ceases passing from one closed environment to another, each having its own laws," (Deleuze 3). He uses the school, military, factory, jail, and hospital as examples throughout his paper to illustrate this shift. Each space in the disciplinary society is separate, with its own rules. These spaces are measured in absolutes, similar to Euclidean geography. There is a person or institution in charge, enforcing the rules (Mom, Principle, Warden, or Commander). These people are able to control individuals based on their authority within that space. Those living in the disciplinary society 'start over' in each new space. The individual has to learn the rules and adjust themselves each time (Deleuze 4). The change is abrupt, distinct; one is aware of the change in their new surroundings, it is easy to control people in this way. However, "the family” according to Deleuze “is an "interior," in crisis like all other interiors,” (Deleuze 4). With this imagery, Deleuze evokes imagery of closed spaces separate from one another, with their own rules, whose integrity are somehow jeopardized as technologies advance.

Control Society Control is not about intentionally forcing someone to do something. Here control is about rules governing hardware, like software instructing a computer to write the letters in a word processor which have been typed on the keyboard or the algorithm which decides which results to show for a query in a search engine. Control, in this context, means algorithm, in the sense that it controls hardware. Here the individual is constantly controlled through all spaces, by small and personalized adjustments. The spaces in the control society are much like the spaces that came later in human geography. They can be spaces that are not strictly formed by borders. The control here is tailored to fit the individual: “controls are a modulation, like a self-deforming cast that will continuously change from one moment to the other," (Deleuze 4). Using the factory’s transition into a corporation to illustrate this idea, Deleuze explains that the transition to a factory from school was quite clear; everyone receives the same salary and goes to work doing what they know. Now, with corporations, each person is handled individually. This could be through adjusting salaries, or giving people extra training based on their potential. In this 30

case, the corporation acts as the code which drives hardware to do work. In this case, it is the people who are the hardware being molded and driven to maximize profit. People are a tool in the corporation. “We are taught that corporations have a soul, which is the most terrifying news in the world,"(Deleuze 6). If corporations are the soul, does that make it's employees the body? Further driving home the point that the corporation uses people as tools? Are people 'hardware' being driven to consume?

In contrast to the disciplinary society where one constantly starts over, in the disciplinary society, ones data follows the individual through all the phases in life. The modulation of control means that an individual is always the sum of their data, so starting over becomes more difficult (Deleuze 5). As Galloway believes, “the crux of this short text has to do with technology.” (Galloway 517), meaning that although Deleuze does not grant much attention to technology specifically in the Postscript, technology is what makes the control society possible. What follows is an example of an algorithm using capta to personalize results, which will highlight how algorithms contribute to the control society.

Netflix, the legal movie streaming service, set out a challenge in 2009 to improve it's recommendation algorithm. Using three years of notes on the development of the new algorithm, Hallinan and Stripahs discuss the challenges in defining what 'culture' is (Hallinan and Stripahs 3). The concern here is if an algorithm should be used to define culture at the risk of oversimplifying the concept. Their work is a look into how the control society uses capta to deliver culture. The capta will provide the algorithm the data it needs to suggest new films. To define culture is in itself daunting, let alone write an algorithm to deliver it successfully to a diverse public. The Netflix recommendation algorithm is an example of a company adjusting to the “emergent framework of identification” (Hallinan and Stripahs 7). This new framework can be considered akin to the factories switch to a corporation in the control society, in which capta becomes more important that anything. Here the “'algorithmic culture': [is] the use of computational processes to sort, classify, and hierarchize people, places, objects, and ideas, and also the habits of thought, conduct, and expression that arise in relationship to those processes" (Hallinan and Stripahs 3). An algorithm with the concept of culture built into it will influence what an individual is exposed to. While someone can still manually search, the recommendations will not suggest ideas outside of a sort of comfort zone. While ideal for an individual, because they receive recommendations for new films they will like, the algorithmic culture may isolate individuals from genres outside of their own consideration. Much like a parochialization of spaces. Now that the periodization of governance; discipline and control, have been thoroughly explained, I will move onto what this shift means for the individual. I do so because this thesis is ultimately interested in how individuals are affected by the 31

shift in governance.

The fate of the individual The disciplinary society and control society govern people differently. In the disciplinary society, the individual is molded into a mass. “The signature that designates the individual, and the number or administrative numeration that indicates his or her position within a mass." (Deleuze 5). For example, the university registers a person’s name and gives them a code in return: 10244360. These numbers follow the student through their time at the university. It is used for all transactions, from borrowing a library book to submitting papers. The number makes the student easily identifiable amongst the masses, or 'machine readable'. This, in turn, makes controlling the student easier. Did the student return their books three months after the deadline? Then the student’s diploma is held until they have paid. This number grants the university ease in conditioning a large group of individuals to learn it's rules.

With the shift from disciplinary society to control society, the individuals are no longer forced into a single mass, but broken up into pieces. In the control society one becomes a 'dividual' (Deleuze 5). My interpretation of a dividual, based on Deleuze, is that they are made up of separate bits of capta. Capta:

are units that have been selected and harvested from the sum of all potential data, where data are the total sum of facts in relation to an entity; in other words, with respect to a person, data is everything that it is possible to know about that person, capta is what is selectively captured through measurement. (Kitchin and Dodge 5)

Deleuze's dividual could be said to become the sum of all capta. This is taken, as needed, from any number of places. It could be from drivers licenses, social media profiles, supermarket membership cards (Kitchen and Dodge 98), some of it willingly supplied and others measured. No whole 'individual' is used at any given time rather than individuals in spaces as in the disciplinary society. In the control society one is never whole. That the individual is split up into many pieces to make a 'dividual' reflects the modularity of the control society. The advance in technology has made modularity possible. Increased data storage space along with lower costs for running computers allows an extensive database of 'you' to exist. The increased computation speed allows queries to be run faster than before so that this software becomes almost unnoticeable (Kitchin and Dodge 8). This means that queries and other experiences can be increasingly personalized, to the individual's great convenience. 32

To summarize: Deleuze has written a short, yet influential piece on the periodization of governance. The recent developments in technology have allowed for a control society to develop. This makes it easier to govern individuals by breaking them down into dividuals, to make them “machine readable”. From here I would like to develop some broader theoretical context surrounding these ideas, using Foucault.

Ten years prior to Deleuze’s Postscript, Foucault had published a similar theory. Where they differ is that Foucault approaches this paradigm shift from what I consider to be a “bottom-up” perspective. While Deleuze writes from the perspective of the governing bodies, Foucault uses points of resistance to find sources of power, he is interested in the subject being governed (Foucault 778). He puts himself in the shoes, of an individual in order to explore how that individual is being governed. Foucault does not simply accept the governing power as an 'authority' but tries to discover what gives authority its power (Foucault 780). He is intrigued by what makes individuals become subjects (Foucault 781), what makes an individual surrender control over themselves. In what follows, I will discuss how the control society manages to control individuals. With two theorists’ perspectives on the matter, Deleuze and Foucault, I hope to make a stronger case for algorithms as an aid in the new form of governance.

Anti-authority struggles have several things in common. Including:

the status of the individual: on the one hand, they assert the right to be different, and they underline everything which makes individual's truly individual. On the other hand, they attack everything which separates the individual, breaks his links with others, splits up community life, forces the individual back on him-self, and ties him to his own identity in a constraining way. (Foucault 781)

The individuals here are split yet whole, never to start over and tied to their own identity forever. Much like a dividual in the control society. People are not controlled by a who, but a what. In the disciplinary society the institution of power, group or elite would be the school, army, prison or hospital. However, after the shift to the control society the technique is the power. Individuals are broken up and controlled through passwords and through their new identities made up of capta. This form of power, “applies itself to immediate everyday life which categorizes the individual, marks him by his own individuality, attaches him to his own identity, imposes a law of truth on him” (Foucault 781).

This, according to Foucault, is the power which makes an individual a subject. In this thesis I would 33

apply this “power” to the algorithm. As discussed earlier, the algorithm has the capability to handle complex processes. The new state did not develop by “ignoring what they [individual's] are ... but, on the contrary, as a very sophisticated structure, in which individuals can be integrated, under one condition: that this individuality would be shaped in a new form and submitted to a set of very specific patterns, “ (Foucault 783). If someone becomes the sum of their capta, their identity is then dependent on the information which is deemed most interesting by whichever power is governing the space in question. One is no longer “Mary” but a role. This individual may be addressed in the form of tailored information as a woman, a mother, an avid reader. But, while capta that is gathered is becoming increasingly specific, one is never Mary. In this way an individual becomes the sum of their capta.

There is a paradigm shift in how people can be controlled, just as there is a shift in how spaces are made. Deleuze's argument shows a shift from having control over spaces to having control over the forces which drive a person’s life. For the public space this means that although individuals may be present, the control of their choice to be there may be less 'their own' than an individual expects. Much like the spaces from chapter one, the control is ever changing and different for each person. There are numerous examples of Google's Autocomplete offering racist, offensive, or otherwise stereotyped results (Hafiz, OII, and Langston). In this case the control is the algorithmic influence on the choices presented to a city dweller.

Now that the shift in governance has thoroughly been explained, I want to relate this back to the first part of the chapter. Agre's surveillance- and capture model look similar to the discourse on the discipline and control society. In a disciplinary society, individuals are governed through the spaces they occupy, thus an invasion of space is an effective manner to describe surveillance. Similarly, the disciplinary society can be related to the surveillance model of data gathering in that an individual attributes the gathering with some large institution, like the state (and in the disciplinary society, an individual or institution governs the spaces one occupies). Now I will put forth Kitchin and Dodges argument that software is especially well suited in facilitating the shift from disciplinary to control society (Kitchin and Dodge 85), thus well suited for governing individuals.

Automated management Kitchen and Dodge discuss how algorithms are used to implement a new form of governance, much like how Deleuze describes the control society. They are very clear in their argument that “software is 34

ideally suited to monitoring, managing, and processing capta about people, objects, and their interaction, and is leading to a new mode of governmentality that we term automated management,” (Kitchin and Dodge 85). As discussed earlier, the algorithms within software and the computing power that exists today together are able to provide a tool for control society style governance. Automated management describes the development of technology to aid control and oversight over a diverse, complex, yet ever centralizing society (Kitchen and Dodge 109). Since software is everywhere, it has the potential to collect, analyze and catalog enormous amounts of capta. This gives software a powerful role in how society is defined, and therefore, managed. Automated management allows for the “capture model” (Agre 743) to take hold, which allows for the transition from traditional surveillance (a policeman on the corner) to process which are: “automated (technologically enacted), automatic (the technology performs the regulation without prompting or direction), and autonomous (regulation, discipline, and outcomes are enacted without human oversight) in nature," (Kitchen and Dodge 84-85, emphasis of the authors). Thus, automated management is driven by software and algorithms. In this way the technology which runs automated management could be said to be what allows the control society to take hold. Bucher's research into the Facebook algorithm provides a good example of how automated management guides people's behavior.

In an analysis of Facebook's EdgeRank algorithm, Bucher sought out to learn how visibility on the platform is constructed. She learned that Facebook rewards participation with visibility. This visibility encourages further interaction from friends (Bucher 1174) which then encourages more participation. It works "by ‘training’ subjects to think and behave in certain ways and thus to become the principle of their own regulation of conduct. Through the means of correct training, subjects are governed so as to reach their full potentiality as useful individuals," (Bucher 1175). Bucher writes her article taking the stance that Facebook acts as a panopticon. However, I disagree. This model of visibility acts in a way that Deleuze and Kitchin and Dodge have described, the algorithm is written to shape how people interact with the platform. More specifically, Facebook trains individual users to act in a certain way, prompting each to participate with tailored content based on the capta gathered. "While Facebook is certainly a space that allows for participation, the software suggests that some forms of participation are more desirable than others," (Bucher 1176). This research has shown that training people to interact with software in a particular way is possible, and even rewarding, as Facebook uses the interaction to generate advertisement revenue.

Capta goes into the 'black box' and results come out, in Facebook's case the capta guides certain 35

information to more visible positions on the page. “Capta, although inextricably tied to an individual, is thus made to work in new ways independent of the original person." (Kitchen and Dodge 103). The capta is put to work for gathering advertising and pulling more interaction. Thrift and French agree on this: “the governmentality of software is perhaps best approximated to by Deleuze’s notion of ‘societies of control’ in that it provides a set of modulations that constantly direct how citizens act (and who, therefore, is important)," (Thrift and French 326). However, they are concerned with the ubiquity of objects which are augmented with software, rather than the software itself. However, as these algorithms are written by people, who themselves are not perfect, algorithms are subject to faults. This will be explored in the following sections.

This section discusses the importance of the shift in power from a discipline society to a control society through three works. Through Deleuze, who wrote a top down approach to governance in the new control society. Foucault brings forth a similar argument in The Subject and Power, but uses another perspective to test the limits of this new power society. Finally, Kitchin and Dodge demonstrate how this switch was made with a closer look at the role Agre’s capta plays in controlling, something which Deleuze and Foucault had foreseen, but had not put into words, what might be called “Algorithmic Governance”.

Part 3: Unintended Consequences The previous sections were dedicated to developing the concept of algorithmic governance. There I discussed how algorithms can be used as a tool to govern individuals. In relation to the research question, I believe the previous sections of this chapter answer how algorithms might impact people's behavior in general. In the third chapter I will further explore how algorithms may contribute to parochialization of public spaces, as well as some opportunities to mitigate that trend. However, in this section I will turn my attention to sharing some critique of automation (allowing software to do menial tasks so that humans do not have to). I will focus specifically on the agency that individual's grant to software, as well as the faults or ignorance of social science written into code. Although much of this chapter has been written on an abstract level, it is important to remember that algorithms are written by people, called programmers, and to consider the impact they have on how code works.

Critiquing Automation Carr contends that becoming reliant on software can have a dramatic impact on our way of life. He sees 36

the continuous automation as a threat to skills and questions the ethics of programming with questions like: “who controls the software? who chooses what’s to be optimized? whose intentions and interests are reflected in the code?” (Carr). Like Dourish and Bell, Thrift and French, and Weiser, Carr agrees that technology is developing towards ubicomp, which is what inspires his critique. Evoking the imagery of a car he states: “We're behind the wheel, but we can't be sure who's driving,” (Carr). He brings forth the argument that while we are increasingly dependent on technology, we are decreasingly aware of how it works.

As software increasingly takes over menial tasks, users are giving software agency to make decisions for them. The risk Carr sees is that “we habituate ourselves to it; the technology comes to exert more power over us, not less. We may be oblivious to the constraints it imposes on our lives, but the constraints remain,” (Carr). Trusting a piece of code to aid in making choices is convenient in a world with almost endless possibilities. Code gets its agency from a network of relationships which grants code its power: Kitchin and Dodge argue that the users, as a part of that network, grant code agency over themselves: “power arises out of interrelationships and interactions between code and the world. From this perspective, code is afforded power by a network of contingencies, but in and of itself does not possess power” (Dodge, Kitchin, 40). Each time someone uses an app or searches for a term, that person allows the algorithms behind that app or search engine to make certain decisions on their behalf. Here in lies the heart of the critique: the user often unaware of what those decisions are.

One trait which algorithms might have, especially recommendation systems or search engines, is to program “expected behavior” into its results (Graham, S. 573) in order to provide the user with what it thinks the user wants. However the phrase 'expected behavior' is problematic. This brings up several questions: what is expected behavior? Behavior that is expected of whom? Is this expected behavior the same for everyone? Who expects the behavior? Graham worries that programming 'expected behavior' into a system, like closed circuit television (CCTV) in order to survey the streets can ingrain old stereotypes into the expected behavior of the street, (Graham, S. 573) much like the critique of Jacobs' neighborhood watch. Response to 'unexpected behavior' evokes a feeling of Huxley’s Brave New World. In the novel, those living in civilization have little freedom to do anything beyond what they were conditioned to do by the government (or desire to, as they do not know any better). Strange behavior not conforming to that conditioning that the rest of the society had was alarming enough for the police to be called and the novel's protagonist to be banished. Assigning expected behaviors to specific places in a city could exaggerate existing racial, social, economic divides in a city. A homeless 37

person could be targeted for sitting on the corner but a busker could be targeted just the same, the code simply 'sees' someone sitting on the street with “unusual body movements,” (Graham and Wood 236). However, the code might not see a difference between busking and begging.

Here I would like to take a closer look at how Google increasingly personalizes search results based on the profile of a user, in an effort to learn their expected behavior. This profile is made up of capta gathered while using Google, as well as information that is voluntarily given. When one queries Google, the algorithm that is responsible for returning results is called PageRank. Richard Rogers, an internet researcher particularly interested in research that can only be done with the internet, summarizes PageRank as:

Dependent on one's personal settings, preferences such as language, safe search and the quantity of results, or on one's histories of sessions, searches, purchases, etc.. As personal settings and personal histories fuse, the search engine's acquaintance with the user would ultimately provide returns that seem uncanny, as if it knew what you were looking for and desiring all along. (Rogers 85)

The complexities in PageRank are disguised by the simplicity of the home search page (Rogers 90). “Today Google’s algorithms rely on more than 200 unique signals or “clues” that make it possible to guess what you might really be looking for,” (Google Algorithms). Each of these 200 signals is answered by data that could be considered capta. The black box that produces the results is a mystery which intrigues many researchers from many backgrounds (Rogers 84). For example, in their paper on search engine bias, Reider and Sire use a micro economic approach to show that Google orients the results page in "self-serving ways," (207). They show that Google favors its own services above competitors (208). While not surprising, these results are telling of the amount of influence a search engine can have in shaping the search results presented. Not only economics, but one's own capta contributes. This results in a continuous loop best summarized by Rushkoff, a media theorist: “Recommendation engines, by telling me what people like me do and encouraging me to be like a person like me, they help me to become more prototypically one of my kind of person. And the more like one of my kind of person I become, the less me I am, and the more I am a demographic type,” (Rushkoff). What Rushkoff states is the result of being addressed as a dividual. As one is catered to with results based on capta, one is slowly pushed into a particular group (or a particular parochial realm). If I look up music, Google may remember that I like Iron Maiden, thus providing me with similar music, the more I interact with the metal themed music that Google returns as a result, the more 38

my personal profile may be branded as a metal fan (although I also like other genres of music). A closer look at the Google Places algorithm will be taken in the third chapter when I discuss the tension between urban geography and software studies; this example serves to show how capta and interacting with PageRank can be considered a part of how parochial realms are formed.

Another critique is of decisions being made on behalf of the user and using expected behavior in algorithms is of the “Ghetto Tracker”. It was a crowd sourced map launched so that “users could pin digital maps with safety ratings to enable those new to town to avoid dodgy neighborhoods," (Silver). While avoiding 'bad' areas may seem like a good idea, the consequences of living in an area which has been defined as 'bad' by those writing the algorithms can be detrimental. The designers of Ghetto Tracker expected people to want to avoid certain areas and had built a platform around that assumption. According to Joe Silver, writing for the American Civil Liberties Union, these neighborhoods are “already marginalized and disenfranchised communities whose businesses would lose foot traffic. And this redlining would likely serve to reinforce existing harmful and negative stereotypes about poor communities and communities of color,” (Silver). As discussed in the first chapter, diversity and mixing is important for a city to thrive. Again, simple confrontation is enough to adjust. Then it would not be a difficult leap to imagine that avoiding certain areas could be detrimental for the city as a whole. While the “Ghetto Tracker” was offline soon after its release due to sharp critique, other large tech companies hold patents for algorithms which can help avoid unsafe areas, dubbed the “avoid Ghettos” feature. , for example, has held such a patent since 2012. According to their patent: “it can be more dangerous for a pedestrian to enter an unsafe neighborhood then a person in a vehicle since a pedestrian is more exposed and it is more difficult for her to leave an unsafe neighborhood quickly,” (U.S. Patent No. 8,090,532). Their algorithm “was designed to provide navigational walking routes that factored in such considerations as the weather, crime statistics and demographic information,” (Silver).

Although the Ghetto Tracker was a crowd sourced site, rather than an algorithm, the public’s reaction to its existence is telling of people's opposition to such a feature. When users could 'see' the process behind the Ghetto Tracker, there was a great deal of critique on the potential effects of such a site on neighborhoods (Badger Enough Already, Silver, Matyszczyk). However, due to the opacity of Microsoft's maps algorithm it is hard to know if the 'avoid the ghetto' feature is already in use or not. Imagine if the 'avoid ghetto's' feature became a standard part of map based applications, the casual user would likely not notice it, yet the consequences would remain. Jim Thatcher, a geospatial application scholar suggests: “Microsoft may or may not ever put this technology into one of their products, but we 39

have no way of knowing how these decisions are being made. How are they using demographic information? Are they saying, this area has a median income of X, therefore, this user with an income of Y would not like to go there?” (Silver, quoting Thatcher). The integration of such a feature could shrink back into the invisibility of ubiquitous computing as put forth by Dourish and Bell. This section, shows how people would react if they knew this feature was used as evidence by the critique on the Ghetto Tracker. It also brings forth an existing patent to show that these concerns are not limited to the imagination of concerned citizens, that the technology exists and that users cannot know if they are being guided around 'unsafe' areas or not.

The risk of using 'expected behavior' is that it is culturally determined, much like space can be culturally determined (as discussed in the first chapter with the example of Australians Aboriginal spaces). "This tension between informatization processes that facilitate individualized self-sorting and those that possess the power to ascribe identities and access independently of individual ‘choices’ is likely to be at the heart of much sociological inquiry in the coming years" (Burrows and Ganes 809) Which brings me back to the importance of diversity which has been a reoccurring theme in this thesis. The diversity in gender and culture among programmers seems to play a role in how those algorithms and software function.

With the cultural determination of expected behavior in mind, I would like to highlight the current situation at some tech companies. Consider Google's (among others) abysmal diversity statistics. They “predominantly employ white and Asian men...” (Baer). This leads to the question: how would this affect the algorithm which is responsible for filtering content for 3.5 billion searches each day (InternetLiveStats)? Recent research into Google's advertising system “suggests that there are parts of the ad ecosystem where kinds of discrimination are beginning to emerge and there is a lack of transparency,” (Simonite). Google has been critiqued for an algorithm which misidentified a young African-American woman as a gorilla (Pulliam-Moore), Nikon's facial recognition suggests that Asian people in pictures (Idim.) and Hewlett-Packard's face tracking failed to track an African- American man (Chen). Each of these examples demonstrates a lack of consideration of diversity on even a shallow level.

This section has been dedicated to exploring some critique of automation. First, I discussed Carr's critique, specifically on the agency we grant software and which decisions it makes for us. These “code-based technologized environments continuously and invisibly classify, standardize, and 40

demarcate rights, privileges, inclusions, exclusions, and motilities and normative social judgments across vast, distanciated, domains,” (Graham, S., 563). Again, these decisions on how to classify or sort answers are programmed by people who may have vastly different experiences and expectations of what expected behavior is. Following this, the Ghetto Tracker and Microsoft's “avoid Ghetto” feature were discussed. The Ghetto Tracker received harsh critique and was pulled offline soon after it was launched. However, Microsoft's algorithm is capable of similar decisions, yet the public has no way of knowing if this is already being taken into account when getting directions. This example highlights a discrepancy between what some consider to be expected behavior (wanting to avoid a certain area) and others see as detrimental to an area.

Conclusion With the introduction of computing to almost every aspect of life it is appropriate to consider how that may affect our behavior. This includes the ways in which we engage with a city. As urbanites increasingly engage with their city through apps, recommendations, and review sites, it is time to consider how software influences what appears on the screen, and thus, how software can effectively “govern” the conduct user. In order to understand why I think that there may be consequences for the city because of an increasing use of algorithms I had to discuss several theories.

Starting with Deleuze's proposed paradigm shift from the traditional disciplinary society to the control society. This shift can be said to be made possible by the development of technologies that help turn individuals into dividuals. The dividual, as explained, is the identity of a person made up of many bits of capta. An individual is no longer the same to every system, rather divided and pieced together based on what the algorithm needs. The increasing modularity of such a mode of governance can be said to be made possible by Kitchin and Dodge's “automated management”. This is, in turn, driven by algorithms. Each algorithm is made by many programmers. A question which comes up is: how does their individual experience influence the way in which algorithms are written? Or, how can algorithms, which are written by a certain group of people, have unintended consequences for people in another group. These circumstances are exaggerated by software's ubiquity, opacity, and the agency we grant it. Building on these concepts, the subject of the following chapter will be the influence of software on space and how to mitigate these problems. 41

Chapter 3

Democratization of Public Space

Introduction The first chapter was dedicated to urban geography, specifically public space; I discussed the role of pubic space from Jacobs' perspective as an urban activist. Then, I explored how space might be constructed using non-western cartography as an example. The purpose was to demonstrate the many ways in which space has been produced temporally and globally. Building on that, I contend that 42

algorithms can construct space as well. I also made the case that public spaces could be developing into parochial realms with the aid of new media. As de Waal argues, public spaces as we know them are developing into parochial realms. In his book, The City as Interface, he attributes this largely to the “telecocoon” (de Waal 78) and the algorithms which drive recommendations. These algorithms could create what Stephen Graham, professor of Cities and Society at Newcastle School of Architecture, Planning and Landscape, calls “software-sorted geographies,” which are new urban geographies that have been shaped by recommendations and query returns (Graham, S. 564). The risk is, again, the invisibility of such sorting: "most processes of software-sorting are actually invisible from the point of the users, these prioritization's are often not evident either to the favored groups or places or to the marginalized ones" (Graham, S. 566). As like-minded people are being 'sent' to the same places, these “software-sorted geographies” could be considered parochial realms.

Having developed the concept of parochialization, the second chapter was dedicated to exploring algorithms. Here, I reviewed Deleuze's Postscript in which he puts forth a paradigm shift from a disciplinary society to a control society. The control society, as Deleuze describes, can be considered a new method of governance made possible by the advancements in technology explained by Dodge and Kitchin's concept of automated management, and Agre's captured data (or capta). Capta, as Dodge and Kitchin explain it, are what informs algorithms which in turn return results for an urbanites query. In the second chapter I dedicated some attention to how algorithms are written and their potential flaws. I suggest opening the black box in order to better understand how the city may be shaped by algorithms.

This chapter is also split into two parts. The first part will explore the tension between the two previous chapters. They represent the two academic disciplines in tension with one another: Urban Geography and Software Studies. If, as discussed in the first chapter, public spaces and diversity are important for the existence of a healthy city, then algorithms that produce personalizing results may be detrimental to the city as we know it. This tension reveals itself through public spaces transforming into parochial spaces. If this the case, then the algorithm functions much like Robert Moses' proposed highway through New York that Jane Jacobs fought because it would have destroyed the neighborhoods (Burns). A story so vital to New York City, that some would like to make an opera of it (Moses Jacobs Opera).

For an individual, algorithms that personalize results are very convenient. De Waal suggests that they allow an individual to find their way through the vast variety of parochial realms. This individual convenience could be said to be in tension with the needs of a city. This does not mean, however, that 43

the public realm has to be lost. In the second part of the chapter I will explore how Open Source Software, programmed serendipity, and physical planning, might aid in democratizing public spaces by working with this tension rather than attempting to reverse it.

Part one: The Tension between Software and Space When the web was new, geography was thought to become obsolete (Rogers Mapping 194 and Thielmann et al. 3 and 16). As anyone with an Internet connection could go online and gain access to information from anywhere. Recently, however, the importance of geography has re-emerged. This phenomenon came to be known as the “revenge of geography” (Rogers Mapping 194). Rogers attributes the revenge to the increase in location aware search abilities: “with location aware web devices (e.g. search engines), cyberspace becomes less an experience in displacement than one of re- placement – you are sent home by default" (Rogers Mapping 194). Location aware web devices are becoming more ubiquitous as “the ongoing convergence of mobile communication, Internet technology and geospatial technology is leading to an increasing integration of geospatial technology into mainstream IT,” (Thielmann, et al. 5). Mobile search has surpassed desktop search for the first time at the start of 2014 (Murtagh, Vincent, Sterling), which confirms what Thielmann claims that “navigation with maps and GPS is currently the most popular motivation behind Location Based Services uptake (46 percent), but there is growing interest in more diverse activities," (Thielmann, et al. 9). Ubiquitous, opaque, complex, dynamic, personalized, and now geo-located, algorithms play an increasing role in the way in which people get information about space. This geographical layer adds yet another level of personalization to aid in returning 'ideal results'. This personalization, however, risks separating people based on their profiles and is how I propose that algorithms contribute to parochializing public spaces.

Graham calls the result of this polarization “software-sorted geography” (Graham, S. 562). As I have explored extensively in the previous chapter, software is increasingly structuring spaces. The effect of code is often ignored when discussing geographic inequalities, as code is “automatically and continuously allocating social or geographical access to all sorts of critical goods, services, life chances or mobility opportunities to certain social groups or geographical areas, often at the direct expense of others,” (Graham, S. 564). The concern here is how access is algorithmically defined. Graham describes systems which allow it’s users to find neighborhoods to move to that reflect certain geodemographic qualities (Graham, S. 571). There are decades of human geographical and sociological research that explore the functions of a city, a point briefly discussed in the first chapter of this thesis. 44

Attempts to oversimplify the complexities of a city risk ignoring this knowledge.

A point which this study endeavors to highlight is the notion that space is nuanced; already considered a concept beyond simply a box in which events happen by geographers., Dourish and Bell discuss this tension between the geographer’s perspective on space, and that of the programmer’s:

Space is organized not just physically but also culturally; cultural understandings provide a frame for encountering space as meaningful and coherent and for relating it to human activities. Technological infrastructures are inherently given social and cultural interpretations and meanings; they render the spaces that they occupy ones that can be distinguished and categorized and then understood through the same processes of collective categorization and classification that operate in other domains of social activity. Technological infrastructures and services, then, need to understood as operating in this context. (Dourish and Bell 115)

They see technology molding space into the old definition geographers previously ascribed to: a box where things happened, rather than created by myriad cultural or other social influences. It seems as if ubicomp has not yet made the paradigm shift from a box to culturally created spaces and that the consequences may be further polarizing the city. They recommend that technology should be aware of this context in order to prevent this. Having developed this concept of the tension between geography and software I would like to highlight how this works using a popular algorithm, Google Places.

Among many services offered by Google is Places, which plots these points onto their map results. In addition to one's personal profile (Google Personalized Search), Places relies on two other factors: the validity of the place and the geographical location. There are very few people who know exactly how much each of these factors weighs in the final display of results. First: "a rankings premium may be assigned to geospatial entities based on the user’s interest or preferences. User data collected at a client may be stored in the memory of the entity ranking module and used by the ranking engine to generate entity rankings that are personal to the user," (Google Entity Display Priority). The 'client' has built up a particular profile based on their capta, which Google terms “interests or preferences”. These capta, in turn allows Google to produce results for each dividual. What is most important for parochilalization of public spaces is that the algorithm also takes the dividual's friends into account:

A part of the profile is conditioned based on how one interacts with the search engine, our system automatically compares the places you’ve rated against the places rated by other Google 45

Places users, and identifies people whose taste overlap—meaning you both tend to like and dislike the same places. Now, you can see all the places that people who are ‘like-minded’ with you enjoy, since there’s a very good chance you’re going to love them too. (Google Discover more Places)

This particular algorithm's contribution to parochialization of public spaces is apparent. The Google Places algorithm tries to send clients to places with similar people. As discussed in the first chapter, this feature of the algorithm can result in bringing certain groups of people together, so that a place acquires a parochial identity. However, it equally denies access by excluding certain results for certain people.

Then, the validity of the place itself is accessed. This works just like the PageRank works for websites. A place is considered more valid if there are links to its website or reviews. In 2012, Google announced that it integrated scores (restaurant rating system) into the Places information displayed. Additionally, Places are pushed higher up the rankings depending on whether your friends had been to or interacted with the Places page (Google Local). The key here is that Google is not simply incorporating an aggregate of review websites; it is tailoring the results to a particular set of friends. Prioritizing friends' visits to certain places is exactly what exacerbates the process of parochialization of the city.

Thirdly, the geographical location of the place is taken into account. The latest version of the algorithm seems to be heavily reliant on the 'hyperlocal'. Queries not specifying a geographical place will still display local results (Barreneche 335). This means that Google is taking neighborhoods into account. With the newest update, it is said that Google has improved its distance and location ranking parameters (Patel and Shotland) in an attempt to show links 'near Amsterdam' or 'near the Staatsliedenbuurt'. Consider again Graham's concern: if geographical location is taken into consideration, this could only exacerbate existing geographical inequalities in the 'real' world, areas that are already segregated by socio-economic situations may only become more so if fewer people will come there when places in that area do not appear on the map.

Carlos Barreneche, a researcher on digital culture, media theory, and software studies, believes that Google has built a powerful new kind of “geodemographics system”. Considering just these three elements used in returning search results, it Barreneche writes:

[A] geodemographic system is an information technology that combines databases on 46

consumers’ data and geographic information systems (GIS) in order to enable ‘marketers to predict behavioural responses of consumers based on statistical models of identity and residential location’ ...These systems are built upon the sociological assumption that location, particularly where we live, signals social and cultural characteristics of a given population. (Barreneche 337)

Such a system has powerful effects, particularly the assumption that where we live relates to certain cultural and social characteristics. If the geodemographic algorithm relies on these sorts of assumptions for delivering content, then it can contribute to polarizing a city's inhabitants.

In ignoring or choosing not to engage with space as it has developed theoretically, programmers subsequently brush aside decades of social science research. There are already “significant concerns about the way in which such systems rely on and reinforce the categorization of certain sociospatial risk categories: high crime neighbourhoods, known criminals or dangerous socioeconomic groups." (Graham and Wood 237). In their article Graham and David Wood, of the Surveillance Study Centre at Queen's University in Canada, discuss the implications of automation as “critically exploring the social implications of a new and emerging raft of surveillance practices," (Graham and Wood 227). Although this thesis is not about surveillance, it is about “digital techniques and the changing political economies of cities and urban societies," (Graham and Wood 227). As such, this could be seen as a consequence of algorithmic governance simply because the digital surveillance described is given instructions by algorithms. Evgeny Morozov, a writer and researcher interested in the political and social implications of technology, thinks that relying on technology (and algorithms) to solve complex social problems undermines the problem’s complexity. In an interview over his book To Save Everything Click Here, he uses the quantifying self movement as an example to explain that:

[it] reduces everything to a single number and while you may learn how to adjust your behavior to that number, it doesn’t necessarily translate into any holistic understanding of the self who is behaving. So in a sense the person becomes a kind of a black box with an input and an output, but the user himself has no idea how the input relates to the output. (Morozov Interview)

The quantifying self movement is one which, people on a diet for example, use to make their weight loss quantifiable for higher success. This idea can be applied to complex problems in cities, an algorithm may be useful for narrowing down an individual’s options when searching, yet ignore years of research and oversimplify a problem. Graham and Wood conclude that “the techniques may facilitate better services for mobile, affluent citizens, but that this is often paralleled by a relative worsening of 47

the position of more marginalized groups who are physically or electronically excluded or bypassed by automated surveillance," which resembles the software-sorted geography concept (Graham and Wood 229). Basically, they expect that algorithms will be written in such a way that their influence will only exaggerate existing social segregation. It would be naïve, however, to attribute the occurrence of software-sorted geographies and segregation solely to algorithms. Rather, algorithms add another variable to be considered when researching these situations.

Polarization of cities could be attributed to the conditions in which algorithms are written. “Increasingly, the encoding of software to automatically stipulate eligibility of access, entitlement of service or punishment is often done far away in time and space from the point of application,” (Graham and Wood 233). Consider the physical geographies of where software is written, as Thrift and French do: The geography of software production is concentrated into a very few key places and regions: Silicon Valley..., New York..., London...However, because writing software requires skills that are still in short supply, especially for newer languages, software writers from all over the world often gravitate to the main software-writing centres. (Thrift and French 324)

A result of the influx of well-educated software writers to these centers is a wealthy group of people who themselves may not experience the diversity of the cities they live in. Some describe such communities a 'bubble', this seems to be written into the software that they are also producing. Thus: “the digital divide in contemporary societies is based on the broader disconnections of certain groups from IT hardware and the growing use of automated surveillance and information systems to digitally red-line their life chances within automated regimes of service provision,” (Graham and Wood 234).

Just like the change in how space is viewed, the digital divide is becoming a more nuanced concept. The digital divide aims to describe the inequalities in access to digital media. Initially, it was a theory that focused on the segregation of developing countries from the rest of the world. Division exists within developed and 'well connected' countries. Even in California, home to the Silicon Valley, it has been found that “those with less education and low income levels or who have a disability or language barrier are most likely to be left behind,” (Avalos). In being left behind, their needs might not be met. Graham and Wood, and Carr, seem to be concerned that the process of automation will eventually get out of hand. Their concern is that while humans simultaneously increase automation while continuing to address space as a box the “proliferation of automatic systems raises clear concerns that social exclusion itself will be automated” (Graham and Wood 233). The tension seems to be that algorithms 48

treat space as a box in which things happen, which may contribute to the parochialization of public spaces. Opacity and ubiquity make it “hard to identify how the shift to automated, digital and algorithmic surveillance practices relates to current radical shifts in the political economies of welfare states, governance, punishment and urban space," (Graham and Wood 233). Opacity contributes to this challenge in that much of the most popular software that is written is proprietary and protected as a business secret (recall the case of Google in Germany from the previous chapter), thus forming a challenge to study these shifts. The ubiquity of software as described in the second chapter means that there are multiple systems at work which overlap and interact with each other, further complicating the feasibility of controlling for one system in order to study that software's contribution to the shifts described by Graham and Wood.

In this section I have developed this tension between urban geography and algorithms, the effects of software-sorted geography and ignorance of social sciences on the city, namely in further polarization of groups which leads to parochialization of public spaces. Algorithms are not the sole reason for the parochialization of public space; rather they are one more factor to consider when researching how people experience the city. In the following section I would like to move onto opportunities to mitigate the change from public to parochial. The rest of this chapter will consider Open Source Software theories as well as projects which critically investigate algorithms as well as the effects of physical planning on public spaces, in an effort to learn how the public realm may be maintained in the digital age.

Part two: Opportunities Open Source Software is not a new concept. In fact, it has been around since software has existed (Raymond). Lawrence Lessig is an academic and political activist, as well as the director of the Center for Ethics at Harvard University. For some “the issues of open source and free software are fundamental in a free society," (Lessig 350). As Stallman, software freedom activist and programmer, puts it, “Free in terms of freedom, not free beer”. Open source advocates see software as a common (GNU.org). The commons is a concept stemming from English herders agreeing to graze only a certain number of sheep on a field. This means that each shepherd had access to the common field, and could each use it freely, yet equally. If, however, one shepherd decided to add sheep to their flock, then that individual may benefit, but the other shepherds will not, as they have less grass to feed their sheep. The commons are not managed by anyone (de Waal), but by everyone. The commons as a concept have 49

been the center of much debate and critique. The tragedy of the commons from Garrett Hardin, for an example of one, describes how he believes the commons will not work as the earth becomes more populated because there will always be individuals acting in self-interest will not have the best interest of the rest in mind.

The way in which software is produced in the open source community is interesting for this thesis because it gives an example of how many people with a variety of skills can work together to produce something of high quality without a top-down disciplinary approach. The open source discourse is varied, with many schools of thought, as detailed in Christopher Kelty's book: Two Bits: The Cultural Significance of Free Software. He writes that there are two kinds of narratives, the Free Software of Stallman and the Open Source approach of Raymond (Kelty 99). Stallman, for example, believes that proprietary software is a problem because it is obstructive and is thus against keeping software for yourself and not sharing, which he calls “hoarding software,” (Kelty 99 and Stallman). Eric Raymond is considered to be a “libertarian pro-business hacker,” (Kelty 99). It is in Raymond's essay, that I find interesting opportunities for developing algorithms.

Cathedral and Bazaar In the essay “The Cathedral and the Bazaar”, Raymond explains the process of producing open source software. While some parts are more technical than others, the overarching story explains why he, and the open source community, find building software in a Bazaar to work better than a Cathedral. In his essay the Cathedral is a metaphor for any large software company, such as Microsoft. These are “carefully crafted by individual wizards or small bands of mages working in splendid isolation, with no beta to be released before its time," (Raymond 3). Wizards and mages are high level characters in online games. To release a beta is to release a test version of the software and to gather feedback from early users (techterms.com). Raymond thus claims that they work isolated from their users while positioning themselves as the authority on the matter of programming. The Cathedral model could be compared with the disciplinary society as Deleuze put forth. Here the software company is the authority controlling the masses through designing software in a top down manner where they decide what can and cannot be done with the software. The Bazaar, in contrast, is made up of "differing agendas and approaches (aptly symbolized by the Linux archive sites, who’d take submissions from anyoneanyone) out of which coherent and stable system could seemingly emerge only by a succession of miracles...didn’t fly apart in confusion but seemed to go from strength to strength at a speed barely imaginable to cathedral-builders.," (Raymond 3). Here, like in a Bazaar, many people of different 50

specialties and interest work together to produce something almost utopic. Some sell vegetables while others oversee the organization of the market, while others come to shop but only buy what they need and nothing more. Some de-bug some code and others simply use bits of code for other projects. Much as some would say there is a paradigm shift in organizing control, Raymond's essay could be considered a paradigm shift in organizing work flows.

Raymond shares many lessons for bazaar style development, one he finds important for the process could also play an important role in democratizing urban spaces. This lesson is: "Given a large enough beta-tester and co-developer base, almost every problem will be characterized quickly and the fix obvious to someone," (Raymond 8). More simply put, many eyes see more because they collectively have more diverse experience. Raymond attributes this to what "Sociologists years ago discovered that the averaged opinion of a mass of equally expert (or equally ignorant) observers is quite a bit more reliable a predictor than the opinion of a single randomly-chosen one of the observers. They called this the Delphi effect which for software writing means: ‘many eyeballs tame complexity’,” (Raymond 9). The Delphi effect means that it is not effective to rely on a few people to find all the mistakes in the code, but rather to take advantage of the experiences, opinions and skills of all those involved (consider the concept of thought diversity from the previous chapter) in order to gain a holistic view of the problem at hand. Additionally, The Delphi effect could also be seen as resembling Foucault's Panopticon. Many eyes also means that the work that individuals contribute to the code is seen by many others can theoretically be checked for flaws. As anyone can contribute to Open Source software, this might also grant social scientists an opportunity to contribute to the work being done.

The manner in which the open source community organizes their work and operationalizes their position is important because it shows their interest in the idea of the commons and freedom from a, what Deleuze would call, disciplinary society. As discussed, many eyes may be better than few and can catch potential faults in the software. These faults could include any of those discussed in the second chapter on unintended consequences. Next, I will review some work done on democratizing algorithms in order to explore more opportunities to mitigate the influence of algorithms on parochialization of public space.

Democratization McKelvey and Nicholas Diakopoulos, a journalist advocating for “Algorithmic Accountability”, would 51

argue that a critical unpacking of the black box is necessary to democratize public space. This can be done by making people aware of how algorithms work. Making the algorithm more transparent should allow people to understand the 'decisions' made in producing recommendations and query results, as Diakopoulos might argue. Democratizing public space is about being able to access it. Having access to that space in this thesis means that software is not sorting individuals based on what it is programed to think 'suits' particular places. Thus, an algorithm that has decided to label an individual as a 'metal fan' and use this categorization to determine where this person should go, would then be contributing to the parochialization of public spaces. In his article McKelvey bases argument of democratizing algorithms on Deleuze’s theory of the control society.

He also calls for opening black boxes. "A principal shortcoming of democratic theory...is that it all too frequently depends on a mythic omnipotent citizen capable of processing volumes of information daily," (McKelvey 604). What McKelvey suggests, is that democracy is dependent on everyone being able to understand what is happening and participating, rather than a select few. His point is reminiscent of Crampton's argument that knowledge is determined by power from the first chapter. This “principle shortcoming” mimics the way algorithms are built, with a few people who know how they work and many people using them. Thus, McKelvey calls to create democratic methods for studying algorithmic media: “a democratic response to algorithmic media [which] requires translating the fleeting operations of software system routines into knowledge that is conducive with democratic debate. However, methods must be created to study algorithmic media," (McKelvey, 599). For knowledge to be conductive with democratic debate it should be understandable and open to the people it affects.

This may be a daunting task, since “one algorithm informs another, creating a continuous modulation that Gilles Deleuze (1992) identified as distinct to contemporary power. Algorithmic media emphasize these computational routines as general characteristics—as overall processes that dynamically adjust outputs and inputs," (McKelvey 598). Algorithms rarely function alone. Individuals are being broken up into dividuals, then the dividual's capta is being used to drive algorithms, and these algorithms, in turn are informed by other algorithms. This theoretically, means each search result could be entirely new each time someone searches. Or, as McKelvey states: "dividuality increases differences and fragmentation as it dissects users into a variety of profiles and data types" (McKelvey 603). This personalization is more reason to break open the black boxes, as they may divide social groups or reinforce existing social problems. McKelvey calls on an example from an app store. There a 52

“troubling assumption [was made] when examining the Grindr app, an online dating app for gay men. When looking at this particular app, the Store recommended a Sex offender Search tool as another app to download,” (McKelvey 598). The danger, here, is that these assumptions can be damaging. In this case, relating gay relationships to sex offenders.

With such grand claims on the complexity of the task at hand Kitchin and Dodge propose how to start learning:

We can start to chart the various ways in which software makes a difference to everyday life by studying where these blips occur-how capta is worked upon and made to do work in the world in various ways; to make visible the "dynamics, structures, regimes, and drives of each little event ... to look behind the blip" (Fuller 2003, 32). Moreover, these blips are contextual and signifier of other relations-for example, a person's bank balance is an instantiation of class relations-and they themselves can be worked upon and reinterpreted by code to invent a sequence of new blip. (Kitchin and Dodge 40)

The blips referred to here are “events... some outcome or action in an assemblage that the software contributes” (Kitchin and Dodge 40). The blips could be a starting point to search for how algorithms work.

In his article, McKelvey discusses what he perceived to be successful projects which aim to democratize the algorithm. The key ingredient of each of these projects is the same lesson which Raymond put forth in the Cathedral and the Bazaar: the Delphi effect. “The projects discussed here all succeeded in enlisting publics and, importantly, in developing an adaptive response to the invisibility of algorithms by overcoming their a-semiotic and instant nature. They flourished because they combined democratic inquiry with an awareness of the qualities of algorithmic media," (McKelvey 609). The most successful projects included crowd sourcing at least some of the data, rather than relying on expert data. He found that experts often failed "to recognize the diverse influences of algorithmic media on everyday life, or to acknowledge that the institutional attention algorithms have garnered is due in large part to public research," (McKelvey 605). This idea of using the crowd knowledge to expose how an algorithm works is key to democratization. “The public is a participant, not a spectator," (McKelvey 606) involving the public pushes back against algorithmic governance of dividuals as the status quo. Making the project about learning how an algorithm work could best be done, as McKelvey offers, by those it directly affects:

It is the affected public that can best explain the consequences of algorithms and why they 53

matter. So although finding resolutions to less desirable aspects of algorithmic media might be another step, people have an important role to play in making algorithms matter. While this process might be imperfect, it remains the most viable for algorithmic media. (McKelvey 607).

Although many contend that opening the algorithms black box is important to our democracy and many seem driven to do so (as evidence by the projects which McKelvey has discussed), learning how an algorithm works seems to be the bottle neck in the democratization process.

One such method of unpacking these black boxes is what, Diakopoulos calls “Algorithmic Accountability...at its core it’s really about reverse engineering—articulating the specifications of a system through a rigorous examination drawing on domain knowledge, observation, and deduction to unearth a model of how that system works," (Diakopoulos 2013). The Wall Street Journal suggests “Algorithmic Accountability” to be a trend in 2015 (Dwoskin). Diakopoulos proposes to rigorously test the algorithms much like learning functions in math class. Based on these data researchers should be able to unravel what happens to the inputs in order to produce each output. “You can’t see what’s going on inside directly, but if you vary the inputs in enough different ways and pay close attention to the outputs, you can start piecing together some likeness for how the algorithm transforms each input into an output. The black box starts to divulge some secrets,” (Diakopoulos 2013). He calls for reverse engineering the algorithm. Although written for journalists, his proposed methods might be a good starting point for future research.

The Open Source community has written on freedom and how to best organize a large group of people towards a single goal. Kitchin and Dodge suggest investigating blips as a starting point. Finally, McKelvey and Diakopoulos believe that democratization of algorithms will happen with input from the crowd. The Open Source community has inspired some approaches for analyzing algorithms. By using the power of crowd sourced data to bring many eyes to the problem, an informed public can work together to break open the black boxes.

One such project is the OpenStreetMap (OSM). Described as the Wikipedia of Maps, is “built by a community of mappers that contribute and maintain data about roads, trails, cafes, railway stations, and much more, all over the world,” (openstreemap.org). The OSM is made following the principles as discussed above. First, that place is a common, explained by and OSM supporter: “Place is a shared resource, and when you give all that power to a single entity, you are giving them the power not 54

only to tell you about your location, but to shape it. In summary, there are three concerns- who decides what gets shown on the map, who decides where you are and where you should go, and personal privacy," (Warclawski). Warclawski’s summary is strikingly similar to Crampton’s motivation for critical cartography. The second consideration relies on the fact that many eyes make for a more accurate and democratic map: "OpenStreetMap is both neutral and transparent. OpenStreetMap is a wiki-like map that anyone in the world can edit. If a store is missing from the map, it can be added in, by a store owner or even a customer," (Warclawski). What appears on the map is not subject to the Google Maps algorithm. I am not referring to the map an individual sees on their computer, but another one. This other map is the 'deep map', the master map. Which, unsurprisingly, there is an algorithm which scrapes images made by the street view cars and uses that data to add to the 'deep map': "what's significant about the photographs in Street View is that Google can run algorithms that extract the traffic signs and can even paste them onto the deep map within their Atlas tool," (Madrigal). Finally, the OSM is transparent in how and which places are displayed, there is no black box. Users are encouraged to join the discussions on how best to represent the world.

The API is open, it allows for anyone to use it as they please, which has resulted in many mobile applications (OSM wiki). As a result many countries, such as the Netherlands, are completed to a high level of detail. Groups are organized around improving the OSM, including the Humanitarian OpenStreetMap Team, best known for their disaster response (hotosm.org). Other organizations aim to teach how to contribute to the OSM in an effort to expand the number of eyes editing the map. Mapping Day is an event hosted in Kampala or at Universities around Uganda with the goal to teach Ugandans how to contribute to the OSM. The events hosted at universities aim to expose new people to the OSM and hopefully inspire them to add their knowledge, as many live in villages which may not appear on Google's map. The events hosted in Kampala aim to improve existing skills so those in attendance can map on their own (Mappingday.com).

The OpenStreetMap is, perhaps, an answer to the critiques of other maps. It certainly democratizes space by allowing anyone to contribute to the map. It is often more detailed than other commercial maps, thus a testimony that many eyes can create quality work. However, the OSM does not do so by exposing an existing algorithm; it simply provides an alternative map. 55

Rethinking algorithms The majority of this chapter has been dedicated to working on discovering how algorithms work in order to democratize access to public space. However, “the intent here is not to demonize algorithms, but to recognize that they operate with biases like the rest of us...Any investigation must therefore consider algorithms as objects of human creation and take into account intent, including that of any group or institutional processes that may have influenced their design," (Diakopoulos). As de Waal puts forth in his book, it may be wise to work with the existing situation, as technology will not fade away and neither, maybe, increasing parochialization. Summarizing Hajer and Reijndorp who argue that:

The public domain is not a neutral platform where all city dwellers despite their differences still come together as in Arendt's vision. It is exactly the opposite: different urban groups have their own network of parochial domains. An experience of the public domain occurs when we visit the domain of another group. (de Waal 131)

In other words, the public domain, in Hajer and Reijndorp’s eyes, is not any particular physical space where strangers can meet. It is the decision to visit another parochial realm in order to meet each other. Their concept of public space is one which demands more effort from an individual; it demands an individual to leave their comfort zone. They embrace the parochial realms which emerge from a city, the paradox of more publics yet increasing monotony is broken by visiting other realms.

One way to work with the increasing parochialization would be to adjust thinking about how algorithms are programmed. Rather than send like-minded people to meet each other, users could be guided towards new experiences: “the algorithms in the software would not be used to bring people together who already have many things in common, but to tempt users to try something new for a change.” (de Waal, 148). This could be called 'programming serendipity’. Graham and Wood also suggest this:

On the one hand, systems can be designed to socially exclude, based on automated judgments of social or economic worth; on the other hand, the same systems can be programmed to help overcome social barriers and processes of marginalization. The broad social effects and policy implications of digital surveillance are thus contingent and, while flexible, are likely to be strongly biased by the political, economic and social conditions that shape the principles embedded in their design and implementation. (Graham and Wood 229)

They propose that programmers take social barriers or geographical inequalities into consideration 56

when writing software or programming in order to produce software that is more inclusive and does not exacerbate social issues. There is a very fine line between this and nudging. Nudging is the manipulation of default settings with the intention of guiding the user’s actions that is considered to be worse, by some, than the current situation (Morozov Click 85).

Writing software that deliberately sends people to another parochial realm would create public space from an overlap between parochial realms. The public space would become an experience rather than a physical space, consider the first chapter where I discuss how space is created and experienced by Aboriginal Australians in dreamtime. The public space, in this case, would become a sort of dreamtime.

The technology is available to bring this kind of change to algorithms. Walking tours of a city for tourists or a scenic stroll are easy to find. What some researchers at Yahoo! Labs did, however, is new. Using the crowd sourcing, these scientists have developed a way to make a walking route more enjoyable based on 'beauty', 'quiet', or 'happy' (Berko). To program an algorithm to increase an individual's enjoyment of the city seems to be possible, with very little effort on the individuals part: “the recommended paths are only 12% longer...which correspond to roughly 7 and a half additional minutes,” (Quercia et al., 4). There exists so much potential in incorporating experience into a walk:

Future work might go into studying temporal dynamics at different levels: time of the day (day vs. night), day of the week (e.g., Saturday vs. Monday), time of the year (e.g., different seasons). It would be also interesting to see how those dynamics change depending on, for example, weather conditions. (Quercia et al., 9)

The routes could route through public spaces, include a 'social', a 'solitude', or 'discover' option. It could be dynamic based on events happening in town (pass by a street fair, ateliers, the weekly market, or a newly opened shop). As briefly addressed earlier, nudging people to act in a particular way may be worse than the existing problems with algorithm, if the users are unaware of the nudge. However, giving someone the option to select how they would like their route could lessen this problem.

Physical design Perhaps, the changes to the public realm cannot be solved by many eyes or planned serendipity. Maybe, an analogue solution is relevant. Public spaces can be created through good design. There is extensive research into the area of urban design, which basically states 'if you build it, they will come'. Both 57

Montgomery and Shaftoe acknowledge that the way the physical environment is built has an impact on how humans act, but also acknowledge that there are other factors at play, when discussing who uses a public space. Montgomery shares an anecdote of the transformation of the public-repelling spaces in New York: from odd edges, sunken and dark areas, to comfortable seating and trees. Small adjustments turned those once empty spaces into public spaces. (Montgomery 166). Shaftoe concludes his book with a list of considerations for building an ideal convivial urban public space, including: just the right size, balance, intriguing details, few or no cars, human scale, and many others (Shaftoe 139-143). The right combination of these elements, he argues, create inviting spaces for people to stop to enjoy city life and encounter each other.

Conclusion In this chapter I discussed the tension between urban geography and software. Here I brought together the ideas that I developed in the first and second chapters. The tension between the two fields simply put, is that it seems that software developers have not yet adopted the existing social science research which take constructed space into consideration. Yet, it is nearly impossible to know if this is the case due to the complexity and opacity of these algorithms. Software's role in polarizing urban residents could be mitigated by abandoning the historical concept that space is a box which should be filled as well as incorporating existing social science research into the software the build.

The second part of the chapter was dedicated to exploring some opportunities to diminish the polarizing effect of search and recommendation algorithms. Looking to the open source community's approach, I specifically addressed their use of the Delphi effect. The essay The Cathedral and the Bazaar laid the basis for explaining how many eyes are may be better at solving a problem, than one central authority. Then, I discussed democratization of algorithms and how that might be done using algorithmic accountability. The OpenStreetMap and Yahoo! Labs' research were brought forth as examples of democratized space and as opportunities to address the tensions expressed at the opening of this chapter. Finally, I proposed that not everything can be solved by technology and gave a few examples of successful physical planning that have revived urban public spaces.

Conclusion This thesis set out to answer the questions: How might algorithms contribute to the parochialization of public space? What are some opportunities to mitigate this change? In order to answer this, I first took 58

a step back to discuss what public spaces are and what parochialization is. The first chapter was dedi- cated to different elements of human geography. I used de Waal and Lofland to provide the definition for parochialization and Jacobs as well as Shaftoe and Montgomery to discuss public spaces' role in the city. Public spaces are open to anyone and have served as the place where individuals from diverse backgrounds encounter each other. These meetings, as de Waal, Jacobs, and Shaftoe agree, contribute to an individual's development as an open minded and tolerant person. As public spaces parochialize they are increasingly being used by like-minded people, so this function would be lost. De Waal at- tributes this parochialization to the profiles that are built up around us online (de Waal 171). These pro- are important because they essentially mold an individual into something that can be easily cate- gorized and understood by the algorithms, also known as a dividual. From here I took a detour to explore how space has been constructed over time and across cultures. The aim is to demonstrate that space is not simply a fixed box in which things happen. Rather, it can be de- fined and shaped by any number of factors, including feminist and Marxist geographies (Kitchin 269- 270). The box can be constructed in different ways for various academic analysis. To further demon- strate this point, I discussed Aboriginal spaces, using Law. He explained how dreamtime is non-linear, always developing and different for Aboriginals, depending on their position in society (Law 134). This concept of how space can be constructed served to prepare for the contention that algorithms can influ- ence how public spaces can change to parochial spaces, by being a factor in constructing spaces. Having developed that space can change from public to parochial, and that spaces can be constructed in many ways I then moved onto the second chapter, which focused the software studies side of the ques- tion. Here I describe how algorithms can play a role in parochialization of public space. First, I wrote an in-depth analysis of how algorithms work before discussing their role in governance. I started with Agre's concept of capta, because I saw this as the most basic unit necessary in discovering how algo- rithms can change space. Capta are the small pieces of data collected about an individual. These data are then used to inform algorithms so that they can provide personalized results (Kitchin and Dodge 85). The second part was about the paradigm shift in governance using Deleuze and Foucault. From a disciplinary society, in which people were governed based on the physical space that they occupied, to the control society, in which dividuals can be governed based on their profiles (Deleuze). This shift was necessary to establish how digital media has brought the tools possible to make this shift in gover- nance. The increased use of algorithms, in addition to improved hardware and decreasing storage and computing costs, makes it possible to incorporate algorithmic governance into many platforms. This chapter demonstrated what algorithmic governance is. This is key to answering the question left at the 59

end of the first chapter, specifically how algorithms contribute to the parochialization of public space, by explaining how they can govern individuals in general. If algorithms can govern individuals and contribute to constructing parochial spaces, then algorithms can contribute to parochialization of public spaces. The third chapter focused on bringing the first two chapters together to discuss the tension between ur- ban geography and software. The aim here is to apply algorithmic governance to urban geography with examples of software-sorted geography (Graham S. 562). It seems that the tension between the fields stems from software programmer's ignorance of their impact on social situations in the 'real' world. Dourish and Bell find that space is still treated as a box in which things happen. Additionally, a detailed look at the Google Places algorithm shows that Google is, by de Waal and Lofland’s definition, con- tributing to parochialization because the algorithm takes an individual’s friends and personal profile into consideration when producing results, thereby creating parochial realms within the city. Algo- rithms resemble a highway, in that they cut through the many public realms of a city to deliver the user to their parochial realm, quickly and without detour. Finally, to answer the second question: What are some opportunities to mitigate this change? Although, I opened this thesis stating that parochialization is neither good nor bad, I believe in maintaining a bal- ance between the three realms present in a city: private, parochial, and public. So while public spaces may parochialize, some should remain. I addressed some opportunities to mitigate the influences of algorithmic governance has on parochial- ization. Because algorithms function as a black box, opaque and complex, some call for “algorithmic accountability” to discover how algorithms work learn what kinds of decisions they make on the users behalf. The methods that Open Source software use to develop their software offers opportunities for diverse group of people to be involved in developing software. More eyes see more solutions, but also generate a Foucaultian panopticon where one never knows who is watching .These elements together should offer more creative solutions from diverse perspectives while offering a social control which would, hopefully negate the effects of proprietary software development and their reputation for stereo- typed or racist algorithms. Then, I explored programmed serendipity. This calls for actively program- ming mixing and new experiences into the algorithms as to expose people to unexpected places that fall outside of their profile. However, this should be approached with caution as it can be easily become problematic. Nudging people to act in a certain way is almost worse than accidentally doing so. Finally, I suggest that not all the answers lie in digital media. Physical urban planning can still have a notice- able impact on people's behavior by simply enticing people to be there to people watch. 60

Further research could include a closer look at how popular algorithms work. A longtitudinal study fol- lowing Googlers would be interesting to see how the diversity of the places an individual visits changes with increased use of the software. Would those individuals visit decreasingly diverse places as a result of having a particular profile? Would they notice? Ideally, this would be done in partnership with the company that wrote the program. This way a researcher would have access to data and expertise neces- sary to understand how the algorithm works. Additionally, algorithmic governance could be considered a new phase in the construction of space for geographers to take into consideration. Much like the phases of geography presented in the first chap- ter, perhaps geographers can discuss space in a new way. They could take the construction of space as algorithms form them into consideration when researching new urban geographical situations. Starting with: How is (this space) influenced by (this algorithm)? I have demonstrated how algorithms, specifically those which produce recommendations, can contrib- ute to the parochilalization of public space. Computing will become increasingly ubiquitous and more people will be living in cities than ever before, so the tolerance and open-mindedness that develops from encountering strangers remains important. Such research is relevant for urban geographers as well as software developers to stay in touch with how technology can impact the urban dwellers experience of the city. 61

Bibliography Agre, Philip. "Surveillance and Capture: Two Models of Privacy." The NewMediaReader. Ed. Noah Wardrip-Fruin and Nick Montfort. Cambridge, MA: MIT, 2003. 740-60. Print. Algorithms – Inside Search – Google." Google, n.d. Web. 02 July 2015. "Applications." OpenStreetMap Wiki. N.p., n.d. Web. 01 July 2015. . Burrows, R. and Gane, N. “Geodemographics, software and class” (2006) Sociology 40 (5) 739-812 Atkinson, R. and McGarrigle, J. "Segregation, Urban" Ed. N. J. Thrift and Rob Kitchin. N.p., 2 July 2009. Web. 6 May. 2015. Avalos, George. "Poll: California's Digital Divide Narrowing Slightly." San Jose Mercury News. N.p., 16 June 2015. Web. 16 June 2015. . Badger, Emily. "Computers Can Now Automatically Stereotype 'Hipsters' and 'Bikers'" City Lab. The Atlantic, 12 Dec. 2013. Web. 10 June 2015. . Badger, Emily. "Enough Already With the Avoid-The-Ghetto Apps."City Lab. The Atlantic, 4 Sept. 2013. Web. 10 June 2015. . Baer, Drake. "Google Has An Embarrassing Diversity Problem." Business Insider. Business Insider, Inc, 29 May 2014. Web. 21 May 2015. . Barreneche, Carlos. "Governing the geocoded world: environmentality and the politics of location plat- forms."Convergence: The International Journal of Research into New Media Technologies18.3 (2012): 331-351. Berko, Lex. "What If You Could Choose Between the Fastest Route and the Most Beautiful?" City Lab. The Atlantic, 17 July 2014. Web. 28 May 2015. Best, Kirsty. "Living in the control society Surveillance, users and digital screen technologies." Interna- tional Journal of Cultural Studies 13.1 (2010): 5-24. "Beta Software." Tech Terms. N.p., 5 Apr. 2013. Web. 29 May 2015. . Boer, Rene, Michiel Van Iersel, and Mark Minkjan. "FA Workshop: Amsterdam's Club Trouw and Wibautstraat - Failed Architecture."Failed Architecture. N.p., 10 July 2014. Web. 02 July 2015. . Bucher, Taina. "Want to Be on the Top? Algorithmic Power and the Threat of Invisibility on Facebook." New Media & Society 14.7 (2012): 1164-180. Web. Carmona, Matthew, Claudio De. Magalhaes, and Leo Hammond. Public Space: The Management Di- mension. London: Routledge, 2008. Print. Carr, Nicholas. "Your Inner Drone: The Politics of the Automated Future."The Glass Cage: Automation and Us. N.p.: W.W. Norton, 2014. N. pag.Longreads Blog. 30 Sept. 2014. Web. 24 June 2015. . 62

CHAPTER SIX - Less Crime, More Punishment." To Save Everything, Click Here. Saphari. Web. 01 July 2015. . Chatelain, Phillipe. "The NYC That Never Was: Robert Moses' Lower Manhattan Expressway (LOMEX)." Untapped Cities RSS. N.p., 11 Sept. 2013. Web. 08 July 2015. . Chen, Brian. "HP Investigates Claims of ‘Racist’ Computers." Wired.com. Conde Nast Digital, 22 Dec. 2009. Web. 07 July 2015. Chritiaanse, K. "Een smeulend vuur dat oplicht in de duisternis" De levende stad: Over de hedendaagse betekinis van Jane Jacobs. First edition. Simon Frank and Geert-Jan Hospers. Amsterdam: SUN, 2009. 21-30. Book. Chun, Wendy Hui Kyong. "On software, or the persistence of visual knowledge." grey room 18 (2004): 26-51. Company Info | Facebook Newsroom." Facebook Newsroom. N.p., n.d. Web. 24 June 2015. Cook, James. "Germany Just Asked Google To Do The Impossible: Reveal Its Secret Search Algo- rithm." Business Insider. N.p., 16 Sept. 2014. Web. 03 June 2015. . Crampton, Jeremy W. Mapping a Critical Introduction to Cartography and GIS /. Chichester, U.K. ; Malden, MA: Wiley-Blackwell, 2010. Print. Critical Introductions to Geography Cresswell, T. "Place." Ed. N. J. Thrift and Rob Kitchin. N.p., 2 July 2009. Web. 15 Apr. 2015. . de Waal, Martijn. De Stad is (g)een algorithme Ruimtevolk 19 February, 2015. Web. 4 March, 2015 de Waal, Martijn. The City as Interface (2014) nai010 Roterdam. Book Deleuze, Gilles ‘Postscript on Societies of Control,’ October 59 (1992): 73-77. Diakopoulos, Nicholas. "Algorithmic Accountability: On the Investigation of Black Boxes." TOWCen- ter Algorithmic Accountability On the Investigation of Black Boxes Comments. Columbia Journalism School, 3 Dec. 2014. Web. 28 May 2015. . Diakopoulos, Nicholas. "Rage Against the Algorithms." The Atlantic. Atlantic Media Company, 03 Oct. 2013. Web. 24 May 2015. Dourish, Paul, and Genevieve Bell. Divining a Digital Future: Mess and Mythology in Ubiquitous Computing (Boston: MIT Press, 2011), pp. 95-116 Dwoskin, Elizabeth. "Trends to Watch in 2015: From Algorithmic Accountability to the Uber of X." Wall Street Journal. N.p., 8 Dec. 2014. Web. 29 May 2015. Elden, S. “Space 1” Ed. N. J. Thrift and Rob Kitchin. N.p., 2 July 2009. Web. "Feiten en Cijfers" Iamsterdam. Amsterdam Marketing. March 5, 2014 Foucault, Michel, and Sheridan, Alan. Discipline and Punish : The Birth of the Prison. 2nd ed. New York: Vintage, 1995. Print. Foucault, Michel. 'The Subject and Power,' Critical Inquiry 8.4 (1982): 777-795. 63

Frank, Simon and Hospers, Geert-Jan "De actualiteit van Jane Jacobs" De levende stad: Over de hedendaagse betekinis van Jane Jacobs. First edition. Simon Frank and Geert-Jan Hospers. Amsterdam: SUN, 2009. 7-20. Book Fuller, Matthew, and Usman Haque. Urban Versioning System 1.0. New York: Architectural League of New York, 2008. Print. Fuller, Matthew. ‘Introduction: The Stuff of Software,’ in Software Studies: A Lexicon (Cambridge, MA: MIT Press, 2008), pp. 1-14. Galloway, Alexander R. "Computers and the Superfold." Deleuze Studies 6.4 (2012): 513-28. Web. Glaeser, Edward L. Triumph of the City: How Our Greatest Invention Makes Us Richer, Smarter, Greener, Healthier, and Happier. New York: Penguin, 2011. Print.

Goodyear, Sarah. "A New Way of Understanding 'Eyes on the Street'" City Lab. Atlantic Media Com- pany, 22 July 2013. Web. 02 July 2015. . Google Inc. (2011) Global mobile research on the Smartphone user and the Mobile Marketer from the MMA and Google. Google Mobile Ads Blog, 16 June. Available at: http://googlemobileads.blogspot.- com/2011/06/global-mobile-research-on-smartphone.html (accessed March 2015). Google Inc. (2011) Discover more places you’ll like based on people who’re like you. Available at: http://places.blogspot.com/2011/05/discover-more-places-youll-like-based.html (accessed March 2015).Google Inc. Google Inc. (2011) Entity Display Priority in a Distributed Geographic Information System. US Patent 7933897. " Statistics." - Internet Live Stats. N.p., n.d. Web. 21 May 2015. . Google. Google Mobile Ads Blog, 16 June. Available at: (ac- cessed March 2015). "Governance." Oxford Dictionaries. N.p., n.d. Web. 23 June 2015. . Graham, S. “Software-sorted geographies” (2005) Progress in Human Geography 29 (5) 562-580 Graham, S.; Wood, D. (2003): Digitizing surveillance: Categorization, space, inequality. In: Critical So- cial Policy, 23 (2), 227-248. Hafiz, Yasmine. "Google Suggest Reveals The Internet's Offensive Religious Stereotypes." The Huffin- gton Post. TheHuffingtonPost.com, 22 July 2014. Web. 02 July 2015. . Hajer, Maarten A., and Arnold Reijndorp. In Search of New Public Domain: Analysis and Strategy. Rotterdam: NAi, 2001. Print. Hallinan, B., and T. Striphas. "Recommended for You: The Netflix Prize and the Production of Algo- rithmic Culture."New Media & Society (2014): 1461444814538646. Web. Hardin, Garrett. "The tragedy of the commons." Science 162.3859 (1968): 1243-1248. "Humanitarian OpenStreetMap Team."Humanitarian OpenStreetMap Team. N.p., n.d. Web. 29 May 2015. . Huxley, Aldous. A Brave New World. 1932. Great Britain: Vintage, 2007. Book "Information Geographies." Information Geographies. Oxford Internet Institute, n.d. Web. 02 July 2015. . Jacobs, Jane The death and life of great American cities (1961) Random House, New York: Book. 64

Kitchin, R. “Space 2” Ed. N. J. Thrift and Rob Kitchin. N.p., 2 July 2009. Web. Kitchin, Rob, and Dodge, Martin. Code/space : Software and Everyday Life. Cambridge, MA: MIT, 2011. Print. Software Studies. Langston, Jennifer. "Who’s a CEO? Google Image Results Can Shift Gender Biases." UW Today. Uni- versity of Washington, 9 Apr. 2015. Web. 02 July 2015. . "Laurenskwartier - Wederopbouw Rotterdam." Wederopbouw Rotterdam. N.p., n.d. Web. 02 July 2015. . Law, John. After Method: mess in social science research (2004) Routledge, London: Book Lawrence Lessig, ‘Open Code and Open Societies’, in Joseph Feller, Brian Fitzgerald, Scott A. Hissam and Karim R. Lakhani (eds) Perspectives on Free and Open Source Software (Cambridge, MA: MIT Press, 2005), pp. 349-360. Lemke, Thomas. "Foucault, governmentality, and critique." Rethinking marxism14.3 (2002): 49-64. "Local—now with a Dash of Zagat and a Sprinkle of Google+." Web log post. Google Blog. Google, 29 July 2013. Web. 20 May 2015. . Lofland, L The Public Realm: exploring the City's Quintessential Social Territory (1998) Walter de Gruyter, Inc., New York: Book Madrigal, Alexis C. "How Google Builds Its Maps—and What It Means for the Future of Everything."The Atlantic. Atlantic Media Company, 06 Sept. 2012. Web. 29 May 2015. "Mapping Day." Mapping Day. N.p., n.d. Web. 28 May 2015. Matyszczyk, Christ. "The Joy of Microsoft's 'avoid Ghetto' GPS Patent - CNET." CNET. N.p., 7 Jan. 2012. Web. 10 June 2015. . McKelvey, Fenwick. "Algorithmic Media Need Algorithmic Methods: Why Publics Matter." Canadian Journal of Communication[Online], 39.4 (2014): n. pag. Web. 25 May. 2015 Montgomery, Charles. Happy City: Transforming our lives through urban design (2013) Farrar, Straus and Giroux. New York: Book "Moses Jacobs Opera." Moses Jacobs Opera. N.p., n.d. Web. 15 June 2015. . Murtagh, Rebecca. "Mobile Now Exceeds PC: The Biggest Shift Since the Internet Began." Search En- gine Watch. N.p., 8 July 2014. Web. 15 June 2015. . New York: A Documentary Film. Dir. Ric Burns. PBS, 1999. YouTube. Web. . "OpenStreetMap." OpenStreetMap. N.p., n.d. Web. 29 May 2015. Orwell, George. 1984. 1949. United States: Signet Classics, 1977. Book. Patel, Niel. "Everything You Need To Know About Google's Local Algorithm, Pigeon." Search Engine Land. Search Engine Land, 09 Jan. 2015. Web. 14 Mar. 2015. Perez, J. "GM, Ford, And Others Want to Make Working on Your Own Car Illegal."Yahoo Autos. Yahoo, 22 Apr. 2015. Web. 24 Apr. 2015. . 65

"Personalized Search for Everyone." Web log post. Google Blog. Google, 4 Dec. 2009. Web. 20 May 2015. Pulliam-Moore, Charles. "Racist 'glitch' in Identifies Black People as Gorillas." Fusion. N.p., 1 July 2015. Web. 07 July 2015. . Quercia, Daniele, Rossano Schifanella, and Luca Maria Aiello. "The Shortest Path to Happiness: Rec- ommending Beautiful, Quiet, and Happy Routes in the City." (2014). Web. Raymond, Eric. The Cathedral and the Bazaar (2000). Rieder, B. and Sire, G. "Conflicts of Interest and Incentives to Bias: A Microeconomic Critique of Google’s Tangled Position on the Web." New Media & Society 16.2: 195-211. Web. Rogers, Richard. "Mapping and the politics of web space." Theory, Culture & Society 29.4-5 (2012): 193-219. Rogers, Richard. Digital methods. MIT press, 2013. Rushkoff, D. (2010): The virtual revolution: The cost of free 5. BBC, http://www.youtube.com/watch? v=3l6W3baAPMY, viewed June 8, 2015 Shaftoe, Henry. Convivial Urban Spaces: Creating Effective Public Places. London: Earthscan in Asso- ciation with the International Institute for Environment and Development, 2008. Print. Shotland, Andrew. “Is your local business ready for Google's Neighborhood algorithm?” Search Engine Land. Search Engine Land, 19 June, 2014. Web. 20 May, 2015 Silver, Joe. "Is Your Turn-By-Turn Navigation Application Racist?" American Civil Liberties Union. N.p., 2 Oct. 2013. Web. 28 May 2015. . Simonite, Tom. "Study Suggests Google's Ad-Targeting System May Discriminate | MIT Technology Review." MIT Technology Review. N.p., 06 July 2015. Web. 08 July 2015.

"What Is Algorithm? - Definition from WhatIs.com." WhatIs.com. Web. 03 June 2015. . "What Is Free Software? - GNU Project - Free Software Foundation." What Is Free Software? Free Software Foundation, n.d. Web. 01 July 2015. .