UNIVERSITEIT GENT

FACULTEIT ECONOMIE EN BEDRIJFSKUNDE

ACADEMIEJAAR 2014 – 2015

TOWARDS A DEEPER UNDERSTANDING OF RESISTANCE

Masterproef voorgedragen tot het bekomen van de graad van

Master of Science in de Bedrijfseconomie

Petra Truant

onder leiding van

Dr. Katrien Verleye en Dr. Simon Quaschning

UNIVERSITEIT GENT

FACULTEIT ECONOMIE EN BEDRIJFSKUNDE

ACADEMIEJAAR 2014 – 2015

TOWARDS A DEEPER UNDERSTANDING OF FACEBOOK RESISTANCE

Masterproef voorgedragen tot het bekomen van de graad van

Master of Science in de Bedrijfseconomie

Petra Truant

onder leiding van

Dr. Katrien Verleye en Dr. Simon Quaschning

Vertrouwelijkheidsclausule

PERMISSION

Ondergetekende verklaart dat de inhoud van deze masterproef mag geraadpleegd en/of gereproduceerd worden, mits bronvermelding.

Naam student:......

Samenvatting Deze masterproef focust op Facebook resistance, een fenomeen dat erg actueel is, maar in de academische literatuur amper aandacht geniet. Van Dijck’s classificatie van sociale media langsheen een technoculturele (gebruikers en gebruik, content, technologie) en socio-economische dimensie (beleid, eigenaarschap, business model), lijkt geschikt om het onderwerp in de juiste context te kaderen. Doorheen deze zes niveaus komen van Dijck’s concepten van ‘friending’ en ‘sharing’ veelvuldig aan bod. Deze ervoeren een verschuiving van ‘connectedness’ naar ‘connectivity’. Dit betekent een verregaande commercialisering en comodificatie van gebruikersinfo. Dit heeft diepgaande gevolgen voor de sociale privacy (privacy naar andere mensen toe) en de institutionele privacy (privacy naar Facebook toe) van Facebook gebruikers. Verschillende kritieken op Facebook’s privacy beleid komen aan bod. Sociale en institutionele privacy lijken hier heel belangrijk. Bovendien lopen zij ook doorheen de zes dimensies van Facebook.

Ondanks de vele kritiek omtrent Facebook en privacy, blijven mensen Facebook gebruiken. Motieven voor Facebook gebruik en blijvend gebruik worden besproken. Attitude bewijst hierin heel belangrijk. Toch blijken er mensen te zijn die zich tegen Facebook willen verzetten. Verschillende Facebook resistance vormen worden besproken langsheen een externe en interne dimensie en langsheen een passieve en actieve dimensie. Bovendien worden enkele motieven voor Facebook resistance aangehaald.

Het onderzoek binnen deze masterproef is zowel kwantitatief als kwalitatief en focust op de motieven en op verschillende vormen en types van Facebook resistance. Daarnaast tracht het onderzoek een typologie van gebruikers en niet-Facebook gebruikers te voorzien langsheen een dimensie van attitude naar Facebook toe (voorgesteld door de schaal Facebook love) en een dimensie van Facebook resistance (voorgesteld door de schaal Facebook resistance). Er blijken veelvuldige vormen van Facebook resistance te bestaan, die kunnen opgedeeld worden volgens drie types: algemene Facebook resistance, sociale Facebook resistance en institutionele Facebook resistance. Deze laatste zijn verbonden met respectievelijk sociale en institutionele privacy. Motieven voor Facebook resistance worden ook langsheen deze dimensies ingedeeld. De termen algemene Facebook resistance, sociale Facebook resistance en institutionele Facebook resistance worden van definities voorzien. Onder de respondenten kan men vijf groepen onderscheiden langsheen de schaal Facebook love en Facebook resistance. Zij krijgen de volgende namen: Facebook indifferent resisters, Facebook lovers, anti Facebook resisters, Alter-Facebookers en Ambiguous Facebook resisters. Verschillende conclusies worden getrokken en worden bediscussieerd. Implicaties voor Facebook en maatschappelijke implicaties worden besproken. Beperkingen en aanbevelingen voor verder onderzoek besluiten deze masterproef.

Preface

I want to thank anyone who supported the establishment of this Master’s thesis, my thanks especially go to:

Dr. Katrien Verleye, Dr. Simon Quaschning and Dhr. Arne De Keyser: for your professional support Prof. Dr. José van Dijck, Marc Stumpel en Tobias Leingruber: for your time and consideration Prof. Dr. Pieter Verdegem: for your support Dr. Brecht Wyseur: for your inspiration My parents: for your emotional support Frederik: for reviewing spelling

Table of Contents

Abstract ...... 1 Introduction ...... 2 Problem Statement ...... 2 Context ...... 3 Facebook’s six dimensions ...... 3 Facebook’s Techno-cultural Dimension ...... 3 Facebook’s socio-economical dimension ...... 4 Facebook and privacy ...... 6 Facebook and social privacy ...... 7 Facebook and institutional privacy ...... 9 Motives for (continued) Facebook use ...... 12 Facebook resistance ...... 19 Forms of Facebook resistance ...... 19 Motives for Facebook resistance ...... 25 Conclusion ...... 27 Research methodology...... 27 Test sample ...... 27 Instrument development ...... 28 Data analysis and results ...... 29 Conclusion and discussion ...... 38 Towards a definition ...... 39 Effects of Facebook resistance ...... 45 How does Facebook react on Facebook resistance? ...... 46 Implications of this research ...... 47 Limitations and future research ...... 48 References ...... 50

Appendix 1 ...... 2 Appendix 2 ...... 3 Appendix 3 ...... 4 Appendix 4 ...... 7 Appendix A: Surveying Facebook resistance ...... 9 Appendix B: Descriptives reducing Facebook’s influcence (Q5), resisting Facebook’s influence ...... 15 Appendix C: Reliability analysis: scale attitude towards Facebook (Q1,Q2), scale Facebook resistance (Q5, Q6) ...... 16 Appendix D: Frequencies scale Facebook Resistance (Q5,Q6) ...... 17 Appendix E: Correlation between Scale Facebook love and Facebook resistance ...... 18 Appendix F: Clustering Respondents ...... 19 Appendix G: Cross tabulation clusters – Facebook account ...... 20 Appendix H: Cross tabulation clusters – gender ...... 21 Appendix I: Cross tabulation clusters – age ...... 22 Appendix J: Cross tabulation scale Facebook resistance – adjust privacy settings ...... 23 Appendix K: Cross tabulation scale Facebook resistance – adjust privacy settings in order to resist Facebook/reduce Facebook’s influence ...... 25 Appendix L: Cross tabulation scale Facebook resistance – Adjust use ...... 27 Appendix : Cross tabulation Scales – Facebook account ...... 29 Appendix N: Correlation scales – frequency Facebook use ...... 32 Appendix O: Correlations Facebook love – frequency of actions ...... 33 Appendix P: Presence of social and institutional privacy and of GFBR, SFBR, IFBR and privacy FBR ...... 40 Appendix Q: Cross tabulation clusters – GFBR, SFBR & IFBR ...... 42 Appendix R: Cross tabulation presence of social privacy – SFBR, adjust privacy – SFBR...... 45 Appendix S: Cross tabulation SFBR - IFBR ...... 47 Appendix T: Facebook account – broad GBFR ...... 48

Abbreviations

FBR Facebook resistance

Q Question GFBR General Facebook resistance SFBR Social Facebook resistance IFBR Institutional Facebook resistance

Tabels & Figures

Table 1: Internal drivers for the adoption and use of Facebook

Table 2: External drivers for the adoption and use of Facebook

Table 3: Internal drivers for the continued use of Facebook

Table 4: Correlation Facebook love and Facebook actions

Table 5: Cluster description along dimensions Facebook love and Facebook resistance

Table 6: Drivers of Facebook resistance

Table 7: Forms of general Facebook resistance, social Facebook resistance and institutional Facebook resistance

Table 8: Appearance of GFBR, SFBR and IFBR in relation to clustering along the Facebook love and Facebook resistance dimension

Table 9: Naming cluster groups based on scale Facebook love, scale Facebook resistance and Account

Figure 1: Facebook Purity (FBP) and Social Fixer (screw wrench) buttons incorporated in the upper Facebook toolbar

Figure 2: Facebook Purity options popup screen

Figure 3: Social Fixer options popup screen Figure 4: types and amount of Facebook users and non-users divided along the Facebook love and Facebook resistance dimension

Figure 5: Concentric model: three types of FBR forms

Abstract

This research paper focuses on the concept of Facebook resistance, a phenomenon that is actual but compared to other academic research, has hardly received any attention. The research focuses on quantitative and qualitative aspects of Facebook resistance. It offers a review of Facebook resistance forms and of Facebook resistance drivers. Moreover, different types of Facebook resistance are distinguished. A typology of Facebook users and non-users is stated along two dimensions: Facebook resistance and attitude towards Facebook. Five groups are to be found: Facebook indifferent resisters, Anti Facebook resisters, Alter- Facebookers, Facebook ambiguous resisters, Facebook lovers. Of these five groups, three groups seem to be resisters. Three types of Facebook resistance can be distinguished: general Facebook resistance, social Facebook resisitance and institutional resistance. A definition for all three types is posed. Qualitative results provide new forms of Facebook resistance. Several Facebook resistance drivers are to be found along the same dimension. While referring to the literature section societal implications and implications for Facebook are being discussed. Research limitations and propositions for further research are stated.

1

Introduction

Problem Statement In the third quarter of 2014, Facebook counted 1.3 billion monthly active users and 864 million daily active users on average (Statista, 2014 1; Facebook Newsroom, 2014 2). This proves that after ten years of existence, Facebook is still very popular. Nevertheless, Facebook has never stopped being the topic of many heavy critiques. Several authors negate privacy issues (e.g. Jones & Soltron, 2005; Roosendaal, 2011; Wondracek et al., 2010; Liu et al., 2011, Madejski et al., 2011; Debatin et al., 2009; van Dijck, 2013), its general approach and general policy (van Dijck, 2013). In consequence of complaints about these topics, Facebook has encountered many forms of protest (e.g. Stumpel, 2010; Stumpel, 2013; van Dijck, 2013; Fernback, 2013; Debatin et al., 2009).

Facebook enjoys a lot of attention in academic research. As such, research has looked at Facebook’s privacy issues and user’s dealing with their own information (e.g. Madejski et al., 2011; Debatin et al., 2009; Kwan & Skoric, 2013; Nosko et al., 2010). Moreover, studies have investigated the motives for using Facebook (e.g. Ross et al., 2009; Carpenter, 2011) and continued Facebook use (e.g. Kourouthanassis et al., 2015 ; Gwebu et al., 2014; Yang & Lin, 2014).

Research towards Facebook resistance (FBR), however, is yet rare. Nonetheless some descriptive analyses available on the topic (Boyd, 2008; van Dijck, 2013; Stumpel 2010; Stumpel 2013; Truant, 2013), only little is known about this phenomenon and why it occurs (Boyd, 2008; van Dijck, 2013; Truant, 2013). Nevertheless, this topic is very present-day counting many protests and actions against Facebook. Recent examples are ‘A day without Facebook’3 and Max Schrems’ initiative to institute legal proceedings against Facebook (“Lawyer suing Facebook overwhelmed with support”, 2014, August 6 th ).

We address this gap in literature, investigating different types of FBR, distinguishing types of Facebook resisters and investigating FBR drivers. To what degree is FBR to be found among Facebook users and non-users? How can we describe Facebook resisters? What are their motives? Which strategies and techniques are used? Which types of FBR can be identified? Moreover, we will discuss the implications of this research towards Facebook, other companies and society.

1 Consulted on November 25 2014 on the World Wide Web: http://www.statista.com/statistics/264810/number-of-monthly-active-facebook-users-worldwide/ 2 Consulted on November 25 2014 on the World Wide Web: http://newsroom.fb.com/company-info/ 3 Consulted on January 7 on the World Wide Web: https://www.facebook.com/events/1556867964555260/?fref=ts 2

Context In a context of globalized interconnectedness and independentness of time and space by means of networks and technology (Castells, 1996; Rantanen, 2005), people experience a large sense of interaction and interdependency (Thomspon, 1995). The network society influences our personal and social lives significantly (Castells, 1996; Bell, 1973; van Dijk, 1999; Rainie & Wellman, 2012), a new society in which information plays a vital role (Castells, 1996; Van Dijk, 1999; Bell, 1973 & Rainie & Wellman, 2012). Castells considers a new economy which calls ‘ informational capitalism ’. Social media incorporate the elements above in corporate surveillance, as Facebook does (van Dijk, 2012; Castells, 2009; van Dijck, 2013a, 2013b; Kaplan & Haenlein, 2010; Fuchs, 2011). Facebook is one of the most well known social network sites. With over 1.3 billion users, almost one in five people have an account they use at least monthly 4.

In order to frame the topic of FBR, we discuss in the following chapters van Dijck’s (2013) techno- cultural and socio-economical dimension in relation to Facebook, and the concepts of social and institutional privacy (Raines-Goldie, 2010). Although privacy is a large source of dissatisfaction, people still use and continue to use site: another chapter focuses on these motives. Nonetheless many people seem to resist Facebook. The concept of FBR is discussed and motives are suggested. In a following part, a conducted research about the concept FBR is being reported: types of Facebook resisters are posed and analyzed and FBR drivers are reviewed. Conclusions and remarks are discussed.

Facebook’s six dimensions In order to look at Facebook from a more theoretical point of view, Van Dijck’s (2013a) proposes a classification of social media around a techno-cultural and socio-economic dimension. Three elements are considered in both categories all applicable on Facebook: the first entails technology, users and their use and content. The latter entails ownership, policy and business models.

Facebook’s Techno-cultural dimension Users and usage The large amount of users, 1,3 billion people, is key to Facebook’s success. ‘Facebook’s main development characteristics are speed and growth ’ (Feitelson et al., 2013). The critical mass is reached: more people create a profile and people experience pressure to do so (Chiu et al., 2008; Cheung, et al., 2011; van Dijck, 2013a; Chang & Liu, 2011 & Chen, Yen & Hwang, 2012). It is therefore hard to step out of the system, which is moreover technically hard to do (van Dijck, 2013a).

4 Consulted on January 7th 2015 on the World Wide Web: http://www.statista.com/statistics/264810/number- of-monthly-active-facebook-users-worldwide/ and http://www.statisticbrain.com/facebook-statistics/ 3

Facebook’s core message is openness and connectedness (van Dijck, 2013a; Data Policy, last updated on January 30 th 2015). This is incorporated in the terms ‘sharing’ and ‘friending’. However, these terms have experienced a shift in meaning from a more narrow understanding of ‘ connectedness ’ towards a broader understanding as ‘ connectivity ’ (van Dijck, 2013a; 2013b). Connectivity implies a broader meaning of both ‘friends’ and ‘sharing’. Few Facebook ‘friends’ are actual real friends (Boyd, 2008; Hull et al., 2011 & Brooks et al., 2014). Information is shared with Facebook and third parties, instead of with Facebook friends only (van Dijck, 2013a).

Content The second aspect of Facebook’s techno-cultural dimension is content: ‘anything you or other users post, provide or share using Facebook Services’ (Statement of Rights and Responsibilities, last updated on January 30, 2015). It also experienced a shift in meaning from connectedness (narratives, diverse) towards connectivity (databases, standardization and uniformity) (van Dijck, 2013a; van Dijck, 2013b).

Technology Technology is the last aspect in Facebook’s techno-cultural dimension. ‘Coding technologies ’ are key to recode social activities into usable data. Five elements are herein important: the ‘default’ status, protocols, interfaces, algorithms and data and metadata (van Dijck, 2013a).

Every action conducted on Facebook, forms a source of data and metadata for the company (e.g. Stumpel, 2010; van Dijck, 2013a). The ‘default’ status (standard setting for the sharing of users’ information), Facebook’s interface, algorithms and protocols, are all adjusted to Facebook’s core message and to both connectedness and connectivity (van Dijck, 2013a) encouraging to openly share personal information, desires and feelings (van Dijck, 2013a; 2013b; Debatin et al., 2009; Raines- Goldie, 2010). Facebook’s interface is standardized and built up chronologically, making it easier to apply algorithms that determine what Facebook users see. Facebook’s rules are implemented in its protocols, which users have to meet in order to use Facebook.

Facebook’s socio-economical dimension Ownership Facebook promotes itself as the social network experience. It employs different strategies to maintain this dominant position. One method is acquiring other successful upcoming players (van Dijck, 2013a) such as WhatsApp or , or ongoing strategic partnerships with niche markets such as Skype and Spotify (van Dijck, 2013a). One the one hand, Facebook wants to reach the status of an entertainment platform, one the other hand it wants to keep as many users as long as possible active on its platform (Shearman, 2011, September 22nd ; Halliday, 2011, September 22 nd ).

4

Facebook’s presence on the Nasdaq stock exchange is an important driver for encouraging users to stay active, providing Facebook with more information. The IPO urges to maintain profits, but at the same time Facebook has to please its users, who become more and more aware of Facebook’s commercial aspects (van Dijck, 2013a). The conflicting relationship between users, advertisers and shareholders thus becomes clear (van Dijck, 2013a, 2013b).

Policy Technical and social protocols imply implicit and explicit rules to regulate privacy, acceptable behavior, intellectual rights and the rights of Facebook to use and sell data and metadata (van Dijck, 2013a; Data Policy, last updated January 30 th , 2015; Statement of Rights and Responsibilities, last updated January 30 th 2015 ). They are integrated in Facebook’s terms and conditions of use in which regular changes are expressed. They have no legislative authority, comprising a grey area where Facebook decides one-sidedly what is allowed and what isn’t. Facebook however, formulates the terms and conditions of use as in the sole interest of the users (van Dijck, 2013a).

Marc Zuckerberg, founder and CEO of Facebook, considers changing social norms as an external power, claiming privacy is the changing norm, not ‘sharing’ and ‘friending’. Facebook operates in a ‘perpetual development ’ framework, entailing small incremental changes based on what users like, without predefined objective during the process of development (Feitelson et al., 2013). Van Dijck does not agree: Facebook itself steers social norms enhancing the contextual evolution of ‘sharing’ and ‘friending’ towards connectivity, categorizing it under richer social experiences (van Dijck, 2013a). Christofides et al. (2009) also disagree: privacy is changing, but Facebook itself changes the nature of it.

Business model Third and last element of Facebook’s social-economic dimension is its business model. Three values are key to it: connectivity, attention and popularity. Algorithms are specialized in detecting and promoting these values: advertisers seek the attention of Facebook users and popular subjects potentially influencing other users. (van Dijck, 2013a).

Facebook’s business model consist out of old and new formats. Old formats are however presented in new forms: advertisements are now personalized (Van Dijck, 2013a) and can be quite effective (Taylor et al., 2011; Fuchs, 2011). Possibilities to personalize advertisements to a certain goal public are nearly endless 5. All the necessary information is extracted out of the users’ content (van Dijck, 2013a) (see infra). Advertisements also have new forms: Page Post Ads, Promoted Posts (Darwell,

5 Consulted on March 10, 2015 on https://www.facebook.com/ads/create?campaign_id=357409530988309&placement=header 5

2013) and Sponsored Stories (Interacting with ads, n.d. 6) would be even more effective than normal personalized advertisements (Constine, 2011). Also new is the selling of data and metadata to third parties 7. New formats have found their role in Facebook’s business model: selling pages to brands or paid services within applications or games on Facebook (van Dijck, 2013a).

Regarding the above, the logics of connectivity with the aspects of sharing and friending are to be found in all six aspects van Dijck considers. Taking the above into consideration and regarding van Dijck’s and Fuchs’ statements, one should agree the main goal of Facebook is to make money, by commercializing user data and other activities. Facebook’s IPO in 2012 could have been an open confirmation towards this idea. However, Facebook has to please its users, otherwise it risks losing them. Regarding the high amount of users, Facebook definitely offers its users advantages. In this context Feitelson et al. (2013) are right in claiming Facebook introduces features to users’ liking, but they remain subjective considering two of the three authors are employed with Facebook.

Facebook and privacy Facebook’s privacy policy is integrated on different levels: the Statement of Rights and Responsibilities and the Data Policy (Johnston & Wilson, 2012; van Dijck, 2013a; Statement of Rights and Responsibilities, last updated on January 30 2015; Data Policy, last updated on January 30). It has been criticized by many authors for being long, complex (Jones & Soltren, 2005; Turow, 2008; Johnston & Wilson, 2012; Asif & Khan, 2012, van Dijck, 2013a), unclear and contradictive (Jones & Soltren, 2005; van Dijck, 2013a). Even if people read the Facebook privacy policy, there exists a poor knowledge of it (Jones & soltren, 2005; Govani & Pashley, 2005; Debatin et al., 2009; Madejski et al., 2011; Asif & Khan, 2012 & van Dijck, 2013a).

By changing and adding specific features, Facebook gives users the impression to care about their privacy and to put steps towards better privacy protection. However, Facebook only changes its privacy policy either under pressure or out of marketing perspectives (Soghoian, 2008, March, 19; Wray, 2009, August 27). As a consequence of the changes of January 30 th 2015, Facebook has changed the ways of leading users towards the privacy settings. Claiming to help people, it created a privacy shortcut. However, it only contains partial possibilities to adapt privacy settings. Most likely Facebook does this because it hopes people will feel safe after having adjusted only these settings.

6 The definition of Sponsored Stories is not retrievable any more from the Facebook help pages, as it was before on Interacting with ads (n.d.). Consulted on May 22nd 2013 on the World Wide Web: https://www.facebook.com/help/499864970040521 7 The total of Big Data business of all companies associated with European citizens’ personal data alone, are presumed to be worth one trillion Euro by 2020 (European Commission, 2015, April 15).

6

Surma (2013) indeed points out users are likely to put more information and updates on their profiles, if they make use of the privacy controls Facebook offers (for more information, see appendix 1).

For several years, Facebook was criticized for endangering its users’ privacy. In order to give true meaning to this debate, it is necessary to consider two privacy approaches privacy: ‘ social privacy ’ and ‘ institutional privacy ’. The first entails privacy of one Facebook user towards another, or towards all other people surfing the Internet. The latter entails Facebook users’ privacy towards the Facebook company and its partners (Raynes-Goldie, 2010). Other authors share the same ideas, though not using these explicit terms (Jones & Soltren, 2005; Debatin et al., 2009; Fuchs, 2011; young & Quan- Haase, 2013; van Dijck, 2013a, 2013b).

Jones and Soltren (2005) distinguish three sorts of issues endangering Facebook users’ privacy. Firstly, people give away too much information about themselves, causing possible negative consequences for both social and institutional privacy. Secondly, Facebook neglects its promise to protect their users’ privacy, since the privacy settings are solely intended to protect information from other people. The third and last point of criticism is Facebook’s profound data mining (the gathering of data and metadata of users) and putting this at the disposal of companies using it for advertising purposes (see infra).

Facebook and social privacy Based on Jones and Soltren’s distinction of privacy dangers, too much self-disclosure is one important aspect endangering social privacy. There are several authors focusing on this topic on Facebook and its possible harming effects (e.g. Messmer, 2007; Grimmelman, 2009; Nosko, Wood & Molema, 2010; Kwan & Skoric, 2013). Since Facebook requires from its users to use their authentic name (see infra), this could make users behave more responsible, but also have negative effects on their personal lives (Willaerts, 2013). The public posting of location and identity details, the publishing of political preference, religion and sexual orientation all entail possible dangers (Nosko et al., 2010). Other possible harming effects are online aggression (Kwan & Skoric, 2013; Walraeve, 2013; Dredge et al., 2014): Facebook bullying (Walraeve, 2013; Dredge et al., 2014), threats, stalking, intimidation, provocation (Kwan & Skoric, 2013; Lyndon et al., 2011; Hull et al., 2011) and identity theft (Messmer, 2007; Gross & Acquisti, 2005 & Hull et al., 2011). Moreover, tax collector’s and insurance controls (Tange, 2013, July 22; Steenackers, 2012 August 1 st ; “De fiscus volgt je op Facebook”, 2009, July 19) and controls of future and current employees are threats to social privacy (HR.square, 2011, December, 2 nd ; Smith & Kidder, 2010; van Dijck, 2013b).

7

Two important elements contribute to too much public self-disclosure: not changing privacy settings (Gross & Acquisti, 2005; Acquisti & Gross, 2006; Ellison et al., 2007; Debatin et al., 2009; Madejski et al., 2011; Nosko et al. 2012; van Dijck, 2013a) and accepting strangers as Facebook friends (Walraeve et al., 2013; Gross & Acquisti, 2005; Jones & Soltren, 2005; Messmer, 2007). There are two main reasons why people don’t change their privacy settings: people are not aware their data is publicly accessible (van Dijck, 2013a & Brandtzæg, 2010) and a lax attitude towards privacy settings (Govani & Pashley, 2005; Gross & Acquisti, 2005; Acquisti & Gross, 2006; Madejski et al., 2011 & van Dijck, 2013a): ‘ People are so involved in getting socially connected that they don’t see the dangers it poses ’ (Asif & Khan, 2012). People that do adjust their privacy settings, often can’t reach what they intended (Liu et al., 2011; Madejski et al., 2011). This proves the possibilities to control privacy settings are not yet sufficient (Liu et al., 2011; Madejski et al., 2011). The threat of identity theft thus still stands. Accepting strangers as Facebook friends herein plays an important role (Messmer, 2007).

Fuchs (2011) puts the topic of self-disclosure and the dangers for social privacy in a different perspective. He accuses these authors of a ‘ victimization discourse’ , in which especially adolescents are pushed in the role of victim (e.g. George, 2006; Acquisti & Gross, 2006; Walraeve, 2013). This is a one-sided view that presumes young people show passive and irresponsible behavior and are badly informed (Fuchs, 2011). Adolescents and younger people get on better with privacy settings than people aged above 40 because they often experience more problems in understanding the Facebook privacy settings in particular and in understanding Facebook in general (Brandtzæg, 2010). Young people do adjust the default privacy settings (Young & Quan-Haase, 2013), remove and untag unwanted photographs (Robards, 2010; Lang & Barton, 2015), limit friendship requests from unknown people, adjust privacy towards different groups within their friends list (Robards, 2010), exclude contact information and display limited profiles (Young & Quan-Haase, 2013; Moreno, 2014). Therefore, young people are confident in their self-disclosure on Facebook (Young & Quan-Haase, 2013).

Facebook does efforts to protect its users’ social privacy, although this is not always due to free will. Current possibilities are still not sufficient: Adjusting privacy settings does not always achieve wanted results. Facebook’s new privacy shortcut implemented in January 2015 claims to help users protecting their privacy, actually making it worse due to the creation of a feeling of safety. This proves that it is still necessary to address social issues when studying Facebook. Popular news media, such as radio, television and news papers, regularly mention this problem. Most Facebook users’ concerns about privacy deal indeed with social privacy (Debatin et al., 2009; Raynes-Goldie, 2010;

8

Young & Quan-Haase, 2013). However, many authors, popular media and users, often overlook the concept of connectivity and institutional privacy. Only recently some changes have come up with for example a report for the Belgian Privacy Commission (Van Alsenoy et al., 2015) and messages in news media (e.g. Vanhecke & Deckmyn, 2015, May 15; Tierens, 2015). There are however authors (Fuchs, 2011; van Dijck, 2013a, 2013b; Jones & Soltren, 2005 & Deabtin et al., 2009) who previously supported Raines-Goldie’s notion of institutional privacy.

Facebook and institutional privacy Too much self-disclosure can endanger both social and institutional privacy of Facebook users. The threat of self-disclosure for institutional privacy is related to Jones and Soltren’s second and third critique: Facebook does not keep its promise to protect its users’ privacy and the profound data mining. These critiques are supported by other authors (e.g. Debatin et al., 2009; Roosendaal, 2011; Wondracek et al., 2010; van Dijck, 2013a). Although Facebook claims in its Facebook Principles people can choose to share their information with whomever they want (Facebook Principles, n.d.), they are subject to ‘ frictionless sharing ’ (van Dijck, 2013a): Facebook is allowed to use and share every piece of information and content the user provides to the company. This is every action on the Facebook website, on other platforms in its possession and on other websites connected with it by means of social plugins (for example the ) (van Dijck, 2013a; van Dijck, 2013b; Stumpel, 2013, personal message, April 4 th ; Statement of Rights and Responsibilities, last updated on January 30 2015). This proves Facebook’s privacy policy is highly contradictive. The Facebook principles refer to SP (van Dijck, 2013a), but Facebook’s elaborate rights refer to institutional privacy. Facebook thus offers possibilities to protect users’ SP (Debatin et al., 2009), but they do not protect users’ institutional privacy (Jones & Soltren, 2005). (for more information see appendix 2).

Critique on Facebook’s data mining and it’s broken promise to protect users’ privacy does not only focus on the happening of it, but also on the manner it is conducted. Facebook conserves data, even though claiming to have removed it (van Dijck, 2013a) and it gathers information of users without being logged on to the website (Facebook, n.d.; van Dijck, 2013a; Roosendaal, 2011). Facebook even gathers information on users who don’t have an account by means of shadow profiles (Lomas, 2013, June 24). Facebook had been suspected of it (Debatin et al., 2009; Jones & Soltren, 2005), but whistle-blower Edward Snowden revealed that Facebook gave user information to the United States’ secret service National Security Agency, known as the PRISM scandal (Munger, 2015; Moran, 2015). Facebook’s face recognition techniques also have been subject to criticism (Andrade et al., 2013;

9

Buckley & Hunter, 2011) 8. Facebook Connect (a tool to log on to other websites using their real/Facebook identity) is one example of Facebook regarding its users to have one single true identity represented by their authentic identity 9 (Raines-Goldie, 2010; van Dijck, 2013b; Willaerts, 2013; Andrade et al., 2013). Back et al. (2010) conclude Facebook users comply with this image: they present their actual personality. However, according to van Dijck (2013b) Facebook makes a judgmental mistake (van Dijck, 2013b), overlooking people’s changing identity on social networks as acting performances (Van House, 2009). Leingruber (2012) criticizes the general acceptance of the Facebook identity as the real identity, outside of the social network site for the use of controlling identity (Lown, 2013, May 3 rd ; Schenker, 2012, March 5 th ). Mobile tracking is another contested valuable source of information (Van Alsenoy et al., 2015).

Other reasons for claiming that Facebook does not protect its users’ privacy sufficiently, are several leaks in Facebook’s system in the past whereby sensitive information about Facebook’s datasets appeared online (Johnson, 2007, August 13; Lomas, 2013, June 24), or the fear of hacked datasets (Jenkins, 2007, December 7).

Facebook’s algorithms (van Dijck, 2013a; Fuchs, 2011), applications of third parties (Hull et al., 2011; Egele et al., 2012), the partnership with Spotify (O’Carroll & Halliday, 2011, September 29) and the legal framework in which Facebook acts (Farrell, 2014, June 20; Matussek, 2013, February 15) al have become subject to critique threatening institutional privacy (For more information see appendix 3).

Facebook tries to counter these critiques by attributing its actions to security reasons (Facebook’s name policy, last updated on March 10, 2015). It also claims to strip all user data from the users’ identity before selling it to third parties and advertisers (Data Policy, last updated on January 2015). However, Backstrom et al. (2007) prove that after anonimization of data, it is still possible to discover the users’ identity. Making user data anonymous does thus not guarantee users’ privacy safety (Backstrom et al., 2007; Clarkson et al. 2010; Narayanan & Shmatikov, 2009 & Wondracek et al., 2010). Moreover, ‘custom audience’ (offering advertisers the possibility to implement their customers database in Facebook advertisements and target them directly with advertisements 10 ), directly proves Facebook’s statement to be wrong.

8 In the European Union the use of face recognition on Facebook is no longer allowed, since it conflicts with the European Data Protection law (Out-Law.com, 2012, September 21). But Facebook is still eager to encourage people to tag yourself and other people on Facebook. 9 Consulted on July 6 2015 on the World Wide Web: https://www.facebook.com/help/159096464162185 10 Consulted on May 31st 2015 on the World Wide Web: https://www.facebook.com/business/a/custom- audiences?campaign_id=579705038731764&placement=OnlineAdvCADrp_Trg 10

Another issue consists of Facebook users not taking institutional privacy into consideration (Raines- Goldie, 2010; Debatin et al., 2009; Young & Quan-Haase, 2013). They are either not conscious about it: people believe their privacy is protected when they adjust their privacy settings (Debatin et al., 2009), or they are not concerned about it (Raynes-Goldie, 2010; Young & Quan-Haase, 2013). The gathering and using of personal info for commercial ends have become an accepted social norm (Young & Quan-Haase, 2013), as is Facebook’s goal (van Dijck, 2013a). More and more people are however conscious about commercialization of personal user data and its threats for institutional privacy (van Dijck, 2013a). The Beacon case in 2007 made this for the first time truly visible for users (e.g. Perez, 2007, November 30; Debatin et al. 2009; van Dijck, 2013a). The introduction of the Timeline in 2011 also made many suspicious of privacy issues, realizing they had posted more data on their profiles than previously thought of (van Dijck, 2013a).

After the Beacon incident, Facebook relocated the commercial aspects in its terms and conditions of use to the background (Turrow, 2008; Debatin et al., 2009; Fuchs, 2011 & van Dijck, 2013a), resulting in a long, complex and contradictive privacy policy (see supra). Promising more interactivity and connectedness with new features (The Facebook Blog, 2009, Febrary 10; Halliday, 2011, September 22; van Dijck, 2013a; Press Association, 2013, June 13), it started gradually changing the norm of sharing (van Dijck, 2013a). Although sharing is here a euphemism of commodification, it is important not to overlook Facebook’s capitalist contextual framework (Fuchs, 2011). ‘Privacy fetishism ’ considers privacy only to be an individual responsibility, forgetting the political economic context in which Facebook operates: ‘ advertising, capital accumulation, the appropriation of user data for economic ends, and user exploitation ’ (Fuchs, 2011). In order to survive in this competitive society, Facebook keeps on searching new ways to gather data on its users in order to accumulate its capital (Fuchs, 2011). Especially since Facebook’s IPO in 2012, the pressure to make profits has become more committing (van Dijck, 2013a).

Privacy fetishism ‘advances the view that increasing privacy levels will technologically solve societal problems and ignores that this might create new problems’. When evaluating privacy in relation to Facebook, it is thus necessary to first question: ‘ Privacy for whom?’ (Fuchs, 2011). Debatin et al. (2009) point out a privacy dilemma: ‘ It would seem that if privacy is protected, then sociability and content sharing will be compromised, whereas if sociability and content sharing are promoted, then privacy will suffer’ . Although paying for privacy could be an option, hardly any people are willing to do this (van Dijck, 2013a). An equilibrium between respecting privacy and defending connectivity is necessary (Debatin et al., 2009; van Dijck, 2013a).

11

Motives for (continued) Facebook use Many authors have already contributed to this aspect of literature on Facebook. By reviewing 42 articles about motives for using Facebook with undergraduate and graduate students, Nadkarni & Hofmann (2012) distinguish six different motives: personality characteristics, impression formation, social connectedness, general uses of Facebook, self-esteem and impression formation. Chang and Liu (2011) however conclude that: ‘ effort expectancy, performance expectancy, perceived playfulness, social influence, perceived critical mass, human-message interaction, human-human interaction, information sharing, right of privacy protection, search function, free to use, and habit are factors influencing behavioral intention to use facebook ’. According to Wilson et al. (2012) researchers don’t know the exact reasons for Facebook’s popularity. The above proves it is difficult to propose all-embracing motives for using Facebook and even making it impossible to argue to have found the motive for it. This is one reason why research towards these motives is interesting.

Wilson et al. (2012) state motives for using Facebook can be divided into two subcategories: external and internal motives. In order to create a more thorough understanding, two tables list the different motives found in academic literature, divided along Wilson’s et al. external-internal dimension. Regarding these tables, it seems that Wilson et al. (2012) are right in their conclusion scholars don’t know the exact reason for adopting Facebook. It is interesting to see how many motivations and their interactions with each other drive this subject.

The gratifications of Facebook seem to exceed the perceived threats to privacy (Debatin et al, 2009) and feelings of dissatisfaction, since a large majority continues using Facebook. Research towards motives to continue using Facebook has been conducted in a much lesser degree. It is reasonable to presume that the above motivations to adopt and use Facebook, are also reasons why people continue using it. Although it is only recently, there are some authors who mention reasons to proceed with the use of Facebook or focus entirely on the matter of post-adoption continued use of the social network site. Again, different theoretical frameworks were used and are displayed in the following tables along Wilson’s et al. (2012) internal and external motives. Bonson et al. (2014) prove the above presumption incorrect, concluding social influence has no effect on intention of continued use of Facebook. Therefore it also seems hard to distinguish external drivers. ‘Intention to continue using Facebook is affected mainly by stakeholders'/users' attitude toward using this platform ’ (Bonson et al., 2014).

Several authors focus on integrated models explaining continued use of Facebook. Hsu et al. (2014a) state each model only partially explains a user’s extrinsic and intrinsic motivations for continued use of Facebook, providing a more thorough explanation integrated in one model. All models offer

12 acceptable explanations, but the integrated model of Hsu et al. (2014a) shows the largest explained variance of R²= 0,72. The second best explained variance was to be found in the model of Chiang (2013) with R² varying from 0.631 for innovators/early adopters to 0.699 for early and late majority. In these models, attitude towards Facebook plays an important role. The lowest variance was found in the research of Hsu et al. (2014b). Although there are found similarities in models, it is clear that there is not yet full consensus among authors. Most of these frameworks however focused their research on young people and students. Possibly these results are not valuable for elder people (Yang & Lin, 2014). It is however arguable to review different theoretical views in order to frame this subject.

13

INTERNAL DRIVERS FOR ADOPTION AND USE OF FACEBOOK AUTHORS Social drivers Social engagement Need to belong and stay connected Gosling, 2009; Nadkarni & Hofmann, 2012; Raacke & Bonds-Raacke, 2010; Cheung et al., 2011; Sheldon et al., 2011; van Dijck, 2013a Keep contact with bridging capital General drivers of bridging capital Boyd, 2008; Hull et al., 2008; Brooks (Putnam, 2009) et al., 2014 Contact with geographic spread Gosling, 2009; Ellison et al., 2007 network Relational development Krishnan & Hunt, 2015 Connection strategies with strangers: Ellison et al., 2011; Raacke & Bonds - latent ties Raacke, 2008; Yang & Lin, 2014 Make us e of s ocial capital: strong and General drivers of social capital Gross & Acquisti, 2005; Ellison et al. weak ties (Granovetter, 1973) 2007; Steinfield et al., 2008; Papacharissi & Mendelson, 2008; Valenzuela, 2009; Burke et al., 2010; Ellison et al., 2011; Johnston et al., 2013; Brooks et al., 2014 School related initiatives e.g. Khan et al., 2014, Yang et al., 2014; Lampe et al., 2011; Yu et al., 2010 Favors and information Ellison et al., 2011; Chang & Liu, 2011; Ellison et al., 2013; Ellison et al. 2014, Jung et al., 2013; Panovich et al., 2012; Gray et al., 2013; Lampe et al., 2014; Raacke & Bonds-Raacke, 2010; Krishnan & Hunt, 2015 Seeking social support Ross et al., 2009; Liu, 2011 Minimizing loneliness Xenos, 2011; Burke et al., 2010 Hedonic drivers Entertainment value Ross et al., 2009; Liu, 2011 Boredom: passing time Lampe et al. , 2008; Krishnan & Hunt , 2015 TAM (Technology Acceptance Model) Perceived ease of use Jin, 2013

14

and TRAM (Technology Readiness & Acceptance Model) Perceived usefulness Jin, 2013 Perceived playfulness Jin, 2013 Personal drivers Positive attitude towards new Krishnan & Hunt, 2015 communication technologies Desire for communication Ross et al., 2009; Liu, 2011; van Dijck, 2013a, 2013b Personal ity characteristics: contribute Big 5 : neuroticism, extraversion, Ross et al., 2009; Amich ai -Hamburger to a more unconscious feeling that has openness to experience, & Vinitzky, 2010; Moore & McElroy, a more direct connection in the agreeableness & conscientiousness 2012; Nadkarni & Hofmann, 2012; decision of having Facebook or not Carpenter, 2012; Buffardi & Campbell, 2010; Mehdizadeh, 2010; Ong et al., 2011; Orr et al., 2009; Wilson, Fornasier & White, 2010; Ryan & Xenos, 2011; Butt & Phillips, 2008 Shyness Orr et al., 2009; Ross et al., 2009; Butt & Phillips, 2008 Experience Moore & McElroy, 2012 Self -representation Carpenter, 2012; Nadkarni & Hofmann, 2012; Lee Won, 2014; van Dijck, 2013b; Nie & Sundar, 2013 Self -esteem General drivers Car penter, 2012; Christofides et al., 2009; Gonzales & Hancock, 2010; Lou, 2010; Stefanone et al., 2011; Yu et al., 2010; Zhao et al., 2008 Self -worth Carpenter, 2012; Toma & Hancock, 2013 Self -integrity Toma & Hancock, 2013 Self -Construal Kim et al. , 2010 Self -promotion Van Dijck, 2013b Gender Moore & McElroy, 2012

Table 1: Internal drivers for the adoption and use of Facebook 15

EXTERNAL DRIVERS FOR ADOPTION AND USE OF FACEBOOK AUTHORS External press Viswanath et al., 2009 Social influence Chiu et al., 2008; Cheung, et al., 2011; Chang & Liu, 2011; van Dijck, 2013a; Chen et al. 2012 social pressure Chiu et al., 2008; Cheung, et al., 2011; Chang & Liu, 2011; van Dijck, 2013a; Chen et al. 2012 Table 2: External drivers for the adoption and use of Facebook

16

INTERNAL DRIV ERS FOR CONTINUED USE OF FACEBOOK AUTHORS Social drivers Following b irthdays Viswanath et al., 2009 Hedonic drivers Value theory Epistemic value (perceived novelty) Yang & Lin, 2014 indirect effect with high trust Hedonic value (entertainment): direc t Yang & Lin, 2014 effect with high & low trust Social value with: indirect effect with Yang & Lin, 2014 high trust Enjoyment Indirect effect Yoon & Rolland, 2015 Direct effect Lin & Lu, 2011 Information Systems Attitude, percei ved usefulness Yoon & Rolland, 2015; Bonson et al., Continuance Model (Technology Acceptance Model), 2014 satisfaction (effects) & social influence (no effect) TAM & TRAM Suki et al., 2012; Jin, 2013 Perceived usefulness Yoon & Rolland, 2015; Bonson et al., 2014; Lin & Lu, 2011; Sagioglou and Greitemeyer, 2014 Personal drivers Theory of Reasoned Action Yoon & Rolland, 2015 Automatic behavior & habit Chang et al., 2011; Yoon & Rolland, 2015 Attitude Bonson et al., 2014; Lin et al., 2014; Chiang, 2013 Gender Indirect effect Lin & Lu, 2011 Forecasting error Sagioglou and Greitemeyer, 2014 Subjective norms Ku et al., 2013 Self -affirmation Toma and Hancock, 2013 Satisfaction General satisfaction Van Dijck, 2013a; Kourouthanassis et al., 2015, Gwebu et al., 2015, Yoon & Roland, 2015; Bonson et al., 2014; Yoon & Rolland, 2015; Hsu et al., 2014a; Lin et al., 2014 Expectation -confirmation model: self - Kourouthanassis et al., 2015

17

image matching with Facebook Trust -satisfaction relation Kourouthanassis et al., 2015 Trust Kourouthanassis et al., 2015; Gwebu et al., 2014

INTEGRATION OF DIFFERENT MODELS Analytical model (TAM), Emotional Gwebu et al., 2014 (satisfaction & loyalty), habit/automacy TAM (perceived usefulness and Hsu et al., 2014a perceived ease of use), Theory of Planned Behavior (behavioral intention influenced by attitude, subjective norm and perceived behavioral control), Expectation Disconfirmation model (disconfirmation and satisfaction) & Flow Theory Uses and Gratifications Theory Ku et al., 2013 (relationship maintenance, entertainment, sociability, information), perceived critical mass, subjective norms: positive influences; privacy issues: negative influence Theory of Reasoned Action (attitudes Chiang, 2013 & social norms), Uses and Gratification Theory (playfulness, informativeness & social interactivity), Innovation Diffusion Theory Social Network Theory, Social Hsu et al., 2014b Presence Theory, Cognitive Absorption Table 3: Internal drivers for the continued use of Facebook

18

Facebook resistance Nonetheless social and institutional privacy issues and dissatisfaction about Facebook, many people continue to use Facebook. There are however also people who show forms of FBR. A definition of FBR has not yet been clearly formulated. The following chapter discusses the phenomenon and attempts to offer a more thorough understanding of the concept.

Forms of Facebook resistance There are different levels of FBR. In an e-mail Stumpel (personal message, 2013, April 4 th ) sums up several forms: minimizing the time spent on Facebook (with or without the help of software), not using the mobile application, use an alias instead of the real name, avoid censure by placing text in pictures or images, deactivate an account, delete an account, install browser add-ons, user scripts and hacks. The most extreme form of FBR is not to go on Facebook at all (Stumpel, personal message, 2013, April 4 th ; van Dijck, 2013a). This list of examples is not exhaustive (personal message, 2013, April 4 th ). Academic literature proves this is true: protest actions are for example another important form of FBR (van Dijck, 2013a; Boyd, 2008; Hoadley et al. 2010). Moreover, several Facebook users hold back information from Facebook and even lie about specific personal data on their profiles (van Dijck, personal message, 2013, November 26 th ). Actively adjusting the default privacy settings is another form (van Dijck, personal message, 2013, July 30). Young people often don’t fill in certain information on their Facebook profiles, or do it with poetry or images (Robards, 2010). Although it is a light form, this all fits in the category of FBR (van Dijck, 2013a).

Regarding this, it is possible to make subclasses of different forms of FBR. There seem to be at least two different levels of FBR. A first distinction is internal and external Facebook resistance. The second distinction could comprise active forms of resistance, such as for example hacks, and more passive forms such as deliberately not liking commercial pages. The following attempts to pose a division of FBR forms along these two dimensions.

Internal Facebook resistance It is definitely possible to be a Facebook resister and still continue using Facebook. Internal FBR is conducting FBR from inside the social network site, thus influencing the way of using the platform in order to resist Facebook. We can distinguish passive and active forms of internal FBR.

Active forms of Facebook resistance An active form of internal FBR is actively configuring Facebook (Raines-Goldie, 2010; van Dijck, 2013a) by means of user scripts or hacking the website (van Dijck, 2013a). It is ‘ augmented freedom ’: users are ‘n ot moving away from everything that is imposed by Facebook, but instead seeking the opportunities to bend the rules/laws and implement our own’ (Stumpel, personal message, 2013,

19

April 4 th ). Thousands of examples of applications, browser add-ons and user scripts are available 11 , changing, adding or removing certain functional and esthetical features (Stumpel, personal message, 2013, April 4 th ). They are easily to apply, can regularly be updated, modified or easily removed (Stumpel, 2013).

Examples of these user scripts and add-ons are the Facebook Phishing Protector 12 and Tiny URL decoders such as ‘Long URL Please’ and ‘Tiny URL Decoder’ protecting users against malicious websites or software. Other examples are Unfriend Finder and EnemyGraph, different initiatives around the dislike button and the Facebook Color Changer 13 (Stumpel, 2013). Adblock Plus 14 , however not created specifically for Facebook, removes ads, but not sponsored stories. Social Fixer 15 and Facebook Purity 16 allow to change all kinds of features, including sponsored stories. Although asking some testing and learning time to use it fluently 17 , integrated in the blue top toolbar, Facebook Purity and Social Fixer are easily accessible once installed.

Figure 1: Facebook Purity (FBP) and Social Fixer (screw wrench) buttons incorporated in the upper Facebook toolbar

11 Consulted on January 11 2015 on the World Wide Web: https://addons.mozilla.org/nl/firefox/search/?q=facebook&appver=34.0&platform=windows, https://greasyfork.org/nl/scripts/search?q=facebook and http://userscripts-mirror.org/scripts.html 12 Consulted on January 14 th 2015 on the World Wide Web: https://addons.mozilla.org/en- US/firefox/addon/facebook-phishing-protector/ 13 Consulted on January 14 th 2015 on the World Wide Web: https://addons.mozilla.org/nl/firefox/addon/color- change-for-facebook/ 14 Consulted on January 14 th 2015 on the World Wide Web: https://adblockplus.org/en/ 15 Consulted on January 11 th 2015 on the World Wide Web: http://socialfixer.com/ 16 Consulted on January 11 th 2015 on the World Wide Web: http://www.fbpurity.com/install.htm 17 Consulted on January 13 th 2015 on the World Wide Web: http://www.fbpurity.com/user-guide.htm and http://socialfixer.com/features.html#ex1 20

Figure 2: Facebook Purity options popup screen

Figure 3: Social Fixer options popup screen

‘Facebook Resistance’, an artistic FBR initiative founded by Tobias Leingruber, organizes workshops in which participants discuss critically on Facebook’s protocols and create tools and hacks to change Facebook’s rules from within the platform (Stumpel, 2013; FBresistance, n.d.).

21

Some add-ons however, do more than just changing features on the platform. There are add-ons available that protect users’ SP. The Facebook Seen Notification Remover 18 . Before it wasn’t possible to say ‘I haven’t read your message’ when in truth the user had actually had. Making this possible, it offers an opportunity to enlarge SP for the user who installed the add-on. There also add-ons available protecting users’ institutional privacy: preventing advertising and analytical data mining companies to track people while surfing the Web. Examples are Disconnect 19 , the previously called ‘DoNotTrackMe’, now called ‘Blur’ 20 , Privacy Badger 21 , Ghostery 22 , PoX (Egele et al., 2012) and Facebook Disconnect 23 specifically created for Facebook. Other initiatives focusing on institutional privacy, are likejacking, Reclaim my privacy and Givememydata (Stumpel, 2010). Counter-protological control (Galloway & Thacker, 2007) refers to resisting Facebook protocols (Stumpel, 2010; 2013) or control over Facebook’s control (Galloway & Thacker, 2007; Stumpel, 2010; van Dijck, 2013a). User scripts are an example (van Dijck, 2013a) of these ‘C ounter-protological attacks’ (Galloway, 2004). Real hacks are for example the Web 2.0 Suicide Machine from 2009. A similar hack is Social Roulette, created in 2013 24 . Leingruber remarked in a comment 25 on the FB Resistance Artists Facebook group that Facebook already after one day had disabled the hack application. These kind of hacks can thus easily be understood as forms of heavy resistance.

Protest groups are another possible form of active internal FBR. They are discussed frequently in academic literature (e.g. Boyd, 2008; Hoadley et al., 2010; Debatin et al. 2009; Stumpel, 2010; Hull et al., 2011). Mainstream media gave these actions a lot of attention, which resulted in more consciousness and more protest among users (Hoadley et al., 2010). Examples are ‘Students against Facebook ’ (Boyd, 2008), the Facebook group ‘Facebook, Stop Invading My Privacy’ (Debatin et al., 2009), ‘Change Facebook back to normal’ and ‘Please give us our old news feed back’ (Theguardian PDA The digital content blog, 2009, October 27). The Facebook group FB Resistance Artists 26 was created in inspiration of Leingruber’s artistic project (Stumpel, 2013).

18 Consulted on January 14 th 2015 on the World Wide Web: https://addons.mozilla.org/nl/firefox/addon/fbchatseenblocker/?src=api 19 Consulted on January 12 th 2015 on the World Wide Web: https://disconnect.me/disconnect 20 Consulted on January 11 th 2015 on the World Wide Web: https://dnt.abine.com/#feature/tracking 21 Consulted on May 3 rd 2015 on the World Wide Web: https://www.eff.org/privacybadger 22 Consulted on May 3 rd on the World Wide Web: https://www.ghostery.com/nl/ 23 Consulted on January 12 th 2015 on the World Wide Web: https://addons.mozilla.org/nl/firefox/addon/fbdc/?src=userprofile 24 Consulted on January 16 2015 on the World Wide Web: http://socialroulette.net/ 25 https://www.facebook.com/groups/189135107782024/ 26 Consulted on January 12 th 2015 on the World Wide Web Web: https://www.facebook.com/groups/189135107782024/ 22

Other active forms of FBR are some forms mentioned at the beginning of this chapter: consciously lying about profile info, use an alias instead of the real name, avoid censure by placing text in pictures or images, minimize the time spent on Facebook with the help of software, not using the mobile application. Encouraging users to protect themselves against Facebook’s tracking methods by installing add-ons is also an active form of resistance. Automated buttons indicating feelings and desires such as the like button can be easily manipulated by users (van Dijck, 2013a; van Dijck, 2013b) in order to mislead Facebook’s algorithms. Another example is to untag pictures, avoid faces on profile pictures or make faces on profile pictures blury, as many members of the Facebook group FB Resistance Artists do. This could be as a result of anticipating the facial recognition techniques Facebook uses (see supra).

One remarkable example is a status update that circulated in November 2012 and regularly shows up. It contains a copyright statement claiming that, if posted on users’ timelines, it would protect those users intellectual property making the use of it forbidden unless by explicit written agreement 27 . Facebook users who post this status update clearly believe they can act against the situation of Facebook using their data.

Passive forms of Facebook resistance We could argue that minimizing the time spent on Facebook without the help of software, is rather a passive form of FBR as is untagging pictures and consciously leaving certain profile information boxes blank, in order not to give Facebook or everyone else this kind of personal information. Not liking any commercial pages could be another example. However, it is difficult to state exactly what FBR forms

27 In response to the new Facebook guidelines I hereby declare that my copyright is attached to all of my personal details, illustrations, comics, paintings, crafts, professional photos and videos, etc. (as a result of the Berner Convention).

For commercial use of the above my written consent is needed at all times!

(Anyone reading this can copy this text and paste it on their Facebook Wall. This will place them under protection of copyright laws.)

By the present communiqué, I notify Facebook that it is strictly forbidden to disclose, copy, distribute, disseminate, or take any other action against me on the basis of this profile and/or its contents. The aforementioned prohibited actions also apply to employees, students, agents and/or any staff under Facebook's direction or control. The content of this profile is private and confidential information. The violation of my privacy is punished by law (UCC 1 1-308-308 1-103 and the Rome Statute).

Facebook is now an open capital entity. All members are recommended to publish a notice like this, or if you prefer, you may copy and paste this version. If you do not publish a statement at least once, you will be tacitly allowing the use of elements such as your photos as well as the information contained in your profile status updates...

23 are passive. It is possible to discuss the active or passive nature of some of the above forms such as not using the mobile application, consciously untagging pictures or not liking commercial pages. It is thus the question whether the distinguishing between active and passive forms of FBR provides a good insight on the matter.

External Facebook resistance External FBR is conducting Facebook resistance from outside the social network site. Again a subdivision of active and passive external FBR is posed. Both online and offline initiatives are possible.

Active forms of Facebook resistance Active external FBR entails conscious actions resisting against Facebook from outside this platform. Several online petitions are an example, such as ‘SaveFacebook.com’ (Facebook.com Users Against The “News Feed” and “Mini Feed”, 2006, September 5 th ) and MoveOn.Org putting up a petition against the Beacon feature (van Dijck, 2013a). Other campaigns call up users to quit Facebook for a while, such as the ‘A Day Without Facebook’ campaign28 and the ‘99 Days of Freedom’ initiative 29 . ‘Quit Facebook Day’ went further calling up users to delete their account 30 . Other possible actions of FBR are ‘How to annoy Facebook’ (O’Carrol & Halliday, 2011, September 29) and ‘sousveillance’ (Fernback, 2013) of which watchdog website ‘Europe versus Facebook’ 31 is an example of.

Although a lot of these protest actions are operating online, not all of them are (Stumpel, 2013). Facebook already had to experience several lawsuits: among of which are some group claims (Van Dijck, 2013a; Facebook forced into privacy business; Johnson, 2009, December 10). In 2014 Max Schrems issued a lawsuit against Facebook for violating users’ privacy, mass surveillance and its part in the NSA’s Prism schandal in 2013 32 . Recent protests show other countries in the European Union are not pleased Facebook does not comply with local privacy rules (Matussek, 2013, February 15; Neyskens, 2015, April 15). The Belgian Privacy Commission recently decided to take Facebook to court (Facebook voor Brusselse rechter, 2015, June 15; Rechtszaak tegen Facebook pas in september gepleit, 2015, June 16). There are also several legislative initiatives that try to address the privacy issues due to Facebook’s policy (European Commission, 2012, January 25; European Commission, 2015, April 15; European Commission, 2015) and want to implement the Right to be Forgotten (European Commission, 2014). (For more information, see appendix 4).

28 Consulted on January 12 on the World Wide Web: http://daywithoutfacebook.blogspot.be/ 29 Consulted on May 1 st on the World Wide Web: http://99daysoffreedom.com/ 30 Consulted on January 12 on theWorld Wide Web: http://www.quitfacebookday.com/ 31 Consulted on January 13 on the World Wide Web: http://europe-v-facebook.org/EN/en.html 32 Consulted on May 1 st 2015 on the World Wide Web: https://www.fbclaim.com/ui/page/faqs 24

Another form is to encourage people to install add-ons and user scripts to prevent Facebook from tracking their online behavior, as does the Belgian Privacy Commission (Commission for the Protection of Privacy, 2014a). The creating of an account on an alternative non-commercial social network site such as Diaspora or Ello serves an active form of FBR. Deleting an account, or ‘ Facebook suicide ’ (Justice, 2007) seems a heavy form. Consciously not having a Facebook account is considered to be the heaviest form of FBR (personal message, 2013, April 4 th ; van Dijck, 2013a).

Leingruber, artist and founder of the FB Resistance artists, created in 2012 a Facebook ID-card, to bring attention upon the broadly accepted social network identity becoming more important than ‘standard’ identity documents (Leingruber, 2012; Lown, 2013 May 3 rd ).

Passive forms of Facebook resistance Passive external FBR entails most likely light forms of FBR. Just criticizing Facebook, but not undertaking further actions, could be considered as such. It seems however hard to further distinguish passive forms of external FBR. Again, opportunities for discussion rise. Deleting an account and not having a Facebook account are the heaviest form of FBR, but are they inconvertibly active? Moreover, could they always be considered as FBR? Hardly any research was conducted towards motives for not having Facebook. However, recently Vanhaelewyn et al. (2014) concluded that more than half of the people without an account have ‘ no interest or need to be active on social media’ . Privacy issues are the second main reason, ‘other’ follows as third. Having no internet access and having no time to be active on social media conclude this list. Dindar and Akbulut (2014) concluded that next to privacy concerns, waste of time, disturbance, lack of interest, coping with a break-up and partner pressure were all reasons to delete an account. It is hard to argue people who don’t have time, the need or interest to create an account or maintain it, should be considered as Facebook resisters. If this is however the case, they could fit in the PEFBR category.

Motives for Facebook resistance The main motive for Facebook is dissatisfaction with Facebook (van Dijck, 2013a). This is however a rather broad term, and should be specified in order to give a more thorough understanding of the motives for FBR.

Taylor et al. (2011) state that dissatisfaction with excessive advertisements may lead to Facebook abandonment. Dissatisfaction further arises due to changing lay-outs (Gross, 2011, September 23), the lack of possibility to change certain features, or the missing of wanted features, such as a dislike-

25 button (van Dijck, 2013a; Stumpel 2013). Matt Kruse created Social Fixer when he got tired of seeing the same posts over and over again 33 .

Other motives for dissatisfaction triggering FBR are boredom and tiring experiences (Stumpel, personal message, 2013, April 4 th ). Social media fatigue is caused by an overload of information. Privacy concerns also have a positive relation to social media fatigue (Bright et al., 2015).

Privacy concerns in general seem to trigger dissatisfaction and FBR (Stumpel, personal message, 2013, April 4 th ; Ku et al., 2013 & van Dijck, 2013a; Vanhaelewyn et al., 2014; Dindar & Akbulut, 2014; Zlatolas et al., 2015). The higher people’s privacy awareness, privacy social norms, privacy concerns and knowledge about Facebook’s privacy, the lower they show self-disclosure on Facebook (Zlatolas et al., 2015). Given the above chapters it is reasonable to consider both social and institutional privacy (Raines-Goldie, 2010; Boyd, 2008; Stumpel, 2010;van Dijck, 2013a; Stumpel, personal message, April 4 th ). Openness, transparency and too much self-disclosure trigger social privacy discontent (Raynes-Goldie, 2010; Justice, 2007, September 2015; Zlatolas et al., 2015) or are simply perceived as annoying (Moreno, 2014). Dissatisfaction also arises around Facebook’s ownership (see supra), regarding its fusions, partnerships and IPO. Other reasons are unannounced changes to Facebook’s policy and the secrecy around its algorithms, both arousing institutional privacy issues and leaving unfair feelings with Facebook users who are expected to be open constantly (van Dijck, 2013a).

In 2014, the ’99 Days Without Facebook’ initiative started after it became clear Facebook had conducted an experiment on 690 000 Facebook users 34 , manipulating posts in order to research whether “emotional contagion” by means of more positive or more negative posts is possible (Booth, 2014, June 30). Fernback (2013) offers as motives equality, social responsibility, cultural integrity and individual empowerment: Facebook has to become more open and more transparent. By conducting sousveillance (see supra), resisters want to change power dynamics between Facebook hoping to provoke more consciousness about the amount of information Facebook gathers about its users.

Individuality, identity and one’s community role influence FBR behavior (van Dijck, 2013a). Age would influence it only to a lesser degree; younger people disclosing much information would have a smaller chance of being a Facebook resister.

33 Consulted on January 3 rd 2015 on the World Wide Web: http://socialfixer.com/about.html 34 Consulted on May 1 st on the World Wide Web: http://99daysoffreedom.com/ 26

Conclusion Van Dijck’s classification of Facebook along a Techno-cultural (users and usage, content and technology) and a socio-economical (ownership, policy and business model) dimension provide a thorough contextual framework for the following literature. Connectedness entailing user-centered, narrative and diverse content, has experienced a shift towards connectivity, which implies uniformity, standardization and databases. This connectivity is to be found throughout all six Facebook levels. Sharing and friending are two important concepts embracing this shift. Facebook has become a commercialized platform conducting profound data mining, selling of users’ data and metadata en advertising. Therefore it realizes it has to provide high entertainment values and benefits, in order to keep its user base and maintain profits.

Facebook’s actions however have large consequences for users’ social and institutional privacy concentrated around three interconnected reasons of critique (too much self-disclosure, profound data-mining and Facebook not keeping its promise to protect users’ privacy). More and more people are aware of these issues. Nonetheless, many people use and continue to use Facebook driven by many and complex motives. This does however not prevent several people from resisting Facebook.

There prove to be several forms of FBR. They were divided along an internal and external and an active and passive dimension. It should however be questioned if this is the most optimal manner, as it is hard to clearly distinguish passive resistance examples. Moreover, it is questionable if this list of examples is complete. Several motives for FBR were posed, but no scientific research has been conducted yet towards this issue. There is thus no certainty about the correctness, nor about the completeness of these drivers. Given the complex motives for Facebook use and continued use it is however reasonable to assume there are also multiple and complex motives for FBR. It also seems interesting to distinguish the features of Facebook users. How can we describe them? To what degree is Facebook resistance to be found among Facebook users and non-users? How do different types of users and non-users express FBR? Which strategies do they use? Which types of FBR can be identified? To address these issues we conducted a research specifically focused on these subjects.

Research methodology

Test sample Since there is no certainty about drivers of FBR and about who Facebook resisters are, the research focused on a survey conducted with both Facebook users and non-users. In order to reach a higher amount of older people and non-users, the survey was distributed both online and offline. Mainly distributed in Belgium, we created a Dutch as well as an English version. Several appeals in both

27

Dutch and English containing a link towards an online survey, were publicly posted on Facebook and experienced multiple shares. We contacted 184 people via a mailing list from a previous research on FBR. Furthermore, the survey was handed out in paper on two locations. We handed surveys out to a group of theater spectators waiting for the start of the performance of a local Belgian amateur theatre group. This happened on two occasions, on Friday and Saturday evening. The survey was also handed out on a train journey through Belgium on two occasions: on Monday morning at 09h11 from Poperinge to Antwerp and on Friday afternoon at 15h25 from Ghent to Poperinge. This was not during a school holiday. On all occasions, Dutch and English versions were available. The responses were gathered from Feruary 26 th till March 20. 315 respondents took part in the offline survey. As an incentive, a sweet was offered to the respondents of the offline survey. Moreover, all participants, both online and offline, had a chance of winning two cinema tickets.

In total there were 451 participants of thirteen different nationalities (N=419). The majority of the respondents (95,9%) had the Belgian nationality. 57.7% of the participants was female, 42.3% was male (N=421). The average age of the respondents was 33, the minimum age was 11 years old, whereas the oldest person was 82. More than half of the respondents were 25 years old or younger. 46.4% of the respondents (N=369) indicated not to have enjoyed a higher education. 31.2% indicated to have a Bachelor degree, whereas 22.5% of the respondents has a Master degree or higher. 18.2% of the respondents indicated not to have a Facebook account (N=451).

Instrument development The questionnaire featured thirteen self-reporting questions designed to give a clearer image on FBR drivers and on the degree of FBR among Facebook users and non-users (See appendix A). Both quantitative and qualitative questions were implemented. Several authors stressed the importance of a positive correlation between attitude towards Facebook and the use and continued use of Facebook. It is also one of the aspects of the Technology Acceptance Model which served as a strong explanatory factor (see supra). Reversing this, we assume a negative attitude towards the Facebook website and the Facebook company could trigger FBR. Question 1 and 2 measured the respondents’ attitude towards the Facebook website and the Facebook company.

Question 5 and 6 respectively measured the attempt of respondents to reduce and resist the influence of Facebook. Question 8 addressed the topic of FBR surveying how and why the respondents reduce the influence of Facebook / resist against it. The questions are based on the above literature on privacy issues and FBR forms. A qualitative component encouraged to specify exact actions of resistance. Question 3 and 4 surveyed whether respondents had an account and surveyed the frequency of using Facebook. Question 9 focused on Facebook use of particular 28 features. Question 7 verified the adjusting of privacy settings and consisted out of a qualitative component. Question 10 till 13 concluded the questionnaire requesting demographic information.

Data analysis and results

To what degree is Facebook resistance to be found among Facebook users and non-users? Frequencies show that 20,3% and 20.2% of the respondents respectively try to reduce and resist Facebook’s influence much or as much as possible (see appendix B). In order to verify to what extent total FBR is to be found, we conducted a reliability analysis on Q5 and Q6, resulting in α=0.73 (N=444) (see appendix C). We call this the ‘Facebook resistance’ scale. Frequencies now show 21.7% of the respondents indicate trying to resist Facebook’s influence much or as much as possible (N=444). 48.9% of the respondents claims trying to resist Facebook’s influence moderately or more than moderate. Only 9% points out not to resist Facebook at all (see appendix D).

We assumed that the more negative people’s attitude towards Facebook, the more likely they would try to resist Facebook. In order to test this, we first conducted a reliability analysis on Q1 and Q2 measuring the attitude towards Facebook. This results in α=0.62 (See appendix C). Due to discussions about the reliability of α on two-item scales, we used the Spearman-Brown (Eisinga, de Grotenhuis en Pelzer, 2012), however resulting in the same value as Chronbach’s α. In using this scale, we have thus to take this low value into account. We named this scale ‘Facebook love’. The above statement proves correct: a negative correlation of -0.20 is to be found between the scales Facebook love (N=434) Facebook resistance (N=444). Although it is only a weak relation, it proves significant on the 0.000 level (see appendix E).

In order to further describe Facebook resisters, it is reasonable to assume on the basis of the literature, the higher respondents score on the scale Facebook resistance, the more they will adjust privacy settings, adjust privacy settings in order to resist Facebook, adjust Facebook use and do other things. As installing external aids such as add-ons, user scripts or hacks are considered to be heavy forms of Facebook resistance, we expect people who score high on the Facebook resistance scale to have installed them more often than other respondents.

The first statement proves to be correct. Among people with a Facebook account, a cross tabulation (N=365, significant on the 0.05 level) (see appendix J) makes clear that from a level of moderate resistance on, people adjust more often their privacy settings. However, this is not the case for respondents who indicate ‘very much’. Presumably, this is due to some respondents using another technique of protecting their social privacy. Qualitative results indicate that of the four respondents

29 who score highest on this scale but don’t adjust privacy settings, three respondents visit the Facebook website as less as possible. The fourth respondent indicates only to read other posts.

The second statement also proves to be correct. A cross tabulation (N=349) (See appendix K) on people who have a Facebook account, proves respondents with a higher score on the Facebook resistance scale significantly (p=0.000) adjust privacy settings more often in order to resist or reduce Facebook’s influence. This effect is noticeable starting from a moderate level on the Facebook resistance scale. This accounts also for people who score highest on this scale.

Respondents who have a higher score on the scale Facebook resistance indicated more often to adjust privacy use in order to resist/reduce Facebook’s influence than people who score low on this scale. This proves significant on the 0.000 level. However, a higher score proves necessary comparing to the previous statements: it accounts for scores of more than moderate and higher. Another cross tabulation (N=420) makes this clear (see appendix L). In this analysis, non-users were taken into account, as it is possible some respondents have adjusted Facebook use by deleting their account thus stopped using it.

The fourth and fifth statement, (doing other things and using external aids) seem surprisingly to be non-significant. Based on the scale Facebook love, one could expect the same results as the above: the more negative the attitude towards Facebook, the more likely respondents are to adjust privacy settings, to adjust privacy settings in order to resist/reduce Facebook’s influence and adjust Facebook use. However, none of these statements proved significant. The same non-significant results were found considering Facebook love and using external aids and doing other things to resist/reduce Facebook’s influence.

Nonetheless we found some interesting results regarding the two Facebook scales and people indicating to have an account or not (see appendix M). A cross tabulation considering Facebook love (N=440, sign. P=0.000) proves people who have a little positive or more positive attitude towards Facebook, have more often an account than not. All respondents who scored very positive had an account. Respondents who score neutral or lower, seem to have an account less frequent. This is what we would expect based on the above presumptions. Another cross tabulation considering Facebook resistance (N=444) showed some peculiar results. There was proof on a significance level of 0.000 showing scores from a little resistance to much Facebook resistance indicating respondents have in this condition more often a Facebook account. Participants who score very low on this scale, showing none or very little resistance, proved to have less often a Facebook account. Only

30 respondents who showed as much resistance as possible, showed expectable behavior in more often not having an account.

Consequently, we tested the correlation of the two scales (Facebook love N=440, Facebook resistance N=444) and the frequency of Facebook use (N=450) (see appendix N). As scientific literature has already proven, we found also a clear positive correlation of R=0.49 (sign. P=0.000) regarding attitude towards Facebook. However, no significant correlation was to be found regarding the scale Facebook resistance. We also tested the correlation of the two scales regarding different actions (see appendix O). All respondents were taken into account, due to the possibility of people without Facebook account log in to another account. Nine people indicate to go use Facebook, although it is only ‘hardly ever’ or ‘sometimes’. (see appendix O). All correlations proved significant: the more positive respondents’ attitudes towards Facebook, the more frequent they conduct these actions. The results are to be found in Table 4.

CORRELATION FACEBOOK LOVE AND ACTIONS ON FACEBOOK

Facebook action N Significance Correlation R Sharing/po sting updates 408 0.000 0.33

Sharing/posting pictures 408 0.000 0.44

Sharing/posting films 403 0.000 0.31

Sharing/posting links to other 407 0.000 0.26 websites Sharing/posting hashtags 407 0.000 0.26

Reply to posts 407 0.000 0.46

Indicate going to e vents 409 0.000 0.373

Playing games 409 0.003 0.15

Chatting 407 0.000 0.39

Table 4: Correlation Facebook love and Facebook actions

We conducted the same analyses regarding the scale Facebook resistance. No significant correlations were found between this scale and these Facebook actions.

In attempting to further describe Facebook resisters, we conducted a cluster analysis along the scales Facebook love (attitude towards Facebook) and Facebook resistance (N=434). This resulted in five groups (see appendix F). All 5 groups differed significantly on the 0.000 level. The values of Facebook love and Facebook resistance can be found in Table 5. We conducted further cross tabulation analysis finding significant differences on ‘Do you have a Facebook account’ (N=434) (see appendix

31

G), gender (N=405) (See appendix H) and age (N=403) (See appendix I). The category age was recoded into three groups: 11 till 30, 30 till 50 and 50 plus. All conditions were ready for interpretation. All values occur more often than the other variables. They are to be found in table 4. No significant differences were found with education.

Facebook Love Facebook Account Gender Age Resistance Sig. p=0.000 Sig. p=0.006 Sig. p=0.011 Moderate Moderate Yes Female 11 -30 Lower High No Male 30 -50, 50+ Lower -moderate Low No Male 50+ High Higher Yes Fe male 11 -30 High Low Yes Male No difference

Table 5: Cluster description along dimensions Facebook love and Facebook resistance

In order to provide a more profound image of Facebook resisters, the survey questioned respondents’ motives for FBR. Examining the qualitative answers, a large amount of drivers embraces the privacy dimension. We can distinguish both social and institutional privacy drivers. There are however also FBR motives that do not fit in one of these two categories. These drivers seem all to depend on more general issues regarding Facebook and FBR. Consequently we divide FBR motives along three dimensions: Social- and institutional privacy drivers and general FBR drivers. The motives are displayed in Table 6.

Participants of the survey point out several strategies in resisting/reducing influence of Facebook. Looking closer at the qualitative results, different types of FBR seem to evolve. Respondents quote several actions mentioned in the above distinction along the internal and external and active and passive dimension. Next to other more general forms of FBR, a large amount of actions however address social and institutional privacy. It seems thus more useful to follow a similar distinction as for FBR motives. Therefore, we propose to divide the resistance forms along three topics: General Facebook resistance (GFBR), social Facebook resistance (SFBR) and Institutional Facebook resistance (IFBR). Taking motives into consideration, this results in several forms to be found among both SFBR and IFBR or among SFBR, IFBR and GFBR. The respondents mentioned conducting the following forms they consider as FBR. They are displayed in table 7 along the three proposed dimensions.

32

DRIVERS OF FACEBOOK RESISTANCE General action of FBR GFBR SFBR IFBR Not create an account Not create an account Not create an account General privacy Follow up on privacy settings Follow up on privacy settings Ad just visibility of posts Not filling in certain aspects of profile Not filling in certain aspects of profile Hide certain aspects of profile (for example films, music,…) Remove friends/limit number of friends 35 Selectively adding friends Using special e -mail address Using special e -mail address Falsifying name Falsifying name Installing add -ons Limiting use of the Facebook Logging out after use Logging out after use Logging out after use website Consciously checking Facebook , not Not logged in to Facebook when random scrolling surfing other websites Not log in with Facebook account to other websites Not using social plugins on other websites Limiting locations of Facebook use Limiting locations of Facebook use Limiti ng time spent on Facebook Limiting time on Facebook Limiting time on Facebook Turning of the router Limiting general actions of Facebook Limiting general actions on Facebook Limiting general actions on Facebook (e.g. block games, turn of chat, only (e.g. block games, turn of chat, only (e.g. block games, turn of chat, only commenting not posting information, commenting not posting information, commenting not posting information, only use it for communication, not only use it for communication, not only use it for communication, not sending files on Facebook) sending files on Facebook) sending files on Facebook)

35 The FBR forms in italics were not previously mentioned in the introduction section of this research paper

33

Limiting likes and clicks Remove account Remove account Deactivate account Remove Facebook app on mobile phone Remove and avo id tags Avoid tags Adjust posting behavior Adjusting behavior in indicating Adjusting behavior in indicating events events Limit number of posts Limit number of posts Think about content posts Think about content posts Limit content of posts Lim it content of posts Limit posting of personal information Limit posting of personal information Misleading Facebook in acquiring information Create secret groups Formulating things unclear so only a few people know what is meant Remove da ta Advertisements Ignoring advertisements Ignoring advertisements Removing advertisements Installing add -ons Social behavior Putting face -to -face contacts first Warn people about anti -social behavior Convert an activity to outside of Facebook (for example asking pictures) Table 6: Drivers of Facebook resistance

34

FORMS OF FACEBOOK RESISTANCE General action of FBR GFBR SFBR IFBR Not create an account Not create an account Not create an account General privac y Follow up on privacy settings Follow up on privacy settings Adjust visibility of posts Not filling in certain aspects of profile Not filling in certain aspects of profile Hide certain aspects of profile (for example films, music,…) Remove friends/limit number of friends 36 Selectively adding friends Using special e -mail address Using special e -mail address Falsifying name Falsifying name Installing add -ons Limiting use of the Facebook Logging out after use Logging out after use Logging out after use website Knowingly checking Facebook Not logged in to Facebook when surfing other websites Not log in with Facebook account to other websites Not using social plugins on other websites Limiting locations of Facebook use Limiting locations of Facebook use Limiting time spent on Facebook Limiting time on Facebook Limiting time on Facebook Turning of the router Limiting general actions of Facebook Limiting general actions on Facebook Limiting general actions on Facebook (e.g. block games, turn of chat, only (e.g. block games, turn of chat, only (e.g. block games, turn of chat, only commenting not posting information, commenting not posting information, commenting not posting information, only use it for communication, not only use it for communication, not only use it for communication, not sending files on Facebook) sending files on Facebook) sending files on Facebook)

36 The FBR forms in italics were not previously mentioned in the introduction section of this research paper

35

Limiting likes and clicks Remove account Remove account Deactivate account Remove Fa cebook app on mobile phone Remove and avoid tags Avoid tags Adjust posting behavior Adjusting behavior in indicating Adjusting behavior in indicating events events Limit number of posts Limit number of posts Think about content posts Think about content posts Limit content of posts Limit content of posts Limit posting of personal information Limit posting of personal information Misleading Facebook in acquiring information Create secret groups Formulating things unclear so only a few people know what is meant Remove data Advertisements Ignoring advertisements Ignoring advertisements Removing advertisements Installing add -ons Social behavior Putting face -to -face contacts first Warn people about anti -social behavior Convert an activity to outside of Facebook (for example asking pictures) Table 7: Forms of general Facebook resistance, social Facebook resistance and institutional Facebook resistance

36

Considering the quantitative results, we distinguished for each individual respondent the presence of social privacy consciousness, institutional privacy consciousness, GFBR, SFBR and IFBR. If SFBR or IFBR were present, we pointed this respondent out as privacy resister (see appendix P). 71.9% of the respondents conducted their actions, for example the adjusting of privacy settings, based on consciousness about social privacy (N=445). However, only 54% of the respondents showed SFBR (N=441) influenced by the intention to resist Facebook/reduce its influence. Only 18.6% of the respondents (N=435) conducted actions based on consciousness of institutional privacy. 16.8% of the respondents seemed to conduct these actions out of IFBR. Considered together, 56.2% of the participants showed FBR due to privacy reasons (N=441). Narrow GFBR (forms of FBR that do not entail privacy concerns) was to be found among 34.8% of the respondents (N=442). If any form of FBR occurred, either GFBR, SFBR and/or IFBR, we indicated the respondent showed broad FBR. 70% of the respondents conducted their actions due to at least one form of resistance. Only 30% showed no FBR whatsoever (N=443).

Bringing these data in relation to the five groups of respondents we had created earlier, some interesting results become clear (see appendix Q). The following table shows the results of a cross tabulation indicating for each of the five groups whether GFBR (N=428), SFBR (N=427) and IFBR (N=420) occurred more or less often than expected. Each analysis proved significant on the 0.000 level. It is remarkable that no respondents in the last group showed IFBR.

APPEARANCE OF GFBR, SFBR, IFBR WITHIN CLUSTER GROUPS Facebook love Facebook GFBR SFBR IFBR resistance Moderate Moderate Less often More often More often Lower High More ofte n More often More often Lower -moderate Low Less often Less often Less often High Higher More often More often Less often High Low Less often Less often Less often

Table 8: Appearance of GFBR, SFBR and IFBR in relation to clustering along the Facebook love and Facebook resistance dimension

37

Conclusion and discussion In order to provide a contextual framework for the topic of FBR, this research paper started to discuss Facebook along van Dijck’s techno-cultural and socio-economical dimension, containing six levels. The shift from connectedness towards connectivity and consequently the changing concepts of friending and sharing proved herein to be central. This entails large implications for both Facebook users’ social and institutional privacy. Given the definitions of these concepts discussed in the introduction, one could connect Raynes-Goldie’s aspect of social privacy with Van Dijck’s concept of connectedness. Extending this, a connection between Raynes-Goldie’s institutional privacy and van Dijck’s concept of connectivity becomes clear. As connectedness and connectivity are to be found in all six levels of Facebook we can conclude the same for both social and institutional privacy. Given the critiques on Facebook’s privacy issues, and given the connection between social and institutional privacy, connectedness and connectivity, we can conclude the nonexistence of institutional privacy on Facebook.

Regarding Fuchs’ statements, it is however clear this privacy discussion is complex. On the one hand Facebook’s data gathering and exploitation is very profound, seriously endangering users’ institutional privacy. On the other hand it is not possible to offer complete institutional privacy and at the same time offer free using. Users thus pay for the free platform with their privacy. Although it is questionable if alternatives can become success stories in a capitalist context (van Dijck, 2013a), this does not explain the alternative network sites Diaspora and the more recent Ello, wanting to maintain a non-commercial framework 3738 .

Social and institutional privacy proved very important along this entire research paper. Van Dijck stated more and more people realize their social and institutional privacy are at risk. However, Raines-Goldie states people are still mainly concerned about social privacy. This research proves this statement to be correct as 71.9% of the respondents indicated to act based on social privacy concerns whereas only 18.6% of the respondents mentioned institutional privacy concerns. A lot of improvement thus can be made. Given the recent attention in news media towards this topic, there are reasons to believe more consciousness about institutional privacy will occur.

The introduction pointed out dissatisfaction to be the main factor triggering FBR. However, this term proved too broad considering motives of Facebook resistance. This seems an umbrella term for several other reasons pointed out in the introduction: excessive advertisements, boredom, changes to policy and social and institutional privacy factors. These factors were indeed found among the

37 https://ello.co/manifesto 38 https://joindiaspora.com/ 38 qualitative results surveying motives for FBR. However, due to a lack of scientific research we stated these few motives would not give a sufficient view on FBR drivers. This research found many other FBR drivers, along three dimensions: a more general one and an institutional and a social privacy dimension. This supports the importance of these concepts stated by Raines-Goldie. In the introduction there were however also motives for FBR pointed out, which were not to be found in the qualitative survey results. An example is the critiques on Facebook’s algorithms. This is could for example be due to a lack of knowledge among facebook users.

Although this research showed there are many motives to conduct FBR, the situation is ambiguous. The perceived benefits of Facebook use still seem to overrule many FBR drivers. Nonetheless the introduction pointed out many forms of FBR. In the introduction we listed several examples along two dimensions: external versus internal and active versus passive. The latter dimension was used considering already existing literature about this topic. As it proved hard to distinguish with certainty passive forms of FBR, we questioned this distinction along these two dimensions. Moreover, although the amount of FBR forms was already quite elaborate, we stated the amount of examples were not proven to embrace all possible forms. The qualitative research results surveying FBR forms showed both issues to be correct. Facebook resistance forms were again to be distinguished along the institutional and social privacy dimension. Other factors were considered as GFBR. Many previously mentioned examples were found in the qualitative survey results. However, also several new forms were found. We should also mention some FBR forms in the introduction were not to be found among respondents’ answers. For example, nobody mentioned having taken part in protest actions. We propose two more forms of FBR. As a form of FBR was to limit clicks, we could extend this towards copying web links in the address bar of the browser instead of clicking on them. Another form takes van Dijck’s ownership dimension into consideration. As Facebook has acquired Instagram and WhatsApp and all these data are thus also exploited (see supra), deliberately not going on Instagram and WhatsApp could be considered as FBR.

Towards a definition Because of the many alternative strategies, it is very hard to generalize FBR (Stumpel, personal message, 2013, July 30). Different forms of Facebook resistance were discussed, but no clear definition of FBR was yet posed. In order to do this, it is useful to consider other definitions related to the matter. Galloway and Thacker (2007) regard acting against social network sites’ policies as ‘ counterprotological control […]: tactical implementations and intensifications of protocological control ’. The introduction listed several forms of FBR posed by Stumpel (personal message, 2013, April 4 th ) considering it in that case as ceasing to use Facebook or keep distance from

39 it. The many examples of FBR in both introduction and research, prove FBR is much more than that. Van Dijck (personal message, 2013, July 30) stresses the importance of intention: ‘I don’t want it in the way Facebook wants it to be standard’. Van Dijck thus places the term in a very broad context: every change to and action on Facebook should be regarded as FBR, as long as it is intended to resist Facebook.

Regarding Stumpel (2010; 2013), van Dijck (2013a) and Leingruber’s e-mail (personal message, 2013, April 5 th ), we can conclude a certain degree of consciousness is important. However, we would propose this does not imply Facebook resisters have to be conscious about being a Facebook resister. They should however be very conscious about the implication the platform has on them and thus decides to act upon it. Knowledge also seems of importance regarding FBR (Stumpel, 2010; van Dijck, 2013a; Leingruber personal message, 2013, April 5 th ). Stumpel (2010) applies this to Facebook users acting against data exploitation: they have to have a certain degree of knowledge on how Facebook exploits user data and on how to control personal data. We agree with this in the case IFBR. When considering the other forms used in the research, GFBR and SFBR, this seems less the case. Regarding van Dijck’s intentional statement, knowing what Facebook wants you to do, in order to do it differently, seems sufficient.

In an attempt to pose a definition of Facebook resistance and regarding van Dijck’s broad context of the term, we state the following: ‘FBR is deliberately acting against Facebook and its power’. In this context, not having Facebook at all seems to be indeed the heaviest form of FBR, as these people want to back away or stay away from Facebook’s power over them. ‘Power’ has to be understood in a broader way than just the exploitation of data. It has to be regarded as ‘how Facebook wants it to be standard’. Since Facebook wants to keep users as long as possible on its platform, since it encourages people to be open and connected and since it standardizes the platform entirely, every action that intentionally acts against this power, has to be considered as FBR. Minimizing the time spent on Facebook, deliberately keeping back information from Facebook and other users and changing features of Facebook with the help of add-ons and user scripts, are thus indeed all to be considered as FBR.

When narrowing down the concept of power to a context of user- and data exploitation, and when considering the important motive of privacy, it is possible to extend this definition to the previous posed concept of IFBR. As already concluded, consciousness, knowledge and intentions are here necessary to include. IFBR is thus ‘with a conscious intention wanting to act against Facebook’s power, in whichever possible way, in order to protect institutional privacy. This happens with a certain degree of knowledge of Facebook’s policy and of certain ways to manage personal

40 information and user data in order to act against Facebook’s power’. This definition leaves an opening for lawsuits against Facebook due to privacy violations, since in a broad context this could be considered as an attempt of trying to manage their personal information and user data.

The concept of social Facebook resistance is to be considered in the light of social privacy. Again the consciousness and intention are important. The demanded knowledge of Facebook’s policy is here much less profound, as the institutional context would be rather left out. Being conscious about the fact everyone else could see what you post if not taking action, we could consider here to be sufficient, as this embraces the concepts of social privacy. SFBR is thus ‘with a conscious attitude wanting to act against Facebook’s power, in whichever possible way, in order to protect social privacy. This happens with a certain degree of knowledge of Facebook’s policy and of certain ways to manage personal information and user data in order to act against Facebook’s power’. This entails for example not posting certain information, adjusting privacy settings, coding certain information, using secret groups,… People who however adjust privacy settings, but do not have the intention to resist facebook or to reduce its influence, cannot be considered as Facebook resisters. Regarding a cross tabulation (N=441, sign. p=0.000) (see appendix R) 79 respondents indicate to act on social privacy but have not the intention to resist Facebook. This intentional factor was formulated in question 8A, questioning whether respondents adjusted their privacy settings in order to resist Facebook/reduce its influence. People who don’t adjust privacy settings, but conduct other actions in order to protect their social privacy such as for example not posting any information, and that have the intention to resist Facebook, are social Facebook resisters. Another cross tabulation (N=440, sign. p=0.000) (see appendix R) shows 17 respondents who have not adjusted their privacy settings but do show SFBR.

The two definitions that deal with privacy specifically share great resemblance, however, IFBR has much more elaborate implications for both user and Facebook. It implies at least two aspects: on the one hand institutional Facebook resisters have sufficient knowledge. On the other hand, they are conscious about and care about institutional privacy. Moreover, it would be hard to believe Institutional Facebook resisters would not care about social privacy. This statement is confirmed by the survey data. 54% of the respondents showed SFBR, 16.8% showed IFBR. Considered together, 56.2% of the participants showed FBR due to privacy reasons. This means a large amount of institutional Facebook resisters are also social Facebook resisters. A Cross tabulation (N=433, sign. p=0.000) (see appendix S) shows there are only twelve respondents showing IFBR but not SBFR, whereas 34 respondents were expected.

41

Social Facebook resisters at least have a degree of knowledge of Facebook’s policy: they know their social privacy is at risk. Moreover they care about social privacy. They could have the knowledge about institutional privacy, but they don’t necessarily care about it. This implies SFBR is FBR, although to a lesser degree than IFBR, that is more profound and active. We could thus pose a small model of concentric circles where the general Facebook resistance (GFBR) is the outer of the three. The first definition of FBR is broad and more general, even comprising acts of resistance against Facebook that are not inspired by privacy reasons, but due to other intentions or dissatisfaction. This is for example the case for the adding of a dislike button, Social Fixer and Facebook Purity, since these add-ons change specified features to Facebook. The further inwards, the more specific and intense FBR becomes. This model indicates SFBR and IFBR are forms of the more broad GFBR, as the above definition states. In order to give a more clear image on GFBR, we however also consider narrow GFBR, this is GFBR not considering privacy issues.

Figure 1: Concentric model: three types of FBR forms

We previously assumed that resistance entails ‘I don’t want it in the way Facebook wants it to be standard’. As Facebook offers its users tools to adjust social privacy settings, it can be discussed, whether Facebook wants this to be standard or not. If this would be the case, then adjusting privacy settings would not be a form of FBR. However, Facebook still wants its users to be connected and open. Moreover the default settings for privacy settings are set to public. We could thus argue adjusting privacy settings is to be considered as a form of FBR. We could consider social privacy tools on Facebook to be a concession towards Facebook users, letting users believe their privacy is

42 protected. Facebook realizes it has every interest in not losing users since this high user base is proven to be important in maintaining high profits (see supra).

Of the previous mentioned concepts of consciousness, knowledge and intentions, intention seems to be most important. In general, the heaviest form of FBR is not to have a Facebook account at all, or delete the account. Taking this intentional effect into account, it is difficult to state that people who don’t know how to work with Facebook, don’t have time for it or don’t perceive its usefulness, can be called Facebook resisters, without the intention to reduce or resist Facebook’s power, or without doing this out of conscious reasons. The same accounts for the laggards in the Technology Diffusion Model. It is possible some of these people would like to learn work with it, or they would realize the usefulness of the platform when they are shown what it can do. A cross tabulation (N=443, sign. p=0.001) (see appendix T) with dependent variable broad GFBR shows that people who don’t have a Facebook account less often show broad GFBR. This indeed confirms the statement that not all people without a Facebook account are Facebook resisters. This effect was also found in the result section: Participants who score very low on the scale Facebook resistance, showing none or very little resistance, proved to have less often a Facebook account.

Attitude was found to be the most important factor towards continued Facebook use. We assumed a negative attitude towards Facebook would trigger FBR. There indeed seemed a significant negative correlation between scale Facebook love and scale Facebook resistance. When regarding attitude in relation to specific actions, there proved to be no significant relations. Attitude towards Facebook thus seems to drive Facebook resistance that on its turn drives Facebook resistance actions. Attitude however seemed to drive different actions on Facebook. The higher the score on the scale Facebook love, the more the respondents post/share updates, pictures, films, links to other websites, hashtags, reply to posts, indicate going to events, playing games and chatting. Remarkable is that no significant correlations were found regarding the scale Facebook resistance, although it would be expected that respondents with high resistance scores would conduct these actions less.

Results show that the higher respondents scored on the scale Facebook resistance, the more likely they were to adjust privacy settings, adjust privacy settings out of resistance intentions and adjust use in order to resist Facebook/reduce its influence. This significant relation was not to be found for installing external aids. It seems that not all respondents who install add-ons, user scripts or hacks have the intention to resist Facebook.

43

Based on the new information concerning the different FBR types and considering if the five distinguished groups seem to have more or less often an account, we can distinguish the following group names displayed in table 9. Facebook indifferent resisters are rather indifferent towards Facebook, but they do display resistance and especially SFBR and IFBR. Anti Facebook resisters display more often all three types of resistance, even though they do not have a Facebook account. Facebook ambiguous resisters are named as such since they both display a positive attitude towards Facebook while also showing high resistance. This resistance entails the three types. Facebook lovers and Alter-Facebookers are however no resisters, since they show low resistance, as they seem to have no intention to resist Facebook. The three types of resistance are to be found less often in both these groups when compared to other groups. Remarkable is that Facebook lovers are of all ages, whereas in other groups significant differences were found among the age categories. When putting the five groups on a chart according to their size (see appendix F), along the Facebook love (attitude) and Facebook resistance dimension, the following image is created as is displayed in Figure 4. NAMING CLUSTERS

Facebook Love Facebook Resistance Account NAME Moderate Moderate yes Facebook indifferent resisters Lower High No Anti Facebook resisters Lower -moderate Low No Alter -Facebook ers High Higher Yes Facebook ambiguous resisters High Low Yes Facebook lovers

Table 9: Naming cluster groups based on scale Facebook love, scale Facebook resistance and Account

44

APPEARANCE OF GFBR, SFBR, IFBR WITHIN CLUSTER GROUPS Facebook love Facebook GFBR SFBR IFBR resistance Moderate Moderate Less often More often More often Lower High More often More often More often Lower -moderate Low Less often Less often Less often High Higher More often More often Less often High Low Less often Less often Less often

Table 9: Appearance of GFBR, SFBR and IFBR in relation to clustering along the Facebook love and Facebook resistance dimension

Figure 4: types and amount of Facebook users and non-users divided along the Facebook love and Facebook resistance dimension

Effects of Facebook resistance Certain initiatives such as Social Roulette certainly succeed to annoy Facebook. Under pressure of successful user protests, privacy settings were already adjusted different times (Soghoian, 2008, March, 19; Wray, 2009, August 27; van Dijck, 2013a). The ‘A Day Without Facebook’ campaign proved successful 39 . Due to the many protest actions, Facebook had to adapt the News Feed, it had to offer an opt-out possibility for Beacon and later even had to remove the feature (e.g. Johnson,

39 Consulted on January 12 on the World Wide Web: http://daywithoutfacebook.blogspot.be/ 45

2007, November 30; Johnsons, 2007, December 1 st ; van Dijck, 2013a). There was the successful lawsuit in 2008 with regard to Beacon (van Dijck, 2013a; Facebook forced into privacy business; Johnson, 2009, December 10). However, this happened at the beginning of the connectedness- connectivity shift. Facebook has learnt from its mistakes (van Dijck, 2013a; Debatin et al. 2009). Since the protests around Beacon, no other action had more effect. A form of open policy regarding to policy changes failed, since at least 30% of active Facebook users had to vote in order to create a binding result (van Dijck, 2013a). According to Stumpel (2010) the effects of FBR are thus rather small, since Facebook’s protologicol control is hardly affected by it. Although being effective, add- ons, user scripts and hacks for example are only used by a small amount of Facebook users (Stumpel, 2010).

All protest groups grew very large very quickly. However, looking at these groups, after a while the amount of protesters in these groups declines. The remaining users then spam the pages promoting events. This can for example be seen on the Facebook group ‘Facebook, Switch back to the old news feed!!!’ 40 . This is possibly due to the adjusting of the user to the changes Facebook had implemented, thus again proving the statement of the changing social norms to be right. Putting charges against Facebook in Europe is not an easy task. Facebook argues this should be done under Irish jurisdiction, where Facebook Europe is established (see Appendix 4). The status update claiming to protect users’ personal info against commercialization, seems useless, since by using Facebook, people give Facebook their consent to commercialize it (e.g. Ingham, 2015, January 9; Cohen, 2014, December 1st ; Statement of Rights and Responsibilities, last updated on January 30, 2015).

How does Facebook react on Facebook resistance? Fernback (2013) admits that changing power dynamics between Facebook and its users are dependent on Facebook’s reaction to resistance initiatives. Most of the protest actions Facebook simply ignores, although Facebook already had to change privacy settings under pressure (see supra). Facebook has every interest in avoiding FBR, as previous effects of the Beacon protests show. Facebook continues to preach openness and connectedness, since it cannot avoid or counteract every action of FBR. However there are protests Facebook simply cannot overlook: ‘ Facebook’s response to lawsuits and other allegations over its privacy policies has been to listen to user complaints and make amends when required by law ’ (van Dijck, 2013a). However, Facebook has a clear strategy in handling law suits in Europe, acting under Irish jurisdiction (see supra). Facebook makes it technically hard to conduct the heaviest form of FBR, removing an account. Moreover, when

40 Consulted on January 12 2015 on the World Wide Web: https://www.facebook.com/groups/155541434951/?fref=ts

46 users do remove an account Facebook tries to appeal to the emotional bonds with friends by e- mailing pictures and messages (van Dijck, 2013a).

There are other examples where Facebook reacts on initiatives: Leingruber remarked in a comment 41 on the FB Resistance Artists Facebook group that already after one day the hack application Social Roulette was disabled by Facebook. Facebook does not want users to change elements on the platform, nor does it recognize this happens (Stumpel, 2013). It is however remarkable Facebook allows Facebook Fanpages of this kind of applications to exist 42 and it also lets pages such as Leingruber’s FB Resistance Artits undisturbed. Facebook forbids the use of its name in resistance initiatives (Terms and Conditions of use, last updated on January 30, 2015). It does not want to draw attention on questions about the platform or features. It always reacts when this happens. Examples are to be found with Social Fixer (formerly called Better Facebook), Facebook purity and Leingrubers’ social ID card (Leingruber, 2012).

Facebook let Kruse and Social Fixer exist without taking any further notice, as it only changes the appearance of Facebook and has nothing to do with privacy matters. Other add-ons such as for example Unfriend Finder and FB Purity did not enjoy the same goodwill. Unfriend Finder ceased to exist 43 , FB Purity was nearly being stopped from spreading and had issues with Facebook’s legal division. When Facebook was asked why it turned a blind eye for Social Fixer and not for FB Purity, it refused to comment (Boffard, 2013, August 25).

Implications of this research Facebook seems to maintain its position as the largest social network site in the world. Referring to its ownership, with its takeovers and partnerships, Facebook becomes more powerful than ever. Although its effects are small, FBR offers hope in protecting social and institutional privacy and in addressing other forms of dissatisfaction. As privacy concerns proved to have a positive relation to social media fatigue and trigger FBR (see supra), Facebook will have to keep in mind more people become conscious about institutional privacy issues. Social media confidence is therefore important for Facebook, as it is the strongest factor counteracting social media fatigue (Bright et al., 2015). Facebook in particular and other similar companies in general have to create more trust in handling user data, but not by conducting the current used techniques only providing social privacy tools. More transparency is necessary. Instead of ignoring resistance or counteracting it, Facebook and

41 https://www.facebook.com/groups/189135107782024/ 42 Consulted on January 12 2015 on the World Wide Web: https://www.facebook.com/pages/Unfriend- Finder/343657649045698, https://www.facebook.com/disconnecters, https://www.facebook.com/purityupdates?fref=ts, https://www.facebook.com/bookfacepurity 43 Consulted on January 14th 2015 on the World Wide Web: http://www.unfriendfinder.com/ 47 other companies could work together with users wanting to improve their platforms, such as for example Kruse intended with Social Fixer. This would probably result in higher feelings of satisfaction among many resisters.

Several respondents mentioned Facebook is a fake environment triggering resistance due to the constant encouragement of posting only positive feelings and information. Facebook should realize Facebook users adapt their behavior to this environment, not showing their true identity (see supra).

As already stated, more people are aware about social and institutional privacy issues. Raines-Goldie and the results of this research indicate institutional privacy concerns are however still under addressed by people compared to social privacy. There has been recent improvement in popular news media, but we agree with Fernback consciousness about this issue should further be developed. Educational matters should not only focus on social privacy, but also on institutional privacy and on manners protecting it. One should remark privacy issues considering Facebook are always regarded in relation to Facebook users. However, it was shown that Facebook also gathers information non-Facebook users. The privacy debate should thus be extended towards everyone.

Nonetheless, we also agree with Leingruber and Fuchs, critique on Facebook should be reviewed in the right context, that of a capitalistic environment putting pressure to make profits. Leingruber states: ‘ I actually criticize people that just see Facebook as an evil entity, because when you boil it down their critique is a lot of times much more about the system of capitalism in general, in which Facebook is playing (very well)’ (personal message, 2013, April 5 th ). Educational matters should make people aware this total picture. Educational matters should also not only focus on adolescents, but on all ages.

Limitations and future research This research paper addressed the topic of Facebook resistance, providing a contextual framework along van Dijck’s techno-cultural and socio-economical dimension. It confirmed the importance of both social and institutional privacy. It provided a more clear view on FBR forms, on motives for FBR and on Facebook resisters themselves. Furthermore, several types of FBR were discussed and definitions were posed. Positive is that this research did not only focus on students, as other research towards motives and Facebook often does, but on all ages.

There are however some limitations in this research that have to be addressed. This research was not based on a random test sample. Therefore it is disputable if the results can be generalized towards the whole population of Facebook users and non-users. Moreover, the scales Facebook love and Facebook resistance are the result of self-reported values. Self-reported data in research towards

48 social network sites often tend to be irregular and infrequent (Staddon et al., 2013). However, as intention proves to be important in defining FBR and thus types of Facebook resisters, this does not pose a large problem. However, some respondents clearly show FBR forms, and indicate the intention to resist, but do not consider themselves to be true resisters. In searching a definition of Facebook resistance, we concluded it is not necessary people realize they are resisters as long as they show the intention and conduct FBR action. On the other hand, there are respondents who claim to resist Facebook, but who don’t indicate an intention to resist or indicate FBR actions. These two issues affect the clustering and therefore the typology of Facebook users and non-users we proposed. We also have to take into account the reliability value of the Facebook love scale which proved too low in order to conduct reliable analyses.

In reviewing the qualitative results, there was no software used in order to list the different forms and motives of FBR, thus proving to be a highly subjective distinction. Regarding the literature and the importance of institutional an social privacy, there are however reasons to believe, a reasonable attempt was conducted.

We recommend future research towards more objective scales in order to create a better typology of Facebook users and non-users based on Facebook resistance. It also seems interesting to conduct resistance research towards other platforms connected with Facebook, such as WhatsApp, Instagram and Spotify, and on other companies exploiting user data.

49

References Acquisti, A. & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on Facebook . PET 2006. Geraadpleegd op 14 april 2013 op het World Wide Web: http://people.cs.pitt.edu/~chang/265/proj10/zim/imaginedcom.pdf

Albrow, M. (1990). Globalization, knowledge and society: Readings from international sociology . London: Sage Publications Ltd.

Amichai-Hamburger, Y. & Vinitzky, G. (2010). Social network use and personality. Computers in human behavior, 26 , 1289-1295.

Andrade, N.N.G, Martin, A. & Monteleone, S. (2013). “All the better to see you with, my dear”: Facial recognition and privacy in online social networks. Security & Privacy, IEEE, 11 , 21-28.

Asif, Z. & Khan, M. (2012). Users’ perceptions on Facebook’s privacy policies. ARPN Journal of Systems and Software, 3 , 119-125.

Back, M.D., Stopfer, J.M., Vazire, S., Gaddis, S., Schmukle, S.C., Egloff, B. & Gosling, S.D. (2008). Facebook profiles reflect actual personality, not self-idealization. Psychological Science, 21 (3), 372- 374.

Backstrom, L., Dwork, C & Kleinberg, J. (2007). Wherefore art thou r3579x? Anonymized social networks, hidden patterns, and structural steganography. Research paper for the International World Wide Web Conference Committee in Alberta, Canada.

Bell, D. (1973). The coming of post-industrial society: A venture in social forecasting . New York, NY: Basic Books.

Boffard, R. (2013, August 25). Meet Matt Kruse, the man making Facebook better. Ars Technica . Consulted on January 12 2015 on the World Wide Web: http://arstechnica.com/business/2013/08/meet-matt-kruse-the-man-making-facebook-better/

Bonson, E., Escobar, T. & Ratkai, M. (2014). Testing the inter-relations of factors that may support continued use intention: The Case of Facebook. Social Science Information Sur Les Sciences Sociales, 53 (3), 293-310.

Booth, R. (2014, June, 30). Facebook reveals news feed experiment to control emotions. The Guardian .

Boyd, D.M. (2008). Facebook’s privacy trainwrech exposure, invasion and social convergence. Convergence, 14 , 13-20.

50

Brandtzæg, P.B., Luders, M. & Skjetne, J.H. (2010). Too many Facebook “Friends”? Content sharing and sociability versus the need for privacy in social network sites. International Journal of Human- Computer Interaction, 26 , 1006- 1030.

Brennan, C. (2015). If I had done this in Ireland it would’ve taken 25 years’ – Facebook Privacy Campaigner. Consulted on May 1 st 2015 on the World Wide Web: http://www.thejournal.ie/facebook-madness-irish-courts-2040163-Apr2015/

Bright, L.F., Kleiser, S.B. & Grau, S.L. (2015). Too much Facebook? An exploratory examination of social media fatigue. Computers in human behavior, 44 , 148-155.

Brooks, B., Hogan, B. & Ellison, N.B. (2014). Assessing structural correlates to social capital in Facebook ego networks. Social networks, 38 , 15-15.

Buckley, B. & Hunter, M. (2011). Say cheese! Privacy and facial recognition. Computer Law and Security Review: The international Journal of Technology and Practice, 27 , 637-640.

Buffardi, L.E. & Campbell, W.K. (2010). Narcissism and social networking web sites. Personality and social psychology bulletin, 34 , 1303-1314.

Burke, M., Marlow, C. & Lento, T. (2010). Social network activity and social well-being. In: Proceedings of the 2010 ACM Conference on Human Factors in Computing Systems . New York: ACM, 1909–1912.

Butt, S., & Phillips, J. G. (2008). Personality and self reported mobile phone use. Computers in Human Behavior, 24 (2), 346–360.

Carpenter, C.J. (2012). Narcissism on Facebook: Self-promotional and anti-social behavior. Personality and individual differences, 52 , 482-486.

Castells, M. (2009). Communication power . Oxford/New York: Oxford University Press.

Castells, M. (1996). The rise of the network society . Oxford/Malden MA: Blackwell Publishers.

Chang, S-C., Liu, H. (2011). Using focus group to investigate why people use Facebook: The case of Taiwan. In D.F. Kocaoglu, T.R. Anderson, T.U. Diam (Eds.), Proceedings of Picmet 11: Technology Management in the energy-smart world (Picmet) .

Chen, S.C., Yen, D.C. & Hwang, M.I. (2012). Factors influencing the continuance intention to the usage of Web 2.0: An empirical study. Computers in Human Behavior, 28 , 933-941.

51

Cheung, C.M.K.; Chiu, P-Y., Lee, M.K.O. (2011). Online social networks: Why do students use Facebook? Computers in Human Behavior, 27 (4), 1337-1343.

Chiang, H.-S. (2013). Continuous usage of social networking sites: The effect of innovation and gratification attributes. Online Information Review, 37 (6), 851-871.

Choi, J., Jung, J. & Lee, S.W. (2013). What causes users to switch from a local to a global social network site? The cultural, social, economic, and motivational factors of Facebook's globalization. Computers in Human Behavior, 6 , 2665-2673.

Clarkson, K.L., Liu, K. & Terzi, E. (2010). Towards identity anonymization in social networks. In P.S. Yu, J. Han & Faloutsos, C. (Eds.). Link mining: Models, algorithms and applications . New York, NY: Springer.

Commission for the Protection of Privacy. (2014a). Facebook . Consulted on May 18 2015 on the World Wide Web: http://www.privacycommission.be/nl/faq-themas/facebook

Commission for the Protection of Privacy. (2014b). Het recht om vergeten te worden: kunt u online uw sporen wissen, en hoe moet u dat dan doen? Consulted on May 3rd 2015 on the World Wide Web: http://www.privacycommission.be/en/node/16953#section-1

Constine, J. (2011, January 24). Facebook’s sponsored stories turns News Feed posts into home page ads. Social Times. Consulted on May 22nd 2013 on the World Wide Web: http://www.insidefacebook.com/2011/01/24/sponsored-stories-feed-ads/

Chiu, P-Y., Cheung, C.M.K., Matthew, K.O. (2008). Online social networks: Why do “we” use Facebook? In M.D. Lytras, J.M., Carroll, E. Damiani, R.D. Tennyson, D. Avison, G. Vossen, P. Ordóñez de Pablos (Eds.) The Open Knowlege Society. A Computer Science and Information Systems Manifesto (pp. 67-74). Berlin: Springer.

Christofides, E., Muise, A., Desmarais, S. (2009). Information disclosure and control on Facebook: Are they two sides of the same coin or two different processes? Cyberpsychology & Behavior, 12 (3), 341- 345.

Darwell, B. (2013, 11 januari). Understanding the difference between Facebook Sponsored Stories, Page Post Ads, Promoted Posts and Marketplace Ads. Consulted on May 22nd 2013 on the World Wide Web: http://www.insidefacebook.com/2013/01/11/understanding-the-difference-between- facebook-sponsored-stories-page-post-ads-promoted-posts-and-marketplace-ads/

52

Dindar, M. & Akbulut, Y. (2014). Why do pre-service teachers quit Facebook? An investigation on 'quitters forever' and 'quitters for a while'. Computers in Human Behavior, 39 , 170-176.

Debatin, B., Lovejoy, J.P., Horn, A. & Hughes, B.N. (2009). Facebook and online privacy: Attitues, behavior, and unintended consequences. Journal of Computer-Mediated Communication, 15 , 83-108.

Deci, E.L. & Ryan R.M. (1987). The support of autonomy and the control of behavior. J Pers Soc Psychol, 53 (6), 1024–1037.

Dredge, R., Gleeson, J., Garcia, X. (2014). Presentation on Facebook and risk of cyberbullying victimization. Computers in Human Behavior, 40 , 16-22.

Eisinga, R., Te Grotenhuis, M. & Pelzer, B. (2013). The reliability of a two-item scale: Pearson, Cronbach or Spearman-Brown? International Journal of Public Health,58 (4), 637–642.

Egele, M., Moser, A., Kruegel, C. & Kirda, E. (2012). PoX: Protecting users from malicious Facebook applications. Computer Communications, 35 , 1507, 1515.

Ellison, N.B., Gray, R., Lampe, C. & Fiore, A.T. (2014). Social capital and resource requests on Facebook. New media & society, 16 (7), 1104-1121.

Ellison, N.B., Gray, R., Vitak, J., et al. (2013) Calling all Facebook friends: exploring requests for help on Facebook. In: Proceedings of the 7th annual international conference on weblogs and social media , Boston, MA, 8-10 July 2013, pp. 155–164. Washington, DC: Association for the Advancement of Artificial Intelligence.

Ellison, N.B., Steinfield, C. & Lampe, C. (2007). The benefits of Facebook “friends:” Social capital and college students’ use of online social network sites. Journal of Computer-Mediated Communication, 12 ( 4), 1143-1168.

Ellison, N.B., Steinfield, C. & Lampe, C. (2011). Connection strategies: Social capital implications of Facebook-enabled communication practices. New Media & Society, 13 , 873-892.

European Commission. (2012, January 25). Commission proposes a comprehensive reform of the data protection rules. Consulted on March 15 on the World Wide Web: http://ec.europa.eu/justice/newsroom/data-protection/news/120125_en.htm

European Commission. (2014). Myth-Busting: The Court of Justice of the EU and the “Right to be Forgotten” . Consulted on May 3 rd 2015 on the World Wide Web: http://ec.europa.eu/justice/data- protection/files/factsheets/factsheet_rtbf_mythbusting_en.pdf

53

European Commission. (2015). The EU data protection reform and big data protection factsheet . Consulted on May 3 rd 2015 on the World Wide Web: http://ec.europa.eu/justice/data- protection/files/data-protection-big-data_factsheet_web_en.pdf

European Commission. (2015, April 15). What will the EU Data Protection Reform bring for startup companies and Big Data? Consulted on May 3 rd 2015 on the World Wide Web: http://ec.europa.eu/justice/newsroom/data-protection/news/150415_en.htm

The Facebook Blog (2009, 10 februari). “I like this” . Consulted on February 1 st 2015 on the World Wide Web: https://blog.facebook.com/blog.php?post=53024537130

Facebook. (n.d.). Cookies, Pixels & similar technologies . Consulted on May 5 th 2015 on the World Wide Web: https://www.facebook.com/help/cookies/?ref=sitefooter

Facebook’s Name Policy. (last updated on March 10, 2015). Consulted on July 6 2015 on the World Wide Web: https://www.facebook.com/help/292517374180078

Facebook Data Policy. (2015). Consulted on February 2 nd 2015 on the World Wide Web: https://www.facebook.com/policy.php

Facebook forced into privacy business (2010). Computer Fraud & Security, 2010 , 1-2.

Facebook principles . (n.d.). Consulted on March 1st 2015 on the World Wide Web: https://www.facebook.com/principles.php

Facebook Statement of Rights and Responsibilities. (2015). Consulted on February 2 nd 2015 on the World Wide Web: https://www.facebook.com/legal/terms/update

Facebook.com Users Against the "News Feed" and "Mini Feed" (2006, September 5th). Consulted on January 17 2015 on the World Wide Web: http://www.petitiononline.com/faceb00k/petition.html

Facebook voor Brusselse rechter. (2015, June 15). De Standaard.

Farrell, H. (2014, June 20). The case that might cripple Facebook. The Washington Post .

Feitelson, D.G., Frachtenberg, E. & Beck, K.L. (2013). Development and deployment at Facebook. IEEE Internet Computing, 17 (4), 8-17.

FBresistance (n.d.). Consulted on the World Wide Web on January 14th 2015: http://fbresistance.com/

Fernback, J. (2013). Fernback, J. (2013). Sousveillance: Communities of resistance to the surveillance environment. Telematics and Informatics , 1 ,11-21.

54

“De fiscus volgt je op Facebook”. (2009, 18 juli). De Standaard .

Fuchs, C. (2011). An alternative view of privacy on Facebook . Information, 2 , 140-165.

Galloway, A.R. & Thacker, E. (2007). The exploit: a theory of networks . Minneapolis, MN: MIT Press.

Galloway, A.R. (2004). Protocol: How control exists after decentralization. Cambridge, MA: MIT Press.

Giddens, A. (1990). The consequences of modernity . Cambridge: Polity Press.

Gonzales, A.L., Hancock, J.T. (2010). Mirror, mirror on my Facebook wall: Effects of exposure to Facebook on self-esteem. Cyberpsychology Behavior and Social Networking, 14 , 79-83.

Gosling, S. (2009). The ancient psychological roots of Facebook behavior. The Harvard Business Review. Consulted on the World Wide Web on March 28 2015. http://www.blogs.hbr.org/now-new- next/2009/03/the-ancient-psychological-root.html

Govani, T. & Pashley, H. (2005). Student awareness of the privacy implications when using Facebook. Carnegie Mellon . Geraadpleegd op 14 april 2013 op het World Wide Web: http://lorrie.cranor.org/courses/fa05/tubzhlp.pdf.

Granovetter, M.S. (1973). The strength of weak ties. American Journal of Sociology, 78 , 1360-1380.

Gray, R., Ellison, N., Vitak, J., et al. (2013). Who wants to know? Question-asking and answering practices among Facebook users. In: Proceedings of the 16th annual conference on computer- supported cooperative work and social computing (CSCW) , San Antonio, TX, 23–27 February 2013, pp. 1213–1224. New York: ACM.

Grimmelmann, J. (2009). Saving Facebook. Iowa Law Review, 94 , 1137-1206.

Gross, D. (2011, September 23). Users not happy with new Facebook changes. CNN. Consulted on February 25 on the World Wide Web: http://edition.cnn.com/2011/09/21/tech/social- media/facebook-changes-react/index.html?_s=PM:TECH

Gross, R. & Acquisti, A. (2005). Information revelation and privacy in online social networks. Workshop on Privacy in the Electronic Society (WPES) . Consulted on January 5 2015 on the World Wide Web: http://heinz.cmu.edu/~acquisti/papers/privacy-facebook-gross-acquisti.pdf

Gwebu, K.L., Wang, J. & Guo, L. (2014). Continued usage intention of multifunctional friend networking services: A test of a dual-process model using Facebook. Decision Support Systems, 67 , 66-67.

55

Halliday, J. (2011, 22 september). Facebook to transform into an entertainment hub. The Guardian .

Hoadley, CM., Xu, H., Lee, J.J. & Rosson, M.B. (2010). Privacy as information access and illusory control: The case of the Facebook News Feed privacy outcry. Electronic Commerce Research and Applications, 9 , 50-60.

Hsu, C.-L., Y, C.-C. & Wu, C.-C. (2014a). Exploring the continuance intention of social networking websites: an empirical research. Information Systems and E-business Management, 12 (2), 139-163.

Hsu, M.-H., Chuang, L.-W., Chiu, S.-P., Chu, W.-C. (2014b). Understanding users' intentions to continue using social media: The role of cognitive absorption, social network, social presence.

Hsu, M.-H., Chuang, L.-W., Chiu, S.-P., Chu, W.-C. (2014b). Understanding users' intentions to continue using social media: The role of cognitive absorption, social network, social presence. 2nd International Conference on Innovation, Communication and Engineering , 179-182.

Hull, G., Lipford, H.R. & Latulipe, C. (2011). Contextual gaps: privacy issues on Facebook. Ethics and Information Technology, 13 , 289-302.

Ingham, A. (2015, January 9). Facebook ‘copyright notice’ a hoax. Liberty Voice . Consulted on May 6 2015 on the World Wide Web: http://guardianlv.com/2015/01/facebook-copyright-notice-a-hoax/

Jackson, S.A. & Marsh, H.W. (1996). Development and validation of a scale to measure optimal experience: the flow state scale. J Sport Exerc Psychol, 18 (1), 17–35.

Jenkins, S. (2007, 7 december). In the age of leaky data, there is no such thing as a secure online computer. The Guardian .

Jin, C.H. (2013). The perspective of a revised TRAM on social capital building: The case of Facebook usage. Information Management, 50 (4) 162-168.

Johnson, B. (2007, 13 augustus). Facebook’s code leak raises fear of fraud. The Guardian .

Johnson, B. (2009, 10 december). Facebook privacy change angers campaigners. The Guardian .

Johnson, R. (2010, 21 juli). Scaling Facebook to 500 Million Users and Beyond . Consulted on January 12 on the World Wide Web: https://www.facebook.com/notes/facebook-engineering/scaling- facebook-to-500-million-users-and-beyond/409881258919

Johnston, A. & Wilson, S. (2012). Privacy compliance risks for Facebook. Technology and Society Magazine, 31 , 59-64.

56

Johnston, K., Tanner, M., Lalla, N. & Kawalski, D. (2013). Social capital: The benefit of Facebook ‘friends’. Behavior and Information Technology, 32 (1), 24-36.

Jones, H. & Soltren, J.H. (2005). Facebook: Threats to privacy . Consulted on January 2 nd 2015 on the World Wide Web: http://citeseerx.ist.psu.edu/viewdoc/summary?doi=10.1.1.112.3154

Jung, Y., Gray, R., Lampe, C., et al. (2013) Favors from Facebook friends: unpacking dimensions of social capital. In: Proceedings of the conference on human factors in computing systems , Paris, France, 27 April – 2 May 2013, pp. 11–20. New York: ACM.

Justice, E. (2007, September 15). Facebook suicide: The end of a virtual life. The Times .

Kaplan, A.M. & Haenlein, M. (2010). Users of the World, unite! The challenges and opportunities of social media. Business Horizons, 53 , 59-68.

Khan, M.L.; Wohn, D.Y. & Ellison, N.B. Actual friends matter: An internet skills perspective on teens' informal academic collaboration on Facebook. Computers & education, 79 , 138-147.

Kim, J.H., Kim, M-S. & Nam, Y. (2010). An analysis of self-construals, motivations, Facebook use, and user satisfaction. International Journal of Human-Computer Interaction, 26 (11-12), 1077-1099.

Kourouthanassis, P., Lekakos, G., Gerakis, V. (2015). Should I stay or should I go? The moderating effect of self-image congruity and trust on social networking continued use. Behavior & Information Technology, 34 (2), 190-203.

Krishnan, A. & Hunt, D.S. (2015). Influence of a multidimensional measure of attitudes on motives to use social networking sites. Cyberpsychology Behavior and Social Networking, 18 (3), 165-172.

Ku, Y.-C., Chen, R. & Zhang, H. (2013). Why do users continue using social networking sites? An exploratory study of members in the United States and Taiwan. Information & Management, 50 (7), 571-581.

Kwan, G.C.E. & Skoric, M.M. (2013). Facebook bullying: An extension of battles in school. Computers in Human Behavior, 29 , 16-25.

Lampe, C., Gray, R., Fiore, A., et al. (2014) Help is on the way: patterns of responses to resource requests on Facebook. In: Proceedings of the 2014 ACM conference on computer-supported cooperative work , Baltimore, MD, 15–19 February 2014, pp 3–15. New York: ACM.

57

Lampe, C., Wohn, D.Y., Vitak, J., Ellison, N.B. & Wash, R. (2011). Student use of Facebook for organizing collaborative classroom activities. International Journal of Computer-supported Collaborative Learning, 6 , 329–347.

Lang, C. & Barton, H. (2015). Just untag it: Exploring the management of undesirable Facebook photos. Computers in Human Behavior, 43 , 147-155.

Lee-Won, R.J., Shim, M., Joo, Y.K, Sung, G.P. (2014). Who puts the best "face" forward on Facebook?: Positive self-presentation in online social networking and the role of self-consciousness, actual-to- total Friends ratio, and culture. Computers in Human Behavior, 3 9, 413-423.

Leingruber, T. (2012). Social ID Bureau. Pick-up your Social Identity Card. Consulted on January 12 2015 on the World Wide Web: http://fbbureau.com/

Leong, P. (2011). Role of social presence and cognitive absorption in online learning environments. Distance Education, 32 , 5-28.

Lin, K.Y., Lu, H.P. (2011). Why people use social networking sites: An empirical study integrating network externalities and motivation theory. Computers in Human Behavior, 27 (3), 1152-1161.

Lin, T.H., Lu, H.P., Hsiao, K.L. & Hsu, H.H. (2014). Continuance intention of Facebook check-in service users: An integrated model. Social behavior and personality, 42 (10), 1745-1760.

Liu, Y., Krishna, P. & Gummandi, A. (2011). Analyzing Facebook privacy settings: User expectations vs. Reality. IMC’11 Proceedings of the 2011 ACM SIGCOMM conference on Internet measurement conference , 61-70.

Lomas, N. (2013, June 24). Facebook’s creepy data-grabbing ways make it the borg of the digital world. TechCrunch . Consulted on January 15 2015 on the World Wide Web: http://techcrunch.com/2013/06/24/creepy-facebook/

Lou, L.L.L. (2010). Loneliness, friendship, and self-esteem: First-year college students’ experience of using facebook. Dissertation Abstracts International: Section B: The Sciences and Engineering, 70 , 7902.

Lown, M. (2012, May 3 rd ). Bouncers 'checking Facebook on phones' as identification. BBC . Consulted on January 12 2015 on the World Wide Web: http://www.bbc.co.uk/newsbeat/article/17930370/bouncers-checking-facebook-on-phones-as- identification

58

Lyndon, A., Bonds-Raacke, J. & Cratty, A.D. (2011). College students’ Facebook stalking of ex- partners. Cyberpsychology, Behavior and Social Networking, 14, 711-716.

Madejski, M., Johnson, M. & Bellovin, S.M. (2011). The failure of online social network privacy settings. Paper gepubliceerd door de Colombia University . Geraadpleegd op 28 juni 2013 op het World Wide Web: http://academiccommons.columbia.edu/catalog/ac:135406

Matussek, K. (2013, February 15). Facebook scores win in legal-regime dispute with Germany. BloombergBusiness . Consulted on May 12 2015 on the World Wide Web: http://www.bloomberg.com/news/articles/2013-02-15/facebook-scores-win-in-legal-regime- dispute-with-germany

McCrae, R. R. (1992). The five-factor model: Issues and applications [Special issue]. Journal of Personality, 60 (2).

Mehdizadeh, S. (2010). Self-presentation 2.0: Narcissism and self-esteem on Facebook. Cyberpsychology, behavior, and social networking, 13 , 357-364.

Messmer, E. (2007). Study: Facebook users easy targets for identity theft . Consulted on the World Wide Web on January 3 rd 2015 on the World Wide Web: http://www.macworld.com/article/59488/2007/08/facebook.html

Moore, K. & McElroy, J.C. (2012). The influence of personality on Facebook usage, wall postings, and regret. Computers in human behavior, 28 , 267-274.

Moran, C. (2015). Turning against the CIA: Whistleblowers during the ‘Time of Troubles’. History, 100 , 340, Special Issue, 251-274.

Moreno, M.A., Kelleher, E., Ameenuddin, N. & Rastogi, S. (2014). Young adult females' views regarding online privacy protection at two time points. Journal of Adolescent Health, 55 (3), 347-351.

Munger, M. (2015). No place to hide: Edward Snowden, the NSA, and the US surveillance state. Independent Review, 19 (4), 605-609.

Nadkarni, A. & Hofmann, S.G. (2012). Why do people use Facebook? Personality and Individual Differences, 52 , 243-249.

Nasralla, S. & Gruber, A. (2015, April 9). Austrian student's lawsuit vs Facebook bogged down in procedure. Reuters . Consulted on May 1 st 2015 on the World Wide Web: http://www.reuters.com/article/2015/04/09/us-facebook-austria-lawsuit-idUSKBN0N019420150409

59

Narayanan, A. & Shmatikov, V. (2009). De-anonymizing Social Networks. Security & Privacy, IEEE .

Neyskens, M. (2015, May 15). Reactie staatssecretaris voor Privacy Bart Tommelein op de aanbevelingen van de Privacycommissie aan Facebook. Politics.be . Consulted on May 18 2015 on the World Wide Web: http://www.politics.be/persmededelingen/41262/

Nie, J. & Sundar, S. (2013). Who would pay for Facebook? Self esteem as a predictor of user behavior, identity construction and valuation of virtual possessions. Human-Computer interaction – Interact 2013, PT III, Book Series: Lecture Notes in Computer Science, 8119 , 726-743.

Nosko, A., Wood, E. & Molema, S. (2010). All about me: Disclosure in online social networking profiles: The case of Facebook. Computers in Human Behavior, 3 , 406-418.

Nosko, A., Wood, E., Kenney, M., Archer, K., De Pasquale, D., Molema, S. & Zivcakova, L. (2012). Examining priming and gender as a means to reduce risk in a social networking context: Can stories change disclosure and privacy setting use when personal profiles are constructed? Computers in Human Behavior, 28 , 2067-2074.

O’Carroll, L. & Halliday, J. (2011, 29 september). Facebook flooded with campaigners' requests for hard copies of information. The Guardian .

Ong, E.Y.L., Ang, R.P., Ho, J.C.M., Lim, J.C.Y., Goh, D.H., Lee, C.S., Chua, A.Y.K. (2011). Narcicissism, extraversion and adolescents’ self-presentation on Facebook. Journal of personality and individual differences, 50 , 180-185.

Orr, E.S., Sisic, M., Ross, C., Simmering, M.G., Arseneault, R.R. (2009). The influence of shyness on the use of Facebook in an undergraduate sample. Cyberpsychology, behavior, and social networking, 12 , 337-340.

Out-Law.com (2012, 21 september). Facebook agrees to delete facial recognition image 'templates' in response to EU privacy concerns . Consulted on December 28 2014 on the World Wide Web: http://www.out-law.com/en/articles/2012/september/facebook-agrees-to-delete-facial-recognition- image-templates-in-response-to-eu-privacy-concerns/

Panovich K, Miller R and Karger D (2012) Tie strength in question & answer on social network sites. In: Proceedings of the 2010 ACM conference on computer supported cooperative work , Savannah, GA, 6–10 February 2012, pp. 1057–1066. New York: ACM.

60

Papacharissi, Z. and Mendelson, A. (2008). Toward a new(er) sociability: Uses, gratifications, and social capital on Facebook. Paper presented at the Internet Research conference , Copenhagen, Denmark, October 2008. Perez, J.C. (2007, 30 november). Facebook’s Beacon more intrusive than previously thought. PC World . Consulted on January 15 2015 on the World Wide Web: http://www.pcworld.com/article/140182/facebooks_beacon_more_intrusive_than_previously_thou ght.html

Pihlström, M. & Brush, G.J. (2008), Comparing the perceived value of information and entertainment mobile services. Psychology and Marketing, 25 (8), 732-755.

Press Association (2013, 13 juni). Facebook to introduce clickable hashtags. The Guardian .

Putnam R.D. (2000). Bowling alone : the collapse and revival of American community . New York, NY: Simon and Schuster.

Raacke, J. & Bonds-Raacke, J. (2008). MySpace and Facebook: Applying the uses and gratifications theory to exploring friend-networking sites. Cyberpsychology and Behavior, 11 , 169-174.

Raacke, J. & Bonds-Raacke, J. (2010). MySpace and Facebook: Identifying dimensions of uses and gratifications for friend networking sites. Individual Differences Research , 8, 27-33.

Rainie, L. & Wellman, B. (2012). Networked: The social operating system . Cambridge, MA: MIT Press.

Rantanen, T. (2005). The media and globalization . London: Sage Publications Ltd.

Raynes-Goldie, K. (2010). Aliases, creeping, and wall cleaning: Understanding privacy in the age of Facebook. First Monday , 15 , artikel 32. Consulted on January 2nd 2015 on the World Wide Web: http://firstmonday.org/ojs/index.php/fm/article/view/2775/2432

Rechtszaak tegen Facebook pas in september gepleit. (2015, June 16). De Standaard .

Robards, B. (2010). Randoms in my bedroom: Negotiating privacy and unsolicited contact on social network sites . PRism Online PR Journal,7 (3) .

Rogers, E.M. (1995). Diffusion of Innovations . New York, NY: The Free Press.

Roosendaal, A. (2011). Facebook tracks and traces everyone: Like this! Tilburg Law School Legal Studies Research Paper Series No. 03/2011.

61

Ross, C., Orr, E.S., Sisic, M., Arseneault, J.M., Simmering, M.G. & Orr, R.R. (2009). Personality and motivations associated with Facebook use. Computers in human behavior, 25 , 578-586.

Ryan, T. & Xenos, S. (2011). Who uses Facebook? An investigation into the relationship between the Big Five, shyness, narcissism, loneliness, and Facebook usage. Computers in Human Behavior, 27 (5), 1658-1664.

Sagioglou, C. & Greitemeyer, T. (2014). Facebook's emotional consequences: Why Facebook causes a decrease in mood and why people still use it. Computers in Human Behavior, 35 , 359-363.

Schenker, D. (2012, March 5 th ). Artist explores online identity and privacy with Facebook ID cards. TheCreatersProject . Consulted on July 16 2015 on the World Wide Web: http://thecreatorsproject.vice.com/blog/artist-explores-online-identity-and-privacy-with-facebook- id-cards

Schrems, M. (2015, March 10). Final response by Facebook. Facebook Class Action . Consulted on May 1 st 2015 on the World Wide Web: https://www.fbclaim.com/ui/page/updates

Sheldon, K.M., Abad, N., Hirsch, C. (2011). A two-process view of Facebook use and relatedness need- satisfaction: Disconnection drives use, and connection rewards it. Journal of Personality and Social Psychology, 100 , 766-775.

Smith, W.P. & Kidder, D.L. (2010). You’ve been tagged! (Then again, maybe not): Employers and Facebook. Business Horizons, 53 , 491-499.

Soghoian, C. (2008, 19 maart). Flaws emerge in Facebook’s new privacy controls . Consulted on May 11 2015 on the World Wide Web: http://news.cnet.com/8301-13739_3-9898098-46.html

Staddon, J., Acquisti, A. & LeFevre, K. (2013). Self-reported social network behavior: Accuracy predictors and implications for the privacy paradox. ASE/IEEE International Conference on Social Computing (Socialcom) , 295-302.

Steenackers, J. (2012, 1 augustus). De fiscus, uw vriend op Facebook. MoneyTalk .

Stefanone, M.A., Lackaff, D., Rosen, D. (2011). Contingencies of self-worth and social-networking-site behavior. Cyberpsychology, Behavior, and Social Networking, 14 , 41-49.

Steinfield, C., Ellison, N.B. & Lampe, C. (2008). Social capital, self-esteem, and use of online social network sites: A longitudinal analysis. Journal of Applied Developmental Psychology, 29 , 434-445.

62

Stumpel, M. (2013). Facebook resistance: Augmented freedom. In G. Lovinck & M. Rash (Eds.), Unlike us reader. Social media monopolies and their alternatives (pp. 274-288) . Amsterdam: Institute of Network Cultures.

Stumpel, M. (2010). The politics of social media. Facebook: control and resistance . Niet gepubliceerde masterproef, Amsterdam, Vakgroep Media Studies.

Suki, N.M., Ramayah, T. & Ly, K.K. (2012). Empirical investigation on factors influencing the behavioral intention to use Facebook. Universal Access in the Information Society, 11 (2), 223-231.

Surma, J. (2013). The privacy problem in big data applications: An empirical study on Facebook. ASE/IEEE International Conference on Social Computing , 955-958.

Swallow, S. (2011, October 23 rd ). How recruiters use social networks to screen candidates. Mashable . Consulted on the World Wide Web on February 15 2015 on http://mashable.com/2011/10/23/how- recruiters-use-social-networks-to-screen-candidates-infographic/ and http://www.mjhsbnn.com/mjhs/nick/employers_use_social_media.htm

Tanghe, N. (2013, 22 juli). Verzekeraars voeren strijd tegen fraude op: ‘We screenen alles: Facebook, Youtube, websites…’. De Standaard .

Taylor, D.G., Lewin, J.E. & Strutton, D. (2011). Friends, fans and followers: Do ads work on social networks? How gender and age shape receptivity. Journal of Advertising Research, 51 (1), 258-275.

Tierens, S. (Anchor man). (2015). Koppen XL: Locatievoorzieningen aan/uit. Vlaanderen: Een.

Theguardian PDA The digital content blog. (2009, 27 oktober). Facebook users protest over news feed . Geraadpleegd op 18 mei 2013 op het World Wide Web: http://www.guardian.co.uk/media/pda/2009/oct/27/new-facebook-newsfeed-protest

Thompson, J.B. (1995). The media and modernity: A social theory of the media . Cambridge: Polity Press.

Toma, C.L. & Hancock, J.T. (2013). Self-affirmation underlies Facebook use. Personality and social psychology bulletin, 39 (3), 321-331.

Truant, P. (2013). Vechten tegen de bierkaai? Een onderzoek naar ‘Facebook resistance’ in Vlaanderen (“Master’s thesis”, Ghent University, Ghent, Belgium).

Turow, J.N.E. (2008). Marketing Discrimination in the Digital Age . Cambridge, MA: MIT Press.

63

Valenzuela, S., Park, N. & Kee, K.F. (2009). Is there social capital in a social network site?: Facebook use and college students’ life satisfaction, trust, and participation. Journal of Computer-mediated Communication, 14, 875–901.

Vallerand, R.J. (1997). Toward a hierarchical model of intrinsic and extrinsic motivation. Adv Exp Soc Psychol, 29, 271–360.

Van Alsenoy, B., Verdoodt, V., Heyman, R., Ausloos, J., Wauters, E. & Acar, G. (2015). From social media service to advertising network: A critical analysis of Facebook’s revised policies and terms. Research paper commissioned by the Belgian Privacy Commission .

Van Dijck, J. (2013a). The culture of connectivity: A critical history of social media . New York, NY: Oxford University Press.

Van Dijck, J. (2013b). 'You have one identity': performing the self on Facebook and LinkedIn. Media Culture & Society, 35 (2) , 199-215.

Van Dijk, J. (1999). The network society: Social aspects of new media . London: Sage Publications.

Van Dijk, J. (2012). The network society . (3 rd ed) London: Sage Publications.

Van House, N. (2009). Collocated photo sharing, storytelling, and the performance of self. International Journal of Human-Computer Studies, 67 (12), 1073–1086.

Vanhaelewyn, B., Pauwels, G., Maes, M. & De Marez, L. (2014). Digimeter: Measuring digital trends in Flanders. IMinds, Digimeter Report Aug-Sept 2014 .

Vanhecke, N. & Deckmyn, D. (2015, May 15). Privacycommissie treedt op: Facebook op het matje. De Standaard .

Viswanath, B., Mislove, A., Cha, M., & Gummadi, K. P. (2009). On the evolution of user interaction in Facebook. In Proceedings of the 2nd ACM Workshop on Online Social Networks (pp. 37–42). New York, NY: ACM.

Cohen, D. (2014, December 1 st ). They’re Back: Hoax Status Updates Claiming Copyright to Intellectual Property on Facebook. Social Times . Consulted on May 6 2014 on the World Wide Web: http://www.adweek.com/socialtimes/hoax-status-updates-copyright-intellectual-property/439700

64

Wallace, E., Buil, I., de Chernatony, L. & Hogan, M. (2014). Who "likes" you ... and why? A typology of Facebook fans from "fan"-atics and self-expressives to utilitarians and authentics. Journal of Advertising Research, 54 (1), 92-109.

Walraeve, M. (2013). Jongeren 2.0: Zelfrepresentatie & vriendschappen in sociale netwerksites. Summary research 12-18 year-olds. Not-published Summary, Antwerp, Department Communication Sciences.

Willaerts, C. (2013). Altijd naakt. Manage je identiteit online. Leuven: Lannoo Campus.

Wilson, K., Fornasier, S, White, K.M. Psychological predictors of young adults’ use of social networking sites. Cyberpsychology, behavior, and social networking, 13, 173-177 .

Wilson, R.E., Gosling, S.D., Graham, L.T. (2012). A review of Facebook research in the social sciences. Perspectives on Psychological science, 7 (3), 203-220.

Wondracek, G., Holz, T., Kirda, E. & Kruegel, C. (2010). A practical attack to de-anonymize social network users. Security & Privacy, IEEE .

Wray, R. (2009, 27 augustus). Facebook forced to tighten up privacy rules. The Guardian .

Yang, H.-L. & lin, C.-L. (2014). Why do people stick to Facebook web site? A value theory-based view. Information Technology & People, 27 (1), 21-37.

Yang, Y.; Crook, C. & O’Malley, C. (2014). Can a social networking site support afterschool group learning of Mandarin? Learning media and technology, 39 (3), 267-282.

Yoon, C. & Rolland, E. (2015). Understanding continuance use in social networking services. Journal of Computer Information Systems, 55 (2), 1-8.

Young, A.L. & Quan-Haase, A. (2013). Privacy protection strategies on Facebook: The internet privacy paradox revisited. Information Communication & Society, 16 (4), 479-500.

Yu, A.Y., Tian, S.W., Vogel, D., Kwok, R.C.W. (2010). Can learning be virtually boosted, An investigation of online social networking impacts. Computers and Education, 55 , 1494-1503.

Zhao, S., Grasmuck, S., Martin, J. Identity construction on Facebook: Digital empowerment in anchored relationships. Computers in Human Behavior, 24 , 1816-1836.

65

Zlatolas, L.N., Welzer, T., Hericko, M. & Höbl, M. (2015). Privacy antecedents for SNS self-disclosure: The case of Facebook. Computers in Human Behavior, 45 , 158-167.

66

Table of Contents Appendix 1 ...... 2 Appendix 2 ...... 3 Appendix 3 ...... 4 Appendix 4 ...... 7 Appendix A: Surveying Facebook resistance ...... 9 Appendix B: Descriptives reducing Facebook’s influcence (Q5), resisting Facebook’s influence ...... 15 Appendix C: Reliability analysis: scale attitude towards Facebook (Q1,Q2), scale Facebook resistance (Q5, Q6) ...... 16 Appendix D: Frequencies scale Facebook Resistance (Q5,Q6) ...... 17 Appendix E: Correlation between Scale Facebook love and Facebook resistance ...... 18 Appendix F: Clustering Respondents ...... 19 Appendix G: Cross tabulation clusters – Facebook account ...... 20 Appendix H: Cross tabulation clusters – gender ...... 21 Appendix I: Cross tabulation clusters – age ...... 22 Appendix J: Cross tabulation scale Facebook resistance – adjust privacy settings ...... 23 Appendix K: Cross tabulation scale Facebook resistance – adjust privacy settings in order to resist Facebook/reduce Facebook’s influence ...... 25 Appendix L: Cross tabulation scale Facebook resistance – Adjust use ...... 27 Appendix M: Cross tabulation Scales – Facebook account ...... 29 Appendix N: Correlation scales – frequency Facebook use ...... 32 Appendix O: Correlations Facebook love – frequency of actions ...... 33 Appendix P: Presence of social and institutional privacy and of GFBR, SFBR, IFBR and privacy FBR ...... 40 Appendix Q: Cross tabulation clusters – GFBR, SFBR & IFBR ...... 42 Appendix R: Cross tabulation presence of social privacy – SFBR, adjust privacy – SFBR...... 45 Appendix S: Cross tabulation SFBR - IFBR ...... 47 Appendix T: Facebook account – broad GBFR ...... 48

1

Appendix 1 Facebook tries to draw as less attention as possible to these rules. Consciously following the Facebook policy updates for several years, it is clear to see how Facebook changed some elements. Before the 2015 update, the more neutral term Data Policy was called the Data Use Policy, which could draw more attention to the aspect of Facebook using data. Moreover, Facebook makes it quite hard to find the privacy policy. Sticking at the bottom of each page, it is not accessible from the Timeline, since scrolling down only results in more statuses, stories, posts and adverts. It can thus only be found at the bottom of each non- continuous scrolling page amongst other terms as ‘create advert’, ‘create page’, ‘help’,…

Facebook also has changed the ways of leading the user towards the privacy settings. In the upper right corner, where a triangular arrow points down, there used to be two terms for adjusting the Facebook settings. One specifically mentioned privacy settings 44 , the other mentioned settings in general 45 . Now the shortcut button mentioning the privacy settings is removed, leaving only the latter behind 46 , forcing the user to go to the general settings first and then click through to the privacy settings. This change was not mentioned to users. It makes it harder to find the privacy settings, especially for users who are not very familiar with Facebook. Instead of the removed button, Facebook created a ‘Privacy shortcut’ right next to the blue arrow, accompanied by a blue dinosaur, in order to make this shortcut more attractive. This feature was mentioned in a pop up to all users. The privacy shortcut however, only gives the possibility to change three elements: adjust who can see your posts, change some app settings and change the visibility of some profile info. Other important topics such as ‘who can look me up’ or ‘who can contact me’ are left out and in order to change those, the user has to address the specific privacy settings 47 . Facebook most likely hopes people will feel safe after having adjusted only these settings.

44 This button was leading directly to: https://www.facebook.com/settings?tab=privacy. 45 This button was leading directly to: https://www.facebook.com/settings?tab=account. 46 This button leads directly to: https://www.facebook.com/settings?tab=account.

47 https://www.facebook.com/settings?tab=privacy

2

Appendix 2 The implication of the sharing of information with partners and third parties, becomes more clear when regarding the following definitions: ‘By "information" we mean facts and other information about you, including actions taken by users and non-users who interact with Facebook ’ […] ‘By "Facebook" or” Facebook Services” we mean the features and services we make available, including through (a) our website at www.facebook.com and any other Facebook branded or co-branded websites (including sub-domains, international versions, widgets, and mobile versions); (b) our Platform; (c) social plugins such as the Like button, the Share button and other similar offerings; and (d) other media, brands, products, services, software (such as a toolbar), devices, or networks now existing or later developed’ (Statement of Rights and Responsibilities, last updated on January 30 2015).

When Facebook actually does use a user’s content or information, this is subject to a ‘ non- exclusive, transferable, sub-licensable, royalty-free, worldwide license to use any IP content that you post on or in connection with Facebook (IP License) ’. Remarkable is that this right is transferable. Moreover, users are not compensated when the company uses their content or information, at the same time it has the right to sub-license these data. Facebook acts thus as if it is becoming the new owner of the content provided by the Facebook user.

3

Appendix 3 Facebook’s algorithms, which are related to data mining capacity (see supra) and which are highly secretive, arouse discontent with Facebook users, since Facebook expects its users to be open and have no secrets towards the website (and thus towards the company) (van Dijck, 2013a). Algorithms such as Edge Rank, determine on the basis of metadata from who and what users see on their News Feed. The more interaction with specific persons, the more likely users see more of these people (Bucher, 2012). Bucher argues the consequences for Facebook users are not solely good: In the end users only see actions of people they have a close relationship with on Facebook. This endangers people’s visibility for their bridging capital (Bucher, 2012). Regarding the statement of Boyd (2008) and Hull et al. (2011) one should conclude this contradicts the intentions of many Facebook users. Moreover, as the connectivity argument of van Dijck stands, these algorithms provide Facebook with lots of information on their users.

Applications of third parties have also become subject to criticism. (Hull et al., 2011; Egele et al., 2012). Many users are not happy with the appearance on other’s News Feeds of the music they listen to on Spotify. This setting is default, unless they opt out. Moreover, this partnership gives Facebook the opportunity to know what kind of music Facebook users like (O’Carroll & Halliday, 2011, September 29). The introduction of the Timeline in 2011 also made many suspicious of privacy issues. It meant the culmination point of the shift towards standardization, showing every status update in chronological order. For some users this meant a wakeup call realizing they had posted more data on their profiles than thought of (van Dijck, 2013). Although standardization can enhance connectedness between users, at the same time it facilitates applying algorithms and personalizing advertisements more efficiently.

After some academic authors had already expressed their concerns (Jones & Soltren, 2005), several authors recognize the Beacon case to be the first example of Facebook’s road towards commercialization (and thus connectivity) becoming truly visible for users (e.g. Perez, 2007, November 30; Debatin et al. 2009; van Dijck, 2013a). With the arrival of Beacon the concept of ‘sharing’ changed (van Dijck, 2013a): Facebook made it for the first time openly clear that it would enhance sharing content with commercial third parties. Facebook worked together with 44 commercial companies: if Facebook users bought something from

4 these companies via internet, an announcement appeared on the News Feed of their friends. Users perceived this as an attack on their privacy, especially because users were opted-in by default and there was not foreseen an opt-out setting. Due to large protest (see infra), Facebook had to introduce an opt-out possibility and eventually it had discharge the feature (van Dijck, 2013a). Debatin et al. (2009) consider the Beacon affaire to be an ‘accident’. Facebook did however not make the same mistake twice: It relocated the commercial aspects in its terms and conditions of use to the background (Turrow, 2008; Debatin et al., 2009; Fuchs, 2011 & van Dijck, 2013a), resulting in a long, complex and contradictive privacy policy (see supra). Making the comparison with an iceberg, the part on the foreground encourages the user to constantly feed the information gathering (Debatin et al., 2009; van Dijck, 2013a). ‘ The invisible part, on the other hand, is constantly fed by the data that trickle down from the interactions and self-descriptions of the users in the visible part […] As long as users feed the invisible part of the iceberg with extensive personal data that they update voluntarily and continually, their privacy is at risk’ (Debatin et al., 2009).

The face recognition techniques that Facebook uses have also been subject to criticism: they can identify specific persons due to fysiological- behavioral features (Andrade et al., 2013; Buckley & Hunter, 2011). These techniques make for example automated tagging possible, threatening our bodily privacy due to repeated validation and transforming the biometric identity into a single and primary identity (Andrade et al., 2013). Again Facebook would not have mentioned its users about the use of these techniques, nor would it have explained sufficiently how they work (Buckley & Hunter, 2011).

Facebook regards their users to have one single true identity (van Dijck, 2013b), not only entailing physical appearance. Facebook expects its users only to use their authentic identity, as is stated on their birth certificate, drivers license, passport 48 ,… Nick names are only allowed when meeting certain criteria (Facebook’s name policy, last updated on March 10, 2015). Facebook also encourages the use of Facebook Connect, implemented in many other websites in order to log on to these websites using their real/Facebook identity. Facebook Connect makes it hereby possible to trace and predict online behavior in the benefit of advertisers (Willaerts, 2013; van Dijck, 2013a), thus seriously endangering

48 Consulted on July 6 2015 on the World Wide Web: https://www.facebook.com/help/159096464162185 5 institutional privacy. This is also what van Dijck means when handling the term ‘ frictionless sharing’. Facebook however makes a fundamental judgmental mistake (van Dijck, 2013b), overlooking people’s changing identity on social networks as acting performances (Van House, 2009). Leingruber (2012) also criticizes the general acceptance of the Facebook identity as the real identity, outside of the social network site for the use of controlling identity, as Lown (2013, May 3 rd ) and Leingruber in an interview (Schenker, 2012, March 5 th ) give an example of.

One part of criticism on Facebook and its privacy policy specifically focuses on the legal framework implementing privacy rules. Being situated in the United States, Facebook operates according to the laws of this country. The European division of Facebook is situated in Ireland and while operating in the European Union it has thus to comply with the rules of this country. Europe is well known for imposing stricter privacy rules (European Commission, 2015; van Dijck, 2013a). However, Ireland seems carefully chosen, as the privacy rules of this country prove to be less strict than in other European countries (Farrell, 2014, June 20; Matussek, 2013, February 15).

6

Appendix 4

In 2014 Max Schrems issued a lawsuit against Facebook for violating users’ privacy, mass surveillance and its part in the NSA’s Prism schandal in 2013. He managed to gather the support of 25 000 people to join his 10 million euro Facebook Class Action 49 . Schrems originally instituted the legal proceedings against Facebook in Ireland, but dropped these and did it in Vienna, claiming it would take too long to do it in Ireland (Brennan, 2015, April 10). This is exactly the reason Facebook wants the court to declare the lawsuit not permissible: ‘ There is basically no word on the actual violations of the law. Facebook is in essence only targeting the jurisdiction of the court ’ (Schrems, 2015, March 10). In the past, Facebook was already successful using this strategy (Matussek, 2013, February 15). For now, however, there has been a hearing in April 2015 in order to distinguish whether this matter falls under the jurisdiction of the Austrian court. Schrems argues Facebook will do everything to delay the court case, hoping people will run out of time and money. Schrems is however not very likely to give up, as his goal is supported by a procedure financing company (Schrems, cited in Nasralla & Gruber, 2015, April 9).

There are several legislative initiatives that try to address the privacy issues due to Facebook’s policy. The European Commission announced in 2012 it would adjust the European directives for data protection which already dated of 1995, and the associated risks due to the current situation of globalization en new technologies (European Commission, 2012, January 25). These changes also imply rules for the gathering of Big Data (European Commission, 2015, April 15; European Commission, 2015), which also Facebook would have to meet. The European Commission argues: ‘ Organisations will be required to publish transparent and easily accessible data protection policies. Simple icons on a website could explain how, by whom and under whose responsibility personal data will be processed ’ (European Commission, 2015). In 2014 a judgment of the European Court of Justice issued the Right to be Forgotten associated with online search engines (European Commission, 2014), however, this seems not to be a sound protection (Commission of the Protection of Privacy, 2014b).

49 Consulted on May 1 st 2015 on the World Wide Web: https://www.fbclaim.com/ui/page/faqs 7

Belgium is one of the first countries arguing it wants to defend its citizens better against Facebook’s privacy violations. As a result of a privacy report on Facebook (Van Alsenoy, 2015) issued by the Commission for the Protection of Privacy, Bart Tommelein, State Secretary of Privacy, threatened that if Facebook would not bend to stricter Belgian rules the Commission for the Protection of Privacy would take the case to court (Neyskens, 2015, May 15). Recently this decision was made (Facebook voor Brusselse rechter, 2015, June 15; Rechtszaak tegen Facebook pas in september gepleit, 2015, June 16). Meanwhile, the Commission for the Protection of Privacy advises Facebook users to take matters into their own hands and install add-ons and user scripts in order to prevent Facebook from tracking their online behavior (Commission for the Protection of Privacy, 2014a).

8

Appendix A: Surveying Facebook resistance

Dear respondent

As a master student at Ghent University (Belgium), I am doing research about Facebook.

Anyone with and anyone without a Facebook account can fill in this survey. Please read and answer the following questions carefully and indicate the answers that best represent your opinion. There are no right or wrong answers.

Your data will be processed anonymously and won’t be given to third parties.

Completing the survey takes about 10 minutes. By filling in the survey completely, you have a chance to win two cinema tickets.

Thank you for your participation! Petra Truant

9

1. What is your opinion about the Facebook website?

Very negative Neutral Very positive

2. What is your opinion about the Facebook company?

Very negative Neutral Very positive

3. Do you have a Facebook account?

Yes No

4. How often do you use Facebook?

Never Moderately Very much

5. To what extent are you trying to reduce the influence of Facebook?

Not at all Moderately As much as possible

6. To what extent are you trying to resist against the influence of Facebook?

Not at all Moderately As much as possible

10

7. Do you adjust your privacy settings?

Yes No

Specify what you do exactly:

8. How are you trying to reduce / resist against the influence of Facebook?

A. Adjust the privacy settings, specifically to reduce / resist against the influence of Facebook.

Yes No

B. Adjust your use of Facebook.

Yes No

Specify what you do exactly:

11

C. Use external aids/devices (such as add-ons, hacks or user-scripts).

Yes No

Specify what you do exactly:

D. Are you doing other things?

Yes No

Specify what you do exactly:

12

E. Why are you trying to reduce / resist against the influence of Facebook?

9. To what extent are you doing the following things on Facebook?

Never Moderately Very much Chatting Indicating that you go to events Post / share hashtags Post / share pictures Post / share films Reply to posts from friends / acquintances (incl. “like”) Post / share status updates Post / share links to other websites Playing games

13

10. How old are you?

…………………….

11. What is your gender?

Female Male

12. What is your nationality?

………………………………………….…………………….

13. What is your education?

…………………….…………………….…………………….…………………….…………………….…………………….……

Thank you for participating. We are still looking for people that are willing to participate in a short (completely anonymous) interview about Facebook-usage.

If you want to participate, please give us your e-mail address in the next field. We will contact you soon.

(This information will only be used for this study and will not be given to other parties)

…………………………………………………………………………………………..

Do you want to have a chance to win two cinema tickets? Please leave your e-mail address. (This information will only be used for this study and will not be given to other parties)

…………………………………………………………………………………………..

14

Appendix B: Descriptives reducing Facebook’s influcence (Q5), resisting Facebook’s influence Q5

Q6

15

Appendix C: Reliability analysis: scale attitude towards Facebook (Q1,Q2), scale Facebook resistance (Q5, Q6) Q5, Q6

Q1, Q2

16

Appendix D: Frequencies scale Facebook Resistance (Q5,Q6)

17

Appendix E: Correlation between Scale Facebook love and Facebook resistance

18

Appendix F: Clustering respondents

19

Appendix G: Cross tabulation clusters – Facebook account

20

Appendix H: Cross tabulation clusters – gender

21

Appendix I: Cross tabulation clusters – age

22

Appendix J: Cross tabulation scale Facebook resistance – adjust privacy settings

23

24

Appendix K: Cross tabulation scale Facebook resistance – adjust privacy settings in order to resist Facebook/reduce Facebook’s influence

25

26

Appendix L: Cross tabulation scale Facebook resistance – Adjust use

27

28

Appendix M: Cross tabulation scales – Facebook account

Facebook love – Facebook account

29

Facebook resistance – Facebook account

30

31

Appendix N: Correlation scales – frequency Facebook use Facebook love

Facebook resistance

32

Appendix O: Correlations Facebook love – frequency of actions Sharing/posting updates

33

Sharing/posting pictures

Sharing/posting films

34

Post/share links to other websites

35

Post/share hashtags

36

Reply to posts

37

Indicate going to events

Playing games

38

Chatting

People without Facebook account using Facebook

39

Appendix P: Presence of social and institutional privacy and of GFBR, SFBR, IFBR and privacy FBR

40

41

Appendix Q: Cross tabulation clusters – GFBR, SFBR & IFBR GFBR

42

SFBR

43

IFBR

44

Appendix R: Cross tabulation presence of social privacy – SFBR, adjust privacy – SFBR Social privacy - SFBR

45

Adjust privacy - SFBR

46

Appendix S: Cross tabulation SFBR - IFBR

47

Appendix T: Facebook account – broad GBFR

48