Tschersich et al. Impact of Default Settings on Self-Disclosure in SNS

Understanding the Impact of Default Privacy Settings on Self-Disclosure in Social Network Services Building a Conceptual Model and Measurement Instrument Completed Research Paper

Markus Tschersich Reinhardt A. Botha Goethe University Frankfurt Nelson Mandela Metropolitan University [email protected] [email protected]

ABSTRACT Restrictive default privacy settings might threaten the business model and functionality of social network services (SNS). As a first step in understanding the impact of default privacy settings on self- disclosure in SNS this paper proposes a conceptual model to reflect the decision-making and dimensions of self-disclosure in SNS. The conceptual model depicts aspects of benefits, costs and trust as impacting on five dimensions of self-disclosure. Thereafter, an instrument measuring the impact of default privacy settings on the SNS was developed. An item-sorting task, comprising three rounds, was performed to ensure the validity of the constructs used in the measurement instrument. The instrument comprises a 53-item questionnaire based on 16 constructs. Keywords Default Privacy Settings, Social Network Services, Privacy by Default INTRODUCTION The privacy of users in information systems is a big concern, actively discussed by society and in scientific literature (Smith et al., 2011). Within the context of online communities and social networking services (SNS) several research projects investigated the needs and behavior of users (Gross and Acquisti, 2005; Taraszow et al., 2010), while others have developed tools to support users in the management of their privacy settings (Tschersich et al., 2011). Motivated by several incidents with respect to privacy abuse, the European Commission (EC) decided to intervene. A study across the European Union found that the majority of citizens feel that they don’t have complete control over their personal data (European Commission, 2011). To strengthen the safety of citizens, the EC is working on a reform of the European Union Data Directive 95/46/EC (European Parliament, 1995), because “the rapid change have brought new challenges for data protection” (European Commission, 2012). The reform aims to make Privacy by Default (PbDef) mandatory for services that deal with personal information. PbDef delivers the maximum degree of privacy as the default, requiring every user to decide explicitly what s/he wants to share with others (Cavoukian, 2008). These privacy-friendly default settings could enhance the protection of the personal information of users (European Commission, 2012). Although improving the privacy of individuals, PbDef, by regulation, is a potential threat for social networks – without explicit permission no personal information can be processed. Literature shows that users tend to use default settings (Goldstein et al., 2008; Shah and Kesan, 2006). This is also with Facebook (Gross and Acquisti, 2005). Thus, restrictive default privacy settings could cause less self-disclosure, but some self-disclosure is critical for active SNS participation and for the sustainability of SNS (Ledbetter et al.,

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 1 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

2011). Therefore, providers of social network services maintain that their business model and the functionality of their social network will suffer from PbDef (Europe versus Facebook, 2012). A paradox therefore exists: users keep default settings, but effective SNS requires personal information (boyd, 2007). Current literature is mute on how users self-disclose when restrictive rather than privacy- friendly default settings prevail. This paper describes a conceptual model to help investigate the impact of default privacy settings on self-disclosure in SNS. One needs to understand: (i) the decision to self- disclose, and (ii) the dimensions of the self-disclosure. Thereafter, an instrument to measure the impact of default privacy-settings on the Facebook SNS can be developed and validated. This paper is structured as followed. After the theoretical background, the conceptual model is presented. The description of the measurement instrument and the validation of the constructs follow in section four. Finally the paper concludes with the road ahead. THEORETICAL BACKGROUND The reasoning behind selecting the term 'social network services' starts with observing that this services enable users “to articulate and make visible their social networks" (boyd, 2007). Services can have many delivery mechanisms, web sites and mobile phone applications being two of these. However, privacy is a fundamental issue, oblivious to different delivery mechanisms. Therefore the paper steers away from site or online. Beer (2008) questions the analytic value of choosing the wider 'network' term above 'networking'. However, taking a lifecycle view (Iriberri and Leroy, 2009) it is contended that privacy is not only a concern when acquiring new contacts, as 'networking' implies (boyd and Ellison, 2008). Social networks are thus referred to, not social networking. While not all people are interested in participating in social networks (Andrews, 2002), often those who wish to feel a sense of belonging; build esteem through self-presentation, or heed peer pressure use a social network service, and must necessarily share some personal information (Krasnova et al., 2008). The decision to disclose personal information in a social network can be based on the privacy calculus (Smith et al., 2011): the anticipated benefits and perceived cost/risk are weighed to reach a decision. Krasnova et al. (2008) present a conceptual framework based on the privacy calculus to model the decision to self-disclose when using a SNS. The model combines perceived benefits and perceived privacy costs with trust factors that may influence the perception of users regarding benefit and risk. Concerns about privacy and safety; the lack of anonymity and shyness about public posting may inhibit participation in a social network service (Iriberri and Leroy, 2009). concerns access to individually identifiable personal information (Smith et al., 2011), which is any information about the self. Voluntarily sharing ‘information about the self’ that is unlikely to be known or discovered from other sources is called self-disclosure (Pearce and Sharp 1973). While self-disclosure is often discussed, its nature is generally only considered to be ‘information about the self’, for example (Cozby, 1973; Ledbetter et al., 2011; Pearce and Sharp, 1973). Wheeless and Grotz (1976) also start from that definition, but then observe that the process of self-disclosure is the process of communication through self-disclosive messages of various degrees. Wheeless and Grotz (1976) confirmed five dimensions of self-disclosure: consciously intended disclosure (intent); amount of disclosure (frequency and duration); positive-negative nature of the disclosure (valence), honesty- accuracy of the disclosure, and general depth and intimacy of the disclosure. Posey et al. (2010) empirically confirmed the five dimensions also in the context of self-disclosure in online communities.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 2 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

Joinson (2001) found that when using computer-mediated communication people disclose more information about themselves than people communicating face-to-face. Explanations are offered in terms of public and private self-awareness and visual anonymity. In addition, Trudeau et al. (2009) provide some counter-intuitive empirical evidence that users applying introspection to privacy policies are more likely to share their personal data more openly. This suggests that the default setting that encourages introspection may not be desirable – the choice of default setting is thus quite important. Default settings have been investigated in numerous different domains and fields of application. It has been shown to influence security of WiFi access points (Shah and Sandvig, 2008); the purchase of seat reservations on railways (Goldstein et al., 2008); the percentage of organ donors (Johnson, 2003); and response rates in web surveys (Jin, 2011). The common theme is that users tend to accept default settings. Several possible reasons for not changing the default settings exist: cognitive and physical laziness; perceiving default as correct, perceiving endorsement from the provider; using the default as a justification for choice, lacking transparency of implication, or lacking skill (Bellman et al., 2001; Dhar and Nowlis, 1999; Iyengar and Lepper, 2000; Samuelson and Zeckhauser, 1988; Shah and Kesan, 2006). Given the plethora of reasons for not changing default settings, understanding the effect of default settings, and specifically PbDef, on self-disclosure is important. The effect of default privacy settings on self-disclosure in SNS has, to the best of our knowledge, not been studied empirically. This paper therefore sets out to develop a conceptual model and an instrument that can be used to investigate these questions. Our model is grounded in Krasnova and Veltri (2010) and Posey et al. (2010). Krasnova and Veltri (2010) address the decision-making regarding self-disclosure using the privacy calculus, but do not unpack self-disclosure any further. Aiming to understand self-disclosure better, the five dimensions of self-disclosure by Posey et al. (2010): amount, depth, honesty, intent, valence are adopted. Next, these two models are combined in this conceptual model. CONCEPTUAL MODEL This process of self-disclosure includes the decision-making to self-disclose and the properties of the self-disclosure. Owing to the social exchange theory and the privacy calculus, people weigh up the benefits and costs of their self-disclosure (Cozby, 1973; Dinev and Hart, 2006; Homans, 1961). A change in the default privacy settings can change the ratio of cost and benefits (as described in the following) and therefore also the self-disclosure. This change in the trade-off also influences the dimensions of the self-disclosure in SNS (Posey et al., 2010). Consequently, a conceptual model that includes both the decision-making to self-disclose and the dimensions of self-disclosure is required to understand the impact of default privacy settings on the self- disclosure of users in SNS (c.f. Figure 1). In the following the two parts of the conceptual model are described in more detail. Decision-Making regarding Self-Disclosure As described previously, the decision to self-disclose is a weighing up between the costs and benefits resulting from the revelation of personal information. In the context of SNS costs can be seen as the risk that something unintended happens with the personal information of users (Dinev and Hart, 2006). Besides the privacy calculus trust also plays an important role in the decision-making for self-disclosure (boyd, 2007; Dwyer et al., 2007; Posey et al., 2010). Krasnova and Veltri (2010) thus built a conceptual model that merges benefit-, cost- and trust-factors to explain decision-making regarding self-disclosure.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 3 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

This conceptual model can also be used to measure the impact of default privacy-settings on the decision-making to self-disclose as described in the following paragraphs.

Figure 1. Conceptual model To measure benefits, Krasnova and Veltri (2010) highlight enjoyment, relationship maintenance and self-presentation as main drivers for the usage of social networks and for the sharing of personal information. Benefits might be influenced by the strictness of the default privacy settings. Default settings that are very privacy-friendly will probably lead to less available personal information from the peers of a user in an SNS. Consequently, the enjoyment can be reduced and also the benefit of relationship maintenance is made more difficult. With restrictive privacy settings and a missing awareness of those settings on the side of the user, the benefit of self-presentation can also be limited (Shah and Kesan, 2006). While benefits motivate users to self-disclosure, privacy costs represented by privacy concerns, perceived likelihood and perceived damage can take users away from revealing personal information in a social network (Krasnova and Veltri, 2010). In the conceptual model of Krasnova and Veltri (2010), privacy concerns describe how users assess the consequences of revealing personal information. They are influenced by several aspects, inter alia, privacy experiences and privacy awareness (Smith et al., 2011; Xu et al., 2008). More restrictive default privacy settings, as in the case of PbDef might decrease privacy concerns, because of a lower risk of privacy vulnerability. Similarly, the perceived likelihood and perceived damage of privacy violations can depend on the default privacy-settings. With PbDef, users need to decide explicitly what to share with whom. This transparency can reduce the perceived likelihood and perceived damage. Trust in the provider, in other users and in the regulation influence positively the self-disclosure in social networks (Dwyer et al., 2007; Krasnova and Veltri, 2010; McKnight et al., 2002). Having more restrictive default privacy-settings possibly increases the trust towards the different parties. For example, providers might seem to be more trustworthy to their users, when they offer functionalities to protect the privacy of users (Bowie and Jamal, 2006). Furthermore, the trust of users can be enhanced by their belief in the effectiveness of regulation (McKnight et al., 2002). Consequently, trust in regulation could be enhanced, when regulation enforces providers to implement PbDef.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 4 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

Finally, the constructs of perceived control over their personal information as well as the awareness about information handling out of the conceptual model of Krasnova et al. (2010) are relevant for the identification of the impact default privacy settings. For instance, the perceived control can be positive influenced by PbDef owing to the required active decision of self-disclosure. Awareness can be influenced positively as well, because PbDef should make sure that personal information is not be processed by anyone without the explicit permission of the user. Dimensions of Self-Disclosure The five dimensions of self-disclosure (Posey et al., 2010) have been adopted. It is described that those dimensions are partly interdependent and have some inverse relationships. For instance, Cozby (1973) highlights the inverse relationship between amount and depth (intimacy). Thus, an influence on one of the dimensions by the change of the default privacy settings can change the whole way that users disclose personal information in an SNS. To understand the impact of default privacy settings on self- disclosure, it is necessary to identify the impact on the various dimensions. It can be expected that default privacy settings influence the dimensions as follows. Users share a greater amount of personal information and in more depth when they are less concerned about privacy risks (Gross and Acquisti, 2005). Consequently, more restrictive default privacy settings should also lead to a change in the amount and depth of self-disclosure in SNS. When users are concerned about their privacy in a service, they tend to publish wrong information (Son and Kim, 2008). Depending on the privacy-friendliness of the defaults, the honesty of self-disclosure can be increased by reducing privacy concerns. Since with restrictive default privacy-settings users need to decide actively what to disclose, the intent to self-disclose is most probably also influenced by the default privacy settings. Finally, more restrictive default privacy settings might lead users to disclose less positive issues, because they feel safer.

Construct Item definition Ref.

Amount I frequently share personal information on my timeline or on my profile on Facebook. I usually write about myself on Facebook for fairly long periods of time. 1976)

I often publish status messages on Facebook where I write about myself. Depth I fully disclose personal information on Facebook to a high degree of intimacy. , Personal Information I disclose on Facebook is confidential and intimate. Status messages on Facebook where I reveal myself are very intimate. Honesty I always feel completely honest when I reveal my own feelings and experiences on Facebook. My self-disclosures on Facebook are completely accurate reflections of who I really am. I am not always honest in status messages and profile entries on Facebook. My statements about my own feelings, emotions and experiences on Facebook are always accurate self-perceptions. disclosure - I am always honest in the information I reveal on Facebook. 1978; Wheeless and Grotz and Wheeless 1978;

, self Intent When it is my intention, my self-disclosures on Facebook reveal the information about me that I want to reveal. When I reveal personal information on Facebook, I intentionally publish personal information in

status messages and on my profile. (Wheeless When I reveal personal information about myself on Facebook, I consciously intend to do so. Valence Usually, information I disclose about myself on Facebook is positive. I normally express my "good" feelings on Facebook about myself. On the whole, my disclosures about myself on Facebook are more positive than negative. on Based

Table 1. Items of dimensions of self-disclosure by construct

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 5 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

Construct Item definition Ref.

Privacy Concerns How concerned are you that the information submitted to Facebook… … can be used in a way you did not foresee? … can become available to someone without your knowledge? … can be continuously spied on (by someone unintended)? Perceived Please assess the likelihood of the following events: …

Likelihood … Information you provide on Facebook will be used in a way you did not foresee. … Information you provide on Facebook will be accessed by someone you don't want to access it. costs … Information you provide on Facebook will be misinterpreted by someone. Perceived Please assess the amount of resulting damage to you if … Damage … the personal information you provide to Facebook was used in a way you did not foresee. … the personal information you provide to Facebook was accessed by someone you don't want to access it. … the personal information you provide to Facebook was misinterpreted by someone. Enjoyment I have fun on Facebook. I spend enjoyable and relaxing time on Facebook.

Self-Presentation Facebook allows me to make a better impression on others.

Facebook allows me to present myself in a favorable way to others.

benefits Relationship Facebook is useful in supporting relationships with my friends. 2010)

Maintenance Facebook is convenient to stay in touch with my friends. , Facebook is useful for developing relationships with people (business or private). Other Users Generally, I trust that Facebook users … … will not misuse my sincerity on Facebook. … will not embarrass me with some information they learned about me through Facebook. … will not use the information they found about me on Facebook against me. … will not use the information about me in a wrong way. … are trustworthy. (Krasnovaand Veltri … are open to and delicate with each other. Platform In general, Facebook is open and receptive to the needs of its members. In general, Facebook is honest in its dealings with me. In general, Facebook keeps its commitments to its members. on Based In general, Facebook is trustworthy. In general, Facebook tells the truth related to the collection and use of personal information.

In general, Facebook is competent in protecting the information I provide.

trust Legislation I feel confident that existing laws protect me against abuse of my information on Facebook. Existing laws adequately protect my personal information on Facebook. The existing legal framework, the policies and regulations are good enough to make me feel comfortable using Facebook. Perceived Control How much control is given to you by Facebook (e.g. through functionality, privacy policies) over … … the information you provide on Facebook (e.g. in your profile, on the Wall, etc.). … how and in which cases the information you provide can be used. … who can view your information on Facebook. … who can collect and use the information you share on the platform. … the actions of other users (e.g. tagging you in pictures, writing on your Wall). Awareness Generally, I am aware of how the personal information I provide can be used by Facebook. I am aware of what information Facebook is collecting about me.

Table 2. Items of decision-making for self-disclosure by construct

INSTRUMENT DEVELOPMENT Based on the conceptual model described in the previous section, a measurement instrument was built to identify the impact of PbDef on the self-disclosure in SNS. The procedure proposed by O'Leary-Kelly and Vokurka (1998) was followed to assess the validity of the instrument. The construct validation is a necessary and important step to ensure that “a construct sufficiently measures the intended concept” (O'Leary-Kelly and Vokurka, 1998) and to counter corrupting elements embedded in measures like for

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 6 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS example measurement errors. The construct validity requires the empirical assessment of the suitability of the measure by ensuring the unidimensionality, reliability and validity of the constructs. Following the three stages of Moore and Benbasat (1991) these components have been ensured. The first stage requires the identification of existing items or the creation of new ones to be consistent with the definition of the related constructs. For the instrument existing items were taken for the different constructs and they were adapted them to the specified scope (cf. Tables 1 and 2). The second step includes the assessment of construct validity as well as the identification and refinement of ambiguous items. In the final stage, an instrument testing and factor analysis is performed. This is not covered by this paper, but will be carried out in future.

Table 3. Substantive validity pretest aggregated per construct Construct validity is ensured by a pretest assessment of the substantive validities of the measures in order to predict the performance of the measures (Anderson and Gerbing, 1991). Substantive validity is requisite for the validity of a construct. The substantive validity assessment is performed by an item- sorting task. Representatives of the population of interest are asked to assign each element of the list of items to the construct that, in their judgment, is the intended one. The participants of the pretest received a list of all items in a randomized order and the constructs with their definitions. Two indices are used to analyze the data across all replies, to assess the substantive validity. The index of the proportion of substantive agreement, Psa, calculates the proportion of substantive agreement and indicates the extent to which an item reflects its intended construct. The substantive validity coefficient, Csv, reflects to which extent the representatives assign an item to the intended construct more than to any other construct. The value for Psa ranges from 0.0 to 1.0 and for Csv the range is from -1.0 to 1.0. For both indices the recommended threshold is 0.5 and the larger the value the greater the substantive validity is. Three rounds, with 12 to 15 participants each, were carried out. An item-sorting task requires expert participants. In this case expertise in making personal decisions is present in all humans and expertise in the Facebook SNS can be judged by active use of Facebook. In each round the two indices were calculated. In the case that the index of an item was under the threshold, this item was revised for the next round. Modifications were necessary after the first two rounds. In general, the participants had problems with the right assignment of the items of self-disclosure and their dimensions. In the case of

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 7 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS amount, depth, intent and valence for each, one item was removed to prevent a wrong measurement. Also, for the construct of platform two items were removed, because it was not possible to distinguish them from other constructs without losing the connection to the related construct. As displayed in Table 3, the number of items that do not reach the threshold of 0.5 decreased from round to round. In the final or third round, the indices of all items passed the threshold so that no further modification was required. The high values of both indices, Psa and Csv, within the third phase (c.f. aggregated figures in Table 3) indicate that a high measurement performance of the items is ensured. Finally, the result was a 53-item instrument that operationalizes the conceptual model for the experiment. Tables 1 and 2 display the instrument, including all positive tested items. CONCLUSION In the research conducted, it was considered how default privacy settings impact on the self-disclosure in SNS. In a first step, a conceptual model was built that combines the understanding about the decision- making regarding self-disclosure and the dimensions of self-disclosure. This conceptual model helps to understand the impact of default privacy settings. To be able to measure the concrete impact of very strict compared to weak default privacy settings, a measurement instrument was built based on the conceptual model that can measure the impact on the SNS Facebook. The measurement instrument passed a multistage process during which successful tests for substantive validity of the constructs could be applied. Both the conceptual model and the measurement instrument are limited to SNS that are being used for private purposes. SNS that are used professionally most probably will have other requirements for the benefits of using the platform and might also have other costs. The measurement instrument directly addresses users of Facebook. Whether the instrument can easily be applied to an SNS other than Facebook remains an open question. In future research, a field experiment will be conducted based on the described measurement instrument, to get valid data about the impact of default privacy settings on SNS. Using a two-group pretest/posttest experimental design, it will be possible to understand how the decision-making regarding self-disclosure and the dimensions of self-disclosure are influenced by more strict default privacy settings. By having the results, it will be possible to identify whether PbDef is a threat for the business model and the functionality of SNS or not. It would also be known whether the EC achieved their goal to protect their citizens by the continuation of business innovations (European Commission, 2012). REFERENCES 1. Anderson, J. C., and Gerbing, D. W. (1991) Predicting the performance of measures in a confirmatory factor analysis with a pretest assessment of their substantive validities., Journal of Applied Psychology, 76, 5, 732. 2. Andrews, D. C. (2002) Audience-specific online community design, Communications of the ACM, 45, 4), 64–68. 3. Beer, D. D. (2008) Social network(ing) sites…revisiting the story so far: A response to danah boyd & Nicole Ellison, Journal of Computer-Mediated Communication, 13, 2, 516–529. 4. Bellman, S., Johnson, E. J., and Lohse, G. L. (2001) To opt-in or opt-out?: it depends on the question, Communications of the ACM, 44, 2, 25–27. 5. Bowie, N. E., and Jamal, K. (2006) Privacy rights on the internet: self-regulation or government regulation? Business Ethics Quarterly, 323–342.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 8 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

6. boyd, d. m. (2007) Why Youth (Heart) Social Network Sites: The Role of Networked Publics in Teenage Social Life, in Buckingham, D. (Eds.) Youth, Identity, and Digital Media, Cambridge, 119– 142. 7. boyd, d. m., and Ellison, N. B. (2008) Social network sites: Definition, history, and scholarship, Journal of Computer-Mediated Communication, 13, 1, 210–230. 8. Cavoukian, A. (2008) , www.privacybydesign.ca. 9. Cozby, P. C. (1973) Self-disclosure: a literature review, Psychological Bulletin, 79, 2, 73. 10. Dhar, R., and Nowlis, S. M. (1999) The effect of time pressure on consumer choice deferral, Journal of Consumer Research, 25, 4, 369–384. 11. Dinev, T., and Hart, P. (2006) An Extended Privacy Calculus Model for E-Commerce Transactions, Information Systems Research, 17, 1, 61–80. 12. Dwyer, C., Hiltz, S. R., and Passerini, K. (2007) Trust and privacy concern within social networking sites: A comparison of Facebook and MySpace, Proceedings of AMCIS, 1–12. 13. Europe versus Facebook (2012) Facebook’s views on the proposed data protection regulation, www.europe-v-facebook.org. 14. European Commission (2011) Attitudes on Data Protection and Electronic Identity in the European Union (No. 359), Special Eurobarometer. 15. European Commission (2012) Regulation of the European Parliament on the protection of individuals with regard to the processing of personal data and on the free movement of such data, ec.europa.eu. 16. European Parliament (1995) Directive 95/46/EC of the European Parliament and of the Council of 24 October 1995 on the protection of individuals with regard to the processing of personal data and on the free movement of such data, Official Journal L 281. 17. Goldstein, D. G., Johnson, E. J., Herrmann, A., and Heitmann, M. (2008) Nudge your customers toward better choices, Harvard Business Review, 86, 12, 99–105. 18. Gross, R., and Acquisti, A. (2005). Information revelation and privacy in online social networks (the Facebook case), ACM WPES'05. 19. Homans, G. C. (1961) Social Behavior: Its Elementary Forms, Harcourt, Brace, & World, New York. 20. Iriberri, A., and Leroy, G. (2009) A life-cycle perspective on online community success, ACM Computing Surveys, 41, 2, 1–29. 21. Iyengar, S. S., and Lepper, M. R. (2000) When choice is demotivating: Can one desire too much of a good thing?, Journal of Personality and Social Psychology, 79, 6, 995–1006. 22. Jin, L. (2011) Improving response rates in web surveys with default setting: the effects of default on web survey participation and permission, International Journal of Market Research, 53, 1, 75. 23. Johnson, E. J. (2003) MEDICINE: Do Defaults Save Lives?, Science, 302, 5649, 1338–1339. 24. Joinson, A. N. (2001) Self-disclosure in computer-mediated communication: The role of self- awareness and visual anonymity, European Journal of Social Psychology, 31, 2, 177–192. 25. Krasnova, H., and Veltri, N. F. (2010) Privacy calculus on social networking sites: Explorative evidence from Germany and USA, HICSS'10, 1–10.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 9 Tschersich et al. Impact of Default Privacy Settings on Self-Disclosure in SNS

26. Krasnova, H., Hildebrand, T., Guenther, O., Kovrigin, A., and Nowobilska, A. (2008) Why Participate in an Online Social Network? An Empirical Analysis, ECIS 2008 Proceedings, Paper 33. 27. Ledbetter, A. M., Mazer, J. P., DeGroot, J. M., Meyer, K. R., Yuping Mao, and Swafford, B. (2011) Attitudes Toward Online Social Connection and Self-Disclosure as Predictors of Facebook Communication and Relational Closeness, Communication Research, 38, 1, 27–53. 28. McKnight, H. D., Choudhury, V., and Kacmar, C. (2002) The impact of initial consumer trust on intentions to transact with a web site: a trust building model, The journal of strategic information systems, 11, 3, 297–323. 29. Moore, G. C., and Benbasat, I. (1991) Development of an instrument to measure the perceptions of adopting an information technology innovation, Information Systems Research, 2, 3, 192–222. 30. O'Leary-Kelly, S. W., and Vokurka, R. J. (1998) The empirical assessment of construct validity, Journal of Operations Management, 16, 4, 387–405. 31. Pearce, W. B., and Sharp, S. M. (1973) Self-disclosing communication, Journal of Communication, 23, 4, 409–425. 32. Posey, C., Lowry, P. B., Roberts, T. L., and Ellis, T. S. (2010) Proposing the online community self- disclosure model: the case of working professionals in France and the U.K. who use online communities, European Journal of Information Systems, 19, 2, 181–195. 33. Samuelson, W., and Zeckhauser, R. (1988) Status quo bias in decision making, Journal of risk and uncertainty, 1, 1, 7–59. 34. Shah, R. C., and Kesan, J. P. (2006) Policy through software defaults, Proceedings of the 2006 international conference on Digital government research, 265–272. 35. Shah, R. C., and Sandvig, C. (2008) Software Defaults as de facto Regulation, Information, Communication & Society, 11, 1, 25–46. 36. Smith, H. J., Dinev, T., and Xu, H. (2011) Information privacy research: An interdisciplinary review, MIS Quarterly, 35, 4, 989–1015. 37. Son, J.-Y., and Kim, S. S. (2008) Internet users' information privacy-protective responses: A taxonomy and a nomological model, MIS Quarterly. 38. Taraszow, T., Aristodemou, E., Shitta, G., Laouris, Y., and Arsoy, A. (2010) Disclosure of personal and contact information by young people in social networking sites: An analysis using Facebook™ profiles as an example, International Journal of Media and Cultural Politics, 6, 1, 81–101. 39. Trudeau, S., Sinclair, S., and Smith, S. W. (2009) The Effects of Introspection on Creating Privacy Policy, Presented at the WPES, Chicago, 1–10. 40. Tschersich, M., Kahl, C., Heim, S., Crane, S., Böttcher, K., Krontiris, I., and Rannenberg, K. (2011) Towards privacy-enhanced mobile communities—Architecture, concepts and user trials, Journal of Systems and Software, 84, 11, 1947-1960. 41. Wheeless, L. R. (1978) A follow-up study of the relationships among trust, disclosure, and interpersonal solidarity, Human Communication Research, 4, 2, 143–157. 42. Wheeless, L. R., and Grotz, J. (1976) Conceptualization and measurement of reported self- disclosure, Human Communication Research, 2, 4, pp. 338–346. 43. Xu, H., Dinev, T., Smith, H. J., and Hart, P. (2008) Examining the formation of individual's privacy concerns: toward an integrative view, AMCIS 2008 Proceedings, Paper 179.

Proceedings of the Nineteenth Americas Conference on Information Systems, Chicago, Illinois, August 15-17, 2013. 10