<<

THREE ESSAYS ON INFORMATION PRIVACY OF MOBILE USERS

IN THE CONTEXT OF MOBILE APPS

Mehrdad Koohikamali

Dissertation Prepared for the Degree of

DOCTOR OF PHILOSOPHY

UNIVERSITY OF NORTH TEXAS

August 2016

APPROVED:

Dan J. Kim, Major Professor Chang Koh, Committee Member Aaron French, Committee Member Victor Prybutok, Committee Member and Vice Provost of the Toulouse Graduate School Mary C. Jones, Chair of the Department of Information Technology and Decision Sciences Marilyn Wiley, Dean of the College of Business Koohikamali, Mehrdad. Three Essays on Information Privacy of Mobile Users in the

Context of Mobile Apps. Doctor of Philosophy (Business Computer Information Systems),

August 2016, 151 pp., 12 tables, 9 figures, references, 339 titles.

The increasing demand for mobile apps is out the current capability of mobile app developers. In addition, the growing trend in smartphone ownership and the time people spend on mobile apps has raised several opportunities and risks for users and developers. The average time everyday a user spend on smartphones to use mobile apps is more than two hours. The worldwide mobile app revenue increase is estimated to grow 33%, $19 billion. Three quarter of the time used on mobile apps is solely for using game and social networking apps. To provide more customized services and function to users, mobile apps need to access to personal information. However, 80% of mobile apps put people’s information privacy at risk. There is a major gap in the literature about the privacy concerns of mobile device users in the context of mobile apps. This dissertation addresses one fundamental research question: how does individuals’ privacy change in the context of mobile apps? More precisely, the focus of this dissertation is on information privacy role in individuals’ and mobile app developers’ protective behaviors. We investigate the information sensitivity level influence on mobile app developers’ emphasis on privacy across mobile app categories. The results show information sensitivity level has a significant impact on developers’ emphasis on secondary usage of information. Moreover, we analyze the privacy trade-off dynamism in using a new social networking app and how it could result in emotional attachment. Results show initial use and initial disclosure influence the privacy trade-off from pre-use to initial-use period. Finally, the effect of privacy concern and engagement on emotional attachment is demonstrated.

Copyright 2016

by

Mehrdad Koohikamali

ii

ACKNOWLEDGMENTS

This achievement could not happen without the help and devotion of many important people in my life. First, this dissertation was made possible, in part, by my graduate committee’s continuous help and guidance during this research. I wish to express my sincere appreciation to my advisor Dr. Dan J. Kim for his support and patience during this research. I would also like to thank Dr. Victor Prybutok, Dr. Chang Koh, and Dr. Aaron French for their very helpful suggestions and help. Special thanks to my family and all those who have provided help for my continuing education. I would like to thank my wife for her patience. She has always supported me and given me the strength and optimism to persevere through the hard times. I want to thank Allah for giving me the strength to overcome obstacles. Finally, I want to dedicate this dissertation to people of my country, Iran.

iii TABLE OF CONTENTS

ACKNOWLEDGMENTS ...... iii

PROLOGUE ...... 1

CHAPTER 1: DO DIFFERENT MOBILE APPS EMPHASIZE DIFFERENT PRIVACY

DIMENSIONS? AN INVESTIGATION OF PRIVACY POLICIES OF MOBILE APPS

THROUGH FRAMING AND CONTENT ANALYSIS ...... 10

1.1.Introduction ...... 10

1.2. Literature Review...... 13

1.2.1. Multidimensional Concept of Privacy...... 13

1.2.2. Privacy Policies of Mobile Apps ...... 15

1.2.3. Sensitivity Level of Personal Information ...... 16

1.3. Theoretical Foundation ...... 18

1.3.1. Mobile User Privacy Concern ...... 18

1.3.2. Prospect Theory...... 21

1.3.3. Research Hypotheses...... 22

1.4. Methodology ...... 27

1.4.1. Data Collection ...... 27

1.4.2. Data Analysis ...... 28

1.5. Results ...... 31

1.5.1. Ease of Reading ...... 31

iv 1.5.2. Associative Frequencies of Privacy Dimensions ...... 32

1.5.3. ANOVA Results ...... 35

1.6. Discussion and Implications ...... 37

1.6.1. Implications ...... 40

1.6.2. Limitation and Future Research ...... 42

1.7. Conclusion ...... 43

CHAPTER 2: PRIVACY TRADE-OFF DYNAMICS: THE ROLE OF INITIAL USE AND

SELF-DISCLOSURE ...... 45

2.1. Introduction ...... 45

2.2. Literature Review...... 48

2.2.1. Use of Social Networking Apps ...... 48

2.2.2. Privacy Concern and Perceived Benefit Trade-Off...... 49

2.2.3. Self-Disclosure ...... 52

2.3. Theoretical Background ...... 54

2.3.1. Privacy Trade-Off Dynamism ...... 54

2.3.2. Research Model and Hypotheses ...... 57

2.4. Methodology ...... 66

2.4.1. Measurement Development...... 66

2.4.2. Data Collection ...... 66

2.5. Analysis and Results ...... 68

v 2.5.1. Reliability and Validity ...... 68

2.5.2. Structural Model Assessment ...... 70

2.6. Discussion ...... 72

2.6.1. Implications ...... 75

2.6.2. Limitations and Future Research...... 77

2.7. Conclusion ...... 77

CHAPTER 3: EMOTIONAL ATTACHMENT TO SOCIAL NETWORK APPS ...... 79

3.1. Introduction ...... 79

3.2. Literature Review...... 81

3.2.1. Mobile Apps ...... 81

3.2.2. Privacy Concern, Anonymity, and Self-Disclosure ...... 83

3.2.3. Engagement with SNAs ...... 85

3.2.4. Emotional Attachment to SNAs ...... 87

3.3. Theoretical Background ...... 89

3.3.1. Research Model and Hypotheses ...... 93

3.4. Research Methodology ...... 98

3.4.1. Research Design and Procedure ...... 99

3.5. Data Analysis and Results ...... 100

3.5.1. Measurement Model ...... 100

3.5.2. Structural Model ...... 101

vi 3.6. Discussion ...... 102

3.6.1. Discussion of Results ...... 102

3.6.2. Implications ...... 104

3.7. Conclusion ...... 105

EPILOGUE ...... 107

APPENDICES ...... 109

REFERENCES ...... 128

vii

PROLOGUE

A recent trend in information technology (IT) is the prevalent use of mobile applications

(apps) on smartphones. Gartner reports by the end of 2017 the demand for development of mobile

apps will outstrip the capacity of IT organizations to deliver apps. Further, the smartphone

ownership has reached the tipping point of exceeding desktop ownership. Continuous climb in the

adoption of smartphones is due to the increasing demand for apps as important tools providing

various services. Every day the average time people spend on smartphones is 158 minutes and

more than 80% of that time is spent using mobile apps (Novak, 2016). As of June 2015, number

of mobile app downloads worldwide has reached over100 billion which is projected to rise to more

than 268 billion by 2017 (Statista, 2015). The worldwide mobile app revenue from 2016 to 2017

is estimated to grow 33%, $19 billion. From the total time spent on mobile apps, people spend

69% on game and social networking apps (Rudolph, 2015).

The fundamental success of mobile apps and offered services is contingent to access to

personal information. Interestingly, reports indicate 80% of mobile apps put privacy of users at

risk (Musthaler, 2013). Among all mobile app users, 60% have declined to install an app after they

discovered how much personal information is available to the app and 43% have uninstalled an

app due to privacy concerns (Olmstead and Atkinson, 2015). Protecting privacy of personal

information especially for more sensitive data such as health is critical for both users and mobile

app developers (Martínez-Pérez et al., 2015). Compared to web sites, mobile apps are more

intimate to users and people have immediate and ubiquitous access to them. Thus, concerns about

privacy of information provided to mobile apps becomes more important (Meng et al., 2016).

Adoption of social network sites and the role of privacy concern is fairly investigated in the prior information systems (IS) research. Mainstream privacy research have focused on privacy 1

concern as the mediating variable between set of antecedents and outcomes known as APCO box

(Dinev et al., 2015). Perceived behavioral control, subjective norm, gratification, attitude, convenience, social identity, usefulness, and information sensitivity are among the antecedents of privacy research and self-disclosure, adoption intention, continuance intention, engagement, relationship development, usage are among the outcomes. Privacy is an important determinant of users’ decisions to use technologies particularly when personal information is disclosed (Ayalon and Toch, 2013; Squicciarini et al., 2011).

There is a major gap in the literature about the privacy of mobile device users in the context of mobile apps. In addition to this major gap in the current information systems (IS) literature, there has been an inadequate investigation of the information privacy from organizations’ perspectives (Smith et al., 2011), a longitudinal perspective of the information privacy concern

(Belanger and Xu, 2015), and the context-dependencies of people’s concerns (Acquisti et al.,

2015). This dissertation focuses on the role of information privacy concerns on the both providers’ protective behaviors and individuals’ usage behaviors in the context of social networking apps

(SNAs) and addresses one fundamental research question:

Research question: Does the mobile users’ personal information privacy differ in the context of mobile apps?

To answer the fundamental research question, this dissertation investigates three sub- research questions:

(1) Is there any difference between privacy aspects among privacy policies of mobile apps?

(2) How does the privacy trade-off change over time due to self-disclosure and initial use

experiences of new social networking app?

2

(3) How is the ongoing emotional attachment with a new SNA influenced by ongoing engagement

and ongoing privacy concern?

Three Essays in this Dissertation

Essay I

Essay I studies the role of information sensitivity level on different dimensions of information privacy on mobile apps. It specifically focuses on privacy statements of different mobile app categories to understand how mobile app developers think about users’ privacy concerns. Essay I applies the framing concept of prospect theory and the framework provided by

Smith et al. (1996) about four key privacy dimensions (collection, secondary use, improper access, and error) to understand privacy policies of different app categories. Essay I contributes to the literature in several ways. Essay I extends the current user-centric view on information privacy by

providing mobile app developer’ perspectives on privacy. Also, Essay I provides practical insights

into the privacy research to enhance the effectiveness of privacy policies.

Privacy policies are widely applied for the assurance of information privacy practices.

However, privacy policies are ineffective from users’ perspectives. To differentiate between

statements of privacy policies first readability and length calculated. Then, a text mining method

is developed in R to analyze weights of four key privacy dimensions of privacy policies among

three mobile app categories (health, navigation and game). Mobile apps categorized into three

classes based on the sensitivity of information they deal with. Although privacy policies are widely

applied for the assurance of information privacy practices, for some reasons, they are ineffective

from users’ perspectives. In this study, four key dimensions of mobile apps privacy are collection,

secondary use, improper access, and error. Through the lens of prospect theory and framing

concept, this study investigates the role of information sensitivity level on emphasis of privacy 3 dimensions of privacy policies of mobile apps is investigated. To differentiate between statements

of privacy policies first readability and length calculated. Then, a text mining method is developed

in R to analyze weights of four key privacy dimensions of privacy policies among three mobile

app categories (health, navigation and game). Based on the information sensitivity level available

on different mobile apps, mobile apps are categorized into three classes: type I dealing with high

sensitive information, type II dealing with medium sensitive information, and type III dealing with

non-sensitive information.

Final dataset contains privacy policies of 90 mobile apps from three categories. Readability

calculations reveal that privacy policies of mobile apps are very complex and users should have at

least 12 years of education to completely understand them. In addition, the average length of

privacy policies is at least 1900 words. The extensive length of privacy policies hinders users to

read them completely. ANOVA results show a significant difference between secondary use and

error dimensions between health and navigation game apps. In addition, findings demonstrate that

collection and secondary use dimensions are more emphasized in privacy policies of health apps

than game apps. This study has made several contributions. First, building upon framing concept

of prospect theory provides an appropriate framework to understand organizational perspective of

privacy concerns. Second, it is shown that information sensitivity level could be useful in future

research defining privacy trade-off functions. Third, the developed methodology could help

practitioners to design more effective privacy policies.

Essay II

The information privacy concern is not static and researchers have discussed the changes

in privacy concerns (Hong and Thong, 2013; John et al., 2011; Knijnenburg and Kobsa, 2013).

Variations in concern for privacy is due to a new experience or phenomenon, increased 4 understanding of the context, changes in the environment, and modifications in legislation

(Acquisti et al., 2015; John et al., 2011). The use of new technologies may have advantages to

involved individuals such as utilitarian and hedonic benefits (Lee, 2009). Prior research has only

focused on the privacy concern influence on disclosure behavior and the role of contexts in

perceptions of privacy concern. However, further research evaluating the dynamic changes in the privacy concern when people use social networking mobile apps is needed.

During the beginning usage of new technologies, is important to study changes in privacy trade-off over a period of time when individuals build new relationships, get more experience, and disclose personal information. Belanger and Xu (2015) draw the direction for IS privacy studies to measure actual amounts of information disclosed in a longitudinal fashion. Building upon previous literature, the current research will address the following questions: (1) how does the privacy trade-off change over time due to self-disclosure and initial use experiences? (2) What is the relationship between privacy trade-off and intention to use SNAs in the pre-adoption phase?

(3) How does the personal innovativeness in IT influence the moderate the relationship between pre-perceptions and intention to use a new SNA?

Essay II explores dynamism of the privacy concern particularly when users adopt a new social networking app (SNA) and disclose their personal information on it. On SNAs, users develop social interactions as a result of a mental trade-off between their concern for privacy and rewards (benefits), defined as the privacy trade-off (Jiang et al., 2013). During the usage phases of

SNAs, the initial perceptions of privacy concerns and benefits (privacy trade-off) changes due to usage experiences and self-disclosure (Liu et al., 2014b).

Integration of innovation diffusion theory, theory of planned behavior, theory of belief updating, and privacy trade-off model are used for theoretical underpinning of this research. Essay 5 II develops a new research model to explain the change in privacy trade-off from pre-use to initial use and its relationship with self-disclosure. Essay II provides a longitudinal perspective on the privacy trade-off using the actual usage data of a new SNA. Applying a longitudinal perspective, this research posits a research model that shows the role of initial use and initial self-disclosure on updating users’ pre-perceptions of using a new SNA to initial perceptions of using it. The trade-

off between pre-privacy concern and pre-perceived benefit determines the intention to use a SNA.

Then, initial perceived benefit is explained by pre-perceived benefit, initial use, and initial self-

disclosure. Initial privacy concern is explained by pre-privacy concern and initial use.

The research model is tested using the survey data collected at two phases. Phase 1 of the

survey was conducted to capture pre-use perceptions and yielded 390 respondents. Phase 2 of the

survey was conducted with 1-week delay after phase 1 to capture initial-use perceptions and usage

experiences. Respondents were asked to install and use the new SNA (Sociabile) on their

smartphones. The second survey concluded with 349 respondents who were adopters of the app.

Users who completed both phases were included in the analysis so that pre and post evaluations

could be conducted. The final dataset contained 239 usable responses suitable for analysis.

The research model explains intention to use a new SNA is determined by pre-perceived benefit and pre-privacy concern. Personal innovativeness in IT has a negative influence on the relationship between pre-privacy concern and intention to use. Further, the research model suggests the initial use and initial self-disclosure relationship. Initial perceived benefit is determined by pre-perceived benefit, initial use, and initial self-disclosure. Initial privacy concern is determined by pre-privacy concern and initial use. Finally, initial self-disclosure is not related with the initial privacy concern.

6 Essay II contributes to the literature by extending the privacy calculus model. Findings also

advances our understanding of context-specific privacy trade-off over time. By providing a dynamic model of privacy trade-off, findings of this study provide significant contributions to theory and practice in the privacy literature in the Information Systems (IS) field. This study serves as initiative attempt to investigate dynamics of privacy trade-off during the use of a new social networking app. Current research model is built upon an integrated innovation diffusion theory, theory of belief updating, theory of planned behavior, and privacy trade-off model to investigate the dynamism of privacy trade-off through the per-use and initial use phases.

Essay III

Adoption and use of social network sites (SNSs) is recognized in many studies (Broeck et al., 2015; Chang and Zhu, 2011; Ku et al., 2013). Among different variables self-disclosure, intention to adopt, continuance intention are investigated more than other constructs. Prior literature is divided into two main categories. The first group of research focuses on the motivations and deterrents behind using SNSs. The second group of studies investigate consequences of using SNSs. Research shows the paradoxical influence of using SNSs among different groups of users. For example, while in some studies privacy concern is negatively related to the SNS usage behaviors such as self-disclosure (Lin and Liu, 2012), in another series of research there is not a significant relationship between privacy concern and self-disclosure

(Koohikamali et al., 2015). Furthermore, in the SNS running context both the negative influence of privacy concerns on engagement is demonstrated (Mahan et al., 2015; Vasalou et al., 2015).

Engagement with the SNSs is lower for people with higher privacy concerns (Staddon et al., 2012). Engagement with SNSs is one of the indicators of the success of implementing them for businesses and organizations (Chu and Kim, 2011). In the following stages of using SNSs, 7 people build emotional attachment toward them (Abouzahra et al., 2014). When users experience

an information technology (IT) for the first time, their emotion toward the IT is a response to the

anticipated experience (Lowry et al., 2015). To keep gaining the anticipated response, positive

emotions motivate users to continue using an IT (Lowry et al., 2015).

Nowadays, the average time people spend on their mobile apps is increasing. However, the

life cycle of a new mobile app could be very short. For mobile app providers, it is important to

have a profound impact on users during the first experience. The extent to which people engage

with social networking apps (SNAs) is not clear and previous research has not explored this area.

Previous work has been done to show antecedents of app usage and the role of privacy concern on

self-disclosure. However, there is a gap in the literature about the antecedents of emotional

attachment to SNAs. Furthermore, the relationship between engagement and emotional attachment

is less considered in the information systems (IS) literature.

Finally, most of the previous literature has extensively examined the negative influence of

privacy concerns on self-disclosure but did not include the influential role of initial perception of

anonymity on depth of self-disclosed information and ongoing privacy concerns. These gaps in

the literature lead this research to suggest the following research questions: (1) how is the ongoing

emotional attachment with a new SNA influenced by ongoing engagement and ongoing privacy

concern? (2) What is the relationship between initial self-disclosure depth and initial self- anonymity on ongoing privacy concern and ongoing engagement? (3) What is the influence of ongoing privacy concern on ongoing engagement with a new SNA?

Essay III proposes a research model to investigate privacy concern and engagement as two antecedents of emotional attachment to an SNA. In addition, it examines the importance of perceived anonymity and self-disclosure depth on privacy concerns and engagement with the SNA. 8 The research model is tested with the survey method data. Essay III examines the relationship between engagement, emotional attachment, and privacy concern during the usage stage of a SNA.

Similar to information privacy concern, emotional attachment is not constant and it changes over time (Mugge et al., 2006). Privacy concern is an important gate to engagement on social networks

(Staddon et al., 2012). Essay III explores the extent to which the relationship between privacy concern, engagement, and emotional attachment influences the ongoing use. Essay III makes several implications. First, it suggests an integrated model of engagement, attachment, and privacy concern. Second, it extends the attachment theory by showing change in engagement and privacy concerns where people continuously build relationships.

9 CHAPTER 1

DO DIFFERENT MOBILE APPS EMPHASIZE DIFFERENT PRIVACY

DIMENSIONS? AN INVESTIGATION OF PRIVACY POLICIES OF MOBILE APPS

THROUGH FRAMING AND CONTENT ANALYSIS

1.1. Introduction

As the adoption of smartphones becomes more prevalent, the popularity of mobile applications (apps) continues to rise. More than 50% of smartphone users download apps

(MobiForge, 2013), and more than 4.4 billion individuals are anticipated to use mobile apps worldwide by 2017 (MobiForge, 2013). A user can download more than 5 million available apps from app markets such as Google play, Apple app store, Amazon app store, Windows phone store, and Blackberry world (Statista, 2015). The two most popular app markets are Apple app store and

Google play (Miller, 2011). Despite their popularity, the privacy and security of mobile apps are still of critical concern (Barkhuus and Dey, 2003; Snekkenes, 2001), because app providers can possibly access sensitive user data (Liu et al., 2014a). To protect both app users and providers, app markets require a certain level of security and privacy agreement between the user and provider

(Li and Clark, 2013).

Privacy statements and user agreements (PSUA) are legal contracts between users and app providers to explain users’ personal information risks and providers’ obligations (Breaux and

Baumer, 2011; Gindin, 2009). Statements of privacy policies generally cover obligations of app users to secure personal information and also the intention of app providers to collect, store, use, and share users’ personal information (Young and Anton, 2010). To prevent individuals’ privacy violations, legal authorities require mobile markets to compel app providers to provide comprehensive privacy policies to users (Breaux and Rao, 2013; Myles et al., 2003). To install and

10 use mobile apps, users must agree with the provider privacy policies, which determine the extent

to which the app may access and use users’ information (Gerlach et al., 2015).

Privacy policies contain information to minimize the legal consequences for providers of

collecting users’ personal information and to maximize the protection of personal information

from hidden disclosures and risks for users (Bansal et al., 2010). Statements in privacy policies include information that may influence users’ perceptions of privacy risks (Gerlach et al., 2015).

There are two streams of research in this area: first, a group of studies focused on the benefits of

posting privacy policies for companies (Andrade et al., 2002; Hann et al., 2007) while a second

group of studies examined the effectiveness of privacy policies for users (Cranor et al., 2015; Xu

et al., 2008).

Inconsistencies exist between the findings of prior research mentioned earlier in terms of

reading privacy policies. The first group of research finds the majority of users are likely to read

or skim privacy policies the first time they visit a website (Gerlach et al., 2015); however, privacy

policies are too complicated for users to easily comprehend (Liccardi et al., 2014). Other groups

of studies show privacy policies are hard to read, and the likelihood of users reading these policies

is low (McDonald and Cranor, 2008; Sunyaev et al., 2015). Based on the findings of a study by

McDonald and Cranor (2008), the average time people should spend reading the privacy policy of

a website is 10 minutes. Findings of previous research has shown privacy policies are ineffective

at helping to evaluate possible risks to information privacy and that most users do not read the

privacy policies (Cotton and Bolan, 2012; Gindin, 2009). A survey by Pew research center

revealed half of Americans online do not know what a privacy policy is (Smith, 2014). Overall,

previous research failed to show how mobile app developers protect users’ privacy in various

situations (Koohikamali and Kim, 2015). A gap in the literature remains in regards to the contents 11

of privacy policies and what type of information providers include. Furthermore, despite our

expectation, there is also gap in the literature on how app providers protect high sensitive

information in privacy policies.

To address these gaps in the literature, this study investigates how mobile app privacy

policy statements are designed to protect personal information. More specifically, understanding

how different categories of mobile apps protect users’ information privacy when personal

information is more or less sensitive is important. Mobile users want to have a better grasp of how their sensitive personal information is protected by app providers. In order to reveal if users’

expectations – that privacy of their high sensitive personal information should be carefully protected – match statements of provider privacy policies, this study focuses on privacy policies

of mobile apps as the voice of developers. This study answers following research questions. RQ1:

Is there any difference in privacy aspects among privacy policies of mobile apps? RQ2: How does

the information sensitivity level influence the app providers’ emphasis on protection of personal information privacy as reflected in the privacy policies of mobile apps?

To better understand the risks associated with the protection of information privacy on mobile apps, this research focuses on three categories of mobile apps (i.e., health, navigation, and game apps) that deal with three different levels of sensitivity (i.e., very sensitive, sensitive, and less sensitive personal information, respectively). Health related information is perceived more sensitive than other types of information (Bansal et al., 2010). Personal location information is considered sensitive but less sensitive than health information (Li et al., 2013). Finally, information disclosed to game apps, except financial information for premium versions, are less sensitive compared to other two categories. The objectives of this research are twofold: 1) to understand how information sensitivity level influences the key dimensions of information privacy among 12

different mobile app categories; and 2) to provide theoretical and practical insights into the mobile

app security and information privacy research community.

1.2. Literature Review

1.2.1. Multidimensional Concept of Privacy

Most of the recent privacy research is focused on the importance of online personal information privacy and the role of individuals’ perceived privacy concerns on disclosure personal information (Pavlou, 2011; Smith et al., 2011). Investigating multiple aspects of privacy requires the privacy taxonomy. Privacy is usually seen as a social issue concept (Westin, 1968).

Conceptualization of privacy in the previous literature is categorized into three categories. First, some researchers have argued that privacy is an individual’s right to determine what information to be communicated to others (Yu et al., 2015). The second group of researchers have discussed that privacy is an individual’s measure of control over information about himself/herself

(Schoeman, 1984). The third group of scholars have defined the right to privacy as a key aspect of human dignity and the limited access to a person (Schoeman, 1984). Overall, individuals’ privacy concerns are related to people real life situations (Laufer and Wolfe, 1977).

Individuals’ privacy is a three dimensional concept: self-ego, environmental, and interpersonal (Laufer and Wolfe, 1977). The self-ego focuses on the autonomy and personal dignity in every society that is a developmental process. In other words, the process of separating from the social and physical environment is necessary for the development of the self.

Accordingly, individuals bolster appropriate privacy expressions and the privacy experiences affect individuals’ self-esteem, identity, and behavior. The second dimension is the environmental elements and they affect people’s “abilities to perceive, have, and use available options” (Laufer

13

and Wolfe, 1977, p. 28). Environmental elements include cultural meanings, social arrangement,

interaction with the physical setting (socio-physical), and the life cycle stage (Laufer and Wolfe,

1977, p. 28). All of the environmental elements have a temporal component in common (Laufer and Wolfe, 1977, p. 28). Last dimension is interpersonal privacy that is the basis for environmental and self-ego dimensions. The Interpersonal dimension of the privacy concept constitutes the core of the privacy phenomenon (Laufer and Wolfe, 1977, p. 28).

Laufer and Wolfe (Laufer and Wolfe, 1977, p. 28) view privacy as the ability to be alone, have space control, have a non-intrusive situation where no-one-bother the person, and have the control on access to information. The first three meanings are related to interaction management and the last meaning is related to the information management (Hong and Thong, 2013; Laufer and Wolfe, 1977). In online environments, online privacy is defined as users’ rights about their personal information that they want to be known by others (Westin, 2003). Concern for online information privacy includes four dimensions: collection, unauthorized secondary user, improper access, and error (Smith et al., 1996).

Due to the advent of the Internet, new aspects of privacy concerns emerged. Intimacy of

personal information stored on mobile apps increases personal information privacy (Liu et al.,

2014a) . To take control over personal information, users should have enough knowledge about the consequences of disclosing their personal information (Cranor, 2012). Many smartphone users are challenged of how to control their personal information disclosed to apps (Rainie and Duggan,

2016). Users concerns about their privacy on apps should be addressed properly to reduce possible

risks when information is collected and shared by the app (Barkhuus and Dey, 2003).

14

1.2.2. Privacy Policies of Mobile Apps

Privacy policies are public documents on different Web sites to address the policy of the

responsible organization with regard to collection, use, and disclosure of personal information

(Gindin, 2009; Mai et al., 2010). According to the law, certain organizations are forced to inform

users how information will be used and disclosed (Vail et al., 2008). The complexities in privacy

policies have made it difficult for users to easily read and understand them (Massey et al., 2013;

Vail et al., 2008). The vagueness of privacy policies encourage users to quit using the service or to stop disclosing personal information (Mcdonald et al., 2009). Two key principles of information

privacy protection are notice and choice (Cranor, 2012).

Effective privacy policies could protect privacy because they draw a clear image of possible risks to privacy of users (Joinson et al., 2010). Goel and Chengalur-Smith (Goel and

Chengalur-Smith, 2010) offered three different metrics for measuring effectiveness of privacy

statements. According to their study, three antecedents for policy effectiveness are brevity, clarity,

and breadth. First, clarity is regarded as the ease of reading and understanding a text. It is related

to the characteristics of the user in terms of prior knowledge and education. It has been quantized

with regard to number of words per sentence and number of complex words. Second, breadth is

the measure of policy comprehensiveness, which is measured according to the number of technical

terms in a policy statement. Last, the brevity is defined as the total number of unique words to

reduce confusions (Goel and Chengalur-Smith, 2010). Mobile app providers have tried to address

their responsibilities by providing a comprehensive privacy policy document but they are not easily

interpretable for users (Kwon et al., 2009).

Privacy policy statements cover two practices: awareness of information collection and

information usage dimensions. To better understand the differences between statements of privacy 15

policies in terms of dimensions of privacy, distribution and frequency of the privacy keywords could be analyzed. Privacy keywords are representation of a specific aspect of privacy (Licorish et al., 2015). They are limited to the most common privacy protection challenges for personal information (Massey et al., 2013). For example, in the literature 57 common privacy keywords are suggested as a metric to show differences between privacy policies (Stufflebeam et al., 2004). To get better understanding of differences between privacy policy statements, the privacy goal keywords is used to capture an active practice of protecting personal information. The list of top twenty one privacy protection keywords are given in Table 1 (Massey et al., 2013).

Table 1 - Most common privacy protection goal keywords (Massey et al., 2013)

Goal keyword Access Collect Display Inform Opt-in Request Specify Apply Comply Help Limit Opt-out Reserve Store Change Connect Honor Notify Post Share Use 1.2.3. Sensitivity Level of Personal Information

Sensitivity of information is an important aspect of personal information. Sensitive information is related to higher intimacy (Mothersbaugh et al., 2011). Intimate sensitive information is vulnerable to losses (Lwin et al., 2007). Possible losses due to disclosure of personal information include psychological, physical, and material risks (Mothersbaugh et al., 2011).

Sensitivity of information is defined as the extent of possible losses related to information disclosure (Mothersbaugh et al., 2011). Some types of data are more sensitive than others.

Information that increases users’ vulnerability to risks is more sensitive (Badrul et al., 2016). For example, Cranor et al. (2000) study suggest financial data (e.g. credit card number), health data

(e.g. medical history information), and location data (e.g. postal address) are perceived more sensitive than other types of personal information. People have the least comfort if high sensitive 16 personal information is disclosed (Cranor et al., 2000). Consequently, divulgence of high sensitive

information could cause greater losses than less sensitive information (Cranor et al., 2000).

For the purpose of this study, mobile apps are classified based on the sensitivity level of

information they generally deal with. Three classes of mobile apps in this study (health, navigation,

and game) are different in terms of sensitivity level of users’ information and they range from high

sensitive to low sensitive information. Game apps are known as experiential in contrast to health

apps that are informational apps (Bellman et al., 2011). Based on the department of homeland

security handbook of sensitive information, every personally identifiable information (PII) is

sensitive (Callahan, 2012). Both health and location personal information are sensitive (Bansal et

al., 2010; Li et al., 2013), but health insurance portability and accountability act suggests that

personal health information is usually the most sensitive data and must be intensely protected

(Martínez-Pérez et al., 2015). For example, many people are reluctant to disclose their personal health records (PHR) with third party companies (Ozdemir et al., 2011). The location privacy protection act of 2014 in the US is an effort to ensure users’ privacy. As a result, “the legislation would require a non-governmental entity to obtain the consent of a user before collecting or sharing location information collected from an electronic device” (Hiller and Park, 2014). Finally, game

app users are not required to provide their personal information to use the game apps functionalities

(Crampton and Betke, 2003). As a result, users usually do not disclose personally identifiable information to game apps, except for purchases (Bellman et al., 2011).

17

1.3. Theoretical Foundation

1.3.1. Mobile User Privacy Concern

Online privacy concern is related to the users’ control over their information and also explains awareness of users’ rights (Sheehan and Hoy, 2000). In Sheehan et al. (Sheehan and Hoy,

2000) five dimensions of online users privacy concerns are discussed as: awareness of information collection, information usage, information sensitivity, familiarity with the entity, and compensation. First, information collection awareness refers to the situation when users are aware that some of their information is being collected, such as registration forms before using a website.

Second, the information usage addresses the ways users’ information is used and/or the purpose of the usage. For example, users’ personal data is delivered to third-party agencies for different purposes. Third-party companies’ usage of personal information may cause users’ lack of control over personal information. Third, information sensitivity emerges due to the ambiguity of information ownership. Information sensitivity is “the level of privacy concern an individual feels for a type of data in a specific situation” (Weible, 1993, p. 30). When users’ willingness to share sensitive information is related to their trust of the data gathering entity it is known as the familiarity with entity. Last, compensation is a procedure to eliminate privacy concerns with regard to information collection by providing benefits to users (Sheehan and Hoy, 2000).

Among five dimensions of online users’ privacy concern suggested by Sheehan et al.

(Sheehan and Hoy, 2000), information sensitivity is the most complicated dimension. Sensitivity of information refers to “a personal information attribute that informs the level of discomfort an individual perceives when disclosing specific personal information to a specific external agent”

(Dinev et al., 2013, p. 302). “Sensitivity to information is the degree of privacy concerns towards certain information in a specific situation” (Badrul et al., 2016, p. 39). Information sensitivity may 18

change on various contexts, among different groups of people, and with respect to other information (Lips and Eppel, 2016).

Consequently, the level of information sensitivity influences concerns for privacy (Badrul et al., 2016). On mobile apps, users may indirectly give access to many of their personal information stored on their smartphones or directly disclose their personal information on user profiles. Depending on what types of information individuals disclose (directly or indirectly), their concern for the personal information privacy is influenced differently. For example, when people use a health app they can enter their personal health information into their user profiles (direct disclosure) or they may connect it to a health kit and give access to their health data (indirect disclosure). If a user enters a very sensitive health record they may feel very uncomfortable and concerned about his/her privacy (Badrul et al., 2016).

To explain individuals’ privacy concern in the context of mobile apps, this study adopts the information privacy concern framework developed by Smith et al. (1996) and the conceptualization of privacy concern for context aware mobile applications suggested by Liu et al. (2014b). Four dimensions of privacy concern in the context of mobile apps are collection, secondary use, improper access, and error (Junglas et al., 2008b; Liu et al., 2014b; Smith et al.,

1996). Concern for collection of information is related to the amount of information being collected by the app (Junglas et al., 2008b; Smith et al., 1996). Concern for secondary use of information is related to disclosure of users’ information to other companies or usage for other purposes other than what it is stated in the policy (Junglas et al., 2008b; Smith et al., 1996).

Concern for improper access is the users’ concern for the possibilities of stalking their information

(Junglas et al., 2008b; Smith et al., 1996). Concern for error is related to the lack of enough

19

procedures to double-check the accuracy of users’ data or to update users’ information (Junglas et

al., 2008b; Smith et al., 1996).

In the context of mobile apps different types of information is being shared or given access

to. Mobile apps have access to an extensive range of personal information directly through the user

disclosure on the app profile or indirectly through the access to the user’s data stored on the device.

Consequently, depending on the sensitivity level of information that mobile apps have access to,

people may express higher or lower concern for privacy (Badrul et al., 2016).

From app providers’ standpoint, protection of individuals’ privacy is the primary goal to

avoid inconsistencies between users’ expectations and their privacy protection practices (Breaux

and Rao, 2013). Privacy policies allow both users and providers to outline their preferences and concerns pertaining to their private data (Allison et al., 2009). As a result, providers often provide

compliant privacy policies to protect users’ privacy (Cranor et al., 2015). While users are not

comfortable to disclose sensitive personal information and they expect companies dealing with

those information to be more meticulous expressed in the statements of privacy policies (Allison

et al., 2009; Cranor et al., 2015).

There is a very strong relationship between the information disclosure behavior and the

context (Knijnenburg and Kobsa, 2013). To affect the decision of disclosing information, many

strategies are pursued such as justification provisions about disclosure (Patil et al., 2011),

reordering disclosure requests (Acquisti and Grossklags, 2005; Acquisti et al., 2012), and

displaying privacy statements (Xu et al., 2011a). The effect of privacy statement presence on

information disclosure has proven to be significant (Hui et al., 2007). Both sensitivity of personal

information and the context of the mobile app impact the privacy concern perceptions (Anderson

and Agarwal, 2011; Bansal et al., 2010). 20

1.3.2. Prospect Theory

In the context of mobile apps, mobile users disclose personal and sensitive private information. To conceptualize the influence of sensitivity level of information on underscoring dimensions of privacy inside privacy policies, prospect theory is an appropriate framework.

Prospect theory explains that the ultimate decision among risky choices is based on evaluation of gains and losses (Kahneman and Tversky, 1979). Choosing between different options related to gains and losses is the result of prospect theory (Yechiam and Hochman, 2013). Prospect theory suggests in response to hypothetical risks human behavior changes and decision making under uncertainties is based on a personal utility (Scheufele and Tewksbury, 2007).

Framing concept as the center of prospect theory is a manner in which a statement is worded (Tversky and Kahneman, 1992; Young et al., 2012). Prospect theory suggests three types of framing: framing under risk, attribute framing, and goal framing (Scheufele and Tewksbury,

2007). Framing can be defined as the description of analogous choices in different ways (Edwards et al., 2001). The concept of framing referred to situation in which individuals’ responses to uncertainties (decisions) vary based on the way they are formulated (Keren, 2012). Framing as a modern communication tool is crucial in policy making because different framing of information within a policy could yield different consequences (Keren, 2012).

Based on the sociological foundation of framing, individuals’ understanding of the surrounding environment and phenomena is based on the tools people own for interpretations such as life experiences and knowledge (Goffman, 1974). Research shows practitioners’ privacy decisions are influenced by choice framing (Adjerid et al., 2016). Through the lens of prospect theory, Mothersbaugh et al. (2011) investigate the value function to explain gains and losses of information sensitivity levels on the information disclosure behavior. They discuss instead of 21

potential gains and losses, perceived gains and losses should be included to explain the value

function. It should be noted that contextual factors could influence the perceived loss significantly

(Mothersbaugh et al., 2011). Prospect theory explains losses may weigh more than gains due to

evaluation of the relative size of the loss, known as loss aversion (Tversky and Kahneman, 1992).

Mobile apps companies’ choice to protect sensitive personal information could be a

function of presumptive gains and losses to both users and the company. Mistreatment of non-

sensitive personal information may not impose serious consequences but sensitive data if breached

could put different parties such as individuals (whose information is breached) and companies

(who should protect the information) at serious risks. In the context of mobile apps, mobile users’

decision to disclose personal information can be explained by prospect theory (Galletta et al.,

2015). The linkage between individual’ privacy concerns and organizations’ privacy protective

efforts is not clear because organizations use privacy policies for their own benefits (Montesdioca

et al., 2015). Privacy policy development is a common practice to foresee risks for customers and

the company via an explicit communication approach (Xu et al., 2005).

1.3.3. Research Hypotheses

It is anticipated that app providers express greater efforts to protect higher sensitive personal information privacy of mobile app users. There are three types of mobile apps in terms of dealing with different sensitivity level of personal information. Type I mobile apps deal with the most sensitive information and health mobile apps are representation of this category. Type II apps deal with the moderate level of sensitive information and the navigation category of mobile apps are examples of this type. Type III apps deal with the least sensitive information and game

apps are instances of this category. It is expected apps dealing with high sensitive data to protect

22

the information privacy more cautiously. Findings of the study by Mothersbaugh et al. (2011)

demonstrate information sensitivity level influences the perceived privacy concern.

Providers put more efforts into protecting high sensitive information than low sensitive information because the extent of loss would be more for high sensitive information

(Mothersbaugh et al., 2011). Framing of information within statements of privacy policies ensures companies that their customers have sufficient understanding of the potential risks to their personal information and more importantly minimizes company’s responsibilities in case of mistreatment of users’ data. Building upon the previous literature and specifically the framework of privacy proposed by Smith et al. (1996) and Liu et al. (2014b), four dimensions of mobile users personal

information privacy concerns in the context of mobile apps are collection, secondary use, control,

and error.

From users’ perspective, concern for collection of personal information is the degree to

which users are concerned about the amount of personal information processed by mobile apps

(Culnan and Williams, 2009; Hong and Thong, 2013; Liu et al., 2014b; Malhotra et al., 2004).

Amount of personal information on mobile apps could have substantial influence on possible

losses and negative consequences for mobile app companies that should protect them. When users

disclose more personal information to a mobile app, the mobile app company would supposedly

have to develop more practices to protect users’ data.

Mobile app providers’ concern for collection represents how much of users’ identifiable

information is be collected by them. Mobile apps providers’ choice to protect sensitive personal

information could be a function of presumptive gains and losses to both users and the company.

Companies should invest more time and effort into protecting high sensitive data due to serious

negative consequences that they may impose to individuals. High sensitive data may targeted more 23

by hackers and as a result protection of sensitive data is more difficult and expensive (Liu and

Kuhn, 2010). Due to highest sensitivity of health related information, companies are concerned

about collection of users’ health information (Choi et al., 2006). High sensitive data such as

personal health information should be protected better if collected by mobile apps (Huckvale et

al., 2015). Building on the prospect theory, mobile app providers try to minimize the possibility of

negative consequences to themselves and also individuals. Through framing of information in

privacy policies, companies highlight individuals’ rights and companies’ responsibilities (Straub

Jr and Collins, 1990). As a result, it is expected that mobile app providers emphasize more on

collection dimension in the Type I apps (dealing with high sensitive) than Type II and Type III

apps, dealing with medium sensitive and non-sensitive personal information, respectively. This study hypothesizes:

H1a and H1b: In mobile apps’ privacy policies, Type I apps more emphasize on collection of information dimension of privacy than Type II apps (H1a) or Type III apps (H1b) do.

Individuals’ concern for secondary use of information is the degree to which users feel

uneasy if their information is used by the website for other purposes without their consent (Culnan

and Williams, 2009; Hong and Thong, 2013; Liu et al., 2014b; Malhotra et al., 2004; Smith et al.,

1996). In another word, a user is concerned about the usage of personal information if the

information is used for unknown purposes without user’s direct consent, known as the concern for

secondary usage of information (Hong and Thong, 2013; Liu et al., 2014b). From a mobile app

company perspective, the secondary use of users’ information could result in both benefits (e.g.

financially) and losses (e.g. losing customer trusts). Companies must have clear understanding of

potential risks and benefits regarding the secondary usage of customer’s information (Safran et al.,

24

2007). High sensitive data (such as personal health information that are disclosed to health mobile

apps) put mobile app providers more vulnerable to various risks if they do not have sufficient

understanding of it (Safran et al., 2007).

Benefits of secondary usage of high sensitive information for companies include enhancing

the quality of services and revenues (Hodge Jr et al., 1999). On the other hand, secondary usage

of high sensitive information raise many legal and ethical challenges to companies (Hodge Jr et

al., 1999; Safran et al., 2007). Also, the secondary usage of sensitive information could impose

higher costs for training more staff (Cresswell et al., 2013). For mobile app developers,

consequences of secondary usage of higher sensitive information could be more serious than lower

sensitive information. Building upon the framing concept in prospect theory, mobile app providers

usually trade-off between potential risks and benefits involved when dealing with personal

information. As a result, there should be more emphasis on secondary use dimension when higher

sensitive personal information available to the app. Thus, this research proposes:

H2a and H2b: In mobile apps’ privacy policies, Type I apps more emphasize on secondary

usage of information dimension of privacy than Type II apps (H2a) or Type III apps (H2b) do.

Customers’ concern about the availability of their personal information to unauthorized people is the concern for improper access (Hong and Thong, 2013; Liu et al., 2014b). Improper

access could occur due to the lack of enough mechanisms to forbid improper access to personal

information stored on databases or computers (Hong and Thong, 2013). Due to unintended

disclosure by a user, unauthorized access to personal information may happen (Mothersbaugh et

al., 2011). Mobile app providers’ concern for improper access to users’ information force them to

take more steps to limit accessibilities to personal information (Hong and Thong, 2013).

25

Certain vulnerabilities such as security holes, unprotected copies of personal information,

and lack of clear access policies increase the probability of unauthorized access (Pearson, 2009).

High sensitive data needs more effort to be fully protected such as incorporation of different

security levels and protocols and more expensive security software. Throughout different wording

of the privacy policy of a mobile app, a company dealing with higher sensitive information should

foresee better practices to ensure protection of personal information privacy. Following the

expectation of users, providers of mobile apps consider the importance of improper access to

higher sensitive information more rigorously than lower sensitive information. Consequently, this

study proposes:

H3a and H3b: In mobile apps’ privacy policies, Type I apps more emphasize on improper

access of information dimension of privacy than Type II apps (H3a) or Type III apps (H3b) do.

Lack of adequate mechanisms to protect accuracy of fallacious users’ data is related to the

concern for errors (Hong and Thong, 2013). Error dimension of privacy concern refers to the

privacy issues related to the quality of personal information and its maintenance (Hong and Thong,

2013; Liu et al., 2014b; Pergler et al.). Users want to have easy access to their stored personal information and be able to check the accuracy of the data (Pearson, 2009). Information sensitivity level increases the possibility of risks to individuals who disclose it and companies that store it

(Mothersbaugh et al., 2011). Within the privacy protection act, facilitating individuals’ access to their personal information enables people to check and update the accuracy of personal information

(Pearson, 2009). Mobile app providers should implement better procedures to correct errors in personal information and devote more time to verify accuracy of personal information (Hong and

26

Thong, 2013). App providers should invest more time and money to protect higher sensitive

personal information than lower sensitive personal information. As a result, this research suggests:

H4a and H4b: In mobile apps’ privacy policies, Type I apps more emphasize on error dimension of privacy than Type II apps (H4a) or Type III apps (H4b) do.

1.4. Methodology

1.4.1. Data Collection

As mentioned previously, the Apple App Store is one of the biggest providers of smartphone apps. We select three mobile app categories (i.e., health, navigation, and game) to download the possess privacy statements of the top ranked apps. To remove duplicate privacy policy documents from same providers and companies, we removed any privacy statement that was provided by the same company because they were identical for different apps. Many game apps have in-app purchases and some of them handle the transaction information such as credit card numbers themselves (Vitticci, 2015). Credit card data is known as sensitive personal information. As a result, to only include game apps that only deal with low sensitive personal information, we removed game applications that handle purchases and stores credit card numbers.

The final set of privacy statements of game category does not include any privacy policy of an app that collects financial information. A final set of 90 different privacy statements created for three app categories and each category had 30 different privacy statements. A complete list of mobile apps that we used to analyze their privacy policies is provided in Appendix A.

27

1.4.2. Data Analysis

Our analysis includes three main steps: first analysis of readability of privacy statements,

second the text mining approach, and third the statistical analysis of distribution of privacy dimensions between three mobile apps categories.

1.4.2.1. Ease of Reading

Two metrics are used to measure ease of reading of privacy policies. The first variable is

the readability measure. The second measure is word count that shows the length of privacy

policies. Privacy policies are mechanisms often used to increase perceived control over privacy

and to reduce perceived risks to privacy (Xu et al., 2008). Consequently, on mobile apps,

statements of privacy policies should ensure the protection of personal information. To deliver crucial information more effectively the writing style needs to be simple because users have different education levels. Readability indices show the level of difficulty of the language in the textual documents (Smith et al., 1996). In this study we used the automated readability index (ARI) to show the easiness in understanding privacy statements. ARI is a popular measure to evaluate the readability of texts (Senter and Smith, 1967). ARI is an efficient indicator for readability of a document because unlike many other measures it is based on the number of characters in a word

(Hu et al., 2012). We used the koRpus package in R version 3.1.1 for analyzing the readability of the privacy statements. ARI value is interpreted as the minimum level of education to understand that specific text and can be calculated using the following formula (Hu et al., 2012):

Total number of characters Total number of words ARI = 4.71 + 21.43 (1) Total number of words 2 Total number of sentences ∗ � � � � − We also calculate the total word count of privacy∗ policies. Length of privacy policies has

negative influence on users’ intentions to read them.

28

1.4.2.2. Associative Frequencies of Privacy Dimensions

In this section, we describe our approach to process the privacy statements. Following the procedure described by Cao et al. (Cao et al., 2011), the text mining approach is implemented in tm package in R version 3.1.1. The text mining process includes multiple steps: parsing (creating corpus, cleaning, and transformation), stemming, filtering, and analyzing term associations. The text mining approach is shown in the Figure 1.

Set of Privacy Corpus Document Statement Generation and Text Stemming Text Filter Analysis Documents Cleaning

Figure 1- Overall text mining approach in R environment

When a large stack of documents is available it is important to summarize them based on

important characteristics (Patra and Singh, 2013). To understand how four key privacy dimensions

differ among privacy statements of three mobile app categories (Type I, Type II, and Type III), it

is necessary to determine the weight of each dimension within every document. Unfortunately, it

is impossible to contrast differences between textual information without measurable scales such

as term frequency (Sebastiani, 2002). Textual data should be decomposed into quantifiable

elements that are suitable for data mining (Salton et al., 1975). Quantified elements must be

transformed into a compact and informative format.

The first step in the text mining is to create a corpus of documents. The corpus of the

documents is built to put textual information into a vector space so it is measurable. A corpus is a

collection of texts brought together for knowledge acquisition (McDonagh et al., 2011). The

second step in the text mining approach is known as part-of-speech tagging to assign words to

grammatical categories (Lossio-Ventura et al., 2014). After removing unnecessary parts in texts

29

such as white spaces, punctuations, and stop words, analysis of the frequency and distribution of

key privacy dimensions based on related words is implemented.

Frequent terms association is used to determine the weight of each privacy dimension. The

frequency measure is an efficient way of text mining (Hu et al., 2012). The basic method to compare different documents is to associate the frequency of similar terms to each privacy dimension. To differentiate between four key privacy dimensions suggested by Smith et al. (1996)

and then extended by Liu et al. (2014b), we followed the procedure suggested by Kim et al. (2005)

to create the hit density value for four key privacy dimensions for each privacy policy.

Hit density is the ratio of the number of each privacy dimension occurrence to the total

number of filtered terms (Kim et al., 2005). The frequency of hits for each dimension is an indicator

of its importance (Kim et al., 2005; Morris, 1994). Four search strings are developed to calculate

hit densities of four key privacy dimensions, as shown in Table 2. For example, for collection of

information, the developed search string determines how many hits would result when the program

searches for any of the main keywords (collect, keep, store), when within the proximity of 50

words there is one of the sub-keywords (user, customer, your), and finally it contains the term

information. Keywords are extracted from the literature and similar studies such as Massey et al.

(2013). In Table 2 the search strings are shown for each privacy key dimension.

30

Table 2 – Search strings of key privacy dimensions (hit density)

Dimension Keyword Search string Collect, store, keep, acquire, (Collect or keep or store) w/50 [(1OF user, Collection gather, reserve, retain customer, your) and (information)] Secondary Secondary use, share, disclose, (Use or disclose or share) w/50 [(secondary, Use third-party third party, other) and (information)] Improper (Improper or authorized or certified) w/50 Access, authorize, permit Access [(1OF user, customer, your) and (information)] Error, mistake, fault, incorrect, (Error or mistake or inaccuracy) w/50 [(1OF Error accurate user, customer, your) and (information)] 1.4.4.3. ANOVA Analysis

To conduct a statistical analysis to see whether there is evidence that means of hit densities of different mobile app categories for each dimensions are different ANOVA analysis method is used. Analysis of variance (ANOVA) provides statistical supports to ensure the possibility of differences between key dimensions of privacy statements of three mobile app categories. The

ANOVA procedure is conducted in SPSS® using the hit density matrix of four key privacy

dimensions.

1.5. Results

1.5.1. Ease of Reading

To measure the readability of privacy policies, automated readability index (ARI) is

calculated for each privacy policy. The average ARI of privacy policies for health, navigation, and

game apps is shown in Table 3. Overall, results indicate that at least 12 years of education is

necessary to completely understand statement of privacy policies of three categories of apps.

The average ARI of privacy policies of health apps (Type I) is 12.9 years of education.

Results indicate to completely understand privacy policies of health apps at least one year of

college education is necessary. The average ARI of privacy policies of navigation apps (Type II)

is 12.3 years of education. Readability value of both navigation and health apps shows users should 31

have a college education to comprehend documents entirely. The average ARI of privacy policies

of game apps (Type III) is 11.9 years of education. The difference between ARI of Type I and

Type III indicates privacy policies of game apps (Type III) are easier for users to understand than

health and navigation apps (Type I and Type II).

The average word count of health apps (Type I) and navigation apps (Type II) are more

than 2000 words indicating lengthy documents. The average length of privacy policies of game

apps (Type III) is 1897 words indicating long texts that users should put a lot of effort to read

them. Therefore, to understand privacy statements users should be highly educated and be able to read very lengthy documents. The difference between word count of health apps (Type I), navigation apps (Type II), and game apps (Type III) indicates privacy policies of game apps are shorter than both health and navigation apps while they are easier for users to understand them.

Table 3 – ARI descriptive statistics

Mobile app category Average ARI Std. dev. Average word count Type I (e.g. Health) 12.9 2.42 2217 Type II (e.g. Navigation) 12.39 1.65 2137 Type III (e.g. Game) 11.9 3.13 1897 1.5.2. Associative Frequencies of Privacy Dimensions

Text mining is conducted to calculate term frequency matrix and key privacy terms associations (hit densities). We used tm package in R version 3.1.1 to calculate term frequency matrix and hit densities. In Table 4, the top 10 term frequencies of all three categories are shown.

Based on the term frequency table, there are marginal differences between privacy policies of three

Types in terms of receptiveness of keywords. The term information is the top most important keyword among all three types. Then, the second and third important term of health mobile apps

32

is personal and privacy while the second and third important term of navigation and game apps are use and services.

Table 4 – Top 10 term frequency of health, game, and navigation mobile apps’ privacy policies

Type I (e.g. Health) Type II (e.g. Navigation) Type III (e.g. Game) No. Term Frequency Term Frequency Term Frequency 1 Information 2017 Information 2133 Information 1598 2 Personal 682 Use 1355 Services 675 3 Privacy 569 Services 1084 Use 550 4 Services 517 Location 712 Privacy 513 5 Site 477 Personal 687 Personal 483 6 Third-party 462 Provide 648 Third-party 450 7 Policy 419 Collect 593 Policy 372 8 Data 384 Name 532 Data 283 9 Health 351 Party 527 Collect 275 10 Cookies 291 Take 524 Cookies 219 Term frequency is not a suitable indicator to show differences between textual documents because considering frequencies of words alone cannot take into account the actual meanings. To overcome this, term associations of words is necessary to provide a more meaningful meter.

Following the search strings procedure explained earlier, hit densities of four key privacy dimensions of privacy policies of three mobile apps categories are calculated. The resulting hit densities demonstrate the importance of each key privacy dimension throughout the privacy policies of Type I, Type II, and Type III mobile apps (Table 5). According to the results, there are differences between key privacy dimensions that must be statistically tested.

First, hit density of the collection dimension in Type II is greater than in Type I and Type

III. Results show the means of hit densities of the collection dimension of health, navigation, and game apps are 3.43, 5.03, and 1.73 respectively. Also, total hit density of collection dimension of health apps is 103. Total hit density of navigation apps is 151 and total hit density of game apps is

52. 33

Second, hit density of secondary use dimension of privacy policies of health apps is greater than navigation and game apps. The average hit densities of the secondary use dimension of health, navigation, and game apps are 2.17, 0.50, and 0.80 correspondingly. Also, total hit density of secondary use dimension of health apps is 65. Total hit density of navigation apps is 15 and total hit density of game apps is 24.

Third, hit density of improper access dimension of privacy policies of health apps is greater than navigation and game apps. The average hit densities of the improper access dimension of health, navigation, and game apps are 0.87, 0.57, and 0.73 in order. Also, total hit density of improper access dimension of health apps is 26. Total hit density of navigation apps is 17 and total hit density of game apps is 22. Finally, the average hit density of privacy policies of the error dimension of health apps is greater than navigation and game apps. Total hit density of error dimension of health apps is 27 compared to 3 for navigation apps and 12 for game apps.

Table 5 –Hit density of key privacy dimensions for three app categories

Type I (e.g. Health) Type II (e.g. Type III (e.g. Rank Privacy Dimension Navigation) Game) (max) Total Mean Std. Total Mean Std. Total Mean Std. Collection 103 3.43 3.50 151 5.03 3.76 52 1.73 2.81 Type II Secondary Use 65 2.17 2.57 15 0.50 0.68 24 0.80 1.52 Type I Improper Access 26 0.87 0.90 17 0.57 0.97 22 0.73 1.02 Type I Error 27 0.90 1.47 3 0.10 0.55 12 0.40 1.10 Type I To better see the difference of privacy dimensions, a radar chart of hit densities for three app categories is provided in Figure 2. Based on the radar chart, the biggest differences among privacy policies of three mobile app categories are in the collection, secondary use, and error dimensions. Hit density analysis shows that the means of three privacy dimensions are different.

The total of mean of hit densities represent the overall emphasis on privacy dimensions. ANOVA

34

is used in the next step to statistically test the differences between total hits of key privacy dimensions.

Figure 2 - Hit density chart for three app categories

1.5.3. ANOVA Results

In order to test the set of hypotheses proposed earlier, ANOVA is an appropriate method to compare means of multiple groups of samples. The null hypothesis in ANOVA is that means of different groups are equal (Tabachnick and Fidell, 2001, p. 19). We ran the ANOVA in SPSS® between hit densities of health (Type I), navigation (Type II), and health (Type III) mobile apps.

ANOVA results are provided in Table 6. Means differences between Type I and Type II hit densities show evidence at 0.05 of significance level for secondary use and error dimensions. At

0.10 of significance level, the collection dimension is statistically different between Type I and

35

Type II. Comparing means between Type I and Type III hit densities show support at 0.05 of

significance level for collection and secondary use dimensions.

Table 6 –ANOVA Analysis Results

Between Type I and Type II Between Type I and Type III Dimension SS MS F Sig. SS MS F Sig. BTG 38.40 38.40 43.35 43.35 Collection 2.906 0.094* 6.084 0.017** WG 766.33 13.21 413.23 7.12 BTG 41.67 41.67 28.02 28.02 Secondary Use 11.750 0.001*** 7.421 0.009*** WG 205.67 3.55 218.97 3.77 Improper BTG 1.35 1.35 0.27 0.27 1.918 0.171 0.270 0.605 Access WG 40.83 0.70 57.33 0.99 BTG 9.60 9.60 3.75 3.75 Error 7.798 0.007*** 2.222 0.142 WG 71.40 1.23 97.90 1.69 Note. *** Significant at 0.01; ** Significant at 0.05, * Significant at 0.1 BTW: Between groups, WG: Within groups, SS: Sum of Squares, MS: Mean of Squares, Sig.: Significance level Based on the ANOVA results between privacy policies of health (Type I) and navigation

(Type II) categories, we find H2a and H4a were supported; and H3a was not supported. The results indicate that there is more emphasis on secondary use (F=11.750, p=0.001, p=0.01) and error

(F=7.798, p=0.007, p=0.01) dimensions in Type I than Type II, supporting H2a and H4a. Contrary to what we hypothesized in H1a, collection dimension in Type II is more emphasized than Type I.

ANOVA results show a significance difference (F=2.906, p=0.094, p=0.1), the mean of hit densities of collection dimension of privacy policies is significantly higher in navigation apps than health apps. For improper access dimension we could not find statistical support for the difference between hit densities of privacy policies Type I and Type II (F=1.918, p=0.171, p=0.1).

Results of ANOVA between privacy policies of health (Type I) and game (Type III) categories show H1b and H2b were supported, but H3b and H4b were not supported. Results indicate among privacy policies there is more emphasis on collection (F=6.084, p=0.0171,

36

p=0.051) and secondary use (F=7.421, p=0.009, p=0.01) dimensions in Type I than Type III, supporting H1b and H2b. Summary of hypothesis testing results is provided in Table 7.

Table 7 - Summary of hypothesis testing results

Hypothesis Result H1a: In mobile apps’ privacy policies, Type I apps more emphasize on collection of information dimension of privacy than Type II apps Not Supported do. H1b: In mobile apps’ privacy policies, Type I apps more emphasize on collection of information dimension of privacy than Type III apps Supported do. H2a: In mobile apps’ privacy policies, Type I apps more emphasize on secondary usage of information dimension of privacy than Type II Supported apps do. H2b: In mobile apps’ privacy policies, Type I apps more emphasize on secondary usage of information dimension of privacy than Type Supported III apps do. H3a: In mobile apps’ privacy policies, Type I apps more emphasize on improper access of information dimension of privacy than Type II Not Supported apps do. H3b: In mobile apps’ privacy policies, Type I apps more emphasize on improper access of information dimension of privacy than Type Not Supported III apps do. H4a: In mobile apps’ privacy policies, Type I apps more emphasize Supported on error of information dimension of privacy than Type II apps do. H4b: In mobile apps’ privacy policies, Type I apps more emphasize Not Supported on error of information dimension of privacy than Type III apps do. 1.6. Discussion and Implications

In this study, we investigate the differences of privacy dimensions among mobile apps to clarify what is truly provided inside privacy policy statements of three app categories. This study is an attempt to unwind mobile app providers’ view of individuals’ privacy. In the mobile phone context, privacy statements are principles of actions provided by mobile app providers to ultimately protect users’ information when users disclose their personal information to apps (Talib et al., 2014). Mobile app users’ demand for the highest protection of their personal information

37

privacy compared to other contexts because they constantly share their information (from very sensitive to not sensitive) with apps to take advantage of full apps’ functionalities. As a result, mobile app providers offer privacy policy statements to clarify probable risks to individuals’ privacy and how they protect privacy of personal information (Stufflebeam et al., 2004).

There are at least a couple of key findings. First, results of the readability analysis of privacy statements show that users should have at 12 years of education to fully understand statements of privacy policies. Readability of privacy statements for three mobile app categories

(health, navigation, and game) indicates a high difficulty in understanding documents, especially for the users of mobile apps. Privacy policies of health and navigation mobile apps require at least

1 year of college education to be easily interpretable. It is really hard for users to understand terminologies inside privacy policies and as a result many people simply accept/reject privacy policy statements without reading them.

Secondly, results of text mining method and ANOVA analysis supports partial support for set of hypotheses proposed. The basic assumption of this research is that if a mobile app deals with high sensitive personal information, disclosed data should be protected more precisely.

Consequently, statements of privacy policy should contain greater personal information privacy related terms. When sensitive information is involved, the privacy policy of the app providers should weight more key privacy dimensions. Hit density of key privacy dimensions revealed that secondary use and error dimensions emphasized more in privacy policies of health apps than navigation apps. Also, collection and secondary use privacy related terms are more considered in privacy policies of navigation apps than game apps.

One explanation for partial support of proposed hypothesis could be due to the reason that app developer and companies considers their own benefits and risks that are reflected in the 38

statements of privacy policies. So, app providers overlook users-specific concerns and they display

general, ambiguous, and overly detailed privacy policies to refrain from unwanted consequences

for themselves. Mobile app providers are more concerned about legal consequences of disclosing

users’ information than actually protecting different dimensions of users’ privacy in the mobile

app context (Felt et al., 2012). Another reason is that while according to the law app providers

have to make users aware of the use of their personal information (Vail et al., 2008), they do not

sufficiently invest in different aspects of personal information in various circumstances and users’

personal information is seen as a uniform concept. While both health and location data are

considered sensitive data (Bansal et al., 2010; Li et al., 2013), their undesired use could influence

differently on individuals.

Among the four dimensions of privacy concern, secondary use of information is more

emphasized when the disclosed information is highly sensitive. First, privacy policies of all three

mobile apps sufficiently discuss collection dimension while navigation apps contain more terms

related to it. Interestingly, results show concerns for collection of personal information is more

discussed in privacy policies of navigation apps than health apps. Generally, location privacy is

not well protected and recent legal attempts to give users a greater control over their location-

related information have been started to fill this hole (Koohikamali et al., 2015; Xue et al., 2015).

With the growing use of location-based services many people would like to know both the possible

risks and benefits of disclosing their location data (Cottrill, 2015; Koohikamali and Peak, 2015).

Users trade-off the levels of privacy protection with the benefits to make their decisions (Cottrill,

2015). Navigation app providers are aware of the negative consequences of collecting location- related information on users’ privacy concerns and they put a greater effort to address it in the privacy policy statements. 39

Second, the secondary use dimension is less emphasized in privacy policies of navigation apps compared to health and game categories. It is an interesting finding because while privacy policies of navigation apps discuss collection of information more they do not contain enough terms about the secondary use of information. It can be argued that many navigation apps do not collect personally identifiable information and as a result the secondary use of such information becomes less important.

Third, improper access dimension of privacy is more emphasized in health apps compared to navigation apps. Users of health apps often provide personal health information to the app to take advantage of benefits of apps. As a result, app providers make users aware of the ways their information would be accessible to authorized people. Users would like to have more control on ways their health related information would be disclosed, used, and accessed (East and Havard,

2015). Health mobile app providers have realized the importance of protecting health related data specifically based on users’ requests (Semple et al., 2015). Fourth, the error dimension is more discussed among privacy policies of health apps than navigation and game apps. Health data is known to be the most sensitive type of personal information and inaccuracies in stored data could cause serious negative results. So, users are more concerned about the consequences of error to their personal health information that they want to know they ways they can update inaccuracies in health related data.

1.6.1. Implications

This paper provides a new direction to understand and analyze privacy policies. This study has several contributions to both theory and practice in the area of mobile security and privacy research. The major contribution of this research is that it extends the current user-centric view on information privacy by providing organizational perspectives. There are few studies to explain 40

privacy policies on mobile apps and this study has investigated a new aspect of privacy in the context of mobile apps from the providers’ and companies’ points of view. Theoretically, findings of this study provide companies’ perspective into privacy of individuals. Lack of a theoretical model to explain privacy protection strategies from organizations perspectives derives a new direction of research. Also, there is a misalignment between priorities of individuals and organizations in terms of privacy protection and findings of this study have made an initial attempt to fill the gap.

The second contribution of this study is the emphasis on the role of information sensitivity level on privacy practices. Our study contributes to Information Systems (IS) literature by providing a different way of thinking about users’ privacy based on the level of information sensitivity. Findings of this study demonstrate providers of mobile apps realized the importance of considering sensitive information to protect privacy at a mediocre level and in the statements of privacy policies they have made an initial attempt to make users aware of different types of risks to their privacy. Third, through the lens of prospect theory and based on the framing concept this study investigates how providers’ privacy trade-off derives the framing of policies they provide to customers. Future research in IS could focus on different ways of framing for improve communications between companies and customers.

In practice, results of this study provide remarkable insights to enhance effectiveness of privacy policies. While the importance of privacy policies are recognized by organizations, findings of this study demonstrate users often find it difficult to read and understand them.

Practitioners can take into account complexities of privacy policies in terms of readability and length of to provide easier documents to users. Different framing of positive and negative choices for customers within privacy policies would influence the perceptions of privacy concern and 41

future decisions Furthermore, developing a text-mining algorithm based on word frequencies and associations offer new mechanisms for privacy policy providers. In addition, findings suggest that privacy policies should differentiate between dimensions of privacy and providers could consider it. Finally, practitioners may use findings of this study to weight dimensions of privacy and visualize them to help users better understand privacy policies.

1.6.2. Limitation and Future Research

Similar to other research, our study has several limitations. The first limitation of our study relates to the data collection methodology. To show different levels of sensitivity, we only considered three mobile app categories (health, navigation, and health). However, in other categories of mobile apps users can provide sensitive information as well. Future research can consider more categories to provide more comprehensive view. Furthermore, several mobile apps from the pool we selected did not have a privacy policy that we had to remove them and some had exactly the same privacy policy document because they were developed by a same team/developer/organization. Future research can consider more mobile app categories to provide a better picture of providers’ perspectives of personal information privacy.

Third limitation is related to our developed text mining method. Our method is based on association of terms and it is imposed by some uncertainties in interpretations of contents of privacy policy documents. As suggested by Bansal et al. (2015), research should find new ways to enhance effectiveness of privacy statements. For example, clear language and interactive designs are more appropriate ways of delivering information in privacy statements (Bansal et al., 2015).

The text mining approach we conducted in this study to quantify content of privacy policies create

hit density indicators for different dimensions of privacy throughout privacy policies. Future

research may adopt the hit density mechanism to provide visualization tools to enhance 42

effectiveness of privacy policies. Finally, future studies may focus on the users’ perceptions and explore how users of mobile app actually distinguish between privacy dimensions.

1.7. Conclusion

This study has taken the first step to distinguish between key privacy dimensions when on mobile apps sensitive information is involved. Privacy policies are designed to enhance users’ awareness about the use and disclosure of their information. The complexities of privacy policies impede users to read and fully understand them. The readability calculations showed, for all three mobile app categories (health, navigation, and game) users should have at least 12 years of education to completely understand privacy policies. Among them, privacy policies of health apps are longer and more complex.

Due to the popularity of mobile apps and disclosure of personal information on them, protecting personal information becomes more crucial than other contexts. It is necessary to recognize how privacy policies deal with personal information privacy when data is more sensitive. We focused on three mobile app categories that deal with a high, medium, and low sensitive information. Health apps (type I) deal with high sensitive information, navigation apps

(type II) deal with medium sensitive information, and game apps (type III) deal with low sensitive information. Through the framing conceptualization of prospect theory and four dimensions of privacy we built the theoretical basis to uncover mobile app developers’ perspective of information privacy. The text mining approach in R was implemented to differentiate between key privacy dimensions of privacy policies of three mobile app categories. We identified how mobile app developers weight dimensions of privacy differently when sensitive information is involved.

Findings demonstrate there is more emphasis on secondary use dimension of privacy policies in mobile apps dealing with higher sensitive data. In addition, results show there is more emphases 43

on collection of personal information in privacy policies of health apps than game apps. Finally, we could not find any difference between improper access dimensions among categories of mobile apps.

44

CHAPTER 2

PRIVACY TRADE-OFF DYNAMICS: THE ROLE OF INITIAL USE AND SELF-

DISCLOSURE

2.1. Introduction

Mobile devices have become the leading technology among the digital landscape (Furner

et al., 2015) with nearly 64 percent of Americans owning a smartphone (Smith and Page, 2015).

The primary reason fueling this exponential growth in smartphones usage is growing market

mobile applications (apps) (Furner et al., 2015). Mobile apps are software applications designed

specifically for smartphones, tablets, and other mobile devices (Statista, 2016a). Mobile phones

have evolved from communication-only devices to sophisticated multi-tasking tools and with the additional of mobile apps, these devices have become the Swiss army knives of technology

(Christensen and Prax, 2012; Wellman, 2010). The use of social networking services continues to rise as the predicted number of users will reach 2.5 billion by 2018, encompassing approximately one-third of global population (Statista, 2016a). Furthermore, it was reported in January 2016 that

52% of social networking users in North America accessed social networking services via their smart devices (Statista, 2016a).

Ease of availability and accessibility of mobile devices have made mobile users to believe that the future of web is mobile (Kumar et al., 2015; Laird, 2012). The transition from web to mobile creates faces many challenges such as app indexing, app installation limitation, auto update, and user privacy (Kumar et al., 2015). Information privacy concerns can be seen as a matter of human rights or a matter of contractual negotiation (Dinev et al., 2015). Either way the protection of people’s personal information is an important issue. The level of privacy concern is not constant across various contexts or situations for every person (Angst and Agarwal, 2009).

Privacy concern could reflect the desire for sufficient control and fairness for personal information 45 (Bansal and Zahedi, 2015; Malhotra et al., 2004). Previous research has shown the negative

influence of privacy concerns on intention to use social network sites (SNSs) and continuance

intention (Ku et al., 2013) while other research demonstrates the insignificance of this relationship

in the context of location-based social network apps (Koohikamali et al., 2015). Privacy concern

is an important indicator for many people deciding whether to use SNSs or not (Jiang et al., 2013).

The majority of smartphone users view the need of protection of their privacy higher than

they do with conventional technologies due to the amount of personal information commonly

stored on smartphones (Xu et al., 2013b). While mobile apps are akin to online contexts, there are

some features that are specific only to mobile apps such as their ubiquity, immediacy and offline

connectivity (Junglas et al., 2008a; Zhou et al., 2010). Prior research has accorded less attention

to the adoption life cycle from adoption to continuance of social networking sites (SNSs), instead

they have focused mainly on the usage of SNSs at specific point in time (Huang et al., 2014; Ku

et al., 2013). Through various stages of the adoption lifecycle, the perceptions of risks and benefits may differ for various groups of people (Meade and Rabelo, 2004).

The information privacy concern is not static and researchers have discussed the changes

in privacy concerns (Hong and Thong, 2013; John et al., 2011; Knijnenburg and Kobsa, 2013).

Variations in concern for privacy could result from a new experience or phenomenon, increased

understanding of context, changes in the environment, or modifications in legislation (Acquisti et

al., 2015; John et al., 2011). The adoption of new technology may possess advantages involving perceptions of utilitarian and hedonic benefits (Lee, 2009). Within social networking contexts such as social networking apps (SNAs), users develop social interactions as a result of a mental trade- off between their concern for privacy and rewards (benefits), defined as the privacy trade-off (Jiang et al., 2013). During the adoption phases of SNAs, the initial perceptions of the privacy and the 46

benefit (privacy trade-off) change due to usage experience and self-disclosure. Therefore, it is

important to study the changes in privacy trade-off in a period of time.

Prior research has generally focused on the privacy concern influence on disclosure

behavior and the role of contexts in perceptions of privacy concern. However, further research

evaluating the dynamic changes in the privacy concern when people use social networking mobile

apps is needed. Building upon previous literature, the current research will address the following

questions: (1) how does the privacy trade-off change over time due to initial self-disclosure and initial use experiences of new SNA? (2) What is the relationship between privacy trade-off and intention to use SNAs in the pre-use phase? (3) How does the personal innovativeness in IT moderate the relationship between pre-perceptions and intention to use a new SNA?

Using the model of privacy trade-off developed by Jiang et al. (2013) and through the lens of belief updating process suggested by Hogarth and Einhorn (1992), this study has three primary objectives: 1) to propose a dynamic privacy trade-off model explaining the change in privacy concern and perceived benefit of individuals during the initial usage and initial self-disclosure on a new social networking app (SNA) ; 2) to empirically test the proposed dynamic privacy trade- off research model using the data collected from users of a new SNA at two phases (before using the SNA and after the initial use); and 3) to improve theoretical and practical understanding of using a new technology (e.g. SNA) with insights of a longitudinal privacy trade-off model. In other area of technology usage, it is important to adjust initial users’ perceptions (Kim and Malhotra,

2005). Findings of this study answer the research questions to provide new visions of social networking usage in the context of mobile apps and users’ perceptions malleability due to initial experience on a new SNA.

47

2.2. Literature Review

2.2.1. Use of Social Networking Apps

The proliferation of smartphones has produced a new industry consisting of applications

(apps) that have increased the functionality of smartphones beyond mere communication devices

(Dinner et al., 2015). With millions of mobile apps available, it is imperative to study the factors driving the adoption and usage of a specific app. Five events that define the lifecycle of mobile apps are installing, updating, uninstalling, opening, and closing the app (Böhmer et al., 2011). The

preliminary work has been started in the computer science literature to show determinants of app

usage (Taylor and Levin, 2014; Xu et al., 2013b). Findings by Xu et al. (2013b) reveal three key

factors in everyday use of mobile apps: user preference, context, and community behavior.

First, the intrinsic user app preferences are originated from the user’s historical usage

patterns. Second, the environment and user activities are observable through the sensor-based

contextual signals. Third, in user communities the shared aggregate behavior is based on the

pattern of usage behaviors. In the study by Kranz et al. (2013), authors have explored the reasons

behind failure in adoption of apps. They have explained two release cycles, preview version and

the feature-complete version importance on adoption success. Early stage release of mobile apps

could help developers to improve design features based on the immediate feedback from users

(Kranz et al., 2013). Appealing visuals is enough to attract users to download and try the app.

Then, after introducing the feature-complete version to motivate current users to continue using

the app, developers should make effort to increase user satisfaction via app functionalities and

marketing strategies (Kranz et al., 2013).

A comparison between pre-adoption and post-adoption of social network sites reveals that

prior experience is important in determining users’ behaviors (Chang and Zhu, 2011). However, a 48

significant difference between attitude, subjective norm, and behavioral control perceptions of pre-

adopters and post-adopters was not found (Chang and Zhu, 2011). Karahanna et al. (1999) argued

that post-adoption attitude is based on the perceived usefulness and perceived image enhancement

whereas pre-adoption attitude is determined by ease-of-use, perceived usefulness, visibility, and trial-ability.

Only few studies in the adoption literature have focused on both pre-adoption and post-

adoption of a new technology. For example, findings of previous literature demonstrated changes

in people’s attitudes is due to experiences and situational factors during adoption and use (Dinev

et al., 2009; Kroenung and Eckhardt, 2015). A study by Ku et al. (2013) explores the effective

factors on the continued use intention toward SNSs. Their findings indicate privacy concerns,

perceived critical mass, subjective norms, and gratification influence the intention to continue to

use SNSs. Also, the regional difference could moderate the effect of both privacy concerns and

gratification on continuance intention (Ku et al., 2013).

2.2.2. Privacy Concern and Perceived Benefit Trade-Off

Defining privacy is not trivial because it depends on the context that frames it, the culture that cultivates it, and an individual who perceives it (Moore, 2003). General privacy is often

explored under an ethical perspective (Smith et al., 2011). Privacy could have different meanings

and connotations across different groups of the same culture. Consequently, the expectations of

individuals to protect their privacy vary greatly (Dinev et al., 2013). Concerns for information

privacy have been growing since 1960s (Dinev et al., 2015). According to Smith et al. (2011)

there are three major areas that previous privacy literature have contributed most: the

conceptualization of information privacy, the relationships between information privacy construct

and other constructs, and the context the information privacy resides in. Personal concepts of 49

privacy are interwoven with values, perceptions, beliefs, and experience, such that privacy research requires a taxonomy of privacy (Solove, 2006).

Information privacy is the amount of information individuals choose to share with others

(Westin, 2003). Personal information privacy is the optimal level of control over personal information (Malhotra et al., 2004). Three dimensions of privacy situations identified include self- ego, environmental, and interpersonal (Laufer and Wolfe, 1977). The self-ego dimension focuses on the autonomy and personal dignity in every society. The self-ego is reinforced by appropriate privacy expressions and experiences. The second dimension is the environmental elements that affect a person’s abilities to perceive and use availability options. Environmental elements are social arrangement, cultural meanings, and physical setting interaction (socio-physical). The interpersonal dimension establishes the core of the privacy phenomenon and it is the synthesis of the self-ego and environmental dimensions. Smith et al. (1996) have investigated dimensions of individuals’ privacy concerns. Four dimensions of privacy concern are collection, improper access, errors, and unauthorized use (Smith et al., 1996). We follow the same perceptive in this study and explore four dimensions of individual privacy concern in the context of SNAs.

In the previous research, the privacy concern is an important determinant of usage of SNSs.

Boyd and Ellison (2007) argue that the actual behavior of people to protect their privacy is not always the same as their desire. The privacy-trust model developed by Dwyer et al. (2007) shows that the privacy concern could determine the information sharing and relationship buildings on

SNSs as major post-adoption behaviors on SNSs. Following an opportunity and risk perspective,

Livingstone (2008) discusses how balance between both could influence the use of SNSs. Findings of Livingstone (2008) study show that use of SNSs involves possible risks such as privacy concern and abuse and opportunities such as identity and sociability. The results demonstrate that usage 50

experience with SNSs shape users’ perceptions of opportunities and risks. For example, young

people with less experience of using SNSs relish the opportunities more than more experienced users. Mobile app users psychological decision making to take risks in exchange for potential benefits of a new SNA is important indicator (Aloudat and Michael, 2011). Thus, adoption of

LBAs is related to the trade-off between perceived risks the expected benefits (Keith et al., 2013a).

Perceived benefit is a multidimensional construct known as value dimensions (Overby and

Lee, 2006). Utilitarian and hedonic benefits are classifications of perceived benefits of IT (Xu et

al., 2015). Utilitarian benefit refers to the functional and practical benefits and hedonic benefit

reelects the aesthetic and enjoyment benefits (Chitturi et al., 2008). The direct and indirect

advantages of adopting an IS comprise the two main types of perceived benefits (Lee, 2009). For

example, location-based social network applications (LB-SNAs) let users to benefit the society as

well as themselves (Koohikamali et al., 2015). Users of LB-SNAs receive incentives when sharing

location as well as offering great deals to their networks (Koohikamali et al., 2015). In the previous

literature, perceived usefulness is used to measure utilitarian benefit and perceived enjoyment is

used to capture the hedonic benefit (Sun et al., 2014). Thus, the perceived benefit on SNAs is the

reward that expected by the user (Chen and Dubinsky, 2003). Mobile apps give users more

interactivity with technology and enhanced visuals as well as exclusive experiences (Christensen

and Prax, 2012). Mobile apps adds an externality of mobility to the traditional technologies

(Chmielewski, 2015; Han et al., 2014).

Hedonic benefits generates a feeling of excitement (Grange and Benbasat, 2014).

Integration of hedonic benefits of SNAs based on the literature provides two determinants: social

reward and enjoyment (Jiang et al., 2013; O'Brien and Toms, 2008; Zhou et al., 2015). Enjoyment

is related to the pleasant experiences of using SNAs (Zhou et al., 2015). Social reward explains 51

social network values for a user. Social rewards refer to the satisfaction and gratifications user

derive through the interaction on social networks (Jiang et al., 2013). On SNAs, social reward can

facilitate relationship building (Bateman et al., 2011). Perceived benefit is the expected value of

a technology (Li et al. 2011). For SNAs, two of the most important utilitarian benefits are

immediacy and ubiquity. Hedonic benefit reelects the enjoyment benefits (Chitturi et al., 2008). In

addition, people on social networks gain satisfaction through interaction with others (Jiang et al.,

2013) and social reward can facilitate the relationship building (Bateman et al., 2011). Dimensions

of perceived benefits and privacy concern are shown in Figure 3.

Utilitarian Ubiquity

Immediacy Collectio n

Pre-Perceived Secondary Use Hedonic Benefit Pre-Privacy Concern Social Reward Improper Access

Enjoyment Error

Figure 3 - Dimensions of perceived benefit and privacy concern

2.2.3. Self-Disclosure

Information sharing on SNSs is a natural phenomenon as a post-adoption behavior (Cheung and Lee, 2010). The content that SNS users generate can be classified as self-information (e.g. age, home address, phone number, personal views, and personal images) and non-self-information (e.g. news and entertainment contents) (Bryce and Klang, 2009). The intentional and voluntary sharing of own information is self-discourse (Posey et al., 2010). Self-disclosure could include personal information, personal photos and videos, ideas, and locations (Koohikamali et al., 2015; Posey et al., 2010). Self-disclosed information could be specific information (such as home address) or 52

general information (such as nickname) (Taddicken, 2014). Research shows that it is necessary to differentiate between different forms of personal information (Acquisti et al., 2015; Taddicken,

2014). French and Read (2013) view information sharing as a 2-dimensional phenomena that may differ in type and depth. The rapid change of contents on SNSs have raised the concern for privacy

(Baek et al., 2011). Self-disclosure behavior characterizations are intent, amount, frequency, depth, valence, and honesty (Posey et al., 2010). The information disclosure behavior on SNSs has made the concern for personal information privacy more salient (Squicciarini et al., 2011).

SNS research has investigated motivations for sharing user-generated information

(Scanfeld et al., 2010; Zarsky, 2008) and for sharing self-information (Ahn et al., 2007). For example, Baek et al. (2011), identified the motivations by users to share links on . They show information sharing’s primary motivations include convenience, entertainment, passing time, interpersonal utility, control, and work promotion These results indicate how the use of privacy control mechanisms could limit the amount of information available on Facebook.

Building on TPB, previous literature found self-disclosure is influenced by many factors such as perceived benefit, perceived risk, privacy concerns, information control, and sensitivity (Xu et al.,

2013a).

Primary inhibitors of self-disclosure behavior are privacy-related factors such as privacy concerns (Koohikamali et al., 2015). Findings of Gross and Acquisti (2005) study demonstrates the potential threats to privacy due to disclosure of personal information on SNSs users’ profiles.

For example, their study shows that it is possible to reconstructs social security number of a user based on the self-disclosed information on the user’ profile. In another study by Choi et al. (2015), information dissemination and exposure of personal information has a strong effect on perceptions of privacy invasion. They discuss the effect of posting self-information on privacy invasions is 53

amplified due to embarrassing exposures. As a result, the current research posits that when people

disclose information on social networks (voluntarily or involuntarily), depending on the type of

information, their privacy concern could increase. Summary of previous literature on SNS usage and users’ privacy concerns is provided in Appendix B.

2.3. Theoretical Background

2.3.1. Privacy Trade-Off Dynamism

Continuous changes in laws and information technologies potentially result in change of privacy concerns (Chen et al., 2008). Privacy is a multidimensional and elastic construct and it varies in response to people’s experiences (Smith et al., 2011). The concern for information privacy (i.e. privacy concern) is neither static nor absolute because individuals’ perceptions could shift over time (Hong and Thong, 2013; Smith et al., 1996). Malhotra et al. (2004) investigated the dynamic privacy construct and introduced the internet users’ information privacy concern (IUIPC) construct. They also suggest the dynamic nature of IUIPC but do not provide a time-related perspective. Findings of the study by Xu et al. (2012b) in the context of location-based services emphasize the importance of prior experience on privacy concern. Their study demonstrates negative prior privacy experiences increase the level of privacy concern and suggest that privacy is not static with context and users’ experiences potentially changing the initial privacy concerns.

In another study by Squicciarini et al. (2011)., the devlopment of a collaborative privacy

management tool demonstrates the change in people’s privacy concerns on SNSs as opposed to

being static.

Behavior change model suggests to capture the range of behaviors on SNSs (Fogg and

Eckles, 2007). Building on behavioral chain model, individuals have different levels of

54

involvement regarding a new SNS that could induct different experiences and behaviors over time

(Vasalou et al., 2010). According to Vasalaou et al. (2010), three phases of involvement with a

new SNS are discovery, superficial involvement, and true commitment. During the discovery

phase, people are aware of the new SNS without an actual usage experience. During the superficial

involvement phase, users decide to try the new SNS and create content for the first time. Finally,

users get involved in seeking relationship with others and sharing contents with them. When users

are exposed more to a new SNS, they gain more experiences (Vasalou et al., 2010). In this study,

discovery phase is the pre-use phase and the superficial involvement is the initial use phase. We

only focus on the transition from pre-use stage (phase 1) to the initial use stage (phase 2).

Theory of belief updating process explains the adjustment in beliefs of individuals in

response to interaction with information (Hogarth and Einhorn, 1992). In the IS field, theory of

belief updating is used to explain post-adoption behaviors such as IS continued use with a

longitudinal perspective. For example, Kim and Malhotra (2005) discuss how consumer

evaluations can be explained by theory of belief updating. The adjustment process of prior knowledge with the aid of succeeding pieces of evidence is the main principle of theory of belief updating (Hogarth and Einhorn, 1992). Applying the same mechanism, users’ perceptions undergo the similar adjustments through the usage of a new IT (Kim and Malhotra, 2005). Based on the above discussion, the conceptual model of dynamic privacy trade-off is depicted in Figure 4.

Pre-Use Initial Use Continued Use (Phase 1) (Phase 2) (Phase 3)

Pre-Privacy Initial Privacy Ongoing Privacy Trade-Off Trade-Off Trade-Off

Initial Use Continued Use Experience Experience

Figure 4 - The conceptual model of dynamic privacy concern 55

Research shows privacy concerns on SNSs are cultivated by initial use experiences of users

(Boyd and Ellison, 2007). Privacy concerns could decrease the use of SNSs among users (Ku et

al., 2013). Similarly, during the pre-use (discovery) many people are aware of a new SNA. Then,

a portion of people decide to try the app for the first time and they gain an actual experience with

regard to it. The dynamic privacy model explains that initial privacy concern change after using

SNAs for the first time (superficial involvement) and gain further experiences. According to the

dynamic privacy model, the ongoing privacy concern is always changeable due to the interactive

relationship between usage experiences and privacy concerns.

Integration of the theory of planned behavior (TPB), the innovation diffusion theory (IDT), theory of belief updating, and the privacy trade-off model provides the basis for the proposed research model. Extant research has drawn on innovation diffusion theory (IDT) to examine the adoption and usage of IT (Agarwal and Prasad, 1998a; Ahuja and Thatcher, 2005). IDT explains whether to adopt or to reject an innovation (Rogers, 1983; Rogers, 2010). Main elements in diffusion of new ideas are the innovation, the communication channel, time, and members of a social system (Rogers, 1983; Rogers, 2010). Five adopter categories of innovations are innovators, early adopters, early majority, later majority, and laggard (Rogers, 1983; Rogers, 2010). Further, drawing upon IDT the personal innovativeness in IT (PIIT) plays as a predictor of intentions to use. Theory of planned behavior (TPB) suggests that intentions ultimately cause behaviors (Ajzen and Madden, 1986). The first actual experience with social networks could change the pre-use

perceptions of privacy concern and benefits (Xu et al., 2013a; Xu et al., 2009). In IS, theory of

belief updating posits users’ perceptions update during the use of a new IT (Kim and Malhotra,

2005; Sun, 2013). New pieces of evidence obtained during the initial use experience, change pre-

perceptions of users about the new IT. 56

Individuals typically have psychological trade-offs between the value of information privacy and the costs of providing personal information when they decide to use a new technology

(Hann et al., 2002). Privacy calculus model describes the positive influence of perceived benefits and negative effect of privacy risks on intention to disclose personal information (Dinev and Hart,

2006). Disclosure behavior is a post-adoption behavior and it is precedent to the usage behavior

(Hong et al., 2015). To understand contemporary privacy concerns, calculus perspective is the most useful framework (Xu et al., 2009). When users confront both privacy concerns (losses) and perceived benefits (gains), individuals psychologically perform a privacy calculus to trade-off between them (Jiang et al., 2013).

Jiang et al. (2013) extended the privacy calculus framework to include social interactions.

In this study, the focus is the privacy trade-off change as a result of initial usage of a new social networking app and it is necessary to include social interactions. Consequently, the privacy trade- off model is appropriate. When users believe that the app provider offers useful features and functions, the perceived benefits act as a gain function in the privacy calculus model (Li et al.,

2011). Most of the previous research has not considered the change in privacy trade-off from pre- use (discovery phase) to the initial use (superficial involvement phase).

2.3.2. Research Model and Hypotheses

The proposed research model of privacy concern dynamics and hypotheses are displayed in Figure 5. Definitions of constructs is provided in Appendix C. The proposed research model is based on the conceptual model of dynamic privacy concern and it explains the trade-off between privacy concern and perceived benefits changes during the pre-adoption and initial use phases through the initial use and self-disclosure. The research model shown in Figure 5 also posits that personal innovativeness in IT influences the intention to use. 57

Pre-Use Initial Use (Phase 1) (Phase 2)

H6 (+)

Pre-Perceived Initial Perceived Initial Use H8 (+) Benefit H1 (+) Benefit H9 (-) Pre-Privac y Intention to Use H4 (+) Initial Privacy H5 (+) Trade-Off H2 (-) Trade-Off H10 (+)

Initial Self- Initial Privacy Pre-Privacy H11 (+) Concern Disclosure Concern

H7 (+)

H3a (+) H3b (-)

Personal Note. Layered rectangles Innovativeness represent second-order constructs in IT (PIIT)

Figure 5 - The proposed privacy trade-off research model

Perceived benefit is the value of using social networks (Zhou et al., 2015). Functionalities

of SNAs or utilitarian benefits could be very different from one app to another but the main features

that are common among them are the ability to use them everywhere and anytime, known as

ubiquity and immediacy (Gudelunas, 2012). Social networks provide utilitarian and hedonic values

(Zhou et al., 2015). SNAs, similar to social network sites, give users a chance to experience specific functions and pleasant feelings (Keith et al., 2013b). While mobile apps are akin to online contexts, there are some features that are specific only to mobile apps such as their ubiquity, immediacy and offline connectivity (Junglas et al., 2008a; Zhou et al., 2010). There is an abundance of research supporting the positive effect of perceived benefit on intention to use prior to using an IT (Xu et al., 2013a). Perceived benefit is one of the most important factor on the intention to use an IT (Lee, 2009). Consistent with the prior literature, this research suggests:

H1: Pre-perceived benefit is positively related with the intention to use a new SNA.

58

Personal concepts of privacy are interwoven with values, perceptions, beliefs, and experience, such that privacy research requires a taxonomy of privacy (Solove, 2006). Privacy is the amount of information that an individual chooses to share with others (Westin, 2003). Privacy research conceptualizes privacy concern as the extent to which a user can control personal information (Bryce and Klang, 2009). Most of prior research confirm how use of SNSs threats to privacy of users increase (Xu et al., 2012b). Privacy-related perceptions of human beings are originated from their information processing during their engagement with a technology (Dinev et al., 2015).

People who have higher concerns over their privacy may have lower intentions to adopt technologies that require their personal information. Privacy experiences form users’ privacy perceptions which usually built on automatic cognitive heuristics (Dinev et al., 2015). Initial privacy concern that a user have would influence the intention a new technology (e.g. SNA). For example, users who have experienced privacy invasions would generally have higher privacy concerns compared to others (Choi et al., 2015). As a result, victims of privacy invasions practice avoidance mechanisms to minimize potential harms (Choi et al., 2015). To avoid future risks, privacy concern has negative impact on intention to use. Following the same argument, this study proposes:

H2: Pre-privacy concern is negatively related with intention to use a new SNA.

Personal innovativeness in IT (PIIT) is an enduring trait that every person possesses

(Jackson et al., 2013). Privacy concern is related with the possible losses during the adoption of

SNAs. Privacy concern is salient for SNAs because they directly deal with the personal information. Through the lens of IDT perspective, PITT has an effect on both antecedents and

59

consequences perceptions of a new technology (Agarwal and Prasad, 1998a; Agarwal and Prasad,

1998b; Yun, 2013). The integrated model of IDT, TPB, and UTAUT offered by Jackson et al.

(2013) discusses how PIIT influences behavioral intentions. During initial use of a new SNA,

people who have higher intentions to try new technologies would value the technology higher than

others. While privacy trade-off could change during the adoption of a new technology, personal

innovativeness is stable situation-specific (Zhang, 2013).

Research findings demonstrate that individuals who are innovative toward IT are expected

to have greater intentions toward using a new IT (Lu et al., 2005; Sánchez-Franco et al., 2015).

PIIT moderates the relationship between perceptions and intention to use IT (Agarwal and Prasad,

1998a). Thus, the effect of PIIT on the relationship between pre-perceived benefit and intention to

use a new SNA will be higher for people with greater PIIT. In addition, there will be negative

influence of PIIT on the relationship between pre-privacy concern and intention to use a new SNA.

To account for the stable individual differences to try a new SNA, the current research suggests the following hypotheses:

H3a: Personal innovativeness to in IT positively moderates the relationship between pre- perceived benefit and intention to use a new SNA.

H3b: Personal innovativeness to in IT negatively moderates the relationship between pre- privacy concern and intention to use a new SNA.

The relationship between intentions and actual behavior has been discussed in previous literature, specifically in the theory of planned behavior (TPB). TPB confirms that an intention toward a behavior positively related with the actual behavior to perform (Ajzen, 1991; Ajzen,

2011; Ajzen and Madden, 1986). Information Systems (IS) literature and studies of adoption

60

confirms usage intentions are positively related with actual usage behaviors (Davis, 1989).

Technology acceptance model (TAM) predicts use construct by measuring intention to use and

many IS studies have adopted intention to use as a predictor of actual usage (Taylor and Todd,

1995; Van der Heijden, 2003; Venkatesh et al., 2003). On social networks, usage behavior could

be conceptualized and measured differently. SNS usage behavior could range from relationship

building (Utz, 2015) to content sharing (Shi et al., 2014) or to only entertainment purposes (Lee

and Ma, 2012). While the type of usage behavior could vary, but the actual usage of a new technology could be conceptualized by the total time users spend and/or the different features they try (Kang et al., 2006). During the initial use of a social networking app, this study proposes:

H4: Intention to use SNAs is positively related with the initial use of a new SNA.

Building upon the social penetration theory, self-disclosure is the main component in making relationships in online social network (Altman and Taylor 1973). Self-disclosure is the motivating factor in the process of building relationships (Huang 2016). To build relationship with others on social networks, people need to initially disclose personal information to build their identity (Ellison et al., 2014). During communications of users on SNAs, self-disclosure behavior occurs regularly (Lowry et al., 2011). On social networks, self-disclosure is generally beneficial as it builds intimacy in relationships (Lowry et al., 2011). When individuals use a SNA for the first time, their initial perceptions could change due to their actual experience. As a result, if users invest more time during their first experience with a SNA and use more functions, they would be motivated to disclose more personal information. Thus, this study suggests:

H5: Initial use is positively related with the initial self-disclosure on a new SNA.

61

Perceived benefits are desirable and motivational constructs that significantly influence the

evaluation of outcomes and actions (Schwartz, 1994). Perceived benefits is a multidimensional

construct referred as value dimensions. Perceived benefit is formed by utilitarian and hedonic

values (Chiu et al., 2014). On social networks, pre-perceived benefits are formed by hedonic and utilitarian values that users expect to get during the usage (Zhang et al., 2016). Based on the

previous literature, two important utilitarian benefits are ubiquity and immediacy and two

important hedonic benefits are social reward and enjoyment. Utilitarian dimension involves

advantageous functions that a social network features provide to its users and hedonic dimension

refers to pleasant gains users get on the social network (Chitturi et al., 2008; Xu et al., 2015). On

the other hand, privacy concern describes the level of control users desire to have on a social

network (Dinev et al., 2013). The optimal level of control over personal information depends on

the context information resides in (Solove, 2006).

Most of IS studies have recognized the change in users’ perceptions during adoption of

new ITs (Karahanna et al., 1999; Montazemi and Qahri-Saremi, 2015; Sun, 2013; Venkatesh and

Davis, 2000). The process of using an IT makes a link between individuals’ pre-use and post-use

perceptions (Karahanna et al., 1999; Montazemi and Qahri-Saremi, 2015). Pre-use beliefs are

positively related with initial-use beliefs (Sun, 2013). The theory of belief-updating process posits

pieces of evidence influences the predecessor individuals’ knowledge (Hogarth and Einhorn,

1992). Sun (2013) uses the term belief-adjusting to refer to the process of updating individuals’

beliefs during the use of a new IT which is primarily based on the direct experience with the

technology after the first usage. When people are informed about a new SNA, their pre-perceptions

of benefits and privacy concern will be updated after the first usage experience. This study

suggests: 62

H6: Pre-perceived benefit is positively related with initial perceived benefit of a new SNA.

H7: Pre-privacy concern is positively related with initial privacy concern of a new SNA.

The psychological trade-off between privacy concerns and perceived benefit determines the total time and number of functions people use a new SNA. Users’ understanding of the trade- off between the benefit and the privacy concern become more realistic during the usage experience of a SNA. Social networking apps are recognized as one of the most important types of mobile applications (Xu et al., 2011b). Value dimensions of perceived benefit contain both hedonic and utilitarian benefits (Overby and Lee, 2006). To better capture the perceived benefit, this study focuses on general characteristics of social networking apps that are common within every app.

Immediacy and mobility features of a SNA define the utilitarian benefit dimension. Many

social networking apps are used when users are moving around (Xu et al., 2011b). Mobility and

immediacy are important features of a SNA that people want and they can affect the performance

of the app (Han et al., 2014; Xu et al., 2011b). Social rewards and enjoyments describe the hedonic

dimension of perceived benefit. Perceived benefit positively influences users’ intentions (Xu et al.,

2013a). Because perceived benefit represents values in users’ minds (Overby and Lee, 2006), an

increase in perception of values could be due to the use of social networks (Lee 2009). After users

try a SNA for the first time, they have better understanding of benefits. Thus, it is suggested:

H8: Initial use is positively related with the initial perceived benefit.

Effects of usage behaviors on many factors have been investigated in the previous research.

Liebermann and Stashevsky (2002) discuss when users put more time in using the internet they

perceive lower risks, specifically about the misuse of their personal information. Furthermore,

users’ initial expectations of IT could differ during the usage period (Lankton et al., 2016). The

63

usage intensity is shown to be associated with the privacy concern (Lin and Liu, 2012). While prior research have investigated the negative effect of privacy concern on the usage intensity, the current research argues that when people use an IT they accumulate more experiences and knowledge about it. The greater the time people spend and the more functions they use on a SNA for the first time, the image of usage behavior becomes more vibrant. Thus, the current research proposes:

H9: Initial use is negatively related with the initial privacy concern.

Prior to the actual disclosure of personal information, people weigh costs and benefits of self-disclosure to anticipate the greater factor (Altman and Taylor, 1973; Greenberg and Stone,

1992). People wish to gain benefits when they participate in a social networking activity (Xu et al., 2013a). The positive effect of perceived benefit on self-disclosure is examined in the prior literature (Ellison et al., 2007; Xu et al., 2013a). People are encouraged to self-disclose on social networks to take advantage of the benefit of a social network (Xu et al., 2013a). Most social networks are based on the sharing of information and users may choose to disclose their personal information (Contena et al., 2015).

When using a social networking service for the first time, users should disclose personal information to be able to grow the network, to change settings, to express openly, to build identity, and to enjoy (Forest and Wood, 2012; Utz, 2015). On SNSs, the self-disclosure of personal information result in the greater returned values through saving time and getting better recommendations (Xu et al., 2013a). Social network users are motivated to share personal information to be able to interact with their desired network (Xu et al., 2013a). Based on this, this

64

study proposes that initial self-disclosure has a positive influence on the initial perceived benefit of a SNA. Consequently, this study hypothesizes:

H10: Initial self-disclosure is positively related with the perceived benefit.

Even though pre-perceived benefit encourages people to share personal information, users consider inhibitors such as privacy concern after disclosing personal information (Yu et al., 2015).

High privacy concern is adversely related to individuals’ tendencies to provide self-information

(Lowry et al., 2011). Users’ expectation of losses caused by risks to personal information is defined as privacy concern (Xu et al., 2013a). Previous studies have posited that self-disclosure is negatively influenced by privacy concerns as individuals want to avoid future losses (Krasnova et al., 2010).

The perception of control over personal information influences the extent of self-disclosure

(Dinev and Hart, 2006). For a new social networking app, individuals initially disclose some personal information to build their profiles to use it and to be able to communicate with other users

(Nosko et al., 2010). There are four types of self-disclosed information on SNS profiles: default information, sensitive personal information, and potentially stigmatizing information (Nosko et al., 2010). As a result, users may disclose different depth of information on their SNS profile

(French and Read, 2013). Research findings demonstrate that initial privacy concern is related to the privacy concerns of users of a SNA (Cheung et al., 2015). Nonetheless, the amount and the type of self-disclosed information make concerns for information privacy. The self-disclosure behavior is an intentional behavior of sharing personal information within a group of people (Posey et al., 2010). This study defines self-disclosure as the sharing of general and specific type of

65

personal information. People who initially disclose personal information, would be concerned

about the future negative consequences to their privacy. Thus, the current research suggests:

H11: Initial self-disclosure is positively related with the initial privacy concern.

2.4. Methodology

To explain the role of initial use and initial self-disclosure on the privacy trade-off

dynamics, this research develops a research model based on IDT, TPB, and privacy trade-off

model. The proposed model examines the effect of initial use and self-disclosure on perceived

benefit and privacy concern during the pre-use to initial use period of a new SNA.

2.4.1. Measurement Development

To test the proposed model, a two phased survey method is used for pre and post evaluation.

Measurement items are provided in Appendix D. Phase 1 is conducted to capture pre-usage

perceptions and phase 2 evaluates perceptions after initial-usage experiences. The survey

instrument is composed of previously validated scaled identified in related literature and modified

for the context of the current study To ensure content validity of the instrument, a panel of two

faculty who are subject matter and survey development experts were asked to review the items for

content validity and their appropriateness in the current research context (Lynn, 1986). Construct

measurement items are developed on 7-point Likert scales ranging from strongly disagree to

strongly agree because research shows Likert-scaled measures are reliable for capturing respondent perceptions (Chen et al., 2011; Stephenson et al., 2003).

2.4.2. Data Collection

Survey questionnaires were distributed to students at a major university in the US. Students are typical users of mobile apps and thus are representative of the population of users to predict

66

disclosure behavior during the adoption of a new SNA. A new privacy-based SNA called Sociabile

(social + mobile), which is available on iTunes and Google Play was used for the current study.

The research model was tested using instrument survey data. The authors recruited respondents and collected data from students enrolled in various undergraduate business courses. The authors collected two independent samples. Phase 1 of the survey was conducted to capture pre-use

perceptions and yielded 390 respondents. Phase 2 of the survey was conducted with 1-week delay

after phase 1 to capture initial-use perceptions and usage experiences. Respondents were asked to

install and use the new SNA (Sociabile) on their smartphones. The second survey concluded with

349 respondents who were adopters of the app. Users who completed both phases were included

in the analysis so that pre and post evaluations could be conducted. The final dataset contained

239 usable responses suitable for analysis. Demographic information of the sample is provided in

Table 8.

Table 8 - Demographic information

Gender Male (40%), Female (60%) Age Min=18, Max=49, Mean=24 Disposable Between $5k Between Between Phase-1 (Pre- Less than $5k More than income ($ and $10k $10k and $15k and Use) (48%) 20k (11%) yearly) (22%) $15k (13%) $20k (6%) Academic Freshman Sophomore Graduate Junior (57%) Senior (26%) standing (2%) (14%) (1%) Gender Male (50%), Female (50%) Age Min=18, Max=46, Mean=22 Disposable Between $5k Between Between Phase-2 Less than $5k More than income ($ and $10k $10k and $15k and (Initial Use) (51%) 20k (9%) yearly) (20%) $15k (13%) $20k (7%) Academic Freshman Sophomore Graduate Junior (60%) Senior (20%) standing (5%) (15%) (<1%)

67

2.5. Analysis and Results

2.5.1. Reliability and Validity

The proposed research model was tested using partial least squares (PLS) analysis because

PLS employs a component-based approach for estimation that minimizes residual distributions

(Chin, 1998), and is best suited for testing complex relationships by avoiding inadmissible solutions and factor indeterminacy (Chen et al., 2011). Smart PLS 3.2.3 was used to calculate t- tests for each path. A three-step analysis procedure was implemented: measurement model reliability and validity assessment, common method bias checking, and evaluation of structural model. Reliability and validity of the measurement model was evaluated to increase the rigor of the study by examining measurement reliability (composite and indicator reliabilities), convergent validity, and discriminant validity (Hair Jr et al., 2013; Hulland, 1999).

Table 9 presents the descriptive statistics, composite reliability values, Cronbach’s alpha values, and the average variance explained (AVEs) of principal constructs. First, measurement reliability using composite reliability and Cronbach’s alpha values was evaluated. Composite reliability is established if all calculated values exceed 0.70 (Fornell and Larcker, 1981; Nunnally et al., 1967). Also, Cronbach’s alpha scores of 0.70 or greater are considered acceptable (Nunnally et al., 1967), while scores between 0.8 and 0.9 are considered satisfactory (Henseler et al., 2009).

Table 10 shows all composite reliabilities and Cronbach’s alpha values exceed 0.70, verifying measurement reliability. Factor loadings of measurement items are presented in Appendix E.

68

Table 9 - Descriptive statistics, reliability, and validity

Standard Cronbach’s Composite Principal Construct Mean AVE Deviation Alpha Reliability 1 Pre-Privacy Concern (2nd Order) 5.05 1.61 0.93 0.95 0.69 2 Collection 5.03 1.38 0.83 0.92 0.86 3 Secondary Use 5.16 1.82 0.89 0.95 0.90 4 Improper Access 5.30 1.66 0.93 0.97 0.93 5 Error 4.71 1.58 0.89 0.95 0.90 6 Pre-Perceived Benefit (2nd Order) 4.94 1.25 0.91 0.92 0.55 7 Immediacy 5.00 1.21 0.81 0.89 0.72 8 Ubiquity 5.17 1.26 0.91 0.96 0.91 9 Social Reward 4.90 1.21 0.87 0.92 0.79 10 Enjoyment 4.68 1.30 0.93 0.96 0.93 11 Intention to Use 4.31 1.32 0.94 0.95 0.76 12 Personal innovativeness in IT 4.68 1.44 0.87 0.92 0.80 13 Initial-Privacy Concern (2nd Order) 4.25 1.43 0.94 0.95 0.72 14 Collection 4.36 1.37 0.79 0.91 0.83 15 Secondary Use 4.18 1.60 0.96 0.98 0.96 16 Improper Access 4.27 1.41 0.85 0.93 0.87 17 Error 4.17 1.35 0.84 0.92 0.86 18 Initial-Perceived Benefit (2nd Order) 4.30 1.25 0.95 0.95 0.69 19 Immediacy 4.10 1.16 0.90 0.94 0.83 20 Ubiquity 4.23 1.22 0.88 0.94 0.86 21 Social Reward 4.10 1.34 0.90 0.94 0.83 22 Enjoyment 4.77 1.27 0.92 0.96 0.93 23 Initial Use 4.07 3.05 0.72 0.77 0.63 24 Initial Self-Disclosure 3.38 2.31 0.97 0.98 0.97 Next, convergent validity was assessed by evaluating the AVEs (Hair Jr et al., 2013).

Convergent validity is established if AVE reaches at least 0.50 or if principal hypothesized constructs load much higher than other constructs (Chin, 1998). Table 10 shows all AVEs exceed

0.50, thus establishing convergent validity. Third, discriminant validity is established by initially ensuring an indicator’s outer loading on a construct is greater than cross loadings with other constructs, and next by ensuring for each construct the square root of AVE is higher than the outer correlations (Hair Jr et al., 2013). The results show all outer loadings are greater than cross loadings

69

for each construct and squared root of AVEs are higher than outer correlations. The results affirm

discriminant validity. Overall, results show high reliability and validity of the posited measurement

model.

Table 10 - Construct correlations and square root of AVEs (diagonal values)

No. Principal Construct 1 2 3 4 5 6 7 8 1 Pre-Perceived Benefit 0.74 2 Initial Perceived Benefit 0.18 0.82 3 Initial Self-Disclosure 0.07 0.12 0.98 4 Intention to Use 0.53 0.11 -0.03 0.87 5 Personal Innovativeness in IT 0.40 0.12 0.01 0.51 0.86 6 Pre-Privacy Concern 0.23 0.02 -0.19 0.19 0.17 0.80 7 Initial Privacy Concern 0.04 0.09 -0.18 0.13 0.08 0.37 0.83 8 Initial Use 0.34 0.43 0.22 0.30 0.18 0.08 0.05 0.83 2.5.2. Structural Model Assessment

The PLS results of the structural model show during the pre-use stage (phase 1), 55%

variance of intention to use is explained by pre-perceived benefit and pre-privacy concern and the

moderation effect of PIIT on the relationship between pre-perceptions and intention to use is

significant. During the initial use stage (phase 2), 20% of initial use variance is explained by the

intention to use. Also, 14% of variance in self-disclosure is explained by the initial use. 23% of

the initial perceived benefit is explained by per-perceived benefit, initial use, and initial self-

disclosure. The total of 11% of the variance in initial privacy concern explained by pre-privacy concern and initial use. Thus, the posited research model demonstrates satisfactory explanatory power to capture the effect of initial use and self-disclosure on the pre privacy trade-off to initial privacy trade-off. Results of the research model and dimensions of perceived benefit and privacy concern during the pre-use to initial use phases are shown in Figures 6 and 7.

70

The PLS path coefficients shown in Figure 6 demonstrate H1, H2, H3b, H4, H5, H6, H7,

H8, H9, and H10 are significant; while H3a and H11 are not significant. Findings demonstrate

that pre-perceived benefit relationship with intention to use a SNAs is significant (b=0.41,

p<0.001), confirming H1. The relationship between pre-privacy concern and intention to use a

SNA is significant (b=-0.34, p<0.01), supporting H2. Moreover, the results show personal innovativeness has a negative moderation effect on the relationship between pre-privacy concern and intention to use a SNA (b=-0.42, p<0.001), confirming H3b.

Pre-Use Initial Use (Phase 1) (Phase 2)

0.26***

Pre-Perceived 0.35*** Initial Perceived Initial Use Benefit 0.41*** 2 Benefit R =0.20 R2=0.23 0.24*** -0.12** Pre-Privacy Intention to Use Initial Privacy 0.18*** Trade-Off -0.34** R2=0.55 Trade-Off 0.07*

Initial Self- Initial Privacy Pre-Privacy Concern Disclosure Concern R2=0.14 0.04 R2=0.11 0.29***

0.13 -0.42*** Note. ***: p<.001;**: p<.01;*: p<.05 Layered rectangles represent second -order constructs Personal Innovativeness in IT (PIIT)

Figure 6 - The privacy trade-off model analysis

The relationship between intention to use a SNA and initial use is supported (b=0.24,

p<0.001), confirming H4. Also, the hypothesized relationship between initial use and initial self-

disclosure is supported (b=0.18, p<0.001), confirming H5. In addition, the relationship between

pre-perceived benefit and initial perceived benefit is significant (b=0.26, p<0.001), supporting H6.

Then, results indicate the relationship between pre-privacy concern and initial privacy concern is significant (b=0.29, p<0.001), supporting H7. The relationship between initial use and initial

71

perceived benefits was found to be significant (b=0.35, p<0.001), supporting H8. The relationship between initial use and initial privacy concern is significant (b=-0.12, p<0.01), supporting H9.

Also, the relationship between initial self-disclosure and initial perceived benefit is significant

(b=0.07, p<0.05), supporting H10.

On the other hand, results demonstrate the insignificance of the relationship between initial self-disclosure and initial privacy concern (b=0.04, p<0.001), rejecting H11. In addition, the moderation effect of PIIT on the relationship between pre-perceived benefit and intention to use is not supported (b=0.13, p<0.001), failing to confirm H3a.

Collection Collection

0.25*** 0.26*** Secondary Use Secondary Use 0.30*** Pre- Privacy 0.29*** Initial Privacy 0.29*** Concern 0.32*** Improper Access Improper Access Concern 0.30*** 0.27*** ) ) 2 1 Error Error Phase Phase ( ( Utilitarian Utilitarian Use - Ubiquity Ubiquity

Pre 0.29*** 0.29*** Immediacy Initial Use Immediacy 0.29*** 0.29*** Pre-Perceived Initial Perceived Benefit Hedonic Hedonic Benefit 0.28*** 0.27*** Social Reward Social Reward 0.35*** 0.29***

Enjoyment Enjoyment

Figure 7 - Privacy concern and perceived benefit analysis (from pre-use to initial use)

2.6. Discussion

The main purpose of this study was to understand the change in the trade-off between perceived benefit and privacy concern through the initial use of a new SNA. In addition, this study focused on the influence of initial use and initial self-disclosure on the privacy trade-off dynamics.

Finally, this research investigates the moderation effect of PIIT on the pre-perception relationship

72

to intention to use a new SNA. Building on innovation diffusion theory (IDT), theory of belief

updating, privacy trade-off model, and theory of planned behavior (TPB) this study investigated

determinants of the privacy trade-off and its change during the initial use of a new SNA.

Prior research identifies the need to examine privacy through a longitudinal fashion

(Belanger and Xu, 2015). The results of the analysis indicates the initial use of a new social

networking app (SNA) and initial self-disclosure both influence the privacy trade-off during the usage of a new SNA. Applying a longitudinal perspective, this study develops a research model to understand the change in privacy trade-off through initial use experiences. This study is focused on two phases of the use decision: pre-use (phase 1) and initial use (phase 2). Privacy concern and perceived benefits are shown to be dynamic and users’ perceptions changed during the first usage experience of a new SNA. In this study, dimensions of privacy concern are collection, improper access, secondary use, and error. Also, dimensions of perceived benefit in this study are immediacy, ubiquity, social reward, and enjoyment.

Among four dimensions of privacy concern, path coefficients of collection and improper access increased but error and secondary use dimensions had lower path coefficients. Furthermore, path coefficients of hedonic benefits (social reward and enjoyment) decreased from pre-adoption to the initial use experience however path coefficients of utilitarian benefits did not change significantly. It is an interesting finding indicating the importance of hedonic benefits during the initial use of a new SNA. As suggested by the results of the previous research, increasing hedonic values for users would have a positive effect on their stickiness to the network (Zhang et al., 2016).

Stickiness of a website makes users to continue using it (Zhang et al., 2016).

In addition, results of the analysis revealed the relationship between initial self-disclosure and the initial privacy concern is not significant. This is a fascinating discovery demonstrating 73

users’ initial concern for their information privacy is not related with the disclosed information

during the first usage experience. Research confirms the main purpose of using social networks is

to connect with other people and use social network as a way to communicate with them (Chang

and Zhu, 2011). Self-disclosure on social networks ensure identity creation as the basis for building

relationships (Ellison et al., 2014). One possible explanation for insignificance of the relationship

between initial self-disclosure and initial privacy concern could be due to the fact that users do not

disclose too much of personal information during the first usage period. Users of social network

disclose more personal information over time ranging from non-personal to very sensitive data

(Castillo et al., 2013). During the continued usage a new SNA users disclose personal information

more and consequently, individuals lose the proper desired control (Choi et al., 2015).

Another explanation to the insignificance of the relationship between initial self-disclosure and initial privacy concern could be due the context-dependency of users’ privacy concerns. On the context of mobile apps, users’ sharing behavior and privacy preferences becomes complex and diverse (Liu et al., 2014b). The social network app that was introduced to respondents was devoted to protect users’ privacy. We asked participants to install and use a privacy-based social networking app. On the app called Sociabile, users’ information is protected by not storing on databases and users have access to many customizable privacy settings. As a result, when using

Sociabile people cannot search and find other people if they do not have their unique userID or phone number. Also, users cannot see other users’ information if they have not given access.

Indeed, it confirms results of Taddicken (2014) study where the relationship between self- disclosure and privacy concern was not a significant. The moderation effect of PIIT on the relationship between pre-perceived benefit and intention to use a new SNA is not significant. A

74

plausible explanation for the insignificance of the H3a is that individuals’ intentions are already

high due to perception of expected values and PIIT does not make the relationship stronger.

2.6.1. Implications

Due to the popularity of social networks, there has been an extensive amount of research

to explore the antecedents of intentions to use them. Among the various factors, privacy concern

has been the focus of many studies. This study further contributes to the privacy literature in the

Information Systems (IS) field in several ways. Though the issue of privacy concern on social

networks has been investigated in prior research, there is a lack of research examining the change

in privacy concern and perception of benefits during the use period of a new social network.

Specifically, findings of this study enrich the current stream of research on the topic of privacy

trade-off dynamics. This study informs researchers to distinguish between the pre-perceptions and

initial perceptions regarding use of a new SNA.

The current research has two major theoretical contributions. First, integration of

innovation diffusion theory, privacy trade-off model, theory of belief updating, and theory of

planned behavior helped to better understand the use of a new SNA over time. The proposed

research model can be used as a basis for longitudinal research in the other areas of technology

use. Findings of this study answer several calls to fill the gap in the literature regarding the

dynamism of privacy concern. Investigating the trade-off between privacy concern and perceived

benefit prior and after using new SNA, helped to better understand the effect of initial use and self- disclosure behaviors on the change of privacy trade-off.

Second, results specify initial use behavior has a positive effect on self-disclosure suggesting people who invest more time in the first time of using a new SNA will eventually disclose more personal information. This study contributes to the body of knowledge by studying 75

the effect of initial usage experiences on privacy concerns. Theoretically is interesting to find that

the initial usage period has significant impact on the initial self-disclosure. Indeed, research

findings demonstrate there is not significance relationship between initial self-disclosure and initial privacy concern. Results indicate privacy concern of people who adopt a new SNA and disclose their personal information is not influenced by the initial disclosure. Instead, the initial usage has a negative impact on the initial privacy concern. To best of our knowledge, this is the first study to provide usage experience as a soothing gauge for users’ privacy concerns.

This research has also implications for practitioners. Based on the results of the analysis, initial self-disclosure is not associated with the initial privacy concern which is due to the fact that at the beginning stage of using a SNA people do not disclose too much of their personal information. Furthermore, this study provide two indicators for measuring self-disclosure: general information and specific information. Providing tools to differentiate between various types of shared contents on SNAs is a helpful mechanism to protect privacy of involved individuals.

Also, results indicate the negative moderation effect of PIIT on the relationship between pre-privacy concern and intention to use which means people with high PIIT may have high intention to use a new SNA even though their privacy concern is high. In practice, most social networks have basic features and for a new SNA it is crucial to provide unique functionalities that users are willing to try. Second, coefficients of hedonic benefits changed a lot while coefficients of utilitarian benefits were almost constant. Finally, the change of hedonic benefits during the first usage experience indicate the importance of pleasant experiences at the beginning of using a new

SNA. SNA developers may consider to invest more in providing enjoyable and socially rewarding functions to satisfy users and to make users to continue using it.

76

2.6.2. Limitations and Future Research

Similar to most other research, this study has limitations that should be taken into account.

First, the aggregated sample size and diversity could limit the generalizability of findings. The sample was chosen from college students in the US. While college students are one of the major users of social networks, their knowledge and experiences of using SNSs would be a limiting factor. Future research may collect more data from a wider scope of demographics to enhance the interpretation of results. In addition, including individuals from diverse cultures could be beneficial to understand privacy trade-off dynamic. Second, samples were asked to install and use a new SNA and then respond to the initial use survey to receive extra credits in a specific course.

Researchers tried to minimize the forceful responses by offering other extra credit options and removing responses with a low completion time. Future research may conduct surveys by choosing the users of new SNA without offering any benefit to get a better understanding of actual users.

Also, we suggest future research to recruit respondents in a complete voluntary situation to confirm results of our study. Third, to analyze phase 1 survey to phase 2 survey together, respondents’ phone numbers were used as a unique ID. There were instances were respondents felt uneasy to disclose their phone number as they mentioned other values (e.g. NA, 0). Future research may consider integrating different surveys with a different method such as giving a unique ID at the beginning of the first survey and use the same code for the future steps.

2.7. Conclusion

This study serves as initiative attempt to investigate dynamics of privacy trade-off during

the use of a new social networking app. Current research model is built upon an integrated

innovation diffusion theory, theory of belief updating, theory of planned behavior, and privacy

trade-off model to investigate the dynamism of privacy trade-off through the per-use and initial 77

use phases. The research model explains intention to use a new SNA is determined by pre- perceived benefit and pre-privacy concern. Personal innovativeness in IT has a negative influence on the relationship between pre-privacy concern and intention to use. Further, the research model suggests the initial use and initial self-disclosure relationship. Initial perceived benefit is determined by pre-perceived benefit, initial use, and initial self-disclosure. Initial privacy concern is determined by pre-privacy concern and initial use. Finally, initial self-disclosure is not related with the initial privacy concern.

78

CHAPTER 3

EMOTIONAL ATTACHMENT TO SOCIAL NETWORK APPS

3.1. Introduction

As of 2016, the number of mobile app downloads worldwide was more than 224 billion

and it is predicted to reach more than 2680 billion in 2017 (Statista, 2016b). Currently, mobile

apps account for more than half of all time people spend on digital media (Nelson, 2015). The

average time spent on mobile apps has increased 21 percent from 2014 to 2015 (Nelson, 2015).

The highest percentage of time spent on mobile apps is 43 percent for games and 26 percent for

social networking (Nelson, 2015). Reports show the average time people spend on mobile apps

has increased more than 23 minutes from 2014 to 2015 while the time spent on mobile web

browsers is steady (Soper, 2014). Findings of a study by the Pew research center show 42% of

online adults use multiple social network sites (Duggan and Smith 2014).

Adoption and use of social network sites (SNSs) is recognized in many studies (Broeck et

al., 2015; Chang and Zhu, 2011; Ku et al., 2013). Among different variables self-disclosure,

intention to adopt, continuance intention is more focused. Prior literature is divided into two main

categories: first group of research focuses on the motivations and deterrents behind using SNSs

and the second group of research investigates consequences of using SNSs. Research shows the

paradoxical influence of using SNSs among different groups of users. For example, in the SNS

running context the positive influence of using SNSs on activity engagement is proved fruitful

(Mahan et al., 2015).

Many users have more than one SNS profiles and they use each SNS for a different reason

(Lenhart et al., 2009; Stutzman and Hartzog, 2012). Four factors that derive people to maintain multiple SNS profiles are privacy, identity, utility, and propriety (Stutzman and Hartzog, 2012).

One of the motivations behind using a new SNS is to be anonymous to the new network and build 79 a new identity (Gerhart and Koohikamali, 2015; Greenhow and Robelia, 2009). Perception of

anonymity on SNSs can facilitate the sharing behavior (Dwyer et al., 2007). Inability to identify

the generator of a content result in higher safety perceptions (Kern 2013). Free expression of any

idea is one of the positive outcomes of perceived anonymity (Kern 2013). Other positive

consequences of anonymity are sense of free of speech in communications (Davenport 2002) and individual privacy protection (Christopherson 2007). On a specific SNS, protection of personal information privacy is possible by separating real personal identity from identity on the SNS

(Gross and Acquisti, 2005). Maintaining anonymity on some SNSs is one of the motivation people use them (Gross and Acquisti, 2005).

Engagement with the SNSs is lower for people with higher privacy concerns (Staddon et al., 2012). Engagement with SNSs is one of the indicators of the determinants of them being successful for businesses and organizations (Chu and Kim, 2011). In the later stages of using SNSs, people build emotional attachment toward them (Valenzuela et al., 2009). When users experience an IT for the first time, their emotion toward the IT is a response to the anticipated experience

(Lowry et al., 2015). To keep gaining the anticipated experience , the positive emotional response motivates users to continue using an IT (Lowry et al., 2015). Emotional attachment to IT influences different aspects of individuals’ life and people want to continue use it if the emotional bond remain strong (Yun, 2013). In addition, users who build emotional attachment with an IT (e.g. mobile phones) suffer from feelings of anxiety and panic if separated from it (Trub et al., 2014;

Vincent, 2006). Ultimately, even an IT no longer functions perfectly users are hesitant to discontinue using it or to replace it. For example, on blogs attachment anxiety positively influences usage intensity of users (Trub et al., 2014).

80

The extent to which people engage with a new social networking app (SNA) is not clear

and previous research has not explored this area. Furthermore, the relationship between

engagement and emotional attachment is less considered in the information systems (IS) literature.

Finally, most of previous literature has extensively examined the negative influence of privacy

concerns on self-disclosure but did not include the influential role of initial perception of

anonymity on self-disclosed information depth and post-privacy concerns. These gaps in the literature lead this research to tackle the following research questions: (1) how is the ongoing emotional attachment with a new SNA influenced by ongoing engagement and ongoing privacy concern? (2) What is the influence of initial self- disclosure depth and initial perceived anonymity on ongoing privacy concern and ongoing engagement?

To address these questions, this study aims three objectives: i) we develop a new research model to explain the ongoing emotional attachment to a SNA; ii) we empirically test the proposed model using survey data collected from users of a new SNA; and iii) we provide theoretical and practical implications of this study.

3.2. Literature Review

3.2.1. Mobile Apps

Technology adoption lifecycle introduces five groups of people in relation to new technologies (Moore, 1999). Innovators, early adopters, early majority, late majority, and laggards are important players of technology adoption lifecycle (Meade and Rabelo, 2004). In the study by

Meade and Rabelo (2004), the adoption in high-tech market has been explored. Using chaos and complexity theories, they demonstrate that the adoption in high-tech market is non-linear. In another study, authors have applied the modified decomposed theory of planned behavior to

81

anticipate behavior of early adopters of mobile internet (Pedersen, 2005). Their finding show perceived user friendliness and perceived usefulness explain attitude toward use. Also, subjective norm is explained by external influence, inter-personal influence, and self-control (Pedersen,

2005). Finally, self-efficacy and facilitating conditions explain behavioral control in the context of internet mobile internet services (Pedersen, 2005).

The proliferation of smartphones has produced a new industry – applications or “apps” that increased the functionality of smartphones beyond mere communication (Dinner et al., 2015). It is crucial to examine what derive the adoption and usage of a specific mobile app. The preliminary work has been started in the computer science literature to show determinants of app usage (Taylor and Levin, 2014; Xu et al., 2013b). Findings of the Xu et al. (2013b) study reveal three key factors in everyday use of mobile apps: user preference, context, and community behavior. First, the intrinsic user app preferences is originated from the user’s historical usage patterns. Second, the environment and user activities is observable through the sensor-based contextual signals. Third, the aggregation of app usage behaviors is appeared in user communities.

Findings of the study by Cho et al. (2010) show influential factors on the acceptance of the smartphones are mobility, interactivity, innovativeness, social influence, and job fitness. Bohmer et al. (2011) study shows while users spend almost an hour each day using apps on their phones, the average session with an app takes less than a minute. They found different applications are used at different times of the day. For example, communication apps are used throughout the day while news apps are most used in the morning and games are more used at night. Lifecycle of mobile apps consists of five events: installing, updating, uninstalling, opening, and closing the app

(Böhmer et al., 2011).

82

3.2.2. Privacy Concern, Anonymity, and Self-Disclosure

Concerns for information privacy has been growing since 1960s (Dinev et al., 2015).

According to Smith et al. (2011) there are three major areas that previous privacy literature have contributed most: the conceptualization of information privacy, the relationships between information privacy construct and other constructs, and the context information privacy resides in.

Personal concepts of privacy are interwoven with values, perceptions, beliefs, and experience, such that privacy research requires a taxonomy of privacy (Solove, 2006). Information privacy is the amount of information that an individual chooses to share with others (Westin, 2003). Personal information privacy is the optimal level of control over personal information (Malhotra et al.,

2004).

In the previous research, privacy concern is shown to be an important factor in usage of

SNSs. The privacy-trust model developed by Dwyer et al. (2007) show the privacy concern could determine the information sharing and relationship buildings on SNSs as two SNSs’ post-adoption behaviors. Further, research shows privacy concerns on SNSs are cultivated by initial use experiences of users (Boyd and Ellison, 2007). Privacy concerns could decrease the use of SNSs among users (Ku et al., 2013). Boyd and Ellison (2007) argue that the actual behavior of people to protect their privacy is not always same as their desire.

Based on the definition of privacy by Laufer and Wolfe (1977), the self is reinforced by appropriate privacy expressions and experiences shaping the individuals’ identity and behavior.

On social network sites, there are inseparable relationships between users’ identities, their information, and their perceptions of privacy concern. Representing the real identity and publicly posting personal information on SNSs could cause serious negative consequences to the user such as privacy invasion (Gross and Acquisti, 2005). As a result, many individuals seek anonymity and 83

pretense different levels of identifiability on different SNSs (Gross and Acquisti, 2005). For example, on a professional SNS people may use their real names and instead they use fake names on a dating SNS to create a sense of anonymity to other users (Gross and Acquisti, 2005). The relationship between self-disclosure and privacy concern is weak if the provided information is not truthful, fake, or not personally identifiable. Thus, the role of perceived anonymity becomes crucial to better understand self-disclosure and social interactions.

Anonymity on SNSs is known as an inability to identify content generator of a message

(Hayne and Rice 1997; Pinsonneault and Heppel 1997). There are two types of anonymity on the internet: the technical anonymity and the social anonymity (Kiesler et al., 1984). The technical anonymity is the removal of all meaningful identifiable information from a material and the social anonymity is the un-identifiability of a material even by using information from context and other cues (Hayne and Rice 1997). In general, anonymity expected to lessen inhibition by reducing fear of social disapproval, censorship, and evaluation (Pinsonneault and Heppel 1997).

Self-disclosure happens when a user shares information about him/her (Green et al., 2016).

Many studies have been done to understand motives behind self-disclosure behavior on SNSs area

(Tow et al., 2010). Previous works have examined the relationship between self-disclosure behavior and its antecedents/consequences. Privacy concern is shown to be a significant predictor of users’ decisions to disclose personal information (Acquisti et al., 2015; Zimmer et al., 2010).

Recent research also shows usage experiences positively influence self-disclosure (Trepte and

Reinecke, 2013). Self-disclosed information are different in amount, honesty, intent, depth, and valence (Posey et al., 2010).

Self-disclosure depth reflects intimacy in the communications (Posey et al., 2010). Depth of disclosed information is details of it (French and Read, 2013). On SNS, depth of self-disclosed 84

information varies due to heterogeneity of audience and relationship differences (French and Read,

2013). Depth of information can range from very general (low depth) to very specific (high depth)

(French and Read, 2013). According to the social penetration theory, the developmental process of interpersonal relationship is primarily based on self-disclosure (Altman and Taylor, 1973;

Carpenter and Greene, 2015). People tend to share more information on SNSs with anonymous strangers compared to individuals who know them (Gross and Acquisti, 2005).

3.2.3. Engagement with SNAs

The feeling that a technology has caught a user’s interest is defined as engagement with the IT (Webster and Ahuja, 2006). Engagement with computer-mediated activities is a desirable human response and it is an important indicator of its success that grounds for re-use of the system

(Bano and Zowghi, 2015; Webster and Ahuja, 2006). O’Brien and Toms (2008) study provide the conceptual framework for user engagement with technology. They define engagement as a “quality of user experience characterized by attributes of challenge, positive affect, durability, aesthetic and sensory appeal, attention, feedback, variety/novelty, interactivity, and perceived user control”

(O'Brien and Toms 2008, p. 938). Initial mobile app engagement is generated after people use the app and acquire the initial experience (Dinner et al., 2015). While in the marketing literature, engagement on SNSs is defined as a social interaction with other users by liking, commenting, or sharing information (Chu and Kim, 2011), engagement with a social networking app could have two aspects: first, engagement with individuals on the social network and second, engagement with the app and its functionalities. On a new social networking app, people build their engagement over a period of time (O'Brien and Toms, 2010).

People engagement on SNSs is determined by tie strength, homophile, trust, normative

influence, and informational influence (Chu and Kim, 2011). SNSs allow users to engage in social 85

interactions by providing tools such as opinion giving (Chu and Kim, 2011). Similarly, the use of mobile apps increases the users’ engagement in certain behaviors (Dekhane et al., 2013). For example, a study by Dekhane et al. (2013) shows that use of an educational mobile app relates to a greater engagement within users. When mobile users develop a relationship with their devices, the level of engagement with the device and its functionalities is a function of the relationship quality (Mintz, 2014). Smartphones users may control their engagement in certain activities as a response to their needs (Kim et al., 2013). Smartphone users are able to choose how, when, and where to engage with the device and apps installed on it. Using functionalities provided by mobile apps help users to save time, complete particular tasks, entertain themselves, and connect to other users (Kim et al., 2013).

Engagement is a dynamic construct that matures during a user’s experience and use of a technology. Engagement with technology is a four-staged process: point of engagement, the period of sustained engagement, disengagement, and reengagement (O'Brien and Toms, 2008).

Engagement with a technology pertain to the user, the system, and the interaction between the user and the system (O'Brien and Toms, 2008). Findings of Imlawi and Gregg study (2014) suggest engagement with SNSs positively influences satisfaction. In addition, self-disclosure behavior is positively related with engagement with SNSs. Results of the Webster and Ahuja (2006) study indicate engagement has a positive effect on future use of a system. Continued engagement is a post-adoption behavior (Bhattacherjee, 2001). The ubiquitous feature of smartphones makes them to be always available to increase user engagement and ultimately the continued use behavior (Kim et al., 2007; Kim et al., 2013). On mobile apps, aggregation of users’ data is important to improve user engagement (Leijdekkers and Gay, 2015). Storing users’ personal information on mobile apps could cause possible risks to information privacy (Xu et al., 2012a). 86

There are four modes of SNS engagement: information seeking activity, connectivity, bricolage, and participation (Takahashi, 2010; Tian, 2016). First, the extent to which information users look up for specific/various types of information via a SNS is information seeking activity

(Anderson et al., 2013; Takahashi, 2010). For example, a user may seek music related news in general or about a particular musician in specific when he/she log-in to a SNS. Second, the most important mode of SNS engagement is connectivity and it is defined as the ability to virtually connect to almost anyone on the SNS (Takahashi, 2010). Connectivity provide opportunities of interaction with people and information (Takahashi, 2010). Third, bricolage is the third mode of

SNS engagement which it allows image creation and impression management (Takahashi, 2010).

Through bricolage SNS users are able to create/re-create their identity management (Deuze, 2006;

Takahashi, 2010). Bricolage involves practices such as borrowing, mixing, reusing and reconstructing of separate artifacts to produce new insights (Deuze, 2006). For example, on social network users may post their interpretation of news by linking to other sources for readers. Fourth, participation mode of SNS engagement is defined as online activities that people are involved in

(Deuze, 2006; Literat, 2016). For example, when users post their ideas on about specific discussion in response to agree/disagree someone else’s opinion they engage in participation activity. This study focuses on participation and connectivity modes of SNS engagement.

3.2.4. Emotional Attachment to SNAs

Emotional attachment is defined a strong bond between a person and an object (Choi, 2013;

Grisaffe and Nguyen, 2011; Vincent, 2006). Emotional attachment antecedents and consequences are studied in marketing and psychology literature. For example, in the marketing field the emotional attachment to brands is a profound bond with brands that leads to loyalty (Grisaffe and

Nguyen, 2011; Thomson et al., 2005). Findings of the study by Grisaffe (2011) show that when 87

customers perceive a greater value from a brand they emotionally get attached to it. Their study

demonstrates that user-derived benefits that are related to the product could drive emotional

attachment. The attachment-producing benefits while the desired benefits do not effect attachment

becomes very important (Grisaffe and Nguyen, 2011). Customer/prospect experiences that are

explained by sentimental memories could yield to emotional attachment (Grisaffe and Nguyen,

2011). Finally, emotional attachment to places could influence how people feel, think, and

ultimately creates favorable evaluation of them (Yuksel et al., 2010).

Prior IS literature acknowledges the importance of emotional attachment toward IT. User-

device attachment is defined as the relationship between individuals and their mobile devices

(Meschtscherjakov, 2009; Wehmeyer, 2007). Due to emotional attachment to mobile devices, the

usage experience of SNSs has moved beyond merely relationship building and enjoyment

outcomes and instead emotional attachment influences every aspect of a user’s life (Abouzahra et

al., 2014). On online social networks, emotional attachment affect consumers’ trust and

community commitment (Chen and Shen, 2015). As a result of emotional attachment to the

attachment figure, people feel a necessity of the object in their daily life (Abouzahra et al., 2014).

Emotional attachment could result in commitment (Fedorikhin et al., 2008). In a study, results

demonstrate emotional attachment affect consumers’ trust and community commitment on online

social networks (Chen and Shen, 2015). Emotional attachment also support relationship building

on SNSs (Chen and Shen, 2015).

In one study, antecedents of attachment to mobile devices are symbolism, fashion,

possession, and needs (Choi, 2010). In another study, attachment to IS result in community

participation intentions (Choi, 2013). Choi (2013) represents the self-connection and prominence as dimensions of IS attachment. Self-connection is the feeling of oneness with IS and prominence 88

is the degree to which a user’s feeling toward the IS arise to her/his mind automatically (Choi,

2013). Applying the innovation diffusion theory, research shows emotional attachment to mobile

devices influences relative advantage of mobile social networks (Abouzahra et al., 2014). In this

study the user-SNA attachment is the bond between a user and the SNA. Even though the

attachment to mobile apps is different from attachment to mobile devices, the behavioral

consequences are fairly similar. Most of the smartphones usage time is consumed on apps installed

on them (Christensen and Prax, 2012). Summary of related literature in IS focusing on emotional

attachment is provided in Appendix F.

3.3. Theoretical Background

Technology adoption has been studied through the lens of different adoption theories such

as technology acceptance model (TAM) (Davis, 1989), innovation diffusion theory (IDT) (Rogers,

1983), and unified theory of acceptance and use of technology (UTAUT) (Venkatesh et al., 2003).

TAM and UTAUT theories do not apply practically to explain the situation where users’ main

focus is privacy and they try to be almost anonymous on a social network. Adoption decision and

continued use of a new SNA are not only based on facilitating conditions, social norms, social

influence, expectancies, usefulness, and ease of use. Successful adoption of a technology is based

on cognitive, emotional, and contextual concerns (Straub, 2009). Engagement with technologies

enable organizations, developers, and businesses to build and maintain relationship with users and

customers (Chu and Kim, 2011). SNSs permit users to engage in interactive and dynamic social interactions (Chu and Kim, 2011). Moreover, individuals who are emotionally attached to SNSs not only have higher feelings of necessity of continuously using SNSs but also have greater favorable evaluation toward them. To propose emotional attachment model of a new privacy-based

SNA, this study builds upon the attachment theory and contextual integrity of privacy. 89

Emotional attachment is a strong and enduring bond between a person to another (Bowlby,

1982a). During process of building emotional attachment, the relationship between an individual

and attachment figure is dependent on the development of social, emotional, and cognitive

development (Ainsworth et al., 2015). Attachment is not necessarily reciprocal (Bowlby, 1969).

In the previous literature, two main theories are used to explain attachment. First, the

learning/behaviorist theory of attachment which suggests attachment is set of learned behaviors

(Miller and Dollard, 1950). Second, the evolutionary theory of attachment that suggests attachment

formation is a natural response for survival (Bowlby, 1982b). This study adopts the

learning/behaviorist theory of attachment to explain the emotional attachment to SNAs.

After forming emotional bond, separation from the main attachment figure result in

stranger anxiety and separation anxiety (Schaffer and Emerson, 1964). As a result, the attachment

behavioral system motive individuals to seek proximity to and support from significant others

(Bowlby, 1982b). Deprivation behavior as a result of separation is applicable in understanding the

mobile device attachment in IS because individuals feel secure when they are close to their mobile

phones. Attachment processes are formed during the usage of SNSs (Hart et al., 2015). Attachment

dynamics explain motivations behind usage types and the content of exchanged communications

on SNS (Hart et al., 2015). Attachment theory postulates individuals form attachment to manage

interpersonal loss (Hart et al., 2015). Also, attachment formation between a person and an

attachment figure involves their close engagement (Hazan and Shaver, 1987). Consequently,

engaging in different usage behaviors on SNAs influences the emotional attachment.

Understanding privacy concerns in a new context, such as a new SNA, requires a thorough understanding of the type and amount of personal information a user choose to disclose. For example, when a user try a new SNA for the first type without giving access to his personal 90

information and sharing a content the conceptualization of privacy concern in that context should

differ from his/her information privacy concern on an old SNS with a huge amount of shared

contents. In addition, if a person chooses to use a built-up identity to disclose untruthful personal

information the privacy concern is not the same. This study applies the contextual integrity of

privacy as the theoretical underpinning to explain the situations in which users disclose their

information on a SNA.

Nissenbaum (2004) discusses flow of information in a given context is determined by two

principles: first, the plurality of realm that users’ activities take place in them and second, norms

that govern each realm. Appropriateness and distribution norms are defined by Nissenbaum (2004) to explain informational norms in her privacy scheme. According to the appropriateness norms information is either appropriate or inappropriate to disclose within a context. Distribution norms draw restriction of information flow within and across contexts (Nissenbaum, 2004). Privacy violation occurs when either of these norms is breached (Grodzinsky and Tavani, 2010). In the age of information, the central notion of the contextual integrity framework is that information gathering transgress context-specific informational norms (Nissenbaum, 2010, p. 186). The

framework of contextual integrity also explain many privacy conflicts simply do not occur because

the point of departure is a commitment to flow of information not the control over the information

(Nissenbaum, 2010, p. 187).

Contextual integrity recognizes the importance of social systems on people’s reactions and

the framework of contextual integrity accounts social determinants (Nissenbaum, 2010, p. 190).

Users in SNSs facing two distinct privacy issues: the apparent issue which is about the abandon of

confiding personal information to profiles on SNSs and the insidious issue about how social media

companies handle users’ information (Nissenbaum, 2010, p. 221). Nissenbaum continues that 91

social network sites are a medium of interaction, transaction, informtion exchange, and

communication. Nissenbaum argues that exnension of information on SNSs to a diverse variety of

social contexts is inevitable. On social networks, there are different spheres for people (such as

public vs. private) and users have different norms to disclose their information to each sphere

(Grodzinsky and Tavani, 2010). Within different contexts individuals may have different roles that derive particular activities (Sar and Al-Saggaf, 2014). Activities within a context are oriented around values that are governed by behavior-guiding norms (Sar and Al-Saggaf, 2014).

Regarding the flow of information in social contexts, Nissenbaum (2010) demonstrates that paying attention to the complex norms explain how users of SNSs post information under certain context-relative information norms. On SNSs, individuals establish and manage the boundaries of various spheres (e.g. public vs. private) by applying different mechanisms such as anonymity, deception, dissimulation (Acquisti et al., 2015). Consequently, this study considers the influence of perception of self-anonymity and self-disclosure depth on how people experience uncertainty about being concerned about their privacy.

Most of the prior research has focused on intentional behaviors instead of the actual behavior (Cheon et al., 2012; Lee et al., 2014; Rehm et al., 2012). Boyd and Ellison (2007) argue that the actual behavior of people to protect their privacy is not always same as their desire.

Belanger and Xu (2015) demonstrate the importance of measuring actual amounts of disclosed information to reveal actual users’ privacy concern in different contexts. Many individuals do not necessarily show rational behaviors under privacy situations (Belanger and Xu, 2015). Barnett et al. (2015) study demonstrates the different results among relationships of a research model if self- reported use measures are used instead of the actual use measures.

92

3.3.1. Research Model and Hypotheses

To assess the relationships between perceived self-anonymity, self-disclosure depth, privacy concerns, engagement with a SNA, emotional attachment to a SNS this study proposes a research model (Figure 8) and present the following hypotheses.

Initial Use Ongoing Use (Phase 2) (Phase 1) H6 (-)

Initial Perceived H3 (+) Self-Anonymity H1 (-) Ongoing Ongoing Ongoing Privacy H5 (-) Perceived H7 (+) Emotional Concern Engagement Attachment H2 (+) Initial Self-Disclosure H4 (+) Depth

Figure 8 - Proposed emotional attachment research model

One of the unique characteristics of online communications is the perceived anonymity for users (Cho et al., 2012). Pfitzmann and Köhntopp (2001, p. 2) describe anonymity as “the state of being not identifiable within a set of subjects, the anonymity set”. Anonymity ensures a user that his/her identity is not observable, identifiable, and linkable by other users (Pfitzmann and

Köhntopp, 2001). Research shows the presence of anonymity in communication influences the type of content being shared. For example, anti-normative expressions are reduced as a result of identification of users on online communities (Cho et al., 2012). Individuals’ perceptions of anonymity is likely to result in unregulated behavior and less concern about self-image (Sproull and Kiesler, 1986).

Behaviors of anonymous individuals could become more extreme and impulsive (Sproull and Kiesler, 1986) and it may leads to privacy violations (Cho et al., 2012). On social networks,

93

individuals who perceive to be anonymous may experience deindividuation (Jiang et al., 2013).

As a result of deindividuation, users have less concern for self (Postmes and Spears, 1998).

Individuals who perceive themselves unidentifiable on social networks have a greater sense of protection, and higher feel of immunity, and lesser concern for privacy (Jiang et al., 2013).

Considering above discussion, this study suggests:

H1: Initial perceived self-anonymity negatively influences ongoing privacy concern.

Self-disclosure is defined as any information one user shares with others (Krasnova et al.,

2010). Self-disclosed information differs in amount, honesty, depth, intent, and valence (Posey et al., 2010). The negative influence of privacy concern on self-disclosure intentions have been discussed broadly in the previous literature (Bansal and Gefen, 2010; Koohikamali et al., 2015).

Privacy paradox argues that the actual behavior of users to disclose their information is not always the same as their intentions (Norberg et al., 2007). Users sense of personal privacy deteriorate as they actually disclose their information (Norberg et al., 2007). People disclose personal information to gain benefits (Bansal and Gefen, 2010). For example, on SNSs people self-disclose to foster their relationships (Posey et al., 2010).

Self-disclosure depth is related to sharing of intimate and detailed personal information in communications (French and Read, 2013; Posey et al., 2010). As soons as the personal informaiton is dislcosed it may be misused and cause unwanted consequences (Bansal and Gefen, 2010).

Disclosing certain types of personal informaiton such as more intimate personal health information could cause greater undesirable outcomes (Bansal and Gefen, 2010; Hui et al., 2007). This research argues when people initially use a new system and they disclose their intimate personal

94

information, their concern for privacy may increase due to higher vulnerabilities to future risks.

Thus, this research proposes:

H2: Initial self-disclosure depth is positively related to ongoing privacy concern.

On SNSs staying relatively anonymous to other users is not difficult because people could

use made-up or fake names instead of their real names (Byrne, 2007). The inherent psychological

comfort as a result of perceived anonymity increases the level of involvement in online

environments (Bell, 2001). Using SNSs and creating content has become a means of managing

identity (Livingstone, 2008). Online identify might be shaped in relation to other users of the social

network and many may seek confidentiality when disclose their personal information

(Livingstone, 2008). Furthermore, ways of enacting identity on SNSs vary significantly and many

users refrain to self-portray and instead render the personal profile as a place-maker (Livingstone,

2008).

A key aspect of user-generated content on online websites such as social networks is anonymity (Scott and Orlikowski, 2014). Anonymity is the lack of identification that decreases

inhibition (Posey et al., 2010). Perception of anonymity allows users to feel comfortable when

sharing personal information (Lea et al., 2001). Disinhibition occurs is caused by public and

private self-awareness as two main subjective states (Pinsonneault and Heppel, 1997). Individuals

with higher perceptions of anonymity would experience greater disinhibition and as a result of

disinhibition individuals feel free to perform public behaviors (Bansal and Gefen, 2010).

Additionally, Scott and Orlikowski (2014) demonstrate the elective form of anonymity makes

users to engage more with Trip Advisor, a famous social networking website, due to their

perceptions of being unidentifiable. Findings of the study by Ayyagari et al. (2011) suggest

95

individuals’ engagement with a technology is accompanied by strain over unintended consequences. So, this study suggests:

H3: Initial perceived anonymity positively influences perceived ongoing engagement with

SNAs.

SNS users are able to engage in various ranges of activities. Takahashi (2010) investigates modes of SNS engagement. Findings of Takahashi (2010) suggests information seeking activity, connectivity, bricolage, and participation as four main modes of SNS engagement. Among modes of SNS engagement, connectivity and participation are related with the interaction among users which is the most important aspect of using SNSs. Connectivity and participation modes of engagement are accessible through content sharing. Recent studies in the area of human-computer interaction (HCI) have moved beyond the use and usefulness of systems and there are new efforts is to understand and enhance actual engagement with a technology (Hassenzahl and Tractinsky,

2006). Engagement is a desirable response to a technology and it includes intrinsic interest (Imlawi and Gregg, 2014).

Research shows disclosure of personal information is a predictor of engagement on social networks (Liau et al., 2005). On social networks, to reduce uncertainties between users in communications self-disclosure is a necessary element (Imlawi and Gregg, 2014). People use disclosed information on SNSs to get general understanding of other users which eventually yields to greater engagement with SNSs (Joinson et al., 2011). Self-disclosure stimulates feedback on social networks and it improves engagement. This research argues when users provide more in depth of their personal information on a SNA, they show greater level of interest to engage with it. Thus, this study hypothesizes:

96

H4: Initial self-disclosure depth is positively related with ongoing perceived engagement

with SNAs.

Effective engagement with a technology is the result of factors such as users’ experiences

during the usage and users’ expectations and values (Vasalou et al., 2015). Through the perspective of engagement theory, initial engagement with a technology is based on users’ motivations

(O'Brien and Toms, 2008). The key feature of engagement with a technology is control (O'Brien and Toms, 2008). Privacy risks usually hinders engagement with a technology (Vasalou et al.,

2015). Individuals with high privacy concern usually perceive themselves unable of sufficient control over their personal information.

A study by Staddon et al. (2012) discusses the effect of privacy concern on engagement.

Their findings show low engagement with social networks is the result of high privacy concerns.

People who perceive higher concerns for their privacy consistently spend less time on social networks (Staddon et al., 2012). It suggests privacy concern is an important gate to engagement of users with social networks. Privacy concerns influences people’s choice of engaging with the network and sharing content (Stutzman et al., 2012). This research proposes:

H5: Ongoing privacy concern is negatively related with ongoing perceived engagement with SNAs.

Emotional attachment to mobile-devices is a bond between people and their devices

(Meschtscherjakov, 2009; Wehmeyer, 2007). On online social networks, emotional attachment affect consumers’ trust and community commitment (Chen and Shen, 2015). Emotional attachment support relationship building on SNSs (Chen and Shen, 2015). In face-to-face communication, revealing personal information in close relationships is related to risks to one’s

97

privacy because users have less control over personal information (Ben-Ze’ev, 2003). In physical settings privacy regulations foster the level of place attachment (Harris et al., 1996).

Privacy helps people to emotionally adjust their interpersonal relationships (Westin, 2003).

Emotional release, as a function of privacy, is the release of people from tensions of social life

(Trepte and Reinecke, 2013). Privacy provide an opportunity for emotional release that individuals can manage losses through it (Trepte and Reinecke, 2013). Following the same direction, in artificial social network settings such as in SNAs the control over information privacy can positively influence the level of attachment. In turn, this study focuses on the privacy concern, the effect of privacy concerns on emotional attachment in the contexts of SNAs is lower for greater concerns. Given the nature of privacy concern, this study posits:

H6: Ongoing privacy concern negatively influences ongoing emotional attachment.

Research shows users engage more in the social network activities, they have higher levels of emotional attachment to the network (Wisniewski et al., 2015). For example, active users of

Facebook with more engagement in direct communications show higher levels of bonds (Burke et al., 2010; Trepte and Reinecke, 2013). If users spend more time on SNAs, they are more attached emotionally to SNAs than users who spend less time (Ellison et al., 2007). Therefore, this research suggests:

H7: Ongoing perceived engagement with SNAs is positively related with emotional attachment.

3.4. Research Methodology

Through the lens of framework of contextual integrity of privacy and attachment theory this study suggests a research model to explain the ongoing emotional attachment to a new SNA.

98

This research model attempts to examine the effect of initial perceived self-anonymity and self- disclosure depth on ongoing privacy concern and ongoing engagement with the SNA.

3.4.1. Research Design and Procedure

A survey method is used to test the research model in two-stages: after using a SNA for the first time and after three weeks during the ongoing usage period. To test and verify the proposed research model, Sociabile app (a new SNA) is introduced to respondents and they were asked to install the app to experience it for the first time (phase1, n1=196). Sociabile (social + mobile) is a new privacy-based SNA, which is available on iTunes and Google Play and it is used for the current study. After 3-weeks respondents of the first phase were asked to participate in another survey about their ongoing usage experience of Sociabile during this period (phase 2, n2=119).

Pre-validated measures are used to operationalize perceived anonymity (Chen et al., 2008), perceived engagement with the SNA (Ellison et al., 2007; O'Brien and Toms, 2010), privacy concern (Chen et al., 2008), and emotional attachment (Choi, 2013; Ellison et al., 2007). Self- disclosure depth is self-developed and defined as the ratio of number of specific fields user has filled over total fields. Measurement items are provided in Appendix G. Subject demographics is shown in Table 11.

99

Table 11 - Demographic information

Gender Male (50%), Female (50%) Age Min=18, Max=46, Mean=22 Disposable 10k=< 15k=< Phase-1 <5k 5k=< <10k >=20k income ($/ <15k <20k (Initial-Use) (51%) (20%) (9%) year) (13%) (7%) Academic Freshman Sophomore Junior Senior Graduate Standing (5%) (15%) (60%) (20%) (<1%) Gender Male (55%), Female (45%) Age Min=18, Max=41, Mean=23 Phase-2 Disposable 10k=< 15k=< <5k 5k=< <10k >=20k (Ongoing income ($/ <15k <20k (48%) (21%) (9%) Use) year) (21%) (1%) Academic Freshman Sophomore Junior Senior Graduate Standing (4%) (14%) (65%) (17%) (0%)

3.5. Data Analysis and Results

Following the two-step analytical approach suggested by Hair et al. (2006), this study first evaluated the measurement model reliability and validity and then assessed the structural model. The partial least squares (PLS) was used to test the research model, because PLS employs a component-based approach for estimation that minimizes residual distributions (Chin, 1998), and is best suited for testing complex relationships by avoiding inadmissible solutions and factor indeterminacy (Chen et al., 2011).

3.5.1. Measurement Model

Assessment of the measurement model was a three-step analysis process including evaluating reliability of the measurement model, evaluating convergent validity, and ensuring discriminant validity. Factor loadings of measurement items are presented in Appendix H. The necessarily requirement for measurement item reliability is satisfied because all item loadings are above 0.7, as a criteria suggested by Barclay et al. (1995). Table 2 includes the descriptive statistics, composite reliability values, Cronbach’s alpha values, and the average variance 100

explained (AVEs) of principal constructs. Composite reliability is established if composite reliabilities and Cronbach’s alpha values are greater than 0.7 (Fornell and Larcker, 1981;

Nunnally et al., 1967). Resulting composite reliability and Cronbach’s alpha values are more than 0.7 and thus the reliability criteria of the measurement model is met. Convergent validity is established if AVE reach at least 0.50 and if principal hypothesized constructs load much higher than other constructs (Chin, 1998). Table 2 shows all AVEs exceed 0.50, thus establishing convergent validity. To ensure discriminant validity, an indicator’s outer loading on a construct should be more than cross loadings with other constructs and the square root of AVE of each construct should be greater than outer correlations (Hair Jr et al., 2013). Based on Appendix H and Table 12, the discriminant validity of the measurement model is verified.

Table 12 - Descriptive statistics, correlations, and average variance extracted

Principal Construct Mean SD CR CA AVE 1 2 3 4 5 Initial perceived self- 1 4.35 0.09 0.96 0.94 0.85 0.92 anonymity (SA) Ongoing emotional 2 2.99 0.24 0.96 0.94 0.85 0.12 0.92 attachment (EA) 3 Self-disclosure depth (DD) 0.63 0.26 1.00 1.00 1.00 -0.06 -0.24 1.00 Ongoing perceived 4 3.61 0.33 0.96 0.96 0.74 0.43 0.50 -0.28 0.86 engagement (ENG) Ongoing perceived privacy 5 3.91 0.08 0.97 0.97 0.76 -0.30 0.18 0.13 -0.14 0.87 concern (PC) SD: Standard Deviation, CR: Composite Reliability, CA: Cronbach's alpha. The diagonal elements represent the square root of AVE. Off-diagonal elements are the correlations among constructs. 3.5.2. Structural Model

Results of the structural model analysis based on the proposed hypothesis is shown in

Figure 9, including explained variance of dependent or dependent variables, estimated path coefficients, and t-values. The PLS results indicate the structural model explains 10% of the

101

variance in ongoing privacy concern, 25% of the variance in engagement with a SNA, and 31%

of the variance in ongoing emotional attachment to a SNA.

Initial Use Ongoing Use (Phase 2) (Phase 1) 0.25***

Initial Perceived 0.42*** Self-Anonymity - 0.30*** Ongoing Ongoing Ongoing Privacy 0.02 Perceived 0.53*** Emotional Concern Engagement Attachment 2 2 0.11** R =10% R2=25% R =31% Initial Self-Disclosure - 0.26*** Depth ***: p<.001;**: p<.01;*: p<.05 Note. Numbers on arrows represent path coefficients.

Figure 9 - Structural model analysis

3.6. Discussion

3.6.1. Discussion of Results

Analyzing significance of path coefficients reveal all paths are statistically significant

except the relationship between the ongoing perceived privacy concerns and the ongoing

perceived engagement. Results indicate initial perceived self-anonymity (b= - 0.30, p<0.001) and

initial self-disclosure depth (b=0.11, p<0.01) together explain 10% of the variance in ongoing privacy concern, confirming H1 and H2. As hypothesized, initial perceived self-anonymity (b=

0.42, p<0.001) and initial self-disclosure depth (b=-0.11, p<0.001) together explain 25% of the variance in ongoing perceived engagement, confirming H3. Finally, results show the significance of hypothesized relationship between ongoing perceived engagement and ongoing emotional attachment (b=0.53, p<0.001), confirming H7.

102

However, opposed to our hypothesized positive relationship between initial self-

disclosure depth and ongoing perceived engagement (H4) and ongoing privacy concern and

ongoing emotional attachment (H6), path coefficient were negative, failing to support H4 and

H6. Also, the results did not find statistical support to prove the relationship between ongoing privacy concern and ongoing perceived engagement (b=0.02, p<0.001), rejecting H5.

The results supported all hypotheses but H4, H5, and H6. For both H4 and H6 the

resulting path coefficient signs are contrary to what it was hypothesized earlier. Self-disclosure

depth was negatively related to perceived engagement opposed to positive hypothesized

relationship. A plausible explanation is that at the beginning stage of using a SNA depth of self-

disclosure behavior may increase perceptions of future risks and potential losses. Consequently,

users with greater disclosed personal information may feel unsafe and they refrain from engaging

with the SNA. However, if users disclose more general information they do not put themselves

vulnerable to risks and they are expected to engage more with the SNA.

Another interesting finding was about the positive effect of privacy concern on emotional

attachment to a SNA. Although it was hypothesized that users with higher privacy concern will

have lower emotional attachment to a SNA, results indicate the opposite is true. In virtual worlds

(VWs) users’ attachment is based on the meaningfulness of experiences, which motivates users

to return and to continue use them (Goel et al., 2011). On a SNA people with greater privacy

concerns are stimulated to continuously keep track of their information privacy and subsequently

are more emotionally attached to it. Additionally, arguably to avoid separation anxiety users who

want to control their information privacy would induce higher levels of emotional attachment.

Finally, previous literature has discussed the negative relationship between users’ privacy

concerns and engagement with a SNA, as users are more worried about the control over their 103

personal information. Findings did not show a significant relationship between privacy concern

and engagement with a SNA. One possible explanation is that on a privacy-based SNA that was introduced to users, people’s perception regarding the protection of their information privacy is high and subsequently privacy concern does not negatively influence engagement with the SNA.

The findings of this research provide evidence that emotional attachment to a new SNA is determined by perceived engagement and privacy concern. Furthermore, self-disclosure depth increases privacy concern and perceived self-anonymity decreases privacy concern.

3.6.2. Implications

The most significant contributions of this study can be divided into theoretical and practical perspectives. Although the relationship between privacy concern and self-disclosure is well-studied in the previous Information System (IS) literature, the lack of an integrated model to examine the relationship between anonymity, disclosure depth, engagement, and privacy concern and how they influence emotional attachment is the main motivation for this study.

Theoretically, this study investigated a research model explaining emotional attachment to a new social networking app during the continued use stage. This research is one of the few studies in the IS to investigate emotional attachment as a dependent variable through the lens of contextual integrity of privacy and attachment theory. Findings of this research can be extended in the future to better understand antecedents and consequences of emotional attachment. For example, technology addiction studies could incorporate findings of this research in the future technology addiction examinations.

Second, the proposed research model is not merely based on usage behavior and instead this research examined the role of engagement with a new SNA on ongoing emotional attachment. Focusing on engagement perception is beyond the technology acceptance and it 104

extends the current mainstream body of IS research. This study could be useful in future studies examining success of a new system by incorporating engagement perceptions of users. Third, integration of perceived self-anonymity, self-disclosure depth, and privacy concern helped to shed light on the perplexities of information management on internet and specifically on online social networks. Results open up a new stream of research focusing on different types of information users disclose on various social networks. Future research can also study personality traits influence on emotional attachment. Also, future research could investigate the differences between adopters, non-adopters, and quitters of a new social network.

Practical contributions of this study can be discussed from the view of three groups. First, developers of new SNAs may consider users’ initial perceptions of using social network apps to improve functionalities in the future versions. Second, practitioners may apply results of this study to increase users’ engagement with a technology and emotional attachment to it. Third, users may apply the integration of self-anonymity and self-disclosure depth to evaluate the potential risks to their personal information privacy. Nonetheless, this study has some at least one limitation that should be noted. The data collection methodology is limited to college students in southern part of the US. Although college students are a majority of SNA users, it is beneficial to include other parts of the population for better understanding of the emotional attachment to SNAs.

3.7. Conclusion

This study has examined the phenomena of emotional attachment to a new SNA as it is not studied in the previous IS literature. In addition, engagement with a new SNA as a post- adoption behavior is included in this research to deepen understanding of emotional attachment.

Finally, privacy concern is not studied in accordance of perceived self-anonymity and self- 105

disclosure depth. This study propose a new research model explaining emotional attachment to a new SNA in the ongoing use stage. This model identifies privacy concern and engagement as two separate and important predictors of emotional attachment to a new SNA. Privacy concern is influenced by perceived self-anonymity and self-disclosure depth, demonstrated by the results.

106

EPILOGUE

This dissertation investigates the information privacy of mobile users in the context of mobile apps. The current rapid growth of usage of mobile apps has raised researchers’ attention in many fields. App usage behavior is a complex phenomenon that is driven by many personal, contextual, and social factors. Most mobile apps often have access to personal information and many users have lost control of their personal information (Herrmann and Lindemann, 2016). This dissertation focuses on the mobile app context and tries to understand how intimacy of users with their smartphones’ apps influence their control over personal information. In this dissertation, three important and less studied aspects of information privacy are investigated. First, understanding mobile apps providers’ perspectives about users’ information privacy on different app categories is the focus of the essay I. Then, recognizing the dynamics of privacy concern and perceived benefit trade-off during the initial usage of a new social networking app (SNA) is the main objective of the essay II. Finally, studying the emotional attachment to a new SNA and related factors is the goal of the essay III.

The first essay examines the key privacy dimensions of mobile app privacy policies. It is believed that privacy policies are unsuccessful in transforming the company’s perspective with regard to protection of users’ information privacy. Furthermore, it is not clear how mobile app providers differentiate between non-sensitive vs. sensitive personal information. This research helps to reveal the clandestine of privacy policies by using the readability and the hit density measures. Through the lens of prospect theory, essay I identifies four key dimensions of privacy

(collection, secondary use, improper access, and error) to calculate distribution of privacy dimensions among different app categories. Readability calculations reveal that privacy policies of mobile apps are very difficult for a basic user comprehension and the extensive length of privacy 107

policies hinders users to read them. Also, from providers’ perspectives there is a greater emphasis

on error and secondary use privacy dimensions for mobile apps dealing with more sensitive

personal information.

The second essay studies privacy trade-off dynamism through the initial use of a new social

networking app (SNA). Integration of innovation diffusion theory, theory of planned behavior,

theory of belief updating, and privacy trade-off model are used for theoretical underpinning of this

research. Applying a longitudinal perspective, this research posits a research model that shows the

role of initial use and initial self-disclosure on updating users’ pre-perceptions of using a new SNA

to initial perceptions of using it. The trade-off between pre-privacy concern and pre-perceived benefit determines the intention to use the SNA. Then, initial perceived benefit is explained by pre-perceived benefit, initial use, and initial self-disclosure. Initial privacy concern is explained by pre-privacy concern and initial use. By providing a dynamic model of privacy trade-off, findings of this study provide significant contributions the privacy literature in IS field.

The third essay focuses on the emotional attachment to social networking apps (SNAs).The average time people spend on their mobile apps is increasing, but the life cycle of a new mobile app could be very short. This research proposes a research model to investigate privacy concern and engagement as two influencing factors of emotional attachment to an SNA. In addition, it examines the importance of perceived anonymity and self-disclosure on privacy concerns and

engagement with the SNA. This study tests the research model with the survey method. Findings

reveal perceived self-anonymity and self-disclosure depth affect privacy concern. Furthermore,

results indicate both privacy concern and engagement with a new SNA are related with emotional

attachment to it. This study has several theoretical and practical contributions and directions for

future research in the IS. 108

APPENDICES

APPENDIX A

LIST OF MOBILE APPS

109

No Name No Name No Name 1 WebMD 31 GoogleMaps 61 Ridge Racer 2 Mindbody Connect 32 Waze 62 Skyline Skaters 3 Blogilates 33 MapQuest 63 Hungry Shark Mad Skills 4 UP Coffee 34 Scout GPS 64 Motorcross 5 Health4Me 35 Yellow Pagaes 65 Frozen Free Fall 6 Fertility Friend 36 AT&T Navigator 66 Boom Beach 7 Spring 37 INRIX Traffic Map 67 Grand Theft Auto 8 Cine Mama 38 Geocaching Intro 68 Adventure Beaks 9 Breeze 39 Transit 69 Game of War 10 Find Me Gluten Free 40 Sygic 70 CSR Racing 11 ShopWell 41 Vz Navigator 71 Big Win Basketball 12 Calm-Meditate Sleep 42 Embark NYC Subway 72 Fun Run Transit directions by 13 Glow 43 73 Jungle Doctor X Moovit Sunnyville Baby Pet 14 Fooducate 44 CoPilot GPS 74 Animal Salon 15 My Pregnancy Today 45 BestParking 75 Sparkle Unleashed 16 iTriage 46 Glympse 76 Dots 17 MyBabyToday 47 iTrans 77 SongPop 18 Healthgrades 48 Navmii 78 Subway surfers 19 Headspace 49 Escort Live Radar 79 Despicable Me 20 Pregnancy & Baby 50 MotionX 80 Throne Rush Speedway Fuel and 21 Baby Bundle 51 81 Guess The Emoji Speedy Rewards 22 Baby Bump 52 Drunk Mode 82 My Boo 23 askMD 53 Movies by one tap 83 My Singing Monster First Aid by American Red 24 54 SpotHero 84 My Vegas slots Cross 25 Insight Timer 55 OneBusAway 85 QuizUp 26 Brilliant Distinctions 56 DC Metro and Bus 86 High School Story I'm expecting pregnancy app 27 57 Map My Hike 87 Jetpack joyride and baby guide Healthgrades: The right 28 58 Here Maps 88 PBA Bowling doctors and hospitals Hidden Objects: 29 Instant Heart Rate 59 onTime 89 Mystery Crimes 30 CVS Caremark 60 ParkWhiz 90 8 Ball Pool Note. No.1 to 30 are health apps (type I), No.31 to 60 are navigation apps (type II), No.61 to 90 are game apps (type III)

110

APPENDIX B

SELECTED PREVIOUS RESEARCH ON SNS USE AND PRIVACY CONCERNS

111

Dependent Data collection Authors Independent variables Theory Method variables (n=samples) Innovation Perceived benefit, diffusion Two phased privacy concern, Partial least Self- theory (IDT), surveys and This study intention to use, actual squares disclosure Privacy real app usage use, personal (PLS) trade-off data innovativeness in IT model Attitude (information, entertainment, Two connecting with old Theory of independent Multi- Chang and friends, meeting new Adoption planned survey group Zhu (2011) people, conformity), intention behavior questionnaires analysis subjective norm, (TPB) (n1=278, perceived behavioral n2=328) control A survey Privacy concern, questionnaire Uses and Partial least Ku et al. perceived critical Continuance in English and gratification squares (2013) mass, subjective intention Chinese theory (UGT) (PLS) norms, gratification (n1=103, n2=122) Information sharing, A survey Engaging in convenience and questionnaire link sharing Baek et al. entertainment, pass distributed as a Logistic (Linking UGT (2011) time, interpersonal snowball regression frequency utility, control, work sampling and content) promotion (n=217) Integrated Subjective norm, TPB and A survey Cheung and We-intention group norm, social social questionnaire PLS Lee (2010) to use SNSs identity influence (n=389) process Information Internet privacy sharing, Social A survey Analysis of Dwyer et al. concern, trust in SNS, development exchange questionnaire variance (2007) trust in other members of new theory (SET) (n=117) (ANOVA) of SNS relationship Intention to Two survey adopt a Technology Privacy concern, ease questionnaires Squicciarini collaborative acceptance of use, usefulness, before and after PLS et al. (2011) privacy model likeability the use of the management (TAM) tool (n=80) tool Context- Control Xu et al. Individual self- A factorial specific agency PLS (2012b) protection, industry design concern for theory 112

Dependent Data collection Authors Independent variables Theory Method variables (n=samples) self-regulation, information experiment government legislation privacy (n=198) Boundary A survey Non- Ayalon and Sharing regulation Information age questionnaire parametric Toch (2013) intention theory of (n=193) tests privacy Privacy concern, Alashoor information and Self- Privacy sensitivity, cognitive N.A. Theoretical Bakerville disclosure calculus absorption, perceived (2015) benefit, perceived risk Use, privacy concern, A survey and privacy Engagement, questionnaire Broeck et al. protection, age group Boundary self- given to Dutch- ANOVA (2015) (emerging adulthood, management disclosure speaking users young adulthood, (n=508) middle adulthood) Unified Social norm, Theory of perceived risk, A survey Koohikamal Location Acceptance perceived benefit, questionnaire PLS et al. (2015) disclosure and Use of opinion leadership, (n=303) Technology- attitude, incentives 2 (UTAUT2) Covariance Intention to Computer anxiety, Social A survey based Osatuyi provide lurking, concern for penetration questionnaire structural (2015) personal information privacy theory (n=250) equation information modeling

113

APPENDIX C

CONSTRUCT DEFINITIONS FOR ESSAY 2

114

Construct Definition Reference The ideal level of control over personal (Malhotra et al., Privacy concern information shared with other people on a SNA. 2004; Smith et al., (PC) Dimensions of privacy concern are collection, 1996) secondary use, improper access, and error. (Chen and The expected reward a user foresee from using a Perceived benefit Dubinsky, 2003; SNA. Value dimensions of perceived benefit are (PB) Overby and Lee, utilitarian and hedonic benefits. 2006) (Agarwal and Personal Prasad, 1998b; innovativeness in The willingness to try a new technology. Jackson et al., IT (PIIT) 2013). Intention to use The willingness to use a technology. (Lin and Lu, 2000) (IU) The real use of a technology that can be Actual usage (AU) explained by the total time spent or number of (Lin and Lu, 2000) times it is used. The intentional sharing behavior of self- Self-disclosure information with other people within a social (Posey et al., 2010) (SD) network

115

APPENDIX D

MEASUREMENT OF PRINCIPAL CONSTRUCTS FOR ESSAY 2

116

Construct Item Measurement Item Reference I am concerned [if/that] I am asked to provide my PC1 personal information on this app. I am concerned [if/that] this app stores too much of PC2 my personal information. I am concerned [if/that] there is the possibility of PC3 unauthorized access to databases that contain my information on this app. I am concerned [if/that] unauthorized people can (Junglas et PC4 access my personal information on this app. al., 2008a; Privacy I am concerned [if/that] this app does not have Junglas et concern* PC5 thorough procedures to prevent errors in my personal al., 2008b; information. Smith et I am concerned [if/that] there are not enough features al., 1996) PC6 to double-check the accuracy of my personal information on this app. I am concerned [if/that] this app uses my personal PC7 information for other purposes without getting my authorization. I am concerned [if/that] this app shares my PC8 information without my consent. PB1 I can access to the relevant services everywhere. I am able to access to the relevant information at the PB2 (Martins et right place. Perceived al. 2014; PB3 I can get the just-in-time information/services. benefit* Xu et al. PB4 I can get an answer to my questions right away. 2009) The interaction with other users [will fulfill/fulfills] PB5 my social needs in some way.

117

Construct Item Measurement Item Reference The interaction with other users [will help/helps] me PB6 (Jiang et cultivate a good relationship with the other party. al. 2013) PB7 The use of this app [will give/gives] me pleasure. (Zhou et

PB8 Using this app [will make/makes] me feel good. al. 2015) INT1 I intend to use this app in the future. Given that I have access to this app, I predict that I INT2 would use it. (Venkatesh Intention to use INT3 Assuming I have access to this app, I intend to use it et al. 2003) INT4 I plan to use this app frequently. INT5 I will use this app on a regular basis in the future INT6 I will try to use this app in my daily life. PIIT1 I like to experiment it. (Jackson et Personal PIIT2 Among my peers, I am usually the first to use it. al. 2013; innovativeness PIIT3 I would look for ways to experiment it. Lu et al. in IT PIIT4 I am not hesitant to use it. 2005) Use1 Total time in minutes spent in the app. (Lin and Initial use Use2 Number of times the app is used. Lu, 2000) Number of general personal information fields filled Self- DISC1 (e.g. Nickname, hobbies, friends, etc.). developed Self-disclosure Number of specific personal information fields filled Self- DISC2 (email, physical appearance, current work place, etc.). developed *Note. For privacy concern and perceived benefit, different questions are used for phase 1 and phase 2.

118

APPENDIX E

FACTOR LOADINGS

119

Items PB PB_2 DISC INT PIIT PC PC_2 USE PB1 0.85 0.19 0.12 0.49 0.32 0.15 -0.03 0.29 PB2 0.82 0.29 0.10 0.53 0.41 0.06 0.02 0.29 PB3 0.73 0.03 0.03 0.37 0.34 0.31 0.15 0.21 PB4 0.70 0.02 0.08 0.34 0.23 0.21 0.23 0.23 PB5 0.78 0.14 0.02 0.43 0.27 0.18 0.08 0.26 PB6 0.72 0.10 -0.03 0.28 0.21 0.17 -0.01 0.26 PB7 0.55 0.07 -0.02 0.16 0.18 0.28 -0.05 0.16 PB8 0.55 0.23 -0.01 0.15 0.21 0.27 -0.01 0.23 PB9 0.73 0.07 0.00 0.41 0.34 0.18 -0.13 0.28 PB10 0.76 0.07 0.01 0.33 0.23 0.10 -0.05 0.24 PB1_2 0.17 0.82 -0.07 0.20 0.18 0.00 0.06 0.33 PB2_2 0.13 0.82 0.01 0.10 0.08 -0.02 -0.01 0.22 PB3_2 0.16 0.88 0.03 0.13 0.10 0.06 0.02 0.35 PB4_2 0.16 0.81 0.05 0.14 0.17 0.04 0.05 0.31 PB5_2 0.21 0.74 0.02 0.21 0.24 0.06 0.14 0.28 PB6_2 0.09 0.84 0.19 0.04 0.05 -0.02 0.04 0.35 PB7_2 0.14 0.83 0.22 0.04 0.06 0.03 0.14 0.37 PB8_2 0.17 0.81 0.19 0.10 0.10 0.11 0.23 0.44 PB9_2 0.12 0.85 0.11 0.03 0.02 -0.06 0.06 0.37 PB10_2 0.10 0.80 0.12 -0.01 0.00 -0.04 0.04 0.36 DISC1 0.21 0.11 0.87 0.06 0.11 0.12 0.03 -0.04 DISC2 0.07 0.12 0.91 -0.03 0.01 -0.19 -0.18 0.22 INT1 0.49 0.08 0.01 0.93 0.50 0.15 0.05 0.24 INT2 0.53 0.15 0.01 0.86 0.47 0.15 0.13 0.32 INT3 0.52 0.03 -0.12 0.86 0.50 0.25 0.17 0.17 INT4 0.36 0.18 -0.06 0.85 0.36 0.09 0.21 0.27 INT5 0.45 0.05 0.02 0.91 0.40 0.19 0.11 0.26 INT6 0.40 0.11 -0.04 0.83 0.42 0.17 0.02 0.33 PIIT1 0.35 0.13 0.03 0.46 0.92 0.15 0.11 0.24 PIIT2 0.35 0.15 0.02 0.51 0.87 0.14 0.04 0.13 PIIT3 0.40 0.08 -0.02 0.47 0.86 0.21 0.11 0.17 PIIT4 0.23 0.02 0.02 0.25 0.77 0.03 0.00 0.03 PC1 0.12 0.04 -0.11 0.04 0.21 0.81 0.22 0.01 PC2 0.11 0.01 -0.14 0.12 0.23 0.86 0.34 -0.02 PC3 0.18 0.17 -0.08 0.10 0.10 0.67 0.22 0.07 PC4 0.11 0.04 -0.16 0.11 0.11 0.78 0.30 0.02 PC5 0.17 0.04 -0.20 0.19 0.17 0.87 0.33 0.08

120

Items PB PB_2 DISC INT PIIT PC PC_2 USE PC6 0.20 0.02 -0.22 0.24 0.13 0.83 0.27 0.17 PC7 0.13 -0.03 -0.13 0.11 0.12 0.83 0.32 0.07 PC8 0.15 0.02 -0.19 0.10 0.09 0.87 0.29 -0.02 PC1_2 0.06 0.09 -0.15 0.12 0.10 0.43 0.91 0.07 PC2_2 0.07 0.13 -0.13 0.20 0.11 0.26 0.86 0.06 PC3_2 0.16 0.33 -0.04 0.16 0.15 0.25 0.71 0.18 PC4_2 0.08 0.19 -0.14 0.18 0.12 0.40 0.86 0.10 PC5_2 0.07 0.06 -0.13 0.08 0.07 0.41 0.77 -0.01 PC6_2 0.00 0.10 -0.02 0.04 0.03 0.27 0.81 0.09 PC7_2 0.02 0.04 -0.23 0.11 0.07 0.33 0.92 0.02 PC8_2 0.00 -0.01 -0.18 0.14 0.07 0.28 0.91 0.03 Use1 0.25 0.26 0.27 0.16 0.19 -0.13 -0.06 0.78 Use2 0.32 0.43 0.11 0.32 0.12 0.22 0.12 0.88 Note. PB= Pre-perceived benefit, PC=Pre-privacy concern, PB_2= Initial perceived benefit, PC_2= Initial privacy concern

121

APPENDIX F

SELECTED PREVIOUS RESEARCH ON EMOTIONAL ATTACHMENT TO IS

122

Authors Independent variables Dependent variables Theory Context Initial perceived anonymity, Attachment initial self-disclosure, ongoing theory, Ongoing emotional Social This study privacy concern, perceived contextual attachment networking apps ongoing engagement, actual integrity of ongoing engagement privacy Emotional support (attachment), informational Commitment- A famous Social shopping Chen and support, community trust theory, Chinese social intention, social sharing Shen (2015) commitment, trust towards trust transfer commerce intention community, trust towards theory website members Mobile-phone attachment Social cognitive Mobile TV Choi (2010) including symbolism, fashion, Intention to use theory services possession, and needs Emotional attachment, Theory of perceived mobile device planned Abouzahra et benefits, relative advantage, Mobile-enabled Intention to use behavior, al. (2014) complexity, compatibility, social networks innovation subjective norm, attitude, diffusion theory perceived behavioral control Relative classic visual aesthetics, relative expressive Community Self-expansion Choi (2013) visual aesthetics, Web browsers participation intention theory personalization, relative performance Chopik and Mobile phone subscription Attachment avoidance, Peterson rates, number of Facebook N.A. Internet websites attachment anxiety (2014) users Fox and Romantic attachment style, Interpersonal electronic Attachment Social Warber relationship uncertainty surveillance theory networking sites (2014) Hart et al. Attachment styles (avoidance Patterns of engagement Attachment Social (2015) and anxiety) (active and restricted) theory networking sites Interactionist Goel et Social awareness, location Intention to return to theory of Virtual world al.(2011) awareness, task awareness virtual world place attachment Subjective norm, group norm, Cheung and We-intention to use Social influence Social social identity (cognitive, Lee (2010) social network sites theory networking sites affective, evaluative) Identity-based attachment to Willingness to help the the group, bond-based Social Ren et al. group, participation, Online attachment to the individuals, psychological (2012) retention, willingness to communities attachment to the large theory help individual members community Self- Kim et al. Attachment, customization, Web-based Enjoyment determination (2015) control video game theory

123

APPENDIX G

MEASUREMENT OF PRINCIPAL CONSTRUCTS FOR ESSAY 3

124

Construct Item Measurement Item Reference I am concerned that I am asked to provide my personal PC1 information on this app. I am concerned [if/that] this app stores too much of my PC2 personal information. I am concerned that there is the possibility of unauthorized PC3 access to databases that contain my information on this app. I am concerned that unauthorized people can access my (Junglas et PC4 personal information on this app. al. 2008; Privacy concern I am concerned that this app does not have thorough Zhou et al. PC5 procedures to prevent errors in my personal information. 2015) I am concerned that there are not enough features to double- PC6 check the accuracy of my personal information on this app. I am concerned that this app uses my personal information for PC7 other purposes without getting my authorization. I am concerned that this app shares my information without PC8 my consent. Numbers of specific personal information disclosed (email, physical appearance, current workplace, etc.)/ Total number of disclosed personal information [Fields possible to be filled: Self-disclosure first name, last name, email address, nickname, age, gender, Self- DD depth relationship status, current school, past schools, hobbies, developed name of my friends, picture of myself, current workplace, past workplace, life events, current living location, physical appearance, my mind, about me] ANON1 It is easy for me to hide my identity on this app. Perceived ANON2 I can remain anonymous when using this app. (Ayyagari et anonymity ANON3 It is easy for me to hide my usage of this app. al., 2011) ANON4 It is difficult for others to identify my use of this app. ENG1 I am engaged with it. (Kim et al., Perceived ENG2 I recommend my engagement to someone else. 2013; engagement ENG3 I am really drawn into it. O'Brien and with SNA ENG4 During my experience with it I let myself go. Toms, 2010) ENG5 When I use it, I lose track of time. ATC1 I feel emotionally connected to it. Emotional ATC2 I feel I am personally connected to it. (Choi, 2013) Attachment ATC3 My feelings toward it come to my mind so naturally. ATC4 I have a strong bond with it.

125

APPENDIX H

FACTOR LOADINGS FOR ESSAY 3

126

Construct Item SA EA DD ENG PC ANON1 0.92 0.16 -0.12 0.45 -0.31 ANON2 0.93 0.13 -0.05 0.41 -0.25 Initial perceived self-anonymity (SA) ANON3 0.92 0.02 0.00 0.35 -0.36 ANON4 0.92 0.12 -0.06 0.37 -0.22 ATC1 0.01 0.90 -0.15 0.41 0.21 ATC2 0.12 0.96 -0.25 0.54 0.11 Ongoing emotional attachment (EA) ATC3 0.11 0.88 -0.28 0.44 0.11 ATC4 0.18 0.95 -0.18 0.43 0.19 Initial Self-disclosure depth (DD) DD -0.06 -0.24 1.00 -0.28 0.13 ENG1 0.30 0.45 -0.20 0.82 -0.11 ENG2 0.38 0.44 -0.24 0.81 -0.18 ENG3 0.34 0.45 -0.29 0.89 -0.11 ENG4 0.26 0.41 -0.22 0.87 -0.13 Ongoing perceived engagement (ENG) ENG5 0.37 0.52 -0.25 0.90 -0.11 ENG6 0.45 0.42 -0.24 0.89 -0.09 ENG7 0.42 0.35 -0.25 0.87 -0.23 ENG8 0.36 0.47 -0.28 0.87 0.01 ENG9 0.42 0.35 -0.20 0.84 -0.19 PRI_ACC1 -0.31 0.15 0.06 -0.12 0.94 PRI_ACC2 -0.35 0.06 0.16 -0.18 0.92 PRI_ACC3 -0.38 0.07 0.04 -0.20 0.90 PRI_COL1 -0.26 0.13 0.12 -0.14 0.89 PRI_COL2 -0.31 0.19 0.12 -0.12 0.91 PRI_COL3 -0.32 0.08 0.16 -0.20 0.80 Ongoing perceived privacy concern (PC) PRI_ERR1 -0.20 0.27 0.08 -0.12 0.87 PRI_ERR2 -0.15 0.26 0.16 -0.13 0.84 PRI_ERR3 -0.18 0.22 0.04 -0.08 0.82 PRI_SEC1 -0.19 0.14 0.17 -0.10 0.88 PRI_SEC2 -0.20 0.14 0.15 0.00 0.84 PRI_SEC3 -0.23 0.17 0.06 0.01 0.86

127

REFERENCES

Abouzahra, M., Yuan, Y., and Tan PhD, J. 2014. "The Role of Perceived Mobile Device Benefits and Emotional Attachment in Enhancing the Use of Mobile-Enabled Social Networks," Thirteenth Annual Workshop on HCI Research in MIS, Auckland, New Zealand.

Acquisti, A., Brandimarte, L., and Loewenstein, G. 2015. "Privacy and Human Behavior in the Age of Information," Science (347:6221), pp. 509-514.

Acquisti, A., and Grossklags, J. 2005. "Privacy and Rationality in Individual Decision Making," IEEE Security & Privacy (2), pp. 24-30.

Acquisti, A., John, L. K., and Loewenstein, G. 2012. "The Impact of Relative Standards on the Propensity to Disclose," Journal of Marketing Research (49:2), pp. 160-174.

Adjerid, I., Acquisti, A., and Loewenstein, G. 2016. "Choice Architecture, Framing, and Layered Privacy Choices," Framing, and Layered Privacy Choices (April 14, 2016)).

Agarwal, R., and Prasad, J. 1998a. "The Antecedents and Consequents of User Perceptions in Information Technology Adoption," Decision Support Systems (22:1), pp. 15-29.

Agarwal, R., and Prasad, J. 1998b. "A Conceptual and Operational Definition of Personal Innovativeness in the Domain of Information Technology," Information Systems Research (9:2), pp. 204-215.

Ahn, Y.-Y., Han, S., Kwak, H., Moon, S., and Jeong, H. 2007. "Analysis of Topological Characteristics of Huge Online Social Networking Services," Proceedings of the 16th international conference on World Wide Web: ACM, pp. 835-844.

Ahuja, M. K., and Thatcher, J. B. 2005. "Moving Beyond Intentions and toward the Theory of Trying: Effects of Work Environment and Gender on Post-Adoption Information Technology Use," MIS Quarterly (29:3), pp. 427-459.

Ainsworth, M. D. S., Blehar, M. C., Waters, E., and Wall, S. N. 2015. Patterns of Attachment: A Psychological Study of the Strange Situation, (Classic ed.). New York: Routledge, Taylor and Francis group, Psychology Press.

Ajzen, I. 1991. "The Theory of Planned Behavior," Organizational Behavior and Human Decision Processes (50:2), pp. 179-211.

Ajzen, I. 2011. "Theory of Planned Behavior," in Handbook of Theories of Social Psychology, P.A.M.V. Lange, A.W. Kruglanski and E.T. Higgins (eds.). VU University Amsterdam, Netherlands: p. 568.

Ajzen, I., and Madden, T. J. 1986. "Prediction of Goal-Directed Behavior: Attitudes, Intentions, and Perceived Behavioral Control," Journal of Experimental Social Psychology (22:5), pp. 453-474.

Alashoor, T., and Baskerville, R. 2015. "The Privacy Paradox: The Role of Cognitive Absorption in the Social Networking Activity," International Conference on Information Systems, Fort Worth.

128

Allison, D. S., El Yamany, H., and Capretz, M. 2009. "Metamodel for Privacy Policies within Soa," Workshop on Software Engineering for Secure Systems: IEEE Computer Society, pp. 40-46.

Aloudat, A., and Michael, K. 2011. "Toward the Regulation of Ubiquitous Mobile Government: A Case Study on Location-Based Emergency Services in Australia," Electronic Commerce Research (11:1), pp. 31-74.

Altman, I., and Taylor, D. 1973. "Social Penetration Theory," New York: Holt, Rinehart &\ Mnston).

Anderson, A. A., Delborne, J., and Kleinman, D. L. 2013. "Information Beyond the Forum: Motivations, Strategies, and Impacts of Citizen Participants Seeking Information During a Consensus Conference," Public Understanding of Science (22:8), pp. 955-970.

Anderson, C. L., and Agarwal, R. 2011. "The Digitization of Healthcare: Boundary Risks, Emotion, and Consumer Willingness to Disclose Personal Health Information," Information Systems Research (22:3), pp. 469-490.

Andrade, E. B., Kaltcheva, V., and Weitz, B. 2002. "Self-Disclosure on the Web: The Impact of Privacy Policy, Reward, and Company Reputation," Advances in Consumer Research (29), pp. 350-353.

Angst, C. M., and Agarwal, R. 2009. "Adoption of Electronic Health Records in the Presence of Privacy Concerns: The Elaboration Likelihood Model and Individual Persuasion," MIS Quarterly (33:2), pp. 339-370.

Ayalon, O., and Toch, E. 2013. "Retrospective Privacy: Managing Longitudinal Privacy in Online Social Networks," Proceedings of the Ninth Symposium on Usable Privacy and Security: ACM, p. 4.

Ayyagari, R., Grover, V., and Purvis, R. 2011. "Technostress: Technological Antecedents and Implications," MIS quarterly (35:4), pp. 831-858.

Badrul, N. A., Williams, S. A., and Lundqvist, K. Ø. 2016. "Online Disclosure of Employment Information: Exploring Malaysian Government Employees' Views in Different Contexts," ACM SIGCAS Computers and Society (45:3), pp. 38-44.

Baek, K., Holton, A., Harp, D., and Yaschur, C. 2011. "The Links That Bind: Uncovering Novel Motivations for Linking on Facebook," Computers in Human Behavior (27:6), pp. 2243-2248.

Bano, M., and Zowghi, D. 2015. "A Systematic Review on the Relationship between User Involvement and System Success," Information and Software Technology (58), pp. 148-169.

Bansal, G., and Gefen, D. 2010. "The Impact of Personal Dispositions on Information Sensitivity, Privacy Concern and Trust in Disclosing Health Information Online," Decision Support Systems (49:2), pp. 138-150.

Bansal, G., Zahedi, F., and Gefen, D. 2010. "The Impact of Personal Dispositions on Information Sensitivity, Privacy Concern and Trust in Disclosing Health Information Online," Decision Support Systems (49:2), pp. 138-150.

Bansal, G., and Zahedi, F. M. 2015. "Trust Violation and Repair: The Information Privacy Perspective," Decision Support Systems (71), pp. 62-77. 129

Bansal, G., Zahedi, F. M., and Gefen, D. 2015. "The Role of Privacy Assurance Mechanisms in Building Trust and the Moderating Role of Privacy Concern," European Journal of Information Systems).

Barclay, D., Higgins, C., and Thompson, R. 1995. "The Partial Least Squares (Pls) Approach to Causal Modeling: Personal Computer Adoption and Use as an Illustration," Technology studies (2:2), pp. 285-309.

Barkhuus, L., and Dey, A. K. 2003. "Location-Based Services for Mobile Telephony: A Study of Users' Privacy Concerns," INTERACT: Citeseer, pp. 702-712.

Barnett, T., Pearson, A. W., Pearson, R., and Kellermanns, F. W. 2015. "Five-Factor Model Personality Traits as Predictors of Perceived and Actual Usage of Technology," European Journal of Information Systems (24:4), pp. 374-390.

Bateman, P. J., Pike, J. C., and Butler, B. S. 2011. "To Disclose or Not: Publicness in Social Networking Sites," Information Technology & People (24:1), pp. 78-100.

Belanger, F., and Xu, H. 2015. "The Role of Information Systems Research in Shaping the Future of Information Privacy," Information Systems Journal (25:6), pp. 573-578.

Bell, M. 2001. "Online Role-Play: Anonymity, Engagement and Risk," Educational Media International (38:4), pp. 251-260.

Bellman, S., Potter, R. F., Treleaven-Hassard, S., Robinson, J. A., and Varan, D. 2011. "The Effectiveness of Branded Mobile Phone Apps," Journal of Interactive Marketing (25:4), pp. 191- 200.

Ben-Ze’ev, A. 2003. "Privacy, Emotional Closeness, and Openness in Cyberspace," Computers in Human Behavior (19:4), pp. 451-467.

Bhattacherjee, A. 2001. "Understanding Information Systems Continuance: An Expectation-Confirmation Model," MIS Quarterly (25:3), pp. 351-370.

Böhmer, M., Hecht, B., Schöning, J., Krüger, A., and Bauer, G. 2011. "Falling Asleep with Angry Birds, Facebook and Kindle: A Large Scale Study on Mobile Application Usage," Proceedings of the 13th international conference on Human computer interaction with mobile devices and services: ACM, pp. 47-56.

Bowlby, J. 1969. Attachment and Loss. New York: Basic books.

Bowlby, J. 1982a. "Attachment and Loss: Retrospect and Prospect," American journal of Orthopsychiatry (52:4), p. 664.

Bowlby, J. 1982b. A Secure Base: Clinical Applications of Attachment Theory. Taylor & Francis.

Boyd, D. M., and Ellison, N. B. 2007. "Social Network Sites: Definition, History, and Scholarship," Journal of Computer‐Mediated Communication (13:1), pp. 210-230.

Breaux, T. D., and Baumer, D. L. 2011. "Legally “Reasonable” Security Requirements: A 10-Year Ftc Retrospective," Computers & Security (30:4), pp. 178-193. 130

Breaux, T. D., and Rao, A. 2013. "Formal Analysis of Privacy Requirements Specifications for Multi-Tier Applications," Requirements Engineering Conference (RE), 2013 21st IEEE International: IEEE, pp. 14-23.

Broeck, E. V. d., Poels, K., and Walrave, M. 2015. "Older and Wiser? Facebook Use, Privacy Concern, and Privacy Protection in the Life Stages of Emerging, Young, and Middle Adulthood," Social Media+ Society (1:2), p. 2056305115616149.

Bryce, J., and Klang, M. 2009. "Young People, Disclosure of Personal Information and Online Privacy: Control, Choice and Consequences," Information Security Technical Report (14:3), pp. 160-166.

Burke, M., Marlow, C., and Lento, T. 2010. "Social Network Activity and Social Well-Being," Proceedings of the SIGCHI conference on human factors in computing systems: ACM, pp. 1909- 1912.

Byrne, D. N. 2007. "Public Discourse, Community Concerns, and Civic Engagement: Exploring Black Social Networking Traditions on Blackplanet. Com," Journal of Computer‐Mediated Communication (13:1), pp. 319-340.

Callahan, M. E. 2012. "Handbook for Safeguarding Sensitive Personally Identifiable Information," United States Department of Homeland Security, Washington, DC, p. 30.

Cao, Q., Duan, W., and Gan, Q. 2011. "Exploring Determinants of Voting for the “Helpfulness” of Online User Reviews: A Text Mining Approach," Decision Support Systems (50:2), pp. 511-521.

Carpenter, A., and Greene, K. 2015. "Social Penetration Theory," in: The International Encyclopedia of Interpersonal Communication. pp. 1-4.

Castillo, C., Mendoza, M., and Poblete, B. 2013. "Predicting Information Credibility in Time-Sensitive Social Media," Internet Research (23:5), pp. 560-588.

Chang, Y. P., and Zhu, D. H. 2011. "Understanding Social Networking Sites Adoption in China: A Comparison of Pre-Adoption and Post-Adoption," Computers in Human behavior (27:5), pp. 1840-1848.

Chen, H.-G., Chen, C. C., Lo, L., and Yang, S. C. 2008. "Online Privacy Control Via Anonymity and Pseudonym: Cross-Cultural Implications," Behaviour & Information Technology (27:3), pp. 229- 242.

Chen, J., and Shen, X.-L. 2015. "Consumers' Decisions in Social Commerce Context: An Empirical Investigation," Decision Support Systems (79:4), pp. 55-64.

Chen, R., Wang, J., Herath, T., and Rao, H. R. 2011. "An Investigation of Email Processing from a Risky Decision Making Perspective," Decision Support Systems (52:1), pp. 73-81.

Chen, Z., and Dubinsky, A. J. 2003. "A Conceptual Model of Perceived Customer Value in E-Commerce: A Preliminary Investigation," Psychology and Marketing (20), pp. 323-347.

131

Cheon, J., Lee, S., Crooks, S. M., and Song, J. 2012. "An Investigation of Mobile Learning Readiness in Higher Education Based on the Theory of Planned Behavior," Computers & Education (59:3), pp. 1054-1064.

Cheung, C., Lee, Z. W., and Chan, T. K. 2015. "Self-Disclosure in Social Networking Sites: The Role of Perceived Cost, Perceived Benefits and Social Influence," Internet Research (25:2), pp. 279-299.

Cheung, C. M., and Lee, M. K. 2010. "A Theoretical Model of Intentional Social Action in Online Social Networks," Decision Support Systems (49:1), pp. 24-30.

Chin, W. W. 1998. "Commentary: Issues and Opinion on Structural Equation Modeling." JSTOR.

Chitturi, R., Raghunathan, R., and Mahajan, V. 2008. "Delight by Design: The Role of Hedonic Versus Utilitarian Benefits," Journal of Marketing (72:3), pp. 48-63.

Chiu, C. M., Wang, E. T., Fang, Y. H., and Huang, H. Y. 2014. "Understanding Customers' Repeat Purchase Intentions in B2c E‐Commerce: The Roles of Utilitarian Value, Hedonic Value and Perceived Risk," Information Systems Journal (24:1), pp. 85-114.

Chmielewski, D. 2015. "Apple Reports Record Adoption of Ios 9 Mobile Operating System." Re/code Retrieved 09/21, 2015, from http://recode.net/2015/09/21/apple-reports-record-adoption-of-ios-9- mobile-operating-system/

Cho, D., Kim, S., and Acquisti, A. 2012. "Empirical Analysis of Online Anonymity and User Behaviors: The Impact of Real Name Policy," Hawaii International Conference on System Science (HICSS): IEEE, pp. 3041-3050.

Cho, Y., Jeon, S., and Choi, G. Y. 2010. "A Study on the Acceptance Factors of the Smart Phone," Applied Mechanics and Materials: Trans Tech Publ, pp. 762-767.

Choi, B. C., Jiang, Z., Xiao, B., and Kim, S. S. 2015. "Embarrassing Exposures in Online Social Networks: An Integrated Perspective of Privacy Invasion and Relationship Bonding," Information Systems Research (26:4), pp. 675-694.

Choi, N. 2013. "Information Systems Attachment: An Empirical Exploration of Its Antecedents and Its Impact on Community Participation Intention," Journal of the American Society for Information Science and Technology (64:11), pp. 2354-2365.

Choi, S. 2010. "Exploring Intention to Adopt Mobile Tv Service in the United States: Toward a New Model with Cognitive-Based and Emotional-Based Constructs." University of South Carolina.

Choi, Y. B., Capitan, K. E., Krause, J. S., and Streeper, M. M. 2006. "Challenges Associated with Privacy in Health Care Industry: Implementation of Hipaa and the Security Rules," Journal of medical systems (30:1), pp. 57-64.

Chopik, W. J., and Peterson, C. 2014. "Changes in Technology Use and Adult Attachment Orientation from 2002 to 2012," Computers in Human Behavior (38), pp. 208-212.

Christensen, C., and Prax, P. 2012. "Assemblage, Adaptation and Apps: Smartphones and Mobile Gaming," Continuum: Journal of Media & Cultural Studies (26:5), pp. 731-739. 132

Chu, S.-C., and Kim, Y. 2011. "Determinants of Consumer Engagement in Electronic Word-of-Mouth (Ewom) in Social Networking Sites," International journal of Advertising (30:1), pp. 47-75.

Contena, B., Loscalzo, Y., and Taddei, S. 2015. "Surfing on Social Network Sites: A Comprehensive Instrument to Evaluate Online Self-Disclosure and Related Attitudes," Computers in Human Behavior (49), pp. 30-37.

Cotton, H., and Bolan, C. 2012. "User Reaction Towards End User License Agreements on Android Smartphones," Proceedings of the International Conference on Security and Management (SAM): The Steering Committee of The World Congress in Computer Science, Computer Engineering and Applied Computing (WorldComp), p. 1.

Cottrill, C. D. 2015. "Location Privacy Preferences: A Survey-Based Analysis of Consumer Awareness, Trade-Off and Decision-Making," Transportation Research Part C: Emerging Technologies (56), pp. 132-148.

Crampton, S. C., and Betke, M. 2003. "Counting Fingers in Real Time: A Webcam-Based Human- Computer Interface with Game Applications," Proceedings of the Conference on Universal Access in Human-Computer Interaction (affiliated with HCI International 2003): Citeseer, pp. 1357-1361.

Cranor, L. F. 2012. "Necessary but Not Sufficient: Standardized Mechanisms for Privacy Notice and Choice," J. on Telecomm. & High Tech. L. (10), p. 273.

Cranor, L. F., Hoke, C., Leon, P. G., and Au, A. 2015. "Are They Worth Reading? An in-Depth Analysis of Online Trackers’ Privacy Policies," Journal of Law and Policy for the Information Society).

Cranor, L. F., Reagle, J., and Ackerman, M. S. 2000. "Beyond Concern: Understanding Net Users' Attitudes About Online Privacy," The Internet upheaval: raising questions, seeking answers in communications policy), pp. 47-70.

Cresswell, K. M., Bates, D. W., and Sheikh, A. 2013. "Ten Key Considerations for the Successful Implementation and Adoption of Large-Scale Health Information Technology," Journal of the American Medical Informatics Association (20:e1), pp. e9-e13.

Culnan, M. J., and Williams, C. C. 2009. "How Ethics Can Enhance Organizational Privacy: Lessons from the Choicepoint and Tjx Data Breaches," MIS Quarterly (33:4), p. 6.

Davis, F. D. 1989. "Perceived Usefulness, Perceived Ease of Use, and User Acceptance of Information Technology," MIS quarterly), pp. 319-340.

Dekhane, S., Xu, X., and Tsoi, M. Y. 2013. "Mobile App Development to Increase Student Engagement and Problem Solving Skills," Journal of Information Systems Education (24:4), p. 299.

Deuze, M. 2006. "Participation, Remediation, Bricolage: Considering Principal Components of a Digital Culture," The information society (22:2), pp. 63-75.

Dinev, T., Goo, J., Hu, Q., and Nam, K. 2009. "User Behaviour Towards Protective Information Technologies: The Role of National Cultural Differences," Information Systems Journal (19:4), pp. 391-412. 133

Dinev, T., and Hart, P. 2006. "An Extended Privacy Calculus Model for E-Commerce Transactions," Information Systems Research (17:1), pp. 61-80.

Dinev, T., McConnell, A. R., and Smith, H. J. 2015. "Research Commentary—Informing Privacy Research through Information Systems, Psychology, and Behavioral Economics: Thinking Outside the “Apco” Box," Information Systems Research (26:4), pp. 639 - 655.

Dinev, T., Xu, H., Smith, J. H., and Hart, P. 2013. "Information Privacy and Correlates: An Empirical Attempt to Bridge and Distinguish Privacy-Related Concepts," European Journal of Information Systems (22:3), pp. 295-316.

Dinner, I., Van Heerde, H. J., and Neslin, S. 2015. "Creating Customer Engagement Via Mobile Apps: How App Usage Drives Purchase Behavior," Available at SSRN 2669817).

Dwyer, C., Hiltz, S., and Passerini, K. 2007. "Trust and Privacy Concern within Social Networking Sites: A Comparison of Facebook and Myspace," AMCIS, p. 339.

East, M. L., and Havard, B. C. 2015. "Mental Health Mobile Apps: From Infusion to Diffusion in the Mental Health Social System," JMIR Mental Health (2:1), p. e10.

Edwards, A., Elwyn, G., Covey, J., Matthews, E., and Pill, R. 2001. "Presenting Risk Information a Review of the Effects of Framing and Other Manipulations on Patient Outcomes," Journal of health communication (6:1), pp. 61-82.

Ellison, N. B., Steinfield, C., and Lampe, C. 2007. "The Benefits of Facebook “Friends:” Social Capital and College Students’ Use of Online Social Network Sites," Journal of Computer‐Mediated Communication (12:4), pp. 1143-1168.

Ellison, N. B., Vitak, J., Gray, R., and Lampe, C. 2014. "Cultivating Social Resources on Social Network Sites: Facebook Relationship Maintenance Behaviors and Their Role in Social Capital Processes," Journal of Computer‐Mediated Communication (19:4), pp. 855-870.

Fedorikhin, A., Park, C. W., and Thomson, M. 2008. "Beyond Fit and Attitude: The Effect of Emotional Attachment on Consumer Responses to Brand Extensions," Journal of Consumer Psychology (18:4), pp. 281-291.

Felt, A. P., Ha, E., Egelman, S., Haney, A., Chin, E., and Wagner, D. 2012. "Android Permissions: User Attention, Comprehension, and Behavior," Proceedings of the Eighth Symposium on Usable Privacy and Security: ACM, p. 3.

Forest, A. L., and Wood, J. V. 2012. "When Social Networking Is Not Working Individuals with Low Self-Esteem Recognize but Do Not Reap the Benefits of Self-Disclosure on Facebook," Psychological science), p. 0956797611429709.

Fornell, C., and Larcker, D. F. 1981. "Evaluating Structural Equation Models with Unobservable Variables and Measurement Error," Journal of Marketing Research (JMR) (18:1).

Fox, J., and Warber, K. M. 2014. "Social Networking Sites in Romantic Relationships: Attachment, Uncertainty, and Partner Surveillance on Facebook," Cyberpsychology, Behavior, and Social Networking (17:1), pp. 3-7. 134

French, A. M., and Read, A. 2013. "My Mom's on Facebook: An Evaluation of Information Sharing Depth in Social Networking," Behaviour & Information Technology (32:10), pp. 1049-1059.

Furner, C. P., Racherla, P., and Babb, J. S. 2015. "What We Know and Do Not Know About Mobile App Usage and Stickiness: A Research Agenda," International Journal of E-Services and Mobile Applications (IJESMA) (7:3), pp. 48-69.

Galletta, D., Eargle, D., Janansefat, S., Kunev, D., and Singh, S. P. 2015. "Integrating Social and Economic Models of Responding to Privacy Messages in Mobile Computing: A Research Agenda," Proceedings of the 10th Pre-ICIS Workshop on Information Security and Privacy, Ft. Worth, TX: AIS.

Gerhart, N., and Koohikamali, M. 2015. "Anonymously Social Networking: A Great Migration," Americas Conference On Information Systems (AMCIS).

Gerlach, J., Widjaja, T., and Buxmann, P. 2015. "Handle with Care: How Online Social Network Providers’ Privacy Policies Impact Users’ Information Sharing Behavior," The Journal of Strategic Information Systems (24:1), pp. 33-43.

Gindin, S. E. 2009. "Nobody Reads Your Privacy Policy or Online Contract: Lessons Learned and Questions Raised by the Ftc's Action against Sears," Northwestern Journal of Technology and Intellectual Property (8), p. 37.

Goel, L., Johnson, N. A., Junglas, I., and Ives, B. 2011. "From Space to Place: Predicting Users' Intentions to Return to Virtual Worlds," MIS quarterly (35:3), pp. 749-772.

Goel, S., and Chengalur-Smith, I. 2010. "Metrics for Characterizing the Form of Security Policies," The Journal of Strategic Information Systems (19:4), pp. 281-295.

Goffman, E. 1974. Frame Analysis: An Essay on the Organization of Experience. Harvard University Press.

Grange, C., and Benbasat, I. 2014. "Explaining Customers’ Utilitarian and Hedonic Perceptions in the Context of Product Search within Social Network-Enabled Shopping Websites," Thirteenth Annual Workshop on HCI Research in MIS (SIGHCI 2014), Auckland, New Zealand.

Green, T., Wilhelmsen, T., Wilmots, E., Dodd, B., and Quinn, S. 2016. "Social Anxiety, Attributes of Online Communication and Self-Disclosure across Private and Public Facebook Communication," Computers in Human Behavior (58), pp. 206-213.

Greenberg, M. A., and Stone, A. A. 1992. "Emotional Disclosure About Traumas and Its Relation to Health: Effects of Previous Disclosure and Trauma Severity," Journal of personality and social psychology (63:1), p. 75.

Greenhow, C., and Robelia, B. 2009. "Old Communication, New Literacies: Social Network Sites as Social Learning Resources," Journal of Computer‐Mediated Communication (14:4), pp. 1130- 1161.

Grisaffe, D. B., and Nguyen, H. P. 2011. "Antecedents of Emotional Attachment to Brands," Journal of Business Research (64:10), pp. 1052-1059. 135

Grodzinsky, F., and Tavani, H. T. 2010. "Applying the “Contextual Integrity” Model of Privacy to Personal Blogs in the Blogoshere,").

Gross, R., and Acquisti, A. 2005. "Information Revelation and Privacy in Online Social Networks," Proceedings of the 2005 ACM workshop on Privacy in the electronic society: ACM, pp. 71-80.

Gudelunas, D. 2012. "There’s an App for That: The Uses and Gratifications of Online Social Networks for Gay Men," Sexuality & Culture (16:4), pp. 347-365.

Hair, J. F., Tatham, R. L., Anderson, R. E., and Black, W. 2006. Multivariate Data Analysis. Pearson Prentice Hall Upper Saddle River, NJ.

Hair Jr, J. F., Hult, G. T. M., Ringle, C., and Sarstedt, M. 2013. A Primer on Partial Least Squares Structural Equation Modeling (Pls-Sem). SAGE Publications, Incorporated.

Han, K., Shih, P. C., Rosson, M. B., and Carroll, J. M. 2014. "Understanding Local Community Attachment, Engagement and Social Support Networks Mediated by Mobile Technology," Interacting with Computers), p. iwu040.

Hann, I.-H., Hui, K.-L., Lee, S.-Y. T., and Png, I. P. 2007. "Overcoming Online Information Privacy Concerns: An Information-Processing Theory Approach," Journal of Management Information Systems (24:2), pp. 13-42.

Hann, I.-H., Hui, K.-L., Lee, T., and Png, I. 2002. "Online Information Privacy: Measuring the Cost- Benefit Trade-Off," ICIS 2002 Proceedings, p. 1.

Harris, P. B., Brown, B. B., and Werner, C. M. 1996. "Privacy Regulation and Place Attachment: Predicting Attachments to a Student Family Housing Facility," Journal of Environmental Psychology (16:4), pp. 287-301.

Hart, J., Nailling, E., Bizer, G. Y., and Collins, C. K. 2015. "Attachment Theory as a Framework for Explaining Engagement with Facebook," Personality and Individual Differences (77), pp. 33-40.

Hassenzahl, M., and Tractinsky, N. 2006. "User Experience-a Research Agenda," Behaviour & information technology (25:2), pp. 91-97.

Hazan, C., and Shaver, P. 1987. "Romantic Love Conceptualized as an Attachment Process," Journal of personality and social psychology (52:3), p. 511.

Henseler, J., Ringle, C. M., and Sinkovics, R. R. 2009. "The Use of Partial Least Squares Path Modeling in International Marketing," Advances in international marketing (20), pp. 277-319.

Herrmann, D., and Lindemann, J. 2016. "Obtaining Personal Data and Asking for Erasure: Do App Vendors and Website Owners Honour Your Privacy Rights?," arXiv preprint arXiv:1602.01804).

Hiller, J. S., and Park, J.-M. J. 2014. "Spectrum Sharing and Privacy: A Research Agenda." Retrieved 04/11/2016, 2016, from http://www.arias.ece.vt.edu/pdfs/spectrumagenda.pdf

136

Hodge Jr, J. G., Gostin, L. O., and Jacobson, P. D. 1999. "Legal Issues Concerning Electronic Health Information: Privacy, Quality, and Liability," Journal of the American Medical Association (282:15), pp. 1466-1471.

Hogarth, R. M., and Einhorn, H. J. 1992. "Order Effects in Belief Updating: The Belief-Adjustment Model," Cognitive psychology (24:1), pp. 1-55.

Hong, J.-C., Hwang, M.-Y., Hsu, C.-H., Tai, K.-H., and Kuo, Y.-C. 2015. "Belief in Dangerous Virtual Communities as a Predictor of Continuance Intention Mediated by General and Online Social Anxiety: The Facebook Perspective," Computers in Human Behavior (48), pp. 663-670.

Hong, W., and Thong, J. Y. 2013. "Internet Privacy Concerns: An Integrated Conceptualization and Four Empirical Studies," MIS Quarterly (37:1), pp. 275-298.

Hu, N., Bose, I., Koh, N. S., and Liu, L. 2012. "Manipulation of Online Reviews: An Analysis of Ratings, Readability, and Sentiments," Decision Support Systems (52:3), pp. 674-684.

Huang, L.-Y., Hsieh, Y.-J., and Wu, Y.-C. J. 2014. "Gratifications and Social Network Service Usage: The Mediating Role of Online Experience," Information & Management (51:6), pp. 774-782.

Huckvale, K., Prieto, J. T., Tilney, M., Benghozi, P.-J., and Car, J. 2015. "Unaddressed Privacy Risks in Accredited Health and Wellness Apps: A Cross-Sectional Systematic Assessment," BMC medicine (13:1), p. 1.

Hui, K.-L., Teo, H. H., and Lee, S.-Y. T. 2007. "The Value of Privacy Assurance: An Exploratory Field Experiment," Mis Quarterly (31:1), pp. 19-33.

Hulland, J. 1999. "Use of Partial Least Squares (Pls) in Strategic Management Research: A Review of Four Recent Studies," Strategic Management Journal (20:2), pp. 195-204.

Imlawi, J., and Gregg, D. 2014. "Engagement in Online Social Networks: The Impact of Self-Disclosure and Humor," International Journal of Human-Computer Interaction (30:2), pp. 106-125.

Jackson, J. D., Mun, Y. Y., and Park, J. S. 2013. "An Empirical Test of Three Mediation Models for the Relationship between Personal Innovativeness and User Acceptance of Technology," Information & Management (50:4), pp. 154-161.

Jiang, Z., Heng, C. S., and Choi, B. C. 2013. "Research Note—Privacy Concerns and Privacy-Protective Behavior in Synchronous Online Social Interactions," Information Systems Research (24:3), pp. 579-595.

John, L. K., Acquisti, A., and Loewenstein, G. 2011. "Strangers on a Plane: Context-Dependent Willingness to Divulge Sensitive Information," Journal of Consumer Research (37:5), pp. 858- 873.

Joinson, A. N., Houghton, D. J., Vasalou, A., and Marder, B. L. 2011. "Digital Crowding: Privacy, Self- Disclosure, and Technology," in Privacy Online. Springer, pp. 33-45.

Joinson, A. N., Reips, U.-D., Buchanan, T., and Schofield, C. B. P. 2010. "Privacy, Trust, and Self- Disclosure Online," Human–Computer Interaction (25:1), pp. 1-24. 137

Junglas, I., Abraham, C., and Watson, R. T. 2008a. "Task-Technology Fit for Mobile Locatable Information Systems," Decision Support Systems (45:4), pp. 1046-1057.

Junglas, I. A., Johnson, N. A., and Spitzmüller, C. 2008b. "Personality Traits and Concern for Privacy: An Empirical Study in the Context of Location-Based Services," European Journal of Information Systems (17:4), pp. 387-402.

Kahneman, D., and Tversky, A. 1979. "Prospect Theory: An Analysis of Decision under Risk," Econometrica: Journal of the Econometric Society), pp. 263-291.

Kang, H., Hahn, M., Fortin, D. R., Hyun, Y. J., and Eom, Y. 2006. "Effects of Perceived Behavioral Control on the Consumer Usage Intention of E‐Coupons," Psychology & Marketing (23:10), pp. 841-864.

Karahanna, E., Straub, D. W., and Chervany, N. L. 1999. "Information Technology Adoption across Time: A Cross-Sectional Comparison of Pre-Adoption and Post-Adoption Beliefs," MIS Quarterly (23:2), pp. 183-213.

Keith, M. J., Babb, J., Lowry, P. B., Furner, C., and Abdullat, A. 2013a. "The Roles of Privacy Assurance, Network Effects, and Information Cascades in the Adoption of and Willingness to Pay for Location-Based Services with Mobile Applications," Network Effects, and Information Cascades in the Adoption of and Willingness to Pay for Location-Based Services with Mobile Applications (June 30, 2013)).

Keith, M. J., Thompson, S. C., Hale, J., Lowry, P. B., and Greer, C. 2013b. "Information Disclosure on Mobile Devices: Re-Examining Privacy Calculus with Actual User Behavior," International Journal of Human-Computer Studies (71:12), pp. 1163-1173.

Keren, G. 2012. "Framing and Communication: The Role of Frames in Theory and in Practice," Netspar panel paper (32).

Kiesler, S., Siegel, J., and McGuire, T. W. 1984. "Social Psychological Aspects of Computer-Mediated Communication," American psychologist (39:10), p. 1123.

Kim, D. J., Song, Y. I., Braynov, S. B., and Rao, H. R. 2005. "A Multidimensional Trust Formation Model in B-to-C E-Commerce: A Conceptual Framework and Content Analyses of Academia/Practitioner Perspectives," Decision Support Systems (40:2), pp. 143-165.

Kim, H.-W., Chan, H. C., and Chan, Y. P. 2007. "A Balanced Thinking–Feelings Model of Information Systems Continuance," International Journal of Human-Computer Studies (65:6), pp. 511-525.

Kim, K., Schmierbach, M. G., Chung, M.-Y., Fraustino, J. D., Dardis, F., and Ahern, L. 2015. "Is It a Sense of Autonomy, Control, or Attachment? Exploring the Effects of in-Game Customization on Game Enjoyment," Computers in Human Behavior (48), pp. 695-705.

Kim, S. S., and Malhotra, N. K. 2005. "A Longitudinal Model of Continued Is Use: An Integrative View of Four Mechanisms Underlying Postadoption Phenomena," Management science (51:5), pp. 741-755.

138

Kim, Y. H., Kim, D. J., and Wachter, K. 2013. "A Study of Mobile User Engagement (Moen): Engagement Motivations, Perceived Value, Satisfaction, and Continued Engagement Intention," Decision Support Systems (56), pp. 361-370.

Knijnenburg, B. P., and Kobsa, A. 2013. "Making Decisions About Privacy: Information Disclosure in Context-Aware Recommender Systems," ACM Transactions on Interactive Intelligent Systems (TiiS) (3:3), p. 20.

Koohikamali, M., Gerhart, N., and Mousavizadeh, M. 2015. "Location Disclosure on Lb-Snas: The Role of Incentives on Sharing Behavior," Decision Support Systems (71), pp. 78-87.

Koohikamali, M., and Kim, D. J. 2015. "Does Information Sensitivity Make a Difference? Mobile Applications’ Privacy Statements: A Text Mining Approach," Americas Conference on Information Systems, Puerto Rico: AIS.

Koohikamali, M., and Peak, D. 2015. "Location-Based Mobile Applications Usage Behavior: Beware the Power of the Dark Side," Americas Conference On Information Systems (AMCIS).

Kranz, M., Murmann, L., and Michahelles, F. 2013. "Research in the Large: Challenges for Large-Scale Mobile Application Research-a Case Study About Nfc Adoption Using Gamification Via an App Store," International Journal of Mobile Human Computer Interaction (IJMHCI) (5:1), pp. 45-61.

Krasnova, H., Spiekermann, S., Koroleva, K., and Hildebrand, T. 2010. "Online Social Networks: Why We Disclose," Journal of Information Technology (25:2), pp. 109-125.

Kroenung, J., and Eckhardt, A. 2015. "The Attitude Cube–a Three-Dimensional Model of Situational Factors in Is Adoption and Their Impact on the Attitude-Behavior Relationship," Information & Management).

Ku, Y.-C., Chen, R., and Zhang, H. 2013. "Why Do Users Continue Using Social Networking Sites? An Exploratory Study of Members in the United States and Taiwan," Information & Management (50:7), pp. 571-581.

Kumar, R., Nivangune, A., and Joshi, P. 2015. "Challenges in Transition from Web to App," Proceedings of the 3rd International Workshop on Mobile Development Lifecycle: ACM, pp. 9-10.

Kwon, K., Barnett, G. A., and Chen, H. 2009. "Assessing Cultural Differences in Translations: A Semantic Network Analysis of the Universal Declaration of Human Rights," Journal of International and Intercultural Communication (2:2), pp. 107-138.

Laird, S. 2012. "Mobile Site or Mobile App: Which Should You Build First? ." Infographic Retrieved 11/29, 2015, from http://mashable.com/2012/06/06/mobile-site-mobile-app- infographic/#6UvAyXk6Sgqt

Lankton, N. K., McKnight, D. H., Wright, R. T., and Thatcher, J. B. 2016. "Research Note—Using Expectation Disconfirmation Theory and Polynomial Modeling to Understand Trust in Technology," Information Systems Research (27:1), pp. 197-213.

Laufer, R., and Wolfe, M. 1977. "Privacy as a Concept and a Social Issue: A Multidimensional Developmental Theory," Journal of Social Issues (33:3), pp. 22-42. 139

Lea, M., Spears, R., and de Groot, D. 2001. "Knowing Me, Knowing You: Anonymity Effects on Social Identity Processes within Groups," Personality and Social Psychology Bulletin (27:5), pp. 526- 537.

Lee, C.-K., Mjelde, J. W., Kim, T.-K., and Lee, H.-M. 2014. "Estimating the Intention–Behavior Gap Associated with a Mega Event: The Case of the Expo 2012 Yeosu Korea," Tourism Management (41), pp. 168-177.

Lee, C. S., and Ma, L. 2012. "News Sharing in Social Media: The Effect of Gratifications and Prior Experience," Computers in Human Behavior (28:2), pp. 331-339.

Lee, M.-C. 2009. "Factors Influencing the Adoption of Internet Banking: An Integration of Tam and Tpb with Perceived Risk and Perceived Benefit," Electronic Commerce Research and Applications (8:3), pp. 130-141.

Leijdekkers, P., and Gay, V. 2015. "Improving User Engagement by Aggregating and Analysing Health and Fitness Data on a Mobile App," in Inclusive Smart Cities and E-Health. Springer, pp. 325- 330.

Lenhart, A., Madden, M., Smith, A., and Macgill, A. 2009. "Teens and Social Media: An Overview," Washington, DC: Pew Internet and American Life).

Li, M., Zhu, H., Gao, Z., Chen, S., Ren, K., Yu, L., and Hu, S. 2013. "All Your Location Are Belong to Us: Breaking Mobile Social Networks for Automated User Location Tracking," arXiv preprint arXiv:1310.2547).

Li, Q., and Clark, G. 2013. "Mobile Security: A Look Ahead," Security & Privacy, IEEE (11:1), pp. 78- 81.

Liau, A. K., Khoo, A., and Hwaang, P. 2005. "Factors Influencing Adolescents Engagement in Risky Internet Behavior," CyberPsychology & Behavior (8:6), pp. 513-520.

Liccardi, I., Bulger, M., Abelson, H., Weitzner, D. J., and Mackay, W. 2014. "Can Apps Play by the Coppa Rules?," Privacy, Security and Trust (PST), 2014 Twelfth Annual International Conference on: IEEE, pp. 1-9.

Licorish, S. A., MacDonell, S. G., and Clear, T. 2015. "Analyzing Confidentiality and Privacy Concerns: Insights from Android Issue Logs," International Conference on Evaluation and Assessment in Software Engineering: ACM, p. 18.

Liebermann, Y., and Stashevsky, S. 2002. "Perceived Risks as Barriers to Internet and E-Commerce Usage," Qualitative Market Research: An International Journal (5:4), pp. 291-300.

Lin, S.-W., and Liu, Y.-C. 2012. "The Effects of Motivations, Trust, and Privacy Concern in Social Networking," Service Business (6:4), pp. 411-424.

Lips, A. M. B., and Eppel, E. A. 2016. "Understanding and Explaining Online Personal Information- Sharing Behaviours of New Zealanders: A New Taxonomy," Information, Communication & Society), pp. 1-16.

140

Literat, I. 2016. "Interrogating Participation across Disciplinary Boundaries: Lessons from Political Philosophy, Cultural Studies, Art, and Education," New Media & Society), p. 1461444816639036.

Liu, B., Lin, J., and Sadeh, N. 2014a. "Reconciling Mobile App Privacy and Usability on Smartphones: Could User Privacy Profiles Help?," International conference on world wide web: IWWC steering committee, pp. 201-212.

Liu, S., and Kuhn, R. 2010. "Data Loss Prevention," IT professional (12:2), pp. 10-13.

Liu, Z., Shan, J., Bonazzi, R., and Pigneur, Y. 2014b. "Privacy as a Tradeoff: Introducing the Notion of Privacy Calculus for Context-Aware Mobile Applications," System Sciences (HICSS), 2014 47th Hawaii International Conference on: IEEE, pp. 1063-1072.

Livingstone, S. 2008. "Taking Risky Opportunities in Youthful Content Creation: Teenagers' Use of Social Networking Sites for Intimacy, Privacy and Self-Expression," New media & Society (10:3), pp. 393-411.

Lossio-Ventura, J. A., Jonquet, C., Roche, M., and Teisseire, M. 2014. "Towards a Mixed Approach to Extract Biomedical Terms from Text Corpus," International journal of Knowledge Discovery in Bioinformatics (4:1), pp. 1-15.

Lowry, P. B., Cao, J., and Everard, A. 2011. "Privacy Concerns Versus Desire for Interpersonal Awareness in Driving the Use of Self-Disclosure Technologies: The Case of Instant Messaging in Two Cultures," Journal of Management Information Systems (27:4), pp. 163-200.

Lowry, P. B., Gaskin, J., and Moody, G. 2015. "Proposing the Multi-Motive Information Systems Continuance Model (Misc) to Better Explain End-User System Evaluations and Continuance Intentions," Journal of the Association for Information Systems (16:7), pp. 515-579.

Lu, J., Yao, J. E., and Yu, C.-S. 2005. "Personal Innovativeness, Social Influences and Adoption of Wireless Internet Services Via Mobile Technology," The Journal of Strategic Information Systems (14:3), pp. 245-268.

Lwin, M., Wirtz, J., and Williams, J. D. 2007. "Consumer Online Privacy Concerns and Responses: A Power–Responsibility Equilibrium Perspective," Journal of the Academy of Marketing Science (35:4), pp. 572-585.

Lynn, M. R. 1986. "Determination and Quantification of Content Validity," Nursing Research (35:6), pp. 382-386.

Mahan, J. E., Seo, W. J., Jordan, J. S., and Funk, D. 2015. "Exploring the Impact of Social Networking Sites on Running Involvement, Running Behavior, and Social Life Satisfaction," Sport Management Review (18:2), pp. 182-192.

Mai, B., Menon, N. M., and Sarkar, S. 2010. "No Free Lunch: Price Premium for Privacy Seal-Bearing Vendors," Journal of Management Information Systems (27:2), pp. 189-212.

141

Malhotra, N. K., Kim, S. S., and Agarwal, J. 2004. "Internet Users' Information Privacy Concerns (Iuipc): The Construct, the Scale, and a Causal Model," Information Systems Research (15:4), pp. 336- 355.

Martínez-Pérez, B., De La Torre-Díez, I., and López-Coronado, M. 2015. "Privacy and Security in Mobile Health Apps: A Review and Recommendations," Journal of Medical Systems (39:1), pp. 1-8.

Massey, A. K., Eisenstein, J., Antón, A. I., and Swire, P. P. 2013. "Automated Text Mining for Requirements Analysis of Policy Documents," Requirements Engineering Conference (RE), 2013 21st IEEE International: IEEE, pp. 4-13.

McDonagh, E. M., Whirl-Carrillo, M., Garten, Y., Altman, R. B., and Klein, T. E. 2011. "From Pharmacogenomic Knowledge Acquisition to Clinical Applications: The Pharmgkb as a Clinical Pharmacogenomic Biomarker Resource," Biomarkers in medicine (5:6), pp. 795-806.

McDonald, A. M., and Cranor, L. F. 2008. "Cost of Reading Privacy Policies," Law and Policy for the Information Society (4), p. 543.

Mcdonald, A. M., Reeder, R. W., Kelley, P. G., and Cranor, L. F. 2009. "A Comparative Study of Online Privacy Policies and Formats," Privacy enhancing technologies: Springer, pp. 37-55.

Meade, P. T., and Rabelo, L. 2004. "The Technology Adoption Life Cycle Attractor: Understanding the Dynamics of High-Tech Markets," Technological Forecasting and Social Change (71:7), pp. 667-684.

Meng, W., Ding, R., Chung, S. P., Han, S., and Lee, W. 2016. "The Price of Free: Privacy Leakage in Personalized Mobile in-App Ads," NDSS (16), pp. 21-24.

Meschtscherjakov, A. 2009. "Mobile Attachment: Emotional Attachment Towards Mobile Devices and Services," Proceedings of the 11th International conference on human-computer interaction with mobile devices and services: ACM, p. 102.

Miller, C. 2011. "Mobile Attacks and Defense," Security & Privacy, IEEE (9:4), pp. 68-70.

Miller, N. E., and Dollard, J. 1950. Personality and Psychotherapy. New York, NY: McGraw-Hill

Mintz, J. 2014. "The Role of User Emotional Attachment in Driving the Engagement of Children with Autism Spectrum Disorders (Asd) in Using a Smartphone App Designed to Develop Social and Life Skill Functioning," in Computers Helping People with Special Needs. Springer, pp. 486-493.

MobiForge. 2013. "Global Mobile Statistics 2013 Section E: Mobile Apps, App Stores, Pricing and Failure Rates." MobiThinking Retrieved 02/10, 2015, from http://mobiforge.com/research- analysis/global-mobile-statistics-2013-section-e-mobile-apps-app-stores-pricing-and-failure- rates#appusers

Montazemi, A. R., and Qahri-Saremi, H. 2015. "Factors Affecting Adoption of Online Banking: A Meta- Analytic Structural Equation Modeling Study," Information & Management (52:2), pp. 210-226.

142

Montesdioca, G., Hino, M., and Maçada, A. 2015. "The Information Privacy Concerns at the Organizational Level: An Exploratory Study in the Bank Sector," Americas Conference on Information Systems, Puerto Rico.

Moore, A. D. 2003. "Privacy: Its Meaning and Value," American Philosophical Quarterly (40:3), pp. 215-227.

Moore, G. A. 1999. Crossing the Chasm. Harper Perennial.

Morris, R. 1994. "Computerized Content Analysis in Management Research: A Demonstration of Advantages & Limitations," Journal of Management (20:4), pp. 903-931.

Mothersbaugh, D. L., Foxx, W. K., Beatty, S. E., and Wang, S. 2011. "Disclosure Antecedents in an Online Service Context: The Role of Sensitivity of Information," Journal of Service Research), p. 1094670511424924.

Mugge, R., Schifferstein, H. N., and Schoormans, J. P. 2006. "A Longitudinal Study on Product Attachment and Its Determinants," European Advances in Consumer Research (7), pp. 641-647.

Musthaler, L. 2013. "At Least 80% of Mobile Apps Have Security and Privacy Issues That Put Enterprises at Risk." IT best practices, 2016, from http://www.networkworld.com/article/2163225/infrastructure-management/at-least-80-of-mobile- apps-have-security-and-privacy-issues-that-put-ente.html

Myles, G., Friday, A., and Davies, N. 2003. "Preserving Privacy in Environments with Location-Based Applications," IEEE Pervasive Computing (2:1), pp. 56-64.

Nelson, D. 2015. "Mobile Apps Usage and Trends " Infographic Retrieved 09/14/2015, from http://destinapps.com/blog/2015/6/18/mobile-apps-usage-and-trends-infographic

Nissenbaum, H. 2004. "Privacy as Contextual Integrity," Wash. L. Rev. (79), p. 119.

Nissenbaum, H. 2010. "Privacy Rights in Context: Applying the Framework," in Privacy in Context: Technology, Policy, and the Integrity of Social Life. Stanford, CA: Stanford University Press, p. 288.

Norberg, P. A., Horne, D. R., and Horne, D. A. 2007. "The Privacy Paradox: Personal Information Disclosure Intentions Versus Behaviors," Journal of Consumer Affairs (41:1), pp. 100-126.

Nosko, A., Wood, E., and Molema, S. 2010. "All About Me: Disclosure in Online Social Networking Profiles: The Case of Facebook," Computers in Human Behavior (26:3), pp. 406-418.

Novak, D. 2016. "What Will Be the Biggest Mobile App Design Trends in 2016?" Trends, from https://www.quora.com/What-will-be-the-biggest-mobile-app-design-trends-in-2016

Nunnally, J. C., Bernstein, I. H., and Berge, J. M. t. 1967. Psychometric Theory. McGraw-Hill New York.

O'Brien, H. L., and Toms, E. G. 2008. "What Is User Engagement? A Conceptual Framework for Defining User Engagement with Technology," Journal of the American Society for Information Science and Technology (59:6), pp. 938-955. 143

O'Brien, H. L., and Toms, E. G. 2010. "The Development and Evaluation of a Survey to Measure User Engagement," Journal of the American Society for Information Science and Technology (61:1), pp. 50-69.

Olmstead, K., and Atkinson, M. 2015. "Apps Permissions in the Google Play Store," Pew Research Center, p. 37.

Osatuyi, B. 2015. "Is Lurking an Anxiety-Masking Strategy on Social Media Sites? The Effects of Lurking and Computer Anxiety on Explaining Information Privacy Concern on Social Media Platforms," Computers in Human Behavior (49), pp. 324-332.

Overby, J. W., and Lee, E.-J. 2006. "The Effects of Utilitarian and Hedonic Online Shopping Value on Consumer Preference and Intentions," Journal of Business Research (59:10), pp. 1160-1166.

Ozdemir, Z., Barron, J., and Bandyopadhyay, S. 2011. "An Analysis of the Adoption of Digital Health Records under Switching Costs," Information Systems Research (22:3), pp. 491-503.

Patil, S., Page, X., and Kobsa, A. 2011. "With a Little Help from My Friends: Can Social Navigation Inform Interpersonal Privacy Preferences?," Proceedings of the ACM 2011 conference on Computer supported cooperative work: ACM, pp. 391-394.

Patra, A., and Singh, D. 2013. "A Survey Report on Text Classification with Different Term Weighing Methods and Comparison between Classification Algorithms," International Journal of Computer Applications (75:7), pp. 14-18.

Pavlou, P. A. 2011. "State of the Information Privacy Literature: Where Are We Now and Where Should We Go?," MIS Quarterly (35:4).

Pearson, S. 2009. "Taking Account of Privacy When Designing Cloud Computing Services," Proceedings of the 2009 ICSE Workshop on Software Engineering Challenges of Cloud Computing: IEEE Computer Society, pp. 44-52.

Pedersen, P. E. 2005. "Adoption of Mobile Internet Services: An Exploratory Study of Mobile Commerce Early Adopters," Journal of Organizational Computing and Electronic Commerce (15:3), pp. 203-222.

Pergler, E., Glatz, D., and Kreuzer, E. 2013. "Privacy Challenges in Mobile Technology Acceptance Research," Central European Conference on Information and Intelligent Systems, Varaždin, Croatia.

Pfitzmann, A., and Köhntopp, M. 2001. "Anonymity, Unobservability, and Pseudonymity—a Proposal for Terminology," Designing privacy enhancing technologies: Springer, pp. 1-9.

Pinsonneault, A., and Heppel, N. 1997. "Anonymity in Group Support Systems Research: A New Conceptualization, Measure, and Contingency Framework," Journal of Management Information Systems (14:3), pp. 89-108.

Posey, C., Lowry, P. B., Roberts, T. L., and Ellis, T. S. 2010. "Proposing the Online Community Self- Disclosure Model: The Case of Working Professionals in France and the Uk Who Use Online Communities," European Journal of Information Systems (19:2), pp. 181-195. 144

Postmes, T., and Spears, R. 1998. "Deindividuation and Antinormative Behavior: A Meta-Analysis," Psychological bulletin (123:3), p. 238.

Rainie, L., and Duggan, M. 2016. "Privacy and Information Sharing," Pew Research Center.

Rehm, J., Shield, K. D., Joharchi, N., and Shuper, P. A. 2012. "Alcohol Consumption and the Intention to Engage in Unprotected Sex: Systematic Review and Meta‐Analysis of Experimental Studies," Addiction (107:1), pp. 51-59.

Ren, Y., Harper, F. M., Drenner, S., Terveen, L. G., Kiesler, S. B., Riedl, J., and Kraut, R. E. 2012. "Building Member Attachment in Online Communities: Applying Theories of Group Identity and Interpersonal Bonds," Mis Quarterly (36:3), pp. 841-864.

Rogers, E. M. 1983. Diffusion of Innovations, (1st ed.). New York: Free Press.

Rogers, E. M. 2010. Diffusion of Innovations, (4th ed.). Simon and Schuster.

Rudolph, S. 2015. "Mobile Apps Usage – Statistics and Trends." Digital and Social, 2016, from http://www.go-globe.com/blog/mobile-apps-usage/

Safran, C., Bloomrosen, M., Hammond, W. E., Labkoff, S., Markel-Fox, S., Tang, P. C., and Detmer, D. E. 2007. "Toward a National Framework for the Secondary Use of Health Data: An American Medical Informatics Association White Paper," Journal of the American Medical Informatics Association (14:1), pp. 1-9.

Salton, G., Wong, A., and Yang, C.-S. 1975. "A Vector Space Model for Automatic Indexing," Communications of the ACM (18:11), pp. 613-620.

Sánchez-Franco, M. J., Buitrago-Esquinas, E. M., and Yñiguez-Ovando, R. 2015. "What Drives Social Integration in the Domain of Social Network Sites? Examining the Influences of Relationship Quality and Stable and Dynamic Individual Differences," Online Information Review (39:1), pp. 5-25.

Sar, R. K., and Al-Saggaf, Y. 2014. "Contextual Integrity’s Decision Heuristic and the Tracking by Social Network Sites," Ethics and Information Technology (16:1), pp. 15-26.

Scanfeld, D., Scanfeld, V., and Larson, E. L. 2010. "Dissemination of Health Information through Social Networks: and Antibiotics," American Journal of Infection Control (38:3), pp. 182-188.

Schaffer, H. R., and Emerson, P. E. 1964. "The Development of Social Attachments in Infancy," Monographs of the society for research in child development), pp. 1-77.

Scheufele, D. A., and Tewksbury, D. 2007. "Framing, Agenda Setting, and Priming: The Evolution of Three Media Effects Models," Journal of communication (57:1), pp. 9-20.

Schoeman, F. 1984. "Privacy: Philosophical Dimensions," American Philosophical Quarterly (21:3), pp. 199-213.

Schwartz, S. H. 1994. "Are There Universal Aspects in the Structure and Contents of Human Values?," Journal of social issues (50:4), pp. 19-45. 145

Scott, S. V., and Orlikowski, W. J. 2014. "Entanglements in Practice: Performing Anonymity through Social Media," MIS Quarterly (38:3), p. 22.

Sebastiani, F. 2002. "Machine Learning in Automated Text Categorization," ACM computing surveys (CSUR) (34:1), pp. 1-47.

Semple, J. L., Sharpe, S., Murnaghan, M. L., Theodoropoulos, J., and Metcalfe, K. A. 2015. "Using a Mobile App for Monitoring Post-Operative Quality of Recovery of Patients at Home: A Feasibility Study," JMIR mHealth and uHealth (3:1).

Senter, R., and Smith, E. 1967. "Automated Readability Index," DTIC Document.

Sheehan, K. B., and Hoy, M. G. 2000. "Dimensions of Privacy Concern among Online Consumers," Journal of Public Policy & Marketing (19:1), pp. 62-73.

Shi, Z., Rui, H., and Whinston, A. B. 2014. "Content Sharing in a Social Broadcasting Environment: Evidence from Twitter," MIS Quarterly (38:1), pp. 123-142.

Smith, A. 2014. "Half of Online Americans Don’t Know What a Privacy Policy Is." Numbers, facts, and trends shaping your world Retrieved 04/11, 2016, from http://www.pewresearch.org/fact- tank/2014/12/04/half-of-americans-dont-know-what-a-privacy-policy-is/

Smith, H. J., Dinev, T., and Xu, H. 2011. "Information Privacy Research: An Interdisciplinary Review," MIS Quarterly (35:4), pp. 989-1016.

Smith, H. J., Milberg, S. J., and Burke, S. J. 1996. "Information Privacy: Measuring Individuals' Concerns About Organizational Practices," MIS Quarterly (20:2), pp. 167-196.

Snekkenes, E. 2001. "Concepts for Personal Location Privacy Policies," Proceedings of the 3rd ACM conference on Electronic Commerce: ACM, pp. 48-57.

Solove, D. J. 2006. "A Taxonomy of Privacy," University of Pennsylvania Law Review (154:3), pp. 477- 564.

Soper, T. 2014. "Study: Americans Spend 162 Minutes on Their Mobile Device Per Day, Mostly with Apps." GeekWire Retrieved 11/29, 2015, from http://www.geekwire.com/2014/flurry-report- mobile-phones-162-minutes/

Sproull, L., and Kiesler, S. 1986. "Reducing Social Context Cues: Electronic Mail in Organizational Communication," Management science (32:11), pp. 1492-1512.

Squicciarini, A. C., Xu, H., and Zhang, X. L. 2011. "Cope: Enabling Collaborative Privacy Management in Online Social Networks," Journal of the American Society for Information Science and Technology (62:3), pp. 521-534.

Staddon, J., Huffaker, D., Brown, L., and Sedley, A. 2012. "Are Privacy Concerns a Turn-Off?: Engagement and Privacy in Social Networks," Proceedings of the eighth symposium on usable privacy and security: ACM, p. 10.

Statista. 2015. "Statistics and Facts About Mobile App Usage," The Statistics Portal. 146

Statista. 2016a. "Statistics and Facts About Mobile App Usage." Mobile App Usage, from http://www.statista.com/statistics/269025/worldwide-mobile-app-revenue-forecast/

Statista. 2016b. "Statistics and Facts About Mobile Social Networks." Mobile Social Networks - Statistics & Facts Retrieved 04/11, 2016, from http://www.statista.com/topics/2478/mobile-social- networks/

Stephenson, M. T., Hoyle, R. H., Palmgreen, P., and Slater, M. D. 2003. "Brief Measures of Sensation Seeking for Screening and Large-Scale Surveys," Drug and alcohol dependence (72:3), pp. 279- 286.

Straub, E. T. 2009. "Understanding Technology Adoption: Theory and Future Directions for Informal Learning," Review of educational research (79:2), pp. 625-649.

Straub Jr, D. W., and Collins, R. W. 1990. "Key Information Liability Issues Facing Managers: Software Piracy, Proprietary Databases, and Individual Rights to Privacy," Mis Quarterly (14:2), pp. 143- 156.

Stufflebeam, W., Bolchini, D., Earp, J. B., He, Q., and Jensen, C. 2004. "Financial Privacy Policies and the Need for Standardization," IEEE Security & Privacy (2:2), pp. 36-45.

Stutzman, F., and Hartzog, W. 2012. "Boundary Regulation in Social Media," Proceedings of the ACM 2012 conference on computer supported cooperative work: ACM, pp. 769-778.

Stutzman, F., Vitak, J., Ellison, N. B., Gray, R., and Lampe, C. 2012. "Privacy in Interaction: Exploring Disclosure and Social Capital in Facebook," ICWSM.

Sun, H. 2013. "A Longitudinal Study of Herd Behavior in the Adoption and Continued Use of Technology," Mis Quarterly (37:4), pp. 1013-1041.

Sun, Y., Wang, N., and Shen, X.-L. 2014. "Perceived Benefits, Privacy Risks, and Perceived Justice in Location Information Disclosure: A Moderated Mediation Analysis,").

Sunyaev, A., Dehling, T., Taylor, P. L., and Mandl, K. D. 2015. "Availability and Quality of Mobile Health App Privacy Policies," Journal of the American Medical Informatics Association (22:e1), pp. e28-e33.

Tabachnick, B. G., and Fidell, L. S. 2001. Using Multivariate Statistics, (Fifth ed.). Pearson Education Limited.

Taddicken, M. 2014. "The ‘Privacy Paradox’in the Social Web: The Impact of Privacy Concerns, Individual Characteristics, and the Perceived Social Relevance on Different Forms of Self‐ Disclosure," Journal of Computer‐Mediated Communication (19:2), pp. 248-273.

Takahashi, T. 2010. "Myspace or Mixi? Japanese Engagement with Sns (Social Networking Sites) in the Global Age," New media & society).

Talib, S., Razak, A., Munirah, S., Olowolayemo, A., Salependi, M., Ahmad, N. F., Kunhamoo, S., and Bani, S. K. 2014. "Perception Analysis of Social Networks' Privacy Policy: Instagram as a Case

147

Study," Information and Communication Technology for The Muslim World (ICT4M), 2014 The 5th International Conference on: IEEE, pp. 1-5.

Taylor, D. G., and Levin, M. 2014. "Predicting Mobile App Usage for Purchasing and Information- Sharing," International Journal of Retail & Distribution Management (42:8), pp. 759-774.

Taylor, S., and Todd, P. A. 1995. "Understanding Information Technology Usage: A Test of Competing Models," Information systems research (6:2), pp. 144-176.

Thomson, M., MacInnis, D. J., and Park, C. W. 2005. "The Ties That Bind: Measuring the Strength of Consumers’ Emotional Attachments to Brands," Journal of consumer psychology (15:1), pp. 77- 91.

Tian, X. 2016. "Network Domains in Social Networking Sites: Expectations, Meanings, and Social Capital," Information, Communication & Society (19:2), pp. 188-202.

Tow, W. N.-F. H., Dell, P., and Venable, J. 2010. "Understanding Information Disclosure Behaviour in Australian Facebook Users," Journal of Information Technology (25:2), pp. 126-136.

Trepte, S., and Reinecke, L. 2013. "The Reciprocal Effects of Social Network Site Use and the Disposition for Self-Disclosure: A Longitudinal Study," Computers in Human Behavior (29:3), pp. 1102-1112.

Trub, L., Revenson, T. A., and Salbod, S. 2014. "Getting Close from Far Away: Mediators of the Association between Attachment and Blogging Behavior," Computers in Human Behavior (41), pp. 245-252.

Tversky, A., and Kahneman, D. 1992. "Advances in Prospect Theory: Cumulative Representation of Uncertainty," Journal of Risk and uncertainty (5:4), pp. 297-323.

Utz, S. 2015. "The Function of Self-Disclosure on Social Network Sites: Not Only Intimate, but Also Positive and Entertaining Self-Disclosures Increase the Feeling of Connection," Computers in Human Behavior (45), pp. 1-10.

Vail, M. W., Earp, J. B., and Antón, A. I. 2008. "An Empirical Study of Consumer Perceptions and Comprehension of Web Site Privacy Policies," Engineering Management, IEEE Transactions on (55:3), pp. 442-454.

Valenzuela, S., Park, N., and Kee, K. F. 2009. "Is There Social Capital in a Social Network Site?: Facebook Use and College Students' Life Satisfaction, Trust, and Participation1," Journal of Computer‐Mediated Communication (14:4), pp. 875-901.

Van der Heijden, H. 2003. "Factors Influencing the Usage of Websites: The Case of a Generic Portal in the Netherlands," Information & Management (40:6), pp. 541-549.

Vasalou, A., Joinson, A. N., and Courvoisier, D. 2010. "Cultural Differences, Experience with Social Networks and the Nature of “True Commitment” in Facebook," International journal of human- computer studies (68:10), pp. 719-728.

148

Vasalou, A., Oostveen, A. M., Bowers, C., and Beale, R. 2015. "Understanding Engagement with the Privacy Domain through Design Research," Journal of the Association for Information Science and Technology (66:6), pp. 1263-1273.

Venkatesh, V., and Davis, F. D. 2000. "A Theoretical Extension of the Technology Acceptance Model: Four Longitudinal Field Studies," Management science (46:2), pp. 186-204.

Venkatesh, V., Morris, M. G., Davis, G. B., and Davis, F. D. 2003. "User Acceptance of Information Technology: Toward a Unified View," MIS Quarterly (27:3), pp. 425-478.

Vincent, J. 2006. "Emotional Attachment and Mobile Phones," Knowledge, Technology & Policy (19:1), pp. 39-44.

Vitticci, F. 2015. "Apple Promoting “Great Games with No in-App Purchases” on App Store Front Page." MacStories Retrieved 02/23, from http://www.macstories.net/news/apple-promoting-great- games-with-no-in-app-purchases-on-app-store-front-page/

Webster, J., and Ahuja, J. S. 2006. "Enhancing the Design of Web Navigation Systems: The Influence of User Disorientation on Engagement and Performance," MIS Quarterly (30:3), pp. 661-678.

Wehmeyer, K. 2007. "Assessing Users' Attachment to Their Mobile Devices," Management of Mobile Business, 2007. ICMB 2007. International Conference on the: IEEE, pp. 16-16.

Weible, R. J. 1993. "Privacy and Data: An Empirical Study of the Influence of Types of Data and Situational Context Upon Privacy Perceptions," in: Department of Business and Industry. Mississippi State University, p. 378.

Wellman, B. 2010. "The Reconstruction of Space and Time: Mobile Communication Practices," Contemporary Sociology: A Journal of Reviews (39:2), pp. 179-181.

Westin, A. F. 1968. "Privacy and Freedom," Washington and Lee Law Review (25:1), p. 166.

Westin, A. F. 2003. "Social and Political Dimensions of Privacy," Journal of social issues (59:2), pp. 431-453.

Wisniewski, P., Xu, H., Lipford, H., and Bello‐Ogunu, E. 2015. "Facebook Apps and Tagging: The Trade‐Off between Personal Privacy and Engaging with Friends," Journal of the Association for Information Science and Technology (66:9), pp. 1883-1896.

Xu, C., Peak, D., and Prybutok, V. 2015. "A Customer Value, Satisfaction, and Loyalty Perspective of Mobile Application Recommendations," Decision Support Systems (79), pp. 171-183.

Xu, F., Michael, K., and Chen, X. 2013a. "Factors Affecting Privacy Disclosure on Social Network Sites: An Integrated Model," Electronic Commerce Research (13:2), pp. 151-168.

Xu, H., Dinev, T., Smith, H. J., and Hart, P. 2008. "Examining the Formation of Individual's Privacy Concerns: Toward an Integrative View," ICIS 2008 Proceedings: AIS, p. 6.

Xu, H., Gupta, S., Rosson, M. B., and Carroll, J. M. 2012a. "Measuring Mobile Users' Concerns for Information Privacy," International Conference on Information Systems, Orlando: AIS, p. 16. 149

Xu, H., Luo, X. R., Carroll, J. M., and Rosson, M. B. 2011a. "The Personalization Privacy Paradox: An Exploratory Study of Decision Making Process for Location-Aware Marketing," Decision Support Systems (51:1), pp. 42-52.

Xu, H., Teo, H.-H., and Tan, B. 2005. "Predicting the Adoption of Location-Based Services: The Role of Trust and Perceived Privacy Risk," ICIS 2005 proceedings), p. 71.

Xu, H., Teo, H.-H., Tan, B. C., and Agarwal, R. 2009. "The Role of Push-Pull Technology in Privacy Calculus: The Case of Location-Based Services," Journal of Management Information Systems (26:3), pp. 135-174.

Xu, H., Teo, H.-H., Tan, B. C., and Agarwal, R. 2012b. "Research Note-Effects of Individual Self- Protection, Industry Self-Regulation, and Government Regulation on Privacy Concerns: A Study of Location-Based Services," Information Systems Research (23:4), pp. 1342-1363.

Xu, Q., Erman, J., Gerber, A., Mao, Z., Pang, J., and Venkataraman, S. 2011b. "Identifying Diverse Usage Behaviors of Smartphone Apps," Proceedings of the 2011 ACM SIGCOMM conference on Internet measurement conference: ACM, pp. 329-344.

Xu, Y., Lin, M., Lu, H., Cardone, G., Lane, N., Chen, Z., Campbell, A., and Choudhury, T. 2013b. "Preference, Context and Communities: A Multi-Faceted Approach to Predicting Smartphone App Usage Patterns," Proceedings of the 2013 International Symposium on Wearable Computers: ACM, pp. 69-76.

Xue, M., Liu, Y., Ross, K. W., and Qian, H. 2015. "I Know Where You Are: Thwarting Privacy Protection in Location-Based Social Discovery Services," Computer Communications Workshops (INFOCOM WKSHPS), 2015 IEEE Conference on: IEEE, pp. 179-184.

Yechiam, E., and Hochman, G. 2013. "Losses as Modulators of Attention: Review and Analysis of the Unique Effects of Losses over Gains," Psychological Bulletin (139:2), p. 497.

Young, D., Beebe, N., and Chang, F. 2012. "Prospect Theory and Information Security Investment Decisions," Americas Conference on Information Systems, Seattle, Washington.

Young, J. D., and Anton, A. I. 2010. "A Method for Identifying Software Requirements Based on Policy Commitments," Requirements Engineering Conference (RE), 2010 18th IEEE International: IEEE, pp. 47-56.

Yu, J., Hu, P. J.-H., and Cheng, T.-H. 2015. "Role of Affect in Self-Disclosure on Social Network Websites: A Test of Two Competing Models," Journal of Management Information Systems (32:2), pp. 239-277.

Yuksel, A., Yuksel, F., and Bilim, Y. 2010. "Destination Attachment: Effects on Customer Satisfaction and Cognitive, Affective and Conative Loyalty," Tourism Management (31:2), pp. 274-284.

Yun, Y. 2013. "Understanding Product Attachment and Expected Product Lifetime by Extending Technology Acceptance Model (Tam) with Product Personalization and Innovation Diffusion Theory (Idt)," in: Media and Information Studies. Michigan State University, p. 133.

150

Zarsky, T. 2008. "Law and Online Social Networks: Mapping the Challenges and Promises of User- Generated Information Flows," Fordham Intellectual Property, Media & Entertainment Law Journal (18:3).

Zhang, M., Guo, L., Hu, M., and Liu, W. 2016. "Influence of Customer Engagement with Company Social Networks on Stickiness: Mediating Effect of Customer Value Creation," International Journal of Information Management).

Zhang, P. 2013. "The Affective Response Model: A Theoretical Framework of Affective Concepts and Their Relationships in the Ict Context," MIS Quarterly (37:1), pp. 247-274.

Zhou, T., Lu, Y., and Wang, B. 2010. "Integrating Ttf and Utaut to Explain Mobile Banking User Adoption," Computers in Human Behavior (26:4), pp. 760-767.

Zhou, Z., Jin, X.-L., Fang, Y., and Vogel, D. 2015. "Toward a Theory of Perceived Benefits, Affective Commitment, and Continuance Intention in Social Virtual Worlds: Cultural Values (Indulgence and Individualism) Matter," European Journal of Information Systems (24:3), pp. 247-261.

Zimmer, J. C., Arsal, R., Al-Marzouq, M., Moore, D., and Grover, V. 2010. "Knowing Your Customers: Using a Reciprocal Relationship to Enhance Voluntary Information Disclosure," Decision Support Systems (48:2), pp. 395-406.

151