<<

This article is part of a Research Dialogue: Krishna (2020): https://doi.org/10.1002/jcpy.1186 Oyserman & Schwarz (2020): https://doi.org/10.1002/jcpy.1189 Mulligan et al. (2020): https://doi.org/10.1002/jcpy.1190 Jagadish (2020): https://doi.org/10.1002/jcpy.1188 Acquisti et al. (2020): https://doi.org/10.1002/jcpy.1187

Secrets and Likes: The Drive for and the Difficulty of Achieving It in the Digital Age

Alessandro Acquisti Laura Brandimarte Carnegie Mellon University University of Arizona

George Loewenstein Carnegie Mellon University

Accepted by Associate Editor, Aradhna Krishna

We review different streams of social science literature on privacy with the goal of understanding consumer pri- vacy decision making and deriving implications for policy. We focus on psychological and economic factors influencing both consumers’ desire and consumers’ ability to protect their privacy, either through individual action or through the implementation of regulations applying to firms. Contrary to depictions of online sharing behaviors as careless, we show how consumers fundamentally care about online privacy, and present evidence of numerous actions they take to protect it. However, we also document how prohibitively difficult it is to attain desired, or even desirable, levels of privacy through individual action alone. The remaining instrument for privacy protection is policy intervention. However, again for both psychological and economic reasons, the collective impetus for adequate intervention is often countervailed by powerful interests that oppose it. Keywords Economic psychology; Public policy issues; Social networks and ; Privacy; Privacy paradox

Introduction targeted ads intended to influence voting in the upcoming US presidential elections. When this In the summer of 2016, a app featuring a scheme was revealed, outrage ensued (Rosenberg personality test funneled millions of consumers’ et al., 2018). Many users who had filled out the test to political operatives for use in had not realized whom their data could be shared with or for what purpose (Chen, 2018). Yet, the Received 2 June 2020; accepted 2 August 2020 threatened mass exodus from, and regulation of, Available online 06 September 2020 social media never came (Wong, 2019). Four years We thank Aradhna Krishna and two anonymous reviewers for later, little seems to have changed. their sharp and insightful guidance. We thank Ross Anderson, Julie Cohen, Jessica Colnago, Lorrie Cranor, Daphne Chang, The Cambridge Analytica scandal encapsulates Judith Donath, Alisa Frik, Jim Graves, Chris Hoofnagle, Wolf- the modern privacy problem: seemingly carefree gang Kerber, Steve Margulis, Kirsten Martin, Eyal Peer, Sandra online sharing behaviors; concerns with misuses of Petronio, Elissa Redmiles, Florian Schaub, Daniel Solove, and — Daniel Smullen for their time and their helpful comments. Some data, quickly forgotten; and staggering and yet so arguments in The Supply Side of Privacy are based on remarks hard to demonstrate or quantify—individual and prepared by one of the authors for the Atlanta FED 2019 Confer- societal consequences. To make sense of this ence on Markets and the Economy. Acquisti gratefully acknowl- edges support via a grant from the Alfred P. Sloan Foundation. Correspondence concerning this article should be addressed to Alessandro Acquisti, Heinz College, Carnegie Mellon University, © 2020 Society for Consumer Psychology Pittsburgh, PA, USA. Electronic mail may be sent to: All rights reserved. 1057-7408/2020/1532-7663/30(4)/736–758 [email protected]. DOI: 10.1002/jcpy.1191 Secrets and Likes 737 problem, we review diverse streams of social such as those in the United States, that rely on “no- science literature and take account of both the psy- tice and consent” strategies and consumer “respon- chological characteristics of individuals as well as sibilization,” have failed to secure either desired, or the economic environment in which they attempt to desirable, levels of privacy, and that a combination navigate privacy issues. of psychological and economic hurdles explains An often-repeated claim is that, under the assault otherwise seemingly paradoxical patterns of privacy of novel information technologies, privacy is dead choice (The Privacy “Paradox?”). —both in the sense that it is no longer achievable Given that people should and do care about pri- and that concern about privacy is fading. In Do vacy, but cannot adequately manage it, what Consumers Care About Privacy?, we provide evi- should be done? The inescapable conclusion, we dence against the latter assertion, including survey argue in What Should Be Done?, is that the restora- responses, field observations, and experimental tion of privacy can only be a matter of public pol- results, all showing that consumers fundamentally icy. And yet, while numerous regulatory efforts care about privacy and often act on that concern. have taken place around the world, economic and Actions to protect privacy are in fact so ubiquitous psychological factors again explain why adequate and second nature that they often go unnoticed. levels of protection remain hard to attain. If people do care about privacy and take actions While these conclusions may appear pessimistic, to protect it, can they do so effectively in an we end by highlighting how the resiliency—and increasingly digitized world? And, should they? In apparent universality—of a human drive for pri- Can Individuals Effectively Manage Privacy vacy provides hope for a future that balances pri- Online? and The Supply Side of Privacy, we discuss vacy with sharing and data utility. We offer how, while the desire for privacy may be wide- suggestions on research directions to investigate spread, consumers’ ability to reach desired or even that balance. desirable levels of privacy is far less so. Can Individuals Effectively Manage Privacy Online? focuses on psychological and behavioral Do Consumers Care About Privacy? factors limiting privacy attainment. Although con- sumers display a concern about privacy in a wide Do consumers care about, and act to protect, their array of behaviors, there are broad realms of life in privacy? Among Americans, the evidence for ele- which such concern is difficult to discern. Some vated privacy concerns has been ample and endur- motivations—such as the powerful desire to share ing. It can be found in Alan Westin’s seminal information with other people—counteract the surveys from the last century, which identified an coexisting desire for protection. And some other overwhelming majority of Americans as troubled features of human psychology—such as the ten- by potential misuses of their information dency to adapt to a gradually evolving situation, (Westin, 2001), up to the most recent polls, showing glean social norms from other people’s behavior, a majority of Americans concerned about data har- and take greater risks when one feels in control— vesting by corporations (Pew, 2019). Yet, a common can reduce or even neutralize privacy concerns. response to the above question is that, although The Supply Side of Privacy focuses on economic people say that they care, they do not actually do— factors that explain the difficulty, or even inability, as proven by seemingly careless online disclosures of consumers to satisfy their desire for privacy—or, (Johnson, 2020; Miller, 2014). Privacy is no longer a when such desire is absent but would be justified, social norm, the narrative goes, or, in fact, privacy in achieving beneficial levels of privacy. We also is dead (Sprenger, 1999). consider how the economic side of the equa- We disagree. We show in this section that con- tion overlaps with the psychological side—that is, sumers do not just say they care about privacy, but the different ways that -age firms exploit in fact often take action to protect it. Surveys, field consumers’ psychological propensities to disclose. studies, and experiments—as well as common sense Furthermore, while some have proposed that the —show that consumers engage in privacy-regulat- benefits that have accompanied the loss of privacy ing behaviors continually and in both online and ultimately justify the costs, we examine whether the offline scenarios, crossing the many diverse dimen- untroubled acceptance of the erosion of privacy sions and definitions of privacy (Solove, 2005), from may ignore real costs and consequences that threa- spatial to informational. The drive for privacy is ten not only individuals, but also society. We con- under-appreciated in part because individual clude that market-based approaches to privacy, actions to protect privacy are so ubiquitous and 738 Acquisti et al. second nature that they go unnoticed, or are not steps were diverse, from less to more sophisticated, construed as privacy behaviors. including clearing cookies, encrypting , avoid- ing using their name, and using virtual private net- works. The Evidence for Privacy-Seeking Behavior Numerous other surveys and interview reports In our everyday lives in the offline world, we provide similar evidence. For instance, a 2012 sur- instinctively and continually engage in privacy vey (Pew, 2012) found that the majority (58%) of behaviors without even thinking: lowering our social network site users had restricted access to voice during intimate conversations; leaving a their profiles. Another Pew (2015) survey found group of people to take a personal call; tilting a that among respondents who had heard of govern- document we are reading so it is protected from ment surveillance, 34% had changed the way they prying eyes; and drawing curtains to ensure pri- communicated online, including avoiding using cer- vacy in our bedrooms. Altman (1975) argued that tain terms or uninstalling apps from their devices. privacy behaviors are so ubiquitous and common In 2019, a survey of over 2,600 adult respondents in that they occur with little conscious awareness. 12 of the world’s largest economies conducted by Writing decades before the Internet age—and so, CISCO found that one third of them had switched focusing on personal space rather than information companies or providers over their data policies or —he noted that protection of personal space is data sharing practices (CISCO, 2019). A survey of instinctive and universal, across time and geogra- 1,500 Americans by Instamotors found, in 2017, that phy: Transgressions of personal space invoke a 89% of respondents had taken at least one step to variety of reactions, including a shifting of gaze, protect data—including changing passwords, not breaking eye contact, turning the body, or adopting displaying personal information on social media, protective postures (Altman, 1977). using password generators, covering laptop cam- Altman’s insights apply also to how people inter- eras when not in use, or using VPNs when brows- act on the Internet (Palen & Dourish, 2003). Online, ing (Lewis, 2017). And a survey of 5,710 US too, we engage continuously in behaviors that deli- participants conducted by DuckDuckGo in 2017 mit the contours of our closeness or openness to found that 46% had used “private browsing” (a others. Multiple times per day, we alternate browser setting to remove certain traces of brows- between different email accounts or online personae ing activities from a computer) at least once. In an to separate personal from professional spheres; pick in-depth interview study, Kang et al. (2015) found privacy settings to manage the visibility of our that 77% of nontechnical participants reported tak- social media posts; reply privately to group mes- ing some action to hide or delete their digital traces, sages, carefully selecting recipients for our including using anonymous search engines and responses; enter (or rely on previously stored) pass- blockers for cookies and other trackers. And recent words to keep information in our online accounts work on reactions to contact tracing suggests that private; set "I am busy!" notices on instant messag- 53% of the US population may be unwilling to ing profiles to tell people not to contact us, right install an app that would provide information to now; and turn on and off camera or audio on con- public health officials to track the spread of ference calls, depending on what we want to (or SARS‑CoV‑2 (KFF, 2020). must) show to others. The precise motivations behind these behaviors are diverse; their common Field studies trait is the individual’s attempt to regulate the boundaries of their interactions with others. Self-reports do not always reflect actual behavior. Observations of online privacy-seeking behaviors Yet, observational field studies of consumer choice go well beyond the anecdotal. In the next subsec- also provide evidence of privacy-seeking behavior. tions, we catalogue nonexhaustive examples of The behaviors encompass the many diverse scenar- research evidence. ios in which privacy (and privacy invasions) plays a role in consumers’ lives: telemarketing annoyance, government surveillance, online tracking, social Surveys media intrusions, and so on. Consider a seminal survey by Pew (2013): It For instance, by 2007, after the Federal Trade found that an overwhelming majority (86%) of sur- Commission had opened the National Do Not Call veyed US adults reported having taken steps online Registry (a database with the telephone numbers of to remove or mask their digital footprints. The individuals who do not want telemarketers to Secrets and Likes 739 contact them), 72% of Americans had registered on anonymous gift card for a $12 trackable one— the list (Bush, 2009). And following the revelations essentially refusing a 20% increase in rewards to made in 2013 regarding secret give away information on their purchasing deci- government surveillance programs, US consumers sions (Acquisti et al., 2013). When information became less likely to read Wikipedia articles per- about sellers’ privacy policies was made salient and ceived as privacy sensitive (Penney, 2016). understandable, subjects in a laboratory were will- Consumers try to elude online tracking in many ing to pay roughly a 4% premium to purchase from different ways. Forty-one percent of 451 partici- more privacy-oriented sellers (Tsai et al., 2011). In a pants in an academic study were found to have survey-based experiment, making privacy policies used a feature called “private browsing” (which salient reduced subjects’ disclosure of personal allows hiding some information about one’s brows- information, regardless of whether those policies ing behaviors) at least once (Habib et al., 2018). A were protective or intrusive of one’s privacy (Mar- recent study of a group of consumers’ consent to reiros et al., 2017). Online shoppers in a field exper- firm targeting following the enactment of the Euro- iment were willing to pay approximately one euro pean General Data Protection Regulation (GDPR) to keep their mobile phone number private found that opt-in consent to the collection of intru- (Jentzsch et al., 2012). In an online experiment in sive data such as location information was as low which subjects were first asked to answer survey as 5.5% (Godinho de Matos & Adjerid, 2019). questions in exchange for monetary rewards, and Social media users employ a variety of other later asked to share their Facebook profiles with the strategies to carve out private spaces in online pub- researchers for an additional bonus, a majority were lic fora, including self-censoring (Vitak & not willing to share their data for $2.50 Ellison, 2013), “social steganography” (the act of (Svirsky, 2019). In a choice experiment, subjects hiding information in plain sight, by encoding mes- were willing to make one-time payments ranging sages for friends in otherwise public channels in from around $1 to over $4 in order to prevent each ways that will allow only the intended recipients to smartphone app from accessing information such as discern their true meaning; Boyd & Marwick, 2011), browsing history, contacts, location, or texts (Sav- and selective sharing (Vitak & Kim, 2014). For age & Waldman, 2015). instance, Facebook users who were members of the Carnegie Mellon University’s network in 2005 pro- A note, about samples gressively transitioned toward less public sharing of personal information over time: While 86% of Most of the examples above are based on those users were publicly sharing their date of birth WEIRD (Western, Educated, Industrialized, Rich, in 2005, the percentage of them doing so decreased and Democratic) samples (Henrich et al., 2010). year by year, down to 22% in 2009 (Stutzman However, the evidence of “privacy”-seeking behav- et al., 2013). Additionally, while only a miniscule iors across cultures and historical periods is ample proportion of Facebook users on that same Face- (Moore, 1984; Murphy, 1964), from the use of secret book network had altered their (highly visible) paths in the woods by some tribes in Brazil, to default search and visibility settings in 2005 (Gross Javanese people hiding their emotions and speaking & Acquisti, 2005), just about 7% of Facebook users softly, and from the rearranging of the huts by the studied by Fiesler et al. (2017) had not changed Pygmies of Zaire in response to the arrival of new their default privacy settings by 2017. While the people to the camp, to the covering of almost their two samples are different, and self-selection cannot entire face by the Tuareg of Northern Africa (Alt- be excluded, the disparity in choices over time is man, 1977). As Altman concluded, privacy is simul- stark. taneously culturally specific and culturally universal. Experiments Acknowledging Complexities, and Reconciling the Evidence of privacy-seeking behavior arises from Evidence laboratory and field experiments as well. Many of them focus on exchanges involving privacy of data Although we have deliberately highlighted evi- and monetary rewards. For instance, a field experi- dence of privacy-seeking behaviors to counter a ment testing for a gap in willingness to pay/will- prevailing narrative of disappearing concerns, ingness to accept for privacy showed that over 50% there also exists ample evidence of people not of participants were not willing to exchange a $10 bothering to protect information, or engaging 740 Acquisti et al. publicly in behaviors only a short time ago con- with strangers between 2005 and 2009, changes sidered highly private. There is evidence of con- in the Facebook interface in late 2009 and early sumers being unwilling to pay for data protection 2010, affecting the visibility of certain profile (Beresford et al., 2012; Preibusch et al., 2013), and fields, abruptly reversed that protective trend, choosing to give up personal data in exchange making public again, for a significant portion of for small convenience and small rewards (Athey users, personal information those users had et al., 2017). Although by now it is well known attempted to keep private (Stutzman that mobile apps collect and share sensitive infor- et al., 2013). Likewise, although a DuckDuckGo mation with third parties (Almuhimedi survey did find that a substantial proportion of et al., 2015), the number of app downloads Internet users had tried to use private browsing, increases every year (Statista, 2016). Major data it also found that two-thirds of those users breaches have become common (Fiegerman, 2017), actually misunderstood (in fact, overestimated) yet most people seem willing to trust companies thedegreeofprotectionthatprivatebrowsing with personal information in exchange for rela- provided.And,whilethe cited Instamotor sur- tively small benefits (Ghose, 2017). And a plethora vey did find that 89% of respondents had taken of widespread, readily observable, everyday at least one step to protect data, the percentage online behaviors seem to bespeak an overall lack of respondents taking all of those steps was of concern. exceedingly small; in fact, some of the more pro- Evidence of disclosure-seeking behavior on its tective steps (such as using VPNs) were adopted own, however, does not contradict the argument by a small minority. Not coincidentally, those that consumers care for privacy. First, the work of steps were also the ones less familiar to average Altman (1975, 1977) serves as an antidote to simple users, and costlier to adopt. notions of privacy as a static condition of with- This more nuanced look at the evidence suggests drawal, protection, or concealment of information that claims concerning privacy being “dead” too (e.g., Posner, 1978). Altman construed privacy as a often confuse wants with opportunities—what people dialectic and dynamic process of boundary regula- want and what they can actually achieve. The tion. Privacy regulation encompasses both the open- desire for privacy may well be virtually universal; ing and the closing of the self to others. By consumers, offline as well as online, continually balancing and alternating the two, individuals man- attempt to regulate boundaries between public and age interpersonal boundaries to achieve desired private. Yet, the opportunities and ability to do so levels of privacy—an optimal amount of social effectively—to achieve desired levels of privacy— interaction, or an optimal balance of both disclosing may be shrinking. As the above examples suggest, and protecting personal information. Privacy regu- and as the next sections demonstrate, the reasons lation is thus dynamic—a process highly dependent are both psychological (Can Individuals Effectively on and responsive to changes in context and a pro- Manage Privacy Online?) and economic (‘The Sup- cess that applies equally to the many diverse ply Side of Privacy’). dimensions of privacy the literature has explored (a diversity this manuscript embraces, as evidenced by the disparate consumer scenarios it covers). Con- Can Individuals Effectively Manage Privacy sistent with this account (see also Petronio, 2002), Online? the seemingly contrasting examples of protection- and disclosure-seeking behaviors illustrate how, Economists use the term “revealed preferences” to while we manage privacy all the time, we do not refer to how consumers’ true valuations can be continuously protect our data. It would be undesir- revealed by their behavior, such as their choices in able (and unfeasible) for any individual to do so. the marketplace. Applied to the privacy domain, a Second, the evidence for seemingly privacy- revealed preference argument would posit that con- neglecting behaviors highlights a deeper issue: sumers protect and share precisely what they desire Privacy is extraordinarily difficult to manage, or —from disclosing personal information on social regulate, in the Internet age. Consider, again, media to covering online footprints using privacy- some of the examples of protective behaviors enhancing technologies. The choices they make, presented earlier in this section; they have a sec- according to this perspective, should be optimal for ond side. Although Facebook users on the Car- them personally and for society as a whole: If pri- negies Mellon University’snetworkdid become vacy behaviors express true preference for privacy, less likely to share their personal information market outcomes based on consumers’’ choices Secrets and Likes 741 would, in the absence of externalities, maximize Consumer Characteristics Affecting Privacy Behavior aggregate utility. On the surface, such reasoning appears consis- The privacy literature has increasingly drawn tent with the Altmanian notion of privacy as a pro- from research in psychology and behavioral eco- cess of boundary regulation, according to which the nomics to provide empirical evidence of numerous individual deliberately chooses when and what to processes affecting, and sometimes impairing, pri- protect or to share. In reality, regulating one’s pri- vacy-related decision making (Margulis, 2003). vacy is aspirational: In Altman’s terms, desired pri- These factors range from privacy “calculus” to emo- vacy may not be matched by achieved privacy, and tions; from asymmetric information to bounded market behaviors may not always necessarily cap- rationality; and from resignation and learned help- ture underlying preferences for privacy. We focus, lessness to cognitive and behavioral biases (Acquisti in this section, on some psychological factors caus- et al., 2015). Some of them are summarized in ing the discrepancy. Table 1 and discussed in the rest of this section.

Table 1 Some Psychological Factors Affecting Privacy Decision Making

Psychological factor Description Representative consequence Firms’ response

Information Consumers are unaware of the Consumers cannot respond to risks Increases firms’ ability to collect and asymmetries diverse ways firms collect and use they are unaware of use consumer information their data Bounded Consumers lack the processing Few read, or even could make sense Writing policies using sophisticated, rationality capacity to make sense of the of, privacy policies legalistic terms that obscure the complexities of the information central issues environment Present bias Overemphasizing immediate, and Consumers will incur long-term costs Offering small benefits in exchange under-weighing delayed, costs and —for example, intrusive profiling and for consumer data sharing benefits advertising—in exchange for small immediate benefits—for example, online discounts Intangibility Putting little weight on outcomes Consequences of privacy violations are Making it difficult for consumers to that are intangible—difficult to often diffuse and difficult to connect draw connections between specific isolate or quantify with specific actions acts of data sharing and specific privacy violations (e.g., price discrimination) Constructed Uncertainty about one’s preferences Sticking with default privacy settings Setting defaults that are advantageous preferences leads people to rely on crude to the firm rather than to the decision heuristics that often run consumer counter to considerations of objective costs and benefits Illusory The feeling (often illusory) that one Consumers share more when given Provide consumers with more control is in control of a situation leads to more granular control over privacy granular privacy controls to increased risk-taking settings encourage disclosure Herding The tendency to imitate the behavior Consumers share more when they see Provide social media feeds that of other people others sharing more on social media convey a maximal sense of others’ sharing Adaptation The tendency to get used to risks Despite ever-increasing violations of Change data usage practices that are unchanged over time or privacy, consumers adapt to them gradually that increase gradually and accept them The drive to The powerful drive to share Sharing of highly private, or even Working behavioral levers that elicit share information, including personal incriminating, information (e.g., on the motive to share (e.g., information social media) recommending photographs to share) 742 Acquisti et al.

Together, they explain when privacy-related behav- (e.g., when it comes to insuring against earth- iors capture actual preferences, and when they may quakes) they simply ignore them. Privacy is virtu- not. ally a “perfect storm” when it comes to all these considerations. The benefits of engaging in privacy- challenging activities (e.g., online discounts offered Informational asymmetries and bounded rationality in exchange of personal information) are typically Perhaps the most obvious reason why consumers immediate and tangible, while the consequences are cannot achieve desired levels of privacy is igno- typically delayed, and have either low likelihood rance about how their information is collected, dis- (e.g., catastrophic , or loss of job due seminated, and used. What one does not know can to social media postings made while in college), or hurt one, and worse, negative outcomes one is una- have very high likelihood but are small and intan- ware of are impossible to avoid. Most Americans gible (delays in the loading time of a website due are unaware of the data practices of firms, and very to trackers and ads). few are capable of making sense of the ubiquitous privacy policies they blithely agree to in order to Constructed preferences access different apps and websites (Pew, 2019). Even those who do have the expertise to make Other research in psychology and behavioral sense of such policies are unlikely to have the time economics shows that when consumers are uncer- to read through them. One amusing study esti- tain about their own preferences, they seize upon mated the annual national opportunity cost of any available cues to aid them in making a deci- Americans actually reading privacy policies at over sion, a process known as “constructed preferences” three quarters of a trillion dollars (McDonald & (Slovic, 1995). In one study documenting the phe- Cranor, 2008). And many Americans believe that nomenon of constructed preferences for privacy, privacy policies provide them with protections, John et al. (2011) asked students to answer sensitive when the reverse is more likely to be true; they pro- questions (e.g., if they had ever peeked at someone vide firms with uninformed consent to use and else’s email without them knowing), or even poten- often sell their information (Hoofnagle & tially incriminating questions (e.g., if they had dri- Urban, 2014). Moreover, people may derive (incor- ven while under the influence), using either an rect) assumptions of protection based on expected official-looking university interface which provided norms or privacy policies (Martin & Nis- assurances of and confidentiality, or an senbaum, 2016). For instance, even privacy-savvy obviously amateur interface. Although an indepen- consumers who use tools such as private browsing dent group of raters evaluated the professional overestimate the degree of protection they obtain website as much more secure, students were con- (Habib et al., 2018). siderably more likely to admit to these behaviors on the unprofessional website. Cues that most eval- uators agreed signaled safety of divulging, in fact Present bias and intangibility led respondents to “clam up”—presumably because A second reason why people may take actions the formal website reminded them of the potential that do not realize their desired levels of privacy is hazards of excessive divulgence. The features of the that the benefits of actions such as using an app or device utilized to self-disclose also matter. People accessing a source of information are typically seem more comfortable with, and end up disclosing immediate, while the privacy costs are often more intimate thoughts on social media, when delayed, uncertain, and intangible (Acquisti, 2004). using smartphones than PCs (Melumad & Work in behavioral economics shows that con- Meyer, 2020). Mental accounting also plays an sumers respond disproportionately to costs and important role when deciding whether to share per- benefits that are immediate, a phenomenon known sonal information with marketers (White as “present-bias.” As a result, minor inconve- et al., 2014). niences, such as the time and effort of completing a form, can have a huge impact on behavior (Bet- Illusory control tinger et al., 2012). People are also very bad at deal- ing with small probabilities of negative outcomes In another line of research, the three of us (see, e.g., Kunreuther et al., 1978). In some cases (Brandimarte et al., 2013) documented a “control (e.g., the threat of terrorism), they overweigh such paradox.” Much as people feel safer driving than small probabilities, but in many other situations flying because they feel more in control when they Secrets and Likes 743 drive, the illusion that they have greater control distribution of answers by other participants. There over their privacy leads consumers to divulge more were three possible answers: They had engaged in freely, even though they rarely actually choose to the behavior, they had not engaged in the behavior, exert that control. In one study from that paper, or they chose not to reveal whether they had. In students were asked sensitive questions, such as each of the three conditions, the feedback they got whether they had ever stolen anything, and were about other people’s responses was manipulated to randomly assigned to four conditions that varied make it look as if most people gave one of these the degree of control they perceived over the publi- three responses. Confirming the influence of other cation of their responses. In one condition, they people, over time individuals’ responses trended were told that if they answered all their answers toward echoing those of other people. When other would be published on a university website. In a people denied engaging in the behaviors, or second, offering a modicum of greater control, they reported unwillingness to answer, then respondents were asked to check a box to allow publication of began, themselves, to gravitate toward greater con- all their answers. A third condition that increased cealment; but when others admitted to the behav- control even further asked them to check a box for iors, respondents were more willing to do so each question to allow publication of that answer. themselves. And a fourth condition was the same as the second (in which a single box led to publication of all Adaptation answers), but respondents were also asked to pro- vide demographic information that would have Consumers’ responses to problems, including allowed us to uncover the identity of the respon- violations of privacy, have another pernicious dent. The striking results from the study were that, property: They are adaptive. Problems that seem although largely illusory, greater control led to acute, and initially draw a lot of attention and greater admission of sensitive or illegal behaviors, alarm, tend to recede into the background, even if but collecting detailed demographic information, they remain constant over time or worsen gradu- which should have given rise to fears of reidentifi- ally. This is a generally beneficial feature of human cation, had no impact. decision making; negative emotional reactions to The flip side of feeling in control is feeling that problems, such as fear and alarm, draw attention one has no control over a situation, and perversely, and motivate the individual to change their situa- this can also lead to a loss of vigilance about pri- tion. However, the human brain seems to interpret vacy issues. Draper and Turow (2019) propose that the persistence of a problem as evidence that the people’s apparent inaction regarding privacy issues problem is intractable, and hence not worthy of is the result of a sense of helplessness they refer to further attention, so it dials down the emotional as “digital resignation.” As much as people wish to response. manage the river of personal information that flows In some situations, and specifically when ongo- from them to companies, they realize that it is out- ing problems do warrant persistent efforts at mitiga- side of their capabilities, so they simply give up. tion, however, adaptation can lead to complacency Barassi (2019) notices that being an active citizen in and tolerance of what should be an intolerable situ- the modern world necessarily implies digital partic- ation. This is arguably the situation with privacy. ipation (even by children, whose parents post pho- In one of several experiments illustrating the impor- tographs, videos, and all kinds of other tance of adaptation (Adjerid et al., 2018), research information, leaving children no choice in the mat- participants were asked demographic questions, ter), which inevitably leads to one’s “datafication.” including their email address, were provided with specific privacy assurances, and then were asked various sensitive questions. Then, in a second Herding round, they were provided with new privacy assur- One of the most common cues that people use to ances and were asked a second set of sensitive decide how much information to reveal is what questions. The experimental manipulations in the others reveal. Documenting this and a variety of study involved whether the privacy assurances in related effects, Acquisti et al. (2012) conducted a the two rounds signaled increasing (low, then study in which a panel of New York Times readers high), decreasing (high, then low), or stable (low, were asked a series of sensitive questions and, after then low; or high, then high) privacy protections. answering each, were given the ostensible The main result of the study was that the contrast 744 Acquisti et al. between assurances mattered a lot. Information rev- power of motives counterpoised against those that elation was similar in the low–low and high–high drive privacy. conditions, suggesting that individuals did not have much of an independent idea of what levels of pro- Implications tection were appropriate. However, information revelation was much greater in the second round, The research we have recounted implies that in the low–high condition than in the high–low non-normative factors (factors that arguably do fi condition. Even in the short course of the experi- not affect true costs and bene ts of sharing) can ment, subjects seemed to adapt to whatever level of easily sway observed privacy choices. In theory, privacy they were initially provided with, and this may lead to outcomes that can either over- became concerned when privacy assurances were shoot or underachieve desired levels of privacy. reduced. In reality, the ample survey evidence of wide- spread concerns about and desires for privacy (Do Consumers Care About Privacy?), as well as The drive to share firms’ deliberate use of those factors to nudge While much of the literature on privacy tends to disclosure behavior (next, The Supply Side of Pri- — — focus on the risks of information leakage, those vacy), indicate that more often than not online concerned about privacy need to contend with the privacy behaviors may fall short of desires. In ’ fact that in most situations privacy-related motiva- Altman s terms, achieved privacy does not meet tions are counterpoised against potentially even desired privacy. stronger motives for socializing, connecting, and sharing information. Individuals have many rea- sons for sharing information, including economic The Supply Side of Privacy benefits that result from strategic revelation or withholding, and an array of psychological motives Psychological factors affect, and to some degree dis- (Carbone & Loewenstein, 2020). One study using tort, observed market demand for privacy (Can subjective measures as well as fMRI (Tamir & Individuals Effectively Manage Privacy Online?). Mitchell, 2012) found that information sharing is Economic factors affect its supply. In this section, inherently pleasurable, particularly when the infor- we show that, due to the interaction between those fi mation relates to one’s own thoughts and feelings. two factors, even if consumers were in nitely fi Further research dating back decades documents savvy, they would still nd desired (Privacy health, psychological, and social benefits resulting Under-supply), as well as desirable (Desired vs. from interpersonal disclosure (e.g., Pen- Desirable Privacy), levels of privacy nearly unattain- nebaker, 1997). able. Although the studies reported in Carbone and Loewenstein (2020) provide a range of insights Privacy Under-supply about the desire to share information, perhaps the most relevant to the current paper is a finding A supply of privacy does exist in the market. regarding reasons for sharing. In that study, Techniques and protocols have been developed to respondents to a survey (n = 552) were asked an protect data in nearly every imaginable online open-ended question about whether they could activity—from email to , and from recall a situation in which they were “dying to online advertising to online social media services. share” information with another person; then, if Some of those tools have been incorporated into they could recall such a situation, whether they products now available to consumers, such as VPN had ended up sharing the information, as well as a services, user-friendly encrypted , nontrack- series of follow-up questions about what informa- ing search engines and maps, anonymous browsers, tion they were dying to share and who they were and secure messaging platforms. Some major tech- dying to share it with. Survey respondents were nology companies have started trying to leverage presented with a long list of possible motives and their private features as a source of competitive were asked to select all that they believe might advantage (Panzarino, 2019). And most online ser- have driven the intense desire to share their experi- vices offer to varying degrees privacy controls such ences. Figure 1 presents the results from this analy- as opt-outs, visibility settings, and so forth. sis. The diversity of reasons that people cite to At the same time, living in the modern world explain their desire to disclose underlines the means being subject to continuous and ubiquitous Secrets and Likes 745

Figure 1. Figure 6 from Carbone and Loewenstein (2020). tracking. Whether we are aware of it or not, both economic: The reduction in the costs of both our online and offline behaviors are constantly surveillance and data storage, as well as the tracked, by surveillance cameras (Satariano, 2019), increasing value of (and success in) monetizing per- face-recognition technologies (Feng, 2019), rental sonal data, increases firms’ demand for tracking, cars activating GPS tracking by default thus driving down the supply of privacy. Such an (Mapon, 2017), multitudes of apps that share with outcome had been predicted by Hirshleifer (1978), an opaque ecosystem of intermediaries personal who showed how private firms faced incentives to data from our phones (Almuhimedi et al., 2015), overinvest in gathering information: The resources and trackers used by companies to learn and pre- used to acquire and disseminate that information dict our browsing behavior (Federal Trade Commis- would be wasteful from a societal point of view. sion, 2016). Even the boundaries between our The rise of surveillance as an economic model online and offline existences are eroding: Statistical (Zuboff, 2015) has implications other than firms’ tools are used to match sets of data belonging to overinvestments. First, markets with significant the same individual across domains and services, information asymmetries (such as, surely, the mar- endangering the very notion of anonymity and per- ket for privacy) can lead to economic inefficiency mitting personal information to be inferred from and even market failures (Akerlof, 1970). Further- nonsensitive public data (Narayanan & more, market forces lose, in part, their ability to Shmatikov, 2008). restrain firms’ data usage practices if network The forces that drive this relentless encroachment effects (especially powerful in two-sided market of surveillance into every facet of our lives are in platforms such as search engines, advertising net- part behavioral (our increasing adaptation and works, and social media) lead to quasi-monopoly habituation to tracking) but in greater part incumbents. While the debate around competition, 746 Acquisti et al. antitrust, and data regulation is nuanced and evolv- influence privacy choice (Hartzog, 2010). Firms, for ing, reduced competition and network effects rein- example, have a deep appreciation of the impact of force each other, and enable incumbents to defaults (Acquisti, et al., 2013), and joining a long accumulate more data and user attention (say, from tradition (e.g., of enrolling consumers in magazine search behavior), then enter markets for other ser- subscriptions that continue unless proactively termi- vices (say, navigation maps), which in turn allows nated), they set privacy defaults in a liberal fashion, more data collection (and more user attention), and banking on consumers’ laziness, and inattention makes it possible for the incumbent to supply even (see Table 1 for a summary of different ways that more services than any entrant or competitor. firms can take advantage of consumer psychology Increased data collection—in terms of both increas- with respect to privacy). ing share of a consumer market and increasingly Even transparency and control can be used to detailed inferences about each consumer—can thus nudge consumers toward higher disclosures lead to better, more valuable services, but also to (Acquisti et al., 2013). Transparency and control are fewer available outside options for privacy-seeking important components of privacy management. For consumers. instance, they reduce customers’ feeling of emo- Data-based network effects also tend to create tional violation and distrust in cases where personal powerful lock-ins into current products—such as data shared with firms are subject to vulnerabilities online social networks, whose value resides pre- (Martin et al., 2017). But, for reasons expounded in cisely in continued engagement of a growing user Can Individuals Effectively Manage Privacy base—strengthening incumbents. This creates what Online?, they are not sufficient conditions for pri- has been referred to as a privacy externality vacy protection. In fact, economic factors (Privacy (Acquisti et al., 2016): Other people’s usage of pri- Under-supply), such as network effects and lock-in, vacy-intrusive services increases the cost for pri- exacerbate behavioral hurdles (Can Individuals vacy-conscious consumers not to use them. At the Effectively Manage Privacy Online?), giving some extreme, consumers’ costs to choose protective firms more data, more user attention, more control, options become impractically large to accept. Once and ultimately more ability to influence consumer no privacy becomes the default, or the social norm, behavior. Ultimately, consumer “responsibilization” privacy options risk disappearing from the mar- (Giesler & Veresiu, 2014) approaches to privacy, ket altogether. predicated around so-called notice and consent Making matters worse, consumers’ marginal costs regime—that is, reliance on consumer “choice” of protection increase rapidly with the degree of (Solove, 2012)—have not made it affordable or even protection sought. Because so much of what we do feasible for consumers to achieve desired levels of is now tracked, there is simply too much to learn privacy. about and to protect. Everyone can, with little Consider the historical evolution of online track- effort, refrain from publishing their social security ing, which started with browser “cookies.” As users number on their Facebook profile. Using a VPN is started adopting cookie managers to prevent cross- more costly—in cognitive, usability, and economic site tracking, the online advertising industry devel- terms. Attempting to hide most of one’s digital oped so-called “flash” cookies to avoid consumers’ footprints from third-party monitoring is nearly deflecting strategies. And, as users started acquiring incalculably demanding—and, in light of the con- tools to defend against this new form of tracking, tinuously evolving technologies for surveillance, the industry switched to even more intrusive—and ultimately futile. harder to hide from—tracking strategies: device fin- gerprinting, deep packet inspection, and so on. The consumer-seeking privacy in the digital age cannot The interaction of economics and psychology rely on Altman’s mutually shared social norms and While some firms may actively promote privacy, intuitive behaviors, which worked in an offline there is an almost surely greater fraction that world. She/he is a modern Sisyphus constantly respond to, and to a great extent exploit, con- forced to learn new strategies, to little avail. sumers’ psychological characteristics (Can Individu- als Effectively Manage Privacy Online?) for their Desired versus Desirable Privacy own ends. Before the term “dark patterns” started being popularized (Gray et al., 2018), behavioral A valid counterpoint to the arguments in Privacy research on privacy had highlighted how platform Under-supply is that desired levels of privacy, providers can leverage interface changes to albeit unachievable, may in fact exceed what would Secrets and Likes 747 actually be optimal for consumers. If, in the age of First, the welfare implications of the collection and analytics, the collection and analysis of data can be use of data are nuanced and context-dependent. A source of great technological and economic substantial body of modeling and empirical work, advancement, then the loss of privacy, far from surveyed in Acquisti et al. (2016), finds that expan- being a threat to societal well-being, may be a nec- sions in data collection and use are not necessarily essary price for increased consumer and societal welfare-increasing, and limitations not necessarily welfare. Thus, desirable amounts of privacy may be welfare-decreasing. less than what consumers claim to want. They may, Second, privacy trade-offs are inherently redis- in fact, be precisely the levels that markets already tributive. Different decisions on data protection produce. affect different stakeholders differently, and the The fact that great utility can be extracted from interests of those stakeholders are rarely aligned data is undeniable and not questioned here. Rather, (recall the consumer trying to hide his/her reserva- we scrutinize from a number of angles the premise tion price, and the seller trying to discern it). The that market outcomes satisfy optimal balances of theoretical goal of achieving desirable societal out- data use and protection. What does privacy eco- comes inevitably forces policymakers to tackle nomics actually tell us about the trade-offs associ- thorny questions regarding whose welfare they want ated with data sharing (Individual and aggregate to prioritize. Either by intervening with regulation, consumer trade-offs)? How are the benefits from or by letting market outcomes determine levels of data allocated (Who benefits from consumer data privacy across domains, they inescapably favor one collection?)? What are the societal costs of data pro- or the other stakeholder. tection, and can they be minimized? And finally, what are the costs of privacy invasions, and what Who benefits from consumer data collection? dimensions of arguable concern for consumers are left out of economic analysis of data privacy (The Market equilibria may also not be advantageous costs of privacy protection)? for consumer welfare in relative terms, if most of the benefits accrued from the collection and usage of their data accrue to other stakeholders, such as Individual and aggregate consumer trade-offs data intermediaries that, thanks to a combination of A review of both theoretical research and empiri- market power and supremacy in surveillance tech- cal economic research on privacy (Acquisti nology, have nearly unchallenged control over the et al., 2016) belies the notion that more data collec- data. tion is monotonically associated with positive Consider, for instance, online targeted advertis- changes in consumer welfare. ing. According to industry insiders, collecting and At the individual level, it is economically rational using consumer data to target ads create economic to want to decide for oneself what and how much win-wins for all parties involved—merchants, con- to protect or reveal. It would be entirely logical sumers, publishers, and intermediaries (and welfare-increasing: Varian, 1996) for a con- (AdExchanger, 2011). In reality, on theoretical sumer to share his/her product interests with mar- grounds, targeting can either increase or decrease keters (to receive potentially beneficial targeted (Bergemann & Bonatti, 2011; De Corniere & De offers) and to hide the reservation price for those Nijs, 2016) the welfare of stakeholders other than products (to avoid price discrimination). In a very the data intermediaries. And available empirical Altmanian sense, the ability to regulate one’s open- research tells us very little about how the benefits ness is consistent with economic arguments of util- of targeted ads are allocated to stakeholders other ity maximization. The problem is that, as noted than merchants and the intermediaries themselves. above, under current market conditions, there is no Data from ad networks suggest that opting out of practical way for consumers to meaningfully regu- behavioral targeting costs publishers and ad late the use of their information. Since tracking is exchanges approximately $8.5 per consumer (John- ubiquitous, once the ability to regulate data flows is son et al., 2020). Yet, a Digiday 2019 poll of pub- taken away from consumers, secondary use of data lisher executives found that for 45% of respondents, is almost entirely beyond consumer control. And behavioral ad targeting had “not produced any secondary use, in turn, can create significant nega- notable benefit,” and 23% claimed it had “actually tive externalities (Noam, 1997). caused their ad revenues to decline” (Weiss, 2019). The same argument applies to the aggregate wel- And what about consumers themselves? Do con- fare effects of privacy protection (or lack thereof). sumer search costs go down because of targeted 748 Acquisti et al. ads? Are the prices they pay for advertised prod- Several of such privacy-enhancing technologies ucts on average higher, or lower, than those they (popularly referred to as PETs: Goldberg, 2003) use would have paid for products found via search? statistical and cryptographic techniques—such as What about the quality of those products? Much differential privacy or homomorphic —to more research is needed in this area. allow extracting utility from data while protecting privacy. They demonstrate that intrusive data prac- tices are not always essential to service effectiveness The costs of privacy protection and that privacy protection is not inherently anti- Two key approaches to privacy protection are thetical to data sharing and data analytics. As they regulation and privacy-enhancing technologies can reduce data granularity, both regulation and (PETs). PETs can certainly bear costs: Even mere privacy There is no shortage of studies showing privacy controls have been shown to reduce net contribu- regulation to have, other than its intended impact tions to crowdfunding campaigns (Burtch of privacy protection itself, also negative effects on et al., 2015). But both computer science and eco- some economic metrics—for instance, loss of adver- nomic research (Abowd & Schmutte, 2019) indicate tising effectiveness (Goldfarb & Tucker, 2011) or that careful design can minimize those costs while impaired technology adoption (Miller & ensuring some data protection. Tucker, 2009). There is also evidence of privacy reg- ulation having positive economic effects—for Noneconomic ramifications: privacy “Dark Matter” instance, reduction in identity theft (Romanosky et al., 2011) or increased technology adoption Our analysis has, so far, only focused on eco- (Adjerid et al., 2016). This seemingly contradictory nomically quantifiable implications of privacy evidence should not be surprising: It is consistent choices in specific contexts—such as targeted adver- with results from the broader economics literature, tising. Two critical considerations arise when we which show how the impact of regulation on inno- try to look at a broader picture. vation can be quite nuanced, and how the direction First, costs of privacy across scenarios are arguably of the impact can vary depending on how interven- impossible to combine into a definitive, aggregate tions are designed, implemented, and enforced estimation of the “loss” caused by a generalized lack (BERR, 2008). For instance, outcome- or perfor- of privacy. And this is not because privacy costs are mance-based interventions may score better than rare and few—but for the opposite reason: They are prescriptive regulations dictating the use of specific very common, but highly diverse in form, heteroge- tools or technologies. neous in likelihood, and varying in magnitude. They Similarly, in the privacy realm, we can reconcile range from identity theft to price discrimination; from contrasting findings by reiterating, first, the contex- attention and time waste to psychological harm; from tual nature of privacy trade-offs; second, by observing discrimination in targeted advertisement to filter bub- that binary metrics (such as absence vs. presence of bles; and from stigma to rare but catastrophic per- regulation) are too coarse to capture the complex sonal consequences (see, e.g., Acquisti et al., 2016; effects of privacy protection. The decisive factor in Calo, 2011; Solove, 2005, 2007). Hence the aggrega- determining the effect of privacy regulation on inno- tion and estimation problem. For instance, to the vation or welfare may not be its existence, but the spe- extent that data surreptitiously collected through pri- cifics of the intervention (Goldfarb & Tucker, 2012). vacy-intrusive apps can have an effect on a country’s Well thought-out interventions can protect privacy election, how can we quantify (or even demonstrate) and stimulate growth (Adjerid et al., 2016); poorly that impact, and its plethora of downstream ramifica- thought-out ones may achieve neither goal. In addi- tions? tion, focusing on the short-term effects of regulatory Second, and relatedly, we have not even consid- intervention (as much empirical privacy work does ered many of the most consequential ramifications for good reason—to identify robust causal links) risks of the loss of privacy. We call this economic “dark missing not just the downstream beneficial effects of matter”: We know it is there, but cannot directly consumer protection, but also the long-term effects on quantify it. The ramifications include the collective competition and innovation arising from firms having value of privacy (Regan, 1995); its role in preserv- to improve their services and systems to be more pri- ing room for individual subjectivity (Cohen, 2010); vacy-aware (for instance, by deploying privacy-en- and its role in protecting freedom (Westin, 1967), hancing technologies). dignity (Schoeman, 1984), and fairness Secrets and Likes 749

(Jagadish, 2016), or the very integrity of social life always arise) is actually common. Early observations (Nissenbaum, 2009). If market outcomes respond focused on a broad version of the dichotomy: a gap primarily to economic incentives, market equilibria between generic attitudes (typically expressing an may not account for these intricate, indirect, less overall desire for or concern over privacy) and tangible, and yet arguably even more critical impli- actual behaviors (often interpreted as privacy-ne- cations of data protection. glecting), such as social media sharing (Barnes, 2006) or disclosure choices in experiments (Spiekermann et al., 2001). But inconsistencies between broad atti- Implications: should consumers care? tudes and specific behaviors are well-recognized in To recap, market forces have driven an expand- the literature predating privacy decision making ing encroachment of surveillance into consumers’ (Ajzen & Fishbein, 1977); thus, considering the lives over the past two decades (Privacy Under- dynamic and dialectic nature of privacy boundary supply). This has been accompanied by undeniable regulation, it would not be surprising to find them economic benefits. And yet, economic analysis also in the privacy realm as well (Dienlin & raises valid doubts over the idea that increasing Trepte, 2015). And yet, evidence of specific, narrow data collection necessarily enhances consumer or dichotomies has also been often uncovered. societal welfare (Individual and aggregate con- For instance, specific attitudes toward privacy in sumer trade-offs), the notion of consumers accrue mobile phone apps were not closely linked to actual most of the value produced from their data (Who app download behavior in an incentivized experi- benefits from consumer data collection?), and the ment (Barth et al., 2019). Although participants idea that data protection through regulation or were “concerned that mobile apps monitor[ed] their technology will be inescapably welfare-decreasing activities and that often too much personal informa- (The costs of privacy protection). Nothing in this tion is collected” and “[t]he level of privacy intru- discussion, moreover, recognizes the importance of sion by mobile apps in general was perceived to be noneconomic dimensions of privacy loss (Noneco- relatively high,” nearly a third of participants nomic ramifications: privacy “Dark Matter”). We downloaded the app that was ranked as the most conclude that consumers have good reasons to intrusive within the set made available to the sub- aspire to higher degrees of privacy than market jects, and 49% downloaded an app that was ana- outcomes are likely to provide. lyzed as intrusive—despite the fact that the participants had been provided extra funds to buy a nonintrusive app. Specific concerns (captured via a questionnaire) The Privacy “Paradox?” did not match corresponding disclosure behaviors The question of whether consumers truly care for pri- (captured from mined network data) for various vacy, the hurdles they face in achieving it, and the members of the Carnegie Mellon Facebook network ramifications of their privacy choices in the mar- in 2006 (Acquisti & Gross, 2006). Nearly 16% of ket also lie at the roots of the so-called “privacy para- respondents who had expressed the highest degree dox”—an apparent gap, or dichotomy, between of concern (on a 1- to 7-point Likert scale) for the people’s self-reported mental states (attitudes, con- scenario in which a stranger knew their schedule of cerns, desires, etc.) regarding privacy and their actual classes and where they lived did in fact provide behaviors. A massive amount of scholarly effort over both pieces of information publicly, to everyone the past two decades has explored this gap, yet failed else on the network (around 7,000 users at the to affirmatively conclude whether a paradox of con- time). In fact, nearly 22% of those high-concern sub- sumers claiming a desire for privacy, but not acting jects provided their address, and nearly 40% pro- like they actually cared, does in fact exist (Norberg vided their schedule of classes. et al., 2007), or is a “myth” (Solove, 2021). Both the Mismatches between individuals’ specific expecta- so-called paradox and the disagreements around it tions of privacy and actual sharing behaviors have have profound implications for policy. Attempting to also been documented. For instance, every single bring some clarity to this debate may help us better participant in a social media study exhibited some understand the policy ramifications of the research mismatch between what they believed their pro- reviewed in the previous sections. files’ privacy settings to be, and what they actually Evidence that some dichotomy (or, more simply, were (Madejski et al., 2012). gaps of varying depth) between privacy mental Also specific behavioral intentions have been states and actual behaviors can arise (but will not shown to not necessarily match behaviors. Norberg 750 Acquisti et al. et al. (2007) first asked individuals their intentions to (many) explanations did exist, concluded (contra disclose specific pieces of information to marketers, Quine’sdefinition) that there was no paradox. In and then, several weeks later, asked the same subjects either case, the fact that misalignments between to actually provide “those exact pieces of information attitudes and behaviors can be explained does not to a market researcher.” Self-reported intentions to imply that the misalignments themselves do not disclose were significantly lower than actual disclo- exist. sures. (Between-subject experiments - including some The debate over the existence of a paradox of of those covered in the previous sections, in which privacy matters from a public policy perspective non-normative factors are shown to significantly (Martin, 2020), as it reflects differing implications sway privacy choice - suggest similar mismatches: one could derive from the body of work on con- See, e.g., Adjerid et al., 2018.) sumer privacy choice and market behaviors that we And yet, as noted, notwithstanding the evidence have reviewed in the previous sections. We high- of gaps between privacy mental states and behav- light in the final part of this section the implications iors, the literature remains split between consider- that seem to us more plausible. ing the paradox real and considering it a misnomer, First, even though there is evidence of situations or even a myth. in which behaviors do not match self-reported pref- We believe these disagreements (and the state of erences, this does not imply that privacy behaviors confusion surrounding the so-called paradox) to be never match preferences, or that self-reported pref- caused by two distinct factors. erences for privacy should never be trusted, or that The first factor is that different scholars have consumers’ persistent demand for privacy in sur- defined the dichotomy underlying the paradox in veys should not be taken seriously. Those mis- different ways (as already noted by Gerber matches do not imply consumer carelessness or et al., 2018 and Kokolakis, 2017): Different studies indifference for privacy; rather, they often have pre- have focused on different privacy scenarios, as well cise psychological and economic explanations, as on different pairwise comparisons of mental explored in previous sections. Second, by the same states and behaviors (e.g., generic attitudes vs. token, the fact that there is evidence of scenarios in specific intentions, or specific concerns vs. behav- which attitudes precisely predict behavior does not iors, or intentions vs. behaviors, and so on). Conse- imply that consumers are always able to match their quently, and considering the contextual nature of privacy desires with behaviors in the marketplace, privacy choice, it should not be surprising that and hence that no correcting public policy interven- scholars found both evidence of correspondence tion is needed. Third, there is no single explanation between mental states and behaviors in some sce- that resolves the paradox. Rather, the explanations narios, and evidence of possible gaps (of varying for the dichotomy – when a dichotomy does arise – depth) between mental states and behaviors in are many, and not mutually exclusive, because other scenarios. Sometimes evidence for and against many and diverse are the factors affecting privacy a dichotomy can arise from within the same study. choice, and the contexts in which the dichotomy For instance, broad attitudes toward privacy were can emerge (see discussions of multiple explana- simultaneously found to be uncorrelated to disclo- tions in Acquisti et al., 2015; Kokolakis, 2017; sure patterns on Facebook, but correlated with the Solove, 2021). In fact, the explanations include likelihood of joining the network (Acquisti & nearly any of the psychological and economic fac- Gross, 2006). tors we reviewed in Can Individuals Effectively The second (and perhaps more consequential) Manage Privacy Online? and The Supply Side of factor is that the very term “paradox” may have Privacy. This, in turn, implies that resolving the been interpreted differently by different scholars. whole of the modern privacy problem amounts to “Veridical” paradoxes are those that include seem- much more than addressing just this or that specific ingly contradictory statements which, upon further concern. observation, can actually be shown to be true Ultimately, the central lesson for policy of the (Quine, 1976). We suspect that some privacy schol- paradox literature is that consumers may well care ars, consistent with this definition, focused on the for privacy and try to regulate opening and closing discovery of a seemingly contradictory gap between of self to others in their everyday lives, but psycho- claimed privacy preferences and actual behaviors, logical and economic hurdles may make the pri- and called that a “paradox” of privacy. Whereas vacy they desire unattainable in absence of a other scholars focused on the explanations that systemic, fundamental change in the way we resolved that apparent dichotomy, and since approach the policy of privacy. Secrets and Likes 751

What Should Be Done? their data than anyone else. Finally, considering the consequential noneconomic dimensions of pri- If market outcomes are unlikely to produce not just the vacy (Noneconomic ramifications: privacy “Dark levels of privacy consumers desire, but also the levels Matter”), some might find abhorrent the notion of that would be desirable for them, what—if anything— putting a price on it, or question the propriety of can be done to correct privacy imbalances? allowing people to sell it, much as many question whether people should be allowed to sell their Privacy Nudges own organs for transplant. Furthermore, market-based data propertization Some of the psychological hurdles we considered schemes suffer from a nearly insurmountable eco- in Can Individuals Effectively Manage Privacy nomic challenge. The most valuable data are con- Online? can be countered, or ameliorated, through textual and dynamic; it is created in the interaction behavioral interventions, to align privacy outcomes between incumbent service providers and con- with ex ante preferences (Acquisti, 2009). Numer- sumers; and—missing regulatory intervention ous privacy nudges have been explored in the liter- establishing baseline protections—those providers ature, from changing social media default visibility are unlikely to relinquish its ownership to others. settings to making the consequences of privacy Hence, data propertization schemes may simply choices more salient to users (Acquisti et al., 2017). add themselves to an ecosystem of widespread Unfortunately, while nudges have been proven to surveillance, rather than replace it. be somewhat effective in experiments and field tri- als (for instance, Zhang & Xu, 2016), it is unclear that localized behavioral interventions alone can Privacy-Enhancing Technologies correct the enormous imbalance consumers encoun- Because they allow both data protection and data ter online between their ability to manage personal analytics, PETs offer significant potential individual data and platform providers’ ability to collect it. By and societal benefits.Yet,manyconsumersareunli- controlling user interfaces, the providers remain in kely to take advantage of them, due to unawareness control of the architecture of choice. of PETs’ existence, distrust, or perceived (and actual) costs. Thus, barriers to the success of PETs are both Data as Property psychological and economic in nature. Pushing the responsibility for their usage to individuals—that is, Data propertization schemes have been pro- expecting them to navigate a universe of disparate, posed in the literature since the mid-1990s. Lau- heterogeneousself-defensesolutionsacrossanever- don (1996) proposed the creation of personal data increasing range of scenarios in which data are markets, where consumers would trade rights tracked—would once again shift exorbitant costs onto with organizations over the collection and usage consumers in usability, cognitive, and economic of their information, thereby “monetizing” their terms (such as the opportunity costs arising from loss data. Over time, technological barriers to Laudon’s of features in services when PETs are deployed). In proposal have vanished. Data monetization star- any case, much as is the case for nudges and data tups have emerged and politicians have incorpo- propertization schemes, deployment of PETs is an rated data propertization or “data dividends” in inherently individualist solution: In the absence of a their platforms (Daniels, 2019). While appealing on wide-ranging regulatory intervention supporting their some levels (Arrieta-Ibarra et al., 2018), data prop- deployment by making privacy the default ertization schemes face hurdles in practice (Cavoukian, 2009), it is hard to see how a patchwork (Acquisti et al., 2016). One issue is whether con- approach of localized solutions—only working under sumers, who under such a scheme will face deci- specific circumstances, on specificappsorsystems,in sions about who to sell their data to and for how specificscenarios—could go far toward addressing much, are able to assign fair, reasonable valua- what is inherently a systemic problem of privacy. tions to their own data, considering the informa- tional and behavioral hurdles we surveyed in Can Individuals Effectively Manage Privacy Online? A Privacy Regulation (and Its Economic and Behavioral second issue is that schemes that monetize privacy Hurdles) run the risk of exacerbating inequality, creating a Psychologically informed interventions, data world in which only the affluent can have pri- propertization schemes, and (especially) privacy-en- vacy, or in which the already rich get more for hancing technologies may be useful tools for 752 Acquisti et al. privacy management, but none is likely to work as reminiscent of the debate surrounding climate intended in the absence of substantive, comprehen- change, another great issue of our times. Certainly sive policies that mandate a framework of baseline one reason is the lack of immediacy, and in some privacy protection, addressing both the consumer- cases, of tangibility of the negative consequences of side hurdles we considered in Can Individuals privacy invasions. Most people have, at best, a Effectively Manage Privacy Online?, and the sup- poorly formed notion of the negative consequences ply-side factors we considered in The Supply Side that can result from unregulated assaults on their of Privacy. While we defer to the vast legal scholar- data. In some cases, this is because they have not ship on privacy for a nuanced discussion on the experienced them, and in other cases because they effectiveness of different intervention models (such experienced them, but were unaware of it (e.g., as legislation, litigation, and so forth), we consider when they pay higher prices for goods as a result here some psychological and economic factors of price discrimination enabled by sellers possessing affecting regulatory efforts. their data). Worldwide, there has been no shortage of pri- Another, closely related reason is adaptation. vacy regulatory efforts. Recently, both in Europe People tend to adapt to—and pay little attention to (with the GDPR) and in the United States (with the —adverse situations that are either gradually California Act), major compre- changing or unchanging. This is especially true of hensive initiatives aimed at addressing the novel situations that people feel powerless to change, so digital privacy challenges have become law. Yet, that the acceptance of pronouncements such as despite the intentions of regulators to promote “pri- “privacy is dead” is likely to make these self-fulfill- vacy by design” (PBD) and “privacy by default” ing prophecies, introducing complacency about principles (which refer to organizations proactively what has been lost and diminishing motivation for considering privacy throughout the entire data life- reform. cycle; see Cavoukian, 2009), a gulf still exists “be- A second set of hurdles is economic in nature. tween PBD in principle and as implemented in “Regulatory capture” refers to the propensity for practice” (Wong & Mulligan, 2019). The jury is still industries to play a central role in crafting the regu- out on the effectiveness of recent regulatory inter- lations that apply to them, resulting in a situation ventions (Bamberger & Mulligan, 2018). For in which those regulations tend to favor the regu- instance, a number of recent studies have suggested lated industries—not the consumers that the regula- that, following the enactment of GDPR, consumer tions are ostensibly enacted to protect. Regulatory tracking remains ubiquitous (Sanchez-Rola capture has been studied by economists et al., 2019). (Downs, 1957), and more recently by privacy schol- Some hurdles impeding the enactment of com- ars (Hirsch, 2013; McGeveran, 2016). According to prehensive regulation are, again, psychological. A this argument, since privacy regulation is in fact wide range of psychological factors can explain likely to come (and, in some cases, has come), com- why there has not been an uprising of citizens panies will inevitably steer it in a direction that to deal with the problem of privacy. Privacy has favors their interests over those of the consumers. not yet become a “hot-button” issue (although By 2018, media were indeed reporting that major some have argued that it could; see Tay- tech companies (including Apple, Google, and Face- lor, 2001). Although in responses to closed-ended book) wanted federal privacy regulation in the Uni- survey questions consumers report being con- ted States (Brandom, 2018). cerned about privacy, this is rarely the case for A related concept from political economy has open-ended questions that assess what problems similar implications. It is the observation of political are “top of mind” for respondents. For example, economist Mancur Olson (1965) that concentrated in the most recent Gallup poll in which voters economic interests tend to trump diffuse, atomistic were asked an open-ended question about the interests. Concentrated industries have incentives to most important problem facing the nation, pri- lobby and influence policy in their favor. Individual vacy (or any item that could be construed as citizens, in contrast, have a much more difficult related to privacy) did not make it to the list of time coordinating lobbying activity. Each individ- the top 11 (the bottom of which garnered only ual, if only by dint of their limited resources, typi- 3% of mentions). cally has only a limited interest in engaging in Why does the loss of privacy not generate, col- legislation-influencing tactics. There are certain lectively, the kind of emotional response that some exceptions to the pattern (e.g., hot-button issues other societal problems do? The situation is such as guns and abortion, which can influence Secrets and Likes 753 multitudes of people to join or form interest to enjoy the benefits of online services, and help groups), but the advantage certainly resides with uncover the best means for securing the best of both commercial, over the individual, interests. worlds. The application of these ideas to privacy is Although our conclusions, as we noted, may straightforward. Several huge firms in the United appear pessimistic, some of the very evidence we States have both monumental resources (thanks to discussed in this article provides a glimmer of their ability to monetize consumer data) and incen- hope. Turning back to where we started in Do Con- tives to lobby and influence public opinion in direc- sumers Care About Privacy?, pronouncements that tions that propel privacy policies in their favor. In privacy is dead, we argued, confuse opportunities contrast, even the most privacy-conscious individ- with wants. People’s opportunities for privacy are ual has limited ability, or likely motivation, to influ- notably shrinking. And yet, across history, individ- ence such policies. If privacy does not have the uals—from Eastern Germans under Stasi properties of hot-button issues that lead people to (Betts, 2010; Sloan & Warner, 2016) to teenagers on rise up for reform, then privacy interest groups social media (Boyd & Marwick, 2011)—have may not be able to contend with the organized lob- revealed a remarkable tenacity in their attempts to bying of powerful firms. carve out private spaces for themselves and their groups in face of all odds—even while in public, and even under surveillance. Technologies, inter- faces, and market forces can all influence human Conclusion behavior. But probably, and hopefully, they cannot The ultimate conclusion of this paper may appear alter human nature. If privacy, as Altman proposed, pessimistic. We showed that people care and act to is both culturally specific and culturally universal, manage their privacy (Do Consumers Care About chances are that people’s quest for privacy will not Privacy?), but face steep psychological (Can Indi- dissipate. viduals Effectively Manage Privacy Online?) and economic (The Supply Side of Privacy) hurdles that make not just desired, but also desirable privacy References nearly unattainable. We conclude that approaches Abowd, J. M., & Schmutte, I. M. (2019). An economic to privacy management that rely purely on market analysis of privacy protection and statistical accuracy forces and consumer responsibilization have failed. as social choices. American Economic Review, 109(1), Comprehensive policy intervention is needed if a 171–202. https://doi.org/10.1257/aer.20170627 society’s goal is to allow its citizens to be in the Acquisti, A. (2004). Privacy in electronic commerce and position to manage privacy effectively and to their the economics of immediate gratification. In Proceedings best advantage. Yet, severe psychological and eco- of the 5th ACM conference on electronic commerce. nomic impediments to the enactment of regulation Acquisti, A. (2009). Nudging privacy: The behavioral eco- also exist. nomics of personal information. IEEE Security & Pri- – More research—combining psychology, eco- vacy, 7(6), 82 85. nomics, computer science, and the law—is needed Acquisti, A., Adjerid, I., Balebako, R., Brandimarte, L., Cranor, L. F., Komanduri, S., . . . Wang, Y. (2017). on those impediments, and how to overcome them. Nudges for privacy and security: Understanding and Research could take a nuanced perspective that ’ fi assisting users choices online. ACM Computing Surveys examines both costs and bene ts of different (CSUR), 50(3), 1–41. https://doi.org/10.1145/3054926 approaches to regulation, and that includes a con- Acquisti, A., Adjerid, I., & Brandimarte, L. (2013). Gone sideration of trade-offs that are difficult to quantify, in 15 seconds: The limits of privacy transparency and including the long-term ramifications of policy inter- control. IEEE Security & Privacy, 11(4), 72–74. https://d ventions. Research could also seek to assess how the oi.org/10.1109/MSP.2013.86 benefits of consumer data collection are allocated to Acquisti, A., Brandimarte, L., & Loewenstein, G. (2015). different stakeholders, and how different interven- Privacy and human behavior in the age of information. – tions can help to skew benefits toward the consumer Science, 347(6221), 509 514. side. Finally, research could address what types of Acquisti, A., & Gross, R. (2006). Imagined communities: Awareness, information sharing, and privacy on the privacy-protecting technologies would be most Facebook. In G. Danezis & P. Golle (Eds.), Privacy effective in enhancing consumer privacy, and most fi enhancing technologies. PET 2006. Lecture notes in com- effective, in terms of producing bene ts at minimum puter science (Vol. 4258). Berlin, Heidelberg: Springer. cost. Ultimately, targeted research could help scruti- https://doi.org/10.1007/11957454_3 nize the premise that loss of privacy is unavoidable 754 Acquisti et al.

Acquisti, A., John, L. K., & Loewenstein, G. (2012). The Barnes, S. B. (2006). A privacy paradox: Social networking impact of relative standards on the propensity to dis- in the United States. First Monday, 11(9). https://doi. close. Journal of Marketing Research, 49(2), 160–174. org/10.5210/fm.v11i9.1394 https://doi.org/10.1509/jmr.09.0215 Barth, S., de Jong, M. D., Junger, M., Hartel, P. H., & Acquisti, A., John, L. K., & Loewenstein, G. (2013). What Roppelt, J. C. (2019). Putting the privacy paradox to the is privacy worth? The Journal of Legal Studies, 42(2), test: Online privacy and security behaviors among 249–274. https://doi.org/10.1086/671754 users with technical knowledge, privacy awareness, Acquisti, A., Taylor, C., & Wagman, L. (2016). The eco- and financial resources. Telematics and Informatics, 41, nomics of privacy. Journal of Economic Literature, 54(2), 55–69. https://doi.org/10.1016/j.tele.2019.03.003 442–492. https://doi.org/10.1257/jel.54.2.442 Beresford, A. R., Kubler,€ D., & Preibusch, S. (2012). AdExchanger (2011, October 28). If a consumer asked you, Unwillingness to pay for privacy: A field experiment. “Why is tracking good?”, what would you say? AdExchanger. Economics Letters, 117(1), 25–27. https://doi.org/10. Retrieved August 3, 2020, from source https://adexcha 1016/j.econlet.2012.04.077 nger.com/online-advertising/why-is-tracking-good/ Bergemann, D., & Bonatti, A. (2011). Targeting in adver- Adjerid, I., Acquisti, A., Telang, R., Padman, R., & Adler- tising markets: Implications for offline versus online Milstein, J. (2016). The impact of privacy regulation media. The RAND Journal of Economics, 42(3), 417–443. and technology incentives: The case of health informa- https://doi.org/10.1111/j.1756-2171.2011.00143.x tion exchanges. Management Science, 62(4), 1042–1063. BERR (Department for Business, Enterprise, and Regula- https://doi.org/10.1287/mnsc.2015.2194 tory Reform) (2008). Regulation and innovation: Evidence Adjerid, I., Pe’er, E., & Acquisti, A. (2018). Beyond the and policy implications. BERR Economics Paper no. 4. privacy paradox: Objective versus relative risk in pri- BERR, UK. Retrieved August 3, 2020, from source vacy decision making. MIS Quarterly, 42(2), 465–488. https://www.yumpu.com/en/document/read/ https://doi.org/10.25300/MISQ/2018/14316 5453669/regulation-and-innovation-evidence-and-polic Ajzen, I., & Fishbein, M. (1977). Attitude-behavior rela- y-implications-pd tions: A theoretical analysis and review of empirical Bettinger, E. P., Long, B. T., Oreopoulos, P., & Sanbon- research. Psychological Bulletin, 84(5), 888–918. https://d matsu, L. (2012). The role of application assistance and oi.org/10.1037/0033-2909.84.5.888 information in college decisions: Results from the H&R Akerlof, G. A. (1970). The market for" Lemons": Quality Block FAFSA experiment. The Quarterly Journal of Eco- uncertainty and the market mechanism. The Quarterly nomics, 127(3), 1205–1242. Journal of Economics, 84(3), 488–500. https://doi.org/10. Betts, P. (2010). Within walls: Private life in the German 2307/1879431 Democratic Republic. Oxford, UK: Oxford University Almuhimedi, H., Schaub, F., Sadeh, N., Adjerid, I., Press. Acquisti, A., Gluck, J., . . . Agarwal, Y. (2015, April). Boyd, D., & Marwick, A. E. (2011). Social privacy in net- Your location has been shared 5,398 times! A field worked publics: Teens’ attitudes, practices, and strate- study on mobile app privacy nudging. In Proceedings of gies (September 22, 2011). A decade in Internet time: the 33rd annual ACM conference on human factors in com- Symposium on the dynamics of the Internet and soci- puting systems (pp. 787–796). ety. Retrieved August 3, 2020, from source https:// Altman, I. (1975). The environment and social behavior. ssrn.com/abstract=1925128 Monterey, CA: Brooks/Cole Publishing Company. Brandimarte, L., Acquisti, A., & Loewenstein, G. (2013). Altman, I. (1977). Privacy regulation: Culturally universal Misplaced confidences: Privacy and the control or culturally specific? Journal of Social Issues, 33(3), 66– paradox. Social Psychological and Personality Science, 4(3), 84. https://doi.org/10.1111/j.1540-4560.1977.tb01883.x 340–347. https://doi.org/10.1177/1948550612455931 Arrieta-Ibarra, I., Goff, L., Jimenez-Hernandez, D., Lanier, Brandom, R. (2018, October 24). Tim Cook wants a federal J., & Weyl, E. G. (2018). Should we treat data as labor? —But so do Facebook and Google. The Verge. Moving beyond "free". AEA Papers and Proceedings, 108, Retrieved August 3, 2020, from source https://www. 38–42. https://doi.org/10.1257/pandp.20181003 theverge.com/2018/10/24/18018686/tim-cook-apple- Athey, S., Catalini, C., & Tucker, C. (2017). The digital pri- privacy-law-facebook-google-gdpr vacy paradox: Small money, small costs, small talk (No. Burtch, G., Ghose, A., & Wattal, S. (2015). The hidden w23488). Cambridge, MA: National Bureau of Eco- cost of accommodating crowdfunder privacy prefer- nomic Research. ences: A randomized field experiment. Management Bamberger, K. A., & Mulligan, D. K. (2018). Privacy law Science, 61(5), 949–962. https://doi.org/10.1287/mnsc. – on the books and on the ground. In B. Van der Sloot 2014.2069 & A. De Groot (Eds.), The handbook of privacy studies: Bush, G. W. (2009). Economic regulation. Chapter 9. An interdisciplinary introduction (pp. 349–354). Amster- White House Archives. Retrieved August 3, 2020, from dam: Amsterdam University Press. source https://georgewbush-whitehouse.archives.gov/ Barassi, V. (2019). Datafied citizens in the age of coerced cea/ERP_2009_Ch9.pdf digital participation. Sociological Research Online, 24(3), Calo, R. (2011). The boundaries of privacy harm. Indiana 414–429. https://doi.org/10.1177/1360780419857734 Law Journal, 86, 1131. Secrets and Likes 755

Carbone, E., & Loewenstein, G. (2020). Dying to divulge: Fiesler, C., Dye, M., Feuston, J. L., Hiruncharoenvate, C., The determinants of, and relationship between, desired Hutto, C. J., Morrison, S., . . . Gilbert, E. (2017). What and actual disclosure. Retrieved August 3, 2020, from (or who) is public? Privacy settings and social media sourcehttps://ssrn.com/abstract=3613232 content sharing. In Proceedings of the 2017 ACM confer- Cavoukian, A. (2009). Privacy by design: The 7 foundational ence on computer supported cooperative work and social principles. Toronto, ON: Information and Privacy Com- computing (pp. 567–580). missioner of Ontario. Gerber, N., Gerber, P., & Volkamer, M. (2018). Explaining Chen, B. X. (2018, March 21). Want to #DeleteFacebook? the privacy paradox: A systematic review of literature You can try. New York: New York Times. Retrieved investigating privacy attitude and behavior. Computers August 3, 2020, from source https://www.nytimes.c & Security, 77, 226–261. https://doi.org/10.1016/j.cose. om/2018/03/21/technology/personaltech/delete-faceb 2018.04.002 ook.html Ghose, A. (2017). Tap: Unlocking the mobile economy. Cam- CISCO (2019). Consumer Privacy Survey: The growing bridge, MA: MIT Press. imperative of getting data privacy right. CISCO Cybersecu- Giesler, M., & Veresiu, E. (2014). Creating the responsible rity Series 2019. San Jose, CA: CISCO. Retrieved consumer: Moralistic governance regimes and con- August 3, 2020, from source https://www.cisco.com/ sumer subjectivity. Journal of Consumer Research, 41(3), c/dam/en/us/products/collateral/security/cybersecu 840–857. https://doi.org/10.1086/677842 rity-series-2019-cps.pdf Godinho de Matos, M., & Adjerid, I. (2019). Consumer Cohen, J. E. (2010). What privacy is for. Harvard Law behavior and firm targeting after GDPR: The case of a Review, 126, 1904. telecom provider in Europe. NBER Summer Institute Daniels, J. (2019, February 12). California governor proposes on IT and Digitization. ’new data dividend’ that could call on Facebook and Google Goldberg, I. (2003). Privacy-enhancing technologies for to pay users. Englewood Cliffs, NJ: CNBC. Retrieved the Internet, II: Five years later. In R. Dingledine & P. August 3, 2020, from source https://www.cnbc.com/ Syverson (Eds.), Privacy enhancing technologies. PET 2019/02/12/california-gov-newsom-calls-for-new-data- 2002. Lecture notes in computer science (Vol. 2482). Berlin, dividend-for-consumers.html Heidelberg: Springer. https://doi.org/10.1007/3-540- De Corniere, A., & De Nijs, R. (2016). Online advertising 36467-6_1 and privacy. The RAND Journal of Economics, 47(1), 48– Goldfarb, A., & Tucker, C. E. (2011). Privacy regulation 72. https://doi.org/10.1111/1756-2171.12118 and online advertising. Management Science, 57(1), 57– Dienlin, T., & Trepte, S. (2015). Is the privacy paradox a 71. https://doi.org/10.1287/mnsc.1100.1246 relic of the past? An in-depth analysis of privacy atti- Goldfarb, A., & Tucker, C. (2012). Privacy and innova- tudes and privacy behaviors. European Journal of Social tion. Innovation Policy and the Economy, 12(1), 65–90. Psychology, 45(3), 285–297. https://doi.org/10.1002/ https://doi.org/10.1086/663156 ejsp.2049 Gray, C. M., Kou, Y., Battles, B., Hoggatt, J., & Toombs, Downs, A. (1957). An economic theory of democracy. Man- A. L. (2018). The dark (patterns) side of UX design. In hattan: Harper & Row. Proceedings of the 2018 CHI conference on human factors in Draper, N. A., & Turow, J. (2019). The corporate culti- computing systems (pp. 1–14). vation of digital resignation. New Media & Society, 21 Gross, R., & Acquisti, A. (2005). Information revelation (8), 1824–1839. https://doi.org/10.1177/1461444819833 and privacy in online social networks (The Facebook 331 case). In Proceedings of the 2005 ACM workshop on pri- DuckDuckGo (2017, January). A study on private browsing: vacy in the electronic society (pp. 71–80). Consumer usage, knowledge, and thoughts. Technical Habib, H., Colnago, J., Gopalakrishnan, V., Pearman, S., report. Paoli, PA: DuckDuckGo. Retrieved August 3, Thomas, J., Acquisti, A., . . . Cranor, L. F. (2018). Away 2020, from source https://duckduckgo.com/down from prying eyes: Analyzing usage and understanding load/Private_Browsing.pdf of private browsing. In Fourteenth symposium on usable Federal Trade Commission (2016, June). Online tracking. privacy and security (SOUPS 2018) (pp. 159–175). Washington, DC: Federal Trade Commission. Retrieved Hartzog, W. (2010). Website design as contract. American August 3, 2020, from source https://www.consumer.f University Law Review, 60, 1635. tc.gov/articles/0042-online-tracking Henrich, J., Heine, S. J., & Norenzayan, A. (2010). The Feng, E. (2019, December 16). How China is using facial weirdest people in the world? Behavioral and Brain recognition technology. Washington, DC: NPR. Retrieved Sciences, 33(2–3), 61–83. https://doi.org/10.1017/ August 3, 2020, from source https://www.npr.org/ S0140525X0999152X 2019/12/16/788597818/how-china-is-using-facial-recog Hirsch, D. D. (2013). Going Dutch: Collaborative Dutch nition-technology privacy regulation and the lessons it holds for US pri- Fiegerman, S. (2017, September 7). The biggest data breaches vacy law. Michigan State Law Review, 83, 84. ever. CNN Business. Retrieved August 3, 2020, from Hirshleifer, J. (1978). The private and social value of infor- source http://money.cnn.com/2017/09/07/technol mation and the reward to inventive activity. In Uncer- ogy/business/biggest-breaches-ever/index.html tainty in economics (pp. 541–556). Academic Press. 756 Acquisti et al.

Retrieved August 3, 2020, from source https://www.sc Mapon (2017, June 9). GPS tracking for rental cars: How to iencedirect.com/science/article/pii/ break from the mold. Mapon. Retrieved August 3, 2020, B9780122148507500383?via%3Dihub from source https://www.mapon.com/us-en/blog/ Hoofnagle, C. J., & Urban, J. M. (2014). Alan Westin’s pri- 2017/06/gps-tracking-for-rental-cars-how-to-break- vacy homo economicus. Wake Forest Law Review, 49, from-the-mold. 261. Margulis, S. T. (2003). Privacy as a social issue and Jagadish, H. V. (2016). The values challenge for Big Data. behavioral concept. Journal of Social Issues, 59(2), 243– In Bulletin of the IEEE computer society technical committee 261. https://doi.org/10.1111/1540-4560.00063 on data engineering (pp. 77–84). Marreiros, H., Tonin, M., Vlassopoulos, M., & Schraefel, Jentzsch, N., Preibusch, S., & Harasser, A. (2012). Study M. C. (2017). “Now that you mention it”: A survey on monetising privacy. An economic model for pricing experiment on information, inattention and online pri- personal information. Heraklion, Greece: European Net- vacy. Journal of Economic Behavior & Organization, 140, work and Agency (ENISA). 1–17. https://doi.org/10.1016/j.jebo.2017.03.024 John, L., Acquisti, A., & Loewenstein, G. (2011). Strangers Martin, K. (2020). Breaking the privacy paradox: The on a plane: Context-dependent willingness to divulge value of privacy and associated duty of firms. Business sensitive information. Journal of Consumer Research, 37 Ethics Quarterly, 30(1), 65–96. https://doi.org/10.1017/ (5), 858–873. https://doi.org/10.1086/656423 beq.2019.24 Johnson, B. (2020, January 11). Privacy no longer a social Martin, K. D., Borah, A., & Palmatier, R. W. (2017). Data norm, says Facebook founder. London: The Guardian. privacy: Effects on customer and firm performance. Retrieved August 3, 2020, from source https://www. Journal of Marketing, 81(1), 36–58. https://doi.org/10. theguardian.com/technology/2010/jan/11/facebook- 1509/jm.15.0497 privacy Martin, K., & Nissenbaum, H. (2016). Measuring privacy: Johnson, G. A., Shriver, S. K., & Du, S. (2020). Consumer An empirical test using context to expose confounding privacy choice in online advertising: Who opts out and variables. Columbia Science and Technology Law Review, at what cost to industry? Marketing Science, 39(1), 33– 18(1), 176–218. 51. https://doi.org/10.1287/mksc.2019.1198 McDonald, A. M., & Cranor, L. F. (2008). The cost of Kang, R., Dabbish, L., Fruchter, N., & Kiesler, S. (2015). reading privacy policies. Information System: A Journal of “My data just goes everywhere”: User mental models Law and Policy for the Information Society, 4, 543. of the Internet and implications for privacy and secu- McGeveran, W. (2016). Friending the privacy regulators. rity. In Eleventh symposium on usable privacy and security Arizona Law Review, 58, 959. (SOUPS 2015) (pp. 39–52). Melumad, S., & Meyer, R. (2020). Full disclosure: How KFF (2020, April). Coronavirus, social distancing, and contact smartphones enhance consumer self-disclosure. Journal tracing. Health tracking poll. San Francisco: KFF. of Marketing, 84(3), 28–45. https://doi.org/10.1177/ Retrieved August 3, 2020, from source https://www. 0022242920912732 kff.org/global-health-policy/issue-brief/kff-health-track Miller, A. R., & Tucker, C. (2009). Privacy protection and ing-poll-late-april-2020/ technology diffusion: The case of electronic medical Kokolakis, S. (2017). Privacy attitudes and privacy beha- records. Management Science, 55(7), 1077–1093. viour: A review of current research on the privacy https://doi.org/10.1287/mnsc.1090.1014 paradox phenomenon. Computers & Security, 64, 122– Miller, C. C. (2014, November 12). Americans say they 134. https://doi.org/10.1016/j.cose.2015.07.002 want privacy, but act as if they don’t.NewYork: Kunreuther, H., Ginsberg, R., Miller, L., Sagi, P., Slovic, New York Times. Retrieved August 3, 2020, from P., Borkan, B., & Katz, N. (1978). Disaster insurance source https://www.nytimes.com/2014/11/13/up protection: Public policy lessons. Retrieved August 3, shot/americans-say-they-want-privacy-but-act-as-if- 2020, from source https://www.worldcat.org/title/dis they-dont.html aster-insurance-protection-public-policylessons/oclc/ Moore, B. Jr (1984). Privacy: Studies in social and cultural 3604263 history. Retrieved August 3, 2020, from source https:// Laudon, K. C. (1996). Markets and privacy. Communica- www.routledge.com/Revival-Privacy-Studies-in-Social- tions of the ACM, 39(9), 92–104. https://doi.org/10. and-Cultural-History-1984-Studies/Moore-Jr/p/book/ 1145/234215.234476 9781138045262 Lewis, B. (2017, November 7). Americans say data privacy Murphy, R. F. (1964). Social distance and the veil. Ameri- is important, but few take steps to protect themselves. Insta- can Anthropologist, 66(6), 1257–1274. https://doi.org/10. motor. Retrieved August 3, 2020, from source https:// 1525/aa.1964.66.6.02a00020 instamotor.com/blog/online-data-privacy-survey Narayanan, A., & Shmatikov, V. (2008). Robust de- Madejski, M., Johnson, M., & Bellovin, S. M. (2012, anonymization of large sparse datasets. In 2008 IEEE March). A study of privacy settings errors in an online symposium on security and privacy. IEEE. social network. In 2012 IEEE international conference on Nissenbaum, H. (2009). Privacy in context: Technology, pol- pervasive computing and communications workshops (pp. icy, and the integrity of social life. Palo Alto, CA: Stanford 340–345). IEEE. University Press. Secrets and Likes 757

Noam, E. M. (1997). Privacy and self-regulation: Markets Regan, P. M. (1995). Legislating privacy: Technology, social for electronic privacy. In W. M. Daley & L. Irving values, and public policy. Chapel Hill, NC: The Univer- (Eds.), privacy and self-regulation in the information age sity of North Carolina Press. (pp. 21–33). Washington, DC: US Department of Com- Romanosky, S., Telang, R., & Acquisti, A. (2011). Do data merce. breach disclosure laws reduce identity theft? Journal of Norberg, P. A., Horne, D. R., & Horne, D. A. (2007). The Policy Analysis and Management, 30(2), 256–286. privacy paradox: Personal information disclosure inten- https://doi.org/10.1002/pam.20567 tions versus behaviors. Journal of Consumer Affairs, 41 Rosenberg, M., Confessore, N., & Cadwalladr, C. (2018, (1), 100–126. https://doi.org/10.1111/j.1745-6606.2006. March 17). How Trump consultants exploited the Facebook 00070.x data of millions. New York: New York Times. Retrieved Olson, M. (1965). The logic of collective action. Cambridge, August 3, 2020, from source https://www.nytimes.c UK: Cambridge University Press. om/2018/03/17/us/politics/cambridge-analytica- Palen, L., & Dourish, P. (2003). Unpacking" privacy" for a trump-campaign.html. networked world. In Proceedings of the SIGCHI confer- Sanchez-Rola, I., Dell’Amico, M., Kotzias, P., Balzarotti, D., ence on human factors in computing systems (pp. 129–136). Bilge, L., Vervier, P. A., & Santos, I. (2019, July). Can I opt Panzarino, M. (2019, March 19). Apple ad focuses on out yet? GDPR and the global illusion of cookie control. iPhone’s most marketable feature: Privacy. Bay Area, CA: In Proceedings of the 2019 ACM Asia conference on computer Techcrunch. Retrieved August 3, 2020, from source and communications security (pp. 340–351). https://techcrunch.com/2019/03/14/apple-ad-focuses- Satariano, A. (2019, September 15). Real-time surveillance on-iphones-most-marketable-feature-privacy/ will test the British tolerance for cameras. New York: New Pennebaker, J. W. (1997). Opening up: The healing power of York Times. Retrieved August 3, 2020, from source emotional expression. New York: Guilford. https://www.nytimes.com/2019/09/15/technology/ Penney, J. W. (2016). Chilling effects: Online surveillance britain-surveillance-privacy.html. and Wikipedia use. Berkeley Technology Law Journal, 31, Savage, S. J., & Waldman, D. M. (2015). Privacy trade- 117. offs in smartphone applications. Economics Letters, Petronio, S. (2002). Boundaries of privacy: Dialectics of disclo- 137, 171–175. https://doi.org/10.1016/j.econlet.2015. sure. Albany, NY: Suny Press. 10.016 Pew Research Center (2012). Privacy management on social Schoeman, F. (1984). Privacy: Philosophical dimensions. media sites. Washington, DC: Pew Research Center. American Philosophical Quarterly, 21(3), 199–213. Retrieved August 3, 2020, from source https://www.pe Sloan, R. H., & Warner, R. (2016). The self, the Stasi, and wresearch.org/internet/2012/02/24/privacy-mana NSA: Privacy, knowledge, and complicity in the gement-on-social-media-sites/ surveillance state. Minnesota Journal of Law, Science and Pew Research Center (2013). Anonymity, privacy and secu- Technology, 17, 347. rity online. Washington, DC: Pew Research Center. Slovic, P. (1995). The construction of preference. American Retrieved August 3, 2020, from source https://www.pe Psychologist, 50(5), 364–371. https://doi.org/10.1037/ wresearch.org/internet/2013/09/05/anonymity-privac 0003-066X.50.5.364 y-and-security-online/ Solove, D. J. (2005). A taxonomy of privacy. University of Pew Research Center (2015). Americans’ privacy strategies Pennsylvania Law Review, 154, 477. post-Snowden. Washington, DC: Pew Research Center. Solove, D. J. (2007). I’ve got nothing to hide and other Retrieved August 3, 2020, from source https://www.pe misunderstandings of privacy. San Diego Law Review, wresearch.org/wp-content/uploads/sites/9/2015/03/ 44, 745. PI_AmericansPrivacyStrategies_0316151.pdf Solove, D. J. (2012). Introduction: Privacy self-manage- Pew Research Center (2019). Americans and privacy: Con- ment and the consent dilemma. Harvard Law Review, cerned, confused and feeling lack of control over their per- 126, 1880. sonal information. Washington, DC: Pew Research Solove, D. (2021, forthcoming). The Myth of the Privacy Center. Retrieved August 3, 2020, from source https:// Paradox. George Washington Law Review, 89. www.pewresearch.org/internet/wp-content/uploads/ Spiekermann, S., Grossklags, J., & Berendt, B. (2001, Octo- sites/9/2019/11/Pew-Research-Center_PI_2019.11.15_ ber). E-privacy in 2nd generation e-commerce: Privacy Privacy_FINAL.pdf preferences versus actual behavior. In Proceedings Posner, R. A. (1978). Economic theory of privacy. Regula- of the 3rd ACM conference on electronic commerce (pp. 38– tion, 2,19–26. 47). Preibusch, S., Kubler,€ D., & Beresford, A. R. (2013). Price Sprenger, P. (1999, January 26). Sun on privacy: “Get over versus privacy: An experiment into the competitive it”. San Francisco, CA: Wired. Retrieved August 3, advantage of collecting less personal information. Elec- 2020, from source https://www.wired.com/1999/ tronic Commerce Research, 13(4), 423–455. https://doi. 01/sun-on-privacy-get-over-it/. org/10.1007/s10660-013-9130-3 Statista (2016, November 23). Number of mobile phone users Quine, W. V. (1976). The ways of paradox. Cambridge, MA: worldwide from 2015 to 2020. Hamburg, Germany: Sta- Harvard University Press. tista. Retrieved August 3, 2020, from source https:// 758 Acquisti et al.

www.statista.com/statistics/274774/forecast-of-mobile- Weiss, M. (2019, June 5). Digiday research: Most publishers phone-users-worldwide/. don’t benefit from behavioral ad targeting. New York: Digi- Stutzman, F. D., Gross, R., & Acquisti, A. (2013). Silent day. Retrieved August 3, 2020, from source https:// listeners: The evolution of privacy and disclosure on digiday.com/media/digiday-research-most-publishers- Facebook. Journal of Privacy and Confidentiality, 4(2), 7– dont-benefit-from-behavioral-ad-targeting/. 41. https://doi.org/10.29012/jpc.v4i2.620 Westin, A. (1967). Privacy and freedom. New York: Athe- Svirsky, D. (2019). Three experiments about human behavior neum. and legal regulation. Doctoral dissertation, Harvard Westin, A. (2001). Testimony on “Opinion surveys: What University, Graduate School of Arts & Sciences. consumers have to say about ”. Tamir, D. I., & Mitchell, J. P. (2012). Disclosing informa- Hearing before the Subcommittee on Commerce, Trade tion about the self is intrinsically rewarding. Proceedings and Consumer Protection. Serial No. 107-35. Retrieved of the National Academy of Sciences of the United States of August 3, 2020, from source https://www.govinf America, 109(21), 8038–8043. https://doi.org/10.1073/ o.gov/content/pkg/CHRG-107hhrg72825/html/ pnas.1202129109 CHRG-107hhrg72825.htm. Taylor, H. (2001). Testimony on “Opinion surveys: What White, T. B., Novak, T. P., & Hoffman, D. L. (2014). No consumers have to say about information privacy”. strings attached: When giving it away versus making Hearing before the Subcommittee on Commerce, Trade them pay reduces consumer information disclosure. and Consumer Protection. Serial No. 107-35. Retrieved Journal of Interactive Marketing, 28(3), 184–195. https://d August 3, 2020, from source https://www.govinf oi.org/10.1016/j.intmar.2014.02.002 o.gov/content/pkg/CHRG-107hhrg72825/html/ Wong, J. C. (2019, March 18). The Cambridge Analytica CHRG-107hhrg72825.htm. scandal changed the world: But it didn’t change Facebook. Tsai, J. Y., Egelman, S., Cranor, L., & Acquisti, A. (2011). London: The Guardian. Retrieved August 3, 2020, from The effect of online privacy information on purchasing source https://www.theguardian.com/technology/ behavior: An experimental study. Information Systems 2019/mar/17/the-cambridge-analytica-scandal-cha Research, 22(2), 254–268. https://doi.org/10.1287/isre. nged-the-world-but-it-didnt-change-facebook. 1090.0260 Wong, R. Y., & Mulligan, D. K. (2019). Bringing design to Varian, H. R. (1996). Economic aspects of personal privacy. the privacy table: Broadening “design” in “privacy by In Privacy and self-regulation in the information age.Wash- design” through the lens of HCI. In Proceedings of the ington, DC: National Telecommunications and Informa- 2019 CHI conference on human factors in computing sys- tion Administration, US Department of Commerce. tems. Vitak, J., & Ellison, N. B. (2013). “There’s a network out there Zhang, B., & Xu, H. (2016). Privacy nudges for mobile you might as well tap”: Exploring the benefits of and bar- applications: Effects on the creepiness emotion and pri- riers to exchanging informational and support-based vacy attitudes. In Proceedings of the 19th ACM conference resources on Facebook. New Media & Society, 15(2), 243– on computer-supported cooperative work & social comput- 259. https://doi.org/10.1177/1461444812451566 ing. Vitak, J., & Kim, J. (2014). "You can’t block people off- Zuboff, S. (2015). Big other: Surveillance capitalism and line": Examining how Facebook’s affordances shape the the prospects of an information civilization. Journal of disclosure process. In Proceedings of the 17th ACM con- , 30(1), 75–89. https://doi.org/10. ference on computer supported cooperative work & social 1057/jit.2015.5 computing (pp. 461–474).