<<

Journal of Economic Literature 2016, 54(2), 442–492 http://dx.doi.org/10.1257/jel.54.2.442

The Economics of Privacy †

Alessandro Acquisti, Curtis Taylor, and Liad Wagman*

This article summarizes and draws connections among diverse streams of theoretical and empirical research on the economics of privacy. We focus on the economic value and consequences of protecting and disclosing personal information, and on consumers’ understanding and decisions regarding the trade-offs associated with the privacy and the sharing of personal data. We highlight how the economic analysis of privacy evolved over time, as advancements in information technology raised increasingly nuanced and complex issues. We find and highlight three themes that connect diverse insights from the literature. First, characterizing a single unifying economic theory of privacy is hard, because privacy issues of economic relevance arise in widely diverse contexts. Second, there are theoretical and empirical situations where the protection of privacy can both enhance and detract from individual and societal welfare. Third, in digital economies, consumers’ ability to make informed decisions about their privacy is severely hindered because consumers are often in a position of imperfect or asymmetric information regarding when their data is collected, for what purposes, and with what consequences. We conclude the article by highlighting some of the ongoing issues in the privacy debate of interest to economists. ( JEL D82, D83, G20, I10, L13, M31, M37)

1. Why an Economics of Privacy Friedrich Hayek’s 1945 treatise on the use of knowledge in society. Contributions to what he value and regulation of information has become known as the field of informa- Tassets have been among the most inter- tion economics have been among the most esting areas of economic research since influential, insightful, and intriguing in the

* Acquisti: Heinz College, Carnegie Mellon Univer- Hal Varian, and Frederik Zuiderveen Borgesius. Acquisti sity. Taylor: Department of Economics, Duke University. gratefully acknowledges support from the Alfred P. Sloan Wagman: Stuart School of Business, Illinois Institute of Foundation and the Carnegie Corporation of New York Technology. We are grateful for useful comments and sug- via an Andrew Carnegie Fellowship. Taylor’s research was gestions provided by the reviewing team, as well as by Omri supported in part by NSF grant SES-1132187. Wagman Ben-Shahar, Rainer Boheme, Alessandro Bonatti, Ryan gratefully acknowledges support from the Yahoo Fac- Calo, James Cooper, Avi Goldfarb, Jens Grossklags, Chris ulty Research and Engagement Program. The statements Hoofnagle, Kai-Lung Hui, Nicola Jentzsch, Jin-Hyuk Kim, made and views expressed in this manuscript are solely the Brad Malin, Helen Nissenbaum, Andrew Odlyzko, Randal responsibility of the authors. All errors are the authors’ own. Picker, Ivan Png, Soren Preibusch, Giancarlo Spagnolo, † Go to http://dx.doi.org/10.1257/jel.54.2.442 to visit the Lior Strahilevitz, Catherine Tucker, Tommaso Valletti, article page and view author disclosure statement(s).

442 Acquisti et al.: The Economics of Privacy 443

­profession. Seminal studies have investi- ­economics therefore relate to the topic gated the informative role of prices in mar- of this article, because they pertain to the ket economies (Stigler 1961); the creation trade-offs arising from the public or private of knowledge and the incentives to innovate status of information. For instance, an auc- (Arrow 1962); the prevalence of asymmetric tion may be structured in such a way that its information and adverse selection (Akerlof participants will reveal their true costs or val- 1970); the transmission of private infor- uations, or a tax mechanism may be designed mation through signaling activity (Spence so that the agents will truthfully reveal their 1973); and voluntary disclosures (Grossman types. However, whereas research on auc- 1981; Milgrom 1981). It may be proper, tions and optimal taxation may pertain to however, to think of information economics the private information of abstract economic not as a single field, but as an amalgam of agents (which could be consumers, firms, many related subfields. One such subfield or other entities), the field of privacy eco- now receiving growing attention by econo- nomics, which is our focus, pertains more mists is the subject of this article: the study specifically to personal information of actual of privacy. individuals. As a consequence, of course, the Privacy is difficult to define. It means dif- field is often influenced by research in the ferent things to different people. It has been other branches of information economics. described as the protection of someone’s This article reviews the theoretical and personal space and their right to be left alone empirical economic literature investigating (Warren and Brandeis 1890); the control individual and societal trade-offs associated over and safeguarding of personal informa- with sharing and protecting personal data. In tion (Westin 1967); and an aspect of dignity, particular, it focuses on the flow and use of autonomy, and ultimately human freedom information about individuals by firms. In so (Schoeman 1992). While seemingly differ- doing, the article identifies a number of key ent, these definitions are related, because themes. One theme is that characterizing a they pertain to the boundaries between the single unifying economic theory of privacy self and the others, between private and is hard, because privacy issues of economic shared, or, in fact, public (Altman 1975). relevance arise in widely diverse contexts. As individuals and as consumers, we con- Nevertheless, we are able, within a given con- stantly navigate those boundaries, and the text, to identify a number of robust theoret- decisions we make about them determine ical insights emerging from the literature. A tangible and intangible benefits and costs, for second key theme is that both economic the- ourselves and for society. Thus, at its core, the ory and empirical analysis of privacy expose economics of privacy concerns the trade-offs varying scenarios. In some, privacy protec- associated with the balancing of public and tion can decrease individual and societal wel- private spheres between individuals, orga- fare; in others, privacy protection enhances nizations, and governments. Economists’ them. Thus, it is not possible to conclude interest in privacy has primarily focused on its informational dimension: the trade-offs arising from protecting or sharing of per- more entities (for instance, an email provider monitoring 1 a user’s messages). Other issues arise when that informa- sonal data. Other subfields of ­information tion is or is made public, and thus possibly accessible by a multitude of entities (for instance, a member of a social networking site publicly sharing personal information on 1 Some of the economic issues we consider in this article her profile). In this article, we will sometimes use the term arise when personal information is or becomes no longer “public” to refer to both types of scenarios—that is, to private because it is shared with, or accessed by, one or information that is no longer private. 444 Journal of Economic Literature, Vol. LIV (June 2016) unambiguously whether privacy protection evolved from an architecture of decentral- entails a net “positive” or “negative” change ized and possibly anonymous interactions in purely economic terms: its impact is con- (Berners-Lee 2000), to one where pack- text specific. A third key theme relates to ets of data capturing all types of behaviors the observation that consumers are rarely (if (from reading to searching, from relaxing to ever) completely aware about privacy threats communicating) are uniquely (Bendrath and and the consequences of sharing and pro- Mueller 2011) and sometimes personally tecting their personal information. Hence, (Xie, Yu, and Abadi 2009) identified. In this market interactions involving personal data environment, a few “gatekeeper” firms are in often take place in the absence of individ- a position to control the tracking and linking uals’ fully informed consent. Furthermore, of those behaviors across platforms, online specific heuristics may profoundly influence services, and sites—for billions of users. As a consumers’ privacy decision making. result, chronicles of peoples’ actions, desires, interests, and mere intentions are collected 1.1 The Value of Personal Data and the by third parties, often without individuals’ Value of Privacy knowledge or explicit consent, with a scope, Economists’ interest in informational pri- breadth, and detail that are arguably without vacy, generally intended as the control or precedent in human history. protection of personal information, can be Such vast amounts of collected data have readily understood: the protection and dis- obvious and substantial economic value. closure of personal data are likely to generate Individuals’ traits and attributes (such as a trade-offs with tangible economic dimen- person’s age, address, gender, income, pref- sions. The transition of modern economies erences, and reservation prices, but also her toward production of knowledge and recent clickthroughs, comments posted online, radical advancements in information tech- photos uploaded to social media, and so nology (in particular, the rise of the Internet) forth) are increasingly regarded as business have vastly enlarged the amount of individ- assets that can be used to target services ual information that can be collected, stored, or offers, provide relevant advertising, or analyzed, and repurposed for new uses. The be traded with other parties. In an effort ascent of the so-called Web 2.0 (blogs, social to leverage the value inherent in personal media, online social networks) has rendered data, new services (such as search engines individuals no longer mere consumers of and recommender systems), new compa- information, but public producers of often nies (such as social networking sites and highly personal data. The spread of mobile blogging platforms), and even new markets computing and sensor technologies has have emerged—such as markets for “crowd- blurred the distinctions between digital and sourcing” (Schenk and Guittard 2011), or physical, online and offline. All of this has led a complex online advertising ecosystem to services that simultaneously generate and (Evans 2009). Existing services such as travel capture digital trails of personal and profes- sional activities—activities that were previ- ously conducted in private and left little or streaming service thus can know the songs to which the 2 user listened, from where, for how long, or how many no trace. Simultaneously, the Internet has times. This data can be combined with other information about the individual, and then used in various manners: to compile a profile of the listener; to infer his or her other 2 For instance, the act of listening to music online using interests and preferences; to present him or her with tar- a streaming service (as opposed to buying a CD in a phys- geted advertising; or to sell his or her information to data ical store) can be captured by the streaming service. The aggregators or other parties. Acquisti et al.: The Economics of Privacy 445 agencies, record companies, and news media ­individual, therefore, the potential benefits have also been affected and, in some cases, of strategically sharing certain data while pro- transformed. tecting other data are quite apparent. So are The tools and products made possible by the potential costs of having too much infor- the increased availability of personal data mation disclosed to the wrong parties (from have borne benefits for data subjects and price discrimination to other more odious data holders alike. Despite those benefits, forms of discrimination; from social stigma public concerns over personal privacy have to blackmailing; from intangible nuisances to increased. With the advent of Internet and identity theft). Equally apparent, however, data analytics, issues surrounding the pro- are the costs that others may incur when they tection or sharing of personal data have find themselves in a position of information emerged as crucial nexuses of economic and asymmetry and have less information than policy debate.3 Over the years, national sur- the subject. For instance, the security firm veys have consistently found widespread evi- that cannot conduct background checks on dence of significant privacy concerns among job applicants may end up hiring the wrong Internet users.4 From the standpoint of employees. As Posner (1981) points out, pri- self-interested individual behavior, the eco- vacy is redistributive—as is, of course, the nomic motive behind concerns for privacy is lack of privacy. far from irrational. It is nearly self-evident. Beyond mere questions of redistribution, If it is true that information is power, then the trade-offs associated with protecting or control over personal information can affect sharing personal information are nuanced the balance of economic power among par- for both the data subject and for the market ties. Thus, privacy can simultaneously be as a whole (as well as society). First, individ- a source of protection from the economic uals can directly benefit from sharing their leverage a data holder could otherwise hold data. Advantages can be both psychological over the data subject (if the merchant figures (Tamir and Mitchell 2012) and economic: out how little you know about the product for instance, personalized services and dis- you are browsing, he may steer you towards counts one receives after joining a mer- merchandise or prices that serve his interests chant’s loyalty program; or reduced search better than yours); as well as be a tool the costs and increased accuracy of information data subject may strategically use against the retrieval one experiences when a search nonholder (if the salesperson cannot esti- engine tracks them more closely. Those mate your reservation price, you may be able benefits turn into opportunity costs when to exploit this information asymmetry to cut the individual chooses not to reveal certain a nice bargain). personal data. Privacy is not the opposite of sharing— Second, both positive and negative exter- rather, it is control over sharing. For the nalities arise through the complex interplay of data creation and transmission. In par- 3 Consider, for instance, the 2013 White House’s report ticular, the benefits arising from individ- on “Big Data: Seizing Opportunities, Preserving Values,” uals sharing their information, because of available at http://www.whitehouse.gov/sites/default/files/ advances in data mining, may be enjoyed by docs/big_data_privacy_report_5.1.14_final_print.pdf. 4 For instance, a Pew Research Center survey of society as a whole. For instance, aggregation 1,002 adult users conducted in 2013 found that 86 per- of online searches may unveil unexpected cent had taken steps online to remove or mask their digital interactions between pharmaceutical drugs footprints, and 68 percent believed that current laws were not good enough in protecting online privacy (Rainie et al. (White et al. 2013), or possibly provide 2013). early alerts for epidemics (Dugas et al. 446 Journal of Economic Literature, Vol. LIV (June 2016)

2012).5 Conversely, other individuals’ com- may still be inferred through the analysis fort with sharing data (“I have nothing to of similar individuals who did not choose hide”) may legitimize expansions of intru- to protect theirs; see, e.g., Jernigan and sive surveillance programs that affect the Mistree 2009). rest of society. Society may suffer when Analyzed as economic goods, privacy and certain behaviors stay hidden (consider personal information reveal other, peculiar insider trading or social progress being characteristics. First, when shared, personal delayed and social norms failing to evolve information can have characteristics of a because of individuals’ fears of disclosing public good, such as nonrivalry and nonex- legitimate but fringe opinions); but society cludability (a complex online advertising may also benefit when other information ecosystem engages in trades of Internet is suppressed (around the world, various users’ personal information; in fact, it is hard jurisdictions allow certain juvenile criminal to prevent released data from being dupli- records to be expunged with the belief that cated and accessed by other parties, or to unfettered reintegration of minors has pos- control its secondary uses). And yet, one itive social value). Similarly, an individual of the core tenets of informational privacy may personally benefit from other people’s is the ability to keep that information pro- sharing (for instance, collaborative filtering tected—that is, to exclude someone from of other users’ movie ratings may produce knowing or using certain information. The accurate viewing recommendations); con- value of keeping some personal information versely, an individual may pay a price when a protected and the value of it being known merchant’s analytical tools permit the latter are almost entirely context-dependent and to accurately predict the reservation price contingent on essentially uncertain combi- of the former, based on the past behavior of nations of states of the world. Furthermore, other consumers. In fact, even an individu- privacy sensitivities and attitudes are sub- al’s costs (and ability) to protect her infor- jective and idiosyncratic, because what con- mation may be a function of the disclosure stitutes sensitive information differs across choices made by others. That “anonymity individuals. Specifically, individuals differ loves crowds” is a common refrain in the in what they may experience if some private literature on privacy-enhancing­ technolo- information were to be shared with others or gies, reflecting the observation that, online made public, as well as in their beliefs that as offline, it is easier to hide as one among the information may in fact be released. For many who look alike. Conversely, protect- instance, the healthy individual who just lost ing one’s data becomes increasingly costly his job may flaunt his active lifestyle on social the more others reveal about themselves media, but hide his unemployment status (for instance, the success of online social to avoid shame; the reverse may be true for networks has encouraged other entities, the affluent manager who was just diagnosed such as online news sites, to require social with a ­sexually-transmitted disease. Different media user IDs in order to enjoy some of pieces of information will matter differently their services, thus curtailing users who do to different people (your piano teacher not want to create social media accounts), may not be as interested in the schools you or altogether infeasible (even if an individ- attended as your potential employer). The ual chooses to protect certain data, that data value of information will change over time (an online advertiser may not be as inter- 5 For a critique of those very claims, however, see Lazer ested in logs of your online activity from five et al. (2014) and section 4.4. years ago as in your activity right now). In Acquisti et al.: The Economics of Privacy 447 fact, the value and ­sensitivity of one piece of and an intermediate good (one valued for personal information will change depend- instrumental purposes; see, e.g., Farrell ing on the other pieces of data with which 2012). Attitudes towards privacy mainly cap- it can be combined (your state of birth and ture subjective preferences; that is, people’s your date of birth, alone, may not uniquely ­valuations of privacy as a good in itself (pri- identify you; together, they may allow the vacy as a final good). But those valuations prediction of your Social Security number are separate from the actual trade-offs that with some accuracy; see, e.g., Acquisti and arise following the protection or sharing of Gross 2009). personal data (from price discrimination to Second, disclosing data often causes identity theft; from coupons to personalized a reversal of informational asymmetries: services)—that is, from the value of pri- beforehand, the data subject may know vacy as an intermediate good (for instance, something the data holder does not (for regardless of whether an individual thinks instance, a customer’s willingness to pay for “my life is an open book, I have nothing to a good); afterwards, the data subject may hide,” that individual will still suffer tangible not know what the data holder will do with harm if she is a victim of identity theft). their data, and with what consequences (for Fifth, it is not always obvious how to prop- instance, how the merchant will use the cus- erly value privacy and personal data. Should tomer’s information, including estimates of the reference point be the price one would her reservation price, following a purchase). accept to give away their data, or the amount As a result, privacy trade-offs are also inher- they would pay to protect it? Or, should it ently intertemporal: disclosing data often be the expected cost the data subject may carries an immediate benefit, be it intangible suffer if her data is exposed, or the expected (friends “liking” your online status updates) profit the data holder can generate from or tangible (a merchant offering you a dis- acquiring her personal information? For count). The costs of doing so are often most products and services that economists uncertain, and are generally incurred at a traditionally study, the way to address these more distant point in time (a future prospec- questions is generally self-evident: the mar- tive employer may not like that risque photo ket captures the accurate price of privacy you had uploaded from vacation as much as and personal data, reflecting the reserva- your friends did at the time; a merchant may tion prices of different buyers (data hold- collect information about you today, and use ers) and sellers (data subjects). However, it for price discrimination the next time you there is yet no open, recognized market for visit its store). personal data in which data subjects them- Third, privacy trade-offs often mix the selves can participate. Personal data is con- tangible (the discount I will receive from the tinuously bought, sold, and traded among merchant; the increase in premium I will firms (fromcredit-reporting ­ agencies to pay to the insurer), with the intangible (the advertising companies to so-called “infome- psychological discomfort I experience when diaries,” which buy, sell, and trade personal something very personal is exposed without data), but consumers themselves do not my consent), and the nearly incommensu- have access to those markets: they cannot rable (the effect on society of surveillance; yet efficiently buy back their data, or offer the loss of autonomy we endure when others their data for sale (although the concept of know so much about us). ­personal-information markets for consum- Fourth, privacy has elements of both a ers, or individuals’ markets for privacy, has final good (one valued for its own sake), been around since the ­mid-1990s; see, e.g., 448 Journal of Economic Literature, Vol. LIV (June 2016)

Laudon 1996, and section 2.1 of this survey). and data subjects? What is the allocation of Moreover, issues associated with individuals’ surplus gained from the usage of individuals’ awareness of privacy challenges, solutions, personal data? How should that surplus be and trade-offs cast doubts over the ability allocated—based on market forces, treating of market outcomes to accurately capture privacy as another economic good, or based and reveal, by themselves, individuals’ true on regulation, treating privacy as a funda- privacy valuations (Berthold and Böhme mental right? And should an allocation favor 2010). However, individuals do engage the data subject as the owner of the data, or daily in transactions involving their personal the data holder who invested in collecting data. With a query on a search engine, the and analyzing the information? searcher is implicitly selling information The studies we review in the remainder of about her current interests in exchange for this article investigate these diverse issues. finding relevant results. By using an online The review focuses on the economic value social network, members are implicitly and consequences of privacy and personal selling information about their interests, information, and on consumers’ under- demographics, and networks of friends and standing of and decisions about the costs acquaintances in exchange for a new method and benefits associated with data protec- of interacting with them. Applying the prin- tion and data sharing. In investigating these ciple of revealed preference, we could infer issues, we focus more on microeconomic people’s valuations for their personal data than macroeconomic analyses. We focus on by observing their usage of those tools. scholarly work published in economic jour- However, for service providers, data trading nals—although, due to the nature of the is the essence of the transaction, whereas subject, we also draw from fields such as from the perspective of the data subject, the psychology, marketing, information systems, trade of personal data is a secondary, mostly and computer science. We begin with a sur- inconspicuous, and often altogether invisible vey of the theoretical literature on privacy aspect of a different, more salient transac- (section 2). The survey highlights how the tion (having a question answered, interacting economic analysis of privacy evolved over with peers online, and so forth). time, as advancements in information tech- nology raised increasingly nuanced and com- 1.2 Focus of the Survey plex issues associated with the protection Information asymmetries regarding the and sharing of personal information. A key usage and subsequent consequences of theme emerging from this literature is that shared information, as well as heuristics there is no unequivocal impact of privacy studied by behavioral decision researchers, protection (or sharing of information) on raise questions regarding individuals’ abili- welfare. Depending on context and condi- ties, as rational consumers, to optimally nav- tions, privacy can either increase or decrease igate privacy trade-offs. They raise questions individual as well as societal welfare. Next, about the extent to which individual respon- we survey the empirical literature on privacy sibility, market competition, and government trade-offs, as well as what is known about regulation can steer the market towards a consumers’ attitudes and behaviors towards balance of disclosure and protection of per- privacy (section 3). The review of the empir- sonal data that best serves the interests of the ical work on privacy reveals various insights. different parties. These observations bring First, it confirms the principal theme arising us to even more questions: Are there privacy from the theoretical literature: empirical “equilibria” that benefit both data holders evidence exists both for scenarios in which Acquisti et al.: The Economics of Privacy 449 the protection of privacy slows innovation or data breaches or ­identity theft that involve decreases economic growth and scenarios personal data, but more often relates to the in which the opposite is the case. A second protection of information infrastructures and insight highlights consumers’ inability to other types of informational assets). make informed decisions about their pri- The diversity of privacy definitions and sce- vacy, due to their being often in a position narios is reflected in the selection of manu- of imperfect information regarding when scripts in this review. Figure 1 in the online their data is collected, with what purposes, appendix depicts some of the connections and with what consequences. A third insight among the different areas of privacy that we relates to heuristics that can profoundly survey. The reader should not hope to find a influence privacy decision making, since pri- unified theory of privacy, nor a single frame- vacy trade-offs are intertemporal in nature work incorporating and connecting the diverse and often uncertain. Finally, we highlight scholarly contributions we review. Privacy current issues in the privacy debate that may means different things to different people, be of interest to economists (section 4). and privacy issues with economic relevance Previous scholarship has distinguished arise in the most diverse contexts: from price different dimensions of privacy such as discrimination to identity theft; from spam seclusion, secrecy, solitude, anonymity, to targeted advertising. What connects these autonomy, freedom, and so forth.6 As noted, diverse definitions and scenarios is that they this review focuses on informational pri- all involve the negotiation and management vacy. Even under such a narrow focus, how- of the boundaries between private and public, ever, different dimensions and definitions and that those boundaries determine tangible of privacy emerge from the literature, such and intangible trade-offs. Some of those pri- as privacy as control over usage versus pri- vacy ­trade-offs may not just be intangible, but vacy as protection against access of personal in fact immeasurable. The economics of pri- information. Thus, this article covers studies vacy focuses on measurable, or at least assess- ranging from those that aim to capture indi- able, privacy components. Some (perhaps, viduals’ willingness to pay to protect their many) of the consequences of privacy protec- data, to those that capture the economic tion, and its erosion, go beyond the economic consequences of sharing or protecting data. dimensions—for instance, the intrinsic value Furthermore, when appropriate, the review of privacy as a human right, or individuals’ touches upon other dimensions of informa- innate desires for privacy regardless of the tional privacy, such as the value of anonym- associated economic benefits or lack thereof. ity (which is a form of privacy for identity Using economics to study privacy does not information: it removes the link between a imply the belief that such other, noneco- person and data items relating to that per- nomic dimensions do not exist or are unim- son); or the economic dimensions of spam portant. Quite the opposite: we acknowledge or the do-not-call­ registry (which relate to those dimensions but do not focus on them. intrusions of a person’s cyberspace made We urge the reader to keep that in mind when possible by knowledge of her information); considering the broader policy implications of or the burgeoning literature on the econom- the economics of privacy. ics of information security (which sometimes relates to privacy, for instance in studies of 2. The Economic Theory of Privacy In this section, we discuss three waves 6 For a taxonomy, see Solove (2007). of research in the economics of privacy: an 450 Journal of Economic Literature, Vol. LIV (June 2016) early wave dating back to the 1970s and early information will negatively affect the firm’s ‘80s; a middle wave active in the 1990s; and hiring decision. Therefore, the protection of a more recent and growing third wave. For the former’s privacy comes at the cost of the illustrative purposes, several simple and latter’s profitability. Removing an individual’s parsimonious algebraic examples appear personal information from the marketplace throughout the discussion. Due to the many through privacy regulation ultimately trans- diverse scenarios in which issues of informa- fers the cost of that person’s possibly neg- tional privacy arise, and their many dimen- ative traits onto other market participants sions, the examples we offer are not meant (see, also, Posner 1993). to represent any particular model or class of Similarly, Stigler (1980) argues that reg- models, but rather to illustrate the complex- ulatory interference in the market for per- ity inherent in privacy trade-offs and in any sonal information is destined, at best, to potential regulation. remain ineffective. Because individuals have an interest in publicly disclosing favorable 2.1 The First Wave personal information and hiding negative While privacy is far from a modern con- traits, those who decide to protect their per- cept (Westin 1967; Schoeman 1984; Ariès sonal information (for instance, a debtor who 1987),7 the extraordinary advances in infor- does not want to reveal her credit history) mation technology that have occurred since are de facto signaling a negative trait. In this the second half of the twentieth century have case, regulatory interventions blocking the brought it to the forefront of public debate. flow of personal information would be redis- A first wave of economic research consists of tributive and inefficient: economic resources seminal works produced between the 1970s and productive factors would end up being and early 1980s by Chicago School scholars used inefficiently, or rewarded unfairly, such as George Stigler and Richard Posner, because information about their quality had and competing arguments by scholars such been removed from the marketplace. as Hirshleifer (1971, 1980). By and large, However, Hirshleifer (1971, 1980) asserts this initial wave of work did not consist of that rational economic agents may end up formal economic models, but rather gen- inefficiently overinvesting in collecting per- eral economic arguments around the value sonal information about other parties, and or the damage that individuals, and society, that assumptions of rational behavior under- may incur when personal information is pro- lying the Chicago School’s privacy models tected, making potentially useful informa- may fail to capture the complexity inherent tion unavailable to the marketplace. in privacy decision making by individuals Posner (1978, 1981) argues that the pro- and organizations. Hirshleifer shows that, tection of privacy creates inefficiencies in given equilibrium prices, the private bene- the marketplace, since it conceals poten- fit of information acquisition may outweigh tially relevant information from other eco- its social benefit (for more recent examples, nomic agents. For instance, if a job seeker see Hermalin and Katz 2006; Burke, Taylor, misrepresents her background and exper- and Wagman 2012; Wagman 2014). In a pure tise to a hiring firm, protecting her personal exchange setting, information may have no social value at all, because it results only in a redistribution of wealth from ignorant to 7 Evidence of both a need and a desire for privacy, and informed agents. a need and a desire for socializing and disclosing, can be found throughout history and across diverse societies; see, While not temporally belonging to the e.g., Acquisti, Brandimarte, and Loewenstein (2015). first wave of privacy literature, Murphy Acquisti et al.: The Economics of Privacy 451

(1995) and Daughety and Reinganum (2010) Example 1 (Signaling and Privacy): Sup­ ­provide rebuttals to the Chicago School view. pose that an MBA student of privately known In particular, Daughety and Reinganum con- ability ​ [0, 1]​ can earn grades of g​ 0​ by θ ∈ ≥ struct a model in which each individual cares incurring effort cost g​/ ​. Upon graduating, θ about his reputation, but an ­individual’s her productivity on the job will be ​ ​. Firms θ actions generate externalities (public good make competitive wage offers to each gradu- or bad). Under a regime of publicity, indi- ate. In particular, if the business school pub- viduals distort their actions to enhance or licly reports its grades, then firmi ​​ makes an preserve their reputations, whereas under offer ​​w​ i​(g)​ to a graduate with grades g​​. On privacy, they choose their individually opti- the other hand, if the business school keeps mal level of the activity. Thus, for exam- grades private,_ then firmi ​​ must make the ple, both private and public welfare can be same offer ​​​ w ​i ​​​​ to all graduates. The utility of increased when information about an indi- a type ​ ​ student is given by θ vidual checking into a drug or alcohol rehab g center remains private; otherwise, the stigma U w, g; w ​ ​ . ​ ( θ) = − _ ​ associated with doing so could deter him θ from seeking treatment. Similarly, if a phy- It is straightforward to verify that if grades sician were not bound by confidentiality, a are public, then in the unique outcome of patient may not feel comfortable sharing all a least-cost separating equilibrium, a stu- the relevant details of her condition. On the dent of ability ​​ will earn grades of g​​​​ ∗( ) 2 θ ​θ other hand, when charitable contributions ​ ​​ ​/2​ and receive a wage offer w​​ ​​ ∗(​g​​ ∗( )) = θ ​ ​θ are public, amounts donated may increase, .​ Her equilibrium payoff, therefore, will = θ because contributing raises the reputation of be ​​U​​ ∗​( ) /2.​ On the other hand, if the θ = θ the donor. business school keeps grades _private, then In a similar vein, but even more fundamen- each student will earn a wage ​​ w ​ E [ ]​ and = θ tally, Spence (1973) can be viewed through a will “waste” no effort on grade seeking. Thus, lens of privacy regulation. From this perspec- privacy of grades is a Pareto optimal policy if tive, signaling activity may—as the Chicago and only if ​​U​​ ∗(1) E [ ]​ or ​E [ ] 1/2​. ​ ≤ θ θ ≥ scholars suggest—reveal ­payoff-relevant pri- 2.2 The Second Wave vate information, but the aggregate cost of the signaling activity may nevertheless out- By and large, economists did not again weigh the benefits. Indeed, as is well-known, exhibit particular interest in the econom- there are situations in which banning signal- ics of privacy for over a decade following ing behavior altogether (that is, enforcing a the first wave of research. This changed in regime of complete privacy) may result in the mid-1990s, arguably because of prog- a Pareto improvement. This is illustrated in ress in digital information technologies on the following example inspired by Gottlieb multiple fronts (the proliferation of elec- and Smetters (2011), who report that nine tronic databases and personal computers, of the fifteen most selective MBA programs the advent of the Internet, the diffusion of in the United States do not disclose student electronic mail), which led to a new set of grades to prospective employers.8 economic issues involving the usage of per- sonal data. This second wave is similar to the

8 See, also, the US Family Educational Rights and Privacy Act (FERPA), in connection to the privacy of Employment Opportunity Commission (EEOC), which student education records, http://www2.ed.gov/policy/ governs the federal legalities of information flows in hiring gen/guid/fpco/ferpa/index.html, as well as the US Equal practices, http://www.eeoc.gov/laws/practices/. 452 Journal of Economic Literature, Vol. LIV (June 2016) first in terms of a preference for ­articulating may not be internalized by the consumer nor ­economic ­arguments, rather than formal by the firm that distributes the information models. However, it is differentiated from (Swire and Litan 1998). the first wave not just temporally, but also Whereas Varian points out a possible in terms of the ­specificity of the privacy individual cost from data protection (e.g., scenarios considered, and the emergent ­receiving irrelevant rather than relevant awareness of the role of digital information offers), a possible social cost of privacy is ­technologies. Works produced in this wave highlighted by Friedman and Resnick (2001). began focusing on issues such as the role of Friedman and Resnick focus on the availabil- cryptographic technologies in affecting eco- ity of easy “identity changes” (for instance, nomic trade-offs­ of data holders and data cheap pseudonyms). Using a repeated pris- subjects, or the establishment of markets for oner’s dilemma game, they show that distrust personal data, as well as the economic impli- of newcomers is an inherent social cost of cations of the secondary uses of personal cheap pseudonyms—privacy of identity can information. be a barrier to trust building. However, it In particular, Varian (1997) observes that does not need to be: the authors also show the development of low-cost technologies for that there are intermediate forms of identity data manipulation generates new concerns for protection that minimize those social costs, personal information processing. Varian none- thereby providing both some degree of pri- theless recognizes that consumers may suffer vacy and some degree of accountability. privacy costs when too little personal infor- Who, then, should hold an economic claim mation about them is being shared with third over personal data? The subject to whom the parties, rather than too much. The consumer, data refers or the organization that invested he notes, may rationally want certain informa- resources in collecting the data? In accor- tion about herself known to other parties (for dance with the Coase theorem (Coase 1960), instance, a consumer may want her vacation Noam (1997) argues that whether or not a preferences to be known by telemarketers in consumer’s data will remain protected does order to receive offers and deals from them not depend on the initial allocation of rights that may actually interest her). The same con- on personal information protection—that sumer, however, may rationally not want too is, it does not depend on the presence or much information to be known by others—for lack of a privacy regulatory regime. Instead, instance, information about her willingness to whether data will eventually be disclosed or pay for the deals in which she is interested. protected ultimately depends on the relative The line of reasoning in Varian (1997) echoes valuations of the parties interested in the Stigler’s and Posner’s approaches, but adds to information. What the presence or lack of it novel concerns associated with the second- a regulatory regime will affect, however, is ary usage of personal data. A consumer may which party—the data subjects, or the data rationally decide to share personal informa- holders—will pay the other for access to, or tion with a firm because she expects to receive protection of, personal data. In other words, a net benefit from that transaction; however, the allocation of privacy rights may still have she has little knowledge or control over how allocative and distributional consequences, and by whom that data will later be used. The differentially affecting the surplus of various firm may sell the consumer’s data to third parties, even when it may not have an effect parties, which may lead to spam and adverse on aggregate welfare. Coasian arguments in price discrimination, among other concerns the analysis of privacy are also proposed by (Odlyzko 2003). Such negative externalities Kahn, McAndrews, and Roberds (2000), but Acquisti et al.: The Economics of Privacy 453 they depend on consumers being aware of, for means to trade those rights across data and internalizing, the costs and benefits of ­subjects and potential data holders. While trading their private information.9 the assignment of property rights is generally Laudon (1997) proposes the creation of welfare enhancing, granting consumers the information markets where individuals own right to sell their personal data may actually their personal data and can transfer the undermine consumer surplus, as illustrated rights to that data to others in exchange for in the following example. some type of compensation. Similarly to the view proposed by Chicago School scholars, Example 2 (A Market for Consumer Laudon argues that the mere legal protec- Information): Consider a market for a tion of privacy is outdated, and a system ­certain good, composed of a measure 1 of based on property rights over personal infor- massless consumers. The consumers’ valua- mation would better satisfy the interests of tions for the good are uniformly distributed both consumers and firms.10 Clearly, how- on ​[0, 1]​. The market is served by a monop- ever, a system of property rights over per- olist with production cost normalized to sonal information would require appropriate zero. Absent a market for information, the legislation to define and assign those rights. monopolist would set its price at p​​​​ M​ _1 ​ ​; = 2 This observation reveals that market-based it would earn profit of​​ _ 1 ​​ ; and the top half of 4 and regulatory approaches to privacy are the market would earn aggregate consumer not binary opposites, but rather points on a surplus of _​​ 1 ​​. 8 spectrum. At one end of the spectrum, one Now suppose that each consumer pos- would find regimes where no privacy legis- sesses verifiable information (e.g., place of lation exists. Under those regimes, the pro- residence or employment) that correlates tection of data relies entirely on consumers’ perfectly with her valuation for the good. marketplace behavior (for instance, strate- The monopolist first makes an offer to pay​ gies such as avoiding interactions with firms r 0​ to any consumer who reveals her infor- ≥ that do not provide adequate protection of mation. It then uses the information thus one’s data, or adopting privacy-enhancing obtained to make personalized price offers technologies to prevent the leakage of per- ​p( ̂ ​ v)​ to those consumers who sold their infor- sonal data), and on firms’ self-regulated, mation and it posts a common price ​p​ to all ­competition-driven data-handling policies. those who did not. On the opposite end of the spectrum, pri- It is straightforward to verify that in the vacy regulation would establish strict default unique perfect Bayesian equilibrium of protection of personal data and limitations this game the following must hold. The over its usage. Somewhere in between, leg- monopolist offers ​r 0​ for information. = islative initiatives may create a framework Nevertheless, all consumers reveal their val- for property rights over personal data and uations, and the monopolist sets ​p( ​ v) v​ and ​ ̂ = p 1​. The intuition here is similar to that in = Grossman (1981) and Milgrom (1981). The 9 Incomplete information and information asymmetries marginal anonymous consumer makes no (for instance, a consumer not being even aware that her surplus and, therefore, is always willing to data is being collected) can limit the applicability of the Coase theorem to the analysis of privacy. For an analysis of reveal her valuation for an arbitrarily small the scope of the Coase theorem in the presence of private payment, but this means that there can be information, see Farrell (1987). no marginal anonymous consumer in equi- 10 For related work on property rights over personal information, see Litman (2000), Samuelson (2000), and librium. That is, the set of anonymous con- Schwartz (2004). sumers unravels from the bottom. 454 Journal of Economic Literature, Vol. LIV (June 2016)

This situation raises social surplus by ​​ _1 ​ fragmented than the previous two in terms 8 and is allocatively efficient—the monopolist of the focus of analysis. extracts all the social surplus of ​​ _1 ​​ . However, While much of the third wave is focused 2 consumers are worse off: the unregulated on issues surrounding privacy as the pro- market for information reduces consumer tection of information about a consumer’s surplus from _​​ 1 ​​ to 0, despite the fact that preferences or type (hence a significant 8 consumers initially owned property rights to number of models examine the relationships their information. between privacy and dynamic pricing), dif- ferent dimensions to privacy (and different 2.3 The Third Wave dimensions of informational privacy) exist, and economic trade-offs can arise from dif- Following the commercial success of the ferent angles of the same privacy scenarios. Internet and the proliferation of databases Consequently, other streams of work we containing consumer information, research consider in this section include the rise of on the economics of privacy dramatically spam, the development of markets for pri- increased at the start of the twenty-first vacy (Rust, Kannan, and Peng 2002), behav- century. Because so many transactions and ioral targeting, the economic analysis of activities, once private, are now conducted (personal) information security, and the rela- online, firms, governments, data aggre- tionship between public goods, social recog- gators, and other interested parties can nition, and privacy. observe, record, structure, and analyze data 2.3.1 Privacy, Consumer Identification, and about consumer behavior at unprecedented Price Discrimination levels of detail and computational speed (Varian 2010). As a result, the digital econ- Intended as the analysis of the relationships omy is, to a degree, financed by the organi- between personal data and dynamic pricing, zation of large amounts of unstructured data the economics of privacy is closely connected to facilitate the targeting of product offer- to the vast stream of studies on intertemporal ings by firms to individual consumers. For price discrimination based on consumer rec- instance, search engines rely on data from ognition. This literature solidifies the notion repeat and past searches to improve search of consumer tracking and personalized pric- results, sellers rely on past purchases and ing, but does not explicitly consider privacy browsing activities to make product recom- issues in online environments. Chen (1997) mendations, and social networks rely on giv- studies discriminatory pricing when differ- ing marketers access to their vast user bases ent consumers buy different brands, and in order to generate revenues. This third Fudenberg and Tirole (1998) explore what wave, while temporally close to the second, happens when the ability to identify con- is differentiated by the fact that studies are sumers varies across goods—they consider rooted in more formal economic models and a model in which consumers may be anon- in empirical analyses, including lab experi- ymous or “semi-anonymous,”­ depending ments (we consider empirical analyses sep- on the good purchased. ­Villas-Boas (1999) arately in section 3). In addition, this third and Fudenberg and Tirole (2000) analyze wave is more directly linked to the novel a duopoly model in which consumers have economic issues brought forth by develop- a choice between remaining loyal to a firm ments in information technology, including and defecting to a competitor, a phenome- search engines, behavioral targeting, and non they refer to as “consumer poaching” social media. Thus, this third wave is more (Asplund et al. 2008 demonstrate evidence Acquisti et al.: The Economics of Privacy 455 of this sort of poaching in the Swedish news- model in which merchants have access to paper industry). They show that a firm always “tracking” technologies and consumers has an incentive to offer discounts to a rival have access to “anonymizing” technologies. firm’s customers who have revealed, through Internet commerce offers an example: mer- their prior purchases, their preferences for chants can use cookies11 to track consumer the rival firm’s product. Such discounts ini- behavior (in particular, past purchases and tially tend to reduce consumer price sen- browsing activities), and consumers can sitivity for a firm’s product, as consumers delete cookies, use anonymous browsing rationally anticipate them; hence prices or payment tools, and so forth, to hide that rise in later periods, thanks to ­anticipated behavior. Acquisti and Varian (2005) demon- ­customer poaching. Chen and Zhang (2009) strate that consumer tracking will raise a study a “price for information” strategy, merchant’s profits only if the tracking is also where firms price less aggressively in order used to provide consumers with enhanced to learn more about their customers. Jeong personalized services. and Maruyama (2009) and Jing (2011) iden- Complementary to the above works, tify conditions under which a firm should ­Villas-Boas (2004) shows how strategic con- discriminate against its first-time and repeat sumers may make a firm worse off in the customers. context of dynamic targeted pricing. The More specific to privacy, Taylor (2004) reason is that once consumers anticipate finds that, in the presence of tracking tech- future prices, they may choose to skip a nologies that allow merchants to infer con- purchase today to avoid being identified as sumers’ preferences and engage in price a past customer tomorrow—and thus have discrimination, the usefulness of privacy access to lower prices targeted at new con- regulatory protection depends on consum- sumers. This strategic “waiting” on the part ers’ level of sophistication. Naïve consumers of consumers can hurt a firm both through do not anticipate a seller’s ability to use any reducing sales and diminishing the benefit of and every detail about their past interac- price discrimination, and may push a firm to tions for price discrimination; consequently, voluntarily adopt a privacy-friendly policy. in equilibrium, their surplus is captured by Calzolari and Pavan (2006) consider the firms—unless privacy protection is enforced exchange of information regarding custom- through regulation. Regulation, however, is ers between two companies that are inter- not necessary if consumers are aware of how ested in discovering consumers’ willingness merchants may use their data and buyers can to pay. They find that the transmission of per- adapt their purchasing decisions accordingly, sonal data from one company to another may because it is in a company’s best interest to in some cases reduce information distortions protect customers’ data (even if there is no and enhance social welfare (see also Pavan specific regulation that forces it to do so). and Calzolari 2009; Kim and Choi 2010; Kim This is an example of how consumers, with and Wagman 2015). Information disclosure their choices, could make a company’s pri- is therefore not always harmful to the indi- vacy-intrusive strategies counterproductive vidual and may contribute to improving the (section 3 includes references to studies that welfare of all parties involved. Moreover, in highlight consumers’ awareness and knowl- edge of tracking technologies and privacy trade-offs). 11 Cookies refer to files that are stored on a user’s device, which can be subsequently used to help recognize Similar conclusions are reached by Acquisti the user across different web pages, websites, and brows- and Varian (2005), who study a two-period ing sessions. 456 Journal of Economic Literature, Vol. LIV (June 2016) line with Taylor (2004), companies may be there are three or more firms in the market inclined to develop their own privacy protec- and two of them merge, the postmerger loss tion policies for profit-maximizing purposes, in consumer surplus is substantially lower even without the intervention of a regula- when firms first-degree price discriminate, tory body. Conitzer, Taylor, and Wagman compared to when they cannot. In contrast, (2012) confirm these findings in a model this reduction is absent in a two-to-one where strategic consumers can opt to remain merger, leading to substantial anticompeti- anonymous towards sellers at some cost—a tive effects of the merger. Thus, their study cost modeled as the monetary-equivalent illustrates that the merger effects of access to burden of maintaining privacy. The authors consumer data depend on . show that consumer surplus and social wel- Armstrong and Zhou (2010) study a fare are nonmonotonic­ in this cost, reaching duopoly search model where ­consumers their highest levels at an intermediate level may choose not to purchase a product on of privacy. their first visit—and sellers record this Other studies take intrinsic privacy con- behavior. They show that, in equilibrium, cerns as given (with the source not nec- firms set higher prices for returning con- essarily modeled), and then analyze how sumers, whereby first-time visitors would these concerns affect equilibrium behavior: pay discounted rates, and that such prac- Gradwohl (2014) does so in the context of tices may lead consumers not to return. decision making in committees; Dziuda and These types of pricing strategies can result Gradwohl (2015) in the context of interfirm in consumer backlash—akin to what took communication to achieve cooperation; and place with Amazon in 2001 (Anderson Gradwohl and Smorodinsky (2014) examine and Simester 2010), which may lead firms some of the effects of privacy concerns on to commit upfront not to engage in such pooling behavior, misrepresentation of infor- practices. Indeed, one theme resonating mation, and inefficiency. throughout this line of research is that firms The sharing or protection of consumer with often benefit from com- data can also influence market competition. mitting to privacy policies. This is illustrated Campbell, Goldfarb, and Tucker (2015) in the following simple example. demonstrate that, if privacy regulation only relied on enforcing opt-in consent, an unin- Example 3 (Repeat Purchases and Cus­ tended consequence may be the entrenching tomer Tracking): Suppose a population of ​n​ of monopolies. The authors show that con- individuals wishes to consume one unit of a sumers are more likely to grant their opt-in good in each of two periods. Half of the indi- consent to large networks with a broad scope, viduals are high-valuation consumers who rather than to less established firms. Hence, value the good at 1 in both periods and the if regulation focuses only on enforcing an other half are low-valuation consumers who opt-in approach, users may be less likely to value it at ​ ​ 0, _1 ​ ​​ in both periods. Each λ ∈ ( 2) try out services from less established firms consumer’s valuation is privately known. The and entrants, potentially creating barriers good is sold by a monopolist with production to entry by leading to a “natural monopoly” cost normalized to 0. The consumers and the in which scale economics include privacy firm are risk neutral and (for simplicity) do protection. Kim, Wagman, and Wickelgren not discount the future. Also, it is common (2016) examine the effect of first-degree price knowledge that the monopolist possesses a discrimination on the welfare consequences tracking technology (for instance, cookies or of horizontal mergers. In their model, when browser fingerprints) with which it can recall Acquisti et al.: The Economics of Privacy 457 whether a consumer purchased the good in search sequentially after having entered the first period and what price he paid for it. a query on a search engine, he shows that Moreover, the monopolist may use this infor- targeting reduces search costs, improves mation to make personalized price offers to matches between consumers and firms, and consumers in the second period. intensifies price competition. However, a It can be shown (see, e.g., Taylor 2004; ­profit-maximizing search engine may choose Acquisti and Varian 2005) that on the path to charge too high an advertising fee, which of play in any perfect Bayesian equilibrium can negate the benefits of targeting. Hence, of this game the following must hold: The the optimal level of accuracy in terms of monopolist makes first-period price offers advertising matching solves a trade-off­ ​​p​ ​ 1​ to all consumers and second period between consumer participation and the 1 = offers ​​p​ ​ 1​ to all consumers regardless profit of the intermediary. 2 = of their purchase histories. A low-valuation­ Hagiu and Jullien (2011) study how consumer never purchases the good. A intermediaries can use information about high-valuation consumer purchases with consumer characteristics in order to affect probability 1 in the second period but pur- matching between firms and consumers. 1 − 2 chases with probability ​​ _____λ ​ 1​ in the first They show that if an intermediary receives 1 − < period (leaving the monopolistλ just indif­ a fee each time a consumer visits an affili- ferent between p​​ ​ ​ 1​ and ​​p​ ​ ​ following ated firm, the intermediary has an incentive 2 = 2 = λ a first-period rejection). to direct consumers towards firms that they If the monopolist could publicly commit would not have visited otherwise. Doing so, not to use the tracking technology, then the the intermediary manipulates the elasticity price offers would be the same, ​ ​ ​ ​ ​ ​ of the demands faced by its affiliated firms. p1 = p2 1​, but high-valuation consumers would Bergemann and Bonatti (2015) study the = accept with probability 1 in the first period acquisition of user-pertinent information by because rejections could never induce lower an advertising platform and its subsequent second-period prices. Thus, the tracking sale to advertisers. In their model, a data pro- technology leads to strategic first-period vider sets the price of an information record rejections by high-valuation consumers, a (e.g., a cookie). Advertisers subsequently Pareto inferior outcome that reduces welfare acquire information records from the data n (in the form of monopoly profit) by​ ​____λ ​​ . provider, form posterior beliefs about con- 1 − λ sumer types, and purchase advertising 2.3.2 Data Intermediaries space. The authors demonstrate situations A number of works have incorporated where improved precision of user informa- questions regarding privacy into the study of tion leads to fewer records being purchased. two-sided markets. Such studies can help us Consequently, a data provider may choose to understand the role of large data holders— restrict or cap advertisers’ access to informa- companies such as Google, Facebook, and tion about users (that is, constrain or reduce Amazon—which in part act as intermediar- its precision) in order to sell more records ies, selling advertising space to advertisers on and generate greater profits. one end and providing services and products Gehrig and Stenbacka (2007) examine to users on the other. lenders in a repeated-interaction framework De Cornière (forthcoming) shows that and consider the possibility of information when consumers actively search for prod- sharing among lenders (for instance, via ucts, targeting leads to more intense compe- credit bureaus). The authors demonstrate tition. In a framework in which consumers that in the presence of information ­sharing, 458 Journal of Economic Literature, Vol. LIV (June 2016) switching costs are essentially reduced, sellers. When consumers are anonymous which relaxes competition for initial ­market and sellers cannot track their searches, shares and can end up reducing the welfare there exists an equilibrium in which sellers of borrowers. In other instances, firms may disclose all of their product information in be reluctant to use first- orthird-degree ­ the limit as search costs vanish. However, price discrimination, for fear of a pub- when sellers are able to observe buyers lic backlash. De Cornière and Nijs (2014) (for instance, through tracking their online rule out direct price discrimination based behavior) and can infer their beliefs, there on consumers’ personal information by is often a unique equilibrium, akin to the focusing instead on firms’ bidding strate- Diamond paradox (Diamond 1971). In this gies in auctions for more precise target- equilibrium, every seller adopts a monopoly ing of their advertisements. That is, given disclosure policy that manipulates consum- that ­consumers’ ­private information­ pro- ers to purchase the most profitable prod- vides a finer and finer segmentation of the ucts, rather than the ones most suited to ­population, firms cancompete ­ to advertise their needs. In other words, the ability to their nondiscriminatory pricing over each track buyers makes it possible for sellers to of those consumer segments. By disclosing implicitly collude, a result that is similar in information about consumers, the platform spirit to the relationship between privacy ensures that consumers will see the most and market competition discussed in sec- relevant advertisements, whereas when no tion 2.3.1. information is disclosed under a complete Zhang (2011) also follows an approach that privacy regime, ads are displayed randomly. does not require the direct use of an inter- They find that targeted advertising can lead mediary and yields similar findings. She stud- to higher prices and, in line with Levin ies competitive markets with endogenous and Milgrom (2010) and Bergemann and product design and demonstrates that in an Bonatti (2015), that improving match qual- effort to avoid more aggressive pricing from ity by disclosing consumer information to competitors, market leaders may choose to firms might be too costly to an intermedi- introduce mainstream products that appeal ary because of the informational rent that to the broader segment of the population. By is passed on to firms. Given a relationship doing so, rather than pursuing an approach between the match quality of advertis- of product differentiation, firms can limit ing and consumer demand, it is then pos- consumers’ strategic release of preference sible to specify conditions under which information—similar to what an intermedi- some privacy or some limits to disclosure ary would do—in order to dampen compe- are optimal for an intermediary (see, also, tition and facilitate product entry (see, also, Cowan 2007). Wickelgren 2015). In a related analysis, Board and Lu (2015) Another approach to limit the release of study the interaction between buyers, who information by consumers is explored in the search across multiple websites to learn study of intermediary gatekeepers (Baye which product best fits their preferences, and Morgan 2001; Wathieu 2002; Pancras and merchants, who manage disclosure and Sudhir 2007)—a third party that pro- policies regarding their products (such as vides consumers with access to some degree advertisements, product trials, or reviews). of anonymity, possibly at a cost. Consistent In particular, the authors study how market with the above works, Conitzer, Taylor, and outcomes vary as a function of the amount Wagman (2012) show that it can be profit of consumer information accessible by the maximizing for both firms and a gatekeeper Acquisti et al.: The Economics of Privacy 459 to reach agreements for granting users the ­intensity ​x​ with a consumer of value v​​ are ​ 12 2 ability to freely anonymize. At the same (v, x) vx ___ c​x​​ ​ . time, Taylor and Wagman (2014) demon- π = − 2 A data provider knows the match value ​v​ i​ strate that the effects of firms’ ability to of each consumer ​i,​ and sells this data to the target individual consumers on consumer advertiser at a constant price per individual ​ surplus, profits, and overall welfare is con- p​. Hence, if the advertiser acquires infor- text dependent, whereby any conclusions mation about consumer ​i​, it is able to tailor drawn from a given model must be under- its choice of contact intensity to the match stood within its specific market setting. value, i.e., ​x​​ ∗(​v​ i​) ​v​ i​ /c​, and obtain profits ​2 = A common lesson arising from this liter- v​ i​ ​ of ​ ∗(​v​ ​) __​ ​ . In contrast, the advertiser ature is that firms—be they advertisers or π​​ ​ i = 2 c _ data intermediaries—seldom possess socially must choose a constant intensity level ​​ x for optimal incentives to match consumers with all other consumers._ Because the constant products. This is illustrated with the follow- intensity level ​​ x depends on the composition ing example suggested to us by Alessandro of the residual set of consumers, the adver- Bonatti. tiser’s information-acquisition problem can be formulated as the choice of a targeted set Example 4 (Buying and Selling of consumers ​A V​. ⊂ Consumer-Level Information): An adver- The demand for information about spe- tiser faces a continuum of heterogeneous cific consumers can be traced back to two consumers and a monopolist data provider. sources of mismatch risk: excessive versus The match value v​​ ​ i​ between consumer ​i​ and insufficient advertising. Specifically, the the advertiser’s product is uniformly distrib- ­profit-maximizing residual set for the adver- uted on ​V ​[0, 1]​​. The advertising technol- = tiser in this case is a nonempty interval, ogy is summarized by the matching function​ C ​​A​​ ​ ​ ​ _1 ​ 2 ​ __cp ​, _ 1 ​ 2 ​ __cp ​ ​. In other words, c​x​​ 2​ [2 2 ] m (x) ___ ​​ that represents the expenditure = − √ + √ = 2 the advertiser purchases information about by the advertiser required to generate a con- both very high- and very low-value con- tact of intensity ​x​. The ­complete-information C _1 sumers. Note that E​ ​[​v​ i​ vi​ ​ A​​ ]​ ​ ​ , which profits from generating a contact of _ |​ ∈ ​ = 2 implies ​​ x __1 ​ . This is too much advertising = 2 c for the consumers in the bottom half of the residual set and too little advertising for the consumers in the top half. 12 Kearns et al. (2014) study the design of mechanisms Finally, if the data provider incurs no mar- that satisfy the computer science criterion of differential ginal cost of supplying information, then it privacy (Dwork 2006)—put simply, the notion of being __ chooses ​p​ to maximize ​p (1 4 ​ cp ​ )​; i.e., it able to distinguish one agent (a consumer) from another in 1 − √ sets ​p​​ ∗ ___ ​ . The advertiser thus purchases a dataset of consumer characteristics with only a low prob- ​ = 36c ability. They show that mechanisms can be designed to sat- data only on the bottom and top sixths of the isfy a variant of this criterion when there are large numbers of agents, and any agent’s action affects another agent’s market, treating all other consumers as if payoff by at most a small amount. Other related mecha- they had match value _​​ 1 ​ . nism-design issues have been studied. One issue is limit- 2 ing “exposure,” where agents internalize being exposed to 2.3.3 Marketing Techniques the realized types and chosen actions of a subset of other agents (Gradwohl and Reingold 2010) or to the party Some studies expand the analysis of privacy responsible for implementing the mechanism (Gradwohl to include the costs of intrusions into an indi- 2015). Another issue is “anonymity,” where agents may seek to participate in a mechanism multiple times when vidual’s personal sphere, such as unsolicited anonymizing is too easy (Wagman and Conitzer 2014). mail or spamming, as in Hann et al. (2008), 460 Journal of Economic Literature, Vol. LIV (June 2016) and personal preferences over privacy, as in among senders of messages for the receivers’ Tang, Hu, and Smith (2008). Here, the the- attention, which is a limited resource. Their oretical study of privacy connects with the model considers the costs that both parties marketing literature on couponing, market have to incur in order to arrive at a transac- segmentation, and consumer addressabil- tion. These costs endogenously determine ity (Blattberg and Deighton 1991). Works the number of messages sent by the sender by Shaffer and Zhang (1995, 2002), Chen, and the number of messages read by receiv- Narasimhan, and Zhang (2001), Chen and Iyer ers. If the cost of sending messages is too low, (2002), Conitzer, Taylor, and Wagman (2012), there will be a congestion problem, meaning and Shy and Stenbacka (forthcoming) obtain that receivers will only read some of the mes- complementary results. These authors show sages sent (see, also, Van Alstyne 2007). In that when a firm has control over consumers’ this case, a welfare-enhancing solution may privacy, it chooses to segment the population be to add a small tax on the transmission of optimally for pricing purposes. Their findings a message. Such a tax may increase surplus, demonstrate that price discrimination can because senders who send messages of low lead to intensified price competition, where quality will be crowded out (it would be too firms may possess incentives to (1) decrease costly for them to send a message), fewer the level of accuracy of targeted promotions, messages will be sent, and more will be read. (2) differentially invest in customer address- Spiegel (2013) identifies conditions under ability, and (3) seek commitment mechanisms which firms may choose to bundle new soft- not to price discriminate. ware with advertisements and distribute­ it Hann et al. (2008) study a competitive for free as adware. While adware is more market with heterogeneous consumers, affordable to consumers and may contain some who draw no benefit from unsolicited advertisements that help improve their pur- marketing and some who are interested in chasing decisions, it also entails a loss of receiving information about new products. privacy. They show that attempts to use technologies Linking the above works to the study that prevent unsolicited marketing on one of privacy, Van Zandt (2004), Armstrong, side, and sellers’ efforts to use direct market- Vickers, and Zhou (2009), Anderson and de ing on the other, constitute strategic comple- Palma (2012), and Johnson (2013) also inves- ments: the higher the attempts of consumers tigate the topic of congestion due to consum- to protect themselves from unsolicited mar- ers having limited attention. In their models, keting, the higher the use of direct market- consumers can choose to “opt out” from ing by sellers. Similarly, Hui and Png (2006) receiving sellers’ marketing. The result is a consider the use of private information for form of a prisoner’s dilemma situation: while unsolicited marketing—in person, via tele- each consumer has a private incentive to opt phone, mail, or email—which competes with out of intrusive marketing, when all con- the marketing efforts of other companies and sumers do this, price competition is relaxed may inconvenience individuals. In related and consumers are harmed. Targeted ads, work, Chellappa and Shivendu (2010) exam- however, can also be counterproductive, if ine the trade-offs vendors and consumers they trigger the recipient’s privacy concerns face between privacy concerns and the per- or her worries regarding the level of con- sonalization of services and products that the trol over her private information (Tucker sharing of data may make possible. 2014). In this sense, targeted advertising is a Anderson and de Palma (2012) look at form of unsolicited marketing. While spam- spamming as a problem of competition ming involves the indiscriminate sending­ Acquisti et al.: The Economics of Privacy 461 of ­advertisements, targeted advertising (or consumer who has time to open exactly one behavioral targeting), as the name suggests, email message. If she opens a message from consists of contacting a select group of recip- the retail firm she receives a payoff of ​v 0​ > ients who, according to the information and the retailer receives gross profit of​b ​ 1​. available to the sender about their previous If she opens a message from the spammer, behaviors or preferences, may be particu- she receives expected payoff ​ k 0​ (a small − < larly interested in the advertised product or fraction of consumers may receive a positive service. payoff from opening spam, but the majority Hoffmann, Inderest, and Ottaviani (2014) receive a negative payoff) and the spammer study targeted communications, a practice receives ​b​ ​ (0, ​b​ ​)​. Also assume ​b​ ​ v ​b​ ​ k​. 0 ∈ 1 1 > 0 they refer to as hypertargeting, in the con- The expected profit to firm ​i ​from sending text of marketing and political campaigns. the consumer ​​m​ i​ messages when its rival In a departure from the earlier literature on sends her ​m​ j​ is strategic disclosures (Grossman and Shapiro ​m​ ​ 1984; Milgrom 1981; Milgrom and Roberts Π​ ​ _____i ​ ​b​ ​ c​m​ ​ , i 0, 1 , i j​, i = ​m​ ​ ​m​ ​ i − i ∈ { } ≠ 1986), they assume that firms must be selec- i + j tive when choosing the amount of informa- tion they communicate to consumers (e.g., where ​c​ is the marginal cost of sending a mes- due to space or time constraints). Since sage. The expected payoff to the consumer consumers differ in their preferences, firms from opening a single message at random is may wish to market different product attri- ​m​ 1​v ​m​ 0​k butes to different consumers. They model ​U ______− . = ​m​ 1​ ​m​ 0​ ­hypertargeting as the selective disclosure of + information to a specific audience, and char- It is straightforward to verify that in the acterize the private incentives and welfare unique perfect Bayesian equilibrium of this impact of hypertargeting. They demonstrate game the following must hold: firm ​i​ sends that a privacy policy that hinders hypertar- ​b​ ​ ​b​ 2​ ​ geting by, for instance, banning the collection _j i ​​m​ i∗​ ​ ​​ of personally identifiable data, is beneficial ​ = c​(​b​ ​ ​b​ ​)​​ 2​ 1 + 0 when consumers are naïve, competition is limited, and firms are able to segment the messages and receives expected profit market to price discriminate. Otherwise, 3 privacy regulation may backfire, because a _​bi​ ​ ​ ​​Π​ i∗​ ​ ​​ , policy that, for instance, requires consumer ​ = ​(​b​ ​ ​b​ ​)​​ 2​ 1 + 0 consent, can allow firms to commit to abstain from selective targeting—even when doing and the consumer receives expected payoff so would benefit consumers. The following example demonstrates that ​b1​ ​ v ​b0​ ​ k ​​U​​ ∗​ ​ _− . = ​b​ ​ ​b​ ​ the common wisdom that imposing a tax on 1 + 0 messages will fall more heavily on spammers, and thereby improve the average quality of Observe that charging the firms a tax contacts, need not necessarily be true. of ​t​ per message (resulting in a marginal cost of c​ t​) reduces the number of mes- + Example 5 (Marketing and Spam): sages sent by each firm, but has no impact Suppose there are two firms, a spammer on equilibrium payoffs. The firms would (firm 0) and a retailer (firm 1). There is a respond to a tax by sending ­proportionally 462 Journal of Economic Literature, Vol. LIV (June 2016) fewer messages, reducing the absolute Those nuanced trade-offs are reflected in number received by the consumer, but not the literature we examine in this section. We their composition. In contrast, a filter that survey the empirical literature on the eco- correctly identifies a fraction of the mes- nomics of privacy to highlight some of the ϕ sages sent by firm 0 as spam both reduces costs and benefits of privacy protection and the number of messages sent by each firm information sharing. The market for personal in equilibrium and raises the consumer’s data and the market for privacy are two sides expected payoff to of the same coin, wherein protected data may carry benefits and costs that mirror or ​b1​ ​ v (1 )b​ 0​ ​ k U​​ ∗∗​ ______​ − − ϕ . are dual to the costs and benefits associated = ​b​ ​ (1 )​b​ ​ 1 + − ϕ 0 with disclosed data for both data subjects and data holders. For instance, disclosed 3. The Empirical Analysis of Privacy personal information (or lack of data protec- tion) can result in economic benefits for both If our perusal of the theoretical economic data holders (savings, efficiency gains, sur- literature on privacy has revealed one robust plus extraction, increased revenues through lesson, it is that the economic consequences consumer tracking) and data subjects (per- of less privacy and more information shar- sonalization, targeted offers and promotions, ing for the parties involved (the data subject etc.). At the same time, such disclosures (or, and the actual or potential data holder) can the lack of protection of personal data) can in some cases be welfare enhancing, while, be costly for both firms (costs incurred when in others, welfare diminishing. The various data is breached or misused, or collected in streams of research we covered highlighted ways that consumers deem too intrusive) that, in choosing the balance between and consumers (from tangible costs such as ­sharing or hiding personal information (and ­identity theft or (price) discrimination, to in choosing the balance between exploit- less tangible ones such as stigma or psycho- ing or protecting individuals’ data), both logical discomfort; see, e.g., Stone and Stone individuals and organizations face complex, 1990; Feri, Giannetti, and Jentzsch 2016). often ambiguous, and sometimes intangi- Furthermore, the act of collecting data can ble ­trade-offs. Individuals can benefit from be costly for data holders (such as the invest- protecting the security of their data to avoid ments necessary to establish customer rela- the misuse of information they share with tionship management systems). other entities. However, they also benefit Similarly, protected data (or, lack of data from the sharing of information with peers disclosure) can be associated with both ben- and third parties that results in mutually efits and costs for data subjects and potential satisfactory interactions. Organizations can data holders; such benefits and costs are often increase their revenues by knowing more dual (that is, the inverse) of the benefits and about the parties they interact with, tracking costs highlighted above. For instance, data them across transactions. Yet, they can also subjects and data holders may incur oppor- bear costs by alienating those parties with tunity costs when useful data is not disclosed policies that may be deemed too invasive. (for instance, they may miss out on oppor- Intermediaries can increase their revenues tunities for increased efficiency or increased by collecting more information about users, convenience), although both parties may also yet offering overly precise information to benefit in various ways (consumers, for exam- advertisers can backfire by reducing compe- ple, by reducing the expected costs associ- tition among sellers. ated with identity theft; firms, for example, Acquisti et al.: The Economics of Privacy 463 by exploiting privacy-friendly stances for just under a third of overall advertising. 14 competitive advantage). Furthermore, there Meanwhile, the market capitalization of the are costs associated with the act of protect- major publicly traded newspaper businesses ing data (investments necessary to encrypt in the United States declined by 42 percent data for the data holders to prevent further between January 2004 and August 2008, disclosures, costs of using privacy-enhancing compared to a 15.6 percent gain for the technologies for the data subject, etc.). Dow Jones industrial average over that time In short, there can be many dimensions to period. privacy harms (Calo 2011) and to the ben- Key to the online collection of consumer efits arising from personal information. The information are the aspects of “targetabil- rest of this section does not attempt to pro- ity,” the collection of data for the purpose of vide a comprehensive enumeration of those showing ads to specific subsets of users, and dimensions, but surveys the areas that have “measurability,” the collection of data for the attracted more empirical economic analysis. purpose of evaluating the efficacy of targeted ads. Data aggregators, advertising networks, 3.1 Privacy, Advertising, and Electronic and website operators establish relationships Commerce to enable them to track and target users Online advertising is perhaps the most across different websites and over time. common example of how firms use the large Advertisers take advantage of the enhanced amounts of data that they collect about performance measurability of online adver- users. The greater availability of personally tising to experiment with different mar- identifiable data on the Internet in terms of keting messages before proceeding with a scope, quantity, and the precision with which specific marketing campaign (see Lewis and firms can target specific userschallenges ­ Reiley 2014). Advertisers and website oper- the traditional­ distinction between per- ators can track user behavior using several sonal selling and remote communication. techniques—from web bugs (also known As a result, the way advertising is targeted as beacons),15 to cookies, to browser and affects marketing strategies and competition device fingerprinting.16 In fact, various and between online and offline media (see, for constantly evolving technologies (such as the instance, Athey and Gans 2010; Bergemann aforementioned web bugs, or flash cookies, and Bonatti 2011; and Athey, Calvano, and etc.) allow advertisers to track consumers’ Gans 2013). Already in 2008, fifty-six of the browsing activities and gain insight into their top one hundred websites (based on page interests. For instance, web bugs are differ- views), accounting for 86 percent of page ent from cookies because they are designed views for that group, presented some form to be invisible to the user and are not stored of advertising, and likely derived most of their revenues from doing so (Evans 2009). By 2012, $36.6 billion were spent on dig- 14 See http://tcrn.ch/1ymh9pB/. 15 Web beacons are small pieces of code placed on web- ital ads, ahead of cable TV ($32.5 billion) sites, videos, and in emails that can communicate informa- and slightly below broadcast TV ($39.6 bil- tion about a user’s browser and device to a server. Beacons lion), with a rate of growth outpacing all can be used, among other things, for website analytics or to 13 deliver a cookie to a user’s device. other formats. By 2015, digital ad revenues 16 Fingerprinting refers to technologies that use details had reached $52.8 billion, accounting for about a user’s browser and device in order to identify the user’s browser or device over time. Fingerprinting can be used for the same purposes as cookies, but does not 13 See http://www.iab.net/media/file/iab_internet_ require files to be stored on a user’s device, and is harder to Advertising_Revenue_Report_FY_2012_rev.pdf. both notice and evade. 464 Journal of Economic Literature, Vol. LIV (June 2016) on a user’s computer. Without inspecting are interested in, thereby reducing search a web page’s underlying code, a customer costs and improving welfare. However, as does not know that they are being tracked. the theoretical literature examined in sec- Compared to traditional surveillance meth- tion 2.3.3 suggests, the effects of targeting ods, collecting data about individuals online can be rather complex and nuanced, and not is cheaper and faster (Wilson and Brownstein necessarily always positive for consumers. 2009). Retention policies for that data (such Consumers may even be offered products as the length of time search engine queries inferior to the ones they would have found or clickstream information can be stored and otherwise, or even potentially damaging used by data holders) vary across organiza- ones. For instance, data brokers sell lists of tions and jurisdictions, and can impact both consumers to target individuals suffering welfare and market outcomes (Bottero and from addictions such as alcoholism or gam- Spagnolo 2013). Search engines data, for bling.17 Additionally, concerns exist over instance, is collected about individual users the fact that tracking technologies are often using cookies, IP addresses, and other meth- made invisible to end-users (Smith 1999), ods. Associated with this profiling are the whereby a significant lack of awareness search queries and subsequent clicks made and misconception exists among consum- by each user. In the past, Google was said ers regarding the extent, nature, and depth to keep this information for nine months of targeting techniques (McDonald and (eighteen months in the case of cookies) and Cranor 2010). Even sophisticated consum- anonymize it afterwards; Microsoft was said ers may not be able to avoid being tracked, to keep this information for six months. But as the advertising and data industry has often in practice, there is no real verification of found new ways of tracking and identifying whether data holders in fact delete or at least users after consumers had learned about and anonymize user information. adopted measures to counter existing forms Despite the large sums of money spent on of tracking (Hoofnagle et al. 2012). targeted advertising, however, its effective- On the other hand, concerns exist that ness is unclear. Farahat and Bailey (2012) the introduction of strict privacy regimes estimate that targeted advertising in 2012 may inhibit either the tracking or the tar- generated, on average, twice the revenue geting of potential consumers, and in doing per ad as nontargeted advertising. However, so, may dampen the development of elec- some of these estimates have been chal- tronic commerce (Swire and Litan 1998). lenged (Mayer and Mitchell 2012), and European countries have raised barriers to more recent empirical work has found evi- the collection and use of personally identi- dence indicating that personalized adver- fiable data, including a requirement that tising may be ineffective (Lambrecht and firms seek explicit consent from consumers Tucker 2013). Blake et al. (2015) reinforce to collect information about their past pur- the latter findings. They measure the effec- chases and recent browsing behavior. The tiveness of paid search by running a series European ePrivacy Directive (2002/58/EC) of large-scale field experiments on eBay, predominantly addresses the telecommuni- and find evidence that returns from paid cations sector, but also regulates the use of search are a fraction of conventional non- experimental estimates (and can, in some cases, be negative). Targeted advertising, in principle, could provide consumers with 17 See, e.g., http://www.dmnews.com/media-one- information about products they want or gamblers-database/article/164172/. Acquisti et al.: The Economics of Privacy 465 cookies and similar tracking methods.18 The They use the responses of 3.3 million ­survey EU ePrivacy Directive (Recital 24) explicitly takers who had been randomly exposed to states that “[s]o-called spyware, web bugs, 9,596 online display (banner) advertising hidden identifiers, and other similar devices campaigns to explore how strong privacy reg- can enter the user’s terminal without their ulation in the European Union influenced knowledge in order to gain access to infor- the effectiveness of advertising. For each mation, store hidden information, or trace of the 9,596 campaigns, their data contains the activities of the user and may seriously a treatment group exposed to the ads and intrude upon the privacy of these users. The a control group exposed to a public service use of such devices should be allowed only ad. To measure ad effectiveness, they use a for legitimate purposes, with the knowledge short survey conducted with both groups of of the users concerned.” With regards to users about their purchase intent towards an tracking cookies, the directive permits their advertised product. They find that following use on the condition that users provide their the ePrivacy Directive, banner ads experi- consent. The 2012 Do-Not-Track proposal enced a reduction in effectiveness of over by the Federal Trade Commission (FTC) 65 percent, in terms of changing consumers’ suggests a set of guidelines somewhat similar purchase intents. They see no similar change to the EU ePrivacy Directive for the United in ad effectiveness in non-European coun- States—giving consumers a way to opt out tries during a similar time frame. of having their data collected. However, as These findings raise a number of stimu- the FTC’s proposal follows an opt-out rather lating questions. One interpretation of the than an opt-in approach, it may be less costly results is that privacy regulation can have for US firms to continue collecting data and a detrimental effect on the advertising using targeted ads.19 industry. Moreover, as online advertising Goldfarb and Tucker (2011b) examine has become a primary source of revenue the effects of the implementation of the EU for many ­web-based businesses, the types ePrivacy Directive on purchase intentions, of content and services provided on the and find evidence that after the ePrivacy Internet may shift as a result of privacy reg- Directive was passed, hypothetical adver- ulation. However, the decrease in hypothet- tising effectiveness decreased significantly. ical advertising effectiveness was only found within a subset of ads (static, content-spe- cific, and small) whereas other types of ads 18 See the Data Protection Directive (1995/46/EC) and the Privacy and Electronic Communications Directive were not at all affected (larger ads, dynamic (2002/58/EC, last amended by Directive 2009/136/EC, ads, and ads consistent with the content of sometimes called the EU Cookie Directive), also known as a website). This suggests a possible way for- the ePrivacy Directive, which regulates cookies and other similar devices. The current prescription is, in short, that ward for organizations. For instance, general certain types of cookies or similar methods must not be interest websites may fine-tune ads based on used unless the relevant Internet user: (1) is provided with the content of a specific web page to make clear and comprehensive information about the purposes of the storage of, or access to, those cookies; and (2) has it easier to monetize, providing ads that are given his or her consent. more contextually appropriate. Another 19 For instance, Johnson and Goldstein (2004) show question raised by the results is whether any that consumers tend to go with the default option cho- sen for them in the case of organ donation, despite heavy actual economic damage will be ultimately lobbying by organizations. That is, if the default approach incurred by consumers (or by merchants as a is for consumers to be subscribed to targeted ads, then whole) after the legislation discourages mar- more consumers are likely to remain subscribed than under an opt-in system where consumers are by default keters from using tracking cookies. This may unsubscribed. depend on whether the effect of behavioral 466 Journal of Economic Literature, Vol. LIV (June 2016) advertising is more persuasive or more infor- to the theoretical scenario of first-degree mational—that is, whether the main effect price discrimination. Indeed, much of the of behavioral targeting was simply a switch- literature surveyed in section 2.3.1 focuses ing effect (that is, nudging and persuading on merchants’ ability to engage in forms consumers to buy from a certain merchant of targeting that increasingly approach the rather than another one), as opposed to an textbook “ideal” of first-degree discrimina- informational effect (that is, informing con- tion. Relative to the volume of theoretical sumers about products or services of which analyses, however, empirical efforts to find they would not have otherwise been aware). evidence of Internet-based price discrimina- Related work has addressed the question tion have lagged behind. Valentino-Devries, of which format of ads advertisers should and Singer-Vine, and Soltani (2012) suggest that should not use. White et al. (2008) find that certain online retailers may be engaging in consumers may experience “personalization dynamic pricing based on their ability to reactance” by negatively reacting to highly estimate visitors’ locations, and, specifically, personalized messages when the fit between the (online) visitor’s physical distance from the targeted offer and consumers’ personal a rival brick-and-mortar store. Mikians et characteristics is not explicitly justified. al. (2012, 2013) find suggestive evidence of Goldfarb and Tucker (2011a) find that obtru- price discrimination based on information sive targeted ads—targeted in the sense that collected online about consumers, as well as they are matched to the content of a website, evidence of “search discrimination” (steering and obtrusive in terms of visibility—are more consumers towards different sets of prod- likely to trigger privacy concerns among users ucts with different prices, following their in comparison to obtrusive but not targeted searches for a certain product category). In ads, or targeted but less obtrusive ads. These particular, Mikians et al. (2013) suggest price findings can help explain the enormous suc- differences of 10 percent to 30 percent for cess of Google AdSense, a service that pro- identical ­products based on the location and vides contextually­ targeted unobtrusive ads the characteristics (for instance, browser (AdSense accounts for about a third of configurations) of different online visitors. Google’s ad revenue, with the other two On the other hand, Vissers et al. (2014) thirds coming from search advertising).20 find price variation, but no experimental Moreover, it can help explain the apparent evidence of consumer-based price discrim- divide in online advertising between banner ination in online airline tickets. In short, ads and unobtrusive targeted ads. the evidence of systematic and diffuse indi- vidual online price discrimination is, cur- 3.2 Privacy and Price Discrimination rently, scarce. It is possible that firms may Along with being targeted with personal- consider online price discrimination as not ized offers, consumers may also face price just challenging, but potentially risky.21 discrimination. Tracking and measurability, And yet, anecdotal cases of firms selectively in addition to websites’ ability to dynami- offering price discounts are ubiquitous (i.e., cally update and personalize prices for each instead of raising prices to some consumers, visitor, are bringing online markets closer firms may simply reframe their behavior by ­offering price discounts to others). It is also

20 See https://investor.google.com/earnings.html. It is also worth noting that contextual targeting is also common 21 Consider, again, the backlash following Amazon’s in the offline world (e.g., magazines about fishing contain purported attempts at price discrimination (Anderson and ads for fishing equipment). Simester 2010). Acquisti et al.: The Economics of Privacy 467 possible that the infrastructure for accurate ­employers may become increasingly reliant price discrimination (and its detection) is on statistical discrimination strategies. Thus, underdeveloped and still evolving—similarly an employer’s personal animus or bias may to the case of behavioral ads (which, anec- end up negatively and disproportionately dotally, seem as likely to present consumers affecting certain minorities. Under this sce- with offers of products they have already nario, expanding the information available to searched for or even bought, rather than employers may generally lead to fairer and undiscovered products in which they may be possibly more efficient outcomes. interested). Similarly contrasting dynamics arise on online dating platforms. By facilitating abun- 3.3 Other Forms of Discrimination dant signaling of personal traits and inter- Price discrimination is probably the least ests, dating platforms can facilitate matching odious form of discrimination involving the and sorting (Hitsch, Hortacsu, and Ariely use of personal information. In many other 2010). However, because most platforms markets, significant trade-offs can arise as allow members to screen and filter their function of the amount of personal informa- populations on the basis of personalized cri- tion available to other parties, including sce- teria such as racial backgrounds, these plat- narios where privacy protection will cause, forms can reinforce racial dynamics already or in fact hinder, discrimination. existing in face-to-face­ interactions (in a Consider, for instance, hiring. Economists study of a popular online dating site, Lewis have long been interested in the role of (2013) finds that users “disproportionately information (Stigler 1962) and signaling message other users from the same racial (Spence 1973) in job market matching. background”). On the other hand, choos- And of course, there exists a vast economic ing not to share certain information may be ­literature on ­discrimination in hiring or counterproductive for a site’s members: the wages. Experimental work has highlighted veil of anonymity (specifically, a member’s that employers may infer candidates’ per- ability to visit other members’ profiles with- sonal traits from information available on out leaving an identifiable trace of that visit) their resumés (such as the candidates’ race actually reduces the probability of that mem- from their names) and use that information to ber finding matches on the site (Bapna et al. discriminate among prospective employees forthcoming). The variety of these outcomes (Bertrand and Mullainathan 2004). In fact, exemplifies one of the major themes of this fairer job market outcomes may sometimes article: the consequences and implications be achieved after removing information from of data sharing or data protection vary very the marketplace. Goldin and Rouse (2000), much with context—such as what specific for instance, find evidence that blind audi- type of data is being shared, how, and when. tions (in which screens conceal the identities Consider, also, online platforms that allow of candidates such as orchestra performers tenants to find landlords and vice versa; or from the jury) foster impartiality in hiring platforms that enable property owners to and increase the probability that women “share” their houses with short-term rent- will be hired. On the other hand, Bushway ers; or platforms that enable car owners to (2004) and Strahilevitz (2008) point out share their vehicles with other drivers or a different dynamic: when employers are passengers. These examples of IT-enabled not able to retrieve pertinent information “sharing economies” may increase efficiency about a job applicant (for instance, their by improving how resources such as housing­ criminal records) due to privacy regulation, or vehicles are used. However, when these 468 Journal of Economic Literature, Vol. LIV (June 2016) platforms expose members’ personal infor- States, several states authorize courts to mation, they may inadvertently foster dis- expunge or seal certain criminal records— crimination. Using data from Airbnb (an but only for certain types of arrests and online dwelling rental marketplace, on convictions. Similarly, in most of the United which the race of a landlord can often be States, an employer who asked about the inferred from profile photos on landlords’ religion of a job candidate would risk being Airbnb accounts), Edelman and Luca (2014) sued under Equal Employment Opportunity find that New York City landlords who are laws; however, different types of information not African American charge approximately enjoy different protections (for instance, 12 percent more than their African American information regarding religious affiliation counterparts for an equivalent rental. cannot even be inquired about in interviews, Another example where expanding the whereas other types of personal information amount of information that is available to the may, theoretically, be inquired about, but marketplace may influence discrimination should not actually be used in decisions con- concerns employment opportunities for indi- cerning hiring or wages). viduals with criminal records. Evidence sug- Information technology, however, has cre- gests that employers do use criminal records ated new challenges in this context: many to screen candidates (e.g., Bushway 2004). job candidates nowadays publicly provide Because of the stigmatizing effect associated personal information through social-network with a criminal history, individuals with crim- profiles, including information such as sex- inal records are more likely to experience job ual orientation or religious affiliation, which instability and wage decline (see, for instance, may actually be protected under state or Waldfogel 1994; Nagin and Waldfogel 1995). federal laws. Employers are not supposed to Information ­technology has exacerbated the ask about such information during the hiring problem: large numbers of criminal histo- process—but searching for it online signifi- ries are now computerized in state repos- cantly reduces the risk of detection. Acquisti itories and commercial databases. Thus, ex and Fong (2013) have investigated the role offenders may be trailed by their crime his- of social media in the hiring behavior of US tories wherever they may apply for jobs. This firms. In the authors’ experiment, they cre- can occur despite a criminal record being, ate online social-media profiles for job can- at some stage, “stale.” This is in contrast to didates and then submit job applications on Blumstein and Nakamura (2009), who point behalf of those candidates to a sample of over out that the likelihood of recidivism, or a per- 4,000 US employers. If an employer were son’s relapse into criminal behavior, declines to search online for the name found in the with time spent without committing a crime, resumé and application it received, it would and at a certain point in time, an ex offender find the social-media profile of the candidate who has remained “clean” can be regarded as and be exposed to the experimental manipu- providing no greater risk than a nonoffender lation. Acquisti and Fong estimate that only counterpart of the same age. In those cases, a (sizable) minority of US employers likely there could be social benefits from forgetful- searched online for the candidates’ informa- ness (Blanchette and Johnson 2002). tion, and that the overall effect of the exper- In many countries, legislators are acutely imental manipulations was small. However, interested in these problems, and finding they did find evidence of both search and the right balance of sharing and protection discrimination among a self-selected set of personal information is a thorny matter of employers. In this, as in other scenarios, of public policy. For instance, in the United it is still unclear the extent to which novel Acquisti et al.: The Economics of Privacy 469

­information channels will reduce market with other hospitals. (Although EMRs were frictions and increase efficiency, or in fact invented in the 1970s, by 2005 only 41 per- promote new forms of discrimination. cent of US hospitals had adopted a basic EMR system (Goldfarb and Tucker 2012a)). 3.4 Privacy and Health Economics Using variations in medical privacy laws Privacy protection may affect the extent across US states and across time, Miller and and direction of data-based technological Tucker (2007, 2009, 2011a, 2014a) provide progress. Particularly significant privacy evidence quantifying the effect of state pri- trade-offs arise in the context of technology vacy protection on the diffusion of EMRs. adoption in the medical industry, as many They find that privacy regulations restrict- new health-care technologies depend on ing a hospital’s release of patient informa- information exchange (Schwartz 1997). tion significantly reduced the adoption of Innovations in digitizing health informa- electronic medical records, primarily due tion can lead to quality improvements by to diminished network effects in adoption. making patient information easy to access Their analysis suggests that state privacy reg- and share. For instance, electronic medical ulation restricting the release of health infor- records (EMRs) allow medical providers mation reduces aggregate EMR adoption by to store and exchange patient information more than 24 percent. They further estimate using computers, rather than paper records. that a 10 percent increase in the adoption of Hillestad et al. (2005) suggest that EMRs such systems can reduce infant mortality by could reduce annual US health-care costs 16 deaths per 100,000 births. by $34 billion through greater efficiency and Miller and Tucker identify two schools safety, assuming a fifteen-year period and 90 of thought about the interplay of privacy percent EMR adoption. However, privacy ­concerns, regulation, and technologi- regulation may affect the rate and manner cal innovation. Although their discussion in which hospitals and health-care provid- focuses on the health-care sector, their argu- ers adopt EMRs. In the European Union, ments apply more generally to the interac- personal data recorded in EMRs must be tion between privacy laws and innovation. collected, held, and processed in accor- The first school of thought holds that regu- dance with the Data Protection Directive. latory protection inhibits technology diffu- In the United States, the 1996 US Health sion by imposing costs upon the exchange of Insurance Portability and Accountability information. In addition to these ­trade-offs, Act (HIPAA) established some rules for hospitals are faced with complexities con- privacy in health care, and the 2009 Health cerning state-specific regulation and infor- Information Technology for Economic and mation exchange across state lines. The Clinical Health (HITECH) Act, part of the second school of thought, instead, argues American Recovery and Reinvestment Act, that explicit privacy protection promotes the devoted $19.2 billion to increasing the use use of information technology by reassuring of EMRs by health-care providers. US hos- potential adopters that their data will be safe. pitals may hesitate to adopt EMR systems if Consider a possible example of the lat- (1) they are concerned about their patients’ ter dynamics, in the context of Health responses, and (2) if regulation intended to Information Exchanges (HIEs). HIEs are protect patient privacy ends up hindering information technology solutions that facili- the adoption of such systems because hos- tate the sharing of patients’ electronic med- pitals cannot properly utilize them by, for ical records. They are expected to enhance instance, exchanging patient information information-sharing capabilities among 470 Journal of Economic Literature, Vol. LIV (June 2016) health-care entities, with the aim of improv- likely than the general population to own ing the quality of care. Their adoption, how- long-term care insurance. Other genes, such ever, is said to have been hindered by privacy as those associated with increased risks of concerns, and it is unclear how privacy laws, breast cancer, colon cancer, Parkinson’s, and such as legislation restricting the disclosure Alzheimer’s diseases, have also been identi- of health records, impact their adoption. In fied, and testing for these genes is becoming the United States, state laws may incentivize more common and more precise (Burton HIE efforts, include specific privacy require- et al. 2007). This testing, in turn, is likely to ments for sharing health-care data, or both. increase the amount of private information Adjerid et al. (forthcoming) investigate the stored about individuals. On the one hand, impact that different state laws had on the this information may be useful in developing emergence and success of HIEs. They com- treatments, vaccines, and immunizations. On pare the adoption and success of HIEs in the other, while US laws limit an insurer’s abil- states with laws that limit information dis- ity to observe an individual’s specific genetic closure with states that do not have such information, marketers (e.g., for certain drugs laws. The authors find that the combination and treatments) and advertising platforms of adoption subsidies and stronger privacy may certainly be interested in it. A number protection (that is, legislation that includes of companies in fact aim at offering genetic strict requirements for patients’ consent in testing to individuals at affordable rates.22 order to use their medical data) is associ- Relatedly, using data on genetic testing ated with greater HIE adoption than either for cancer risks, Miller and Tucker (2014b) under privacy protection alone, or, impor- examine how state genetic privacy laws affect tantly, under subsidies alone. Their results the diffusion of personalized medicine. They suggest that there can be policy complemen- identify three approaches taken by states to tarities between privacy laws and other types protect patients’ genetic privacy: requiring of interventions (such as financial subsidies informed consent; restricting discrimina- and technical assistance in the case of HIEs). tory usage by employers, health-care pro- Their findings also highlight that different viders, or insurance companies; and limiting degrees of privacy regulation can have dif- redisclosure without consent. They show ferent effects on technology innovation and evidence that the redisclosure approach on economic welfare. Regulators may thus encourages the spread of genetic testing, in find room for balancing meaningful privacy contrast to the informed consent approach, protection while incentivizing the adoption which deters it. of information technology efforts. The results in both Miller and Tucker Genetic research is another field where (2014b) and Adjerid et al. (forthcom- complex trade-offs may arise from the inter- ing) illustrate how privacy laws need to be play of technological innovation and privacy ­tailored to take into account and balance regulation. Oster et al. (2010) use data from a prospective cohort study of approximately 1,000 individuals at risk for Huntington’s dis- 22 For instance, 23andme (https://www.23andme.com/) ease (HD), a degenerative neurological dis- did so before the US Food and Drug Administration (FDA) directed them to cease while they undergo a reg- order with significant effects on morbidity, to ulatory review process. Prior to their primary operations estimate adverse selection in ­long-term care being halted, they would store individuals’ DNA and offer insurance. They find evidence of adverse updates on potential health issues as testing procedures advanced. Currently, they do so for a subset of of poten- selection: individuals who carry the HD tial health conditions—those for which they received FDA genetic mutation are up to five times more approval. Acquisti et al.: The Economics of Privacy 471 specific and continually evolving trade-offs, report highlighting a series of recommenda- and how rather than looking at privacy reg- tions for achieving two seemingly contrast- ulation in a binary, monotonic fashion, the ing requirements: generating, using, and effect of regulation on technology efforts can extending access to data (because doing so be heterogeneous, depending on the specific “is expected to advance research and make requirements included in the legislation. public services more efficient”); while, Consider, again, genetic data: genomic anal- at the same time, protecting privacy (“as yses may not only reveal information about this is a similarly strong imperative, and a an individual’s current health, but also about requirement of human rights law”) (Nuffield future health risks, and this potential to reveal Council on Bioethics 2015). information is likely to expand. These anal- Another example of balancing yses are useful for patients and health-care ­health-related informational privacy with providers because they facilitate the delivery public benefits is the identification of infec- of personalized medicine. At the same time, tious disease outbreaks—the reporting of as personal genetic and genomic information these cases to state authorities is usually becomes increasingly available, consumers exempt from privacy restrictions. Rapidly face new privacy risks—for instance, if such identifying such outbreaks is critical for the information reaches the hands of advertising effective initiation of public-health interven- platforms and data aggregators, the latter may tion measures, preparation and readjustment use it to construct risk profiles for individuals of vaccines, and the timely alerting of gov- and their biological relatives such as children ernmental agencies and the general public. and parents, combine it with other data, and Google Flu Trends, for instance, takes advan- improve their targeting of product offerings. tage of users’ searches for ­influenza-related Adding another angle to this discussion is terms to provide both public-health­ pro- Miller and Tucker’s finding that genetic pri- fessionals and the general population with vacy laws have distinct effects beyond stan- a real-time, geographically­ specific view dard health data privacy laws—in particular, of influenza search activity in the United different laws may alter individual behavior.23 States.24 Other monitoring services include Yet another trade-off in this legislative bal- HealthMap and the International Society for ancing act arises from the observation that Infectious Diseases’ Program for Monitoring wider access to genetic and genomic analyses Emerging Diseases. can lead to broader improvements in overall 3.5 Privacy and Credit Markets health care. However, and importantly, med- ical privacy and medical analytics (includ- In the United States, credit reporting ing genetic research) do not have to be was not regulated at the federal level until ­antithetical. In 2015, the Nuffield Bioethics 1970, when the Fair Credit Reporting Act Council in the United Kingdom produced a (FCRA) was legislated. The Act was subse- quently amended several times. Currently, 23 This finding is reminiscent of Johnson and Goldstein the credit-reporting­ industry is among the (2004) who, as previously mentioned, show that consum- most regulated in terms of data protection. ers tend to go with the default option chosen for them The FCRA established permissible purposes­ in the case of organ donation, despite heavy lobbying by organizations. That is, if the default approach is for con- of credit information disclosure, codified sumers to disclose genetic information to their immediate information flows along the lines that they health-care provider along the lines of the “redisclosure” approach, then more consumers are likely to accept the service than under the opt-in system of the “informed con- 24 Also see, however, section 4.4 on the problem of sent” approach. drawing proper conclusions from the data. 472 Journal of Economic Literature, Vol. LIV (June 2016) had naturally developed in the market, intro- transfer information to a marketing or sales duced dispute settlement mechanisms and company to sell new products or jointly data correction procedures, and assigned offered products. Once this unaffiliated third expiration dates to negative information such party has a consumer’s personal information, as bankruptcy and payment defaults. Several they can share it within their own “corporate information flows, such as those among family.” However, they themselves cannot nonaffiliates, were left unregulated at the likewise transfer the information to further federal level, although some states enacted companies through this exemption. In addi- their own regulations (Jentzsch 2006). The tion, financial institutions can disclose users’ 1990s brought major reforms in the United information to credit reporting agencies to States that were intended to strengthen comply with any other laws or regulations. financial privacy laws, in light of intensifying For lenders, the extension of credit to public debate about privacy erosion given borrowers depends on the acquisition and advancements in information technology. possibly the exchange of personal informa- The Consumer Credit Reporting Reform tion among market participants. Jentzsch Act (CCRRA) of 1996 introduced for the (2006) develops a financial privacy index to first time duties for financial information quantify the extent of information protection providers. In order to correct inaccuracies in across different regimes, demonstrating that consumers’ records, the CCRRA mandated the United States grants less data protection a two-sided information flow to/from credit than EU members. The primary concern is bureaus and providers, and formalized some that more stringent data protection regula- information flows among affiliates. tions may lead to reduced access to credit, The Gramm–Leach–Bliley (GLB) Act thus creating a trade-off with consumer of 1999 extended the CCRRA by formally privacy. In line with this work, Pagano and and legally allowing a variety of financial Jappelli predict that if banks share infor- institutions to sell, trade, share, or give out mation about their customers, they would nonpublic personal information about their increase lending to safe borrowers, thereby customers to nonaffiliates, unless their cus- decreasing default rates (Pagano and Jappelli tomers direct that such information not be 1993; Jappelli and Pagano 2002). Relatedly, disclosed by opting out. The GLB Act, while Einav, Jenkins, and Levin (2013) find that granting consumers the option to opt out, credit scoring provides the ability to target restricts it to nonaffiliates. An affiliate is more generous loans to lower-risk borrowers defined as any company that controls, is con- among individuals with lower income. Other trolled by, or is under common control with empirical studies tend to focus on the effects another company. Consumers have limited of credit bureaus and creditor rights using (if any) power to restrict this kind of “cor- data from a cross section of countries (see, porate family” trading of personal informa- e.g., Djankov, McLiesh, and Shleifer 2007; tion. There are also several other exemptions Qian and Strahan 2007). under the GLB Act that can permit informa- States and local municipalities may enact tion sharing despite a consumer’s objection. legislation and local ordinances that exceed For instance, if a financial institution wishes the protections in the GLB Act.25 They may to engage the services of a separate ­company, they can transfer personal information­ to that company by arguing that the information­ is 25 However, in ABA v. Brown, banks were partially suc- cessful in preempting state restrictions on sharing of affil- necessary to the services that the ­company iate info and credit reporting info. See, e.g., https://epic. will perform. A financial institution can org/privacy/preemption/ABABrown-SG-Brief.pdf. Acquisti et al.: The Economics of Privacy 473 require, for instance, opt-in consent, as is (that is, approval rates increased), consistent the case in a subset of the Bay Area counties with their theoretical model’s predictions. examined in Kim and Wagman (2015). In a They further provide some suggestive evi- study that directly bridges the theoretical dence that foreclosure start rates during the and empirical analyses of privacy, Kim and financial crisis of 2007–08 were higher in the Wagman incorporate information acquisition counties that adopted the privacy ordinance, and privacy regulation—through restrictions possibly indicative of looser underwriting on information trade—into a model of con- standards following the ordinance, also in sumer screening. In their model, firms, such line with their model’s predictions. as mortgage lenders, compete in prices. 3.6 Markets for Privacy and Personal Data Lower prices, however, entail more strin- gent screening of applicants. The authors Databases of consumer data or consumer show that, in equilibrium, consumers apply reports have existed throughout the twen- to obtain loans from firms posting the lowest tieth century (Smith 2000). The progress of prices, despite anticipating more stringent information technology and the advent of the loan approval processes. They then demon- Internet have, however, vastly increased the strate that enabling firms to sell applicant scope and reach of those databases, ultimately data to interested downstream parties, such giving rise to a market ecosystem of organi- as insurers, can lead to even lower prices, zations that gather, merge, clean, analyze, higher screening intensities, and higher buy, and sell consumer data. This ­ecosystem, rejection rates of applicants; however, social although dominated by a decreasing number welfare, overall, increases. of players (Krishnamurthy and Wills 2009), One of the main criticisms of the GLB is still rather complex and decentralized Act’s privacy provisions has been that most (Olejnik, Minh-Dung, and Castelluccia 2014). consumers do not (and likely will not) take There is no single, unified market for personal advantage of the opt-out option to request data. Rather, there are multiple markets in that a firm ceases trade in their information. which data is traded, and multiple markets In 2002, three out of five counties in the San in which privacy is sought or purchased (cf. Francisco–Oakland–Fremont, California, Lane et al. 2014). These include markets Metropolitan Statistical Area enacted a local where data aggregators buy and sell data to ordinance (effective January 1, 2003) that is other organizations (data subjects generally more protective than then-current practices do not participate directly in these markets, by pursuing an opt-in approach. Specifically, and are in fact often unaware that reports the local ordinance would require financial on their names may exist); markets in which institutions to seek a written waiver before consumers exchange personal information sharing consumer information with both for “free” products or services (for instance, affiliates and nonaffiliates. The variation in search engines and social media); markets the adoption of the ordinance—adopted where consumers actively attempt to purchase in three of the five counties—led to simple protection for their data and/or against the policy differences in local financial-privacy negative consequences of ­privacy intrusions statutes. Exploiting this variation, and using (for instance, identity theft, insurance ser- Census tract-level and individual loan-level vices); and markets where consumers attempt data on mortgage and refinancing applica- to explicitly trade their data in exchange tions, Kim and Wagman demonstrate that for money (such as services provided by the opt-in ordinance had a statistically sig- “personal data vault” firms, akin to the pro- nificant negative effect on loan denial rates posal for privacy markets in Laudon 1996). 474 Journal of Economic Literature, Vol. LIV (June 2016)

One particular type of database that other hand, consumers who opt out lose the has attracted the attention of economists benefits of targeted advertising, and those is the National “Do-Not-Call” Registry, a who do not opt out are likely to be exces- database established by the Do-Not-Call sively targeted with advertisements. The Implementation Act of 2003. It allows US conclusions of these works may extend to residents, by registering, to disallow tele- the ongoing debate over a “do-not-track” marketers to call their phone numbers with policy for online markets. promotional offers. Within twenty-four 3.7 Privacy and Information Security hours of its opening on June 27, 2003, over 10 million telephone numbers were reg- While privacy and information security istered; by February 2007, registrations are distinct concepts, they can overlap. By exceeded 139 million. Some studies have “information security” we refer to the pro- delineated the demographic characteristics cesses designed to protect data assets. Poor of those likely to opt out using Do-Not- information security can lead to what Solove Call, and estimated consumers’ benefit (2006) refers to as “insecurity,” or careless- from doing so. Varian, Wallenberg, and ness in protecting (personal) information Woroch (2005) calculate consumers’ value from leaks and improper access. While for telemarketing privacy to range from the economics of information security has $0.55 to $33.21 per household per year, become a field of research in its own right, while Png (2010), using ­state-level regis- covering subjects as diverse as the opti- tries, estimates it to be between $13.19 to mal timing for patching operating systems $98.33 per household per year. Goh, Hui, or markets for software vulnerabilities,26 a and Png (2015) use data from the federal number of topics are of interest to both pri- registry (and previously established state- vacy and security researchers, such as spam, level registries) to investigate externalities identity theft, and data breaches. arising from consumers opting out via the Although spam messages are not entirely Do-Not-Call registry. Consumers who opt random (Hann et al. 2006), the term “spam” out prefer privacy to the benefits associated is used to refer to the indiscriminate use of with targeted advertisements. Their deci- electronic messaging systems for unsolic- sion reduces the pool of consumers avail- ited advertisement to consumers. A study able for sellers to solicit. In response, sellers by Ferris Research estimates that in 2009, redirect some of their marketing efforts to the cost of spam, accounting for decreased those consumers who are still available for user productivity, was about $130 billion, solicitation. As these consumers experience with $42 billion in the United States alone. an increase in solicitations, some of them These estimates, however, should be taken respond by opting out as well. As more con- with caution: in 2012, Rao and Reiley (2012) sumers opt out, sellers continue to adjust estimated a much lower overall societal cost their solicitation. In a sense, sellers face a of spam, $20 billion. form of a prisoner’s dilemma: individually, While every Internet user receives spam, sellers wish to intensify their targeting of the the cost per user is low, primarily due to users’ pool of consumers who did not opt out; col- reliance on filtering technologies. On the lectively, they would be better off holding other hand, identity theft may affect fewer back to keep this pool of consumers from individuals, but at larger ­individual costs. shrinking further. Consumers, on the one hand, benefit from having the option of not 26 For a review of the literature on the economics of being targeted with advertisements; on the information security, see Anderson and Moore (2006). Acquisti et al.: The Economics of Privacy 475

The 1998 US Identity Theft and Assumption devise (or agree upon) sound framework for Deterrence Act (ITADA) defines identity data security policy. theft as the knowing transfer, possession, or For instance, Roberds and Schreft (2009) usage of any name or number that identifies argue that the loss of privacy due to identity another person with the intent of commit- theft is outweighed by gains from the rela- ting, aiding, or abetting a crime. Advances in tive ease of gaining access to available credit. information technology have allowed iden- Kahn and Roberds (2008) model the inci- tity thieves to combine information taken dence of identity theft as a trade-off between from a variety of sources to open accounts the desire to avoid costly or invasive moni- in the names of others’ identities (Cheney toring of individuals on the one hand, and 2005; Coggeshall 2007). Anderson, Durbin, the need to control transaction fraud on the and Salinger (2008) report 30 mentions of other. They suggest that this trade-off will “identity theft” in US newspapers in 1995; prevail despite any technological advances. 2,000 in 2000; and 12,000 in 2005. In 2006, Kahn, McAndrews, and Roberds (2005) identity theft resulted in corporate and con- examine the role of money in its provision sumer losses of $61 billion, with 30 percent of privacy and anonymous transactions, of known identity thefts caused by corporate wherein a credit purchase may identify the data breaches. By 2012, the Bureau of Justice purchaser. In a simple trading economy with estimated that 16.6 million US residents ages moral hazard, the authors compare the effi- sixteen and older (or about 7 percent of the ciency of money and credit, and find that population in that age group) had been vic- money may indeed be useful as a means of tims of at least one incident of identity theft. preserving anonymity toward sellers. More By 2014, the number of US victims was recently, the emergence of Bitcoin has pro- estimated at 17.6 (Harrell 2015). It is fur- vided a vehicle for doing just that—facilitat- ther estimated that 75 percent of recorded ing increased anonymity when transacting breaches between 2002 and 2007 were online (see, e.g., Böhme et al. 2015). caused by hackers or external sources, with In response to increasing concerns regard- over 77 percent involving the theft of Social ing identity theft, many US states have Security numbers (Anderson, Durbin, and adopted data-breach disclosure laws. By Salinger 2008; Romanosky, Telang, and and large, these laws require firms to notify Acquisti 2011; Harrell 2015). consumers if their personal information has Miller and Tucker (2011b) study the been lost or stolen. Romanosky, Telang, and impact of data encryption laws on data breach Acquisti (2011) use FTC data to estimate incidences. Their study highlights the risks the impact of data-breach­ disclosure laws on associated with internal security threats (e.g., identity theft over the years 2002 to 2007. those by employees authorized to access a They find that the adoption of data-breach corporate database). In particular, policies disclosure laws has a marginal effect on the that focus on outside threats may paradox- incidence of identity theft and reduces their ically redirect efforts away from protecting average rate by under 2 percent. At the same against internal risks. However, as noted by time, state disclosure laws may have other Mann (2015), “information lost may not be benefits, such as reducing an average victim’s information abused.” That is, the probabil- loss and improving firms’ security and opera- ity distributions of data breach occurrences, tional practices (Schwartz and Janger 2007). and of actual abuse of stolen information In some sense, whether or not state laws conditional on a breach, are, if not unknown, require firms to disclose information about extremely uncertain. This makes it harder to data breaches could be interpreted as firms’ 476 Journal of Economic Literature, Vol. LIV (June 2016) own level of privacy, which may pose its own dents have repeatedly highlighted privacy set of trade-offs. Ideally, firms should be as one of the most significant concerns of induced by strict disclosure laws to secure Internet users. In a 2009 study, Turow et al. their customers’ data. However, several (2009) find that 66 percent of Americans do studies that examine the financial impact of not want marketers to tailor advertisements such disclosures on firms have come up with to their interests, and 86 percent of young mixed and primarily mild results. Campbell adults do not want tailored advertising if it et al. (2003), for instance, find only lim- were the result of following their behaviors ited evidence of a negative reaction by the across websites. In a 2013 survey, the Pew stock market to news of security breaches, Research Center finds that 68 percent of US although they do find a significant and nega- adults believed that current laws are insuf- tive effect on stock price for breaches caused ficient in protecting individuals’ online pri- by unauthorized access of confidential infor- vacy (Rainie et al. 2013).27 A 2015 report, mation. Using an event-study methodology, also by the Pew Research Center, finds and considering a time window of one day that an overwhelming majority of US adults before and one day after the announcement (93 percent) believe that being in control of a breach, they calculate a cumulative of who can get information about them is effect of 5.4 percent. Cavusoglu, Mishra, important; but only 9 percent of them think − and Raghunathan (2004) find that the disclo- that they have, in fact, “a lot” of control over sure of a security breach results in the loss of how much information is collected about 2.1 percent of a firm’s market valuation over them and how it is used (Madden and Rainie two days (the day of the announcement and 2015). At the same time as they profess their the day after). Telang and Wattal (2007) find need for privacy, most consumers remain that software vendors’ stock prices suffer avid users of information technologies that when vulnerability information about their track and share their personal information products is announced. Acquisti, Friedman, with unknown third parties. If anything, and Telang (2006) focus on the announce- the adoption of privacy-enhancing tech- ments of privacy breaches. They find a nega- nologies (for instance, Tor,28 an application tive and significant, but temporary, reduction for browsing the Internet anonymously) of 0.6 percent in the stock market price of lags vastly behind the adoption of sharing affected firms on the day of the breach. Ko ­technologies (for instance, online social net- and Dorantes (2006) find that, while a firm’s works such as Facebook). overall performance is lower in the four quar- The apparent dichotomy between pri- ters following a breach, the breached firm’s vacy attitudes, privacy intentions, and actual sales increase significantly relative to firms privacy behaviors has caught the attention that incurred no breach. These findings sug- of scholars (e.g., Berendt, Gunther, and gest that a strict disclosure policy alone may Spiekermann 2005), leading to a debate not be, by itself, the solution to aligning the over the so-called privacy paradox (Norberg, interests of firms, in terms of data security, Horne, and Horne 2007) and the value peo- with those of their customers. ple assign to their privacy. Is the dichotomy real or imaginary? Do people actually care 3.8 Consumer Attitudes and Behaviors Although privacy concerns seem to vary decidedly with context as well as per- 27 For analyses of privacy complaints submitted by consumers to the FTC, see https://ashkansoltani.files. sonal traits (Acquisti, Brandimarte, and wordpress.com/2013/01/knowprivacy_final_report.pdf. Loewenstein 2015), surveys of US respon- 28 See www.torproject.org. Acquisti et al.: The Economics of Privacy 477 about privacy? If they do, how much exactly evidence of privacy concerns increasing over do they value the protection of their personal an eight-year period; Stutzman, Gross, and data? Acquisti (2012) find evidence of increasing Perhaps anticlimactically, a first possible privacy-seeking behavior among a sample resolution to the paradox is that it may not of over 4,000 early Facebook members; actually exist. Attitudes are often expressed Kang, Brown, and Kiesler (2013) document generically (for instance, the seminal cat- Internet users’ attempts to maintain anonym- egorization by Harris and Westin (1992) of ity online; and Boyd and Marwick (2011) dis- individuals into privacy pragmatists, funda- cuss various alternative strategies teenagers mentalists, and unconcerned relied on broad adopt to protect their privacy while engaging and general survey questions), whereas in online sharing. behaviors (or behavioral intentions) are spe- That noted, evidence of dichotomies cific and contextual. Thus, it should not be between specific attitudes or preferences and surprising that the former may not correlate actual behaviors have also been uncovered. with or predict the latter (Fishbein and Consider, for instance, Acquisti and Gross Ajzen 1975). (2006), in the context of social networking A second resolution is that people routinely sites; or consider Turow et al. (2009), who find and expertly engage in mental ­trade-offs that 66 percent of Americans do not wish for of privacy concerns and privacy benefits marketers to tailor advertisements to their (Milberg et al. 1995), or a so-called privacy interests—while the vast majority of them calculus (Laufer and Wolfe 1977; Culnan use search engines and ­social-networking and Armstrong 1999; Dinev and Hart 2006). sites, which operate based on enabling This context-dependent calculus naturally advertisers to target advertisements. leads to situations in which consumers will Thus, it is more likely that the purported choose to protect their data, and other situa- dichotomy between privacy attitudes and tions in which protection will be seen as too privacy behaviors is actually the result of costly or ineffective and sharing is preferred. many, coexisting, and not mutually exclu- Privacy is, after all, a process of negotiation sive different factors. Among them, a role between public and private, a modulation of is likely played by various decision-making what a person wants to protect and what she hurdles consumers face when dealing with wants to share at any given moment and in privacy challenges, especially online, such any given context. Therefore, neither does as asymmetric information, bounded ratio- the sharing of certain information with oth- nality, and various heuristics. For instance, ers imply, per se, a loss of privacy, nor is the some individuals may not be aware of the complete hiding of data necessary for the extent to which their personal information protection of privacy. In fact, the observation is collected and identified online (many that people seem not to protect their privacy Internet users are substantially unaware of online very aggressively does not justify the the extent of behavioral targeting, and many conclusion that they never do so. Tsai et al. believe that there is an implied duty of con- (2011) find that consumers are, sometimes, fidentiality and law that protects their data willing to pay a price premium to purchase despite disclosure; see, e.g., McDonald and goods from more privacy-protective mer- Cranor 2010; Hoofnagle and Urban 2014). chants; Goldfarb and Tucker (2012b) use Or, some individuals may not be aware of surveys to measure respondents’ implied possible alternative solutions to their privacy concern for privacy by their willingness to concerns (such as privacy-enhancing tech- disclose information about income, and find nologies). Furthermore, some individuals’ 478 Journal of Economic Literature, Vol. LIV (June 2016)

­privacy-sensitive decision making—even purchase cinema tickets from a merchant that of well-informed and privacy-sensitive that requests less personal information subjects—may be affected by cognitive and than a competing, but cheaper, merchant; behavioral biases, such as immediate­ grati- and Preibusch, Kubler, and Beresford fication or status quo bias (Acquisti 2004; (2013) find that a vast majority of partici- John, Acquisti, and Loewenstein 2011). pants chose to buy a DVD from a cheaper Because of those hurdles, it is difficult to but more ­privacy-invasive merchant, than pinpoint reliably the valuations that con- from a costlier (1 euro more) but less inva- sumers assign to their privacy or to their sive merchant. In fact, behavioral and personal data. Certainly, there is no short- cognitive heuristics may also play a signifi- age of studies that do attempt to quantify cant role in affecting privacy valuations. 29 the value of data for both organizations and for ­end-users. For instance, Olejnik, Minh- 4. The Evolving Privacy Debate Dung, and Castelluccia (2014) find that ele- ments of users’ browsing histories are being Both the theoretical and the empirical traded among Internet advertising compa- studies we have examined in the previous nies for amounts lower than $0.0005 per sections of this article suggest that the path person. Hann et al. (2007) quantify the value towards optimally balancing privacy protec- that US subjects assign to protection against tion and benefits from disclosure is, at the errors, improper access, and secondary uses very least, uncertain. And yet, those studies of personal information online to an amount also make the following clear: (1) different between $30.49 and $44.62. Similarly, stakeholders—including businesses, con- Savage and Waldman (2013) find that con- sumers, and governments—each have dif- sumers may be willing to make a one-time ferent, multilayered, and often conflicting payment of $2.28 to conceal their browser objectives; (2) information technologies, history, $4.05 to conceal their contacts list, privacy concerns, and the economics of pri- $1.19 to conceal their location, $1.75 to con- vacy evolve constantly, with no single study ceal their phone’s identification number, or policy intervention being able to fully $3.58 to conceal the contents of their text account for future (and even some present) messages, and $2.12 to eliminate advertis- concerns; and (3) rather than a uniform ing. However, privacy outcomes are uncer- piece of regulation to address contempo- tain (Knight 1921) and privacy concerns rary privacy issues, a nuanced approach— and expectations are remarkably ­context dynamic and individualized to specific dependent (Nissenbaum 2004). Thus, small markets, contexts, and scenarios—may be changes in contexts and scenarios­ can lead necessary. In this section, we point out a to widely differing conclusions­ regarding number of privacy issues that have started consumers’ willingness to pay to protect and continue to attract a lively debate their data. For instance, in a lab experiment, involving economics, technology, and pol- Tsai et al. (2011) find that a substantial pro- icy. We also propose a number of directions portion of participants were willing to pay a for future research. premium (roughly half a dollar, for products costing about $15) to purchase goods from 29 For instance, applying the endowment effect to the merchants with more protective privacy study of privacy, Acquisti, John, and Loewenstein (2013) policies; Jentzsch, Preibusch, and Harasser identify a discrepancy of up to five times the value indi- viduals assign to the protection of personal information (2012) find that (only) a third of participants merely depending on the framing of the trade-offs, as were willing to pay a similar premium to opposed to actual changes in the trade-offs. Acquisti et al.: The Economics of Privacy 479

4.1 Regulation versus Self-Regulation The United States and the European Union have taken different positions in this Empirical studies of privacy trade-offs debate. The European Union has focused have contributed to a debate over how to on regulatory solutions, establishing princi- best protect privacy without harming the ples that govern use of data across multiple beneficial effects of information sharing. sectors, including the need for individuals’ Much of this debate has juxtaposed the rela- consent for certain data processing activi- tive benefits of regulation and self-regulation ties. By contrast, the United States has taken (Bennett and Raab 2006; Koops et al. 2006). a more limited, sectorial, and ad hoc regu- On one side of the debate, Gellman (2002) latory approach, often opting for providing estimates that in 2001, $18 billion were lost guidelines rather than enforcing principles. by companies in Internet retail sales due For instance, recommendations to the US to buyers’ privacy concerns, and appraises Congress by the Federal Trade Commission at even larger amounts the costs that con- (Federal Trade Commission 2012), moti- sumers bear when their privacy is not pro- vated by the delays with which companies tected (the costs include losses associated have adopted appropriate privacy rules, with identity theft, higher prices, spam, and included the introduction of a do-not-track investments aimed at protecting data). This mechanism, similar to the Do-Not-Call list side of the debate advocates regulatory solu- that became law in 2003. Such a mechanism tions to privacy problems (Solove 2003). One would be built into websites and web brows- of the highlighted advantages of such solu- ers, and would allow people to signal to web- tions would be the avoidance of the complex- sites (and their commercial partners) that ity for data subjects to interact with different they do not want to be tracked. Limitations entities, each with different privacy policies with respect to verification and enforce- (cf. Milberg, Smith, and Burke 2000). ment would undoubtedly exist. Currently On the opposite side of the debate, Rubin available services that allow consumers to and Lenard (2001) suggest that the costs of opt out of advertising networks (such as privacy protection are much higher for both the ­Self-Regulatory Program for Online firms and consumers alike than the costs Behavioral Advertising, and Google’s opt-out that may rise from privacy violations. For settings) prevent users from receiving cer- instance, according to the authors, targeted tain types of targeted ads, but they do not advertising gives consumers useful infor- stop advertisers or sites from collecting data. mation, advertising revenues support new Self-regulatory solutions often rely on Internet services, and reducing the use of transparency and control (also known as online information would ultimately be costly “notice and consent”), and are therefore to consumers. This side of the debate advo- predicated around individuals’ ability to cates self-regulation.­ Self-regulatory solutions be informed about, and properly man- may work, for instance, when concerns over age, privacy settings and privacy concerns. adverse consumer response limit advertisers’ However, numerous empirical studies have usage of invasive targeting of ads (Lohr 2010); highlighted the limitations of transparency website operators choose to comply with mechanisms. These include the failure of their published policies rather than engage in privacy policies to properly inform con- spam (Jamal, Maier, and Sunder 2003); and sumers about how their data will be used firms refrain from engaging in certain forms (Jensen and Potts 2004); the large opportu- of price discrimination so as not to antagonize nity costs associated with frameworks that consumers (Anderson and Simester 2010). rely on consumers reading privacy policies 480 Journal of Economic Literature, Vol. LIV (June 2016)

(McDonald and Cranor 2008); and the fact be successful. First, when interacting with that the same policy can nudge individu- services that offer trade and protection for als to disclose varying amounts of personal their data, consumers face similar hurdles as data simply by manipulating the format in those that arise when dealing with transpar- which the policy itself is presented to users ency and consent in the presence of tradi- (Adjerid et al. 2013). Control mechanisms tional privacy policies—including the hurdle have also been critiqued. For instance, of estimating the fair value of their personal while research in the information system information. Second, in the absence of reg- literature has suggested that providing ulatory frameworks that enforce protection users with control over their information of traded data, the possibility of secondary can reduce privacy concerns (Culnan and usage of personal information (after the sub- Armstrong 1999; Tucker 2014; Malhotra, ject has traded it to another party) may run Kim, and Agarwal 2004), the protection counter to the very idea of protecting con- afforded by that control may be illusory. sumer data. Third, much of consumer data To that effect, Brandimarte, Acquisti, and that is of value to advertisers is nonstatic Loewenstein (2013) highlight how the mere information that is dynamically generated as provision of more perceived control over part of the interaction of the individual with personal information can paradoxically lead other online services, such as search engines users to take more risks with their personal or online social networks. These services information, increasing their willingness to would be unlikely to relinquish control over share sensitive data with other parties. As the personal information that their technolo- a result, doubts have been expressed about gies help generate. An additional alternative the viability of self-regulated­ transparency proposed in the literature is the application and control mechanisms in adequately of soft paternalistic solutions (designed by ­protecting consumers’­ privacy (Acquisti, governments, organizations, or data subjects John, and Loewenstein 2013; Solove 2013; themselves as self-control mechanisms) to Zuiderveen Borgesius 2015). The limitations “nudge” individuals towards personal infor- of notice and consent mechanisms as via- mation practices they have claimed to prefer ble instruments of privacy policy were also (Wang et al. 2013). acknowledged in a 2014 report by the US As noted in section 2.2, market-based President’s Council of Advisors on Science solutions and regulatory approaches to pri- and Technology (Executive Office of the vacy protection are not polar opposites. They President—President’s Council of Advisors are better perceived as points on a spec- on Science and Technology 2014). trum of solutions—from regimes that rely An alternative approach to privacy pro- entirely on firms’ self-regulation and con- tection relies on the propertization (Laudon sumers’ responsibility (even in the absence 1996; Varian 1997; Schwartz 2004) or of clearly defined and assigned property licensing (Samuelson 2000) of personal rights over personal data), to regimes with information. As noted in section 2.2 and in strict regulatory protection of data. Similarly, section 3.6, various authors have proposed as pointed out in section 3.4, an understand- the establishment of markets where indi- ing is emerging that the economic impact viduals can trade (rights over) their personal of privacy regulation is heterogeneous and information. With the advent of social media, context dependent: privacy regulation may a number of startups began offering services have both positive and negative effects on along those lines. However, it is not clear that economic growth and efficiency, depending such markets for personal data could ever on the specific attributes of the laws. Thus, Acquisti et al.: The Economics of Privacy 481 a potentially worthwhile direction of future in a world where so much personal data research will aim at focusing on the specific can be gathered and used to influence the features of regulation (and their differential individual.30 effects on economic outcomes), rather than The trade-offs arising from the intersec- on simpler binary models contrasting regula- tion of big data and privacy suggest several tion with its absence. fruitful directions for research. For instance: to what extent will the combination of 4.2 From Big Data to Privacy-Enhancing sophisticated analytics and massive amounts Technologies of consumer data will lead to an increase in As the amount of personal information aggregate welfare, and to what extent will produced and gathered about individuals it lead to mere changes in the allocation of continues to increase, so does the ability wealth? A related open question concerns to utilize data mining to infer more sensi- the role of privacy-enhancing technologies tive information about them. For instance, (Goldberg 2003) in affecting how personal Sweeney (1997) has highlighted the pos- information will be used and with what eco- sibility of reidentifying supposedly anony- nomic consequences. Privacy-enhancing mous health data, and Acquisti and Gross technologies, or PETs, can allow the pro- (2009) have shown how seemingly innocuous tection of sensitive data without entirely self-revelations made on the Internet—such disrupting commercially valuable flows of as making available one’s date and state of consumer information. They are, however, birth in a social network profile—may have computationally intensive, and reduce the serious consequences in terms of privacy granularity of individual information avail- intrusion. With evolving data mining and able to others (consider, for instance, differ- analytics tools being applied to expanding ential privacy, as in Dwork 2006). Thus, they sets of personal data (the so called “big data” may diminish its economic value. Therefore, phenomenon) and with new technologies— how costly will those privacy-enhancing tech- from facial recognition to smart thermo- nologies ultimately be? Will their implemen- stats, from activity and health trackers to the tation costs, as well as the opportunity costs Internet of Things—the portions of our per- they may cause, be offset by gains in pri- sonal and professional lives that are not mon- vacy protection? And, importantly, who will itored and quantified are further reduced. bear those costs—the data subjects or data On the one hand, granular personal data may holders? Finally, the contrast between the be used to provide even more precisely tar- potential value of (big) data, and its privacy geted services and to ensure that advertising costs, raises questions about optimal reten- is shown only to those consumers who stand tion policies. For instance, do larger quan- to gain most from it (Tucker 2012). On the tities of consumer historical data provide other, opportunities for abuse may abound. competitive advantages to Internet search For instance, algorithmic discrimination firms (Chiou and Tucker 2014)? And could may take subtle forms (Sweeney 2013 doc- a so-called “right to be forgotten,” suggested uments cases in which advertising technolo- gies employed by a search engine can expose racial bias). And personal data may be used to influence individual ­decision making in 30 For instance, Kramer, Guillory, and Hancock (2014) subtle, targeted, and hidden manners (Calo show that it is possible to influence the emotional states of users of a social networking site in the form of an emotional 2014), raising questions over the limits of a “contagion” by suppressing information containing positive person’s autonomy and ­self-determination or, alternatively, negative emotions. 482 Journal of Economic Literature, Vol. LIV (June 2016) by the European Commission,31 support may still be exposed (Ohm 2010; Heffetz individuals’ privacy rights without hamper- and Ligett 2013). For instance, a portion ing the societal benefits of data sharing? of anonymized movie ratings data made available by Netflix as part of a competition 4.3 Open Data, Government Records, and to improve its ratings algorithms could be Surveillance reidentified using Internet Movie Database The topic of this survey is the economics (IMDB) data (Narayanan and Shmatikov of privacy, and we have, therefore, natu- 2008). These trade-offs apply both to gov- rally focused on the commercial acquisition ernment databases (the Census uses a variety and exploitation of personal data. It would of mechanisms and procedures to balance be remiss of us, however, not to mention a researchers’ needs to access Census data no-less important facet of the privacy debate, with considerations for Census respondents’ one with potentially even greater impact on privacy) and to the private sector. 32 How individuals and societies—namely the role to best balance researchers’ and society’s and value of open access to data and govern- needs to access granular data with the need mental records, as well as the covert govern- to protect individuals’ records is a question mental collection of personal information. that simultaneously involves ­economists and Access to personal data (from governmen- scholars in other disciplines, such as statisti- tal administrative records, to researchers’ cians and computer scientists. results arising from experiments and surveys, As for the topic of government surveillance, to firms’ collections of consumers’ data) is the US PATRIOT Act was enacted in 2001 of great importance to empirical econo- and extended in 2011. It superseded, among mists and social scientists. And once again, others, the Foreign Intelligence Surveillance ­trade-offs arise between the utility of sharing Act, the Electronic Communications Privacy publicly (or with other researchers) personal Act, and the National Security Letter stat- records and files and the privacy risks asso- utes. It facilitates the US government’s col- ciated with granting access to third parties. lection of more information, from a greater Essentially, data utility and risks of disclosure number of sources, than had previously are correlated (Duncan and Stokes 2004). been authorized in criminal or foreign intel- Even statistical techniques meant to pro- ligence investigations. The PATRIOT Act tect data (such as the technique of differen- enables greater access to records showing an tial privacy, which attempts to minimize the individual’s spending and communications, risks of reidentification of records in a sta- including e-mail and telephone conversa- tistical database while maximizing the accu- tions.33 Beginning in June 2013, a series of racy of queries from such database) still face disclosures by former CIA employee and risk/utility trade-offs (Fienberg, Rinaldo, and Yang 2010), not just for firms but also for researchers (Komarova, Nekipelov, and 32 Similarly, the problems of data breaches and iden- tity theft discussed in section 3.7 do not arise only in the Yakovlev 2015). Furthermore, even pro- context of firms’ databases, but also governmental ones: in tected (or anonymized, or deidentified) data the last few years, a number of large-scale data breaches involved governmental data, such as the loss of 26.5 million records of veterans, their spouses, and active-duty military 31 European Commission, Proposal for a Regulation personnel in 2006, or the recent IRS data breach that has of the European Parliament and of the Council on the put as many as 724,000 taxpayers at risk of identity theft Protection of Individuals with regard to the Processing of (http://www.cbsnews.com/news/irs-identity-theft-online- Personal Data and on the Free Movement of Such Data hackers-social-security-number-get-transcript/). (General Data Protection Regulation) COM(2012) 11 33 See Congress CRS Report, 2011, at http://www.fas. final, 2012/0011 (COD), January 25, 2012. org/sgp/crs/intel/R40980.pdf. Acquisti et al.: The Economics of Privacy 483 contractor Edward Snowden of thousands materially affecting US businesses. How to of classified documents involving data col- devise strategies that balance privacy pro- lection by the National Security Agency tection, benefits from information sharing, triggered a massive wave of public con- business interests, and national security will cern about privacy and governmental over- likely remain a thorny and yet vital subject of reach (Marthews and Tucker 2015). The research for years to come. report of the US Administration’s Review 4.4 Conclusions Group on Intelligence and Communication Technologies in December 2013 states One of the themes emerging from this that “excessive surveillance and unjustified review is that both the sharing and the pro- secrecy can threaten civil liberties, pub- tecting of personal data can have positive lic trust, and the core processes of demo- and negative consequences at both the indi- cratic self-government,” whereas in “an era vidual and societal levels. On the one hand, increasingly dominated by technological personal information has both private and advances in communication technologies, commercial value, and the sharing of data— the United States must continue to col- as highlighted by both theoretical models lect signals intelligence globally in order to and empirical studies—may reduce frictions assure the safety of our citizens.” 34 Together, in the market and facilitate transactions. On the report states, the US government “must the other hand, the claimed societal benefits protect, at once, two different forms of secu- of data sharing have not always been vet- rity: national security and personal privacy.” ted and confirmed. For instance, the ability The report concludes that the “govern- of Google Flu Trends to correctly estimate ment should base its decisions on a careful influenza activity—which has often been analysis of consequences, including both heralded as an example of the power of big benefits and costs (to the extent feasible).” data (Goel et al. 2010), and which we cited Surveillance does not only have implications in the introduction of this review—was later with respect to civil liberties, but also with challenged (Butler 2013). Researchers have respect to economic interests, from those of mentioned “Big Data Hubris” (Lazer et al. firms to those of other nations. These impli- 2014) in reference to the “implicit assump- cations can give rise to disputes, including a tion that big data are a substitute for, rather 2016 legal dispute between Apple Inc. and than a supplement to, traditional data col- the FBI regarding bypassing security mech- lection and analysis.” In fact, exploiting the anisms to gain access to a terrorist’s cell- commercial value of data can often entail a phone, and the European Union’s Court of reduction in private utility, and sometimes Justice ruling in 2015 that the Safe Harbor even in social welfare overall. Thus, consum- agreements are invalid in response, at least ers have good reasons to be concerned about in part, to the Snowden revelations (Finley unauthorized commercial application of 2015). These Safe Harbor agreements had their private information. Use of individual thus far enabled US companies to comply data may subject an individual to a variety of with privacy laws protecting EU citizens personally costly practices, including price by regulating the way US companies would discrimination in retail markets, quantity dis- handle their data. The loss of confidence in crimination in insurance and credit markets, US firms from EU consumers may end up spam, and risk of identity theft, in addition to the disutility inherent in just not knowing 34 See https://www.whitehouse.gov/sites/default/files/ who knows what or how they will use it in the docs/2013-12-12_rg_final_report.pdf. future. Personal data—like all information— 484 Journal of Economic Literature, Vol. LIV (June 2016) is easily stored, replicated, and transferred, history, and political opinions. While, overall, and regulating its acquisition and dissemina- these technologies seemingly leave privacy tion is a challenging undertaking for individ- choices in the hands of consumers, many uals and governments alike. (if not most) consumers, in practice, lack Given the fundamentally sensitive nature the awareness and technical sophistication of personal data, it is not surprising that required to protect and regulate the multiple advancements in information technology dimensions of their personal information. and increased globalization of trade, invest- Privacy-invasive technological services have ment, information flows, and security threats become integral to every-day communica- have brought concerns over the erosion of tions, job searches, and general consump- personal privacy to the forefront of public tion. At the same time, ­privacy-protecting debate. Numerous Internet firms have col- services require additional levels of user lected large amounts of data from their users effort and know-how, which limits their effi- and either sell this data or use it to enable cacy, especially within some of the most vul- advertisers to target and personalize ads. nerable segments of the population. While consumers can and do benefit from As noted in section 3.4, extracting eco- targeted product recommendations (Anand nomic value from data and protecting privacy and Shachar 2009), they also can and do do not need to be antithetical goals. The eco- incur substantial monetary costs and disut- nomic literature we have examined clearly ilities from violations of their privacy (Stone suggests that the extent to which personal 2010). Such concerns have led to new regu- information should be protected or shared to lations across world governments, some pro- maximize individual or societal welfare is not tecting privacy (e.g., the EU Data Protection a one-size-fits-all problem: the optimal bal- Directive, the US Children’s Online Privacy ancing of privacy and disclosure is very much Protection Act), some legalizing its erosion context dependent, and it changes from sce- (for instance, by allowing trade in personal nario to scenario. In fact, privacy ­guarantees information under certain circumstances; may be most needed precisely when the see, e.g., the US Gramm–Leach–Bliley Act goal is to extract benefits from the data. In of 1999), and some suggesting the imple- the health-care realm, for instance, if pri- mentation of additional opt-in and opt-out vacy risks are not addressed, public concern controls for users (e.g., the US Federal might end up outweighing public support for Trade Commission’s 2012 online privacy initiatives that rely on extensive collection of guidelines35). patients’ medical records (Kohane 2015). With regulations struggling to keep pace, Thus, it stands to reason that, case by case, industry competition has been behind both diverse combinations of regulatory interven- new privacy-enhancing and privacy-invasive­ tions, technological solutions, and economic technologies. New search engines, social incentives could ensure the balancing of pro- networks, ecommerce websites, web tection and sharing that increases individual browsers, and individualized controls for and societal welfare. ­privacy-conscious consumers have emerged. We have, in this article, attempted to sur- Concurrently, social media services have vey and rationalize the extant research on the facilitated a culture of disclosure: disclosure economics of privacy. Because privacy is a of one’s activities, location, emotions, work multifaceted concept, our survey has delved into numerous literatures across a variety 35 See http://www.ftc.gov/os/2012/03/120326privacyreport. of disciplines and fields, from marketing to pdf. economics to computer science. While this Acquisti et al.: The Economics of Privacy 485 study is certainly not exhaustive, we believe Quarterly Journal of Economics 84 (3): 488–500. it highlights some of the most relevant his- Altman, Irwin. 1975. The Environment and Social Behavior: Privacy, Personal Space, Territory, Crowd- torical and current research on the topic. It ing. Pacific Grove, CA: Brooks/Cole Publishing. is, however, abundantly evident that protec- Anand, Bharat N., and Ron Shachar. 2009. “Targeted tion of personal privacy is rapidly emerging Advertising as a Signal.” Quantitative Marketing and Economics 7 (3): 237–66. as one of the most significant public policy Anderson, Eric T., and Duncan I. Simester. 2010. “Price issues, and research on the economics of pri- Stickiness and Customer Antagonism.” Quarterly vacy will, therefore, continue to expand and Journal of Economics 125 (2): 729–65. Anderson, Keith B., Erik Durbin, and Michael A. Salin- evolve in coming years. ger. 2008. “Identity Theft.” Journal of Economic Per- spectives 22 (2): 171–92. References Anderson, Ross, and Tyler Moore. 2006. “The Eco- nomics of Information Security.” Science 314 (5799): Acquisti, Alessandro. 2004. “Privacy in Electronic 610–13. Commerce and the Economics of Immediate Grati- Anderson, Simon P., and Andre de Palma. 2012. “Com- fication.” In EC ’04: Proceedings of the 5th ACM Con- petition for Attention in the Information (Overload) ference on Electronic Commerce, 21–29. New York: Age.” RAND Journal of Economics 43 (1): 1–25. Association for Computing Machinery. Ariès, P. 1987. A History of Private Life. Cambridge, Acquisti, Alessandro, Laura Brandimarte, and George MA: Harvard University Press. Loewenstein. 2015. “Privacy and Human Behavior in Armstrong, Mark, John Vickers, and Jidong Zhou. 2009. the Age of Information.” Science 347 (6221): 509–14. “Consumer Protection and the Incentive to Become Acquisti, Alessandro, and Christina M. Fong. 2013. Informed.” Journal of the European Economic Associ- “An Experiment in Hiring Discrimination via Online ation 7 (2–3): 399–410. Social Networks.” Paper presented at the Twelfth Armstrong, Mark, and Jidong Zhou. 2010. “Conditioning Workshop on the Economics of Information Security, Prices on Search Behaviour.” Munich Personal RePEc Washington, DC, June 12. Archive Paper 19985. Acquisti, Alessandro, Allan Friedman, and Rahul Tel- Arrow, Kenneth J. 1962. “The Economic Implications of ang. 2006. “Is There a Cost to Privacy Breaches? Learning By Doing.” Review of Economic Studies 29 An Event Study.” Presented at the Twenty-Seventh (3): 155–73. International Conference on Information Systems, Asplund, Marcus, Rickard Eriksson, and Niklas Strand. Milwaukee 2006 and Workshop on the Economics of 2008. “Price Discrimination in Oligopoly: Evidence Information Security 2006. from Regional Newspapers.” Journal of Industrial Acquisti, Alessandro, and Ralph Gross. 2006. “Imagined Economics 56 (2): 333–46. Communities: Awareness, Information Sharing, and Athey, Susan, Emilio Calvano, and Joshua Gans. 2013. Privacy on the Facebook.” Privacy Enhancing Tech- “The Impact of the Internet on Advertising Markets niques 4258: 36–58. for News Media.” University of Toronto Rotman Acquisti, Alessandro, and Ralph Gross. 2009. “Predicting School of Management Working Paper 2180851. Social Security Numbers from Public Data.” Proceed- Athey, Susan, and Joshua S. Gans. 2010. “The Impact ings of the National Academy of Sciences 106 (27): of Targeting Technology on Advertising Markets and 10975–80. Media Competition.” American Economic Review 100 Acquisti, Alessandro, Leslie K. John, and George Loe- (2): 608–13. wenstein. 2013. “What Is Privacy Worth?” Journal of Bapna, Ravi, Jui Ramaprasad, Galit Shmueli, and Legal Studies 42 (2): 249–74. Akhmed Umyarov. Forthcoming. “One-Way Mirrors Acquisti, Alessandro, and Hal R. Varian. 2005. “Con- in Online Dating: A Randomized Field Experiment.” ditioning Prices on Purchase History.” Marketing Management Science. ­Science 24 (3): 367–81. Baumer, David L., Julia B. Earp, and J. C. Poindexter. Adjerid, Idris, Alessandro Acquisti, Laura Brandimarte, 2004. “Internet Privacy Law: A Comparison between and . 2013. “Sleights of Privacy: the United States and the European Union.” Comput- Framing, Disclosures, and the Limits of Transpar- ers and Security 23 (5): 400–412. ency.” In Proceedings of the Ninth Symposium on Baye, Michael R., and John Morgan. 2001. “Information Usable Privacy and Security, 1–11. New York: Associ- Gatekeepers on the Internet and the Competitiveness ation for Computing Machinery. of Homogeneous Product Markets.” American Eco- Adjerid, Idris, Alessandro Acquisti, Rahul Telang, Rema nomic Review 91 (3): 454–74. Padman, and Julia Adler-Milstein. Forthcoming. “The Bendrath, Ralf, and Milton Mueller. 2011. “The End of Impact of Privacy Regulation and Technology Incen- the Net as We Know It? Deep Packet Inspection and tives: The Case of Health Information Exchanges.” Internet Governance.” News Media & Society 13 (7): Management Science. 1142–60. Akerlof, George A. 1970. “The Market for ‘Lemons’: Bennett, Colin J., and Charles D. Raab. 2006. The Quality Uncertainty and the Market Mechanism.” ­Governance of Privacy: Policy Instruments in Global 486 Journal of Economic Literature, Vol. LIV (June 2016)

Perspective. Cambridge, MA: MIT Press. Burton, Paul R., et al. 2007. “Genome-Wide Association Berendt, Bettina, Oliver Gunther, and Sarah Spieker- Study of 14,000 Cases of Seven Common Diseases and mann. 2005. “Privacy in e-Commerce: Stated Pref- 3,000 Shared Controls.” Nature 447 (7145): 661–78. erences vs. Actual Behavior.” Communications of the Bushway, Shawn D. 2004. “Labor Market Effects of ACM 48 (4): 101–06. Permitting Employer Access to Criminal History Bergemann, Dirk, and Alessandro Bonatti. 2011. “Tar- Records.” Journal of Contemporary Criminal Justice geting in Advertising Markets: Implications for Offline 20 (3): 276–91. versus Online Media.” RAND Journal of Economics 42 Butler, Declan. 2013. “When Google Got Flu Wrong.” (3): 417–43. Nature 494 (7436): 155. Bergemann, Dirk, and Alessandro Bonatti. 2015. “Sell- Calo, Ryan. 2011. “The Boundaries of Privacy Harm.” ing Cookies.” American Economic Journal: Microeco- Indiana Law Journal 86 (3): 1131–62. nomics 7 (3): 259–94. Calo, Ryan. 2014. “Digital Market Manipulation.” Berners-Lee, Tim. 2000. Weaving the Web: The Original George Washington Law Review 82 (4): 995–1051. Design and Ultimate Destiny of the World Wide Web. Calzolari, Giacomo, and Alessandro Pavan. 2006. “On New York: HarperCollins Publishers. the Optimality of Privacy in Sequential Contracting.” Berthold, Stefan, and Rainer Böhme. 2010. “Valuating Journal of Economic Theory 130 (1): 168–204. Privacy with Option Pricing Theory.” In Economics Campbell, James, Avi Goldfarb, and Catherine Tucker. of Information Security and Privacy, edited by Tyler 2015. “Privacy Regulation and Market Structure.” Moore, David Pym, and Christos Ioannidis, 187–209. Journal of Economics and Management Strategy 24 New York: Springer. (1): 47–73. Bertrand, Marianne, and . 2004. Campbell, Katherine, Lawrence A. Gordon, Martin P. “Are Emily and Greg More Employable than Laki- Loeb, and Lei Zhou. 2003. “The Economic Cost of sha and Jamal? A Field Experiment on Labor Market Publicly Announced Information Security Breaches: Discrimination.” American Economic Review 94 (4): Empirical Evidence from the Stock Market.” Journal 991–1013. of Computer Security 11 (3): 431–48. Blake, Thomas, Chris Nosko, and Steven Tadelis. 2015. Cavusoglu, Huseyin, Birendra Mishra, and Srinivasan “Consumer Heterogeneity and Paid Search Effective- Raghunathan. 2004. “The Effect of Internet Security ness: A Large-Scale Field Experiment.” Econometrica Breach Announcements on Market Value: Capital 83 (1): 155–74. Market Reactions for Breached Firms and Internet Blanchette, Jean-Francois, and Deborah G. Johnson. Security Developers.” International Journal of Elec- 2002. “Data Retention and the Panoptic Society: The tronic Commerce 9 (1): 69–104. Social Benefits of Forgetfulness.” Information Society Chellappa, Ramnath K., and Shivendu Shivendu. 2010. 18 (1): 33–45. “Mechanism Design for ‘Free’ but ‘No Free Disposal’ Blattberg, Robert C., and John Deighton. 1991. “Inter- Services: The Economics of Personalization under active Marketing: Exploiting the Age of Addressabil- Privacy Concerns.” Management Science 56 (10): ity.” MIT Sloan Management Review 33 (1): 5–14. 1766–80. Blumstein, Alfred, and Kiminori Nakamura. 2009. Chen, Yongmin. 1997. “Paying Customers to Switch.” “Redemption in the Presence of Widespread Criminal Journal of Economics and Management Strategy 6 (4): Background Checks.” Criminology 47 (2): 327–59. 877–97. Board, Simon, and Jay Lu. 2015. “Competitive Informa- Chen, Yuxin, and Ganesh Iyer. 2002. “Consumer tion Disclosure in Search Markets.” Unpublished. Addressability and Customized Pricing.” Marketing Böhme, Rainer, Nicolas Christin, Benjamin Edelman, Science 21 (2): 197–208. and Tyler Moore. 2015. “Bitcoin: Economics, Tech- Chen, Yuxin, Chakravarthi Narasimhan, and Z. John nology, and Governance.” Journal of Economic Per- Zhang. 2001. “Individual Marketing with Imperfect spectives 29 (2): 213–38. Targetability.” Marketing Science 20 (1): 23–41. Bottero, Margherita, and Giancarlo Spagnolo. 2013. Chen, Yuxin, and Z. John Zhang. 2009. “Dynamic Tar- “Limited Credit Records and Market Outcomes.” geted Pricing with Strategic Consumers.” Interna- Bank of Italy Working Paper 903. tional Journal of Industrial Organization 27 (1): Boyd, Danah, and Alice Marwick. 2011. “Social Steg- 43–50. anography: Privacy in Networked Publics.” Paper Cheney, Julia S. 2005. “Identity Theft: Do Definitions presented at the International Communication Asso- Still Matter?” Federal Reserve Bank of Philadelphia ciation Conference, Boston, MA, May 28. Payments Cards Center Discussion Paper 05-10. Brandimarte, Laura, Alessandro Acquisti, and George Chiou, Lesley, and Catherine Tucker. 2014. “Search Loewenstein. 2013. “Misplaced Confidences: Privacy Engines and Data Retention: Implications for Privacy and the Control Paradox.” Social Psychological and and Antitrust.” Massachusetts Institute of Technology Personality Science 4 (3): 340–47. Sloan School Working Paper 5094-14. Burke, Jeremy M., Curtis R. Taylor, and Liad Wagman. Coase, Ronald H. 1960. “The Problem of Social Cost.” 2012. “Information Acquisition in Competitive Mar- Journal of Law and Economics 3: 1–44. kets: An Application to the US Mortgage Market.” Coggeshall, Stephen. 2007. “ID Theft Knows No American Economic Journal: 4 (4): Boundaries.” http://www.technewsworld.com/story/ 65–106. 56864.html. Acquisti et al.: The Economics of Privacy 487

Conitzer, Vincent, Curtis R. Taylor, and Liad Wagman. Machinery. 2012. “Hide and Seek: Costly Consumer Privacy in a Farrell, Joseph. 1987. “Information and the Coase Theo- Market with Repeat Purchases.” Marketing Science 31 rem.” Journal of Economic Perspectives 1 (2): 113–29. (2): 277–92. Farrell, Joseph. 2012. “Can Privacy Be Just Another Cowan, Simon. 2007. “The Welfare Effects of Third-De- Good?” Journal on Telecommunications and High gree Price Discrimination with Nonlinear Demand Technology Law 10 (2): 251–64. Functions.” RAND Journal of Economics 38 (2): Federal Trade Commission. 2012. “Protecting Con- 419–28. sumer Privacy in an Era of Rapid Change: Recom- Culnan, Mary J., and Pamela K. Armstrong. 1999. mendations for Businesses and Policymakers.” https:// “Information Privacy Concerns, Procedural Fairness, www.ftc.gov/reports/protecting-consumer-privacy- and Impersonal Trust: An Empirical Investigation.” era-rapid-change-recommendations-businesses- Organization Science 10 (1): 104–15. policymakers. Daughety, Andrew F., and Jennifer F. Reinganum. Feri, Francesco, Caterina Giannetti, and Nicola 2010. “Public Goods, Social Pressure, and the Choice Jentzsch. 2016. “Disclosure of Personal Information between Privacy and Publicity.” American Economic under Risk of Privacy Shocks.” Journal of Economic Journal: Microeconomics 2 (2): 191–221. Behavior and Organization 123: 138–48. de Corniere, Alexandre. Forthcoming. “Search Adver- Fienberg, Stephen E., Alessandro Rinaldo, and Xiaolin tising.” American Economic Journal: Microeconomics. Yang. 2010. “Differential Privacy and the Risk-Utility de Corniere, Alexandre, and Romain De Nijs. 2014. Tradeoff for Multi-dimensional Contingency Tables.” “Online Advertising and Privacy.” http://papers.ssrn. Privacy in Statistical Databases 6344: 187–99. com/sol3/papers.cfm?abstract_id=2191124. Finley, Klint. 2015. “Thank (or Blame) Snowden for Diamond, Peter. 1971. “A Model of Price Adjustment.” Europe’s Big Privacy Ruling.” Wired. http://www. Journal of Economic Theory 3 (2): 156–68. wired.com/2015/10/tech-companies-can-blame- Dinev, Tamara, and Paul Hart. 2006. “An Extended Pri- snowden-data-privacy-decision/. vacy Calculus Model for e-Commerce Transactions.” Fishbein, Martin, and Icek Ajzen. 1975. Belief, Attitude, Information Systems Research 17 (1): 61–80. Intention, and Behavior: An Introduction to Theory Djankov, Simeon, Caralee McLiesh, and Andrei Shleifer. and Research. Reading, MA and London: Addison– 2007. “Private Credit in 129 Countries.” Journal of Wesley Publishing Company. Financial Economics 84 (2): 299–329. Friedman, Eric J., and Paul Resnick. 2001. “The Social Dugas, Andrea Freyer, et al. 2012. “Google Flu Trends: Cost of Cheap Pseudonyms.” Journal of Economics Correlation with Emergency Department Influenza and Management Strategy 10 (2): 173–99. Rates and Crowding Metrics.” Clinical Infectious Dis- Fudenberg, Drew, and Jean Tirole. 1998. “Upgrades, eases 54 (4): 463–69. Tradeins, and Buybacks.” RAND Journal of Econom- Duncan, George T., and S. Lynne Stokes. 2004. “Dis- ics 29 (2): 235–58. closure Risk vs. Data Utility: The R–U Confidentiality Fudenberg, Drew, and Jean Tirole. 2000. “Customer Map as Applied to Topcoding.” Chance 17 (3): 16–20. Poaching and Brand Switching.” RAND Journal of Dwork, Cynthia. 2006. “Differential Privacy.” In Autom- Economics 31 (4): 634–57. ata, Languages and Programming, edited by Michele Garrie, Daniel B., and Rebecca Wong. 2006. “Demys- Bugliesi et al., 1–12. Berlin: Springer. tifying Clickstream Data: A European and U.S. Per- Dziuda, Wioletta, and Ronen Gradwohl. 2015. “Achiev- spective.” Emory International Law Review 20 (2): ing Cooperation under Privacy Concerns.” American 563–89. Economic Journal: Microeconomics 7 (3): 142–73. Gehrig, Thomas, and Rune Stenbacka. 2007. “Informa- Edelman, Benjamin, and Michael Luca. 2014. “Digital tion Sharing and Lending Market Competition with Discrimination: The Case of Airbnb.com.” Harvard Switching Costs and Poaching.” European Economic Business School, Negotiation, Organizations & Mar- Review 51 (1): 77–99. kets Unit Working Paper 14-054. Gellman, Robert. 2002. “Privacy, Consumers, and Einav, Liran, Mark Jenkins, and Jonathan Levin. 2013. Costs—How the Lack of Privacy Costs Consumers and “The Impact of Credit Scoring on Consumer Lend- Why Business Studies of Privacy Costs are Biased and ing.” RAND Journal of Economics 44 (2): 249–74. Incomplete. https://epic.org/reports/dmfprivacy.html. Evans, David S. 2009. “The Online Advertising Industry, Goel, Sharad, Jake M. Hofman, Sebastien Lahaie, David Economics, Evolution, and Privacy.” Journal of Eco- M. Pennock, and Duncan J. Watts. 2010. “Predicting nomic Perspectives 23 (3): 37–60. Consumer Behavior with Web Search.” Proceedings of Executive Office of the President, President’s Council of the National Academy of Sciences 107 (41): 17486–90. Advisors on Science and Technology. 2014. “Report to Goh, Khim-Yong, Kai-Lung Hui, and Ivan P. L. Png. the President: Big Data and Privacy: A Technological 2015. “Privacy and Marketing Externalities: Evidence Perspective.” Washington, DC: Executive Office of from Do Not Call.” Management Science 61 (12): the President. 2982–3000. Farahat, Ayman, and Michael Bailey. 2012. “How Goldberg, Ian. 2003. “Privacy-Enhancing Technologies Effective Is Targeted Advertising?” In Proceedings for the Internet, II: Five Years Later.” In Privacy of the 21st International Conference on World Wide Enhancing Technologies, edited by Roger Dingledine Web, 111–20. New York: Association for Computing and Paul Syverson, 1–12. New York: Springer. 488 Journal of Economic Literature, Vol. LIV (June 2016)

Goldfarb, Avi, and Catherine E. Tucker. 2011a. “Online Data-Based Research.” National Bureau of Economic Display Advertising: Targeting and Obtrusiveness.” Research Working Paper 19433. Marketing Science 30 (3): 389–404. Hermalin, Benjamin E., and Michael L. Katz. 2006. Goldfarb, Avi, and Catherine E. Tucker. 2011b. “Privacy “Privacy, Property Rights and Efficiency: The Eco- Regulation and Online Advertising.” Management Sci- nomics of Privacy as Secrecy.” Quantitative Marketing ence 57 (1): 57–71. and Economics 4 (3): 209–39. Goldfarb, Avi, and Catherine E. Tucker. 2012a. “Privacy Hillestad, Richard, et al. 2005. “Can Electronic Medical and Innovation.” Innovation Policy and the Economy Record Systems Transform Health Care? Potential 12 (1): 65–90. Health Benefits, Savings, and Costs.” Health Affairs Goldfarb, Avi, and Catherine E. Tucker. 2012b. “Shifts 24 (5): 1103–17. in Privacy Concerns.” American Economic Review 102 Hirshleifer, Jack. 1971. “The Private and Social Value of (3): 349–53. Information and the Reward to Inventive Activity.” Goldin, Claudia, and Cecilia Rouse. 2000. “Orchestrat- American Economic Review 61 (4): 561–74. ing Impartiality: The Impact of ‘Blind’ Auditions on Hirshleifer, Jack. 1980. “Privacy: Its Origin, Function, Female Musicians.” American Economic Review 90 and Future.” Journal of Legal Studies 9 (4): 649–64. (4): 715–41. Hitsch, Gunter J., Ali Hortaçsu, and . 2010. Gottlieb, Daniel, and Kent Smetters. 2011. “Grade “Matching and Sorting in Online Dating.” American Non-disclosure.” National Bureau of Economic Economic Review 100 (1): 130–63. Research Working Paper 17465. Hoffmann, Florian, Roman Inderst, and Marco Ottavi- Gradwohl, Ronen. 2014. “Voting in the Limelight.” ani. 2014. “Persuasion through Selective Disclosure: Unpublished. Implications for Marketing, Campaigning, and Privacy Gradwohl, Ronen. 2015. “Privacy in Implementation.” Regulation.” Unpublished. Unpublished. Hoofnagle, Chris Jay, Ashkan Soltani, Nathaniel Good, Gradwohl, Ronen, and Omer Reingold. 2010. “Partial and Dietrich J. Wambach. 2012. “Behavioral Advertis- Exposure in Large Games.” Games and Economic ing: The Offer You Cannot Refuse.” Harvard Law & Behavior 68 (2): 602–13. Policy Review 6: 273. Gradwohl, Ronen, and Rann Smorodinsky. 2014. Hoofnagle, Chris Jay, and Jennifer M. Urban. 2014. “Perception Games and Privacy.” http://arxiv.org/ “Alan Westin’s Privacy Homo Economicus.” Wake abs/1409.1487. Forest Law Review 49 (261). Grossman, Gene M., and Carl Shapiro. 1984. “Infor- Hui, Kai-Lung, and Ivan P. L. Png. 2006. “The Eco- mative Advertising with Differentiated Products.” nomics of Privacy.” In Handbooks in Information Sys- Review of Economic Studies 51 (1): 63–81. tems: Volume 1: Economics and Information Systems, Grossman, Sanford J. 1981. “The Informational Role edited by Terrence Hendershott, 471–98. Bingley, of Warranties and Private Disclosure about Prod- UK: Emerald. uct Quality.” Journal of Law and Economics 24 (3): Jamal, Karim, Michael Maier, and Shyam Sunder. 2003. 461–83. “Privacy in e-Commerce: Development of Reporting Hagiu, Andrei, and Bruno Jullien. 2011. “Why Do Inter- Standards, Disclosure, and Assurance Services in an mediaries Divert Search?” RAND Journal of Econom- Unregulated Market.” Journal of Accounting Research ics 42 (2): 337–62. 41 (2): 285–309. Hann, Il-Horn, Kai Lung Hui, Yee-Lin Lai, Sang-Yong Jappelli, Tulio, and Marco Pagano. 2002. “Informa- Tom Lee, and Ivan P. L. Png. 2006. “Who Gets tion Sharing, Lending and Defaults: Cross-Country Spammed?” Communications of the ACM 49 (10): Evidence.” Journal of Banking and Finance 26 (10): 83–87. 2017–45. Hann, Il-Horn, Kai Lung Hui, Sang-Yong Tom Lee, Jensen, Carlos, and Colin Potts. 2004. “Privacy Policies and Ivan P. L. Png. 2007. “Overcoming Online as Decision-Making Tools: An Evaluation of Online Information Privacy Concerns: An Information- Privacy Notices.” In Proceedings of the SIGCHI Processing Theory Approach.” Journal of Manage- Conference on Human Factors in Computing Sys- ment Information Systems 24 (2): 13–42. tems, 471–78. New York: Association for Computing Hann, Il-Horn, Kai Lung Hui, Sang-Yong Tom Lee, and Machinery. Ivan P. L. Png. 2008. “Consumer Privacy and Market- Jentzsch, Nicola. 2006. The Economics and Regulation of ing Avoidance: A Static Model.” Management Science Financial Privacy. New York: Springer. 54 (6): 1094–1103. Jentzsch, Nicola, Soren Preibusch, and Andreas Harrell, Erika. 2015. “Victims of Identity Theft, 2014.” Harasser. 2012. “Study on Monetising Privacy: An Washington, DC: US Department of Justice Bureau Economic Model for Pricing Personal Information.” of Justice Statistics. Berlin: European Network and Information Security Harris, Louis, and Alan F. Westin. 1992. “The Equifax Agency. Canada Report on Consumers and Privacy in the Jeong, Yuncheol, and Masayoshi Maruyama. 2009. Information Age.” Anjou, Quebec: Equifax Canada. “Commitment to a Strategy of Uniform Pricing in a Hayek, Friedrich A. 1945. “The Use of Knowledge in Two-Period Duopoly with Switching Costs.” Journal Society.” American Economic Review 35 (4): 519–30. of Economics 98 (1): 45–66. Heffetz, Ori, and Katrina Ligett. 2013. “Privacy and Jernigan, Carter, and Behram F. T. Mistree. 2009. Acquisti et al.: The Economics of Privacy 489

“Gaydar:­ Facebook Friendships Expose Sexual Orien- In Economic Analysis of the Digital Economy, edited tation.” First Monday 14 (10). by Avi Goldfarb, Shane Greenstein, and Catherine Jing, Bing. 2011. “Pricing Experience Goods: The Tucker, 279–308. Chicago and London: University of Effects of Customer Recognition and Commitment.” Chicago Press. Journal of Economics and Management Strategy 20 Koops, B.-J., et al. 2006. “Should Self-Regulation Be (2): 451–73. the Starting Point?” In Starting Points for ICT Reg- John, Leslie K., Alessandro Acquisti, and George Loe- ulation: Deconstructing Prevalent Policy One-Liners, wenstein. 2011. “Strangers on a Plane: Context- edited by B.-J. Koops et al., 121. The Hague: TMC Dependent Willingness to Divulge Sensitive Informa- Asser Press. tion.” Journal of Consumer Research 37 (5): 858–73. Kramer, Adam D. I., Jamie E. Guillory, and Jeffrey T. Johnson, Eric J., and Daniel G. Goldstein. 2004. Hancock. 2014. “Experimental Evidence of Mas- “Defaults and Donation Decisions.” Transplantation sive-Scale Emotional Contagion through Social 78 (12): 1713–16. Networks.” Proceedings of the National Academy of Johnson, Justin P. 2013. “Targeted Advertising and Sciences 111 (24): 8788–90. Advertising Avoidance.” RAND Journal of Economics Krishnamurthy, Balachander, and Craig Wills. 2009. 44 (1): 128–44. “Privacy Diffusion on the Web: A Longitudinal Per- Kahn, Charles M., James McAndrews, and William spective.” In Proceedings of the 18th International Roberds. 2000. “A Theory of Transactions Privacy.” Conference on World Wide Web, 541–50. New York: Federal Reserve Bank of Atlanta Working Paper Association for Computing Machinery. 2000-22. Kroll, Joshua A., Ian C. Davey, and Edward W. Felten. Kahn, Charles M., James McAndrews, and William 2013. “The Economics of Bitcoin Mining, or Bitcoin Roberds. 2005. “Money Is Privacy.” International Eco- in the Presence of Adversaries.” Presented at the nomic Review 46 (2): 377–99. Twelfth Workshop on the Economics of Information Kahn, Charles M., and William Roberds. 2008. “Credit Security, Washington, DC, June 11. and Identity Theft.” Journal of Monetary Economics Lambrecht, Anja, and Catherine Tucker. 2013. “When 55 (2): 251–64. Does Retargeting Work? Information Specificity in Kang, Ruogu, Stephanie Brown, and Sara Kiesler. 2013. Online Advertising.” Journal of Marketing Research “Why Do People Seek Anonymity on the Internet? 50 (5): 561–76. Informing Policy and Design.” In Proceedings of the Lane, Julia, Victoria Stodden, Stefan Bender, and Helen SIGCHI Conference on Human Factors in Computing Nissenbaum, eds. 2014. Privacy, Big Data, and the Systems, 2657–66. New York: Association for Comput- Public Good: Frameworks for Engagement. Cam- ing Machinery. bridge and New York: Cambridge University Press. Kearns, Michael, Mallesh M. Pai, Aaron Roth, and Jon- Laudon, Kenneth C. 1996. “Markets and Privacy.” Com- athan Ullman. 2014. “Mechanism Design in Large munications of the ACM 39 (9): 92–104. Games: Incentives and Privacy.” American Economic Laudon, Kenneth C. 1997. “Extensions to the Theory of Review 104 (5): 431–35. Markets and Privacy: Mechanics of Pricing Informa- Kim, Byung-Cheoi, and Jay Pil Choi. 2010. “Customer tion.” New York University Stern School of Business Information Sharing: Strategic Incentives and New Working Paper IS-97-4. Implications.” Journal of Economics and Management Laufer, Robert S., and Maxine Wolfe. 1977. “Privacy Strategy 19 (2): 403–33. as a Concept and a Social Issue: A Multidimensional Kim, Jin-Hyuk, and Liad Wagman. 2015. “Screening Developmental Theory.” Journal of Social Issues 33 Incentives and Privacy Protection in Financial Mar- (3): 22–42. kets: A Theoretical and Empirical Analysis.” RAND Lazer, David, Ryan Kennedy, Gary King, and Alessandro Journal of Economics 46 (1): 1–22. Vespignani. 2014. “The Parable of Google Flu: Traps Kim, Jin-Hyuk, Liad Wagman, and Abraham Wickel- in Big Data Analysis.” Science 343 (6176): 1203–05. gren. 2016. “The Impact of Access to Consumer Data Levin, Jonathan, and Paul Milgrom. 2010. “Online on the Competitive Effects of Horizontal Mergers.” Advertising: Heterogeneity and Conflation in Market http://ssrn.com/abstract=2728378. Design.” American Economic Review 100 (2): 603–07. Knight, Frank H. 1921. Risk, Uncertainty, and Profit. Lewis, Kevin. 2013. “The Limits of Racial Prejudice.” New York: Hart, Schaffner and Marx. Proceedings of the National Academy of Sciences 110 Ko, Myung, and Carlos Dorantes. 2006. “The Impact (47): 18814–19. of Information Security Breaches on Financial Per- Lewis, Randall A., and David H. Reiley. 2014. “Online formance of the Breached Firms: An Empirical Ads and Offline Sales: Measuring the Effect of Retail Investigation.” Journal of Information Technology Advertising via a Controlled Experiment on Yahoo!” Management 17 (2): 13–22. Quantitative Marketing and Economics 12 (3): 235–66. Kohane, Isaac S. 2015. “Ten Things We Have to Do Litman, Jessica. 2000. “Information Privacy/Information to Achieve Precision Medicine.” Science 349 (6243): Property.” Stanford Law Review 52. 37–38. Lohr, Steve. 2010. “Privacy Concerns Limit Online Komarova, Tatiana, Denis Nekipelov, and Evgeny Yakov- Ads, Study Says.” http://bits.blogs.nytimes.com/2010/ lev. 2015. “Estimation of Treatment Effects from 04/30/privacy-concerns-limit-online-ads-study-says/. Combined Data: Identification versus Data Security.” Madden, Mary, and Lee Rainie. 2015. “Americans’ 490 Journal of Economic Literature, Vol. LIV (June 2016)

Attitudes about Privacy, Security and Surveillance.” the 2007 Workshop on the Economics of Information http://www.pewinternet.org/2015/05/20/americans- Security, Pittsburgh, PA, June 7. attitudes-about-privacy-security-and-surveillance/. Miller, Amalia R., and Catherine E. Tucker. 2009. “Pri- Malhotra, Naresh K., Sung S. Kim, and James Agarwal. vacy Protection and Technology Diffusion: The Case 2004. “Internet Users’ Information Privacy Concerns of Electronic Medical Records.” Management Science (IUIPC): The Construct, the Scale, and a Causal 55 (7): 1077–93. Model.” Information Systems Research 15 (4): 336–55. Miller, Amalia R., and Catherine E. Tucker. 2011a. “Can Mann, Catherine L. 2015. “Information Lost: Will Health Care Information Technology Save Babies?” the ‘Paradise’ That Information Promises, to Both Journal of Political Economy 119 (2): 289–324. Consumer and Firm, Be ‘Lost’ on Account of Data Miller, Amalia R., and Catherine E. Tucker. 2011b. Breaches? The Epic Is Playing Out.” In Economic “Encryption and the Loss of Patient Data.” Journal of Analysis of the Digital Economy, edited by Avi Gold- Policy Analysis and Management 30 (3): 534–56. farb, Shane Greenstein, and Catherine Tucker, 309– Miller, Amalia R., and Catherine E. Tucker. 2014a. 51. Chicago and London: University of Chicago Press. “Electronic Discovery and the Adoption of Informa- Marthews, Alex, and Catherine Tucker. 2015. “Gov- tion Technology.” Journal of Law, Economics, and ernment Surveillance and Internet Search Behav- Organization 30 (2): 217–43. ior.” Available at SSRN: http://dx.doi.org/10.2139/ Miller, Amalia R., and Catherine E. Tucker. 2014b. “Pri- ssrn.2412564. vacy Protection, Personalized Medicine and Genetic Mayer, Jonathan R., and John C. Mitchell. 2012. “Third- Testing.” http://ssrn.com/abstract=2411230. Party Web Tracking: Policy and Technology.” In Pro- Murphy, Richard S. 1995. “Property Rights in Personal ceedings of the 2012 IEEE Symposium on Security Information: An Economic Defense of Privacy.” and Privacy, 413–27. Piscataway, NJ: Institute of Elec- Georgetown Law Journal 84: 2381. trical and Electronics Engineers. Nagin, Daniel, and Joel Waldfogel. 1995. “The Effects of McDonald, Aleecia, and Lorrie Faith Cranor. 2008. “The Criminality and Conviction on the Labor Market Sta- Cost of Reading Privacy Policies.” I/S: A Journal of Law tus of Young British Offenders.” International Review and Policy for the Information Society 4 (3): 540–65. of Law and Economics 15 (1): 109–26. McDonald, Aleecia, and Lorrie Faith Cranor. 2010. Narayanan, Arvind., and Vitaly Shmatikov. 2008. “Robust “Beliefs and Behaviors: Internet Users’ Understand- De-anonymization of Large Sparse Datasets.” In Pro- ing of Behavioral Advertising.” Paper presented at the ceedings of the 2008 IEEE Symposium on Security 2010 TPRC—43rd Research Conference on Commu- and Privacy, 111–25. Los Alamitos, CA: IEEE Com- nication, Information and Internet Policy, Arlington, puter Society. VA, October 2. Nissenbaum, Helen. 2004. “Privacy as Contextual Integ- Mikians, Jakub, Laszlo Gyarmati, Vijay Erramilli, and rity.” Washington Law Review 79: 101–39. Nikolaos Laoutaris. 2012. “Detecting Price and Noam, Eli M. 1997. “Privacy and Self-Regulation: Search Discrimination on the Internet.” In Proceed- Markets for Electronic Privacy.” In Privacy and Self- ings of the 11th ACM Workshop on Hot Topics in Net- Regulation in the Information Age. Washington, works, 79–84. New York: Association for Computing­ DC: US Department of Commerce, National Tele- Machinery. communications and Information Administration. Mikians, Jakub, Laszlo Gyarmati, Vijay Erramilli, and Norberg, Patricia A., Daniel R. Horne, and David A. Nikolaos Laoutaris. 2013. “Crowd-Assisted Search for Horne. 2007. “The Privacy Paradox: Personal Infor- Price Discrimination in e-Commerce: First Results.” mation Disclosure Intentions versus Behaviors.” Jour- In Proceedings of the Ninth ACM Conference on nal of Consumer Affairs 41 (1): 100–126. Emerging Networking Experiments and Technologies, Nuffield Council on Bioethics. 2015. “The Collection, 1–6. New York: Association for Computing Machinery. Linking and Use of Data in Biomedical Research Milberg, Sandra J., Sandra J. Burke, H. Jeff Smith, and and Health Care: Ethical Issues.” Nuffield: Nuffield Ernest A. Kallman. 1995. “Values, Personal Informa- Council on Bioethics. tion Privacy, and Regulatory Approaches.” Communi- Odlyzko, Andrew. 2003. “Privacy, Economics, and Price cations of the ACM 38 (12): 65–74. Discrimination on the Internet.” In Economics of Milberg, Sandra J., H. Jeff Smith, and Sandra J. Burke. Information Security, edited by L. Jean Camp and 2000. “Information Privacy: Corporate Management Stephen Lewis, 187–212. Norwell, MA: Kluwer Aca- and National Regulation.” Organization Science 11 demic Publishers. (1): 35–57. Ohm, Paul. 2010. “Broken Promises of Privacy: Milgrom, Paul R. 1981. “Good News and Bad News: Responding to the Surprising Failure of Anonymiza- Representation Theorems and Applications.” Bell tion.” UCLA Law Review 57: 1701–77. Journal of Economics 12 (2): 380–91. Olejnik, Lukasz, Tran Minh-Dung, and Claude Cas- Milgrom, Paul R., and John Roberts. 1986. “Relying on telluccia. 2014. “Selling Off Privacy at Auction.” the Information of Interested Parties.” RAND Journal Paper presented at ISOC Network and Distributed of Economics 17 (1): 18–32. System Security Symposium, San Diego, CA, Febru- Miller, Amalia R., and Catherine E. Tucker. 2007. “Pri- ary 23–26. vacy Protection and Technology Diffusion: The Case Oster, Emily, Ira Shoulson, Kimberly Quaid, and E. Ray of Electronic Medical Records.” Paper presented at Dorsey. 2010. “Genetic Adverse Selection: Evidence Acquisti et al.: The Economics of Privacy 491

from Long-Term Care Insurance and Huntington New York: Cambridge University Press. Disease.” Journal of Public Economics 94 (11–12): Schoeman, Ferdinand David. 1992. Privacy and Social 1041–50. Freedom. Cambridge and New York: Cambridge Uni- Pagano, Marco, and Tullio Jappelli. 1993. “Information versity Press. Sharing in Credit Markets.” Journal of Finance 48 (5): Schwartz, Paul M. 1997. “Privacy and the Economics 1693–1718. of Personal Health Care Information.” Texas Law Pancras, Joseph, and K. Sudhir. 2007. “Optimal Mar- Review 76 (1): 1–75. keting Strategies for a Customer Data Intermediary.” Schwartz, Paul M. 2004. “Property, Privacy, and Personal Journal of Marketing Research 44 (4): 560–78. Data.” Harvard Law Review 117: 2056–2128. Pavan, Alessandro, and Giacomo Calzolari. 2009. Schwartz, Paul M., and Edward J. Janger. 2007. “Noti- “Sequential Contracting with Multiple Principals.” fication of Data Security Breaches.” Michigan Law Journal of Economic Theory 144 (2): 503–31. Review 105: 913–84. Png, Ivan P. L. 2010. “On the Value of Privacy from Tele- Shaffer, Greg, and Z. John Zhang. 1995. “Competi- marketing: Evidence from the ‘Do Not Call’ Registry.” tive Coupon Targeting.” Marketing Science 14 (4): Unpublished. 395–416. Posner, Richard A. 1978. “The Right of Privacy.” Georgia Shaffer, Greg, and Z. John Zhang. 2002. “Competitive Law Review 12 (3): 393–422. One-to-One Promotions.” Management Science 48 Posner, Richard A. 1981. “The Economics of Privacy.” (9): 1143–60. American Economic Review 71 (2): 405–09. Shy, Oz, and Rune Stenbacka. Forthcoming. “Customer Posner, Richard A. 1993. “Blackmail, Privacy, and Free- Privacy and Competition.” Journal of Economics and dom of Contract.” University of Pennsylvania Law Management Strategy. Review 141 (5): 1817–47. Smith, Richard M. 1999. “The Web Bug FAQ.” https:// Preibusch, Soren, Dorothea Kubler, and Alastair R. w2.eff.org/Privacy/Marketing/web_bug.html. Beresford. 2013. “Price versus Privacy: An Experi- Smith, Robert Ellis. 2000. Ben Franklin’s Web Site: Pri- ment into the Competitive Advantage of Collecting vacy and Curiosity from Plymouth Rock to the Inter- Less Personal Information.” Electronic Commerce net. Privacy Journal. Research 13 (4): 423–55. Solove, Daniel J. 2003. “Identity Theft, Privacy, and the Qian, Jun, and Philip E. Strahan. 2007. “How Laws and Architecture of Vulnerability.” Hastings Law Journal Institutions Shape Financial Contracts: The Case of 54: 1227–75. Bank Loans.” Journal of Finance 62 (6): 2803–34. Solove, Daniel J. 2006. “A Taxonomy of Privacy.” Uni- Rainie, Lee, Sara Kiesler, Ruogu Kang, and Mary Mad- versity of Pennsylvania Law Review 154 (3): 477–560. den. 2013. “Anonymity, Privacy, and Security Online.” Solove, Daniel J. 2007. The Future of Reputation: Gos- Washington, DC: Pew Research Center. sip, Rumor, and Privacy on the Internet. New Haven Rao, Justin M., and David H. Reiley. 2012. “The Eco- and London: Yale University Press. nomics of Spam.” Journal of Economic Perspectives 26 Solove, Daniel J. 2013. “Introduction: Privacy Self- (3): 87–110. management and the Consent Dilemma.” Harvard Roberds, William, and Stacey L. Schreft. 2009. “Data Law Review 126 (7): 1880–903. Security, Privacy, and Identity Theft: The Economics Spence, Michael. 1973. “Job Market Signaling.” Quar- behind the Policy Debates.” Federal Reserve Bank of terly Journal of Economics 87 (3): 355–74. Chicago Economic Perspectives First Quarter: 22–30. Spiegel, Yossi. 2013. “Commercial Software, Adware, Romanosky, Sasha, Rahul Telang, and Alessandro and Consumer Privacy.” International Journal of Acquisti. 2011. “Do Data Breach Disclosure Laws Industrial Organization 31 (6): 702–13. Reduce Identity Theft?” Journal of Policy Analysis Stigler, George J. 1961. “The Economics of Informa- and Management 30 (2): 256–86. tion.” Journal of Political Economy 69 (3): 213–25. Rosen, Jeffrey. 2012. “The Right to Be Forgotten.” Stan- Stigler, George J. 1962. “Information in the Labor Mar- ford Law Review Online 64: 88. ket.” Journal of Political Economy 70 (5): 94–105. Rubin, Paul H., and Thomas M. Lenard. 2001. Privacy Stigler, George J. 1980. “An Introduction to Privacy in and the Commercial Use of Personal Information. Economics and Politics.” Journal of Legal Studies 9 New York: Springer, Kluwer Academic. (4): 623–44. Rust, Roland T., P. K. Kannan, and Na Peng. 2002. “The Stone, Brad. 2010. “Ads Posted on Facebook Strike Customer Economics of Internet Privacy.” Journal of Some as Off-Key.” New York Times, March 4, B1. the Academy of Marketing Science 30 (4): 455–64. Stone, E. F., and D. L. Stone. 1990. “Privacy in Orga- Samuelson, Pamela. 2000. “Privacy as Intellectual Prop- nizations: Theoretical Issues, Research Findings, and erty.” Stanford Law Review 52: 1125–73. Protection Mechanisms.” Research in Personnel and Savage, Scott J., and Donald M. Waldman. 2013. “The Human Resources Management 8 (3): 349–411. Value of Online Privacy.” Unpublished. Strahilevitz, Lior Jacob. 2008. “Privacy versus Antidis- Schenk, Eric, and Claude Guittard. 2011. “Toward a crimination.” University of Chicago Law Review 75 Characterization of Crowdsourcing Practices.” Journal (1): 363–81. of Innovation Economics and Management 7: 93–107. Stutzman, Fred, Ralph Gross, and Alessandro Acquisti. Schoeman, Ferdinand David, ed. 1984. Philosophical 2012. “Silent Listeners: The Evolution of Privacy and Dimensions of Privacy: An Anthology. Cambridge and Disclosure on Facebook.” Journal of Privacy and 492 Journal of Economic Literature, Vol. LIV (June 2016)

Confidentiality 4 (2): 7–41. American Economic Review 100 (2): 1–10. Sweeney, Latanya. 1997. “Weaving Technology and Pol- Varian, Hal R., Fredrik Wallenberg, and Glenn Woroch. icy Together to Maintain Confidentiality.” Journal of 2005. “The Demographics of the Do-Not-Call List.” Law, Medicine and Ethics 25 (2–3): 98–110. IEEE Security and Privacy 3 (1): 34–39. Sweeney, Latanya. 2013. “Discrimination in Online Villas-Boas, J. Miguel. 1999. “Dynamic Competition Ad Delivery: Google Ads, Black Names and White with Customer Recognition.” RAND Journal of Eco- Names, Racial Discrimination, and Click Advertising.” nomics 30 (4): 604–31. ACM Queue 11 (3): 10. Villas-Boas, J. Miguel. 2004. “Price Cycles in Markets Swire, Peter P., and Robert E. Litan. 1998. None of Your with Customer Recognition.” RAND Journal of Eco- Business: World Data Flows, Electronic Commerce, nomics 35 (3): 486–501. and the European Privacy Directive. Washington, Vissers, Thomas, Nick Nikiforakis, Nataliia Bielova, and DC: Brookings Institution Press. Wouter Joosen. 2014. “Crying Wolf? On the Price Tamir, Diana I., and Jason P. Mitchell. 2012. “Disclosing Discrimination of Online Airline Tickets.” Paper pre- Information about the Self Is Intrinsically Reward- sented at the 7th Workshop on Hot Topics in Privacy ing.” Proceedings of the National Academy of Sciences Enhancing Technologies, Amsterdam, Netherlands, 109 (21): 8038–43. July 18. Tang, Zhulei, Y. U. Hu, and Michael D. Smith. 2008. Wagman, Liad. 2014. “Good News or Bad “Gaining Trust through Online Privacy Protection: News? Information Acquisition and Applicant Self-Regulation, Mandatory Standards, or Caveat Screening­ in Competitive Labor Markets.” http://ssrn. Emptor.” Journal of Management Information Sys- com/abstract=2377849. tems 24 (4): 153–73. Wagman, Liad, and Vincent Conitzer. 2014. “False- Taylor, Curtis R. 2004. “Consumer Privacy and the Mar- Name-Proof Voting with Costs over Two Alterna- ket for Customer Information.” RAND Journal of Eco- tives.” International Journal of Game Theory 43 (3): nomics 35 (4): 631–50. 599–618. Taylor, Curtis R., and Liad Wagman. 2014. “Consumer Waldfogel, Joel. 1994. “The Effect of Criminal Convic- Privacy in Oligopolistic Markets: Winners, Losers, and tion on Income and the Trust ‘Reposed in the Work- Welfare.” International Journal of Industrial Organi- men’.” Journal of Human Resources 29 (1): 62–81. zation 34: 80–84. Wang, Yang, et al. 2013. “The Second Wave of Global Telang, Rahul, and Sunil Wattal. 2007. “An Empiri- Privacy Protection: From Facebook Regrets to Face- cal Analysis of the Impact of Software Vulnerability book Privacy Nudges.” Ohio State Law Journal 74 (6): Announcements on firm Stock Price.” IEEE Transac- 1307–34. tions on Software Engineering 33 (8): 544–57. Warren, Samuel D., and Louis D. Brandeis. 1890. “The Tsai, Janice Y., Serge Egelman, Lorrie Cranor, and Ales- Right to Privacy.” Harvard Law Review 4 (5): 193–203. sandro Acquisti. 2011. “The Effect of Online Privacy Wathieu, Luc. 2002. “Privacy, Exposure and Price Information on Purchasing Behavior: An Experimental Discrimination.” http://neumann.hec.ca/cref/sem/ Study.” Information Systems Research 22 (2): 254–68. documents/031111.pdf. Tucker, Catherine E. 2012. “The Economics of Adver- Westin, Alan. 1967. Privacy and Freedom. New York: tising and Privacy.” International Journal of Industrial Atheneum Publishers. Organization 30 (3): 326–29. White, Ryen W., Nicholas P. Tatonetti, Nigam H. Shah, Tucker, Catherine E. 2014. “Social Networks, Person- Russ B. Altman, and Eric Horvitz. 2013. “Web-Scale alized Advertising, and Privacy Controls.” Journal of Pharmacovigilance: Listening to Signals from the Marketing Research 51 (5): 546–62. Crowd.” Journal of the American Medical Informatics Turow, Joseph, Jennifer King, Chris Jay Hoofnagle, Association 20: 404–08. Amy Bleakley, and Michael Hennessy. 2009. “Amer- White, Tiffany Barnett, Debra L. Zahay, Helge Thor- icans Reject Tailored Advertising and Three Activi- bjørnsen, and Sharon Shavitt. 2008. “Getting too ties That Enable It.” http://repository.upenn.edu/cgi/ Personal: Reactance to Highly Personalized Email viewcontent.cgi?article=1138&context=asc_papers. Solicitations.” Marketing Letters 19 (1): 39–50. Valentino-Devries, Jeremy Singer-Vine, and Ashkan Wickelgren, Abraham L. 2015. “An Economic Analysis Soltani. 2012. “Websites Vary Prices, Deals Based on of Internet Privacy Regulation.” Unpublished. Users’ Information.” Wall St. Journal, December 24. Wilson, Kumanan, and John S. Brownstein. 2009. “Early Van Alstyne, Marshall W. 2007. “Curing Spam: Rights, Detection of Disease Outbreaks Using the Inter- Signals & Screens.” The Economists’ Voice 4 (2). net.” Canadian Medical Association Journal 180 (8): Van Zandt, Timothy. 2004. “Information Overload in a 829–31. Network of Targeted Communication.” RAND Jour- Xie, Yinglian, Fang Yu, and Martin Abadi. 2009. “De-an- nal of Economics 35 (3): 542–60. onymizing the Internet Using Unreliable IDs.” Com- Varian, Hal R. 1997. “Economics Aspects of Personal puter Communication Review 39: 75–86. Privacy.” In Privacy and Self-Regulation in the Infor- Zhang, Juanjuan. 2011. “The Perils of Behavior-Based mation Age. Washington, DC: US Department of Personalization.” Marketing Science 30 (1): 170–86. Commerce, National Telecommunications and Infor- Zuiderveen Borgesius, Frederik J. 2015. Improving Pri- mation Administration. vacy Protection in the Area of Behavioural Targeting. Varian, Hal R. 2010. “Computer Mediated ­Transactions.” New York: Wolter Kluwer Law and Business. This article has been cited by:

1. Elif Kiesow Cortez. Data Protection Around the World: Future Challenges 269-279. [Crossref] 2. Tawei Wang, Yen-Yao Wang, Ju-Chun Yen. It's Not My Fault 1916-1937. [Crossref] 3. Razieh Nokhbeh Zaeem, K. Suzanne Barber. Economics of Privacy: Privacy, a Machine Learning Perspective 1-3. [Crossref] 4. Francesco Ciardiello, Andrea Di Liddo. 2020. Privacy Accountability and Penalties for IoT Firms. Risk Analysis 13. . [Crossref] 5. Gajendra Liyanaarachchi, Sameer Deshpande, Scott Weaven. 2020. Market-oriented corporate digital responsibility to manage data vulnerability in online banking. International Journal of Bank Marketing ahead-of-print:ahead-of-print. . [Crossref] 6. Omid Rafieian, Hema Yoganarasimhan. 2020. Targeting and Privacy in Mobile Advertising. Marketing Science . [Crossref] 7. Hemant K. Bhargava, Olivier Rubel, Elizabeth J. Altman, Ramnik Arora, Jörn Boehnke, Kaitlin Daniels, Timothy Derdenger, Bryan Kirschner, Darin LaFramboise, Pantelis Loupos, Geoffrey Parker, Adithya Pattabhiramaiah. 2020. Platform data strategy. Marketing Letters 31:4, 323-334. [Crossref] 8. Julien Cloarec. 2020. The personalization–privacy paradox in the attention economy. Technological Forecasting and Social Change 161, 120299. [Crossref] 9. Jacopo Arpetti, Antonio Iovanella. 2020. Towards more effective consumer steering via network analysis. European Journal of Law and Economics 50:3, 359-380. [Crossref] 10. Chiara Conti, Pierfrancesco Reverberi. 2020. Price discrimination and product quality under opt-in privacy regulation. Information Economics and Policy 100912. [Crossref] 11. Frank Ebbers, Jan Zibuschka, Christian Zimmermann, Oliver Hinz. 2020. User preferences for privacy features in digital assistants. Electronic Markets 42. . [Crossref] 12. Marta Rocha, Thomas Greve. 2020. Contracting in a Market with Differential Information. Journal of Industry, Competition and Trade 54. . [Crossref] 13. Mark Rosso, ABM Nasir, Mohsen Farhadloo. 2020. Chilling effects and the stock market response to the Snowden revelations. New Media & Society 22:11, 1976-1995. [Crossref] 14. Anna D’Annunzio, Antonio Russo. 2020. Ad Networks and Consumer Tracking. Management Science 66:11, 5040-5058. [Crossref] 15. Leo Sleuwaegen, Peter M. Smith. 2020. Who purchases cross-border? Individual and country level determinants of the decision to purchase cross-border in the European Single Market. Electronic Commerce Research 544. . [Crossref] 16. Andre de Lima Salgado, Ben Singh, Patrick C. K. Hung, Annie Jiang, Yen-Hung Liu, Anna Priscilla de Albuquerque Wheler, Hossam A. Gaber. Preliminary Tendencies of Users’ Expectations about Privacy on Connected-Autonomous Vehicles 296-301. [Crossref] 17. Yongrui Duan, Yuncong Xie, Yihong Hu. 2020. The value of personal information in vertically differentiated markets with privacy concerns. Annals of Operations Research 54. . [Crossref] 18. Jens Prüfer, Patricia Prüfer. 2020. Data science for entrepreneurship research: studying demand dynamics for entrepreneurial skills in the Netherlands. Small Business Economics 55:3, 651-672. [Crossref] 19. Manas Tripathi, Arunabha Mukhopadhyay. 2020. Financial Loss due to a Data Privacy Breach: An Empirical Analysis. Journal of Organizational Computing and Electronic Commerce 30:4, 381-400. [Crossref] 20. Wing Man Wynne Lam, Bruce Lyons. 2020. Does data protection legislation increase the quality of internet services?. Economics Letters 195, 109463. [Crossref] 21. Alessandro Acquisti, Laura Brandimarte, George Loewenstein. 2020. Secrets and Likes: The Drive for Privacy and the Difficulty of Achieving It in the Digital Age. Journal of Consumer Psychology 30:4, 736-758. [Crossref] 22. Yongrui Duan, Yao Ge, Yixuan Feng. 2020. Pricing and personal data collection strategies of online platforms in the face of privacy concerns. Electronic Commerce Research 82. . [Crossref] 23. Laurens Vandercruysse, Caroline Buts, Michaël Dooms. 2020. A typology of Smart City services: The case of Data Protection Impact Assessment. Cities 104, 102731. [Crossref] 24. Charles I. Jones, Christopher Tonetti. 2020. Nonrivalry and the Economics of Data. American Economic Review 110:9, 2819-2858. [Abstract] [View PDF article] [PDF with links] 25. Priyanka Sharma, Liad Wagman. 2020. Advertising and Voter Data in Asymmetric Political Contests. Information Economics and Policy 52, 100864. [Crossref] 26. Paul Belleflamme, Wing Man Wynne Lam, Wouter Vergote. 2020. Competitive Imperfect Price Discrimination and Market Power. Marketing Science 39:5, 996-1015. [Crossref] 27. Alexander Bleier, Avi Goldfarb, Catherine Tucker. 2020. Consumer privacy and the future of data- based innovation and marketing. International Journal of Research in Marketing 37:3, 466-480. [Crossref] 28. Bettina Rockenbach, Abdolkarim Sadrieh, Anne Schielke. 2020. Providing personal information to the benefit of others. PLOS ONE 15:8, e0237183. [Crossref] 29. S. Nageeb Ali, Roland Bénabou. 2020. Image versus Information: Changing Societal Norms and Optimal Privacy. American Economic Journal: Microeconomics 12:3, 116-164. [Abstract] [View PDF article] [PDF with links] 30. Philip Menard, Gregory J. Bott. 2020. Analyzing IOT users’ mobile device privacy concerns: Extracting privacy permissions using a disclosure experiment. Computers & Security 95, 101856. [Crossref] 31. Raul Castro Fernandez, Pranav Subramaniam, Michael J. Franklin. 2020. Data market platforms. Proceedings of the VLDB Endowment 13:12, 1933-1947. [Crossref] 32. Yongmin Chen. 2020. Improving market performance in the digital economy. China Economic Review 62, 101482. [Crossref] 33. Joan Kai-Lin Mak, Hichang Cho. 2020. Framing Smart Nation: A moderated mediation analysis of frame-focus effects. Information, Communication & Society 23:9, 1274-1294. [Crossref] 34. James Peltz, Anita C. Street. Artificial Intelligence and Ethical Dilemmas Involving Privacy 95-120. [Crossref] 35. Simon Loertscher, Leslie M. Marx. 2020. Digital monopolies: Privacy protection or price regulation?. International Journal of Industrial Organization 71, 102623. [Crossref] 36. Sinem Hidir, Nikhil Vellodi. 2020. Privacy, Personalization, and Price Discrimination. Journal of the European Economic Association 54. . [Crossref] 37. Daniele Condorelli, Jorge Padilla. 2020. Harnessing Platform Envelopment in the Digital World. Journal of Competition Law & Economics 16:2, 143-187. [Crossref] 38. Richard J. Gilbert. 2020. Separation: A Cure for Abuse of Platform Dominance?. Information Economics and Policy 100876. [Crossref] 39. Hana Choi, Carl F. Mela, Santiago R. Balseiro, Adam Leary. 2020. Online Display Advertising Markets: A Literature Review and Future Directions. Information Systems Research 31:2, 556-575. [Crossref] 40. Garrett Glasgow, Sarah Butler, Samantha Iyengar. 2020. Survey response bias and the ‘privacy paradox’: evidence from a discrete choice experiment. Applied Economics Letters 13, 1-5. [Crossref] 41. Pejvak Oghazi, Rakel Schultheiss, Koteshwar Chirumalla, Nicolas Philipp Kalmer, Fakhreddin F. Rad. 2020. User self-disclosure on social network sites: A cross-cultural study on Facebook’s privacy concepts. Journal of Business Research 112, 531-540. [Crossref] 42. Sophia Gaenssle, Oliver Budzinski. 2020. Stars in social media: new light through old windows?. Journal of Media Business Studies 24, 1-27. [Crossref] 43. Philip Fei Wu, Jessica Vitak, Michael T. Zimmer. 2020. A contextual approach to information privacy research. Journal of the Association for Information Science and Technology 71:4, 485-490. [Crossref] 44. Michael R. Baye, David E. M. Sappington. 2020. Revealing transactions data to third parties: Implications of privacy regimes for welfare in online markets. Journal of Economics & Management Strategy 29:2, 260-275. [Crossref] 45. Ginger Zhe Jin, Liad Wagman. 2020. Big data at the crossroads of antitrust and consumer protection. Information Economics and Policy 100865. [Crossref] 46. Joshua A. Gerlick, Stephan M. Liozu. 2020. Ethical and legal considerations of artificial intelligence and algorithmic decision-making in personalized pricing. Journal of Revenue and Pricing Management 19:2, 85-98. [Crossref] 47. Alisa Frik, Alexia Gaudeul. 2020. A measure of the implicit value of privacy under risk. Journal of Consumer Marketing 37:4, 457-472. [Crossref] 48. Tim Schürmann, Nina Gerber, Paul Gerber. 2020. Benefits of formalized computational modeling for understanding user behavior in online privacy research. Journal of Intellectual Capital 21:3, 431-458. [Crossref] 49. Ed Saiedi, Anders Broström, Felipe Ruiz. 2020. Global drivers of cryptocurrency infrastructure adoption. Small Business Economics 50. . [Crossref] 50. Alessandro Bonatti, Gonzalo Cisternas. 2020. Consumer Scores and Price Discrimination. The Review of Economic Studies 87:2, 750-791. [Crossref] 51. Alessandro Acquisti, Christina Fong. 2020. An Experiment in Hiring Discrimination via Online Social Networks. Management Science 66:3, 1005-1024. [Crossref] 52. Tobias Regner. 2020. Crowdfunding a monthly income: an analysis of the membership platform Patreon. Journal of 54. . [Crossref] 53. Kai A. Konrad. 2020. Attacking and defending multiple valuable secrets in a big data world. European Journal of Operational Research 280:3, 1122-1129. [Crossref] 54. Tommaso Valletti, Jiahua Wu. 2020. Consumer Profiling with Data Requirements: Structure and Policy Implications. Production and Operations Management 29:2, 309-329. [Crossref] 55. Jaybie A. De Guzman, Kanchana Thilakarathna, Aruna Seneviratne. 2020. Security and Privacy Approaches in Mixed Reality. ACM Computing Surveys 52:6, 1-37. [Crossref] 56. Gilberto Marzano, Velta Lubkina. CityBook: A Mobile Crowdsourcing and Crowdsensing Platform 420-431. [Crossref] 57. Walter Castelnovo. Unlocking the Value of Public Sector Personal Information Through Coproduction 379-391. [Crossref] 58. Teppei Koguchi. Personal Data Protection 165-177. [Crossref] 59. Madeline Turland, Peter Slade. 2020. Farmers' willingness to participate in a big data platform. Agribusiness 36:1, 20-36. [Crossref] 60. Agnieszka Kitkowska, Yefim Shulman, Leonardo A. Martucci, Erik Wastlund. 2020. Psychological Effects and Their Role in Online Privacy Interactions: A Review. IEEE Access 8, 21236-21260. [Crossref] 61. Garrett A. Johnson, Scott K. Shriver, Shaoyin Du. 2020. Consumer Privacy Choice in Online Advertising: Who Opts Out and at What Cost to Industry?. Marketing Science 39:1, 33-51. [Crossref] 62. Ole Jann, Christoph Schottmüller. 2020. An Informational Theory of Privacy. The Economic Journal 130:625, 93-124. [Crossref] 63. Frank Ebbers. 576, 327. [Crossref] 64. Daniel Bachlechner, Marc van Lieshout, Tjerk Timan. Privacy as Enabler of Innovation 3-16. [Crossref] 65. Guy Aridor, Yeon-Koo Che, William Nelson, Tobias Salz. 2020. The Economic Consequences of Data Privacy Regulation: Empirical Evidence from GDPR. SSRN Electronic Journal . [Crossref] 66. Maximilian A. Müller, Caspar David Peter, Francisco Urzua. 2020. Owner Exposure Through Firm Disclosure. SSRN Electronic Journal . [Crossref] 67. Ekaterina Prytkova, Simone Vannuccini. 2020. On the Basis of Brain: Neural-Network-Inspired Change in General Purpose Chips. SSRN Electronic Journal . [Crossref] 68. Martha Davis. Consumer Privacy Regulations 222-238. [Crossref] 69. Elif Kiesow Cortez. Data Breaches and GDPR 239-256. [Crossref] 70. Sanskar Syal, Rejo Mathew. 2020. Threats Faced by Mixed Reality and Countermeasures. Procedia Computer Science 171, 2720-2728. [Crossref] 71. Saori Chiba, Michiko Ogaku. 2020. The Role of Information Design in Facilitating Trust and Trustworthiness. SSRN Electronic Journal . [Crossref] 72. Joohyun Kim, Sangkeun Park, Uichin Lee. 2020. Dashcam Witness: Video Sharing Motives and Privacy Concerns Across Different Nations. IEEE Access 8, 110425-110437. [Crossref] 73. Jakub P. Hlávka. Security, privacy, and information-sharing aspects of healthcare artificial intelligence 235-270. [Crossref] 74. Ranjan Pal, Jon Crowcroft, Yixuan Wang, Yong Li, Swades De, Sasu Tarkoma, Mingyan Liu, Bodhibrata Nag, Abhishek Kumar, Pan Hui. 2020. Preference-Based Privacy Markets. IEEE Access 8, 146006-146026. [Crossref] 75. Ali Kakhbod, Giacomo Lanzani. 2020. Market Power and Asymmetric Learning in Product Markets. SSRN Electronic Journal . [Crossref] 76. Doreén Pick. Data-driven Marketing in the Carsharing Economy – Focus on Privacy Concerns 213-234. [Crossref] 77. Daniela Stephanie Schoch. 2020. Mergers and Acquisitions in the Data Economy. SSRN Electronic Journal . [Crossref] 78. Yanzhe (Murray) Lei, Sentao Miao, Ruslan Momot. 2020. Privacy-Preserving Personalized Revenue Management. SSRN Electronic Journal . [Crossref] 79. Clement Bellet, Paul Frijters. Big Data and Wellbeing: An Economic Perspective 175-206. [Crossref] 80. Friederike Heiny, Tianchi Li, Michel Tolksdorf. 2020. We Value Your Privacy: Behavior-Based Pricing Under Endogenous Privacy. SSRN Electronic Journal . [Crossref] 81. Heng Xu, Nan Zhang. 2020. Mind the Context Gap: A Framework for Assessing Context Effects in Privacy Research. SSRN Electronic Journal . [Crossref] 82. Gail Gilboa-Freedman, Rann Smorodinsky. 2020. On the properties that characterize privacy. Mathematical Social Sciences 103, 59-68. [Crossref] 83. Bastian Haberer, Jan Kraemer, Daniel Schnurr. 2020. Standing on the Shoulders of Web Giants: The Economic Effects of Personal Data Brokers. SSRN Electronic Journal . [Crossref] 84. Lizhi Liu. 2020. The Rise of Data Politics: Digital China and Evolving International Relations. SSRN Electronic Journal . [Crossref] 85. Rémi Rouge. Very Precious Memories: Digital Memories and Data Valorization 71-79. [Crossref] 86. Daniel H Weinberg, John M Abowd, Robert F Belli, Noel Cressie, David C Folch, Scott H Holan, Margaret C Levenstein, Kristen M Olson, Jerome P Reiter, Matthew D Shapiro, Jolene D Smyth, Leen-Kiat Soh, Bruce D Spencer, Seth E Spielman, Lars Vilhuber, Christopher K Wikle. 2019. Effects of a Government-Academic Partnership: Has the NSF-CENSUS Bureau Research Network Helped Improve the US Statistical System?. Journal of Survey Statistics and Methodology 7:4, 589-619. [Crossref] 87. Oliver Budzinski, Victoriia Noskova, Xijie Zhang. 2019. The brave new world of digital personal assistants: benefits and challenges from an economic perspective. NETNOMICS: Economic Research and Electronic Networking 20:2-3, 177-194. [Crossref] 88. Yibo Zhang, Tawei Wang, Carol Hsu. 2019. The effects of voluntary GDPR adoption and the readability of privacy statements on customers’ information disclosure intention and trust. Journal of Intellectual Capital ahead-of-print:ahead-of-print. . [Crossref] 89. Ranjan Pal, Jon Crowcroft. 2019. Privacy trading in the surveillance capitalism age viewpoints on 'privacy-preserving' societal value creation. ACM SIGCOMM Computer Communication Review 49:3, 26-31. [Crossref] 90. Christine Horne, Wojtek Przepiorka. 2019. Technology use and norm change in online privacy: experimental evidence from vignette studies. Information, Communication & Society 4, 1-17. [Crossref] 91. Grzegorz Mazurek, Karolina Małagocka. 2019. What if you ask and they say yes? Consumers' willingness to disclose personal data is stronger than you think. Business Horizons 62:6, 751-759. [Crossref] 92. Waqar Nadeem, Mari Juntunen, Nick Hajli, Mina Tajvidi. 2019. The Role of Ethical Perceptions in Consumers’ Participation and Value Co-creation on Sharing Economy Platforms. Journal of Business Ethics 36. . [Crossref] 93. Thomas M. Brill, Laura Munoz, Richard J. Miller. 2019. Siri, Alexa, and other digital assistants: a study of customer satisfaction with artificial intelligence applications. Journal of Marketing Management 35:15-16, 1401-1436. [Crossref] 94. Abhishek Mahalle, Jianming Yong, Xiaohui Tao. Protecting Privacy in Digital Era on Cloud Architecture for Banking and Financial Services Industry 1-6. [Crossref] 95. Katarzyna Południak-Gierz. 2019. Consequences of the Use of Personalization Algorithms in Shaping an Offer – A Private Law Perspective. Masaryk University Journal of Law and Technology 13:2, 161. [Crossref] 96. Evert Van den Broeck, Brahim Zarouali, Karolien Poels. 2019. Chatbot advertising effectiveness: When does the message get through?. Computers in Human Behavior 98, 150-157. [Crossref] 97. Yanwen Wang, Chunhua Wu, Ting Zhu. 2019. Mobile Hailing Technology and Taxi Driving Behaviors. Marketing Science 38:5, 734-755. [Crossref] 98. Janis Cloos, Björn Frank, Lukas Kampenhuber, Stephany Karam, Nhat Luong, Daniel Möller, Maria Monge-Larrain, Nguyen Tan Dat, Marco Nilgen, Christoph Rössler. 2019. Is Your Privacy for Sale? An Experiment on the Willingness to Reveal Sensitive Information. Games 10:3, 28. [Crossref] 99. Caleb S. Fuller. 2019. Is the market for digital privacy a failure?. Public Choice 180:3-4, 353-381. [Crossref] 100. A. G. Winegar, C. R. Sunstein. 2019. How Much Is Data Privacy Worth? A Preliminary Investigation. Journal of Consumer Policy 42:3, 425-440. [Crossref] 101. Dirk Bergemann, Alessandro Bonatti. 2019. Markets for Information: An Introduction. Annual Review of Economics 11:1, 85-107. [Crossref] 102. Mariola Sánchez, Amparo Urbano. 2019. Security in digital markets. Journal of Business Research 101, 485-491. [Crossref] 103. Susanne Barth, Menno D.T. de Jong, Marianne Junger, Pieter H. Hartel, Janina C. Roppelt. 2019. Putting the privacy paradox to the test: Online privacy and security behaviors among users with technical knowledge, privacy awareness, and financial resources. Telematics and Informatics 41, 55-69. [Crossref] 104. Elias Carroni, Luca Ferrari, Simone Righi. 2019. The price of discovering your needs online. Journal of Economic Behavior & Organization 164, 317-330. [Crossref] 105. Daniele Riboni. 2019. Opportunistic pervasive computing: adaptive context recognition and interfaces. CCF Transactions on Pervasive Computing and Interaction 1:2, 125-139. [Crossref] 106. Michael Kummer, Patrick Schulte. 2019. When Private Information Settles the Bill: Money and Privacy in Google’s Market for Smartphone Applications. Management Science 65:8, 3470-3494. [Crossref] 107. Tawei Wang, Yen-Yao Wang, Ju-Chun Yen. 2019. It's Not My Fault. Journal of Database Management 30:3, 18-37. [Crossref] 108. Ning Cai, Steven Kou. 2019. Econometrics with Privacy Preservation. Operations Research . [Crossref] 109. Sucheta Ghoshal, Amy Bruckman. 2019. The Role of Social Computing Technologies in Grassroots Movement Building. ACM Transactions on Computer-Human Interaction 26:3, 1-36. [Crossref] 110. George Lawrence Sanders, Shambhu Upadhyaya, Xunyi Wang. 2019. Inside the Insider. IEEE Engineering Management Review 47:2, 84-91. [Crossref] 111. Jay Pil Choi, Doh-Shin Jeon, Byung-Cheol Kim. 2019. Privacy and personal data collection with information externalities. Journal of Public Economics 173, 113-124. [Crossref] 112. Payal Arora. 2019. Decolonizing Privacy Studies. Television & New Media 20:4, 366-378. [Crossref] 113. Stefan Tscharaktschiew, Christos Evangelinos. 2019. Pigouvian road congestion pricing under autonomous driving mode choice. Transportation Research Part C: Emerging Technologies 101, 79-95. [Crossref] 114. Jan Krämer, Daniel Schnurr, Michael Wohlfarth. 2019. Winners, Losers, and Facebook: The Role of Social Logins in the Online Advertising Ecosystem. Management Science 65:4, 1678-1699. [Crossref] 115. Udo Pesch, Georgy Ishmaev. 2019. Fictions and frictions: Promises, transaction costs and the innovation of network technologies. Social Studies of Science 49:2, 264-277. [Crossref] 116. Avi Goldfarb, Catherine Tucker. 2019. Digital Economics. Journal of Economic Literature 57:1, 3-43. [Abstract] [View PDF article] [PDF with links] 117. Benedict G. C. Dellaert. 2019. The consumer production journey: marketing to consumers as co- producers in the sharing economy. Journal of the Academy of Marketing Science 47:2, 238-254. [Crossref] 118. Rodrigo Montes, Wilfried Sand-Zantman, Tommaso Valletti. 2019. The Value of Personal Information in Online Markets with Endogenous Privacy. Management Science 65:3, 1342-1362. [Crossref] 119. John M. Abowd, Ian M. Schmutte. 2019. An Economic Analysis of Privacy Protection and Statistical Accuracy as Social Choices. American Economic Review 109:1, 171-202. [Abstract] [View PDF article] [PDF with links] 120. Rolf H. Weber. Disruptive Technologies and Competition Law 223-240. [Crossref] 121. Elif Kiesow Cortez. Data Breaches and GDPR 1-18. [Crossref] 122. Bernadette Kamleitner. Blurring Boundaries and the Erosion of Ownership 109-130. [Crossref] 123. V. Sridhar. What Are the Privacy Issues Over Data Collected by the Internet and Telecom Firms? 233-264. [Crossref] 124. Randell Cotta, Mingyang Hu, Dan Jiang, Peizhou Liao. Off-Policy Evaluation of Probabilistic Identity Data in Lookalike Modeling 483-491. [Crossref] 125. Jun Zhao, Ge Wang, Carys Dally, Petr Slovak, Julian Edbrooke-Childs, Max Van Kleek, Nigel Shadbolt. `I make up a silly name' 1-13. [Crossref] 126. Jens Prufer, Patricia Prufer. 2019. Data Science for Entrepreneurship Research: Studying Demand Dynamics for Entrepreneurial Skills in the Netherlands. SSRN Electronic Journal . [Crossref] 127. Francesco D'Acunto, Alberto Rossi, Michael Weber. 2019. Crowdsourcing Financial Information to Change Spending Behavior. SSRN Electronic Journal . [Crossref] 128. Werner Güth, Viola Ackfeld. 2019. Personal Information Disclosure under Competition for Benefits: Is Sharing Caring?. SSRN Electronic Journal . [Crossref] 129. Sophia Gaenssle, Oliver Budzinski. 2019. Stars in Social Media: New Light Through Old Windows?. SSRN Electronic Journal . [Crossref] 130. Rodney Garratt, Maarten R.C. van Oordt. 2019. Privacy as a Public Good: A Case for Electronic Cash. SSRN Electronic Journal . [Crossref] 131. Samuel Goldberg, Garrett Johnson, Scott Shriver. 2019. Regulating Privacy Online: The Early Impact of the GDPR on European Web Traffic & E-Commerce Outcomes. SSRN Electronic Journal . [Crossref] 132. Sorensen Jannick Kirk, Hilde Van den Bulck, Sokol Kosta. 2019. Privacy Policies Caught Between the Legal and the Ethical: European Media and Third Party Trackers Before and After GDPR. SSRN Electronic Journal . [Crossref] 133. Gilberto Marzano. Cyberbullying and Social Networking Sites 69-118. [Crossref] 134. Danxia Xie, Longtian Zhang, Ke Tang, Zhen Sun. 2019. Data in Growth Model. SSRN Electronic Journal . [Crossref] 135. Jian Jia, Ginger Zhe Jin, Liad Wagman. 2019. GDPR and the Localness of Venture Investment. SSRN Electronic Journal . [Crossref] 136. Priyanka Sharma, Yidan Sun, Liad Wagman. 2019. The Differential Effects of New Privacy Protections on Publisher and Advertiser Profitability. SSRN Electronic Journal . [Crossref] 137. Kirsten E. M. Martin. 2019. Manipulation, Choice, and Privacy. SSRN Electronic Journal . [Crossref] 138. Daniele Condorelli, Jorge Padilla. 2019. Harnessing Platform Envelopment Through Privacy Policy Tying. SSRN Electronic Journal . [Crossref] 139. Shota Ichihashi. 2019. Dynamic Privacy Choices. SSRN Electronic Journal . [Crossref] 140. Itay Perah Fainmesser, Andrea Galeotti, Ruslan Momot. 2019. Digital Privacy. SSRN Electronic Journal . [Crossref] 141. Milomir Vojvodic, Christian Hitz. 2019. Governance Team Leadership and Business User Participation – Organizational Practices for Innovative Customer Engagement in Data Compliance Project. Central European Business Review 8:2, 15. [Crossref] 142. W. Jason Choi, Kinshuk Jerath, Miklos Sarvary. 2019. Customer Purchase Journey, Privacy Choices, and Advertising Strategies. SSRN Electronic Journal . [Crossref] 143. Anastasios Dosis, Wilfried Sand-Zantman. 2019. The Ownership of Data. SSRN Electronic Journal . [Crossref] 144. Mircea Zloteanu, Nigel Harvey, David Tuckett, Giacomo Livan. 2018. Digital Identity: The effect of trust and reputation information on user judgement in the Sharing Economy. PLOS ONE 13:12, e0209071. [Crossref] 145. Reuben Binns, Jun Zhao, Max Van Kleek, Nigel Shadbolt. 2018. Measuring Third-party Tracker Power across Web and Mobile. ACM Transactions on Internet Technology 18:4, 1-22. [Crossref] 146. Razieh Nokhbeh Zaeem, Rachel L. German, K. Suzanne Barber. 2018. PrivacyCheck. ACM Transactions on Internet Technology 18:4, 1-18. [Crossref] 147. Vahid Mirjalili, Sebastian Raschka, Arun Ross. Gender Privacy: An Ensemble of Semi Adversarial Networks for Confounding Arbitrary Gender Classifiers 1-10. [Crossref] 148. Omid Javidbakht, Parv Venkitasubramaniam, Alberto J. Lamadrid. Coupon Targeting Competition in Privacy Sensitive Market 267-274. [Crossref] 149. Tony Doyle. 2018. Privacy, obfuscation, and propertization. IFLA Journal 44:3, 229-239. [Crossref] 150. Pedro M. Gardete, Yakov Bart. 2018. Tailored Cheap Talk: The Effects of Privacy Policy on Ad Content and Market Outcomes. Marketing Science 37:5, 733-752. [Crossref] 151. Jannick Kirk Sørensen, Hilde Van den Bulck. 2018. Public service media online, advertising and the third-party user data business. Convergence: The International Journal of Research into New Media Technologies 30, 135485651879020. [Crossref] 152. Steven Schmeiser. 2018. Online advertising networks and consumer perceptions of privacy. Applied Economics Letters 25:11, 776-780. [Crossref] 153. Volker Benndorf. 2018. Voluntary Disclosure of Private Information and Unraveling in the Market for Lemons: An Experiment. Games 9:2, 23. [Crossref] 154. Joachim Plesch, Irenaeus Wolff. 2018. Personal-Data Disclosure in a Field Experiment: Evidence on Explicit Prices, Political Attitudes, and Privacy Preferences. Games 9:2, 24. [Crossref] 155. Caleb S. Fuller. 2018. Privacy law as price control. European Journal of Law and Economics 45:2, 225-250. [Crossref] 156. Michael Hüther. 2018. Digitalisierung: Motor im Strukturwandel – Herausforderung für die Wirtschaftspolitik. ORDO 68:1, 179-216. [Crossref] 157. Ronen Gradwohl. 2018. Privacy in implementation. Social Choice and Welfare 50:3, 547-580. [Crossref] 158. Chong Huang, Lalitha Sankar. Impact of Privacy on Free Online Service Markets 1-21. [Crossref] 159. Marc van Lieshout, Sophie Emmert. RESPECT4U – Privacy as Innovation Opportunity 43-60. [Crossref] 160. Chantal Lidynia, Ralf Philipsen, Martina Ziefle. The Sky’s (Not) the Limit - Influence of Expertise and Privacy Disposition on the Use of Multicopters 270-281. [Crossref] 161. Bernadette Kamleitner, Vincent-Wayne Mitchell. Can Consumers Experience Ownership for Their Personal Data?From Issues of Scope and Invisibility to Agents Handling Our Digital Blueprints 91-118. [Crossref] 162. Lei Xu, Chunxiao Jiang, Yi Qian, Yong Ren. Contract-Based Private Data Collecting 59-88. [Crossref] 163. Danielle Lee, Peter Brusilovsky. Recommendations Based on Social Links 391-440. [Crossref] 164. Mark Schelker. 4 On the Economics of Remembering and Forgetting in the Digital Age 148-160. [Crossref] 165. Christine Benesch. 5 A Political Economic Analysis of Transparency in a Digital World 161-168. [Crossref] 166. Yefim Shulman. Towards a Broadening of Privacy Decision-Making Models: The Use of Cognitive Architectures 187-204. [Crossref] 167. Majid Hatamian, Agnieszka Kitkowska, Jana Korunovska, Sabrina Kirrane. “It’s Shocking!": Analysing the Impact and Reactions to the A3: Android Apps Behaviour Analyser 198-215. [Crossref] 168. Tina Morlok, Christian Matt, Thomas Hess. Perspektiven der Privatheitsforschung in den Wirtschaftswissenschaften 179-220. [Crossref] 169. Max Braun, Thilo von Pape, Lara Wolfers, Doris Teutsch, Sabine Trepte. Perspektiven und Trends der Privatheit 221-250. [Crossref] 170. Heiko Richter. The Power Paradigm in Private Law 527-577. [Crossref] 171. Inge Graef. Blurring Boundaries of Consumer Welfare 121-151. [Crossref] 172. Daniel G. Arce, Rainer Böhme. Pricing Anonymity 349-368. [Crossref] 173. Eugenia Politou, Efthimios Alepis, Constantinos Patsakis. 2018. Forgetting personal data and revoking consent under the GDPR: Challenges and proposed solutions. Journal of Cybersecurity 4:1. . [Crossref] 174. Martin J. Kraemer. Preserving Privacy in Smart Homes 1-4. [Crossref] 175. Reuben Binns, Ulrik Lyngs, Max Van Kleek, Jun Zhao, Timothy Libert, Nigel Shadbolt. Third Party Tracking in the Mobile Ecosystem 23-31. [Crossref] 176. Esma Aïmeur. Personalisation and Privacy Issues in the Age of Exposure 375-376. [Crossref] 177. Aaron Ceross, Andrew Simpson. Rethinking the Proposition of Privacy Engineering 89-102. [Crossref] 178. Florian Morath, Johannes Münster. 2018. Online Shopping and Platform Design with Ex Ante Registration Requirements. Management Science 64:1, 360-380. [Crossref] 179. Benedict G. C. mname Dellaert. 2018. The Consumer Production Journey: Marketing to Consumers as Co-Producers in the Sharing Economy. SSRN Electronic Journal . [Crossref] 180. Jay Pil Choi, Doh-Shin Jeon, Byung-Cheol Kim. 2018. Privacy and Personal Data Collection with Information Externalities. SSRN Electronic Journal . [Crossref] 181. Marco Bertini, Stefan Buehler, Daniel Halbheer. 2018. Consumer Resistance. SSRN Electronic Journal . [Crossref] 182. P. Arun. 2018. What is Data in this Brave New World?. SSRN Electronic Journal . [Crossref] 183. Mircea Zloteanu, Nigel Harvey, David Tuckett, Giacomo Livan. 2018. Digital Identity: The Effect of Trust and Reputation Information on User Judgement in the Sharing Economy. SSRN Electronic Journal . [Crossref] 184. Jens Prufer, Patricia Prufer. 2018. Data Science for Institutional and Organizational Economics. SSRN Electronic Journal . [Crossref] 185. Sebastian Dengler, Jens Prufer. 2018. Consumers' Privacy Choices in the Era of Big Data. SSRN Electronic Journal . [Crossref] 186. Grazia Cecere, Clara Jean, Fabrice Le Guel, Matthieu Manant. 2018. STEM and Teens: An Algorithm Bias on a Social Media. SSRN Electronic Journal . [Crossref] 187. Alisa Frik, Alexia Gaudeul. 2018. An Experimental Method for the Elicitation of Implicit Attitudes to Privacy Risk. SSRN Electronic Journal . [Crossref] 188. Oliver Budzinski, Annika Stöhr. 2018. Competition Policy Reform in Europe and Germany – Institutional Change in the Light of Digitization. SSRN Electronic Journal . [Crossref] 189. Johan N. M. Lagerlof. 2018. Surfing Incognito: Welfare Effects of Anonymous Shopping. SSRN Electronic Journal . [Crossref] 190. Marzena J. Rostek, Ji Hee Yoon. 2018. Dynamic Imperfectly Competitive Markets with Private Information. SSRN Electronic Journal . [Crossref] 191. Zoe Cullen, Ricardo Perez-Truglia. 2018. The Salary Taboo: Privacy Norms and the Diffusion of Information. SSRN Electronic Journal . [Crossref] 192. Lefrere Vincent. 2018. 7. Quel est le coût des applications gratuites ?. Regards croisés sur l'économie n°23:2, 94. [Crossref] 193. Bo Cowgill, Cosmina L. Dorobantu. 2018. Competition and Specificity in Market Design: Evidence from Geotargeted Advertising. SSRN Electronic Journal . [Crossref] 194. Priyanka Sharma, Liad Wagman. 2018. Advertising and Voter Data in Biased Political Contests. SSRN Electronic Journal . [Crossref] 195. Jian Jia, Ginger Zhe Jin, Liad Wagman. 2018. The Short-Run Effects of GDPR on Technology Venture Investment. SSRN Electronic Journal . [Crossref] 196. Omid Rafieian, Hema Yoganarasimhan. 2018. Targeting and Privacy in Mobile Advertising. SSRN Electronic Journal . [Crossref] 197. Matthew O. Jackson, Yiqing Xing. 2018. The Interaction of Communities, Religion, Governments, and Corruption in the Enforcement of Contracts and Social Norms. SSRN Electronic Journal . [Crossref] 198. Oliver Budzinski, Victoriia Noskova, zhang xijie. 2018. The Brave New World of Digital Personal Assistants: Benefits and Challenges from an Economic Perspec-tive. SSRN Electronic Journal . [Crossref] 199. Grazia Cecere, Fabrice Le Guel, Matthieu Manant, Nicolas Soulié. Privacy, The Economics of 10754-10763. [Crossref] 200. Oliver Budzinski. 2017. Wettbewerbsregeln für das Digitale Zeitalter?. List Forum für Wirtschafts- und Finanzpolitik 43:3, 221-249. [Crossref] 201. Leyla Dogruel, Sven Joeckel, Jessica Vitak. 2017. The valuation of privacy premium features for smartphone apps: The influence of defaults and expert recommendations. Computers in Human Behavior 77, 230-239. [Crossref] 202. Chao Li, Daniel Yang Li, Gerome Miklau, Dan Suciu. 2017. A theory of pricing private data. Communications of the ACM 60:12, 79-86. [Crossref] 203. Alessandro Acquisti, Idris Adjerid, Rebecca Balebako, Laura Brandimarte, Lorrie Faith Cranor, Saranga Komanduri, Pedro Giovanni Leon, Norman Sadeh, Florian Schaub, Manya Sleeper, Yang Wang, Shomir Wilson. 2017. Nudges for Privacy and Security. ACM Computing Surveys 50:3, 1-41. [Crossref] 204. Alexandre de Cornière, Rodrigo Montes. 2017. Consumer Privacy and the Incentives to Price- Discriminate in Online Markets. Review of Network Economics 16:3, 291-305. [Crossref] 205. Marc Bourreau, Bernard Caillaud, Romain de Nijs. 2017. The Value of Consumer Data in Online Advertising. Review of Network Economics 16:3, 269-289. [Crossref] 206. Alexia Gaudeul, Caterina Giannetti. 2017. The effect of privacy concerns on social network formation. Journal of Economic Behavior & Organization 141, 233-253. [Crossref] 207. , Ali Makhdoumi, Azarakhsh Malekian, Asuman Ozdaglar. 2017. Privacy- constrained network formation. Games and Economic Behavior 105, 255-275. [Crossref] 208. Helia Marreiros, Mirco Tonin, Michael Vlassopoulos, M.C. Schraefel. 2017. “Now that you mention it”: A survey experiment on information, inattention and online privacy. Journal of Economic Behavior & Organization 140, 1-17. [Crossref] 209. Samant Khajuria, Knud Erik Skouby. 2017. Privacy and Economics in a 5G Environment. Wireless Personal Communications 95:1, 145-154. [Crossref] 210. Ronen Gradwohl, Rann Smorodinsky. 2017. Perception games and privacy. Games and Economic Behavior 104, 293-308. [Crossref] 211. 2017. Book Reviews. Journal of Economic Literature 55:2, 644-663. [Citation] [View PDF article] [PDF with links] 212. 2017. Book Reviews. Journal of Economic Literature 55:2, 662-663. [Abstract] [View PDF article] [PDF with links] 213. 2017. Book Reviews. Journal of Economic Literature 55:2, 660-662. [Abstract] [View PDF article] [PDF with links] 214. Oliver Budzinski, Sonja Schneider. 2017. Smart Fitness: Ökonomische Effekte einer Digitalisierung der Selbstvermessung. List Forum für Wirtschafts- und Finanzpolitik 43:2, 89-124. [Crossref] 215. Evert Van den Broeck, Karolien Poels, Michel Walrave. 2017. A Factorial Survey Study on the Influence of Advertising Place and the Use of Personal Data on User Acceptance of Facebook Ads. American Behavioral Scientist 61:7, 653-671. [Crossref] 216. Byung-Cheol Kim, Jin Yeub Kim. 2017. The Economics of the Right to Be Forgotten. The Journal of Law and Economics 60:2, 335-360. [Crossref] 217. Sophia Kununka, Nikolay Mehandjiev, Pedro Sampaio, Konstantina Vassilopoulou. End User Comprehension of Privacy Policy Representations 135-149. [Crossref] 218. Malte Dold, Tim Krieger. Informationelle Selbstbestimmung aus ordnungsökonomischer Sicht 181-198. [Crossref] 219. Grazia Cecere, Fabrice Le Guel, Matthieu Manant, Nicolas Soulié. The Economics of Privacy 1-11. [Crossref] 220. Grazia Cecere, Fabrice Le Guel, Matthieu Manant, Nicolas Soulié. The Economics of Privacy 1-11. [Crossref] 221. Muhammad Rizwan Asghar, Gyorgy Dan, Daniele Miorandi, Imrich Chlamtac. 2017. Smart Meter Data Privacy: A Survey. IEEE Communications Surveys & Tutorials 19:4, 2820-2835. [Crossref] 222. Ronen Gradwohl. Information Sharing and Privacy in Networks 349-350. [Crossref] 223. Ronen Gradwohl. 2017. Information Sharing and Privacy in Networks. SSRN Electronic Journal . [Crossref] 224. NNstor Duch-Brown, Bertin Martens, Frank Mueller-Langer. 2017. The Economics of Ownership, Access and Trade in Digital Data. SSRN Electronic Journal . [Crossref] 225. Susan Athey, Christian Catalini, Catherine E. Tucker. 2017. The Digital Privacy Paradox: Small Money, Small Costs, Small Talk. SSRN Electronic Journal . [Crossref] 226. Jens Prufer, Christoph Schottmmller. 2017. Competing with Big Data. SSRN Electronic Journal . [Crossref] 227. Veronica Marotta, Kaifu Zhang, Alessandro Acquisti. 2017. Not All Privacy Is Created Equal: The Welfare Impact of Targeted Advertising. SSRN Electronic Journal . [Crossref] 228. Paolo Vanini. 2017. Digital Disruption. SSRN Electronic Journal . [Crossref] 229. Carin van der Cruijsen. 2017. Payments Data: Do Consumers Want to Keep Them in a Safe or Turn Them into Gold?. SSRN Electronic Journal . [Crossref] 230. Ginger Zhe Jin, Andrew Stivers. 2017. Protecting Consumers in Privacy and Data Security: A Perspective of Information Economics. SSRN Electronic Journal . [Crossref] 231. Garrett A. Johnson, Scott Shriver, Shaoyin Du. 2017. Consumer Privacy Choice in Online Advertising: Who Opts Out and at What Cost to Industry?. SSRN Electronic Journal . [Crossref] 232. Oliver Budzinski. 2017. Wettbewerbsregeln FFr Das Digitale Zeitalter Die konomik Personalisierter Daten, Verbraucherschutz Und Die 9.Gwb-Novelle (Competition Rules for the Digital Age - The Economics of Personalized Data, Consumer Protection, and the 9th Amendment of German Competition Law). SSRN Electronic Journal . [Crossref] 233. Adrian Hillenbrand, Svenja Hippel. 2017. Strategic Inattention in Product Search. SSRN Electronic Journal . [Crossref] 234. Wolfgang Kerber, Jonas Severin Frank. 2017. Data Governance Regimes in the Digital Economy: The Example of Connected Cars. SSRN Electronic Journal . [Crossref] 235. Mariann Ollar, Marzena J. Rostek, Ji Hee Yoon. 2017. Privacy in Markets. SSRN Electronic Journal . [Crossref] 236. David Birget. 2017. Big Data and Price Discrimination. SSRN Electronic Journal . [Crossref] 237. Jian Jia, Liad Wagman. 2017. Competition and Enforcement of Scofflaw Actors in the Sharing Economy: The Case of Airbnb. SSRN Electronic Journal . [Crossref] 238. Kallia Anastasopoulou, Spyros Kokolakis, Panagiotis Andriotis. Privacy Decision-Making in the Digital Era: A Game Theoretic Review 589-603. [Crossref] 239. Paul Belleflamme, Wouter Vergote. 2016. Monopoly price discrimination and privacy: The hidden cost of hiding. Economics Letters 149, 141-144. [Crossref] 240. George Norman, Lynne Pepall, Dan Richards, Liang Tan. 2016. Competition and consumer data: The good, the bad, and the ugly. Research in Economics 70:4, 752-765. [Crossref] 241. Christopher Griffin, Sarah Rajtmajer, Anna Squicciarini. Invited Paper: A Model of Paradoxical Privacy Behavior in Online Users 206-211. [Crossref] 242. Jin-Hyuk Kim, Liad Wagman, Abraham L. Wickelgren. 2016. The Impact of Access to Consumer Data on the Competitive Effects of Horizontal Mergers. SSRN Electronic Journal . [Crossref] 243. Tommaso M. Valletti, Jiahua Wu. 2016. 'Big Data' vs 'Small Data': Consumer Profiling with Data Requirements. SSRN Electronic Journal . [Crossref] 244. Balgobin Yann, Quinn Martin, Patrick Waelbroeck. 2016. Payment Instruments, Financial Privacy and Online Purchases. SSRN Electronic Journal . [Crossref] 245. Oliver Budzinski. 2016. Aktuelle Herausforderungen Der Wettbewerbspolitik Durch Marktplltze Im Internet (Competition Policy Challenges of Internet Marketplaces). SSRN Electronic Journal . [Crossref] 246. Ole Jann, Christoph Schottmmller. 2016. An Informational Theory of Privacy. SSRN Electronic Journal . [Crossref] 247. Alessandro Acquisti, Idris Adjerid, Rebecca Hunt Balebako, Laura Brandimarte, Lorrie Faith Cranor, Saranga Komanduri, Pedro Giovanni Leon, Norman Sadeh, Florian Schaub, Manya Sleeper, Yang Wang, Shomir Wilson. 2016. Nudges for Privacy and Security: Understanding and Assisting Userss Choices Online. SSRN Electronic Journal . [Crossref] 248. Paolo Vanini. 2016. Asset Management. SSRN Electronic Journal . [Crossref] 249. Rodrigo Montes, Wilfried SanddZantman, Tommaso M. Valletti. 2015. The Value of Personal Information in Markets With Endogenous Privacy. SSRN Electronic Journal . [Crossref] 250. Daron Acemoglu, Ali Makhdoumi, Azarakhsh Malekian, Asuman E. Ozdaglar. 2015. Privacy- Constrained Network Formation. SSRN Electronic Journal . [Crossref] 251. Byung-Cheol Kim, Jin Yeub Kim. 2015. The Economics of the Right to Be Forgotten. SSRN Electronic Journal . [Crossref] 252. Doh-Shin Jeon, Byung-Cheol Kim, Domenico Menicucci. 2015. Price Discrimination by a Two- Sided Platform: With Applications to Advertising and Privacy Design. SSRN Electronic Journal . [Crossref] 253. Paul Belleflamme. 2015. Monopoly Price Discrimination and Privacy: The Hidden Cost of Hiding. SSRN Electronic Journal . [Crossref] 254. Florian Morath, Johannes MMnster. 2014. Online Shopping and Platform Design with Ex Ante Registration Requirements. SSRN Electronic Journal . [Crossref] 255. Alessandro Acquisti, Christina M. Fong. 2012. An Experiment in Hiring Discrimination Via Online Social Networks. SSRN Electronic Journal . [Crossref] 256. Grazia Cecere, Fabrice Le Guel, Vincent Lefrere. 2007. Economics of Free Mobile Applications: Personal Data. SSRN Electronic Journal . [Crossref]