Management , , and Price Discrimination

In economics, privacy is usually discussed in the context of consumer preferences and price discrimination. But what forms of privacy are compatible with merchants’ interests in knowing more about their consumers, and how can identity management systems protect while enabling personalization and price discrimination?

Alessandro n the economics literature, privacy is usually dis­ or her purchas­ Acquisti cussed in the context of consumer preferences ing history. Carnegie and reservation prices: merchants are interested Identity management systems can support such Mellon in finding out a consumer’s preferences because selective information revelation strategies by giving University Ifrom those they can infer the consumer’s maximum consumers greater control over which identities are willingness to pay for a good (his reservation price). established, which attributes are associated with them, The ability to identify consumers and track their pur­ and under what circumstances they’re revealed to oth­ chase histories, therefore, lets merchants charge prices ers. Therefore, such systems allow for transactions in that extract as much surplus as possible from the sale, which some level of information sharing is accompa­ which is what economists call price discrimination.1–3 nied by some level of information hiding. At the same In this context, consumer privacy concerns reduce to time, economic views of privacy that are more granu­ individuals’ issues with their personal preferences be­ lar than the one formal micro models ordinarily focus ing known to merchants and exploited for profit. on—that is, privacy as the protection of a consumer’s However, economists also acknowledge that con­ set of preferences—show that there are both costs and sumer privacy is not just about hiding the price paid benefits when information other than a consumer’s for a good.4,5 During any economic transaction, a con­ preferences and reservation prices are protected or sumer might rationally wish to share with a merchant revealed. The issue becomes what dimensions of per­ certain types of personal data while keeping others sonal data privacy are compatible with merchants’ in­ private.6 Consumers can incur several costs (and, terests in knowing more about their consumers and also, gain several benefits) when revealing personal their valuations for goods or services. In this article, data during a purchase: costs associated with spam, I’ll examine several ways in which identity manage­ profiling, or financial fraud when addresses, names, ment systems can protect certain types of informa­ or financial data are revealed; and benefits associated tion privacy while simultaneously supporting various with targeted offers or personalized recommendations forms of personalization and price discrimination. when information about tastes and interests is shared. The personal data shared during a transaction does Identity management and privacy not need to be personally identifiable for those cost or Identity management systems make it possible for benefits to occur. For instance, a merchant can infer a individuals and organizations to engage in selective consumer’s preferences without knowing the consum­ information revelation strategies. 7–9 By offering con­ er’s name or other public identifiers (such as his or her sumers some control over how their identities and credit-card number); or, a consumer’s status (as a stu­ associated attributes are established and revealed to dent, a senior citizen, or member of the military) can others, they become tools for privacy protection and be shared with a merchant without also disclosing his for an efficient economic balancing of information

18 Published by the IEEE Computer Society ■ 1540-7993/08/$25.00 © 2008 IEEE ■ IEEE & Privacy Identity Management

hiding and sharing. degree price discrimination, prices are based on in­ Different types of personal information raise dif­ dividual preferences (in the extreme case, individual ferent privacy concerns. Merchants can use certain buyers could receive a customized price matching their data (such as a consumer’s preferences) for price dis­ maximum willingness to pay, or reservation price, for crimination. Other data (such as the consumer’s the good). In second-degree price discrimination, credit-card number or personal address) can lead to customers self-select into buying different versions or financial fraud or spam. Accordingly, different types quantities of the good; in other words, the seller offers of privacy-enhancing identity management strategies a menu of options for a product (for instance, standard protect different types of information. In the context and premium version), and consumers freely choose I’m considering here, it’s worthwhile to differenti­ the option they desire (and the associated price). In ate between privacy-enhancing strategies that aim third-degree price discrimination, differential prices to provide anonymity and those that aim to provide are assigned to different consumer segments based on pseudonymity. some observable group characteristics, such as age, Pseudonymizing technologies can link various trans­ student status, or geographical location. actions (payments, emails, HTTP requests) by the Each degree of price discrimination relies on dif­ same agent to the same pseudonym identity, although ferent types of personal information being available they aren’t traceable to her permanent public identi­ to the merchant, and therefore raises different privacy fiers (such as her name).Anonymizing technologies not issues. It’s generally believed that consumers don’t only make any transaction from a certain agent un­ accept price discrimination (the notorious “random­ traceable to that agent’s permanent, public identity, ized” price experiment that Amazon.com attempted but also ensure that adversaries can’t link together a few years ago provoked angry consumer reaction12). various transactions by the same agent. In the realm of However, customers don’t mind price discrimination privacy-enhancing electronic payments, for instance, when it implies lower prices than those charged to David Chaum’s eCash10 is an example of an anony­ other customers—that is, when they benefit from it. mizing technology, whereas Steven Low, Nicholas Economists usually regard price discrimination favor­ Maxemchuk, and Sanjoy Paul’s credit-card approach11 ably because it can be “welfare enhancing:”1,3,13,14 is better defined as a pseudonymizing technology. under certain conditions, it can increase aggregate eco­ A pseudonymizing technology can protect a pur­ nomic welfare—for instance, when a good wouldn’t chaser’s financial identity during a transaction. But an even be produced unless its producer could target a anonymizing technology, in addition to that, might segment of consumers willing to pay high prices for it. also protect the purchaser from having her purchase Price discrimination can also increase the welfare of history tracked or might offer the additional psycho­ consumers with lower evaluations for a certain good, logical comfort of complete anonymity. This tech­ who otherwise might not have been offered the good nological distinction is important from an economic at prices matching their willingness to pay. perspective because different combinations of transac­ Given that price discrimination often relies on tion linkability and traceability allow different types consumer identification, it might seem incompatible of information to be shared and different types of with privacy protection. This, in turn, would imply price discrimination to be implemented. that adopting privacy-enhancing technologies would come at the cost of the welfare enhancements that price Price discrimination, identity, and discriminative strategies otherwise provide.1 Andrew tracking Odlyzko14 observed that the current privacy debate in As noted earlier, price discrimination refers to a sell­ e-commerce is fueled by the clash between consumers er’s ability to provide the same commodity or service and merchants around the use of personal information at different prices to different consumers. This price for price discrimination, and that the movement to re­ is based on the seller’s estimation of the price a buyer duce privacy online might be motivated by the incen­ might be willing to pay for that good. tives to price discriminate. Consumers want privacy, Price discrimination is very common in all types which implies freedom from being tracked, yet mer­ of markets: at the cinema, in airline booking systems, chants want to track consumers, which implies their and, in fact, online, where increasingly sophisticated ability to charge prices that improve their profits (for tracking technologies let merchants adjust prices based a more recent view on this theme by the same author, on the visitor’s location (as revealed by his or her IP see Odlyzko’s other work15,16). address), time spent on the site, cookie information, Although such opposing interests are read­ history of previous purchases, and so on. ily observed in many transactions, consumers can Economists distinguish between three types of use identity management systems for selective price discrimination, which they call “degrees” for disclosure of identity and attribute information. technical reasons beyond this article’s scope. In first- Previous works have hinted at using privacy tech­

www.computer.org/security/ ■ IEEE Security & Privacy 19 Identity Management

nologies in ways that protect certain types of con­ technologies as well. sumer information but allow other individual data First, the merchant can link consumer purchase to be exchanged;1,4,17–19 and others have analyzed histories or customer profiles that include a custom­ some of the economic implications of such tech­ er’s preferences or tastes to individual pseudonymous identities, and even trace them to persistent identifiers Umpteen wart hogs grew up, but two mats (such as an email address used repeatedly to login to an account with a merchant or a loyalty card used for tickled Paul. One wart hog grew up, however multiple purchases from a grocery-store chain). Such identifiers can be pseudonymous: in other words, they five dwarves auctioned tickets 4-line pull can separate information about the online persona (for instance, the purchase history in a certain account) quote from the owner of that account’s public identity (name, credit-card number, and so on). Such separa­ nologies.1,20,21 Whether privacy-enhancing identity tion can be enforced through pseudonymous payment management systems are compatible with price-dis­ technologies that link individual transactions by the criminative strategies depend on what type of infor­ same subject to each other through persistent pseud­ mation they’re designed to shield, and therefore on onymous identities, but that aren’t traceable back to the scope of their privacy protection. the purchaser’s offline identity. An example of such technologies is Low and colleagues’ credit-card pro­ Price discrimination tocol,11 which a consumer would use in coordina­ with identity management tion with other anonymous browsing and messaging Different degrees of price discrimination are com­ technologies. The linkages between transactions help patible with different privacy-preserving identity merchants recognize a consumer and offer that person management strategies. I’ll look at some specific ex­ targeted services and prices, although the offline iden­ amples of psuedonymity or anonymity in electronic tity isn’t revealed.4 Of course, as noted earlier, given purchases. I should note that using an anonymous enough complementary information, traffic and trails payment technology isn’t sufficient for protecting an analysis could let a determined adversary break pseud­ online buyer’s identity when other personal informa­ onymous and anonymous shields and re-identify the tion could be revealed during the transaction: a ship­ buyer’s actual identity. A realistic goal of such protec­ ping address, the name of the sender in a gift card, an tive technologies is simply to ensure that such attacks IP address, or unique tastes, selections, or purchasing aren’t cost effective. patterns can lead to the buyer’s re-identification, even Separating offline and online identities offers addi­ when he or she uses anonymizing technologies. In tional advantages—for example, it hides financial in­ what follows, I’ll assume a scenario in which the suite formation about the consumer from the seller (as well of technologies buyers use (from privacy-preserving as third parties); hence, it can help the consumer (as payment technologies to anonymous email and anon­ well as, possibly, the merchant) avoid additional (and ymous browsing) hides all data that could otherwise possibly even significant) financial fraud and identity reveal his or her public identity. theft costs. It also protects against or decreases the risk of non-monetary forms of discrimination or the First degree creation of an individual’s digital dossier, shared with As noted earlier, first-degree price discrimination third parties over which the consumer has no control. consists of differential prices being offered to individ­ In spite of these protections, consumers still receive ual consumers based on their willingness to pay for a targeted or customized offers and differential prices. good. Although this observation might suggest that in However, for merchants, the risk is that consumers order to implement first-degree price discrimination, will engage in economic arbitrage subtracting rev­ a seller must know the consumer’s identity, this isn’t enues from them—particularly if pseudonymous ac­ necessarily the case. counts are cheap to create. First-degree price discrimination relies more on Second, a form of behavioral-based first-degree the seller’s ability to estimate individual reservation price discrimination can also be compatible with prices than on its ability to recognize actual individual anonymizing technologies, as long as the individual identities. Such reservation prices depend on consum­ reveals certain information: the online merchant ob­ ers’ preferences and tastes, and are predictable—for serves the (anonymous, rather than pseudonymous) example, by observing consumers’ behavior or pur­ individual who arrives at the site after a certain search chase histories. Therefore, a merchant might estimate query, browses through different pages on the site, and them even in presence of pseudonymizing technolo­ spends time looking at particular products. Similarly gies or, under uncertain conditions, anonymizing to how brick-and-mortar sellers might try to asses the

20 IEEE Security & Privacy ■ March/April 2008 Identity Management

consumer’s willingness to pay for a commodity in consumer’s permanent and public identity, while the face-to-face transactions, based on traits or informa­ anonymous credentials allow her and the merchant to tion other than that consumer’s name or social secu­ converge on a price for a particular segment of the rity number, an online merchant might, under certain consumer population. conditions, estimate an anonymous visitor’s willing­ ness to pay for a certain item, and, accordingly, set prices dynamically. f both merchants and consumers adopted them, iden­ There’s obviously a third, intermediate, case in I tity management systems could help them fine-tune which a consumer can adopt an anonymous payment personal information revelation to strike a balance be­ technology (which in theory makes transactions both tween consumers’ privacy protection and merchants’ untraceable to the originating public, permanent price discrimination. In this article, I didn’t aim to identity, as well as unlinkable to other transactions establish which exact proportion of price discrimina­ by the same consumer); however, the consumer actu­ tion and privacy protection may be optimal or simply ally renders that payment pseudonymous (by design desirable for society, or whether price discrimination or oversight) by providing information that lets the might or might not be inherently privacy intrusive. I seller track his or her transactions. For instance, the pointed out that many such proportions may be pos­ use of persistent email addresses or PO boxes, online sible thanks to technology, with different types of pri­ accounts, or cookies in combination with anonymiz­ vacy-preserving identity management strategies being ing technologies might shield the buyer’s public iden­ compatible with various forms of differential pricing. tity (for instance, her actual name), while allowing The ensuing combination of price discrimination and repeated interactions with the seller. privacy protection could be a desirable middle path whenever unlinkable transactions would end up de­ Second degree creasing social welfare, or when traceable transactions Second-degree price discrimination consists of custom­ would expose consumers to potential costs by reveal­ ers voluntarily choosing differential prices for different ing their identities. versions or quantities of a good or service. This form Customer reaction to price discrimination isn’t of pricing therefore isn’t based on personal information always adversarial: Sarah Spiekermann22 reports on and doesn’t rely on individual recognition. Neither survey results that confirm how consumer reactions linkability nor traceability across different transactions to price discrimination depend on what degree of or identities needs to be exploited for this form of price discrimination is imposed on them (the first degree discrimination to be enforced. In other words, a mer­ being the least liked, and the second being the most chant can implement second-degree price discrimina­ accepted). For consumers, the advantages of adopting tion even when customers adopt pseudonymous and privacy technologies that allow for some level of in­ anonymous payments strategies that shield personal dividual tracking lie in the combination of sensitive information (including online information, such as information protection and the ability to receive per­ email accounts or IP addresses). Price discrimination sonalized and targeted services (as well as, for some of this form is compatible with anonymous transac­ consumers, lower prices than other purchasers). For tions—although, as mentioned earlier, the consumer’s high-value customers, the disadvantages lie in adverse actions could provide ulterior information adversaries price discrimination. could use for re- identification. For merchants, combining privacy-enhancing technologies and price discrimination strategies via Third degree identity management systems lets them attract and In third-degree price discrimination, merchants as­ satisfy the needs of privacy-sensitive consumers with­ sign differential prices to dif­erent consumer seg­ out harming their ability to implement pricing and ments based on some observable group characteristics. marketing strategies. With identity management sys­ The combination of privacy technologies and prices tems, technologists have given us tools that let some targeted to customer segments is discussed by other information be shared while other data is protected; researchers, such as Partha Das Chowdhury,17 who proper use of those tools can meet both merchants’ suggested the use of ring signature protocols for anon­ and consumers’ needs. ymous purchases. However, even the combination of existing anonymous payments protocols and anony­ Acknowledgments mous credentials (which can prove the ownership of I gratefully acknowledge research support from US Na­ certain group attributes, such as age, employment sta­ tional Science Foundation grant number 0713361 (“IPS: tus, and so on8) meets the requirements of this form Evaluating and Enhancing Privacy and Information Shar­ of price discrimination. The anonymous payment ing in Online Social Networks”) from the Carnegie Mellon protocol makes the transaction untraceable to the Berkman Fund, and from the US Army Research Office

www.computer.org/security/ ■ IEEE Security & Privacy 21 Identity Management

under contract number DAAD190210389 (“Personal In­ (ICEC 07), ACM Press, 2007, pp. 3–6. formation Security and Online Social Networks”). I also 17. P. Das Chowdhury, B. Christianson, and J. Malcolm, thank Ramnath K. Chellappa, Alfred Kobsa, Susan Lan­ “Privacy Systems with Incentives, Proc. First Int’l Work- dau, Deirdre Mulligan, Andrew Odlyzko, Sarah Spieker­ shop Information Systems, 2006. mann, Hal Varian, and three anonymous referees for their 18. P. Das Chowdhury, Anonymity and Trust in the Electronic helpful comments. World, PhD thesis, computer science dept., Univ. of Hertfordshire, 2005. References 19. A. Acquisti, “Personalized Pricing, Privacy Technolo­ 1. A. Acquisti and H.R. Varian, “Conditioning Prices gies, and Consumer Acceptance,” CHI Workshop on on Purchase History,” Marketing Science, vol. 24, no 3, Personalization and Privacy, 2006; www.isr.uci.edu/ 2005, pp. 1–15. pep06/papers/Proceedings_PEP06.pdf. 2. C.R. Taylor, “Consumer Privacy and the Market for 20. S. Koble and R. Böhme, “Economics of Identity Man­ Customer Information,” RAND J. Economics, vol. 35, agement: A Supply-side Perspective,” Privacy Enhanc- no. 4, 2004, pp. 631–651. ing Technologies Workshop (PET 05), G. Danezis and D. 3. G. Calzolari and A. Pavan, “On the Optimality of Pri­ Martin, eds., 2006, pp. 259–272; www.petworkshop. vacy in Sequential Contracting,” J. Economic Theory, org/2005/workshop/call.html. vol. 130, no. 1,2006. 21. R. Böhme and S. Koble, “On the Viability of Privacy- 4. A. Acquisti, “Privacy and Security of Personal Infor­ Enhancing Technologies in a Self-regulated Business- mation: Economic Incentives and Technological Solu­ to-Consumer Market: Will Privacy Remain a Luxury tions,” The Economics of , J. Camp and Good?,” Proc. Workshop on Economics of Information S. Lewis, eds., Kluwer, 2004. Security (WEIS 07), 2007. 5. B. Hermalin and M. Katz, “Privacy, Property Rights 22. S. Spiekermann, “Individual Price Discrimination: and Efficiency: The Economics of Privacy as Secrecy,” An Impossibility?,” CHI Workshop on Personalization Quantitative Marketing and Economics, vol. 4, no. 3, 2006, and Privacy, 2006; www.isr.uci.edu/pep06/papers/ pp. 209–239. Proceedings_PEP06.pdf. 6. H. Varian, Economic Aspects of Personal Privacy, in Privacy and Self-Regulation in the Information Age, NTIA report, Alessandro Acquisti is an assistant professor of information 1996; www.sims.berkeley.edu/hal/people/hal/papers.html. technology and public policy at the H. John Heinz III School of 7. D. Chaum, “Security without Identification: Transac­ Public Policy and Management, Carnegie Mellon University. tion Systems to Make Big Brother Obsolete,” Comm. His research interests include the economics of privacy and ACM, vol. 28, no. 10, 1985, pp. 1030–1044. privacy in online social networks. Acquisti has a PhD in infor- 8. S. Brands, Rethinking Public Key Infrastructure and Digital mation systems from the University of California, Berkeley. He Certificates -Building in Privacy, MIT Press, 2000. is a member of Carnegie Mellon CyLab and Carnegie Mellon 9. M. Hansen et al., “Privacy-Enhancing Identity Man­ Usable Privacy and Security Lab (CUPS). Contact him at ac- agement,” Information Security Tech. Report, vol. 11, no. [email protected]. 3, 2006, 119–128. 10. D. Chaum, “Blind Signatures for Untraceable Pay­ ments,” Advances in Cryptology (CRYPTO 82), Plenum Press, 1983, pp. 199–203. 11. S. Low, N.F. Maxemchuk, and S. Paul, “Anonymous Credit Cards,” Proc. 2nd ACM Conf. Computer and Com- munications Security, ACM Press, 1994, pp. 108–117. 12. “Amazon, The Software Company,” Economist, 18 Dec. 2001; www.economist.com/displayStory.cfm?Story _ID=393096. 13. H.R. Varian, “Price Discrimination and Social Welfare,” Am. Economic Rev., vol. 75, no. 4, 1985, pp. 870–875. 14. A. Odlyzko, “Privacy, Economics, and Price Discrimi­ nation on the Internet,” Proc. 5th Int’l Conf. Electronic Commerce, ACM Press, 2003, pp. 355–366. 15. A. Odlyzko, “The Evolution of Price Discrimination in Transportation and its Implications for the Inter­ net,” Rev. Network Economics, vol. 3, no. 3, 2004, pp. 323–346. 16. A. Odlyzko, “Privacy and the Clandestine Evolution of E-Commerce,” Proc. 9th Intl Conf. Electronic Commerce

22 IEEE Security & Privacy ■ March/April 2008