<<

Our Data, Episode 6— Data Markets: ​ ​ Human-Centric Data Protection with Elizabeth Renieris

June 24, 2020

Our Data is a podcast from the Stanford CodeX Center for Legal Informatics, in conjunction ​ ​ ​ ​ with the Stanford CodeX Blockchain Group and Tech4Good initiatives.

In this episode, policy expert Elizabeth Renieris discusses data privacy and ownership. Join us as we explore the intersections between digital data, privacy, and human rights, as well as how we can actualize better privacy protections, who should bear the burden of protecting individuals’ data, and whether self-sovereign data ownership is truly the best way forward in a human-centric digital world.

Elizabeth Renieris is a law and policy expert with the Harvard Carr Center for Human Rights ​ Policy, as well as the founder and CEO of HackyLawyER. Her work focuses on law and policy ​ ​ concerns at the intersection of data protection, digital identity, and emerging technologies. In addition to her experience advising governments, academic journals, and technology startups (to name just a few), Elizabeth also researches data governance frameworks as a fellow at the Berkman Klein Center for Internet and Society at Harvard University.

[Opening]

Mike Schmitz: Welcome to Our Data. Reuben, today we have a really important conversation with one of the ​ ​ leading thought leaders and leaders having to do with data and privacy. Elizabeth Renieris is one of the, like I said, globally recognized leaders in this space, [the] founder and CEO of HackyLawyER, and also a fellow at the Harvard Carr Center for Human Rights Policy. We're ​ going to talk about a lot of different things, but just to kick it off, I wanted to—and you get to direct this however you want to direct it, Elizabeth—but really talk about this whole concept of owning data and privacy in particular, and you've really jumped out there and and crafted what I've seen very few others do [as far as] making a new way of thinking about this, and obviously with an eye towards creating new a new paradigm for doing it. So Elizabeth, it's great to have you on the podcast. I'll let you open, but let's—if you wouldn't mind, we'd love to get into some of your latest pieces and your latest thinking on this.

Elizabeth Renieris [1:56]: Yeah, sure. Well, thanks for having me. I'm really excited about this podcast. I think it's timely, and it's an important conversation. And obviously, you have a great vehicle for talking about these issues here. So thanks for having me on. Yeah, it's—this is a question I've been spending the last 12 years thinking about in various contexts in various countries, and in different roles as well. And so I think as a result, I tend to have a slightly different perspective than certainly lawyers in the space who have been focused in one particular area of the law or in one jurisdiction, more kind of coming at it from one perspective. And I think what I've been noticing is that we get really stuck on data. So one of the things I wrote about (I think last year) was this idea that we're a little bit distracted by data. So there's just—there's so much data right now that it feels like the issue, the kind of primary issue to grapple with. And my perspective on that is, it's actually a bit of a red herring, and that what we really need to be focusing on are people and their rights, and people and safeguards, and fundamentally how we protect people. Because when we start with the data, I think sometimes we lose sight of what the impact is on people. And so my whole thing now—I'm actually working on a book on this (forthcoming).

Mike: All right.

Elizabeth: Yeah. So we can talk about that a bit. But it's about the future of data governance, and my whole idea is really that we need to move beyond these ideas of data protection or data security or data privacy and think about people protection and you know, people, and really reframe the conversation. And I think we're gonna end up in a much better place. But you know, that's a really broad start. So I'm happy to dive into specific questions.

Reuben Youngblom: I think that's a great segue into something you wrote relatively recently that deals with, I think, both sides of the people protection equation, which is something like blockchain passports for COVID. So I'm not sure when people are going to be listening to this [but] right now we're in the middle of the COVID crisis. Nobody really knows much about how it's being transmitted, nobody knows when a vaccine is coming, there are a lot of unknowns. But one thing that's been proposed is this idea of using blockchain to let people cryptographically prove that they have had COVID or that they've had a recent test or things like that. And there are two sides here, right? There's the side of, we want to protect the public. And there's also the side of, we want to protect the individual. And maybe there are privacy implications here.

Elizabeth: Yeah, and as you mentioned, I've been writing and thinking about this very publicly. And the way that I'm thinking about it is: I generally—[and] this is whether blockchain or any other technology is implicated—I generally don't think that crises are the right time to introduce highly experimental, untested at scale technology of any kind. And with blockchain, certainly, we've had more than 10 years of an industry kind of experimenting and building out use cases and grappling with some of the legal issues. I've been working on blockchain issues for about five or six years now. But we haven't had any real at-scale deployments of the technology beyond certain limited use cases. Obviously, cryptocurrencies being the preeminent use case there. And the context for cryptocurrencies and a payments infrastructure is not easy to transpose into a context around really fundamental human rights of people: rights around autonomy, around freedom of movement, around freedom of assembly of association, around privacy, around data protection, around the right to earn a living… all of these fundamental rights that are implicated by the potential introduction of something like a blockchain-enabled immunity passport. And so my concerns are really at that meta level. And I think what I saw happening in the conversation and why I felt the need to weigh in was that I saw a very micro conversation happening around tweaking specific implementations around adjusting what kind of proofs are being used or looking at whether a DID [decentralized ID] format is used, or some other type of identifier. And I just want to elevate the conversation to the level of taking a step back to say, let's contextualize this. Is this a good idea, as a first question, and then, what are the concerns? And [I wanted to] just offer that broader perspective because I think what happens is, we get really excited about the tech, we start doing the building, we work around these very micro questions [and then] we totally lose sight of the big picture. And I just think as somebody who's worked in a lot of different countries and a lot different fields, I tend to have a broader perspective. And given how close I've been to the tech I really felt the need to speak up about it.

Mike: Yeah, it feels like, actually, very reminiscent of just post 9/11 and the Patriot and essentially, to your point, broad, deep action in a moment of crisis better be really well thought through, because the implications are enormous.

Elizabeth: Yeah, and I think that’s true—

Mike: So we're still working through the ramifications of all this stuff that we went through with the Patriot Act in terms of rights, human rights, fundamental kinds of things.

Elizabeth [7:50]: Absolutely. I think that's some of my perspective. So actually, my first job out of law school was at DHS.

Mike: Interesting, wow.

Elizabeth: And I was in the GCs office at DHS. And this was long after 9/11, and yet, all of that infrastructure was still in place. And I think that was really eye opening that—it really just opened my eyes to the fact that these temporary measures are rarely undone. And so I hold that—yeah, in this context, I mean, certainly in any disaster/emergency, the urgency of the short term always overtakes the long term perspective of, “what's the implication of this in the long run?”

Mike: If you will, let's get into the context of human rights framework, [and] kind of explicate it for the listeners because I think it's the conversation in the blockchain world for those who are not privy ​ ​ to it. It tends, like you said, to be very focused around what the tech was originally envisioned to solve, and it tends to flow from there even at the theoretical level, as well. And, like, when I heard your around human rights, having trained and worked as a human rights attorney… in that context it’s like, “...well, yeah, that makes perfect sense. Why didn't I think of that?” Well, because we've been in this conversation talking about, how do we use other arguments to be able to advance this for the common good. And it's like, well, let's first actually examine the approach we're taking.

Elizabeth: Yeah, I think what happens—I mean, this is pretty universal. You get into these silos and these echo chambers, and it can be hard to zoom out. There's also a cultural difference here where we certainly have a much more market-based approach to things like data governance in the US than, certainly, other countries do. And I think the blockchain ethos in particular, because the genesis really was around cryptocurrency, because the genesis really was around transactions, assets, propertization… it's not surprising that the data governance applications and thinking around data would also take on the same properties and have this sort of market-based ethos. I don't think that was necessarily thought out as the best outcome. I think that was a product of the origins of the technology. But as I said before, I don't think that what's suitable for one use case is necessarily universally suitable. And from my perspective, particularly being… so, I lived in London from 2012 to 20.... just before the referendum, so beginning of 2016.

Mike: You've done a lot of interesting things, Elizabeth!

Elizabeth: For sure! Well, I think that's where my perspective comes from. But at the time that I moved to London, it was around the first draft of the GDPR. And it was something that I worked on from the first draft to the last draft, and really saw the evolution there, and the conversations happening in Europe at the time around data protection and data governance. And what was really important in that context was that Europe already had all this foundational architecture around human rights in the European Convention on Human Rights, but also building on international human rights law. And so the conversation was already kind of framed in that context and building upon the directive before it and other laws in Europe, like the Eprivacy Directive. So I think the cultural germination of data governance really plays out in what the ultimate framework looks like. And what I see happening in the US is, there's still this hyper-orientation around the market, insomuch as even our most progressive data privacy proposals, whether it's the CCPA or others, are still very consumer oriented. So it's always this notion of the consumer, it's always in the context of a commercial transaction. It's never brought into the scope of, like, a citizen, or a person, or a human. And I think that's where we just continue to fall short, and are never going to capture the full array of what's happening.

Mike: Yeah, I know and I plead guilty to falling into the discussion. Sometimes I think about even the proposals around regulating blockchain broadly [and] the key things I always point to are the protection of consumers and protection of investors, but that's because the institutions that are set up with regulatory authority in general cover those two use cases, if you will, and not—we [effectively] don't have a human rights enforcement or human rights based... our institutions, at least at the federal level, are not set up like that. And I think that's a really important perspective to bring into it, Elizabeth. I think we continue to see that in a lot of other arenas, and [we see] the difference having profound effects, frankly, from what the US often is pushing and what you'll see, mainly from Europe at this point. Because, well, that's a whole [different] conversation. But yes, I'd love to continue on this. But what I really want is you to tease your book a little bit more, and get into some of the, kind of, what the structure of a human rights framework in this context would look like.

Elizabeth [13:32]: Well, what you're describing is actually really relevant because the history of data governance is really relevant to examining, or to really projecting, what the future might be. And if you look at the history, the US is actually at the forefront of this, both of the development of international human rights law, as far back as the universal declaration [of human rights], but also in terms of, the emergence of ARPANET and the origins of the internet and the prominence of the US and the UK in that context. If you look at the earliest data protection laws at the national level, we're talking primarily about laws relating to or governing public databases. So, databases of government agencies that had data on their constituents. So the context actually wasn't one of private companies or the commercial sector, it was very much this constituent relationship between government and citizen. And that's just because the use case at the time, and the earliest laws were and then 1970s 1980s, was this context of the government having data on their citizens. And so you already had this social contract in place. And just given the nature of that relationship, that is very different from the context that we often think about this in now. And when you look at something like the 1974 Privacy Act in the US, that was also governing the public sector; [there] was that relationship between government and constituent. And we kind of lose sight of that now, when we're really focused on this consumer commercial relationship. And this, I think, has led to a lot of holes in the way we think about... I think people are still very concerned, obviously, about government surveillance, and they're concerned about corporate surveillance or surveillance capitalism. But we haven't kind of looked at—I think we need to look closer at the relationships between people and then the parties implicated in this data governance conversation, and what the nature of those relationships tells you about things like power dynamics, about rights, civil rights, human rights, about other actors who might be involved, and about the potential implications.

Reuben: Even looking at some of the more progressive data governance regulations out there, so something like the GDPR... I think you've written about how it mirrors the Fair Information, Privacy Principles in some subtle but definitely very present ways. So, what can we be doing better? Where are the places where you see these holdovers from an ancient, archaic database-based schema? And what should it look like? ​ ​

Elizabeth: Yeah, I think that's exactly it. So we still think about it in this one-to-one relationship where both parties are known, and there is a preexisting relationship there, and there's a pre existing context, and there's a predetermined use. And I think you're right that the fundamental principles of the GDP are very similar to the FIPPs. Now, I think the two core innovations of the GDPR, which I've written about, are... the expanded notion of data portability is certainly one innovation, even above the directive before it, from 1995. And also, this notion of data protection by design and default is another core innovation that's really trying to shift the burden away from the individual and onto the organization or the entity. But I think you're right that those principles are similar. And I think where we need to start going is we need to recognize the complexity of the data ecosystem now; the complexity of the digital ecosystem, where we're not in these one-to-one relationships, [where] we certainly don't all have transparency on who's on the other end of [the] data. And the data flows are a lot more complex. And they're also not in the context of a database. They're in the context of the ambient environment, the built environment. They're in the context of IoT, they're in the context of really everything: smart cities, they're increasingly going to be in this very ambient, very animated context that goes well beyond the graphical user interface. It's going to be a voice interface and a brain machine interface, and neuro-hacking and all these technologies that are emerging, where if we keep thinking about this as a one-to-one database structure, we're just not capturing anything in terms of what's really going on in the universe. And so that brings me back to this idea that we have to really start from the human and the person, and think about the impact on people rather than focusing so much on the data itself.

Reuben: So then maybe let me ask you a tougher follow up question: So take the GDPR, and go back 50-60 years when [we] really were dealing with one-to-one database relationships, and all the actors are known. Do you think the GDPR would have been a really useful governing structure in the 70s and 80s?

Elizabeth [18:26]: I think it's still useful. I think what's happening now is—it's still useful because it's predicated on rights and those rights, theoretically, can apply even in more complex situations, it’s just we have to be more imaginative about how we apply them, and about enforcement. But I think the reason that I'm not ready to give up on it is because we have massively underleveraged the GDPR. I think we utilize a much smaller percentage of what is available to us. And this is not just the GDPR. This is true in terms of FTC enforcement, this is true in terms of—even with the FTC, look at the unfair and deceptive standards, we lean very heavily into the deceptive side of things, and we haven't enforced the unfair side of things. And so I'm very reluctant to keep layering on new laws and new laws when we haven't leveraged what we have. And with the GDPR, go back to these two core innovations. We haven't actually enforced data protection by design and default. We have not enforced core principles like data minimization, we have not enforced true data portability. There's so much that—and this is [a] limitation of enforcement—there is so much untapped potential in the law that what I'm worried about is: by trying to introduce a new law, we're kind of deferring and delaying the conversation, and we're not addressing any of the existing problems. I hope that makes sense.

Mike: Yeah, I think that part of your really struck me: the whole question of enforcement is one that... I think it's not by that it's not discussed. I think it has everything to do with, like you alluded to, power relationships. I think about, in a slightly different context, but environmental laws (particularly in California) have relied on a couple different things to make them not just great on the books, but in real life. Citizen enforcement, that is private attorney general, built into those, and things like the unfair business code, the unfair competition laws, which would allow for actions to be taken on behalf of the public, where there's unfair competition, which was broadly defined. I think those kinds of things have allowed clean air, clean water, [and] toxins laws to be enforced where you'd [otherwise] have to rely on just a public nuisance or some other common law basis. And it's led to… frankly, the outcomes have been nation-leading, if you will. I think that, to me—and this question of enforcement always is something which good laws need, and [it’s] thought about much less than the policy itself, but to me, it's like: unless it's enforced, it's essentially not even worth the paper it's printed on. I'd love to hear your thoughts about what that means, because we often [default] to looking at budgetary authority for enforcement. But it's really a broader question. It's public education, it's private attorney general provisions, and it is enforcement, as well, from relevant government agencies. So on the enforcement part of it, [would] you want to talk about, what's your vision about how we get a good-to-great law in real life, making the world a better place? How do we actualize it?

Elizabeth: Yeah. And so we need to really test it. We need to start market testing our laws in some way. It's only when we have this strategic litigation, whether it's Max Schrems, or whether it's Ravi Naik and the Cambridge analytical folks, it's only when the theory of the law is really tested in the field [that] we get any progress. And so I do think it's really important that in the imperfect laws that we might have here, for example, that we have private rights of action, and that we have enforcement by the AGs and that we have these different mechanisms for doing that kind of stress testing and for starting to bring clarity to some of the tremendously vague areas that are in these laws. But I think it's also recognizing that no single law or legal framework or legal domain can do all of the work. Like for me, it's never going to be about just a data protection law or privacy law, it's going to be about competition and antitrust, it's going to be about consumer protection, it's going to be about human rights. It's going to be about this array, this composite approach, because the privacy laws and data protection laws are only as effective as the context of the market in which these companies operate. And all of those macro factors, I think, are really important in terms of the actual efficacy of people's own individual rights and their ability to actually enforce them. So we saw since the GDPR took effect, we've seen [that] some of the most meaningful, sort of, what we call “data protection” actions have actually come from antitrust authorities and regulators in Germany and other places, so I don't think that we should expect privacy law to fix a lot of these issues.

Mike: Yeah, I completely agree. I think one thing it also triggers is our current context, globally, on dealing with COVID-19. And there are a lot of places where there are public health professionals, with all the right credentials and all the right experience, giving all the right guidance, and there are laws to back it up. And yet, that only gets us so far. And then we have to rely on people actually following—and not just following, but leading—into a new way of interacting. So it's the cultural context, frankly. And so there is an element that even the best law, the best implementation of it, and even the enforcement... you can't rely on that only. And I think it's striking, both the need for government in this current crisis... the need for government, the need for law, the need for rule of law, but also the limitations of it. And it sets the framework and the direction, and yet it can't be done without all these other things.

Elizabeth [24:55]: I completely agree. I mean, I think we're seeing this out so well in... certainly in the COVID response in terms of [how much] everyone expected the EU to be ready. And okay, look, they've got the gold standard of the data protection law, surely they're ready. And it hasn't really played out that way. You've actually seen this controversy over decentralized versus centralized contact tracing apps, you've seen Apple and Google come in and set the terms of their API and tell countries what they can and can't do and collect. You've seen a huge lack of trust in the public. Every time these surveys and polls are done, people are very reluctant to download or use these apps because they don't trust what's going to be done with the data. [Contrast this with] Taiwan and South Korea and Singapore, where it's not necessarily about the robustness of a law or regulation, it's about trust in the government. And in the fact that there is civic infrastructure and public infrastructure, certainly in Taiwan, where there's a very robust culture of civic tech and civic infrastructure that's not owned by the public sector. And I think those things definitely played a huge role. And that does speak to the fact that the cultural attitudes around data and the relative trust in, whether it's government or large companies, is a really critical piece of this that is very much overlooked. I don't know that you can engineer a law around that idea to cultivate [it] at the Civic level.

Mike Yeah, and it just speaks, once again, to your point about data is, if it's real, a way of describing something else that's real, like a person. But it's not the same. We get lost in the “is the data actually captured.” And once again, I find myself [falling into] this trap. It's like, well, if it's captured by a sensor, then it's real data, right? It's not model data. It's not projected data. That's real data. That is a sensor capturing some kind of activity, [and] then there was a human associated or some other kind of thing—CO2 emissions or whatever else.

Elizabeth: Yeah. And how is the sensor configured, and what are the other constraints? And I think we see this with the decentralized contact tracing apps where they don't know that there's a wall between people—they’re three feet apart, but on different sides of the wall. There are so many limitations there that speak to how useful that data is and what the implications are.

Reuben: I think that when you get into things like sensors and these different ways that you can collect data, it sort of leads into a conversation about the other part of data. So we're talking a lot about governance, but then there’s this very related idea of ownership (which I think you've also written quite a lot about). So maybe just as a way to kickstart that... there's been a trend lately. I say a trend, [although] it’s sort of a minor trend, towards individual data ownership. Towards this idea of building data marketplaces, and giving the user the ability to control where their data goes and who sees it. You've had some thoughts on that. I think you're on the fence about how smart that is, although I think you also keep in mind the public interest. So I'd love to hear a more fleshed-out version of how you're thinking about these tensions.

Elizabeth [28:19]: Sure. I think a lot of the around “own your data” and “data sovereignty” is coming from a well-intentioned place and [it is] aligned with, certainly, some people's interests and sometimes even fundamental rights and things like that. I just worry that when you start taking that logic a little bit deeper, it starts to fall apart really quickly. And my perspective is that… as you've mentioned, I've written about this extensively. I am really concerned about the second, third and fourth order effects of this idea of owning your data. I think, certainly there are the very obvious logistical challenges about questions around whether data even has the properties of something that can be owned in any meaningful way, from the perspective of property law or [on a] property basis. But I think my much bigger concerns are around this question of, what are the limits of markets? I think about this question a lot. And one of the most impactful courses I ever took as an undergraduate was Michael Sandel’s justice course, where he asked, “what are the moral limits of markets?” And certainly, I'm inspired by Shoshana Zuboff’s work, and others who asked this question of whether we want everything to be subject to the market, and [whether] we want everything to be subject to propertization, or commodification, or trade. And from my perspective, there is a danger to removing all limits from markets. And I think there are good public policy reasons, human rights reasons, ethical and dignity-based reasons, to actually impose some limits on markets, and to view some things as sitting outside of something that is propertizeable. And I would put... not all data, but I would certainly put personal data in that category. And I can expand on that, which is to say that I think it's extremely naive to think that anyone can control their data. And I think it's also a bit reckless to put that burden on anyone, to have to manage their data. And not only do we not have a full picture of all the data implicated in just being alive today, and going about your day, but I don't think people should have to be in that position. Because some people are going to be in a much more privileged position in respect of managing [their data] than others. And I think it's going to put an undue burden on the most vulnerable, [on] the already-marginalized, [on] minority groups, and others. And I just think it's a very… frankly, a naive position to take.

Mike: Would you be advocating for kind of a policy differentiation, like an accredited investor approach? Is that what you're [suggesting], or is it more just a... ?

Elizabeth: [Laughter] No, certainly not. I don't think that fixes the problems of this propertization, this market orientation around data. I think there is fundamentally—there is a relationship between human dignity, and how that sometimes is captured or expressed in personal data. And I also think that, as I said, there are asymmetrical relationships at this stage, in terms of power, in terms of capacity, in terms of resources, in terms of capacity to engineer behavior. The asymmetries are so vast that for anyone to actually—I just don't think it's possible for anyone to effectively manage all of their data. What's “their” data? If we take a group photo, is it your data? Is it my data? I don't know. I don't find that a useful exercise. I find that a really good exercise, potentially, for lawyers who could make a lot of money off of adjudicating these claims. And I also find it a good reason for Facebook to say, “we can't give you your data, because it's not your data.” And I find it creates a lot more logistical hurdles than it solves [around] any of the meaningful problems that people have. I'll take a breath there.

Reuben: I was going to ask for just a quick clarification. So one of the things that I appreciate about the way that you write is that you tend to implicitly put everything on a spectrum, rather than saying, this is good, this is bad. It's a much more nuanced take. So [take] this idea of saying that it's naive to think that people can control their own data. I'm assuming that you're thinking about, or at least I have an intuition, that the idea of data control is also on a spectrum. Even now, you can control some of your data, while there's some data that I've just sort of accepted is gone. Where are you pinpointing this idea of people controlling their data and how are you thinking about what the limitations of that are?

Elizabeth [33:35]: Yeah, this is why I'm very skeptical of a lot of recent proposals for privacy legislation, because I think they're still in this sort of individual control paradigm, which I don't think is realistic—nor should we have to. So the way that I think about this is, when you or I walk into a building, we don't need to know that it's structurally sound, or that it's properly licensed or permitted. I mean, those things are displayed, but it's on someone else to make sure it's sufficiently safe [and] sufficiently stable. I think we're in a similar situation now with data and digital where it is the built environment. And so I shouldn't have to vet every second of my life to make sure that it's not putting me at undue risk. And that's how it feels right now: that it's this excessive burden on the individual to do something that is just beyond [reasonable], for anyone. We're deeply in this, and I cannot tell you—I don't have an accurate picture of what's actually happening. I mean, it may be more sophisticated than some, but it's not anywhere near accurate [enough] to put that back on me. And then ultimately, what happens is, by shifting that burden to the individual—and I think it's also eroding their rights, because it's really taking away the responsibility from the people who have the asymmetrical power and are in a position to architect a much safer ​ ​ ​ ​ environment. So I kind of think about it from that design standpoint.

Mike: Yeah, just in terms of getting people to understand the framework using the built environment, as not just , but—this is just an extension, the next dimension of the built environment, if you will. I think it's so clear, when you think about it that way. Then you think about public and private—there are rights, and there are responsibilities, but they accrue to (for instance) the city with the sidewalks and the paving, or the or the private… the LEED certified building that's owned by Shorenstein. Whatever it is, you actually appropriately placed the responsibilities and the rights, and the individual doesn't have to, like you said, worry about it. But also, it's not up to the individual to make sure the city works as it should, or as the collective feels like it should. So, I think it's really a great way to get people to understand the implications of this.

Elizabeth [36:01]: Yeah, I think one of the major concerns I've had... as you as you both know, I've done some work in the self-sovereign identity area. And I think one of the growing concerns I had working in that space was that I saw companies loving this notion. Why wouldn't they? Why wouldn't they love the notion of shifting the burden to the individual and offloading some of the responsibility and the liability? And to me, that's such a dystopian endgame. And I think, here we are, we were told that the robots would free up our time for better things—

Mike: So, managing your data.

Elizabeth: Right? Exactly. Not something I want to be spending my time on in particular. Nevertheless, it's really important. And so I want to shift that burden back. And there have been very few, but some—I think Senator Sherrod Brown’s recent privacy bill that basically limits data collection to what's strictly necessary and is trying to shift the burden back to the organization is potentially one of the good ideas out there. Of course, it's going to get tremendous blowback from the lobbies and so we'll see if it goes anywhere.

Mike: Generally speaking, the mark of a good bill.

Elizabeth: Right, exactly. But there's such a seductive rhetoric around, like, “own your data,” “control your data,” “self sovereign,” but again, if you just take it a couple levels deeper, for me, it just falls apart so quickly.

Mike: Well, let just, if I could… and I know we may be, for this session, running towards the end of it. But I wanted to get your thoughts on something I've been thinking about in terms of these types of data, and whether there are… we've been talking about personal data. But if there are, in the context of other types of data, different ways to look at it. I'm thinking in particular on looking at climate emissions, toxics, and other pollution, etc. The externalities as they've been defined by economists and by, frankly, big business, of how we consume as humans and how we waste as humans. How that is treated, and the data associated now, and how that is treated. And one of the ​ ​ things we've been thinking about and looking at advocating is the idea of a public right to certain types of data. That the public should have a right to know about carbon emissions. The public should have a right to know about the existence and the use of toxics. The public should have a right to know about these because those are not private. That data should not be, by law, private and held (e.g. behind data servers or any other way) from the public because of the implications for the public. And to me, it's almost like that's a different type of data, and requires also a different way of looking at it. Have you thought about that? Not trying to drag you into this whole other conversation. But you do think about data like few others, and I'd love to get your thoughts on how to treat these other types of data.

Elizabeth [39:15]: Yeah, I do think about that. I think this is one of my concerns about the own-your-data, hyper-individualistic approach is that... there are a lot of collective implications of data. And I think there are a lot of instances where the collective should, frankly, supersede the individual's interests in owning or withholding or controlling a piece of data, or a data set. And so again, that's another reason why I'm not a big proponent of those proposals. I think in what you're describing, Europe's taking an interesting approach there in its new strategy on data: this idea of the common European data spaces. One of the areas is the Green New Deal around climate and environment and energy. And I think there's certainly utility to opening up. I mean, that is one of the problems we have right now: the vast majority of that information is controlled by private interests, and it is overtaking the public interest. And we're seeing how… what's happening in New York is terrifying to me. I grew up in New York. And I think this idea of the Shock Doctrine and of Eric Schmidt (and others) coming in to run New York in a way that is controlled by private interests, and overtaking the public infrastructure, is deeply concerning. And so yeah, I would love to see some of that made available to the public and to governments. And I think there are cities that are doing some interesting work around this. L.A. is doing some interesting work around this in terms of urban planning and transportation. And I'd like to see more cities. And then this goes back to the point earlier about places like Taiwan that have really robust civic tech, and ways of actually making some of this accessible to the public—which then allows people to actually be more involved as citizens and to make much more informed decisions and actually participate in a democracy, where right now we just don't have the good information to do that.

Mike: Yeah, and to your point, by extension: data and information are critical to democracy. And we haven’t even talked about that implication, but I think it’s… hopefully our listeners are locked into that one. But I do think it merits a whole other conversation. Elizabeth, this has been spectacularly interesting. But more importantly, it’s a conversation that people need to engage in right now. We’re super excited that you got on the show, but we’re really thankful for the work you’re doing. I don’t know if you have any closing thoughts, but this would, I would hope, be one of a series of conversations. We want to, of course, look forward to your book, and we will promote that. I’m sure the kind of insights—

Elizabeth: It’s a little ways off, but I’ll try to hasten things up. One of many side projects! As I said, thanks for kicking off this series and this podcast. I think it’s an important conversation. I’d love to see a broader conversation around this, because we get stuck in the same loops. To the extent I can help broaden that conversation, I’d love to continue to participate.

Mike: You’re definitely doing that. Thanks, Elizabeth! Really great talk.

Reuben: Yeah, this was fascinating.

Elizabeth: Yeah, thanks guys, appreciate it!

Mike: Well, take care. We’ll talk soon, offline and in other contexts, but it’s been great.

Elizabeth: Sounds good. Thanks guys, take care!

Mike: Thank you!

[Closing]

Transcribed by https://otter.ai

Listen to the full episode here. ​ ​

Contact:

Elizabeth Renieris (Twitter): @hackylawyER ​ Mike Schmitz: [email protected]

Reuben Youngblom: [email protected]