‘Inside UVA’: This Peerless Expert Is Leading the Charge for Online Privacy
December 14, 2023 • By Jane Kelly, jak4g@virginia.edu Jane Kelly, jak4g@virginia.edu
President Jim Ryan hosts fellow lawyer and online privacy expert Danielle Citron on his podcast, "Inside UVA." (From left, photos by Dan Addison, University Communications, and Julia Davis)
Audio: ‘Inside UVA’: This Peerless Expert is Leading the Charge for Online Privacy (31:44)
President Ryan’s final guest of 2023 is renowned online privacy expert Danielle Citron of UVA Law.
Danielle Citron, Jefferson Scholars Foundation Schenck Distinguished Professor of Law: So it’s not just the creepy – though I actually love online ads. You know what I mean? Like, they know me, man. They’re like, “OK, I see you, lady, I know what you’re buying.” Right? I’m OK with that, actually. But it is the trafficking of data and the intimacy of the data, and it’s the way that impacts our life opportunities that requires protection.
Jim Ryan, president of the University of Virginia: OK, so now I’m anxious and scared about it.
Citron: Sorry about that.
Ryan: No, it’s OK.
Hello, everyone. I’m Jim Ryan, president of the University of Virginia, and I’d like to welcome all of you to another episode of “Inside UVA.” This podcast is a chance for me to speak with some of the amazing people at the University and to learn more about what they do and who they are. My hope is that listeners will ultimately have a better understanding of how UVA works and a deeper appreciation of the remarkably talented and dedicated people who make UVA the institution it is.
I’m joined today by Danielle Citron, the Jefferson Scholars Foundation Schenck Distinguished Professor at the University of Virginia School of Law. In addition to her teaching and scholarship, Danielle is also the director of UVA’s Law Tech Center, which is working to address pressing contemporary questions in law and technology. She also serves as the vice president of the Cyber Civil Rights Initiative and sits on the board of the Electronic Privacy Information Center, among other affiliations. She’s a 2019 MacArthur Genius Grant recipient, and this year, she was elected as a member of the American Academy of Arts and Sciences, recognizing her as one of the world’s leading scholars in privacy law. Her latest book, “The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age,” is a call for the protection of intimate privacy as a human and civil rights.
Today, we are fortunate to have her on the podcast and professor Citron, thanks for being here.
Citron: Thank you so much for having me. I feel so lucky to be at UVA and to be talking to you.
Ryan: Well, thanks. And do you mind if I call you Danielle?
Citron: Please! Well, you insist I call you Jim, so you have to call me Danielle.
Ryan: Fair enough. So, can you tell me what “intimate privacy” is and how it differs from regular privacy?
Citron: Intimate privacy is the way that we manage the boundaries around our intimate lives. So it’s information about, and access to, our bodies, our health, our innermost thoughts.
So we share our innermost thoughts, we don’t realize, all day long. As we click, we share, we read, we browse, we text, right? It is the privacy that we afford our sexual orientation, our sexual activities, sex and gender, and our closest relationships. And in my work, I’ve made the case for – there are lots of different types of privacy, and intimate privacy isn’t the only privacy that deserves special protection, but it warrants special protection, and we should understand it as a civil and human right, because it’s so key to human flourishing.
We need intimate privacy so we can figure out who we are. We need intimate privacy so we can form relationships and love one another. And privacy isn’t me, it’s we, right? It’s drawing people close. It’s bringing companies in. But we can only do that if we trust others to be discreet, right, both with access to our physical selves, as well as to our information.
Ryan: And why is it at risk today?
Citron: You know, it’s interesting you think – 1890 is when Samuel Warren and Louis Brandeis wrote the famous article, “The Right to Privacy,” and then they were worried about cameras and recording devices that were being either installed or misused. But they talked about “what’s whispered in the closets shall not be shouted from the rooftops.” I’m just using their language. And they worried that the penny press was describing the intimate details of people’s sex lives. And they worried that domestic life and they quite literally meant, you know, the home, right, and domestic relationships, that those relationships, including letters from, let’s say, father to son, which weren’t that interesting, but that there needed to be a sense that they were just theirs. That is, we would be free to become ourselves, free to develop relationships, because we had some surety.
They called for law to protect privacy in that way. What they were writing about in 1890 is – not only was it imperiled then, it’s even more imperiled now. And so I take Warren and Brandeis in many respects to be talking about intimate privacy. Sam Warren, I think nudges just you know, then just as law partner, Louis Brandeis, to write that article because Sam’s brother, Ned, was gay. And Ned was a big letter writer, so a lot of those letters are now stored in the Massachusetts Historical Society. But then, he wasn’t out. He was living in Oxford with a group of men. And he wrote a lot of letters about his interest in art and sexuality. But it wasn’t public, his sexuality, and it was a crime in England. And it wasn’t a crime in Massachusetts, I’m not sure, in 1890. But it was certainly, like, socially disastrous. And there’s no question that sexual privacy or intimate privacy was on the mind of Sam Warren.
And so you asked, you know, how is intimate privacy, you know, imperiled today? Yeah. And it was, I have to say, imperiled in 1890 in ways that we were really worried about – the snap camera, and mass media, or the emergence of mass media. But now, details of our intimate lives – you know, who people are having relationships with, their health conditions, right, the prescriptions that they take, their fantasies, their desires, their innermost thoughts – all of that we share all day long. It’s not just in our heads. It is now on our phones. It’s in our apps, it’s in our searching and our browsing. And so, in many respects, if you could – people have said they’ve gotten all the information that’s collected about them from their dating apps, thanks to European law, and a journalist that, she got that like 800-page file of what information was held about her from her dating app, Tinder, and she said her app knew her fantasies more than she knew herself. So, we are sharing so much about ourselves that we just don’t realize information, like, what do they say it flies out of our head and hands and we don’t hear it. Right?
There’s no – in the way that when you share information with someone face to face, you know that you’re disclosing that information. But because so seamlessly, our personal data, and the most sensitive and intimate personal data, is being collected and used and shared seamlessly all day long, we need to protect it. And we woefully, we woefully deal with it right now. There are just too little protections for intimate privacy.
Ryan: My guess is that this information that is being shared, too, is being shared unwittingly.
Citron: Yes. Like when we use apps, and we browse, and we search Google, we’re not saying to Google, “Go ahead, share it with the whole world.” Right? We are assuming that we have a relationship with these providers, and that as they tell us, providers, apps, browsing searching, are all the companies that – Alexa, right, your Amazon device, Siri on your phone – all these companies are telling us that their products are good for us. They will make our lives better, right? Easier, cheaper, more fun, more exciting. What they’re not really telling us is that truly it’s a surveillance business. Right? We don’t realize that we are the product or data, right? Why are these, so often, apps are free? They’re not free. Not free at all. Right? It’s our data that’s the price. And the unfortunate thing is, even if we understood that was the bargain, we wouldn’t understand the downstream costs. And so the bargain is never good for us.
Ryan: So the data that’s being shared – tell me if this is right, – is usually used for commercial purposes. So, you search for something and then suddenly ads are popping up. I admit that that’s slightly creepy and annoying, but is there a real risk that the information will be tied to you as an individual and then divulged publicly? Because I think when people think about the biggest risk of privacy is that something that you meant to be private is going to be told to the entire world – going back to the quote you use from the right-to-privacy paper. Is that a real risk, too?
Citron: Yes, yes. And in fact, companies would love us to think that the only risk is online behavioral advertising, right, that we’re just going to see better ads. “It’s so good for you. Come on, who doesn’t love a good shoe?” And I actually love online behavior.
Ryan: How did you guess I love shoe ads?
Citron: Exactly. Right? We knew, right, we have a shared love of shoes, right Jim?
But that isn’t the risk, right? All of this data that’s being collected about us and shared with online behavioral advertisers and marketers is then being repurposed and shared with data brokers. And data brokers are ranking and rating and scoring us, and that information is being used to make decisions about the jobs that we get interviews for. Jobs we don’t ever know we don’t get interviews for is thanks to a data broker. Right? It’s the licenses, insurance premiums – like there’s so many ways in which our data is being sold by data brokers, that’s unprotected by, let’s say, the Fair Credit Reporting Act, which only covers when you’re, if you’re understood as a credit reporting agency – data brokers will say “We’re not a credit reporting agency. We’re selling 10,000 data points” – on each and every one of us, there are 10,000 data points.
You might say, “How is that interesting, right, that there’s 10,000 data points on Jim Ryan, Danielle Citron, someone else?” It is because they’re scoring and rating us. So that is, I’m likely to have Type 1 diabetes, for example, or I’m likely to have thyroid disease, right? They will rank and rate and score people. Sexually curious – like that is their categories that are really sensitive, that has to do with health, sexual orientation, sexual activity, religious affiliation, political affiliation, that they will rank – sometimes it’s factual, you know; there’s information that I’m of a certain religious affiliation. But many other times it’s speculative. They score us and rate us, right? Categorize us. And you might think, “OK, if it’s just for advertising. I’m not worried. Right?”
But it’s not just for advertising. And then you ask the question, is there a peril that that information could be released online, right, to the public in a way that, similar to what Sam Warren and Louis Brandeis were worried about, and the answer is yes, because you can go to any data broker – it’s like a people search service. And for $23.99 a month, you can subscribe to a lot of these data brokers; at least I did in writing my book. What you can find out is an enormous amount of personal information about people, right? Who they’re dating, what’s on their social dating app profiles, which may not be public, their proclivities, and that information is and can be used to what we call “dox” individuals, so to post their personal information online.
So, as often is true in situations of online abuse, what will then happen is that someone will – like often cyber mobs – will then target someone, and then post their home address, their cell phone number, sometimes their Social Security number.
You might ask, “Well, Danielle, how the heck they get that?” $29.99 buys you a lot for a people-search data broker.
And so all of this is to say that it’s not just that it’s creepy, right? It is imperiling of our crucial life opportunities. And that may be the jobs that we never, ever get notice of – we were just not interviewed for them, right? It’s not that you don’t get the job. It’s like you never get the interview. And there are third-party hiring services that don’t view themselves as entangled by law, and they use all this data. So the practical implications are profound.
There are hundreds of health data brokers. And you might say, “How is that a thing?” Right? Like we have HIPAA. Right, we protect health data. But there are data brokers that traffic in predicting our risk of disease based on our purchases at the pharmacy. So I buy alcohol swabs, right? I buy certain kinds of vitamins. I’m likely to be pregnant or have diabetes. I’m making this up, but you’re getting my drift, right?
All sorts of data is collected about us. I search, “What might be the disease if I have dizziness, high blood pressure,” like, think of what you can infer about people, right? And health data brokers sell that information to health insurance companies to figure out our premiums.
Let’s pause. That has material effect on our life, right? So it’s not just the creepy – though, I actually love online ads. You know what I mean? Like, they know me, man. They’re like, “OK, I see you, lady. I know what you’re buying.” Right? I’m OK with that, actually. But it is the trafficking of data, and the intimacy of the data, the way that impacts our life opportunities, that troubles me, that requires protection.
Ryan: So now I am anxious and scared.
Citron: Sorry about that.
Ryan: No, it’s OK. So what can be done about it?
Citron: So if we think of intimate privacy as a right owed each and every one of us, as a civil right, but that also requires structural protections against discrimination and inequality. Because so often, your intimate data, just given the way that gender norms and racial and gender stereotypes work, is that data about someone’s sex life for a woman, and let’s say for someone of a racial minority and female, it’s going to count against them. Right? So I think if we view intimate privacy as a civil right, something that isn’t just protection against discrimination – though that’s critical to it – but that is a right owed everybody and – come to think about this, I rely on my Dean Risa Goluboff’s work on the history of civil rights, right, that is we once thought of a civil rights as something that we all enjoyed.
But what that would do for us if we understood intimate privacy as a civil right, because you asked like, how do we protect them? When we think – this is in the modern vision of civil rights laws, right, that the entities that have power over our civil rights, they’re understood as guardians of those rights. And they have guardianship duties.
So the argument of the book is that if we think of data collectors and data hosters, whether it’s the hosting of data, or it’s the collection, use, and sharing of our intimate information, that they should consider themselves data guardians, and have obligations, substantive obligations, to protect it. Duties of confidentiality, of loyalty, of anti-discrimination, and duties not to collect. You know, like, we often live in this universe where the – and it’s true in the United States – the presumption is we can collect. Like we have a consumer protection model, which says, “Go ahead and collect and unless you cause consumer harm, or you lie to people; that’s when you’re gonna get in trouble.” And I think we have to flip the presumption.
If we call something a civil right, what that means is, you need a really good reason to interfere with it, right? Like Fred Schauer would say, our colleague at the Law School, that when we flip that presumption, that really good reason would mean you need a legitimate and compelling business reason or other reason to collect the data that doesn’t have a significant risk to intimate privacy. So you can’t leak, misuse, hack information you don’t have. So I think we need – the project of information privacy protection must include hard limits on collection.
And you know, you always think of Europe as having such strong privacy laws. You know what? They’re pretty thin themselves. Their laws are all around what we call “fair information practices,” which is procedural protections, which we actually were the forerunners of that, right? 1973, our Health, Education and Welfare secretary laid out the Fair Information Practice Principles, which is like people should know the data that’s collected about them, they should be able to have a say in what’s collected, they should be able to fix mistakes. Right? So accountability mechanisms, transparency. So the general data protection regulation in the EU is a FIPPS statute. It’s a strong version of the FIPPS, like its procedural protections, but it doesn’t have the sorts of substantive protections that we need. So I think we can do better than the Europeans, yeah.
Ryan: And is there much support for this? Or is it, I guess, this is something relatively few people even know about?
Citron: You know, it’s interesting. We came so close last summer. So, in the House Energy and Commerce Committee, there was bipartisan support for the American Data Protection Privacy Act, which had substantive protection, limits on collection – only if it could be legitimately needed for an important business purpose, right? There was a requirement, a duty of anti-discrimination that didn’t require see enter. So like, it could be disparate-impact kind of discrimination that would be protected against. Duties of loyalty.
And what happened to the bill? It had the support of 52 members of the House Energy and Commerce Committee. But unfortunately, the two California Democrats didn’t like – there’s a trade-off with any very strong privacy protection, which was preemption of state law. And they stood up and said, “Nope, California still has to have its power to adopt strong laws.” And their law is among the strongest state laws, but they’re not as strong as ADPPA. So Pelosi refused to bring it to the floor.
And so at the time, my colleague, Allison Gocke, and I wrote an op-ed for Slate, saying, like in environmental law, they often have like a carve-out for California, the strongest state, and then there’s preemption for every other state and environmental law. So we said, “Let’s borrow from environmental law. We can placate California. We can have our cake and eat it, too, when it comes to privacy.”
But there was then a new Congress. It’s still kicking around, the bill, but it’s sad. We had a very cool moment where there was House support, bipartisan support, there was support in the Senate – Wicker and Cantwell supported the bill in the Senate side. And so it’s a little bit sad. But I guess all this to say, Jim, is that what I’m imagining isn’t fantasy. You know, like, that’s so often you think, “You’re an academic, it’s pie in the sky.” But I actually work with committee staff, you know what I’m saying? I work closely with EPIC, who had the ear of committee staff on ADPPPA, and that’s the kind of thing we came close. So like, what do they say, “theory and practice”? I’m actually not suggesting something that is insane. We came close to it.
Ryan: So you will have another chance, I would think.
Citron: Absolutely, and the folks I work with at EPIC, the Electronic Privacy Information Center, they’re working on it on the inside. So I have not given up on this project of a much more substantive commitment and commitments to intimate privacy.
Ryan: So another topic that you’re interested in is deepfakes, and what you call “the liar’s dividend.” So talk a little bit about what constitutes a deepfake, how you got interested in it, and what the liar’s dividend is.
Citron: Deepfakery is a set of processes, generative adversarial networks with technology that enables us to synthesize video and audio, either out of whole cloth, or – it’s less sophisticated – but morphing people’s faces into existing video. Right?
And you asked, like, “How’d you get interested in this?” In 2018, in February – you know, I write about often the targeting of women and girls and minorities online. So there’s a subreddit that pops up called “Mr. Deep Fakes.” And he’s like programmer by day, creating fake videos by night. And what are the videos that he creates is deepfake sex videos, celebrities, and in particular, of female celebrities being morphed into porn. And what then happens on the subreddit, like for a whole month, he’s offering the technology to people, and other people are then creating more deepfakes. Some saying, “Can I do this with my girlfriend’s face?” The answer is yes, you can. If you have a number of photos, you can totally synthesize more of her face into porn, or then synthesize from whole cloth using this technology.
And at the time, I was talking to Bobby Chesney, who is the dean of UT Law School, but then was a faculty member at University of Texas in Austin. And he’s a national security guru. And we’d always wanted to write together. So Bobby writes to me and I write to him, I said, “I think we have our project, Bobby.” Because deepfakery right now is a problem of intimate privacy, right? It is morphing women’s faces into porn, stealing their identities, coercing sexual expression, and giving them an identity that they did not choose, right? People look at these deepfakes and think it’s their bodies, right, which it’s not. That it’s them in sexual activity. It’s not.
And as Bobby said, “Oh, this is going to be an election problem, this is going to be a domestic, you know, when we think about diplomatic relations problem.” This is a problem for democracy, like women are the – I always say women are the canaries in the coal mine. Like we’re gesturing at larger problems.
And so we wrote a piece about the what we then called in 2018, the “looming challenges” of deepfakes. That is the well-timed deepfake the night before the election, showing candidates doing and saying something they never did or said, right, which changes the election results, because it depresses, let’s say turnout, right? Or the night before an IPO, you know, the CEO doing and saying something about the product, which is not true, but very believable, tanking the stock, right? On the battlefield, right, deepfakes showing certain things that are happening, that didn’t happen, that change how soldiers behave, how lawmakers and state actors behave, right?
So there are all these ways in which deepfakery can change the course of events, right, dictated by somebody else in really destructive ways.
And so, of course, on the one hand, you think, “All right, it’s video and audio. You know what, why make such a big deal of deepfakes? We’ve had lies forever, right?” That’s the initial objection to Bobby and I, it was like, “Why are you making a big deal about this? It’s just, we’ve lied since the inception of mankind, right?” But when video/audio – when we see it, it hits us viscerally. And it’s very hard to shake, much as it’s true in the human rights conversation. Like, we know that photographs of human rights violations allow us to bear witness, right? It’s an incredibly important visual medium. And the same is true for video and audio. If real – incredibly important for us to see, right, what our eyes and ears are having us believe. It almost sits with us. That is, it’s very hard to shake an image, and you hear voices, right? Like once you’ve seen something, you think “I’ve borne witness to that. I know that’s true.”
There are two risks in that, right, with deepfakery. With really sophisticated deepfakes, like synthetic audio and video that technologists like Connie Freed will say – they’re very hard even for technologists like Connie, who focuses on fake photos, to detect. Like they’re so sophisticated, again, technology, that it’s almost impossible for experts on the ground to say “this is false.” Like, you would need contextual information. Like you need good gumshoe journalism to figure out if something is real or false, which takes time. So on the one hand, it makes us distrust – like deepfakery, if we all think everything is fake, then we might just say we can’t believe anything. Like we already live in an era of serious truth decay. Yeah. And that serves some folks, right? Like, if it’s all fake, you can’t believe anything but what I tell you, right? But it also has another feature that Bobby and I coined as “the liar’s dividend,” which is wrongdoers with real video and audio of them doing and saying something destructive, illegal, you name whatever it is, right?
Ryan: They’ll just claim it’s fake.
Citron: Yeah, they’re like, “That’s not me! That’s a deepfake!” Right?
And the fascinating thing is like – so we write this paper, comes out in 2019. And we talked about the liar’s dividend. And we had really already seen it quite slight at the time, which was there were – remember the “Access Hollywood” tape? At the time, you know, at first, like, President Trump says he’s sorry, he feels badly, like he shouldn’t have said it. But about six months later, when he’s in office, he starts talking about the “Access Hollywood” tape, and says it wasn’t real. And that’s a perfect, I think, illustration, and we’ve seen it elsewhere.
That’s not the only example, where people, you know, point to real audio and video and say, “Can’t believe your eyes and ears anymore.” And that’s the liar’s, what we call the dividend, right? That’s the boon, the bonus for the person who wants to say you can’t believe anything, so don’t believe what’s real and damning for me.
Ryan: And is there anything to be done to combat deepfakes?
Citron: So this, so interesting, there’s some new studies about how – and I was very skeptical about this idea myself at first – but there are studies that show that labeling things as deepfakery can actually register in your brain as “It’s not real.” Now, I’m resistant. Like, you know, these are new studies that have come out of MIT that I’ve just gotten wind of –
Ryan: Can I ask you a quick question, just to clarify? My guess is someone who produces a deepfake video is not going to label it?
Citron: Of course, exactly. No, that’s the most awesome common sense I have ever heard, Jim. No, it’s true. Like you’re creating the mischievous deepfake, and you’re not smacking a label on it, right? No, no, thank you so much. That’s such important wisdom right there.
Ryan: But you’re saying even if someone comes along and labels, it’s still might not work?
Citron: I think so, because of the way in which video and audio hit us, right? Because, you know, just as in our lives, Jim, like, when you recount your own memories, your memories so often come from photographs.
Ryan: Absolutely
Citron: How we reconstruct our own lives is based on our photos and the videos we’ve seen. So they never leave us.
So you know, I’ve been working on some legislative possibilities to address digital forgeries. I’m working on a bill that would eliminate the Section 230 immunity for platforms with Representative Auchincloss that’s coming close, that would really narrowly carve out entities that host deepfakery into their privacy violations and cyberstalking. I think law needs to be reintroduced into the calculus, because right now the “internet,” and I’m using air quotes, right, is often viewed as, like, the Wild West. Like it doesn’t deserve laws. And we need to bring law into the picture. And I do think we need to address in a narrow, careful way, ways to address digital forgeries or deepfakes.
Ryan: Right. And that is, in a sense, putting some responsibility and accountability on the providers or the host of the content.
Citron: Yep, exactly. Because right now they get to just, what do they say, enjoy all the benefits and none of the cost? The cost they externalize, they don’t have to internalize, right?
Ryan: So I have a million more questions for you and could talk about this for hours. But I promised you I would only keep you a half an hour. So I’m just going to ask you one last question, which is: How does it feel to be a certified genius?
Citron: We’re not supposed to say that’s OK, Just so you know, the MacArthur Foundation hates that.
Ryan: Oh, is that right? I didn’t know that.
Citron: Or they don’t – I mean, I would say that they like calling us “MacArthur Fellows.” But it is an honor. It’s an honor of a lifetime. And I, to this day, don’t believe it. And when they called me, when John Palfrey, the president, called me, he’s like in a room full of people. And you don’t know – obviously they fake you into getting you on the phone. They say they have to ask you about a nominee and then you’re looking at it – “What is this MacArthur thing?” And I’m thinking of all my friends who might be proposed.
And so John – I said, “Oh, hello. Oh, my goodness, I have the president on the phone! That’s crazy. Nice to talk to you! Who we’re going to talk about?” He said, “Actually, you.” And I was like, “Listen, this is a deepfake.” Because it’s 2019, right? And I immediately think of my –
Ryan: Oh, is that right? Was it labeled as a deepfake?
Citron: It wasn’t labeled. But I said, “This is Neil and Dan.” Like, “Neil, ha ha.” And the whole room erupts in laughter. So then you know something’s up. So I didn’t believe it at first.
And it’s really cool. It gave me a lot of confidence. They always ask, like, what did it do for you? Besides, of course, they give you money that comes with no strings. But I think what it does is meaningfully give you confidence to keep doing your work, saying like, “Hey, you doing good, kiddo.”
Ryan: Right. Well, congratulations on that. And given the problems that you’re tackling, it’s good that we have a genius working on them on behalf of all of us.
Danielle, I want to thank you so much for your time and for all the work that you’re doing. It’s incredibly interesting, slightly frightening, but also unbelievably important.
Citron: Thank you. It’s an honor to be on your broader faculty. I appreciate it.
Aaryan Balu, co-producer of “Inside UVA”: “Inside UVA” is a production of WTJU 91.1 FM, and the Office of the President at the University of Virginia. “Inside UVA” is produced by Jaden Evans, Aaryan Balu, Mary Garner McGehee and Matt Weber. Special thanks to Maria Jones.
“Inside UVA” on Apple Podcasts, Spotify or wherever you get your podcasts. We’ll be back soon with another conversation about the life of the University.
If you are not acquainted with Danielle Citron’s groundbreaking scholarship, here is your chance to learn more.
A 2019 MacArthur Fellow, aka “Genius Grant” winner, and adored teacher at the University of Virginia’s School of Law, you’d be hard-pressed to read an article in any major news outlet about deepfakes and online privacy that does not quote her. She’s been interviewed by The Guardian, NPR and Slate, to name a few.
Appearing on “Inside UVA,” Citron is UVA President Jim Ryan’s final podcast guest of 2023.
In her newest book, “The Fight for Privacy: Protecting Dignity, Identity and Love in the Digital Age,” the Jefferson Scholars Foundation Schenck Distinguished Professor of Law makes the case that intimate privacy is a human right. The issue cuts straight to the one of society’s growing, invasive phenomena: revenge porn.
“Deepfakery right now is a problem of intimate privacy,” she told Ryan. “It is morphing women’s faces into porn, stealing their identities, coercing sexual expression, and giving them an identity that they did not choose.”
She told Ryan, “We share our innermost thoughts, we don’t realize, all day long, as we click, we share, we read, we browse, we text.”
In so doing, that private information now resides in our phones, our apps and our search and browsing histories, where companies can harvest it. Not only are companies using that data to target consumers with personalized advertisements, Citron said, there are data brokers who collect and sell users’ information to hiring agencies and insurance companies, for example.
She offered Ryan a hypothetical. If she were to search for diseases that cause dizziness or high blood pressure, “health data brokers sell that information to health insurance companies to figure out our premiums.”
The possibilities, she said, are seemingly endless.
“It is the trafficking of data, and the intimacy of the data, the way that impacts our life opportunities, that troubles me, that requires protection,” she said.
To hear Citron’s legal solutions to this growing dilemma, listen to “Inside UVA,” which is streamed on most podcast apps, including Apple Podcasts, Spotify or YouTube Music.
Trending
Media Contacts
University News Senior Associate Office of University Communications
jak4g@virginia.edu (434) 243-9935