Deep fakes are taking over the Internet—distorting our perception of what's real. Law professor Danielle Citron explains how deception online not only harms people, but also our democracy.

Transcript

MANOUSH ZOMORODI, HOST:

Hey. It's Manoush. A quick note - this episode makes a couple references to sexual violence, which may be hard to hear. Just want to give you a heads-up before we get started.

(SOUNDBITE OF MUSIC)

ZOMORODI: It's the TED Radio Hour from NPR. I'm Manoush Zomorodi, and on the show today, technology and deception.

(SOUNDBITE OF MUSIC)

DANIELLE CITRON: The word deception has particular meaning. Deception is the intentional falsehood. That is, there is something done to manipulate how people feel in the world. It's designed to change people's behavior and mislead.

ZOMORODI: This is Danielle Citron.

CITRON: I'm a law professor at Boston University School of Law, where I teach privacy and free speech.

ZOMORODI: Danielle also researches technology and cyber harassment. And one of the best examples of this, she says, is the story of a woman named Rana Ayyub.

(SOUNDBITE OF MUSIC)

RANA AYYUB: It was the 22 of April 2018. You know, I just...

CITRON: Rana Ayyub is an investigative journalist in India who has exposed human rights abuses and government corruption.

AYYUB: You know, I am somebody who sent one of the most important ministers in the Modi government behind bars in 2010. And that man now happens to be the second most powerful man in India.

CITRON: And in April of 2018, she received an email from a source inside the Modi government, and the person said, heads-up - a video is going around about you.

AYYUB: It was, like, a 2-minute, 30-second porn video with my image morphed on it.

CITRON: It was a fake sex video. And I mean, she's got big brown eyes. It looked like her. That was Rana, you know, no questions about it.

AYYUB: When I got that video, I felt like I was humiliated. I was shamed by the people who wanted to discredit me.

CITRON: And it went viral.

(SOUNDBITE OF PING)

AYYUB: Screenshots of that video were all over my social media, on Instagram, Facebook, Twitter, WhatsApp, messages and forwards all over India.

CITRON: Within 48 hours, it's been reported that it was on, like, half of the phones in India.

AYYUB: Before I knew, it was on my father's phone, my brother's phone.

(SOUNDBITE OF MUSIC)

CITRON: Within a day after that, her home address, her cellphone number - all over the Internet. There were fake ads on adult, like, finder sites, saying that she was available for sex and this is where she lives. She was inundated with death and rape threats.

AYYUB: I think I was as good as dead for the next five days since I received that video.

CITRON: And she pretty much didn't leave her house for, like, six months.

AYYUB: And I kept asking my friend - I said, what have I done to deserve this?

CITRON: She became like a shell of herself.

ZOMORODI: And so how was that video made possible? If it wasn't her, what was it?

CITRON: It was a deepfake. Her face was inserted into a porn clip. So, you know, when I first worked on it, you know, what we knew about it was that you could insert faces into videos and use sophisticated neural networks to do that. You know, it's called generative adversarial networks that sort of insert a video and then find mistakes and then keep iterating so that it becomes pretty perfected.

But even then - so two years ago, you could sort of tell, though. You know, like, if you stared at it enough, it wasn't as good as Pixar; it wasn't as good as, you know, the Lucasfilm's. And over time, what we've seen is that now we can create from whole - digital whole cloth videos showing you doing and saying things that you never did or said. And they're really hard to tell with the human eye that it's just not manufactured, right?

(SOUNDBITE OF MUSIC)

CITRON: And so Rana was a perfect example of and the first one I had heard of of a deepfake sex video being used to basically drive someone out of the marketplace of ideas.

ZOMORODI: So if I go on one of these platforms right now, like, what's the likelihood that I will come across a deepfake? Or are we talking about a future that we're careening towards?

CITRON: I'm largely imagining a terrible future, but there's - it's pretty bad in here now. Let me explain. A group called Sensity - they found that a year ago, there were 15,000 deepfake videos online. And of those 15,000, 96% were deepfake sex videos, and 99% of those 96% were of women's faces inserted into porn.

ZOMORODI: Wow.

CITRON: Fast-forward to just a year later - 50,000 deepfake videos. Again, same lineup - right? - like, mostly, over 90% deepfake sex videos and, again, same lineup, mostly all of women whose faces are being inserted into porn without their permission. And it's not just U.S. women. You know, they found that it was women from all over the world.

ZOMORODI: And, like, I guess, women's images have been altered and airbrushed for so long. And in some sense, we're already surrounded by fake images everywhere. But this is clearly taking it to a whole nother disturbing level.

CITRON: Yeah, yeah. Lies are absolutely nothing new to the human condition. But what makes this phenomenon different is sort of two things coming together. And the first is that we have this human frailty where audio and video have this power over us, especially, you know, what we see, so that we see something, we're going to believe it. What's new is that we're in an online environment in which online platforms - their business model, their incentives is to accelerate, share and ensure that we make things go viral 'cause then we're liking, clicking and sharing, and they're making money off of online advertising. And so their business model is aligned with our worst instincts.

(SOUNDBITE OF MUSIC)

ZOMORODI: Information travels faster and farther than ever, and it does much more than just spark titillation or outrage. It changes what we believe. Conspiracy theories, new kinds of fake audio and video and algorithms working behind the scenes make knowing what's true or false harder and harder. Our sense of reality is warping. And we can see the consequences - a deep distrust in each other and our fundamental institutions like democracy. And so today on the show, technology, deception and ideas about what we can do to bring ourselves back to reality because, as Danielle Citron says, it takes just a trick of the human eye to upend someone's deeply held beliefs.

(SOUNDBITE OF TED TALK)

CITRON: Deepfakes appear authentic and realistic, but they're not. They're total falsehoods.

ZOMORODI: Danielle continues from the TED stage.

(SOUNDBITE OF TED TALK)

CITRON: Now, it's the interaction of some of our most basic human frailties and network tools that can turn deepfakes into weapons. So let me explain. As human beings, we have a visceral reaction to audio and video. We believe they're true on the notion that, of course, you can believe what your eyes and ears are telling you. And it's that mechanism that might undermine our shared sense of reality. Although we believe deepfakes to be true, they're not.

And we're attracted to the salacious, the provocative. We tend to believe and to share information that's negative and novel. And researchers have found that online hoaxes spread 10 times faster than accurate stories. We're also drawn to information that aligns with our viewpoints. Psychologists call that tendency confirmation bias. And social media platforms supercharge that tendency by allowing us to instantly and widely share information that accords with our viewpoints.

ZOMORODI: OK, so all that information leads us to believe things, whether they are indeed facts or lies, but what about the people who say that they have the right to produce deepfakes or spread other misinformation because of the First Amendment - free speech?

CITRON: It's a misunderstanding both of First Amendment doctrine and free speech theory - right? - 'cause not all ones and zeros and words are protected speech as a matter of First Amendment law and a matter of free speech values, right? Why do we protect free speech? Because it helps us figure out how to govern ourselves, because the truth sort themselves out in the marketplace of ideas, because it helps us engage in self-expression, because it's a safety value. You know, there's so many reasons why we protect free speech.

ZOMORODI: All those reasons, yup

CITRON: All those reasons. And we could add more. You know, we got a few more. But when it comes to defamatory falsehoods - let's take the deepfake video of showing someone doing and saying something they never did. As a matter of First Amendment doctrine, we chill that kind of speech. If you, with actual malice, spread a fake video of a public official doing and saying something they never did, but you know it's false, you can be sued for that, right? So much - and I've been writing about this stuff for a long time and, in my book, "Hate Crimes In Cyberspace," kind of explore how, you know, these online tools - it isn't all just the public square.

You know, the Supreme Court has a series of silly kind of - you know, this understanding of the Internet is, like, as if it's still 1996, right? Like, it's all the town square, and we're all town criers. That's foolish. What we're doing online is we're working. We're hustling for clients. We're spreading ideas. We are finding loved ones, right? We are exploring ideas. We are doing - everything that we do offline, we do it online because phones are wherever we are. And so the idea that everything that happens online is protected free speech is wrong. And it's not good for free speech values.

So the deepfake sex video of Rana - guess what? It ended up with her offline and silenced. You know, your nude photo appears in a search of your name. You are offline. You take down - this is just my experience working with victims. You literally take down all of your presence online.

ZOMORODI: You're basically canceled for something that you didn't do.

CITRON: That's right. Your private persona becomes your public persona in an unwilling way that destroys your public persona. And so it's so easy for people to say it's free speech. And, you know, I will often get the pushback, often from people who are privileged - so white men - love you - but they will say to me...

ZOMORODI: (Laughter).

CITRON: Like, you know, Danielle, like, you're a prude. Why are you making such a big deal about nude photos? We should all just put our nudes online. And I just then take a beat. I'm calm, right? I don't get mad. But I say, I'm so glad you're going to make that choice. But I'm not going to make that choice - right? - 'cause it's going to cost me and other women, women of color, trans women, gay men, trans men, you know, bi folks, like, queer folks - it's just going to cost them more.

(SOUNDBITE OF MUSIC)

ZOMORODI: In a minute, Danielle Citron on why deepfakes have the potential to undermine our democracy. On the show today, technology, deception and our changing sense of reality. I'm Manoush Zomorodi, and you're listening to the TED Radio Hour from NPR. Stay with us.

(SOUNDBITE OF MUSIC)

ZOMORODI: It's the TED Radio Hour from NPR. I'm Manoush Zomorodi, and we were just hearing lawyer and privacy expert Danielle Citron describe a recent Internet phenomenon, videos called deepfakes.

(SOUNDBITE OF MUSIC)

ZOMORODI: Walk with me into the future, where we are in a place where we don't know whether to believe anything we see. What is that like?

CITRON: So that - we call that the liars dividend. In a world in which we can't tell the difference between what's fake and what's real, that's a real boon to the mischief-makers and the liars because they get to point to real evidence of their wrongdoing and say it's not true and get to walk away from responsibility and accountability for bad things that they've done. And we've seen illustrations of this.

(SOUNDBITE OF ARCHIVED RECORDING)

PRESIDENT DONALD TRUMP: You know, I'm automatically attracted to beautiful. I just start kissing them. It's like a magnet.

CITRON: After the Access Hollywood tape came out, President Trump said, you know, hey; I said that, and I'm sorry.

(SOUNDBITE OF ARCHIVED RECORDING)

TRUMP: This was locker room talk. I'm not proud of it. I apologize to my family. I apologize to the American people.

CITRON: A year later, he shared with a reporter, it actually wasn't me on the Access Hollywood tape. He was sort of throwing out the liars dividend. Maybe it'll work, right? Now, for the most part, that didn't really have great traction. And it's kind of part of his brand - all of it - so that, you know, the liar's dividend - like, he tried it. It didn't work, and maybe that didn't hurt him and didn't matter.

But in an environment in which we are sort of post-truth so that we're going to believe falsehoods if they accord with what we believe and we're going to disbelieve truths if they don't accord with what we think - you know, confirmation bias - then we're in this kind of post-truth environment. And I never thought I would say this about our own country, but political discourse feels fragile in a way that makes me feel like we're much more like a Myanmar than we are, you know, Canada. We don't feel like on solid ground in terms of discourse. And so in this environment, it feels so fragile, our democracy.

ZOMORODI: As we see what happens in terms of the platforms taking responsibility or laws passed or whatever sort of systemic change may or may not happen, I mean, how much of us - each of us as individuals who go online who - a lot - do we have - what's our responsibility? Maybe - I don't know. Sometimes I say to people, like, you know, you are up against massive corporations...

CITRON: Yeah.

ZOMORODI: ...When you like and share and all that stuff. You are being manipulated. But maybe you see it differently. Maybe you think that we each have to do a better job as well.

CITRON: We do. You know, in the here and the now where there aren't laws - right? - there are very few state laws around deep - you know, digital forgeries. We are the guardian at the gate. Platforms aren't going to do it for us. Do you know what I'm saying? Like, we can't expect platforms whose incentives are to share because that's where their money is. It's on us, each and every one of us. We need to protect ourselves and our democracy. It's ours. It's ours to lose.

So I do think we have a huge role. What I'm asking is so modest. Think before you click and share. Ask yourself, is this likely? And if it's really crazy, don't you think that it's fake? You know, that's why it's there, right? It's there because it's negative and novel. It's there to feed on our salacious curiosity. Don't do it.

(SOUNDBITE OF MUSIC)

ZOMORODI: That's Danielle Citron. She's a professor of law at Boston University Law School, where she teaches and writes about privacy and free speech. You can see her full talk at ted.com. Transcript provided by NPR, Copyright NPR.