Reporters Reveal 'Ugly Truth' Of How Facebook Enables Hate Groups And Disinformation
In a new book, Cecilia Kang and Sheera Frenkel say Facebook failed in its effort to combat disinformation. "Facebook knew the potential for explosive violence was very real [on Jan 6]," Kang says.
TERRY GROSS, HOST:
This is FRESH AIR. I'm Terry Gross. A new book investigates Facebook's failure to protect against spreading hate speech, disinformation, conspiracy theories and calls to violence. The book also shows how Facebook became an advertising company, monetizing its users and their data. The book is called "An Ugly Truth: Inside Facebook's Battle For Domination." My guests are the authors, Sheera Frenkel and Cecilia Kang, who are reporters for The New York Times. Frenkel covers cybersecurity and is based in San Francisco and covers technology and regulatory policy and is based in Washington, D.C.
The book focuses on the period between the 2016 presidential campaign and the January 6 insurrection, a period in which Trump became one of Facebook's most profitable users and his campaign became one of the platforms most profitable advertisers. The authors say it was also the period in which it became clear Facebook was unprepared to deal with a political leader like Trump, who used the platform to spread misleading and false information. We recorded our interview yesterday morning. Sheera Frenkel, Cecilia Kang, welcome to FRESH AIR. And congratulations on the book.
CECILIA KANG: Thank you.
SHEERA FRENKEL: Thank you, Terry.
GROSS: So Trump and his followers created many headaches and nightmares for Facebook, which Facebook was unprepared for an American president spreading these falsehoods. But Trump was also a gold mine. So how did Facebook profit from Trump's falsehoods spread on Facebook?
FRENKEL: Well, Trump had over 30 million followers by the time he was kicked off Facebook. He was a major draw for people all over the world to come to Facebook and hear what he had to say about the day's news. He not only managed to bring audience and relevancy to Facebook, he created this constant sort of churning stream of information that people couldn't help take their eyes off of. And ultimately, that's what Facebook needs to stay relevant.
GROSS: Even if you repost one of Trump's messages and add a critical comment, you're still amplifying his post.
KANG: Absolutely. What you're doing is doing what Facebook wants, which is you're engaging with his content and you're engaging with the website. And that's really the core of the business there is to get people's attention and to engage and to be active.
GROSS: So Facebook profited enormously from Trump's campaign spending. How did the campaign use Facebook tools to maximize its reach?
FRENKEL: This is Sheera. The Trump campaign really used Facebook in unusual and unprecedented ways for a political campaign. They did something that had previously not been done by politicians in using Facebook's incredibly targeted advertisement, both to reach people and to sort of do an AB testing of what messages worked best. So, for instance, they would send out two, three, four versions of the same message and then whichever one they saw in real time reaching people and being amplified by people, that's the one they would double down on and put more money into. Once Trump did this, I would note that politicians all over the world followed suit. And they discovered that Facebook was an incredible way in telling people exactly what they wanted to hear.
GROSS: Now, you write that Facebook employees were embedded in Trump's New York City campaign headquarters to help riff on Hillary's daily speeches and target negative ads to specific audiences. Facebook employees were embedded in the Trump campaign?
FRENKEL: So interestingly, this is something Facebook actually offered both campaigns. The Hillary Clinton campaign just turned them down and said that they didn't want that kind of assistance. And so, yes, the Trump campaign ended up with Facebook employees in their offices advising them on how to best use Facebook tools, much like you would a major advertiser like Pepsi-Cola. They, in real time, could tell them this tool is working for you, this one is not - this messaging is working for you and this one is not.
And because Facebook's algorithms are so sensitive and because Facebook's tools are so precise, they could even tell them things like, well, we think that this is playing well in this demographic or in that part of the country. And that trove of data was so important to the Trump campaign in understanding who their voters were.
GROSS: So do you think Facebook employees helped the Trump campaign amplify misleading or outright false information in Trump campaign ads?
FRENKEL: You know, Facebook is always really careful in saying that they're a neutral platform, as are their own employees. And so whatever the content was that Trump was amplifying, whether it was misinformation, conspiracies, just, you know, outright false, you know, false information, Facebook employees helped it regardless of what the content was. And often they don't even look at the content themselves. They just give them the data. This messaging is working and that messaging is not.
GROSS: Tell us more about the rationale behind that policy of not fact-checking political ads when you know that some of that information is outright false. And what was the debate within Facebook about fact-checking political ads?
KANG: I think you have to actually go back to the creation of Facebook to understand their policies on speech and the belief that the CEO and co-founder Mark Zuckerberg had from the founding of the company that more expression, that freedom of expression was going to be the bedrock policy for the company. And he also understood that engagement was really important. And from the earliest days, when he was at Harvard, he would - he told people he really wants to create a site where people just sort of mindlessly scroll, and they're online, constantly posting and reading each other's posts. He understood the power of attention and engagement.
And Mark Zuckerberg also - he does truly believe in the idea of freedom of expression. And what we've seen and what we reveal in this book is that that policy and his philosophy towards speech has really evolved and that Donald Trump really tested him. And he was, in many ways - Donald Trump - the person that surfaced all of the things that were embedded and core to the platform and the business that were problematic.
FRENKEL: I just wanted to add one thing, which is that if anyone listening remembers MTV and how popular that was in the 1990s, that's what Zuckerberg was really basing this idea on. The same way that you would sit and for hours with your friends watch MTV, he wanted people on Facebook. Only here, Facebook was collecting data about them in real time. And as they mindlessly scrolled, Facebook was amassing more and more information about them.
GROSS: So has the policy about political advertising changed? Is there any fact-checking on Facebook now?
FRENKEL: Facebook still allowed politicians to post ads without being fact-checked. And in fact, politicians could say things in advertisements that the average Facebook user could not. They did change other things in the platform. For instance, they created an ad library where you could search for ads and see what politicians were posting. And that was a level of transparency they hadn't previously had. However, they did double down, and they did maintain firm in their belief that politicians could say things in ads without the benefit of a fact-check.
GROSS: Sheryl Sandberg was first brought on to help grow and monetize Facebook. What had she done when she was at Google to maximize advertising growth there?
KANG: Sheryl Sandberg was an incredible success at Google. She created what was really one of the earliest big behavioral advertising business models that became known as AdWord (ph) and AdSense at Google. She created a multibillion-dollar business for Google. So she was well-known by the time Mark Zuckerberg was looking for a partner to really build the business and expand it and to refine it.
So when he met - when Mark Zuckerberg met Sheryl Sandberg in December 2007, there was a real meeting of minds. They both had looked at each other with a lot of interest because they knew that they could offer the other one something different, which was Mark Zuckerberg was a technology visionary, and Sheryl Sandberg was absolutely the business erudite and visionary for the company.
Sheryl Sandberg came in, and within her first weeks, she had called a couple meetings with some of the biggest executives at Facebook at the time. And they refined the business. And they realized from that moment forward that they would pursue a behavioral advertising business.
GROSS: let me reintroduce you here. If you're just joining us, my guests are Sheera Frenkel and Cecilia Kang, authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." They're both reporters for The New York Times. We'll be right back. This is FRESH AIR.
(SOUNDBITE OF JAKE MASON TRIO'S "THE STRANGER IN THE MIRROR")
GROSS: This is FRESH AIR. Let's get back to my interview with the authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." Cecilia Kang and Sheera Frenkel are reporters for The New York Times. Kang covers technology and regulatory policy. Frenkel covers tech and cybersecurity.
So Donald Trump, as both candidate and president, challenged Facebook's standards of speech because there were so many mistruths and out and out falsehoods or lies that Trump posted. So what were the standards before Trump for taking down a post?
FRENKEL: Before Trump, Facebook tried to implement fairly universal standards about what it would take down. It created rules in its content moderation policy that applied to everyone evenly, or so it declared. And we saw Trump really at the beginning of his candidacy when he was first running for president, run up against those rules and post something on Facebook, a ban, he said, that would make sure that all Muslims did not enter the United States that within Facebook was seen as possible hate speech. Facebook's own employees went to Mark Zuckerberg and Sheryl Sandberg and said, you know, we think this violates our policies. Shouldn't we be taking this down? And at that moment, when Facebook decided to leave up his post and start to carve out a new policy for Trump, they were essentially creating a separate class of user on Facebook. Now, it would take years for them to come around and kind of declare this officially and to formalize it. But that first step showed that Facebook was really carving out something for important people, for VIPs on the platform that the average user didn't get.
GROSS: Once that policy was formalized, what was it, and what was the justification they offered for it?
KANG: So they called it the newsworthy exemption. The way that Facebook and Mark Zuckerberg described it is that political figures deserved this special class of, really, an exemption from the other hate speech and other speech policies they had because political speech was of public interest and importance for the world to know. So - and Mark Zuckerberg said in a speech at Georgetown in 2019 that he believed that political speech was the most scrutinized speech. Mark Zuckerberg, in his view of expression and free expression, has a belief that more speech will actually drown out bad speech. So his view was that even if there were lies, lies from a politician such as Donald Trump, that the public would respond with their own fact checks of the president and that the fact checks would rise to the top. And that would, in a sense, neutralize any of the problems with political figures.
GROSS: But I'm sure the people at Facebook saw it wasn't working out that way. So how did they respond to the fact that lies were often winning out?
FRENKEL: That was so difficult for Facebook and specifically Mark Zuckerberg to contend with. And that's something that for us was really interesting in writing this book and showing repeatedly over and over again, despite their ideas really being disproven. You know, this idea that people would reject falsehoods. They would reject misinformation. They would reject conspiracies, despite Facebook's own algorithms showing that that wasn't happening - they continued to persist in this idea. And really, up until the end of Trump's presidency, Mark Zuckerberg and Sheryl Sandberg were still defending the idea that Trump and, really, political leaders all over the world could and should say things on the platform as they wished. And people could and should respond as they wished. They failed to see what their own employees were telling us. And that - for us In, the book, one of the most fascinating thing was talking to employees within Facebook who were raising the alarm again and again and again and saying, this is a problem. You know, we are spreading misinformation. We are letting the president spread misinformation. And it's being amplified by our own algorithms. Our systems aren't working the way we predicted, and we should do something. And yet, you know, Mark Zuckerberg and Sheryl Sandberg stay the course.
GROSS: Well, you have a lot of insights into this, but I'm not sure you can actually answer it. But I'll ask, what - do you think that Sandberg and Zuckerberg were defending their ideal of what free speech should mean on Facebook? Or do you think they were trying to protect Facebook's profits?
FRENKEL: You know, our sources admit it's a little bit of both. I think publicly Mark Zuckerberg and Sheryl Sandberg really hold tight to this idea of defending basic, you know, free expression, freedom of speech. That's a really strong public position for them to take, especially here in the United States, where that's core to our identity as Americans. But when you talk to people within the company that are part of that business arm and part of that policy arm, they say there was also a political calculus and, really, a monetary calculus of this just being good business for them.
GROSS: What was the Facebook policy about hate speech, and how was that tested during the Trump years?
FRENKEL: You know, what's interesting is that Trump brought home the problems of hate speech that Facebook had been facing all over the world. Here in America, we might forget, but in India, in Myanmar, in the Philippines, in Sri Lanka, people have been dying because of hate speech on Facebook for years. It has led to real-life consequences and real deaths. Here in the United States, we only began to see how that hate speech could lead to a growth in extremist movements and fringe groups with the Trump presidency because the president himself was amplifying hate speech. He was pointing to militia movements. He was pointing to theories by QAnon, the conspiracy group. And it was being amplified on Facebook. And within Facebook's own security team, the experts who study extremism were saying over and over again, we are seeing extraordinary growth of these movements. We are ourselves frightened by the way the far right has grown on Facebook during these years.
GROSS: Were the algorithms failing to detect hate speech? Was that the problem?
KANG: The algorithms are catching up. We have to remember that the scale of Facebook is 3 - nearly 3 billion users around the world. The amount of content that courses through the platform every day is just so enormous. Facebook has put in AI, artificial intelligence, as well as hired thousands of content moderators to try to detect this. But they're really far behind, and they've only really started taking this seriously since public scrutiny has shed a light, a spotlight on the company. And there is demand for change within the company. So our reporting shows and from the people inside that they really do feel like they are racing to catch up.
FRENKEL: And I would just add that a lot of this hate speech is happening in private groups. This is something Facebook launched just a few years ago, this push towards privacy, this push towards private groups. The idea being is that people wanted to be in small, intimate groups with like-minded people. But what happens when those small, intimate groups are QAnon or when they're militias? Everyone is like-minded, and so no one is reporting the content. In some cases, it's not a matter of Facebook's algorithms not finding things. It's a matter of Facebook creating these kind of secluded, private walled gardens where this kind of talk can happen, where hate speech can happen and it's not being found.
GROSS: But were the tech people aware of what was happening with hate speech?
FRENKEL: They were. I mean, Facebook has a really fantastic security team. These are experts. They hire from the NSA. They hire from the FBI. They hire people who are really at the forefront of their fields. And in reporting the book, so many of the people I spoke to said, you know, in government intelligence, we only wish we had the kind of insight that Facebook has. We collect more intelligence and more data at Facebook than any government official could possibly hope for. So it wasn't that Facebook's engineers, their security team didn't see the growth of these movements. It's just that, really, Facebook's own policies kind of tied their hands behind their backs in terms of what they could do.
GROSS: What were the policies that tied the hands?
FRENKEL: So when it comes to hate speech, there's not a firm line in the sand of what hate speech is. It's a very nebulous and ever-changing thing. One person might say something and it might be seen as a joke, and another person might say it and it's hate speech. It's meant to inspire hatred. It's something that really needs to be looked at by human beings to understand. And Facebook overwhelmingly relies on algorithms and AI to find things. And when you're training your systems on AI, you're building an inherent sort of flaw in that AI is just not going to find everything. And Facebook itself acknowledges that, that its AI isn't effective with hate speech.
KANG: The other thing I would add, Terry, is that these policies are being created oftentimes on the fly. Facebook has not looked around the corner at things such as doctored videos, deepfakes. They are creating policies in real time. And one example that we really spool out in the book is when Speaker House Speaker Nancy Pelosi spoke at a conference and somebody posted a doctored video of her where she appeared intoxicated. And within Facebook - we take people inside the room when there is much debate within the policy team, the executive ranks, with engineers as well over, what should the policy be for a doctored video which is obviously false, but according to the broad umbrella definition of free expression that the company abides by, could be permitted?
I mean, there were discussions about whether this looks like an "SNL" parody, whether this is actually going to lead to disinformation related to politics in the election. And ultimately, which I think is a really important pattern that we discovered and we have in our book, is that often Mark Zuckerberg is the one who makes the final call on these important policy decisions.
GROSS: So the decision that was made by Zuckerberg, you say, is to leave up doctored videos. Is that still the policy?
KANG: That's a great question, because it's unclear. They make these very ad hoc decisions. And in the case of Pelosi, Mark Zuckerberg did decide to leave up that doctored video, but he made this kind of strange distinction between deepfakes and doctored videos, which his own team is still struggling to answer. Where is the line between something that has been altered enough that it is not reality versus something that has just been doctored? They've essentially set the stage for them to have to make these one-off decisions over and over and over again.
GROSS: So after Pelosi and her team asked Facebook to take down this doctored video and Facebook declined to do that, that's when Pelosi stopped talking to Sheryl Sandberg.
KANG: Yes, the - from what I understand, there is a moratorium on talking to Facebook at all within the speaker's office. There is a lot of deep resentment about these policies that don't have a clear throughline. They don't have consistency from the point of view of many political leaders.
GROSS: So what was Facebook's final resolution about how to deal with the Pelosi doctored video?
KANG: After 48 hours, Facebook decided to slow the spread of the video. That was their resolution, their remedy. They decided that to essentially suppress the rankings of the video so that it wouldn't spread too quickly across the internet.
GROSS: Let me reintroduce you both. If you're just joining us, my guests are Sheera Frenkel and Cecilia Kang, authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." They're both reporters for The New York Times. We'll be back after we take a short break. I'm Terry Gross, and this is FRESH AIR.
(SOUNDBITE OF MUSIC)
GROSS: This is FRESH AIR. I'm Terry Gross. Let's get back to my interview with New York Times reporters Cecilia Kang and Sheera Frenkel, authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." It investigates Facebook's failure to protect against becoming a platform spreading hate speech, disinformation, conspiracy theories and calls to violence. The book also shows how Facebook became an advertising company, monetizing its users and their data.
Less than three months before the election, one of Facebook's cybersecurity experts, Ned Moran, discovered the DCLeaks page created by Russians - a page on Facebook created by Russians. Describe the page and Moran's reaction when he found it.
KANG: Ned Moran was someone who came from government intelligence, and he was someone who was trained to look for specifically Russian operations. And yet, when he found the DCLeaks page, he was surprised. He had thought that Russia might be interested in trying to do something during the 2016 presidential elections, but he was surprised it was so blatant. And in the weeks that followed, he watched in real time as Russian agents tried to feed emails from the Clinton campaign to American reporters and took even a step further to try to influence their coverage. He, in this book, described as watching as what he knew to be a Russian agent tried to shape that coverage and say, well, you know, Clinton's going to give a rally on this day. If you drop the story right before she goes onstage, it might affect her. It might affect her polling numbers. Reporters might have to ask her about it. So you really saw an incredibly strategic and aggressive campaign by the Russians on Facebook using Facebook.
GROSS: And he also found that Fancy Bear hackers from Russia had stolen 2,500 documents by hacking into the Soros Foundation, and he was trying to get journalists to publish those. And then in the summer of 2017, the cybersecurity team at Facebook discovered that the Russian Internet Research Agency - and it had, like, thousands of bots - right? - and fake pages.
KANG: Yes. I mean, I think people forget that there are really two separate things that happened during those elections. One was the Russian hackers who were working for the government and who were stealing those Clinton emails and getting American journalists to write about them. Separately, the IRA, the Internet Research Agency, was running a number of bots and buying advertisements on Facebook's own platform to create really divisive emotional content meant to divide Americans. Facebook had been hearing for nearly a year at that point that Russia was buying ads, but they had not been able to find them. And it took until that summer, the summer of 2017, that Facebook was able to finally find those ads. I think in one of the more memorable scenes in our book, we have a Facebook PR person telling journalists that there were no Russian ads bought on the platform during the elections at the same time that just down the hallway, the security team was starting to find those ads.
GROSS: So what was Facebook's reaction? What was the executive leadership at Facebook's reaction when the cybersecurity team reported what they were finding about Russian hacks and Russian bots and Russian disinformation campaigns?
KANG: At this point, you know, I think people know that Facebook took a long time to inform the U.S. public about what they knew. They took almost a year to tell the American public what they knew about Russian election interference. We were shocked, to be honest, when we were reporting at how often they delayed going public and how often they went back to their security team and said, well, let's dig around more. Let's find more. Let's wait a little bit longer. They did not reveal the extent of what Russia had done until September 2017, even when six, seven months earlier, their own security team was urging them to go public and tell people what had happened.
GROSS: Well, you report that Facebook removed the Russian section from a security report. What did they remove, and why?
KANG: This was personally an interesting reporting point for me because I had written about the white paper that Facebook published in the winter of 2017 for The New York Times. At that point, I had heard from sources within Facebook that there had once been an entire section of that report which touched on Russia and which revealed that Russia had, in fact, interfered in the elections. And I went to Facebook. I asked for comment. And I said, you know, I'm hearing these things; did you take out a Russia section? And I was emphatically told that that was not the case, that there was never anything about Russia in the white paper. And it was only in reporting this book that we discovered there were multiple drafts of the paper that had very long sections on Russia. And in fact, there was a great amount of debate within the company about whether or not to include it.
GROSS: So what was taken out, and what was the rationale for it?
KANG: They took out the paragraphs that dealt with Russia. They took out the implications that Russia had been involved in election interference. And the rationale was, well, we don't know enough yet, and this isn't the right form in which we should go out with what we do know, and we should brief members of Congress first, we should perhaps brief intelligence officials first. And so it was, again, a case of Facebook really kicking the can down the road and telling its security team, why don't you go back and find more first?
FRENKEL: I would just add that Facebook today also emphatically says that they did include Russia because they note there is a footnote in the white paper where they link to a DNI report which has a reference to Russian interference. But nowhere in the white paper is the word Russia mentioned.
GROSS: At some point, Zuckerberg said he'd work with Congress in its investigation into Russia and turn over Russian ads that were on Facebook and that he didn't want to use Facebook tools to undermine democracy. That's not what Facebook stands for. What did he actually hand over? And was that everything?
KANG: So Facebook does eventually hand over more than 3,000 ads that were purchased by IRA-connected entities on Facebook. And they are all meant to sort of cause chaos and discord around the election. And they find - they give these ads and these images to a committee that's investigating election interference. The - what's really notable in the book is that the lobbyists who hand over this information, they initially really try to show political neutrality among the ads. They're trying to actually curate the ads and give the impression that the Russians who did buy these ads were not really particularly favoring one candidate over another, but that they were neutral in this, which the people in the committee, the committee investigators, found ludicrous. It was that kind of controlling of the message that I think has really quite angered members of Congress.
GROSS: At some point, Twitter started fact-checking Trump tweets and putting warning labels on false messages or, you know, totally misleading messages. And this was around the time of the election. How did that affect Facebook 'cause, you know, Twitter is a competitor of Facebook?
FRENKEL: Twitter doing that somewhat forced Facebook's hand to become more aggressive in their labeling. For a little while, Facebook had been experimenting with these labels that often directed people to an information center where they were trying to provide more accurate information about the elections or about COVID. But the labels themselves were often confusing. People didn't know what to make of a label that said, for more information about the election, please visit our - you know, and then a link. It didn't say something was false. It didn't really clearly state that something was misleading the way that Twitter's labels did. And so after Twitter really became more aggressive in labeling the Trump posts, we saw Facebook start to change their labels as well. And the language of those labels started to really shift to say this is actually misleading content. There's information provided here that isn't accurate.
GROSS: Let's take another break here, and then we'll talk some more. If you're just joining us, my guests are Sheera Frenkel and Cecilia Kang, authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." They're both reporters for The New York Times. We'll be right back. This is FRESH AIR.
(SOUNDBITE OF AMY RIGBY'S "PLAYING PITTSBURGH")
GROSS: This is FRESH AIR. Let's get back to my interview with the authors of the new book "An Ugly Truth: Inside Facebook's Battle For Domination." Cecilia Kang and Sheera Frenkel are reporters for The New York Times. Kang covers technology and regulatory policy. Frenkel covers technology and cybersecurity.
January 6 was a very violent day in the Capitol. And so many people could see what was leading up to that, and they could see it on Facebook and on other social media. How was Facebook used by the people who planned January 6 and by those who joined in or led the riot and broke into the Capitol building?
FRENKEL: The seeds for what happened on January 6 were sown very early on - really, I would say, the day after the elections. There were people forming Facebook groups called Stop the Steal. We, as reporters, were watching those groups, and we were astounded. I had never seen a Facebook group grow so quickly, adding thousands of users within hours to this group in which they were sharing all sorts of falsified videos and documents about election fraud and really, really churning up anger around this idea that the election had been somehow stolen from Donald Trump.
While Facebook took action on some of those groups, some of the stop-the-steal groups, they allowed others to persist. And, of course, Donald Trump was still on the platform using that moniker, saying stop the steal and claiming the election had been stolen from him. And so within Facebook, they were seeing that they were really not that effective in stopping that idea from spreading and that in the lead-up to January 6, people were getting more and more and more angry. And they were organizing themselves to come to Washington and to march.
Facebook security officials the day before were warned by reporters that there were Facebook groups in which people were posting photos of assault rifles and openly discussing how they were going to bring guns to Washington for this march. And they knew the potential for violence was very, very real, which is why, on that day, Facebook officials gathered to watch what was happening in Washington and to monitor those very groups. They even discussed at one point whether Zuckerberg should call Trump ahead of time. Ultimately, they decided not to because they were worried that it was going to leak to the press that they might do so. But it's very clear from our reporting that Facebook knew the potential for explosive violence was very real that day.
GROSS: Was there a debate within Facebook about whether to do anything to stop this kind of potentially violent organizing on Facebook?
FRENKEL: There was. The security team was constantly debating with other parts of the company about what should be done. And I would note that with their QAnon policy, for instance - I think that's a very interesting one to look at, QAnon obviously being a conspiracy group here in the United States, which has really taken off during the Trump presidency, has millions of people who believe in this idea of a vast sort of ring of cabal of global elites that are really controlling the world. And while they started to make moves towards banning them - they took down some of the groups; they took down some of the accounts - it took several months of seeing that the group was still spreading before Facebook actually took action to ban the group entirely.
And even then, things slipped through. And Facebook's security team was telling its own officials, well, we're not being effective. We're letting them continue to spread and recruit new members. And in these months that are going by, they're organizing on other platforms. They're telling their own Facebook groups, hey, if we get taken down here, come follow us over here on YouTube, or come follow us over here on a messaging app like Telegram. And so they were organizing ahead of time for the planned removal.
GROSS: After January 6, when executives at Facebook saw what happened at the Capitol and saw that Capitol Police were killed by the mob and that the mob breached the Capitol, that they were saying, hang Mike Pence, that they were going after Nancy Pelosi and others - what was the conversation like inside Facebook, and what action did Facebook take?
FRENKEL: There was immediate sort of understanding that this was a watershed moment and that they were going to have to have the discussion they've dreaded having for a very long time, which is, do they remove Donald Trump? And we see them debate that. We see them go back and forth. And really, it's not until Twitter takes action to ban Trump that Facebook sort of makes its announcement, at first that it's a temporary suspension. It's very unclear and muddled. Their messaging is, well, we're removing him for now, but we're going to reevaluate. And ultimately, it's finally announced that they're going to suspend the account, but they're going to refer it to the Facebook oversight board. They were essentially really, again, kicking the can to someone else and saying, we've created this outside body. I'm going to allow them to rule on whether or not we should have removed Donald Trump.
GROSS: And at first, the ban was, like, for a couple of weeks. Right? And then that was extended.
KANG: That's right. The ban was was for a couple weeks. The language was quite interesting. It was indefinite and temporary is the way they described it. They referred it to this body that they describe as a Supreme Court, third-party body that makes decisions on content. Interestingly, months later, the body, the Facebook oversight board, kicked it back, that decision on Trump, to Facebook. And they said, Facebook, you don't have policies that are clear enough on this kind of political speech and taking down an account like Trump, you have to write those policies. It was actually a pretty smart move by the Facebook oversight board. So currently, the final decision on Trump is in the hands of Facebook. They have said that for at least two years, Trump will be banned, and that two years expires, essentially, ahead of his ability to campaign again for the 2024 campaigns.
GROSS: Has Facebook clarified its policy on political speech that is not true, that is inflammatory, that is hate speech, and also its policy on people who amplify that and who threaten to show up with guns and, you know, breach the Capitol building? I mean, Trump might have, you know, started the fire, but people were stoking it.
FRENKEL: They have not clarified that policy. And really, Trump stepping down from office has helped them avoid discussing it. The daily questions that they used to get from reporters are no longer being received by Facebook executives, but that's really just here in the United States. We have to remember that in countries all over the world, this is still a huge problem. There are elections coming up in India. There are elections coming up in a number of countries where the current head of state is very active on Facebook and uses Facebook much in the way that was modeled by Donald Trump. And so by avoiding answering this, by avoiding coming up with a cohesive policy, they've - you know, they've extended the problem. And millions of people all over the world are being affected in democracies that are being threatened by populist leaders using Facebook.
GROSS: What are the changes to Facebook policy that have happened since the Trump administration?
KANG: Facebook now is coordinating much more with governments and with intelligence officials within governments. Every month, they report about disinformation and what they found, and they report to the public. They're trying to be more public, and they're trying to also coordinate with other technology companies to see - for what they're seeing and coordinate on what those other companies are seeing as well on the internet.
FRENKEL: While Facebook has made huge strides in how it reports publicly and transparently about disinformation and has hired, as they say, more than 30,000 people to work in their security apparatus, they still struggle with misinformation, which - the difference there is really interesting, right? One is spread intentionally by a government or by another body to try and influence people. Misinformation is really just bad information shared among people, Americans telling other Americans that the vote has been stolen. And on that, they still don't know what to do, and that's really what's becoming prevalent not just here in the United States but in countries all over the world.
GROSS: Is there a precedent from another social media company about how to deal with that?
KANG: The social media companies are all struggling, and they're creating policies as we go. I will say that Twitter, though it's much smaller, we do have to remember, compared to Facebook, especially when you put Facebook together with its other apps - WhatsApp and Instagram - Twitter's willing to be more experimental. It's quite public in its approach and writing of its policies. I'm not saying that they've got it completely right. YouTube is still very far behind. These social media companies are all struggling with how to handle misinformation and disinformation. And along the lines of misinformation, it is a very current and present danger in that just recently, the chief of staff of the White House, Ron Klain, was saying that when he talks to - when the White House reaches out to Americans and asks why aren't they getting vaccinated, they hear misinformation about dangers with the vaccine. And they - and he said that the No. 1 place where they find that misinformation is on Facebook.
GROSS: Is Facebook trying to do anything about that?
FRENKEL: Facebook and Mark Zuckerberg himself have said that they will not tolerate misinformation about COVID. However, I will note that just today I was curious, and I went to Facebook, and I checked a couple of different groups which I am a part of and which I track for misinformation, and I saw quite a few conspiracies shared about vaccines causing all sorts of problems, whether fertility or otherwise. I will note that scientists say that none of those problems are being documented. And they were sharing videos which had obviously been doctored. They were sharing very experimental information about what could cure COVID. And so just today I saw that the very type of misinformation that Mark Zuckerberg and Sheryl Sandberg said they wouldn't tolerate about COVID is still online and very active on Facebook.
GROSS: Let me reintroduce you both. If you're just joining us, my guests are Sheera Frenkel and Cecilia Kang, authors of the new book, "An Ugly Truth: Inside Facebook's Battle For Domination." They're both reporters for The New York Times. We'll be right back. This is FRESH AIR.
(SOUNDBITE OF HIOR CHRONIK'S "WE ARE ALL SNOWFLAKES")
GROSS: This is FRESH AIR. Let's get back to my interview with the authors of the new book, "An Ugly Truth: Inside Facebook's Battle For Domination." Cecilia Kang and Sheera Frenkel are reporters for The New York Times. Kang covers technology and regulatory policy; Frenkel covers technology and cybersecurity.
So as you were wrapping up your investigation for this book, there were several suits filed against Facebook, one from the Federal Trade Commission. There was a group of over 40 state attorneys general which filed suit against Facebook. What were these suits about?
KANG: Just recently, a federal court threw out the lawsuits by the Federal Trade Commission in more than 40 states and jurisdictions, and those lawsuits were seeking to break up Facebook. There were competition lawsuits. The feeling - I mean, there is, Terry, very few things right now in Washington that unite Democrats and Republicans than the idea that Facebook is too big and too powerful.
So these lawsuits were attempting to address the size and the dominance of Facebook. The Federal Trade Commission does have the ability. The judge in this case said come back to us and do a better job essentially of writing your lawsuit. But it was a big step back for any sort of regulatory pressure on the company. The company's stock soared after the announcement. It now - right after the announcement, the stock soared so much that the company was valued at over $1 trillion.
GROSS: You've been reporting on Facebook for years, so this book is kind of like the culmination of your Facebook reporting. Facebook was not always happy with your reports. What did you hear from Facebook when you reported something that made the leadership unhappy and that they wanted to criticize?
FRENKEL: Facebook is very controlling of their message, and they're always concerned about what journalists uncover that is not sanctioned by the company. And, of course, as journalists, that's what we're most interested in. We're most interested in hearing the unfiltered ideas and the raw discussions, what's happening behind the scenes that isn't the polished sort of formal thought that they present to the public but that where they got there. And that's what we want to do with this book is show people how Facebook got here. How did we arrive at our present moment? We went through a very thorough fact-checking process with Facebook for this book. It took several months. We went through every scene. We went through really every detail and gave them a chance to respond and correct anything that they might find inaccurate, because we want this to be a very thorough understanding of the company and the decisions made by its top leadership.
GROSS: What difficulty did you have getting people in Facebook to talk with you? What were they risking? Had they signed nondisclosure agreements about what happens inside Facebook?
KANG: Of the more than 400 people we interviewed for this book, many currently still are at Facebook. Many are former employees, and many did sign NDAs. So they spoke to us at risk. We are grateful that they spoke to us. I think that that speaks to the fact that they wanted their story to be told as they understood it from inside. It was not easy. This is a project that took over two years as a book, and our reporting has extended even further than that. But we just dug and dug and dug because we knew that there was more than just the sort of curated and scripted talking points that the company espouses. We wanted to take people behind the scenes. And the people who did speak to us spoke to us, and they put their trust in us. And we are so grateful.
FRENKEL: I would add that there's sometimes a notion that the people we spoke to were somehow disillusioned, disenfranchised or were coming to us as reporters because they were mad at Facebook. And that's not what we found. We found the vast majority of people still work there. And they actually love the company. And they care about the company, and they want it to do better. Their motivation for speaking to us was often wanting things to come to light publicly so that the company could change.
GROSS: Two people from Facebook who did not give you interviews were Mark Zuckerberg and Sheryl Sandberg.
FRENKEL: Yes. At the start of this book, we asked Mark Zuckerberg and Sheryl Sandberg to sit with us - an interview, and they declined. We repeated our request multiple times, and they continued to decline.
GROSS: Well, I thank you so much for your reporting and for joining us today. Sheera Frenkel, Cecilia Kang, thank you.
KANG: Thank you, Terry.
FRENKEL: Thank you so much for having us.
GROSS: Sheera Frenkel and Cecilia Kang are reporters for The New York Times. Their new book is called "An Ugly Truth: Inside Facebook's Battle For Domination." Facebook contacted us with this statement in response to the book. Quote, "every day we make difficult decisions on where to draw the line between free expression and harmful speech on privacy, security and other issues. And we have expert leaders who engage outside stakeholders as we craft our policies. But we should not be making these decisions on our own and have for years advocated for updated regulations where democratic governments set industry standards to which we can all adhere," unquote.
Tomorrow on FRESH AIR, how long can an athlete perform at a high level while being an active alcoholic? Our guest will be big-league pitcher CC Sabathia. They'll talk about drinking heavily through 15 seasons, including his most dominating years on the mound, and about getting sober. He's written a new memoir called "Till The End." I hope you'll join us.
(SOUNDBITE OF MUSIC)
GROSS: FRESH AIR's executive producer is Danny Miller. Our technical director and engineer is Audrey Bentham. Our interviews and reviews are produced and edited by Amy Salit, Phyllis Myers, Sam Briger, Lauren Krenzel, Heidi Saman, Therese Madden, Ann Marie Baldonado, Thea Chaloner, Seth Kelley and Kayla Lattimore. Our associate producer of digital media is Molly Seavy-Nesper. Roberta Shorrock directs the show. I'm Terry Gross. Transcript provided by NPR, Copyright NPR.