A large video monitor on the campus of Meta, Facebook's parent company, in Menlo Park, Calif. in February. New research about Facebook shows its impact on political polarization.
Caption

A large video monitor on the campus of Meta, Facebook's parent company, in Menlo Park, Calif. in February. New research about Facebook shows its impact on political polarization. / Getty Images

Is Facebook exacerbating America's political divide? Are viral posts, algorithmically ranked feeds, and partisan echo chambers driving us apart? Do conservatives and liberals exist in ideological bubbles online?

New research published Thursday attempts to shed light on these questions. Four peer-reviewed studies, appearing in the journals Science and Nature, are the first results of a long-awaited, repeatedly delayed collaboration between Facebook and Instagram parent Meta and 17 outside researchers.

They investigated social media's role in the 2020 election by examining Facebook and Instagram before, during, and after Election Day. While the researchers were able to tap large swaths of Facebook's tightly held user data, they had little direct insight about the inner workings of its algorithms.

The design of the social media giant's algorithms — a complex set of systems that determine whether you're shown your friend's vacation snapshots or a reshared political meme — have come under increasing scrutiny in recent years, driven by the popular belief that they send users down rabbit holes and enable radicalization. Those fears crystallized in the aftermath of the 2020 election, when "Stop the Steal" groups on Facebook helped facilitate the Jan. 6th Capitol insurrection.

The studies published on Thursday offer no easy answers to the debate that has raged since the 2016 election over the relationship between social media platforms and political polarization.

Still, the research sheds light on how Facebook's algorithm works. The studies found liberals and conservatives live in their own political news bubbles more so than elsewhere online. They also show that changing the platform's algorithm substantially changes what people see and how they behave on the site — even if it didn't affect their beliefs during the three-month period researchers studied.

"The insights from these papers provide critical insight into the black box of algorithms, giving us new information about what sort of content is prioritized and what happens if it is altered," said Talia Stroud of the University of Texas at Austin, who is co-leading the research project.

One Facebook, two political worlds

In one study looking at 208 million adults on Facebook in the U.S., researchers tried to answer a long-standing question: do liberals and conservatives consume different political news?

The answer seems to be yes. After analyzing popular political news links posted on the platform between September 2020 and February 2021, the researchers found that there's not much overlap between political news consumption within the two camps. Segregation also increases as a news link moves from being selected by the algorithm, to being seen by a user, to being interacted with.

That ideological gap was larger than what other research has shown for overall news consumption online and in traditional media.

"This borders on an indictment of Facebook's algorithm," said Laura Edelson, a computer scientist and postdoctoral researcher at NYU. She was not involved with the project but has done similar research and reviewed the studies' findings. (In 2021, Edelson and her team were blocked from accessing Facebook after a clash over the data they were collecting.)

The bubbles are sometimes punctured. The researchers measured segregation levels by day, and found it dramatically dropped on October 2, 2020, when the White House announced President Donald Trump was diagnosed with COVID-19.

"It's not a grand scientific observation," said David Lazar of Northeastern University, a co-author of the study. "We'd need more data to see and we'd need more crises."

The gap goes beyond the difference in what posts people see. Conservatives engaged more with political news, meaning they clicked, liked, commented on, and re-shared the political news they saw more often than liberals did. The bubbles were asymmetric: there were more political news links seen exclusively by conservatives than by liberals. Political news links posted by pages and in groups — not by friends — had even higher levels of audience segregation.

Conservatives are also the main consumers of websites that Facebook flagged as untrustworthy and links that third-party fact checkers flagged as inaccurate. That said, both amount to a very small fraction of overall political news, which itself makes up just 3% of what people share on Facebook. (Facebook began showing users less news in 2018 and less political content in early 2021.)

What happens when you tweak the news feed

Another study also examined ideological separation, using an internal Facebook measure to classify the ideological leanings of all content sources seen by active users in the US. They found that on average, about half the posts users see come from like-minded sources. One out of five users experience an echo chamber on the platform, where at least three-quarters of the posts they see come from ideologically aligned sources.

After establishing that baseline, the researchers ran an experiment, recruiting roughly 23,000 users who agreed to take part. About 30% of those users were shown less content from like-minded sources and then researchers checked if that reduction changed their political attitudes.

That was not the case. Users did, however, see more content from sources with different political leanings, as well as fewer posts from sources that repeatedly post misinformation.

Two other experiments published on Thursday also tested changes to the algorithm that have been proposed by critics of Meta and policy makers.

The researchers tried replacing Facebook's algorithmic feed with one showing posts in reverse chronological order, without any algorithmic ranking, and reducing the number of reshared posts (the kind of content that goes viral).

All of the changes to the algorithms had significant impacts on what users saw in their Facebook feeds. For example, compared with a chronological feed, the algorithmically driven feed served less political content, less moderate content, more politically aligned sources, and less content from sources Facebook deemed untrustworthy, the study found.

Edelson, the NYU researcher not involved in the project, noted that the comparison sheds light on how Facebook's ranking algorithm works — something that has been hard for outsiders to do, given how closely the company holds its data.

"This is interesting, strong evidence that when it comes to politics, the algorithm is biased towards the extremes," Edelson said. "This is genuinely new."

Moving users to a chronological feed also affected how they used the platform: they posted less about politics, liked political content less, and were less likely to share that they voted or mention politicians and candidates for office. Getting rid of the algorithmically driven feed also curtailed the amount of time people spent on the platform, sending them to Instagram.

"When I read these papers, I see some really promising lines of study. I see things that we could do to build on this work to move toward a safer algorithm, to move toward interventions where we find ways to show people less harmful content," Edelson said.

Changing Facebook's algorithm to reduce engagement would have significant business implications. The systems serve up content they predict will keep users clicking, liking, commenting, and sharing — creating an audience for the advertising that generates nearly all of Meta's $116.6 billion in annual revenue.

Big questions remain unanswered

However, none of the three experiments showed an impact on users' political attitudes over the three months the study ran. That suggests addressing political polarization is not so simple as tweaking a social media algorithm.

Meta described the findings as validating its position that its platforms are not to blame for rising rates of polarization and partisanship.

Nick Clegg, Meta's President of Global Affairs in 2022. In a blog post, Clegg says the studies' findings show there's little evidence that social media
Caption

Nick Clegg, Meta's President of Global Affairs in 2022. In a blog post, Clegg says the studies' findings show there's little evidence that social media "has any meaningful impact on key political attitudes, beliefs or behaviors." / AFP via Getty Images

"These studies add to a growing body of research showing there is little evidence that social media causes harmful 'affective' polarization, or has any meaningful impact on key political attitudes, beliefs or behaviors," Nick Clegg, Meta's president of global affairs, wrote in a blog post shared with NPR ahead of the release of the studies.

That language was updated in Meta's published post to say: "The experimental studies add to a growing body of research showing there is little evidence that key features of Meta's platforms alone cause harmful 'affective' polarization, or have meaningful effects on key political attitudes, beliefs or behaviors."

But the outside academics who conducted the studies and other researchers who reviewed the findings cautioned against drawing broad conclusions from these studies about social media's role in polarization.

"This finding can not tell us what the world would have been like if we hadn't had social media around for the last 10 to 15 years or 15 or 20 years, however long it's been at this point," said Joshua Tucker of New York University, who co-led the research project with Stroud..

The study's short duration and setting — a three-month period ahead of a highly contentious national election — may have been too short to show an impact on beliefs, he added.

The research published on Thursday is the first batch of more than a dozen studies the project took on; further papers are in the works about the impact of political advertising, the spread of disinformation, and other topics.

Ultimately, these studies raise more questions than they answer, said Chris Bail, director of Duke University's Polarization Lab, who was not involved in the research but reviewed the findings.

"We need many, many more studies before we can come up with these types of sweeping statements about Facebook's impact on democracy, polarization, the spread of misinformation, and all of the other very important topics that these studies are beginning to shed light on," he said.

"We all want this to be a referendum on, is Facebook good or bad," he said. "But it's not."

Copyright 2023 NPR. To see more, visit https://www.npr.org.