Facebook is defending itself from accusations from the White House and other critics that it's not doing enough to curb health misinformation.
Caption

Facebook is defending itself from accusations from the White House and other critics that it's not doing enough to curb health misinformation. / AFP via Getty Images

A news story suggesting the COVID-19 vaccine may have been involved in a doctor's death was the most viewed link on Facebook in the U.S. in the first three months of the year.

But Facebook held back from publishing a report with that information, the company acknowledged on Saturday.

The social media giant prepared the report about the most widely viewed posts on its platform from January through March of 2021, but decided not to publish it "because there were key fixes to the system we wanted to make," spokesperson Andy Stone tweeted on Saturday.

The New York Times first reported the existence of the shelved report on Friday, two days after Facebook published a similar report about top posts from the second quarter. Facebook executives debated about publishing the earlier report but decided to withhold it over concerns it would make the company look bad, the Times reported.

Facebook has come under pressure from the Biden administration and other critics who argue it hasn't done enough to curb the spread of misinformation about the pandemic and vaccines.

"We're guilty of cleaning up our house a bit before we invited company. We've been criticized for that; and again, that's not unfair," Stone wrote on Saturday. He said the company had decided to release the previously unpublished first-quarter report because of the interest it had sparked.

But Stone also emphasized that the article raising questions about possible connections between the vaccine and death illustrated "just how difficult it is to define misinformation."

While Facebook bars posts that contain false information about COVID and vaccines or that discourage people from getting vaccinated, it takes the position that it's more effective to allow people to discuss potential risks and questions about health, rather than banning such content.

The article, written by the South Florida Sun Sentinel and republished by the Chicago Tribune, was headlined "A 'healthy' doctor died two weeks after getting a COVID-19 vaccine; CDC is investigating why." The article was factual. When it was originally published in January, it noted that no link had been found between the shot and the Miami doctor's death. (The page now carries an update from April saying the medical examiner said there wasn't enough evidence to conclude whether the vaccine played a role in the doctor's death.)

Many news outlets covered the story, but the Tribune link gained the most traction on Facebook: it was viewed by nearly 54 million U.S. users between January and March, according to the company's report.

Experts who study online platforms say these kinds of stories present challenges for social media companies, because while they do not break the platforms' rules against posting false information about COVID and vaccines, they are often used by anti-vaccination advocates to advance misleading narratives and fuel doubt in vaccines.

The Tribune link was shared on the social network by several accounts that regularly raise doubts about vaccination, according to Crowdtangle, a research tool owned by Facebook.

In March, NPR found that on almost half of all the days so far in 2021, a story about someone dying after receiving a vaccine shot was among the most popular vaccine-related articles on social media, according to data from the media intelligence company NewsWhip. The Tribune link about the Florida doctor topped that list.

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.