On Thursday, the Facebook Oversight Board found that the social network's
Caption

On Thursday, the Facebook Oversight Board found that the social network's "cross-check" program for high-profile users lacks transparency. / AP

Updated October 21, 2021 at 11:34 AM ET

Facebook's Oversight Board said in a report on Thursday that the social network "has not been fully forthcoming" about how it lets millions of prominent users escape the content moderation rules it applies to everyone else, a practice known inside the company as "cross-check."

"The fact that Facebook provided such an ambiguous, undetailed response to a call for greater transparency is not acceptable," the board wrote in its report. "Facebook's answer provides no meaningful transparency on the criteria for accounts or pages being selected for inclusion in cross-check."

Facebook's VIP list of celebrities, politicians and others was highlighted by the Wall Street Journal, which reported that the social network initiated the program as a "quality-control measure" for actions taken against high-profile accounts. In practice, the paper found, some users are "whitelisted," or practically immune from any enforcement actions.

"The amount of information that is publicly available about cross-check is too limited," said Thomas Hughes, director of the Oversight Board, in an interview with NPR. "Users need to know what is being done and when and why. If it's all being done in an opaque, unseen manner, it fuels the belief that something untoward is happening."

The Oversight Board, which Facebook created and funded through an independent trust, includes a group of experts from around the world. It issues rulings that are binding on the company and makes policy suggestions that are not.

In its report, the board says Facebook should publicly explain its rationale for including accounts in its cross-check program. The board, at the request of Facebook, will review cross-check and issue guidance on how the system should change.

Facebook admitted, according to the board's report, that it should not have said that cross-check applied only to "a small number of decisions."

"Facebook noted that for teams operating at the scale of millions of content decisions a day, the numbers involved with cross-check seem relatively small, but recognized its phrasing could come across as misleading," the report said.

Describing the program as "small," Hughes told NPR, "was not appropriate," noting that the program reportedly covers nearly 6 million users. "That is not small," he said.

When the board reviewed Facebook's decision to ban former President Donald Trump, the company was not upfront about the program, only referencing it when asked about what kind of content rules applied to Trump's account.

"The Board notes that, in the Trump decision, Facebook refused to answer one of the Board's questions about whether the company had been contacted by political officeholders or their staff about the suspension of Mr. Trump's accounts," according to the report on Thursday.

The board also noted its precarious role, in which it can fully evaluate Facebook's actions only if the company cooperates.

"The credibility of the Oversight Board, our working relationship with Facebook, and our ability to render sound judgments on cases all depend on being able to trust that information provided to us by Facebook is accurate, comprehensive, and paints a full picture of the topic at hand," the report found.

A Facebook spokesperson said in a statement that the company asked the board to review the cross-check program because it strives "to be clearer in our explanations to them going forward."

In its review of the program, the board said it intends to "ensure fairness and objectivity" in how Facebook implements cross-check. The company agreed to provide the board with documentation about how the system works.

"This should give a fuller understanding of the work Facebook has already done on a given topic. We will include analysis on whether Facebook is fulfilling this commitment in our future transparency reporting," the report said.

The board said it plans to issue recommendations to Facebook on how the system can be changed.

Editor's note: Facebook is among NPR's recent financial supporters

Copyright 2021 NPR. To see more, visit https://www.npr.org.