Facebook's Oversight Board says the company, led by CEO Mark Zuckerberg, must take responsibility for its decisions.

Facebook's Oversight Board says the company, led by CEO Mark Zuckerberg, must take responsibility for its decisions. / AFP via Getty Images

Facebook has almost 2 billion daily users, annual revenue that rivals some countries' gross domestic product, and even its own version of a Supreme Court: the Oversight Board, which the company created to review its toughest decisions on what people can post on its platforms.

This week, the board faced its biggest test to date when it ruled on whether Facebook should let former President Donald Trump back on its social network.

The board upheld the company's decision to remove Trump after the Jan. 6 insurrection at the U.S. Capitol — finding he had broken Facebook's rules about praising violence — but it criticized the indefinite suspension and kicked the case back to the company either to ban Trump permanently or set a time frame for when he can return.

Former Danish Prime Minister Helle Thorning-Schmidt, a board co-chair, even called the company "a bit lazy" for failing to set a specific penalty in the first place.

Facebook said it's now considering the ruling and will determine a "clear and proportionate" action.

The board's response in this case may have been more than Facebook was counting on when it set up the advisory body. But the decision — and the public response to it this week — reveals just how big a challenge Facebook's scale and power present to anyone who wants to hold the company to account.

"They can't invent penalties as they go along"

In many respects, the decision the board handed down is more about Facebook than it is about Trump.

The board zeroed in on something critics have said for a long time: The way Facebook enforces its rules can seem arbitrary. It's often unclear what rules are being applied and why.

When it came to Trump, the board said that an indefinite suspension appeared nowhere in its rule book and violates principles of freedom of expression.

"What we are telling Facebook is that they can't invent penalties as they go along. They have to stick to their own rules," Thorning-Schmidt said in an interview with Axios.

She said that kind of arbitrary decision, made on the fly, has helped fuel claims that Facebook is biased.

"We will only get rid of this talk that Facebook is leaning towards certain political opinions when we get to a stage when all decisions on Facebook and Instagram are taken with transparency and clarity and where all users are judged by the same standard," she said.

Casting doubt on Facebook's "newsworthiness" policy

The board also pushed Facebook to be more transparent about how it treats political leaders and other high-profile accounts in a set of broader recommendations.

The board said the company should generally apply its rules equally, no matter whether the user is the president or an average citizen.

But it acknowledged that people with big audiences, such as politicians or celebrities, can cause outsize harm — and said Facebook should act more quickly when those users break the rules.

That's different from how Facebook — and Twitter for that matter — currently treat politicians and other public figures. Both companies have carve-outs from their rules in matters of public interest, and Facebook's CEO has said the company should err on the side of allowing more political speech. In practice, that meant it appeared Trump was able to get away with posting things that may have gotten the average Facebook user banned.

The board said Facebook should do a better job explaining its "newsworthiness" policy and how it applies to "influential accounts." Under that policy, Facebook doesn't take down posts that break its rules if the company thinks they are "newsworthy and in the public interest." (Facebook said it never applied this policy to any of Trump's posts.)

The board said the opaqueness of the newsworthiness policy makes it seem like Facebook "may be unduly influenced by political or commercial considerations"— in other words, that it's dodging criticism from Republicans or looking out for the bottom line.

"The board's job is to make sure that Facebook is doing its job"

The board's criticism didn't stop at Facebook's imposing what it called a "vague, standardless penalty." It slammed the company for trying to outsource its final verdict on Trump.

"Facebook has a responsibility to its users and to its community and to the broader public to make its own decisions," Jamal Greene, another board co-chair and constitutional law professor at Columbia, said Thursday during an Aspen Institute event.

"The board's job is to make sure that Facebook is doing its job," he said.

Tensions between the board's view of the scope of its role and Facebook's were also evident in the board's revelation that the company wouldn't answer seven of the 46 questions it asked about the Trump case.

The questions Facebook refused to answer included how its own design and algorithms might have amplified the reach of Trump's posts and contributed to the Capitol assault.

"The ones that the company refused to answer to are precisely related to what happened before Jan. 6," Julie Owono, an oversight board member and executive director of the digital rights group Internet Sans Frontières, said at the Aspen Institute event.

"Our decision says that you cannot make such an important decision, such a serious decision for freedom of expression, freedom of speech, without the adequate context."

"They're acting like they're bigger than government"

Critics have seized on these shortcomings — such as the board's inability to force Facebook to answer questions it doesn't want to, and its lack of any legal or enforcement authority — to make the case that the board is little more than a fig leaf for Facebook's lack of accountability.

For many people across the political spectrum, the decision this week confirmed whatever opinions they already held.

Lawmakers seized on the opportunity. House Minority Leader Kevin McCarthy, R-Calif., promised to "rein in big tech power over our speech" if Republicans regain control of the chamber.

Sen. Elizabeth Warren, D-Mass., said she was glad Trump would not return to Facebook but renewed her call to break up Silicon Valley giants. "I don't think that Facebook ought to have this kind of power," she told Cheddar News. "We need to break up these giant tech companies, and Facebook is one of them. They are crushing competition and in cases like Facebook, they're acting like they're bigger than government."

Rashad Robinson, president of the civil rights group Color Of Change, told NPR the board is a "distraction" from what needs to be done to force change at Facebook: congressional regulation of tech giants and their powerful leaders, such as Facebook CEO Mark Zuckerberg.

"The question will be, will our elected officials step up and stop allowing this unaccountable single billionaire person to have this type of outsized power in our democracy and our economy and our media?" he said.

But as unhappy as critics are with executives such as Zuckerberg and Twitter's Jack Dorsey making hard calls about online speech, there is resistance to the idea the government should get involved.

Oversight Board co-chair Thorning-Schmidt said she was concerned about autocratic governments stifling free expression online.

"This [Oversight Board] might not be the perfect solution, but it is much better than Facebook doing it themselves or a government taking these decisions," she told Axios. "It might not be a perfect setup, but I challenge anyone to come up with a setup that is better."

Editor's note: Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.