GHSA Flag Football Playoffs Rnd 1 & 2: Featuring Valdosta, Cherokee, Central Gwinnett and Alexander at 5 P.M.
Section Branding
Header Content
Civil Rights Groups Say If Facebook Won't Act On Election Misinformation, They Will
Primary Content
The civil rights groups behind this summer's Facebook advertiser boycott are joining other critics to pressure the social network to do more to counter hate speech, falsehoods about the election and efforts to delegitimize mail-in voting.
The coalition is launching a weekly livestreamed meeting, starting next week and running through the November presidential election, to discuss "the most urgent issues including voter suppression, election security and misinformation" on Facebook.
Participants include the NAACP, Color of Change and the Anti-Defamation League, which together organized the Stop Hate for Profit campaign. It saw over 1,000 companies pause advertising on Facebook to protest its handling of harmful content.
They are joined by others, including Roger McNamee, an early Facebook investor who has become one of the company's loudest critics, and Yael Eisenstat, who ran Facebook's election integrity operations for political ads in 2018.
"We are deeply concerned that Facebook is now being weaponized, will be weaponized in the coming weeks and possibly even after November 3rd ... to drive anti-democratic dynamics and undermine the results of the vote," Shoshana Zuboff, professor emerita at Harvard Business School and a member of the group, told NPR.
"Facebook is already being used to suppress the Black vote in 2020 and we've seen all sorts of attacks on Black voters across the country," said Rashad Robinson, Color of Change president, in a statement. "We're seeing an unprecedented amount of disinformation and misinformation traveling across the platform."
The group is a project of the U.K.-based The Citizens, an activist organization founded by journalist Carole Cadwalladr, who broke the story of the Cambridge Analytica scandal. She said billing the group the "Real Facebook Oversight Board" is a "deliberate troll" of the actual independent oversight board the company created to review decisions over what content it allows on its platform and make policy recommendations.
After lengthy delay, Facebook's board said on Thursday that it would launch in "mid to late October." The board will review cases submitted by users challenging content that has been removed, as well as decisions referred to it by Facebook. Its review process can take up to 90 days, meaning it is unlikely the board will weigh in on urgent election-related issues.
Critics say the constraints on the board, the fact that it will not review ads or content in groups, and the small number of cases it expects to review each year mean it will have little impact on issues of misinformation, hate speech and urgent harm, which have become flashpoints for Facebook. The company has often been slow to remove content even when it breaks its rules, which has had real-world consequences.
"We ran a year-long global consultation to set up the Oversight Board as a long-lasting institution that will provide binding, independent oversight over some of our hardest content decisions. The members were selected for their deep experience in a diverse range of issues," Jeffrey Gelman, a Facebook spokesperson, said. "This new effort is mostly longtime critics creating a new channel for existing criticisms. We look forward to seeing the Facebook Oversight Board in action in mid to late October."
John Taylor, a spokesman for the Oversight Board, said: "Many groups have strong opinions on how Facebook should moderate content, and we welcome new efforts to encourage debate. The members of the Oversight Board are focused on building an institution that will make binding decisions on Facebook's most significant content issues."
U.S. intelligence agencies, security researchers, and social media companies including Facebook have warned that foreign and domestic actors are seeking to spread misinformation and doubt about the voting process on social media before, during and after election day.
Facebook has come under particular pressure to curb malicious influence on its platform since Russian intelligence agents and Kremlin-backed trolls used the platform in attempts to meddle in the 2016 presidential election. In recent weeks the company has removed accounts linked to Russian state actors it said were involved in previous influence operations.
In the run-up to this year's vote, Facebook has tightened its policies about election misinformation, including deleting claims that people will get COVID-19 if they vote and labeling posts attempting to "delegitimize" the election outcome. It has also said it will not accept any new political ads in the week before the election and will reject political ads "that claim victory before the results of the 2020 election have been declared."
This year, some experts say the biggest election disinformation threat comes from sources in the U.S. — including in the White House, given President Trump's repeated false claims that voting by mail is rife with fraud.
Facebook and Twitter have both labeled some of his posts about voting as breaking their rules, but have for the most part allowed his unfounded claims to remain on their platforms, saying it is in the public's interest to see what world leaders have to say.
Editor's note: Facebook is among NPR's financial supporters.
Copyright 2020 NPR. To see more, visit https://www.npr.org.