Facebook has allowed conservative news outlets and personalities to repeatedly spread false information without facing any of the company’s stated penalties, according to leaked materials reviewed by NBC News.
According to internal discussions from the last six months, Facebook has relaxed its rules so that conservative pages, including those run by Breitbart, former Fox News personalities Diamond and Silk, the nonprofit media outlet PragerU and the pundit Charlie Kirk, were not penalized for violations of the company’s misinformation policies.
Facebook’s fact-checking rules dictate that pages can have their reach and advertising limited on the platform if they repeatedly spread information deemed inaccurate by its fact-checking partners. The company operates on a “strike” basis, meaning a page can post inaccurate information and receive a one-strike warning before the platform takes action. Two strikes in 90 days places an account into “repeat offender” status, which can lead to a reduction in distribution of the account’s content and a temporary block on advertising on the platform.
Facebook has a process that allows its employees or representatives from Facebook’s partners, including news organizations, politicians, influencers and others who have a significant presence on the platform to flag misinformation-related problems. Fact-checking labels are applied to posts by Facebook when third-party fact-checkers determine their posts contain misinformation. A news organization or politician can appeal the decision to attach a label to one of its posts.
Read more from NBC News:
As Trump bans WeChat, some in China turn to encrypted messaging app Signal
State Department: We’re responsible for Russian, Iranian text message campaign
Top U.S. intel official: China wants Trump defeated, Russia is sabotaging Biden
Facebook employees who work with content partners then decide if an appeal is a high-priority issue or PR risk, in which case they log it in an internal task management system as a misinformation “escalation.” Marking something as an “escalation” means that senior leadership is notified so they can review the situation and quickly — often within 24 hours — make a decision about how to proceed.
Facebook receives many queries about misinformation from its partners, but only a small subsection are deemed to require input from senior leadership. Since February, more than 30 of these misinformation queries were tagged as “escalations” within the company’s task management system, used by employees to track and assign work projects.
The list and descriptions of the escalations, leaked to NBC News, showed that Facebook employees in the misinformation escalations team, with direct oversight from company leadership, deleted strikes during the review process that were issued to some conservative partners for posting misinformation over the last six months. The discussions of the reviews showed that Facebook employees were worried that complaints about Facebook’s fact-checking could go public and fuel allegations that the social network was biased against conservatives.
The removal of the strikes has furthered concerns from some current and former employees that the company routinely relaxes its rules for conservative pages over fears about accusations of bias.
Two current Facebook employees and two former employees, who spoke anonymously out of fear of professional repercussions, said they believed the company had become hypersensitive to conservative complaints, in some cases making special allowances for conservative pages to avoid negative publicity.
“This supposed goal of this process is to prevent embarrassing false positives against respectable content partners, but the data shows that this is instead being used primarily to shield conservative fake news from the consequences,” said one former employee.
About two-thirds of the “escalations” included in the leaked list relate to misinformation issues linked to conservative pages, including those of Breitbart, Donald Trump Jr., Eric Trump and Gateway Pundit. There was one escalation related to a progressive advocacy group and one each for CNN, CBS, Yahoo and the World Health Organization.
There were also escalations related to left-leaning entities, including one about an ad from Democratic super PAC Priorities USA that the Trump campaign and fact checkers have labeled as misleading. Those matters focused on preventing misleading videos that were already being shared widely on other media platforms from spreading on Facebook and were not linked to complaints or concerns about strikes.
Facebook and other tech companies including Twitter and Google have faced repeated accusations of bias against conservatives in their content moderation decisions, though there is little clear evidence that this bias exists. The issue was reignited this week when Facebook removed a video posted to Trump’s personal Facebook page in which he falsely claimed that children are “almost immune” to COVID-19. The Trump campaign accused Facebook of “flagrant bias.”
Facebook spokesperson Andy Stone did not dispute the authenticity of the leaked materials, but said that it did not provide the full context of the situation.
In recent years, Facebook has developed a lengthy set of rules that govern how the platform moderates false or misleading information. But how those rules are applied can vary and is up to the discretion of Facebook’s executives.
In late March, a Facebook employee raised concerns on an internal message board about a “false” fact-checking label that had been added to a post by the conservative bloggers Diamond and Silk in which they expressed outrage over the false allegation that Democrats were trying to give members of Congress a $25 million raise as part of a COVID-19 stimulus package.
Diamond and Silk had not yet complained to Facebook about the fact check, but the employee was sounding the alarm because the “partner is extremely sensitive and has not hesitated going public about their concerns around alleged conservative bias on Facebook.”
Since it was the account’s second misinformation strike in 90 days, according to the leaked internal posts, the page was placed into “repeat offender” status.
Diamond and Silk appealed the “false” rating that had been applied by third-party fact-checker Lead Stories on the basis that they were expressing opinion and not stating a fact. The rating was downgraded by Lead Stories to “partly false” and they were taken out of “repeat offender” status. Even so, someone at Facebook described as “Policy/Leadership” intervened and instructed the team to remove both strikes from the account, according to the leaked material.
In another case in late May, a Facebook employee filed a misinformation escalation for PragerU, after a series of fact-checking labels were applied to several similar posts suggesting polar bear populations had not been decimated by climate change and that a photo of a starving animal was used as a “deliberate lie to advance the climate change agenda.” This claim was fact-checked by one of Facebook’s independent fact-checking partners, Climate Feedback, as false and meant that the PragerU page had “repeat offender” status and would potentially be banned from advertising.
A Facebook employee escalated the issue because of “partner sensitivity” and mentioned within that the repeat offender status was “especially worrisome due to PragerU having 500 active ads on our platform,” according to the discussion contained within the task management system and leaked to NBC News.
After some back and forth between employees, the fact check label was left on the posts, but the strikes that could have jeopardized the advertising campaign were removed from PragerU’s pages.
Stone, the Facebook spokesperson, said that the company defers to third-party fact-checkers on the ratings given to posts, but that the company is responsible for “how we manage our internal systems for repeat offenders.”
“We apply additional system-wide penalties for multiple false ratings, including demonetization and the inability to advertise, unless we determine that one or more of those ratings does not warrant additional consequences,” he said in an emailed statement.
He added that Facebook works with more than 70 fact-checking partners who apply fact-checks to “millions of pieces of content.”
Facebook announced Thursday that it banned a Republican PAC, the Committee to Defend the President, from advertising on the platform following repeated sharing of misinformation.
But the ongoing sensitivity to conservative complaints about fact-checking continues to trigger heated debates inside Facebook, according to leaked posts from Facebook’s internal message board and interviews with current and former employees.
“The research has shown no evidence of bias against conservatives on Facebook,” said another employee, “So why are we trying to appease them?”
Those concerns have also spilled out onto the company’s internal message boards.
One employee wrote a post on 19 July, first reported by BuzzFeed News on Thursday, summarizing the list of misinformation escalations found in the task management system and arguing that the company was pandering to conservative politicians.
The post, a copy of which NBC News has reviewed, also compared Mark Zuckerberg to President Donald Trump and Russian President Vladimir Putin.
“Just like all the robber barons and slavers and plunderers who came before you, you are spending a fortune you didn’t build. No amount of charity can ever balance out the poverty, war and environmental damage enabled by your support of Donald Trump,” the employee wrote.
The post was removed for violating Facebook’s “respectful communications” policy and the list of escalations, previously accessible to all employees, was made private. The employee who wrote the post was later fired.
“We recognize that transparency and openness are important company values,” wrote a Facebook employee involved in handling misinformation escalations in response to questions about the list of escalations. “Unfortunately, because information from these Tasks were leaked, we’ve made them private for only subscribers and are considering how best to move forward.”