Report: Facebook Still Struggles to Contain Misinformation

Despite the company’s public commitments, pages pushing post-election violence have grown.

Bill Clark-Pool/Getty Images

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

In 2016, Facebook was caught flatfooted. Throughout that year’s election, Americans using the platform confronted a range of misinformation as Russian state-sponsored trolls, Eastern European teenagers, and others manipulated the company’s weak moderation practices. Since then, the company has claimed that it takes disinformation seriously. But a new report from Avaaz, a global nonprofit activist group, suggests that Facebook still has a lot of work to do to meaningfully tamp misinformation. 

While Facebook appears to have made significant strides in stopping foreign governments from using its platform to interfere in U.S. elections, Avaaz’s analysis suggests the company is having a hard time replicating that success when it comes to disinformation designed inside the United States. 

Avaaz found that Facebook could have stopped an estimated 10 billion views on top-performing pages that repeatedly shared misinformation in the eight months leading up to the 2020 elections. The lack of moderation enforcement on such pages—which Avaaz described as having shared “false or misleading information with the potential to cause public harm” by undermining democracy, public health, or encouraging discrimination—allowed them to flourish. The report finds the pages nearly tripled their monthly interactions from 97 million in October 2019 to 277.9 million a year later. 

Pages backing the QAnon and Stop The Steal movements—two groups with violent histories that Facebook has repeatedly committed to excise from its platform—specifically benefited from this growth. Avaaz found 267 groups with a combined following of 32 million that spread content championing violence through the 2020 election. Sixty-nine percent of these groups had Boogaloo, QAnon, or militia-themed names and shared related content. Facebook has also promised to crack down on both militias and Boogaloos, an extremist group that is pushing for a second civil war.

Facebook spokesperson Andy Stone disputed Avaaz’s findings, claiming that the “report distorts the serious work we’ve been doing to fight violent extremism and misinformation on our platform” and accusing the researchers of using “a flawed methodology to make people think that just because a Page shares a piece of fact-checked content, all the content on that Page is problematic.”

At the time Avaaz completed its report, the organization found that 118 of the 267 groups were still active and continued promoting violence to a combined following of nearly 27 million. Following Avaaz’s work, Facebook reviewed these groups and pages and removed 18 more, claiming it had already removed 4 of the groups prior to Avaaz’s report.

The report also documents how misinformation about the 2020 election circulated on Facebook long before voting closed, results were known, and the conspiracy theories about a stolen election took off. By October 7, Avaaz had documented 50 posts making misleading election fraud claims in key swing states. The posts had 1.5 million interactions, and reached an estimated 24 million viewers. 

While Facebook might label misinformation coming from an account that first generated it, the company’s artificial-intelligence powered tools frequently fail to detect accounts copying the misinformation. Ahead of the election, Avaaz found that copycat accounts avoided fact-checking labels applied to the original posts and continued to spread debunked information. Such posts reached an estimated 124 million people.

These problems persisted after the November elections. In advance of Georgia’s runoff Senate elections on January 5, Avaaz found that Facebook failed to label spreaders of misinformation, some of whom garnered more interactions than traditional news sites covering the race. “The top 20 Facebook and Instagram accounts we found spreading false claims that were aimed at swaying voters in Georgia accounted for more interactions than mainstream media outlets,” the report states.

Avaaz noted that Facebook’s failings on election misinformation go beyond AI detection problems. The company’s own policies also allowed misinformation to spread. In Georgia, for example, the group noted that political ads on the platform spread information that had been debunked by independent fact-checkers. But because Facebook policy expressly allows politicians to lie in political ads, the candidates were allowed to spread this misinformation to millions of voters.

To counter viral misinformation, Avaaz has several recommendations for Facebook. Among them, the group urges the company to stem the influence of accounts spreading disinformation well before an election and not wait to downrank them until just weeks before, as it did in 2020. The report also suggests Facebook take more steps to proactively show corrections to users exposed to misinformation and improve its AI detection systems, which failed to keep a significant amount of false content off the platform in 2020.

Avaaz also has ideas for how Washington should begin to regulate social media platforms. The report’s first recommendation on that front—and the one that would pack the most punch at Facebook—is to create a legal framework that would force the company to alter its algorithms that push disinformation, hateful, and toxic content to the top of users’ news feeds. Critics have long pointed out that Facebook’s algorithm prefers sensational content that drives engagement, even if it is otherwise harmful—and that Facebook has been resistant to change that because it would directly impact the company’s growth and revenue. 

In other recommendations for the federal government, Avaaz suggests mandating public reports from social media companies that would disclose details about their algorithms and their struggles against misinformation. They also suggest that the existing Section 230 of the Communications Act, which provides platforms liability protection for user-created content they help spread, somehow be rewritten to allow regulation of misinformation. Another envisioned reform, which Avaaz says would cut belief in false information by half, would be to require platforms to create retroactive correction tools that would alert users they were exposed to misinformation even if it hadn’t been determined to be false at that time.

On Thursday, Facebook CEO Mark Zuckerberg is set to testify alongside other tech CEOs at a Congressional hearing on social media and disinformation. “The most worrying finding in our analysis is that Facebook had the tools and capacity to better protect voters from being targets of this content, but the platform only used them at the very last moment, after significant harm was done,” said Fadi Quran, a campaign director at Avaaz, in a statement. “How many more times will lawmakers have to hear [Zuckerberg] say ‘I’m sorry’ before they decide to protect Americans from the harms caused by his company?”

This post has been updated with comment from Facebook.

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate