Facebook Let Trump Lie America Into an Insurrection. Will It Stop Other Leaders From Doing Worse?

The company’s advisers weigh rolling back political figures’ freedom to sway debate with disinformation and hate.

Mother Jones illustration

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

In the fall of 2019, Facebook made a fateful decision, announcing that posts and ads from politicians would henceforth be considered “newsworthy,” and would be allowed to be disseminated on the platform even if independent fact-checkers debunked their claims. Put simply, politicians were now allowed to lie on Facebook.

The presumption behind this policy was that people should have access to what their leaders are saying, even if it is not true—or is hateful, or even a little dangerous—in order to be able to assess them. It also provided an immediate benefit to the company: they would not have to censor or sanction Donald Trump, the sitting president of the United States and one of its most popular users. But on January 6, 2021, when rioters egged on by Trump stormed the US Capitol, Facebook decided that the balance between a newsworthy politician and a dangerous one had tilted too far.

In the coming weeks, Facebook’s new Oversight Board will decide whether to restore Trump to the platform. But Facebook has also asked the board to advise it on a possibly thornier issue: What should it do about other political leaders? Should their posting privileges continue even if they lie, stoke violence, or push other harms? By tasking the board with this question, Trump has been set up as a test case that will establish a precedent the company could apply across the globe.

“As the Facebook Oversight Board moves forward with looking at not only this case but other cases in the future, they do need to take into consideration the dangerousness of the speech,” says Adonne Washington, a legal fellow at the Lawyers’ Committee for Civil Rights Under Law, who argues that Trump’s status as a political leader should be part of the equation that weighs against restoring his access—and against keeping the permissive policy in place.

Washington helped craft a public comment submitted to the board by the Lawyer’s Committee, urging it to permanently ban Trump from both Facebook and Instagram, another company product. Their argument was, in essence, the opposite of Facebook’s original logic: rather than give deference to politicians, Facebook should hold them to a higher standard because they have more ability to cause harm. “The traditional free expression principles enshrined in civil and human rights laws are meant to empower individuals,” the submission states. “Free speech is vital for regular individuals to speak truth to power, give voice to the marginalized, spread new ideas, and pursue justice. The purpose of free expression is to protect the people against the state, not to protect the state against accountability.”

Instead, the newsworthiness exception has allowed Trump and other political figures to design ads and make posts pushing misinformation and hateful rhetoric that, if shared by another user, would be taken down. In that way, Facebook has allowed some of most powerful people on the platform—including the most powerful person in the world—to post the worst content.

Janai Nelson, associate director-counsel at the NAACP Legal Defense and Education Fund, agrees that prominent speakers should be subject to greater, not lesser, scrutiny. “The calculus is not only the content, but the potential reach and the degree of authority the speaker appears to have,” she says. “That is going to determine how this misinformation or disinformation is spread.”

It’s hard to quantify the power that any one speaker may have, or the harm that could result from them abusing the platform. Take the surge in anti-Asian hate crimes. Trump didn’t invent violence toward Asians and Asian Americans, but he did inflame anti-Asian sentiment—including on social media. When the coronavirus pandemic began, Trump started calling it the “Chinese virus” and other similarly offensive terms. Throughout 2020, Trump used the phrase and its variant “China virus” at least 50 times on Facebook in posts that collectively garnered around 11.7 million interactions, according to a Media Matters report. As Trump continued to use these phrases, violence against Asian Americans grew. A year after Trump began using this phrase—which was parroted by right-wing media—a gunman killed eight people, including six Asian women, in and around Atlanta. 

It’s impossible to quantify the direct impact of Trump’s social media posts on this phenomenon, but logic would say it was significant, especially when they were amplified, as Trump’s were, both by Facebook and by a right-wing media ecosystem on the platform. To grapple with this, the Oversight Board would need to consider the role leaders like Trump play in shaping the conversation among their supporters and allied media figures, as well as Facebook’s own role in amplifying certain messages and insulating users into ideological bubbles that keep out information that could counter misinformation or hate speech.

“The President of the United States, whether speaking as a candidate or from his official office, has power that is just beyond measure,” says Nelson. “And it’s being amplified on this platform [Facebook] in ways that are absolutely unprecedented.”

Trump spent the time between the November election and the January 6 Capitol insurrection spreading his big lie. He continuously posted false claims of voter fraud and that victory had been stolen from him on Facebook and Twitter. Both platforms kept the content up while affixing fact-checking labels to many of the posts. But despite the labels, Trump and his allies created a powerful false narrative that a majority of Republicans still believe today, even after it culminated in the deadly attack on the US Capitol. Trump had the power to incite a violent insurrection. Surely leaders in other countries have the power to unleash even more destabilizing carnage.

“I’ve never really liked the idea that Facebook has promoted, or that Mark Zuckerberg promoted, that somehow a politician’s speech gets more protection than anybody else’s,” says David Kaye, a professor at the University of California-Irvine School of Law and a former United Nations special rapporteur on free speech. “That’s not how the First Amendment works. It’s also not how the human rights law around freedom of expression works.” Under human rights law, he explains, citizens’ political speech should be especially protected because of the value of expressing dissent—if anything, it should be more protected than a political leader’s speech. “It doesn’t really matter that Trump was a politician, it matters that Trump had the ability to incite the kind of violence that we saw on January 6.”

But the possibility of a Facebook that more regularly bans or suspends political leaders gives many pause. “People who are committed to free speech have different views on this question,” says Jameel Jaffer, executive director of the Knight First Amendment Institute at Columbia University. The problem, he explains, is two competing principles, both of which matter to free speech advocates. One is that no one power—in this case Facebook—should have total control over what is said and discussed. The other is for a public discourse that is constructive and respectful, rather than filled with conspiracies and hate speech. With Facebook, the principles are in conflict. “You gotta figure out how to reconcile these two things that seem irreconcilable,” he says.

When it comes to Trump, Jaffer would say Facebook has gotten the balance right. “I think [the platforms] should be especially hesitant to intervene, because the public needs access to political leaders’ speech in order to understand government policy, to understand the motivations of that particular political leader, to evaluate whether that political leader is doing his or her job,” he says. “That said, I think there are limits, and my view is that on or around January 6, President Trump crossed an important line… He was using his social media accounts to encourage imminent violence. At that point, I think the platforms really had no choice.”

If the Oversight Board restores Trump’s posting privileges, it will suggest that even dangerous political leaders have a place on Facebook. If the board also recommends against suspending other political leaders, it would empower demagogues more dangerous than Trump to stoke violence through Facebook. It’s an awesome power facing the newly-formed board.

But unsaid in Trump’s case is Facebook’s own active role in designing a platform that amplifies the reach of dangerous content. In putting these questions to the Oversight Board—which has no power over algorithmic changes or amplification policies—Facebook has simplified the issue to a red light or green light. There are other, more nuanced, steps the company could take, including changes to the company’s core algorithms, that could limit the reach and damage of dangerous or false speech both from political leaders and more generally. 

If the board advises against suspending other political leaders and some then use the platform to push violence, Jaffer has a prediction. “Facebook will then point to the board and say, ‘We took this question very seriously. We asked these free speech experts, and they told us err on the side of keeping political leaders’ speech up,'” says Jaffer, sketching an easy-to-imagine scenario. “In fact, the decisions that actually matter much more are the design decisions that Facebook has now kind of swept under the rug.”

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate