Facebook, under the premise of so-called free speech, has long allowed politicians to lie on its platforms.
But after the January 6 attack on the US Capitol, the company appeared to issue somewhat of a mea culpa. Perhaps, as my colleague Pema Levy wrote at the time, Meta executives were finally willing to recognize that “the balance between a newsworthy politician and a dangerous one had tilted too far.” Thus began a series of policy changes aimed at reducing disinformation and polarization.
But the results of the 2022 midterm elections, when election-denying candidates generally got crushed, prompted a change in attitude. Meta, according to new reporting from the Wall Street Journal, apparently perceived those losses at the ballot as permission to loosen up. Today, the newspaper explained, the company quietly decided to allow political ads falsely claiming that the 2020 election was stolen to appear on the platform once again, on free-speech grounds. The updated policy, which specifically pertains to past elections, prohibits advertisers from questioning ongoing and future elections. But it’s a strange caveat considering a rematch between Donald Trump and Joe Biden, one that will surely resurrect the greatest hits of 2020 fiction, is all but certain.
“Meta’s lax policy on political ads—a policy which has sadly been in effect for many months—allows for weaponization and heightened disinformation on Meta’s products,” Nora Benavidez, senior counsel at the media watchdog organization Free Press, told Mother Jones. “Now is an urgent moment for Meta and other platforms to do more, not less, for better and safer user experiences, namely by investing in greater trust and safety and content moderation staff to robustly enforce lies on their platforms.”
The decision comes as tech shifts away from political content in the aftermath of January 6. At the time, Meta faced intense criticism that it had not done enough to stop the flow of disinformation leading up to the violence at the Capitol. Thousands of pages of internal documents, provided by a whistleblower on the company’s civic misinformation team and reviewed by the Security Exchange Commission, confirmed that executives repeatedly declined to adopt recommendations aimed at reducing political polarization. In June, YouTube made a similar decision to lift its ban on election lies.
Now, as the country careens into 2024, tech’s biggest platforms appear to have backed away from the fight against political disinformation. It’s hard not to see such surrender as a welcome to advertise lies without consequence.
“In an environment where the online world impacts real people’s attitudes and voting preferences, lies contained in political ads pose a unique and dangerous threat to democracies,” Benavidez said.”