Facebook Is Finally Cracking Down on Fake News

Conservatives are already freaking out.

Facing backlash over claims it played a significant role in spreading viral fake news before the election, Facebook has released several test features aimed at halting the spread of misinformation in users' News Feeds. The changes were unveiled on Thursday, and will first appear for a small portion of English-speaking users, before gradually rolling out to a wider population, Facebook said in a corporate blog post.

"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Facebook's News Feed VP Adam Mosseri wrote. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations."

The new features are a departure from Mark Zuckerberg's initial dismissal of the idea that Facebook helped shape the outcome of the presidential election.

The strategy starts by enabling users to identify and report what they believe falls under the category of fake news:

Facebook

After the story is flagged, Facebook's partners at four prominent fact checking organizations—Snopes, Politifact, FactCheck.org, and ABC—will then help determine whether the story in question is in fact fake. If it is, Facebook will attach a "disputed" message for any future posts that include the story's link:

Facebook

Facebook will also attempt to block the users who masquerade as authentic news outlets. In the weeks since the presidential election, several fake news writers admitted to exploiting anti-Hillary Clinton fervor and people's distrust for the media, saying the gig was simply too lucrative to quit.

Shortly after Facebook announced the new changes on Thursday, some conservatives denounced the efforts as a "disaster" and a leftist ploy: