Why Reddit Is Losing Its Battle with Online Hate

New research shows how the message board keeps giving bigotry a home.

March 15, 2019 - UK - Stock photo of the Reddit social media app icon on a smartphone. (Credit Image: © Nick Ansell/PA Wire via ZUMA Press)

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

Reddit’s dark corners can seem like a dangerous and seedy mess. With over 330 million users, the message board platform is vast and filled with posters obsessed with sports, hobbies, and hyper-specific public transit memes. But its far-right communities have peddled false-flag conspiracy theories, spread Islamaphobic and anti-semitic content, and encouraged violence. While the company has banned some of its worst message boards, it has often ignored or been slow to take action on other hateful communities.

Over the past several years, Reddit has moved to ban toxic, racist boards like r/EuropeanNationalism and r/MillionDollarExtreme, while allowing other communities, like r/CringeAnarchy and the r/The_Donald, to stay online even as their users posted racist comments in the wake of mass shootings and defended such violence. And according to the authors of a June academic paper on the company’s moderation practices, unless Reddit makes significant changes to how it enforces its policies, the company will always be at least one step behind in its battle against hateful content.

The study is based on the work of computer science researchers from the University of Iowa and the Lahore University of Management Sciences who analyzed over three thousand popular Reddit communities, or subreddits, to examine how Reddit polices the worst content on its platform. Their comprehensive dataset—which included hundreds of hateful, banned subreddits—accounted for 6 percent of all posts and 11 percent of all comments made on Reddit between 2015 and 2018. 

The researchers’ report two major findings. The first is that users’ affiliations can be used to predict, often months in advance, which Reddit communities will turn so hateful and dangerous that they will earn a ban. While that indicates Reddit could proactively stop subreddits before they spiral into cesspools, the researchers’ second conclusion is that the company often ignores such signals, and instead only haphazardly enforces its platform rules, often in the wake of media attention on certain, particularly problematic subreddits.

The result is that users of banned subreddits can migrate over to other dangerous communities that have been allowed to stay online, where hate and harassment can continue unchecked. For example, when Reddit banned the fascist and violence-advocating r/Physical_Removal and the sexist and racist r/MillionDollarExtreme, many users likely moved to other similar subreddits like r/KotakuInAction and r/CringeAnarchy (which wasn’t banned until April) where they could post the same kinds of content that got their previous subreddits banned.

In each of these cases, the communities had developed strong reputations for being home to hateful, policy-violating content, garnering attention from mainstream news sites as Reddit let them stay up for months before taking action. One, r/KotakuInAction, a hub of the misogynist GamerGate movement, remains online thanks to Reddit’s intervention. After the subreddit’s creator decided to shut it down for, in his words, becoming “infested with racism and sexism,” a company representative stepped in to prevent it from closing, according to The Verge.

“Reddit isn’t very consistent with the way that it enforces its policy,” says Rishab Nithyanand, an author of the paper, pointing to the team’s research suggesting that Reddit, rather than impose clear and consistent moderation standards, usually only takes action after media stories highlight a particularly egregious subreddit.

“While community A and B are discussing the same terrible things, A gets banned because it gets picked up by the press, but B carries on,” says Nithyanand, a computer science professor at the University of Iowa. Community B, he explained, then becomes a safe haven for displaced members of A, allowing the spread of hate speech and bigotry to continue almost unchecked.

There have been recent high-profile examples of such a cycle. After the Christchurch mosque shooting in New Zealand, users of two communities, r/CringeAnarchy, and r/The_Donald, posted bigoted content justifying the killings, almost certainly in violation of Reddit’s terms of service. But the company did not ban the communities at the time, in spite of the abhorrent content, and both communities’ history of hosting bigotry. It did finally ban r/CringeAnarchy a month later, and in June eventually “quarantined” r/The_Donald, its term for putting a subreddit on probation and making it harder for users to find its content. The step is often followed by a complete ban.

The authors found that a subreddit’s descent into hate can be fairly reliably predicted by the number of its members who are a part of already banned communities or who are a part of other hateful, but not yet banned communities. Using a model based on data recording users’ participation in multiple subreddits, the group says it was able to use early data to predict which subreddits would eventually earn a ban with 70 to 90 percent accuracy.

“This suggests that administrator and community moderation tools which rely on measuring the connectivity of subreddits to known hate or banned subreddits can be used to pre-emptively identify which subreddits require careful monitoring or even administrator/moderator interventions,” the authors wrote.

Nithyanand thinks that he and his colleagues’ findings strongly indicates that Reddit needs to radically reform its moderation of hateful content, from the application of its policies to the tools that it is using by incorporating more machine learning and A.I. to assist human moderators spot and handle content potentially in violation of the platforms’ rules. While Reddit has grappled with how to handle hate speech, the company has boasted of using such “proactive” systems it its work to help detect, even before users complain, attempts to manipulate the popularity of content on the platform.

Nithayanand believes that if Reddit consistently banned all communities violating its policies, the company would make it harder for the site’s worst users to find new homes and keep spreading bigoted, homophobic, and sexist messages, promoting violence, and otherwise break Reddit’s rules. This would be a significant change from Reddit’s usual practice of taking action, if at all, on communities at inconsistent points in their evolution. 

“The power and influence they have when they have to go and create a whole new community is significantly lower. The number of users that carry on posting in a new community is far lower than the original,” Nithanyanad said. “But if they’re already in two well-established homes, but one got banned, that doesn’t change their level of participation.”

Reddit declined to comment on the record about the paper or related critiques of the company’s moderation policies.

Nithayanand’s research stands in contrast to past work suggesting banning single subreddits reduces users hateful behavior, including a 2015 paper by academics at the Georgia Institute of Technology, Emory, and the University of Michigan. Those researchers found that after several communities were banned, members of those board who went on to post in different subreddits used 80 percent less in hate speech in their new postings. While that paper was based on data from only two banned subreddits—r/FatPeopleHate and r/CoonTown—Nithyanand’s more recent analysis included over 3,000 subreddits, including hundreds of offensive, banned, or quarantined subreddits, and more closely examined users’ behavioral changes, both when they first join a community and when it is banned. 

Given its role in playing host to and nurturing hateful content, Reddit has largely avoided related public controversies that have dogged larger and more widely-known platforms like YouTube and Facebook. But the company has played a pivotal role in online hate, acting as a bridge between right-wing communities on those mainstream platforms and hateful, far-right communities on 8chan and elsewhere. In stepping up their enforcement efforts, Reddit could interrupt a key pathway to bigotry.

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

AN IMPORTANT UPDATE

We’re falling behind our online fundraising goals and we can’t sustain coming up short on donations month after month. Perhaps you’ve heard? It is impossibly hard in the news business right now, with layoffs intensifying and fancy new startups and funding going kaput.

The crisis facing journalism and democracy isn’t going away anytime soon. And neither is Mother Jones, our readers, or our unique way of doing in-depth reporting that exists to bring about change.

Which is exactly why, despite the challenges we face, we just took a big gulp and joined forces with the Center for Investigative Reporting, a team of ace journalists who create the amazing podcast and public radio show Reveal.

If you can part with even just a few bucks, please help us pick up the pace of donations. We simply can’t afford to keep falling behind on our fundraising targets month after month.

Editor-in-Chief Clara Jeffery said it well to our team recently, and that team 100 percent includes readers like you who make it all possible: “This is a year to prove that we can pull off this merger, grow our audiences and impact, attract more funding and keep growing. More broadly, it’s a year when the very future of both journalism and democracy is on the line. We have to go for every important story, every reader/listener/viewer, and leave it all on the field. I’m very proud of all the hard work that’s gotten us to this moment, and confident that we can meet it.”

Let’s do this. If you can right now, please support Mother Jones and investigative journalism with an urgently needed donation today.

payment methods

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate