Is There Any Way to Clean Up Facebook and Twitter?

Frankhoermann/Sven Simon/DPA via ZUMA

Tyler Cowen proposes today that there is no form of internet speech moderation that will satisfy everyone:

I’d like to suggest a simple trilemma. When it comes to private platforms and speech regulation, you can choose two of three: scalability, effectiveness and consistency. You cannot have all three. Furthermore, this trilemma suggests that we — whether as users, citizens or indeed managers of the platforms themselves — won’t ever be happy with how speech is regulated on the internet.

Note that Cowen uses “effective” to mean “doesn’t require so much time that the platform company can’t do its core job anymore.” Back when blogs were new and comment moderation was the big issue we were all trying to resolve, I ran into the same trilemma:

  • If the blog was small, I could easily moderate comments and do it consistently.
  • If I was willing to spend lots of time on moderation, I could manage a large blog with consistent comment policies.
  • If I decided not to worry about consistency, I could manage a large blog without putting a lot of time into comment moderation.

I never came anywhere close to finding a solution to this, and I generally chose option #3. I’d do a bit of moderation here and there, and it would necessarily be pretty inconsistent. However, that left me enough time to actually write a blog even as my audience grew. There’s just no way to spend hours moderating comments and still have hours left over to write a blog of decent quality. The same trilemma affects huge social media platforms:

  • A system that’s big and effective (i.e., lightly moderated by the platform company) will inherently be inconsistent.
  • A system that’s big and consistent will inherently require huge resources from the platform company and therefore won’t be effective.
  • A system that that’s effective and consistent requires too much human intervention to ever become very big.

Most people don’t get this, and therefore expect too much from platform companies like Twitter and Facebook. These companies can use automation to do a lot of the job, but automation isn’t even close to perfect yet. So what do you do? If the automation is too tight, it will eliminate innocent comments and everyone will scream. If the automation is too loose, it will let lots of hate speech through and everyone will scream. If you ditch the automation and use humans, you’ll go bankrupt—and anyway, human moderation is far from perfect too.

I didn’t have an answer to this back when I was a lone blogger (these days I get help from MoJo moderators), and I don’t have one today. However, my own personal view is that we should think of internet moderation about the same as we think of real-life moderation. This leads me in the direction of (a) light moderation that lets people say whatever they want, even if it’s gruesome, and (b) giving users the tools to do their own moderation. I’m far more in favor of the latter than I am with Twitter or Facebook making centralized decisions about what to allow and what to ban.

This is not a perfect solution, but that’s because there are no perfect solutions. And there’s no question that different people benefit from different levels of moderation. It’s one thing for a white man like me to prefer light moderation, but quite another for a black woman who gets far more abuse. Nonetheless, I don’t really see a good solution other than giving us all more and more tools to set our own preferred moderation levels while we wait for automated systems to get better. That’s going to be a while.


The more we thought about how MoJo's journalism can have the most impact heading into the 2020 election, the more we realized that so many of today's stories come down to corruption: democracy and the rule of law being undermined by the wealthy and powerful for their own gain.

So we're launching a new Mother Jones Corruption Project to do deep, time-intensive reporting on systemic corruption. We aim to hire, build a team, and give them the time and space needed to understand how we got here and how we might get out. We'll publish what we find as a major series in the summer of 2020, including a special issue of our magazine, a dedicated online portal, and video and podcast series so it doesn't get lost in the daily deluge of breaking news.

It's unlike anything we've done before and we've got seed funding to get started, but we're asking readers to help crowdfund this new beat with an additional $500,000 so we can go even bigger. You can read why we're taking this approach and what we want to accomplish in "Corruption Isn't Just Another Scandal. It's the Rot Beneath All of Them," and if you like how it sounds, please help fund it with a tax-deductible donation today.

We Recommend


Sign up for our newsletters

Subscribe and we'll send Mother Jones straight to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.


Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.