In January, following the deadly insurrection at the US Capitol, Facebook halted Donald Trump’s ability to post to Facebook and Instagram, and asked the company’s new Oversight Board to determine whether Trump’s ban should be made permanent. The board’s ruling came Wednesday—but a final decision on Trump’s status did not. Instead, the board pushed the question back onto Facebook: The ban will remain in place while Facebook deliberates.
The platform’s ostracizing of the former president brought its Oversight Board enormous attention. Facebook created the board to take on some of its trickiest content moderation decisions, wrote the board’s rules, and funded it through a trust. But it also limited the panel’s authority: Board members only consider individual content decisions, such as whether a given post should be removed, but cannot make broader changes to how Facebook operates.
For this reason, Facebook critics see the board as a clever distraction from the true problems plaguing the world’s largest social media company. The longer the public debates the board’s merits, they say, the less we pay attention to Facebook’s structural problems, including a business model based on outrage, surveillance capitalism, and anticompetitive practices. Even Trump is part of this elaborate side show, according to the company’s foes—a distraction from Facebook’s myriad issues.
I spoke with Roger McNamee, an early Facebook investor who in recent years has become one of the company’s most prominent critics and the author of Zucked: Waking Up to the Facebook Catastrophe, about the board’s decision, and what people should be minding instead. This conservation was edited for length and clarity.
What’s your quick take on the Oversight Board’s decision today?
I’m mostly focused on the fact that at a point in time when Facebook is the subject of a Texas antitrust case related to price fixing with Google, where an insurrection was actually organized on its platform, where Facebook’s own research and internal disclosures have suggested that people were radicalized into Qanon by the platform, where disinformation about COVID amplified on Facebook has undermined the nation’s response to a pandemic. We’re sitting here talking about the Oversight Board instead of talking about those issues. That is a huge win for Facebook.
I think of the Oversight Board as an elaborate public relations stunt. The controversies are essentially created to make the distraction more effective. As much as I’m sure they would rather have had the Oversight Board take the hit for this, better to buy another six months than have to put the Trump thing behind them, and then face whatever scrutiny they were going to get on those other issues.
So Facebook doesn’t mind that we get to have this same Trump conversation over and over for another six months?
Facebook is in the business of attention. They know more about the manipulation of human attention than just about any business on Earth. And once you recognize that the Oversight Board—in fact, frankly, all of their communication strategy—is really about attention, then everything makes sense, because suddenly you realize they’re like a magician. And they know how to draw your attention to the left hand, so you don’t see what the right hand is doing. And they’ve done that here.
To me, the Oversight Board’s decision to punt back to Facebook today reinforced that critique. Because it just kept us in this sort of like loop, talking about the same thing.
Rather than asking the question of should Donald Trump be reinstated to Facebook, a more useful question would be: Is there any way that Facebook can be made safe for democracy, public health, and self determination? And what would that look like?
Do you have an answer?
If we put this in the frame of how US policy and law work, there are three areas you have to look at. You have to look at safety. You have to look at privacy, which is really code for self-determination, And then you have to look at competition, which is antitrust.
Under safety, the issue is that the software industry has no standards. There’s no equivalent to the Hippocratic oath. There’s no requirement that engineers anticipate, much less mitigate, harm before shipping a product. And there’s no accountability for when they do, in fact, ship a harmful product. And so we have to fix that.
[On privacy] the place you start is by recognizing that all humans need a sanctuary, that constant surveillance is a terrible thing. And that it would be best if we agree that there were certain classes of data that should not be shared, things that are incredibly intimate. I think you start by banning that, and I think you then maybe have an opt-in rule for everything else.
And then, lastly, you have the competition stuff. That’s the place where our system of government is further along. We have now a bipartisan consensus that this industry needs to be regulated with antitrust laws.
To bring this back to Trump a bit, you’re saying that the Trump decision is just one little patch, while these broader problems persist?
I think of Trump as one of the black holes of internet platforms. He just absorbed all the energy. But he was also more or less enabled by them.
Say Facebook decides to let Trump back on. What do you think are the consequences of that?
Our [political] system is in a very dangerous place right now, where the forces of right-wing extremism have political power greatly in excess of their numbers. And internet platforms have been central to enabling that. It all became really obvious with Trump’s election in 2016. I think it’s better that he’s not on. But how big a difference it makes I have no idea.
You can’t really ever know the counterfactual.
Forget counterfactual for a minute. The Republican Party’s refusal to acknowledge the outcome of the election is as extreme as it could possibly be. How different would things be if Trump were still on internet platforms? I mean, obviously, the quality of the platforms is slightly less toxic without him on there. I’m not sure that our politics could be any worse than they are right now.