The members of Northwest Front have been called the “worst racists” in America, amid stiff competition. On Wednesday, Facebook told Mother Jones that it had used a new enforcement strategy to take down a network of Northwest Front accounts that tried to circumvent the platform’s ban on white nationalists.
“This organization has been banned on Facebook for years now. We recently identified that they had tried to reestablish a presence on the platform. What we decided to do is identify as many of their assets as we could, identities of their supporters and members, groups, pages, and then take them all down at once,” Brian Fishman, Facebook’s director of counterterrorism and dangerous organizations policy, explained on the phone.
The action against the Northwest Front that Fishman outlined is a strategy that Facebook occasionally uses to stop the most egregious and dangerous groups from using its tools. Until Wednesday, the company had not shared any details of that strategy publicly.
The group had been organizing on Facebook and attempting to use the platform to recruit potentially sympathetic minds into the neo-Nazi group.
Typically, Fishman explained, most content that violates Facebook’s terms of service is reviewed and taken down as quickly as the company can manage it; the takedowns are accompanied by messages to violators about how they’d breached platform rules. In the case of the recent move against Northwest Front, though, the company instead opted to monitor accounts of individuals associated with the Northwest Front, in the hopes of finding a larger chunk of the network and taking it all down at once, with no explanations given.
“This is different than our normal operations. The reason for doing things like this is to make it more difficult to rebuild a network,” Fishman said. “If you take out one piece or consistently takedown pieces, then they still can try to rebuild.”
Fishman emphasized that Facebook was not ignoring normal policy violations while monitoring the group. “If something came into standard enforcement flows, it would come down,” he said.
“Also if we recognize something that was an imminent threat, we wouldn’t delay. If we identify something that indicated that there’s potential for harm, we’ll take enforcement action,” Sarah Pollack, a spokesperson for Facebook, added.
Facebook would contact law enforcement if the group posed an imminent threat, Fishman said. It would also work with its lawyers to loop in law enforcement, the military, or other government groups if its investigators came across any white nationalists working in their ranks.
Facebook has been deliberately opaque about its strategy in the past. That’s because the company wanted to make it harder for groups like the Northwest Front to understand what was happening and figure out how to dodge detection in the future. Fishman was light on a lot of details about what Facebook looks for in monitoring these types of groups. Often, he said, Facebook won’t say that it took action on a set of accounts, leaving users puzzled as their accounts, pages, and groups are deleted without explanation.
“Sometimes these groups are confused about actions. That’s a positive effect from my perspective,” he said. “When we’re talking about terror groups and hate groups, ambiguity can be a useful tool.”
The overall scope of the takedown was small, at 36 Facebook users, 10 Instagram users, nine groups, and nine pages.
Fishman said that this isn’t the first time Facebook has taken action of this kind. He and Pollack explained that the company had used the same strategy against other American right-wing hate organizations, including The Right Stuff and Identity Dixie, two alt-right podcasts, as well as the Revolutionary Armed Forces of Colombia (aka FARC), a leftist rebel group in Columbia.
“We’ve always wanted to explain that we’re using this approach,” Pollack said, “but we wanted to do it now to highlight [that] our work on dangerous organizations continues even in this moment where Facebook and the world is focused on COVID-19.”