Political Extremists Are Using YouTube to Monetize Their Toxic Ideas

A new report says the site has become a breeding ground for conspiracy theorists and white supremacy.

YouTube has become a breeding ground for political extremism.Anadolu Agency/Getty Image

Fight disinformation: Sign up for the free Mother Jones Daily newsletter and follow the news that matters.

If you search for “Federal Reserve” on YouTube, one of the first videos to surface is titled “Century of Enslavement.” Using archival footage and the kind of authoritative male voice heard in countless historical documentaries, the 90-minute video espouses the idea the Federal Reserve was formed in secret by powerful, often Jewish, banking families in the early 20th century, causing America to spiral into debt. 

With over 1.6 million views, the video is categorized as “News and Politics.” It was created by a channel called the Corbett Report, which also boasts documentaries touting conspiracy theories, including that 9/11 was staged by the US government and that global warming is a hoax. Watching the video quickly leads users down a rabbit hole of “recommended videos” that detail Illuminati conspiracy theories and blame Israel for 9/11. 

The incendiary Federal Reserve video, flagged by MSNBC host Chris Hayes earlier this month, is just one of many examples of how political extremists have mastered YouTube’s algorithms and monetization structure to spread toxic ideas ranging from conspiracy theories to white supremacy. The video “Why Social Justice is CANCER,” for instance, appears after searching for “social justice.” 

According to a new report entitled “Alternative Influence: Broadcasting the Reactionary Right on YouTube,” researcher Becca Lewis details how the site has also become home for live-streamed, difficult-to-moderate debates on topics such as “scientific racism,” where the two sides typically seem to reach agreement with no alternative position discussed.

Over 10,000 active viewers watched a four-hour January debate between several YouTube creators, including white nationalist Richard Spencer and his opponent, Carl Benjamin, a commentator who criticizes identity politics and goes by the name Sargon of Akkad. “Spencer has had years of experience arguing his racial theories and spoke with more confidence than Benjamin,” Lewis writes in the new report characterizing YouTube as a largely uncritical platform for racist ideology. Benjamin asserts that his position in the debate did not advance any white nationalist or racist theories.*

YouTube’s parent company, Google, had a limited presence during July’s Congressional hearings on social media’s role in propagating misinformation and political bias, while Facebook and Twitter were the primary targets. But YouTube as an individual company has managed to avoid much scrutiny. With nearly 2 billion unique users a month, the platform is used by 94 percent of 18-24 year olds regularly, and one in five members of this group go to YouTube first to get their news, according to the Pew Research Center. With such a large audience and little oversight, many political extremists have turned to YouTube to spread their views and make money doing it.

In the report, released today by tech institute Data and Society,  Lewis examines how political extremists have created a deeply connected “alternative influencer network” on YouTube. They have used collaborations, like staged debates, to tie together users who promote a range of political positions, some of which are extremist.

YouTube posters like “Roaming Millennial” co-opt language from the left to try to radicalize viewers.

YouTube

Lewis, who is a doctoral candidate in communications at Stanford University, spent a year analyzing more than 80 YouTube channels that connected users with positions ranging from mainstream liberalism to white supremacy. She found that YouTube hasn’t just provided a platform for ideas formerly relegated to anonymous internet forums like 4chan—it’s helped monetize them. 

Through YouTube’s network, content from so-called “dark web intellectuals”—like anti-feminist psychology professor Jordan Peterson—quick leads users to more radical ideologues, like white nationalist Richard Spencer, who broadcast what Lewis describes as “extremely harmful—in many cases, racist and sexist—content.” Content creators adopt the position of being marginalized cultural underdogs while reaching an audience of millions. By appearing on each other’s channels and collaborating on staged debates, these alternative content creators have created an intricate network that makes political extremism easily accessible on YouTube.

Mother Jones spoke to Lewis about her report and how YouTube has become a breeding ground for political extremism.

Mother Jones: It seems YouTube gets much less attention than Facebook and Twitter when we talk about political extremism and misinformation. Why is that?

Becca Lewis: We’ve gotten a really clear picture of the type of fake news that disseminates on Facebook [when it’s] created by Macedonian teenagers. We’ve gotten a really clear idea of what can happen when Donald Trump retweets a tweet that originated from an anti-Semitic meme on an anonymous forum. We don’t have as clear a picture of what’s happening on YouTube and Google. It is important to bring to the fore some illustrations of the problems that do exist on these platforms. I’m trying to show there are fundamental issues we need to be addressing with YouTube in the same way we have recognized fundamental issues with Facebook and Twitter.

MJ: YouTube is also one of the only platforms that offers financial incentives to creators through its Partner Program, which allows them to make money off ads on their videos. Can you explain how this incentivizes more extreme behavior?

BL: One of the troubling implications of the report is that these issues can’t be fixed with a simple tweak here or there because they are built into the monetization structure of YouTube. One thing that makes YouTube so appealing to influencers and viewers alike is the fact that viewers can interact directly with the people making content and have a more intimate relationship than viewers or readers have with mainstream news outlets. 

At the same time, an influencer who is making content, in most cases, is also trying to make money off it. So when they have viewers who are telling them to keep making more and more extremist content, they have a direct financial incentive to do so. It speaks to the larger culture of metrics in newsrooms and the emergence of clickbait, but it’s particularly pronounced in a very specific way on YouTube. And I think you see people going down these [ideological] paths they might not otherwise because they’re financially incentivized to do so.

MJ: One way these posters define themselves is by saying they are underdogs who are being attacked by mainstream society. Do you have any thoughts on how to de-platform or de-monetize these creators if they just turn around and point to those efforts as examples of the very discrimination they can use to bolster their claims?

BL: That’s a fundamental question that has been plaguing academics and tech firms alike. My interpretation is that the framing of social underdog paranoia thrives when content moderation and platforming happens inconsistently and without a clear explanation. And the fact is that if [extremists] were being consistently de-platformed, they wouldn’t be able to make content about it.

Someone who has talked a lot about censorship is Paul Joseph Watson, an influencer on YouTube who is affiliated with Infowars. Most recently, after Alex Jones was removed from YouTube, Watson made a video called “(((Censored))),” which signals anti-Semitic themes while discussing alleged conservative bias on social media platforms. But at the same time, he has over 1 million followers, 1 million subscribers on YouTube. And as part of the YouTube Partner Program, he has received a plaque from YouTube for influencers who passed 1 million followers. So here he is able to provide that narrative of censorship while getting influencer treatment from the platform. 

MJ: There’s been a lot of discussion in Congress about how to best regulate social media platforms. Do you have any thoughts on what actions should be taken by either tech companies or lawmakers to rein in this problem?

BL: Up until now these platforms have largely been given carte blanche; they have evaded regulation to a large extent. So even the shifting nature of the conversation, the fact these platforms are now facing pressure externally, is a promising sign. Even though you could debate how much actually came out of the congressional hearings, I think it’s a promising sign that they have started. In terms of talking about solutions, we need to be approaching these problems from multiple tracks. I absolutely think reassessing the algorithms [that surface extremist content] is one step that needs to be taken. Assessing what government regulation options are available is absolutely worthwhile, and then thinking about how YouTube monetization structures incentivize certain behaviors is something that needs to be done. It needs to be a multi-pronged solution.

This interview has been edited for clarity. 

Correction: This story has been updated to clarify Carl Benjamin’s position in the Richard Spencer debate.

We Recommend

Latest

Sign up for our free newsletter

Subscribe to the Mother Jones Daily to have our top stories delivered directly to your inbox.

Get our award-winning magazine

Save big on a full year of investigations, ideas, and insights.

Subscribe

Support our journalism

Help Mother Jones' reporters dig deep with a tax-deductible donation.

Donate