In the week leading up to the 2018 midterm elections, two massive stories dominated the news: the migrant caravan approaching the southern US border and pipe bombs mailed to major liberal figures. As the stories spread online, so did a constellation of hoaxes and conspiracies about them. Some conservatives baselessly claimed that George Soros, the liberal mega-donor and hedge fund millionaire, was somehow fueling both events. Others called the bomb scares a false-flag operation to bring sympathy and votes to Democrats in the upcoming election.
The events recalled the lead-up to the 2016 election, when outright false stories and misleading memes went viral. Except this time, the misinformation looked like it was being spread by Americans, not a coordinated group of Russian trolls.
In 2017, as Americans began to come to grips with social media manipulation, US intelligence agencies, Congress, and the media quickly identified Russia as the disseminator of hoax news stories in its attempts to sow divisions among Americans. The Kremlin’s Internet Research Agency pumped out memes and built pages designed to divide the American public. Others, like Iran, glommed onto the seeming success of the IRA’s operation and launched their own digital influence efforts.
Digital misinformation then seemed like it was mostly being spread from outside the borders of the United States. Now the misinformation is coming from inside the house.
Dipayan Ghosh, a former Facebook policy adviser and current fellow at New America and the Shorenstein Center at the Harvard Kennedy School, says in an email that “instead of only looking outward, we should be looking in. The disinformation problem is becoming increasingly domestic.” Ghosh explains that at Facebook, for example, most disinformation campaigns observed by the company can trace their origins domestically. He thinks this trend is only increasing.
In the case of the hoax linking the migrant caravan to Soros, Jonathan Albright, a social media researcher at Columbia University’s Tow Center for Digital Journalism, found that the earliest mentions of migrant caravans in conjunction with the liberal billionaire came within Facebook groups that didn’t appear to be organized by a foreign country trying to spread false information.
Domestic disinformation and misinformation isn’t new. Outlandish conspiracy theories like Pizzagate went viral before 2018. Infowars, a pivotal progenitor of internet misinformation, has been online since 1999. “What’s new is the technology that makes it easy to set up websites, write dubious stories, modify videos, and spread them through social media, which gives them exposure that’s unprecedented,” explains Alan Rosenblatt, director of digital research at Lake Research Partners and a professor at George Washington University.
Before social media could amplify information in the way it does now, Infowars’ outrageous claims, including one that the 2010 film Machete was a part of a government plot to incite a race war, went relatively unnoticed. Now, hoaxes rip across the internet, leaving wakes big enough to cascade into national discourse.
In 2018, major technology giants finally deplatformed prominent hoax peddlers like Infowars creator Alex Jones, but his impact couldn’t be reversed. Even though he’s no longer on Facebook and Twitter, Jones’ domestic misinformation playbook is still etched into far-right internet circles.
After a horrible school shooting, the victims are portrayed as crisis actors in an orchestrated plot to boost gun-control legislation. When protests break out against conservative policies, the protesters are painted as paid participants in a liberal plot to influence policy. The financier of all these made-up schemes? The right’s favorite boogeyman, George Soros. And if the hoax is completely unverifiable, all the better. It’s hard to provide a counterfactual to something that was never based in reality to begin with.
This framework for dreaming up hoaxes to rebut news events, which Jones popularized, is no longer solely an internet affair. Prominent mainstream Republican pundits like Rush Limbaugh and Fox News hosts have pushed such conspiracies too, including the false-flag hoax about the bombs scares. Even Republican Rep. Matt Gaetz of Florida promoted the idea that Soros funded the migrant caravan on Twitter, though he later wrote a correction in a separate tweet.
During the week of the midterm elections, well-publicized hoaxes didn’t emerge from shady, potentially foreign accounts, but from known American figures, sometimes with blue verified check marks on Twitter.
The bestselling conservative author Larry Schweikart tweeted out a hoax from his verified account that “illegals” were being bussed to polling places and paid to vote for Democratic Texas Senate candidate Beto O’Rourke. Republican Georgia gubernatorial candidate Brian Kemp sent out a robocall to voters falsely claiming that his opponent, Democrat Stacey Abrams was trying to “steal” the election using undocumented voters.
Domestic misinformation didn’t stop after the elections on November 6. One week later, the White House used doctored video to justify suspending CNN reporter Jim Acosta’s White House press pass. And during the harrowing fires in Paradise and Malibu, California, far-right internet figure and verified Twitter user Mike Tokes shared baseless conspiracy memes about the fire being the result of “lasers, or high frequency direct energy beams” potentially shot from drones. The hoax picked up steam and persisted in gullible corners of the internet.
As more and more people in the United States push their own fake information, social media manipulation from groups abroad isn’t going away either. Facebook, Twitter, and Google have disclosed several manipulation attempts by Russia and Iran that they’ve caught this year. The problem isn’t shifting but is instead compounding.
Internet companies are trying to crack down on these hoaxes. Facebook launched a “war room” to weed out misinformation and largely kept fake news from spreading on Election Day. Twitter and Google say they’re trying to get ahead of misinformation peddlers too. But hoax peddlers have gotten more sophisticated as well, and the spread of false information continues.
Zeynep Tufekci, a social media researcher and professor at the University of North Carolina, doesn’t think that major tech companies can innovate their way out of enabling the spread of hoaxes. There’s not a technological solution, she says, because facilitating the spread of salacious and often false information is an inherent feature of the platforms’ current business model, not a glitch. If that doesn’t change, misinformation of all types may get even worse in 2019.