One day after Facebook announced that it had uncovered a foreign influence campaign reaching over 300,000 users, a panel of tech experts told the Senate Intelligence Committee on Wednesday that Congress needs to set new rules for how social media companies combat disinformation.
“The time for industry self-regulation has probably passed,” said Philip Howard, director of Oxford University’s Internet Institute. But while Democrats, Republicans, and the experts they’d summoned to the Senate all seemed to agree that tech giants such as Facebook shouldn’t be trusted to regulate themselves, Congress doesn’t appear to be close to any sort of consensus on what that regulation should look like. While Democrats pushed for specific policy actions to address the continuing spread of disinformation by foreign intelligence, Republicans expressed more interest in researchers’ conclusions that Russia’s attacks were not targeted solely against liberals, nor have those attacks been limited to influencing electoral politics.
“I want to summarize for the American people…[Russia] didn’t do it because they had political leanings to the right or left, or because they care about our elections,” said Sen. Richard Burr (R-N.C.), the chairman of the committee. “They did it because a weak America is good for Russia.”
Wednesday’s hearing had an unusual bipartisan flair for this year’s Congress, thanks to the fact that foreign trolls have sowed disinformation against both Democrats and Republicans. John Kelly, founder and CEO of Graphika, a social media analytics company, told the senators that bot accounts on both the far left and far right tweet on average 20 to 30 times as much as more moderate, real accounts. Researchers also pointed to attacks on Sen. Ted Cruz (R-Texas) and Sen. Marco Rubio (R-Fla.) during the 2016 election to indicate that the attacks were not specifically targeted against one party. “Russian efforts are not made against one campaign, one party, or one country,” Kelly said.
But while neither party is spared, the experts at the hearing said that foreign influence campaigns have continued to primarily focus on exploiting divisions among Democrats. Renee DiResta, director of research at New Knowledge, a social science think tank, said that there were continuing “efforts to push intra-party divisions on the left” and suppress voter turnout.
Researchers at the hearing also emphasized that foreign influence online has gone beyond electoral politics, an issue that seemed of particular concern to Republicans. DiResta cited ongoing campaigns to stir up racial divisions, discord over the the Flint water crisis, and even posts from both sides about the debate over NFL players kneeling during the national anthem.
But Democrats and Republicans seemed to split when it came to the specifics of how to tackle the problem. Several of the researchers at the hearing suggested they were in favor of a recent proposal from Sen. Mark Warner (D-Va.) that social media companies be required to label bot accounts. “Even after 18 months of study,” Warner said, “we’re still only scratching the service when it comes to Russia’s information warfare.”
Even with the identification of many accounts run by the Internet Research Agency, a Russian company that launched a political influence campaign during the 2016 election, nearly one-third of disinformation accounts detected in recent months have been connected with the IRA. And with recent news that the IRA has changed tactics and begun paying in US dollars rather than Russian rubles, it could become even more difficult for social media companies to identify IRA purchases.
Sen. Ron Wyden (D-Ore.) pushed for dismantling Section 230 of the Communications Decency Act, a law he himself wrote in 1996, which prevents internet companies from being sued over user-generated content. “I just want to be clear, as the author of Section 230, the days when these pipelines are considered neutral are over,” Wyden said. Wyden also announced that he would soon be introducing legislation to give the Federal Trade Commission more power to “be a tougher cop on the beat” when it came to social media regulation.
Meanwhile, many Republicans expressed concerns over regulation’s impacts on free speech. Sen. James Risch (R-Ind.) pointed out that treating the issue like a cybersecurity threat contradicted the point that “platforms are supposed to be accessible to everyone.”
The hearing’s panelists said that tech companies themselves can go further to curb disinformation in the absence of new government regulation, and that companies like Twitter and Facebook can do far more to reveal the extent of foreign influence peddling on their platforms. Giving greater access to researchers and encouraging cross-platform collaboration could allow for a more comprehensive picture, said Laura Rosenberger, director of Alliance for Securing Democracy at the German Marshall Fund of the United States, a bipartisan, transatlantic think tank.
“I hope if there’s a takeaway from today’s hearing is that this is the last time that we’re going to associate the propaganda that we see with an election cycle,” said Burr. “Some feel that we as society are sitting in a burning room calmly drinking a cup of coffee telling ourselves this is fine. That’s not fine. And that’s not the case.”