SAN FRANCISCO — As many as 126 million people — or one-third the U.S. population — may have seen material posted by a Russian troll farm under fake Facebook identities between 2015 and 2017, according to testimony presented by Facebook's general counsel at a hearing before the Senate on Tuesday.
The figure is the largest yet of the possible reach Russian operatives had on the giant social platform in the run-up to last year's presidential election and afterwards and reflects Facebook's new disclosures that a Kremlin-linked misinformation agency used original content in users' feeds, as well as paid ads. Previously Facebook said 10 million people saw Russia-linked advertising that sought to sway U.S. voters.
The figures come as Capitol Hill readies itself for two more hearings on the ways Russia used social media to influence the U.S. 2016 election, scheduled to take place on Wednesday.
Social media companies are under pressure to respond to demands by lawmakers that they follow the same regulations on political ads as advertisers in newspapers and on radio and television currently do, including disclosures about who paid for the ads and bans on foreign entities running election-related ads. Facebook, Google and Twitter have all said they would begin doing so, though lawmakers have pushed for additional concessions.
Twitter, which originally said it found 201 accounts linked to Russia that were sending out automated, election-related content, also increased its estimates of the reach these operatives had on its platform. It has now found 36,746 such accounts, according to testimony to be presented by the company’s acting general counsel Sean Edgett.
In a blog post Monday about its testimony for Tuesday, Google reported that it had found 18 YouTube channels it believes were associated with the campaign.
The companies' testimony before the Senate Judiciary Subcommittee on Crime and Terrorism shows that Russian attempts to influence U.S. voters by using the power of social media platforms and an understanding of hot-button social issues was much broader than originally thought..
The goal, said Facebook General Counsel Colin Stretch, was "to try to sow division and discord — and to try to undermine our election process."
The Russia propaganda machine used a mix of ads and original posts. Facebook says an estimated 11.4 million people in the United States saw at least one ad that was paid for by the Russian troll farm known as the Internet Research Agency between Jan. 2015 and August 2017.
But the number of U.S. residents who saw ads was dwarfed by the estimated 29 million people who got content the firm generated and shared in their Facebook News Feeds.
And because people frequently share, like and forward material, Facebook estimates that approximately 126 million people might have seen this divisive material during the two years in which the group was using Facebook to place ads and distribute postings.
"Many of the ads and posts we've seen so far are deeply disturbing — seemingly intended to amplify societal divisions and pit groups of people against each other. They would be controversial even if they came from authentic accounts in the United States. But coming from foreign actors using fake accounts they are simply unacceptable," Stretch's testimony said.
Ads focused on issues across the ideological spectrum, "from LGBT matters to race issues to immigration to gun rights," Stretch said.
The Internet Research Agency is a St. Petersburg-based organization that posts Russia-government approved propaganda online, under fake identities, according to U.S. intelligence officials.
The postings were still only a small portion of the material that Facebook users see every day, he cautioned, equal to about 0.004% or one in 23,000 pieces of Facebook content that users see in their News Feeds.
Twitter says the Russia-related accounts sent “approximately 1.4 million automated, election-related Tweets,” which were seen by approximately 288 million Twitter users.
Edgett emphasized these were just 0.012% of the total accounts on Twitter at the time and that the tweets sent by these Russian-linked automated accounts constituted less than 0.74% of all election-related Tweets.
Google found two accounts linked to the Internet Research Agency which spent a total of $4,700 on its platforms during the 2016 election cycle, it said in a blog post Monday.
The company also found 18 YouTube channels it believes were associated with the campaign. These channels uploaded political videos that represented 43 hours of content. They got 309,000 U.S. views from June 2015 to November 2016. The company noted that a single user might have viewed one video multiple times so it’s likely fewer people actually saw them.
It also found evidence that Gmail accounts “associated with the campaign were used to open accounts on other platforms” and has shared information about those accounts with the other platforms.
Facebook also revealed that in the late summer it began to see fake personas being created on the platform by a group linked to a Russian hacking group that has been publicly linked to Russian military intelligence services.
Those fake personas were being used to "seed stolen information to journalists" and were organized under the banner of a group that called itself DC Leaks, according to the Facebook testimony. The stolen information included information and files stolen by hacking into the email accounts of former White House chief of staff John Podesta's and the Democratic National Committee.