Facebook Still Won't Own Up To Its Role Spreading Russian Propaganda

A little under a year ago,FacebookCEOMark Zuckerbergdismissed as “pretty crazy” the idea that a disinformation campaign could have used his site to influence the outcome of the election.

Withregulators now breathing down his neckfor exactly that reason, Zuck might wish he could take those words back. 

On Tuesday, Facebook general counsel Colin Stretch, along with his counterparts at Twitter and Google, testified before the Senate Judiciary Subcommittee on Crime and Terrorism, where he disclosed thatFacebook facilitated the spread of Russian-backed contentto an estimated 126 million Americans during the 2016 election.

That figure is based on Facebook data indicating that 120 Russian-backed Facebook pages published 80,000 posts between January 2015 through August 2017 (with an influence campaignpotentially stretching back as early as 2014). Those posts were in turn seen directly by 29 million Americans, who, by interacting with them, distributed the falsities throughout their own personal networks.

That means more than half of all eligible voters in 2016 saw deliberately misleading political content pushed by a foreign adversary on Facebook. 

Facebook CEO Mark Zuckerberg stands in front of a slide boasting about user engagement during his address at the company's 2011 developers conference.

And yet, Facebook continues to downplay its own influence. In testimony, Stretch emphasized that Russian propaganda constituted just 1 out of every 23,000 posts on the site, and therefore couldn’t have significantly affected the election’s outcome.

Sen. Chris Coons (D-Del.) printed out several examples of the content in question, including one from the now-defunct “Heart of Texas” page, which had around 225,000 followers during the campaign last summer. The post in question ― later identified as Russian ― claims to represent American military veterans’ disapproval of Hillary Clinton: 

Another of Coons’ examples was more than a mere ad, he said ― it represented “a national security issue.” The post in question, claiming to represent a “Miners for Trump” group, called on its followers to rally for a “Unity Day” in Pennsylvania on Oct. 2.

″[The page] duped Americans into coming to an event that was nothing but a fake,” said Coons.

“You’ve said these things are vile and you take responsibility for changing, [yet] we’re nearly a year after the election,” Coons remarked, then pointedly asked Stretch: “Why has it taken Facebook 11 months to understand the scope of the problem and begin to address it, when former President Obama cautioned your CEO literally nine days after the election that this was a big problem?”

Coons was referring to reports thatObama pulled Zuckerberg aside last Novemberand personally appealed for him to take the threat seriously.

Stretch deflected, noting that Facebook has taken action since then and that Obama’s comments to his boss were more “about fake news generally.”

Sen. Mark Warner, the top Democrat on the Senate Intelligence Committee, says he has reason to believe Facebook (and Twitter and Google, which saw similar disinformation campaigns)still has yet to fully uncover the full extentof the meddling.

This cycle ― wherein Facebook denies, then begrudgingly admits, there could have been asmallproblem, then drastically revises its numbers weeks later, all while still downplaying its own role ― has been the norm since Zuckerbergfamously dismissed the possibility that fake news could’ve influenced the election in any way.

Here’s a brief timeline of Facebook’s year of denial:

― In July,a Facebook spokesman told CNN:“We have seen no evidence that Russian actors bought ads on Facebook in connection with the election.”

― Two months later, Facebook announced Kremlin-linked accounts had indeed actually spent around $100,000 promoting 3,000 political ads “amplifying divisive social and political messagesacross the ideological spectrum.”

― Amid the fallout of the ad buy news, Zuckerberg minimizedthe ads’ impact, noting the campaigns themselves spent “hundreds of millions” to advertise on Facebook. “That’s 1000x more than any problematic ads we’ve found,” he wrote.

― In early October, Facebook revealed those 3,000 ads may have had more of an impact than they first realized, as they wereseen by around 10 million peoplein the U.S., a population roughly equivalent to theentire state of North Carolina.

― Which brings us to Wednesday, when the company will again amend its data and reveal it exposed 126 million Americans to a disinformation campaign, all while downplaying its own influence.

This story has been updated to include comments from the hearing.

This article originally appeared on HuffPost.