Russian disinformation efforts resulted in their content being shared on more than 100 online outlets as part of a two-pronged approach covering both social media and ginned-up websites, according to a report commissioned by the Senate Intelligence Committee.
The report, conducted by researchers at the Stanford Internet Observatory, used data provided by Facebook to the committee to analyze activity between 2014 and 2019 from two groups: Russian intelligence services (commonly known as the GRU) and the Internet Research Agency (IRA), described as “a nominally independent social media firm owned by Yevgeny Prigozhin, an oligarch with close ties to Russian President Vladimir Putin.”
“If appropriately coordinated, the combined capability potential between the GRU (hacking; narrative laundering) and the IRA (subversive social influence; memetic propaganda) could result in significantly impactful information operations,” the report stated.
It was the GRU, the study said, that “aimed to achieve influence” by creating phony think tanks, blogs, or fake accounts posing as journalists in order to distribute content. Among them were “Alice Donovan,” an online persona credited with posts on various sites pushing GRU narratives. The Donovan account was named as a Russian asset in the investigation carried out by special counsel Robert Mueller into that country’s purported interference in the 2016 presidential election, which led to indictments against 12 GRU officers in 2018.
“This report helps us better understand how the GRU conducts its information warfare operations,” committee chair Sen. Richard Burr said of the Stanford probe. “It’s clear that the foreign influence threat is persistent and evolving, and we cannot flag in our collective effort to combat it.”
To illustrate the reach fake personas like Donovan as well as “Jonivan Jones” and “Sonia Mangal” amassed online, the report included a graphic showing the multitude of sites where their content was posted:
While the GRU’s tactics failed to gain much traction using social media, the report said, they did show value in fomenting propaganda:
Articles achieved placement in at least 142 alternative outlets and were occasionally amplified by large state media entities as well. While social network engagement (Likes, Shares) is one of the most quantifiable measures of reach, securing article placement delivers the attention of those publications’ audiences; a piece of content with minimal engagement on Facebook that nevertheless ends up quoted on RT (formerly Russia Today, the Russian international media network) has the potential to reach an audience of millions. A narrative that is repeated, on multiple sites, in a subsection of a media ecosystem with heavy audience overlap is more likely to achieve a measure of influence within that segment.
However, the report stated, the GRU’s efforts to release thousands of stolen Democratic National Committee documents online stalled until it provided them to Wikileaks through the use of another fake persona, “Guccifer 2.0.”
“That content, disseminated via Wikileaks and then dissected and speculated upon by every major newspaper and television station in America, arguably had more of an impact on the US election than any social influence operation,” said the report.
Meanwhile, the IRA made use of its resources in October 2016 in spreading Wikileaks content related to thousands of stolen emails from John Podesta, campaign chairman for then-Democratic Party presidential nominee Hillary Clinton. According to the Washington Post:
The IRA used its Twitter accounts to help push information from those documents and the ones WikiLeaks published in July, according to previous research by Clemson University professors Darren Linvill and Patrick Warren. They found a dramatic surge of tweets — 18,000 over a single 24-hour period — from IRA accounts on Oct. 6, 2016, the day before WikiLeaks published the Podesta emails.
Overall, the study said, the IRA “inflected their content with precision, using audience segmentation techniques regularly deployed by social media agencies,” with some of its accounts amassing more than 100,000 followers and the boost of plausible deniability since it is not technically owned by Putin’s government.
Sen. Mark Warner (D-Virginia), the committee’s vice-chair and ranking member, told the Washington Post that the Stanford study showed that the effects of social media posts are not solely confined to the online world.
“Platforms like Facebook can also serve as the launching pad for narratives that spread throughout the information ecosystem,” Warner told the Post. “These big platforms need to do a better job of making sure they don’t become tools for Russian manipulation of American voters, and responsible actors need to take serious stock of how they interact with, rely on and amplify the information found on those platforms.”