A New York Times column on Facebook’s actions against the data analysis tool CrowdTangle prompted concern from both independent researchers and at least one group collaborating with the platform.
According to Kevin Roose, a tech columnist for the newspaper, Facebook’s decision to move CrowdTangle employees under the jurisdiction of the platform’s “integrity” department in April 2021 followed an extended clash between Facebook executives over data transparency. Roose wrote:
They argued that journalists and researchers were using CrowdTangle, a kind of turbocharged search engine that allows users to analyze Facebook trends and measure post performance, to dig up information they considered unhelpful — showing, for example, that right-wing commentators like Ben Shapiro and Dan Bongino were getting much more engagement on their Facebook pages than mainstream news outlets.
These executives argued that Facebook should selectively disclose its own data in the form of carefully curated reports, rather than handing outsiders the tools to discover it themselves.
Team Selective Disclosure won, and CrowdTangle and its supporters lost.
Brian Boland, who was identified in the column as a Facebook vice president “in charge of partnerships strategy,” left the company in November 2020. He told Roose that executives were “enthusiastic” about CrowdTangle — right up to the point that the information it yielded led to stories critical of the platform.
“One of the main reasons that I left Facebook is that the most senior leadership in the company does not want to invest in understanding the impact of its core products,” Boland said. “And it doesn’t want to make the data available for others to do the hard work and hold them accountable.”
Roose did not mention that his employer had been partnering with Facebook on the use of augmented realism (“AR”) technology as part of its Instagram page. (Instagram, like CrowdTangle, is owned by Facebook.) The Times did not respond to our request for comment on that relationship. Axios reported that the newspaper recently hired a former Facebook engineering manager, Jason Sobel, as its new chief technology officer.
But Roose did note Facebook’s displeasure with the data he provided on the @FacebooksTop10 account on Twitter, which showed how right-wing content dominated the platform:
Roose said that after verifying the information he posted through the account, Facebook executives considered creating a separate account using internal data that would provide “more balanced lists.”
“They never did that,” Roose wrote, adding:
But several executives — including John Hegeman, the head of Facebook’s news feed — were dispatched to argue with me on Twitter. These executives argued that my Top 10 lists were misleading. They said CrowdTangle measured only ‘engagement,’ while the true measure of Facebook popularity would be based on ‘reach,’ or the number of people who actually see a given post. (With the exception of video views, reach data isn’t public, and only Facebook employees and page owners have access to it.)
The platform has claimed that it is not moving to stifle CrowdTangle.
“CrowdTangle is part of a growing suite of transparency resources we’ve made available for people, including academics and journalists,” a spokesperson told Roose. “With CrowdTangle moving into our integrity team, we’re developing a more comprehensive strategy for how we build on some of these transparency efforts moving forward.”
But the decision has raised red flags with researchers that have used CrowdTangle independently. Moira Whelan, director for democracy and technology for the non-profit National Democratic Institute, said on Twitter that her group uses GroupTangle data in its own disinformation-related analyses and that “a reduction of effort” around it would hurt democracies worldwide.
“Crowdtangle in the US political context is one thing. But what is disturbing is how much this drives decision-making in what is a global company,” she wrote. “Starving it of resources is the absolute opposite of what should happen if we have any hope of addressing disinformation. The tool has helped civil society identify patterns, find out who’s leading attacks, and empower them to identify changes Facebook needs to make.”
Roose’s column came out less than a year after Facebook threatened “enforcement action” against researchers at New York University in October 202 who were studying its practices concerning political advertising. The company backed down on that threat that December.
Jared Holt, a resident fellow at the Digital Forensic Research Lab researching domestic extremism, called Roose’s story part of a pattern within Facebook and other platforms.
“When researchers/journalists use the transparency tools and find bad trends or tidbits, the company restricts the transparency rather than grapple with the core issues,” he wrote.
Reached for additional comment, Holt told us:
Major tech companies have incorporated themselves deeply into American life at a scale that is hard to overstate. Transparency into their operations and what is occurring on their platforms is of immense public interest and vital to understanding platforms’ influence domestically, as well as internationally. Any moves to lessen transparency deserve hard scrutiny, due to the immense value inherent in its availability.
Following the publication of Roose’s column, International Fact-Checking Network (IFCN) director Baybars Orsek expressed concern on his own Twitter account that the service would remain available to member news outlets. Reached for follow-up comment, Orsek told us that the group had already confirmed with Facebook that it would continue to have access to CrowdTangle.
Like many other journalism-related organizations, the IFCN has partnered with Facebook or apps under its corporate umbrella for various initiatives. When we contacted Orsek asking if the IFCN was comfortable still doing that in light of both Roose’s column and its threat against the NYU researchers, he sent us a statement saying:
Dozens of organizations within the IFCN verified signatories community are working with Facebook to counter misinformation on the platform and we are committed to assisting those organizations by providing the company with our honest and constructive feedback and input as well as partnering for programs to make more resources and tools available for the fact-checking community.
Update July 16 2021, 12:37 p.m. PST: Updated with additional comment from Baybars Orsek.