In a Friday News Dump, YouTube Announces Election Disinformation Policy Changes

On June 2 2023, a Reddit user shared a post claiming that YouTube altered a policy and was no longer restricting false claims about the 2020 election in the United States:

A day later, a similar post appeared on r/technews, indicating YouTube (a subsidiary of Google) would “no longer take down false claims” about “U.S. elections”:

Fact Check

Claim: In June 2023, YouTube announced it would “stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.”

Description: As accurately described on Reddit and Twitter, Google’s video-sharing platform YouTube issued a statement on June 2, 2023, confirming changes to its election disinformation policy. YouTube announced that it would stop removing content promoting false claims about widespread election fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections.

Rating:

Rating Explanation: The claim is true. The announcement from YouTube was shared by various outlets and can be confirmed from the direct statement from YouTube and Google themselves.

On Twitter, CNN tweeted about the change, and right-wing disinformation purveyor Charlie Kirk lauded the policy (while lamenting the platform’s remaining restrictions):

Late in the day on Friday June 2 2023, a post (“An update on our approach to US election misinformation”) was published to “YouTube Official Blog.” A subheading referenced YouTube “providing a home for open discussion and debate during the ongoing election season,” and the announcement explained in part:

We first instituted a provision of our elections misinformation policy focused on the integrity of past US Presidential elections in December 2020, once the states’ safe harbor date for certification had passed. Two years, tens of thousands of video removals, and one election cycle later [in June 2023], we recognized it was time to reevaluate the effects of this policy in today’s changed landscape. In the current environment, we find that while removing this content does curb some misinformation, it could also have the unintended effect of curtailing political speech without meaningfully reducing the risk of violence or other real-world harm. With that in mind, and with 2024 campaigns well underway, we will stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections. This goes into effect today, Friday, June 2 [2023]. As with any update to our policies, we carefully deliberated this change.

This specific aspect of our elections misinformation policy represents just one piece of a broad, holistic approach towards supporting elections on YouTube. Here’s what isn’t changing:

  • We are ensuring that when people come to YouTube looking for news and information about elections, they see content from authoritative sources prominently in search and recommendations. For example, following the 2020 US election, we found that videos from authoritative sources like news outlets represented the most viewed and most recommended election videos on YouTube. And our 2020 election information panels, with relevant context from voting locations, to live election results, were collectively shown over 4.5 billion times.
  • All of our election misinformation policies remain in place, including those that disallow content aiming to mislead voters about the time, place, means, or eligibility requirements for voting; false claims that could materially discourage voting, including those disputing the validity of voting by mail; and content that encourages others to interfere with democratic processes.

The rest of our policies continue to apply to everyone on YouTube, for all types of content, including elections. This includes policies against hate speech, harassment, and incitement to violence.

Google/YouTube issued the statement late in the day on a Friday — an action typically viewed as a “Friday news dump.” In a May 1 2023 fact check about an unrelated Friday news dump, we excerpted political commentator Taegan Goddard’s The Political Dictionary:

Releasing bad news or documents on a Friday afternoon in an attempt to avoid media scrutiny is often called a “Friday news dump” by members of the media.

NPR: “Often, the White House sets the release of bad news and unflattering documents to late Friday afternoon. The Pentagon and other agencies also use the practice, a legacy of earlier administrations.”

The television show The West Wing had an episode on the technique called, “Take Out the Trash Day” … “Any stories we have to give the press that we’re not wild about, we give all in a lump on Friday.”

Discourse on social platforms demonstrated the announcement was not well received and that it instantly aroused suspicion. Commenters on r/technews speculated that Google’s decision was a cost-cutting measure:

More like not policing content is cheaper than any (even shitty automatic) policing. They will only do it when absolutely unavoidable, in the least engaging way. That’s why copyright take-downs are the travesty we know.

The Reddit post linked to an article by NPR about Google’s YouTube policy change. It reported that other platforms had also started loosening restrictions on the spread of disinformation, adding:

“YouTube was one of the last major social media platforms to keep in place a policy attempting to curb 2020 election misinformation. Now, it’s decided to take the easy way out by giving people like Donald Trump and his enablers free rein to continue to lie without consequence about the 2020 elections,” said Julie Millican, vice president of liberal watchdog Media Matters for America. “YouTube and the other platforms that preceded it in weakening their election misinformation policies, like Facebook, have made it clear that one attempted insurrection wasn’t enough. They’re setting the stage for an encore.”

YouTube’s policy went further than Facebook and Twitter, which said they would label but not take down false election claims.

Twitter stopped labeling false claims about the 2020 election early last year [2022], saying it had been more than a year since the election was certified and Biden took office.

A June 2 2023 Associated Press piece was structured very similarly, reporting:

[YouTube’s existing rules against election misinformation] could prove difficult to enforce, said John Wihbey, an associate professor at Northeastern University who studies social media and misinformation.

“It doesn’t take a genius if you’re on the disinformation ‘we were wronged in 2020’ side to say, ’wait a minute, let’s just claim that voting just generally is not worth it. And 2020 is our example,” [Wihbey] said. “I don’t know how you disentangle rhetoric that both refers to past wrongs and to forward possibilities. The content moderation team, which is going to try to do this, is going to tie themselves in knots trying to figure out exactly where that line is.”

The announcement comes after YouTube and other major social media companies, including Twitter and the Meta-owned Facebook and Instagram, have come under fire in recent years for not doing more to combat the firehose of election misinformation and disinformation that spreads on their platforms.

Broadly, coverage of Google and YouTube’s policy change referenced policies on platforms like Twitter and Facebook and changes to them in response to the 2020 election.

Google and YouTube were not the only parent and platform in the news in early June 2023 for amending policies enacted to curtail the spread of dangerous misinformation. On June 5 2023, CNN reported that Facebook-owned platform Instagram reversed a decision to “ban” Robert F. Kennedy, Jr. On February 11 2021, Associated Press published “RFK Jr. kicked off Instagram for vaccine misinformation,” reporting:

Instagram on [February 10 2021] banned Robert F. Kennedy Jr., son of former presidential candidate Robert F. Kennedy, from repeatedly posting misinformation about vaccine safety and COVID-19.

Kennedy Jr. has amassed a huge following on social media, where he frequently posts debunked or unproven claims about vaccines. He also uses his social media pages to post about large pharmaceutical firms and environmental health concerns.

“We removed this account for repeatedly sharing debunked claims about the coronavirus or vaccines,” a spokesperson for Facebook, which owns Instagram, said Thursday [February 11 2021].

CNN’s June 5 2023 article provided some context about why Instagram had restricted Kennedy to begin with, and their stated reasoning for restoring his account. In addition, CNN reported that Facebook had removed “Facebook and Instagram accounts belonging to Children’s Health Defense, Kennedy’s anti-vaccine group” in August 2022, after the accounts “repeatedly violated” Facebook’s health disinformation policies:

“As he is now an active candidate for president of the United States, we have restored access to Robert F. Kennedy, Jr.’s, Instagram account,” Andy Stone, a spokesperson for Instagram’s parent company Meta said in a statement.

Kennedy, who has a long history of spreading vaccine misinformation, was banned from Instagram in February 2021 … While Kennedy’s Instagram account was banned, his Facebook account remained active. Both platforms are owned by Meta [Facebook].

Kennedy was a leading anti-vaccination voice during the Covid-19 pandemic, using his social media platforms to sow doubt and misinformation about the [vaccine].

He has promoted false claims about vaccine links to autism and in 2022 compared vaccine mandates to Nazi Germany.

On June 2 2023, Google’s video sharing platform YouTube issued a statement confirming the platform would “stop removing content that advances false claims that widespread fraud, errors, or glitches occurred in the 2020 and other past US Presidential elections,” effective “Friday, June 2” 2023. The claim was true, and accurately described on Reddit and Twitter. On June 5 2023, CNN reported that Facebook/Meta had confirmed in a statement that it reversed a “ban” on Robert F. Kennedy Jr.’s account. Kennedy was banned from the platform in February 2021 for spreading disinformation about vaccines during an ongoing pandemic.