The media watchdog site Right Wing Watch (RWW) was reinstated onto YouTube after the platform initially removed its channel on June 28 2021 for unspecified reasons.
“We are glad that by reinstating our account, YouTube recognizes our position that there is a world of difference between reporting on offensive activities and committing them,” site director Adele Stan said in a statement. “Without the ability to accurately portray dangerous behavior, meaningful journalism and public education about that behavior would cease to exist. We hope this is the end of a years-long struggle with YouTube to understand the nature of our work.”
The platform told CNN, “Right Wing Watch’s YouTube channel was mistakenly suspended, but upon further review, has now been reinstated.”
Hours earlier the site, which reposts and reports excerpts from various right-wing extremist channels, alerted followers on Twitter that its channel had been pulled down.
“Our efforts to expose the bigoted view and dangerous conspiracy theories spread by right-wing activists has now resulted in [YouTube] banning our channel and removing thousands of our videos,” the site said. at the time “We attempted to appeal this decision, and YouTube rejected it.”
The post was accompanied by a screenshot of a message YouTube sent the site saying that RWW had violated its “community guidelines” without actually explaining how it did so:
Kyle Mantyla, a senior fellow with the site, told The Daily Beast, that YouTube had hit RWW with two “strikes” against its channel in April 2021, forcing RWW to limit posting on YouTube, before being flagged over “some video from eight years ago” and subsequently removed. However, the actual disinformation sources they were covering were seemingly allowed to keep posting without consequences.
“The number of times our video has gotten flagged and removed and the video from which we took it is still up on YouTube, you’re just like, ‘Well, something is wrong with your system here,'” Mantyla said.
While it is unclear which video originally led YouTube to pull the site’s channel, screenshots circulated online of right-wing extremists purportedly claiming credit for fraudulently reporting RWW en masse.
For example, RWW highlighted one video on Twitter of a right-wing commentator spouting a conspiracy theory accusing tech financier Bill Gates of investing in farmland as part of a campaign to deprive people of meat and “make you weak”:
The original video, with begins by regurgitating conspiracy theories about both COVID-19 and the 2020 presidential election, is still visible on YouTube. The platform did not respond to our message asking if it violated its policy on misinformation, which states:
We have careful systems in place to help us determine what is harmful misinformation across the wide variety of videos on YouTube. As part of this, we ask external human evaluators and experts to assess whether content is promoting unsubstantiated conspiracy theories, or inaccurate information. These evaluators are trained using public guidelines and provide critical input on the quality of a video. Based on the consensus input from the evaluators, we use well-tested machine learning systems to build models that generate recommendations. These models help review hundreds of thousands of hours of videos every day in order to find and limit the spread of harmful misinformation.
YouTube has also refused to comment on why RWW’s channel was removed in the first place, or how its “external human evaluators and experts” are trained.
In January 2021, the platform announced that it had removed more than 500,000 videos pushing disinformation about the COVID-19 pandemic since February 2020. But questions remained regarding its role in the spread of video disinformation; a study published that same month by researchers from New York University covering that of the first 150 videos available on YouTube covering bladder cancer (out of around 242,000), 67 percent reflected “poor to moderate” quality and 21 percent contained “moderate to high” amounts of disinformation.
In April 2021, disinformation researcher Kate Starbird called YouTube “the dominant domain” in research covering disinformation on several topics.
“What YouTube does is it creates these content resources that get mobilized on other platforms. And so it’s not just a problem within YouTube,” she told NPR. “It’s actually a problem for the whole information ecosystem – the fact that YouTube hosts and allows those videos to be resources that are repeatedly mobilized in these other platforms at opportunistic times to spread mis- and disinformation.”
- YouTube Permanently Bans Right Wing Watch, a Media Watchdog Devoted to Exposing Right-Wing Conspiracies
- How Does YouTube Combat Misinformation
- YouTube Has Removed More Than 500,000 COVID-19 Misinformation Videos Since February
- Many Popular YouTube Videos Spread Misinformation on Bladder & Prostate Cancer, Studies Find
- Exploring YouTube And The Spread Of Disinformation