Actress Rachel Zegler and other users criticized the video app TikTok in May 2020, after some reported that the hashtags #BlackLivesMatter and #GeorgeFloyd had been wiped from search results on the platform.
As the entertainment site AceShowbiz.com first reported, Zegler — who is set to star in a new film adaptation of the stage play West Side Story — deleted the app after another user posted screenshots on Twitter showing that the two tags, which have surged in visibility on various platforms since Floyd’s extrajudicial killing at the hands of Minneapolis police on May 25 2020, produced no results:
Zegler later wrote:
The racism we’ve been calling out about your app, esp. recently, is extremely apparent now. If this obvious racism is not addressed properly out of genuine care for the situation (and not fear of losing business) I will be deleting my profile, deleting the app altogether, + encouraging my followers to do the same. You have the opportunity to spread awareness right now, as an extremely popular social media app. But instead you perpetuate oppression by not promoting amazing black creators and now this? Pretty clear where your loyalty lies. Racism is a serious problem. It isn’t new, and we’re all in this together, as we are the ones responsible. Do better, @tiktok_us. Or we’re gone.
The platform responded on Twitter, saying: “We are aware of an issue that is impacting the hashtag view counts displayed at the upload stage. This appears to affect words at random, including terms like #cat and #hello. Our team is investigating and working quickly to address the issue.”
A TikTok spokesperson sent us a separate statement:
We have identified and resolved an issue that had widely affected the view count displayed on hashtags in the upload stage. This bug had temporarily affected view count displays on hashtags in the Compose screen only; it did not affect tags, videos, or discovery of uploaded content. We apologize for the confusion this caused for our community.
When we looked both tags up on the platform we saw that both #blacklivesmatter and #GeorgeFloyd were visible:
“Seems like everyone has had a different experience but it’s been collectively negative,” Zegler wrote after hearing from more of her Twitter followers. “My original statement, however, still stands.”
The criticism over the alleged glitch was the latest round of accusations against the China-based platform. In September 2019, The Guardian reported that content criticizing that country’s government was penalized:
The bulk of the guidelines covering China are contained in a section governing “hate speech and religion”.
In every case, they are placed in a context designed to make the rules seem general purpose, rather than specific exceptions. A ban on criticism of China’s socialist system, for instance, comes under a general ban of “criticism/attack towards policies, social rules of any country, such as constitutional monarchy, monarchy, parliamentary system, separation of powers, socialism system, etc”.
Another ban covers “demonisation or distortion of local or other countries’ history such as May 1998 riots of Indonesia, Cambodian genocide, Tiananmen Square incidents”.
The company responded by saying that the guidelines covered in the leak were “outdated” and had been retired. That was also TikTok’s response that December, when a separate leak revealed that moderators were tasked with suppressing the spread of content posted by users who were disabled — as well as those living with autism or Down Syndrome, or living with “facial problems,” be they birthmarks or disfigurements. The company claimed at the time that their content was limited to the country where it was posted, because the users were “susceptible to bullying or harassment based on their physical or mental condition.”
On May 19 2020, several TikTok users held a one-day protest called the #ImBlackMovement to challenge suppression of Black creators’ content on the platform.
“I did this because black creators are being silenced on TikTok and other social media platforms and I am fed up. Our videos are taken down and our accounts are banned when we speak against racism,” said Black Lives Matter Utah’s Lex Scott, who initiated the digital protest. “Our videos are taken down and our accounts are banned when we speak against racism. I want TikTok to change their policies when it comes to black and brown creators. We should not be punished for speaking against racism. The accounts of actual racists should be taken down.”