TikTok disabled QAnon hashtags and other related conspiracy theories from its search feature today after sweeping changes on other social media outlets target the conspiracies head on. As the heat of the election picks up, so does QAnon content on TikTok. The company won’t allow users to search for the content directly, but it’s algorithm design means that they don’t need to in order for the app to show it to them automatically. The short form video social media platform made headlines recently for its role in the 2020 election season, particularly after hundreds of thousands of Generation Z TikTok users allegedly mass-RSVP’d to President Trump’s campaign rally in Tulsa, Oklahoma last month.
The scheme, which was wildly praised by Millennial and Generation Z leftists on social media, highlighted the apps major caveat as a way for people to mass organize and spread political information—and disinformation. In that instance, TikTok users were able to effectively destroy weeks of data collection by the Trump campaign by using the TikTok algorithm to their advantage, letting them spread their plan to millions of TikTok users in a matter of hours. But what happens when it doesn’t work in their favor? Or when that strategy is used against them?
TikTok, like every other social media site, uses an algorithm to promote new content to its users.
TikTok’s fame compounded exponentially once people realized that the company makes it notoriously easy to create viral content on the platform—more so than virtually any other social networking app. This makes it a hotbed for political news and data manipulation, which is partly why the United States is looking at banning the app altogether.
Where apps like Instagram and Twitter depend on individual users to like and share content, TikTok depends less on interactivity and boosts content that it thinks users are watching. For example if a 15-second video is viewed from beginning to end 80% of the time a user comes across it, TikTok boosts that video by sharing it to more users via the For You page, a stream of videos curated by TikTok’s algorithm for each individual user. If a one-minute video is viewed from beginning to end 20% more than other one-minute videos but has a similar number of likes and shares, TikTok’s algorithm will still share the video to a few more For You pages as a reward for being watched, even though it isn’t getting likes and shares.
Each user’s For You page is tailored specifically to their viewing preferences. The more a user watches, interacts with and shares content, the more tailored their For You page will become to their interests. This is partly what makes the app so fun. A running gag within the app is to joke about which “side” of TikTok you’ve ended up in, referring to which niche of the app you’ve told the algorithm to send you to based on the content you watch and interact with the most.
If you’re a politically quiet Millennial interested in cooking tutorials and pet videos, the odds of TikTok showing you content that instructs you to mess with President Trump’s re-election campaign are virtually none—because what do cute dogs and pasta recipes have to do with overthrowing the government? But if you’re a politically vocal Millennial interested in news about Ghislaine Maxwell’s arrest and trial, TikTok might start sending you videos about those topics. If you interact with it—and you likely will because these stories are ever-changing as they unfold in real time—that might land you in conspiracy theory TikTok, where QAnon thrives even after it was banned from Reddit, YouTube and now, Twitter. Even the FBI has issued warnings against certain conspiracy theories like QAnon.
QAnon is a widely popular, politically charged conspiracy theory finding a new home on TikTok after being banned on many other social media sites.
The theory alleges that members of Hollywood and the liberal elite called the Deep State are part of a ring of Satan-worshipping pedophiles that drink the blood of children after kidnapping and trafficking them around the world. Many QAnon supporters believe that President Donald Trump is spearheading a war against the Deep State after several vague comments by the President have fueled their suspicions.
In 2016 the #pizzagate conspiracy theory alleged that Hillary Clinton’s deleted emails exposed a child trafficking ring being run, in part, by Clinton herself out of a Washington D.C. pizza restaurant. Today #pizzagate is finding new material as TikTok users are making hundreds of viral videos about the subject and using the platform to spread their message.
Trump uses anti-trafficking rhetoric and policy changes as ammunition in his demand for tighter immigration policies, and to wage war on mainstream media outlets or high-profile celebrities that dare to speak up against him. News of Ghislaine Maxwell’s arrest and the explosive virality of the Wayfair child trafficking conspiracy theory have only re-ignited the QAnon flame and spread it into the mainstream corners of the web—like your Facebook news feed, where right-wing suburban housewives are advocating for thousands of missing children that may not actually be missing.
Amid ongoing financial, public health and human rights crises in the United States, a devout trust in Donald Trump’s unfounded goodness is exactly what the campaign needed to wage war on Democrats, even as anonymous militant forces occupy U.S. cities under Trump’s order.
As other social media sites have opted to ban QAnon and #pizzagate related content, TikTok’s QAnon community only began to thrive even more.
Conspiracy theorists have an advantage on a short form video sharing platform like TikTok because they can make their point by presenting just a few references for evidence, but without much context. For right-wing conspiracy theorists, spreading distrust and political disinformation to people that would otherwise discount their theories when presented as a whole becomes much easier. One video plants the seed, and TikTok continues to send the user QAnon content because their For You Page algorithm recorded it as a topic of interest.
Short, one minute videos that infiltrate and spread false information and propaganda have spread like wildfire, but TikTok’s community guidelines did not otherwise ban users from creating and sharing this content.
The company removed the ability to search for content using the #QAnon and #Pizzagate hashtags, but the app still indexes those videos and compiles them together. While its guidelines explicitly say that the platform does not allow “misinformation that could cause harm to our community or the larger public,” it also allows videos that contain both #QAnon and #Trump2020 hashtags to be promoted by its algorithm.
One popular video with over 2.3 million likes allegedly teaches users how to check if their phone has been wiretapped, suggesting that sex traffickers are to blame. A quick search online debunks the video, which actually shows a call forwarding feature that most smartphones have installed into them. But with no option for reporting misinformation that isn’t directly related to COVID-19, TikTok videos such as these are left up for weeks to fuel distrust and open the floodgates for other conspiracy theories to infiltrate.
TikTok updated its community guidelines earlier this year to include conspiracy theories like Holocaust denial videos and other conspiracies that aimed to defraud “widely documented historical events” in its list of banned topics. Later, it updated its guidelines once again to include misinformation associated with COVID-19, but did not ban politically charged conspiracy theory videos like #QAnon or #Pizzagate.
While TikTok disabled QAnon hashtags, the app’s QAnon community is thriving like never before thanks to the algorithm that makes it so easy to go viral in the first place. Users don’t have to search for the content in order for the company’s algorithm to promote it—something that other video platforms like YouTube put a stop to last fall when the company announced it would eliminate conspiracy theory videos from its suggested video algorithm in totality. Later, it banned them completely.
TikTok’s ban on searching for #QAnon content likely saves it from negative attention to some degree by buying the company time. Grit Daily first reached out to TikTok earlier this week to inquire about whether the company was considering placing a ban on QAnon content. The company responded by sending its Community Guidelines, but did not clarify why it still allowed QAnon content on its platform even with a clear stance on misleading information in its guidelines. Later, after it was announced that Twitter removed thousands of accounts in an ongoing effort to ban the topic altogether, TikTok responded once again with the news that the hashtag is no longer searchable on the platform.
When you search for either the #QAnon or #Pizzagate hashtag, users are met with a blank page and a warning that reads “This phrase may be associated with hateful behavior. TikTok is committed to keeping our community safe and working to prevent the spread of hate. For more information, we invite you to review our Community Guidelines.” It is unclear whether the company will change its community guidelines now that TikTok disabled QAnon hashtags.
But a click on any video boosted by the For You page algorithm that contains the hashtag #QAnon takes users to a page where they can view over 84 million QAnon videos compiled by the app. So while TikTok disabled QAnon hashtags from its search, its algorithm has no problem boosting the content to millions of users based on their interests, leaving those users vulnerable to being exposed to political propaganda without ever searching for it.