Pin It
TikTok bans conversion therapy content
Via TikTok

TikTok is banning conversion therapy and white nationalist content

The video sharing platform is further cracking down on harmful ideologies, including ‘coded language and symbols’ that normalise hate

TikTok has announced that it will issue a further crackdown on hate speech, just days after toughening its guidelines in relation to the QAnon conspiracy theory.

Although explicitly hateful ideologies, including neo-Nazism and white supremacy, are already banned on the platform, TikTok has revealed that it will update its moderation policy in order to block “neighbouring ideologies”, including white nationalism, white genocide theory, and male supremacy.

In a statement, the app said it will stop the spread of “coded language and symbols that can normalise hateful speech and behaviour”. As part of this, TikTok says it will take further action to remove misinformation and hurtful stereotypes about the Jewish and Muslim communities. It will also remove content that harms the LGBTQ+ community, including conversion therapy videos, and clips that suggest no one is born LGBTQ+.

LGBTQ+ non-profit group, The Trevor Project, said in a statement that it was “thankful to see TikTok ban content that promotes the dangerous and discredited practice of conversion therapy”. It added: “We know from our research that conversion therapy is strongly associated with greater rates of attempting suicide, so it is imperative to protect LGBTQ+ youth from this misleading, harmful, and outright fraudulent content on social media.”

TikTok is also working to “incorporate the evolution of expression into our policies” and is “training our enforcement teams to better understand more nuanced content like cultural appropriation and slurs”. The platform says it wants to ensure users from disenfranchised groups, including the LGBTQ+, Black, Jewish, Roma, and minority ethnic communities, are not censored when using reclaimed language as a term of empowerment.

“Credit where credit is due,” tweeted social media activist organisation Sleeping Giants. “This is a good, necessary move by @tiktok_us. The real test, as always, will be enforcement. Compare this to @facebook and @YouTube, which only took steps like this when pushed by advertisers or when it was politically expedient.”

On Sunday (October 18), TikTok announced that it was increasing its crackdown on QAnon, deleting users who share QAnon-related content, as opposed to simply targeting hashtags used by the conspiracy theory’s supporters.

Speaking to NPR, Angelo Carusone, the president of the non-profit watchdog, Media Matters for America, said: “There should be recognition of a thing that is good and significant, even if it’s long overdue. TikTok is recognising that by the nature of the QAnon movement, you can’t just get rid of their communities, the content itself is the problem.”

Despite upping its moderation efforts in recent months, TikTok has had its fair share of controversies. The app has admitted to censoring LGBTQ+ content (more than once), blocking posts by users deemed “ugly”, poor, or disabled, and has been accused of unfairly censoring content by Black creators. Its algorithm also previously promoted anti-semitic memes and targeted weight loss ads at its young users.