Pin It
QAnon supporter
via Wikimedia Commons

Facebook bans all QAnon accounts on its platforms

The ban follows previous crackdowns on the conspiracy theory, which only saw accounts removed if they contained violent content

Facebook has announced a ban on all QAnon accounts across its platforms. The ban will affect Facebook pages and groups, and Instagram accounts whose names or descriptions associate them with the conspiracy theory, in an attempt to crack down on the misinformation and harmful content spread by its supporters. 

The new ban follows previous attempts by the social media company to curb the conspiracy theory, which began on 4Chan and 8Chan messageboards, but exploded on more mainstream platforms including Facebook and Instagram in recent months. (In short, the theory itself revolves around a cryptic figure named Q, who shares supposedly-leaked information that places Donald Trump in opposition with a global child-trafficking ring run by Democrat politicians.)

On August 19, Facebook introduced restrictions on QAnon accounts, but at the time it only targeted those that discussed or promoted violence. While over 1,500 groups and pages discussing potential violence were removed in the first month of the policy, the company announces in an October 6 update: “we believe these efforts need to be strengthened when addressing QAnon.”

“Starting today, we will remove Facebook Pages, Groups and Instagram accounts for representing QAnon, even if they contain no violent content,” Facebook adds. “We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks.”

In the same statement, the company expands on the reasons for its strengthening of the ban, explaining: “We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update.”

“For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm.” Examples include a theory alleging that the wildfires on North America’s west coast were started intentionally by certain groups, which distracted from the effort to deal with the fires and provide aid to those affected.

“We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary,” Facebook’s update concludes.

Recently, QAnon accounts have contributed to the spreading of misinformation about coronavirus, suggesting it’s a hoax or perpetuating the idea that China created the virus as a form of population control. The online conspiracy has been bleeding into the mainstream and appearing IRL for some time though, represented on placards at Trump rallies, and seeing school shooting survivors accused of being crisis actors.