Skip to main contentSkip to navigationSkip to navigation
Members of the conspiracy theorist group QAnon demonstrate in Los Angeles.
Members of the conspiracy theorist group QAnon demonstrate in Los Angeles. Photograph: Kyle Grillot/AFP/Getty Images
Members of the conspiracy theorist group QAnon demonstrate in Los Angeles. Photograph: Kyle Grillot/AFP/Getty Images

Facebook to ban QAnon-themed groups, pages and accounts in crackdown

This article is more than 3 years old

Policy update comes after the company’s initial attempt failed to stem misinformation and harm from the conspiracy theory

Facebook will ban any groups, pages or Instagram accounts that “represent” QAnon, the company announced Tuesday, in a sharp escalation of its attempt to crack down on the antisemitic conspiracy theory that has thrived on its platform.

The policy will apply to groups, pages or Instagram accounts whose names or descriptions suggest that they are dedicated to the QAnon movement, a Facebook spokesperson explained. It will not apply to individual content, nor to individual Instagram users who post frequently about QAnon but do not explicitly identify themselves as representing the QAnon movement.

The new, broader ban represents the second update to Facebook’s policy against QAnon in less than two months, and it signals that the company’s initial efforts were insufficient to curb the spread of a movement that has been identified as a potential domestic terror threat by the FBI.

Just two months ago, Facebook had no policy on QAnon, which is a baseless internet conspiracy theory whose followers believe, without evidence, that Donald Trump is waging a secret battle against an elite global cabal of child-traffickers.

The conspiracy theory began on the niche image boards 4chan and 8chan but exploded in popularity on Facebook and Instagram in recent months. Facebook’s recommendation algorithms encouraged cross-pollination between QAnon communities and groups dedicated to anti-vaccine or anti-mask activism, Donald Trump, New Age spirituality and wellness, among other topics.

The growing movement, which has its roots in antisemitic conspiracy theories such as the Protocols of the Elders of Zion, is increasingly influential among Republican voters and politicians. Trump has praised QAnon followers and refused to debunk its false claims. The movement has co-opted the hashtags #SaveTheChildren and #SaveOurChildren in a highly successful rebranding effort that helped introduce QAnon ideas to new audiences.

Facebook’s first attempt at a crackdown came on 19 August, when the company announced a host of restrictions on QAnon-promoting accounts that stopped short of an outright ban. Under those rules, in addition to removing QAnon groups from recommendation algorithms, Facebook said it would ban QAnon-themed groups, pages or Instagram accounts if they discussed potential violence. This led to the removal of more than 1,500 Facebook groups and pages, the company said on Tuesday.

“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” the company said in a blogpost. “For example, while we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public.”

What is QAnon and why is it so dangerous? – video explainer

The new rules will be enforced by Facebook’s Dangerous Organizations Operations team, the group that enforces Facebook’s bans on terrorist and hate groups. The team will “proactively detect content for removal instead of relying on user reports”, study the movement, and adjust to changes in terminology or tactics, the company said.

“QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another,” the company said. “We expect renewed attempts to evade our detection, both in behavior and content shared on our platform, so we will continue to study the impact of our efforts and be ready to update our policy and enforcement as necessary.”

The new policy against QAnon is not as strict as Facebook’s rules against the terror and hate groups it designates as “dangerous organizations”. The company is not banning individual users from posting in support of QAnon, for example, and the requirement for Instagram accounts to “represent” themselves as QAnon in order to be banned also leaves a large loophole for influencers who promote QAnon under their own identities. Many popular wellness influencers on Instagram have become major promoters of QAnon conspiracy theories.

A Facebook spokesperson acknowledged the loophole and said the company was still addressing how to deal with such accounts.

The QAnon movement has for months been preparing for a crackdown from social media platforms. Many Facebook groups began using codes – for instance, using “17” as a substitute for “Q” – as early as April. On 17 September, the anonymous internet poster who goes by “Q” and whose messages QAnon followers believe include clues to the vast supposed conspiracy, warned followers: “Deploy camouflage. Drop all references re: ‘Q’ ‘Qanon’ etc. to avoid ban/termination.”

Explore more on these topics

Most viewed

Most viewed