The new policy is Facebook’s most drastic step yet to beat back the conspiracy theory, which the FBI warned last year could be a domestic terror threat. The company has previously taken smaller steps to limit the conspiracy theory’s spread, but up until now had stopped well short of banning it entirely. In August, the company said it would ban QAnon pages and groups when they discuss violence, and last week said it would ban QAnon ads.
Those earlier crackdowns had resulted in the removal of thousands of pages and groups, but didn’t address all the ways that the conspiracy theory causes “real world harm,” according to Facebook. “While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public,” the company says. “Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another.”
The company notes that it will take some time to ramp up its enforcement of the new policy. Another challenge is that supporters of QAnon have been adept at evolving their message in order to lure more followers. For example. QAnon believers have also latched onto anti-vaccine and COVID-10 conspiracy theories. More recently, the group has latched onto the issue of child trafficking as a recruitment tactic.
With the ban, QAnon followers will have even fewer large platforms to use to expand their reach and grow a following. Reddit banned the conspiracy theory in 2018, and Twitter banned thousands of QAnon accounts earlier this summer. TikTok has also taken steps to limit its spread by banning hashtags associated with the movement.