Facebook says it has removed several groups, accounts and pages associated with QAnon, taking action for the very first time against the far-right US conspiracy theory circulated among supporters of President Donald Trump.
The social-media giant made the announcement Tuesday within its monthly briefing on “coordinated inauthentic behavior” on its platforms. That’s Facebook’s term for fake accounts run with the intent of disrupting politics elections and society.
As well as the QAnon accounts, Facebook also removed accounts associated with VDARE, a U.S. website known for posting anti-immigration content, in addition to accounts in Russia, Iran, Mauritania, Myanmar and the united states of Georgia.
QAnon is a right-wing conspiracy theory centered on the baseless belief that Trump is waging a secret campaign against enemies in the “deep state” and a child sex trafficking ring run by satanic pedophiles and cannibals. For more than 2 yrs, followers have pored over a tangled group of clues purportedly posted online by a high-ranking government official known only as “Q.”
The conspiracy theory first emerged in a dark corner of the internet but has been creeping in to the mainstream political arena. Trump has retweeted QAnon-promoting accounts and its own followers flock to the president’s rallies wearing clothes and hats with QAnon symbols and slogans.
Facebook says it found the QAnon activity as part of its investigations into suspected coordinated inauthentic behavior prior to the 2020 presidential election.
“We are making progress rooting out this abuse, but as we’ve stated before, it’s a continuing effort,” the business said in its April report on coordinated activity. “Which means building better technology, hiring more persons and working more closely with police, security experts and others.”
Social media research firm Graphika, which receives funding from Facebook, said in a concurrent report Tuesday that the QAnon network promoted conspiracy theories and tried to market merchandise, such as for example T-shirts, using Facebook. The network, Graphika said, were run by a tiny group of users who had both real and fake accounts.
The network focused generally on the “Q” conspiracy theory, but dabbled in others _ around the 5G wireless network, the U.S. presidential elections, Bill Gates and the coronavirus, Graphika said.
The study firm found related activity on Twitter as well, but noted that alone, such activity may not have violated Twitter’s rules. Twitter allows users to post under fake names. A representative for Twitter didn't immediately respond to a note seeking comment.