An internal investigation by Facebook has uncovered thousands of groups and pages, with millions of members and followers, that support the QAnon conspiracy theory, according to internal company documents reviewed by NBC News…
The top 10 groups identified in the investigation collectively contain more than 1 million members, with totals from more top groups and pages pushing the number of members and followers past 3 million. It is not clear how much overlap there is among the groups.
And what do they plan to do about it?
The investigation will likely inform what, if any, action Facebook decides to take against its QAnon community, according to the documents and two current Facebook employees who spoke on the condition of anonymity because they were not authorized to speak publicly on the matter.
A Facebook spokesperson said the company has routinely enforced its rules on QAnon groups.
“Enforcing against QAnon on Facebook is not new: we consistently take action against accounts, Groups, and Pages tied to QAnon that break our rules. Just last week, we removed a large Group with QAnon affiliations for violating our content policies, and removed a network of accounts for violating our policies against coordinated inauthentic behavior,” the spokesperson, who asked not to be named for fear of harassment from the QAnon community, wrote in an emailed statement. “We have teams assessing our policies against QAnon and are currently exploring additional actions we can take.”
Reuters has more on that large group that Facebook removed last week:
SAN FRANCISCO (Reuters) – Facebook said on Thursday it had removed one of the largest public groups devoted to the QAnon web of conspiracy theories after repeated violations of the company’s policies.
The group, called Official Q/Qanon and with nearly 200,000 members, had been deleted after multiple individual postings were removed for crossing the lines on bullying and harassment, hate speech, and false information that could lead to harm, a Facebook spokeswoman said.
The spokeswoman, who declined to be identified, said Facebook took the action on Tuesday and was monitoring other QAnon groups as it strengthens enforcement in the area.
I love that the spokesperson (and I wonder why Reuters gendered this person? what did that add to the story?) declined to be identified. Maybe this person is worried about retribution? So thanks a lot, Reuters. But I digest. Back to NBC:
“Facebook is definitely the largest piece of the QAnon infrastructure,” Donovan said. “While people who have bought into these disinformation campaigns are already affected, preventing it from spreading to new groups and new audiences is one intervention, among many, that are needed. Unless there is some kind of coordination between platform companies to get rid of the main QAnon influencers, it will continuously pop back up.”
Good on Facebook for taking action. But why is it such a hotbed of conspiracy?
Facebook has been key to QAnon’s growth, in large part due to the platform’s Groups feature, which has also seen a significant uptick in use since the social network began emphasizing it in 2017.
There are tens of millions of active groups, a Facebook spokesperson told NBC News in 2019, a number that has probably grown since the company began serving up group posts in users’ main feeds. While most groups are dedicated to innocuous content, extremists, from QAnon conspiracy theorists to anti-vaccination activists, have also used the groups feature to grow their audiences and spread misinformation. Facebook aided that growth with its recommendations feature, powered by a secret algorithm that suggests groups to users seemingly based on interests and existing group membership.
So Facebook Groups is uniquely designed to help conspiracy theorists find each other. I would think that would also make it uniquely easy for Facebook to find them, too, and then get rid of them.
The company is considering an option similar to its handling of anti-vaccination content, which is to reject advertising and exclude QAnon groups and pages from search results and recommendations, an action that would reduce the community’s visibility.
Considering. That’s a fail right there.