‘Oops,’ Said Faceberg (Part Infinity)

H/T Scissorhead BDR

Tiger Beat on the Potomac (thanks Charlie!) morning email thingy leads with:

— A whole bunch of news about Facebook: It’s odd for several news outlets to release major stories on a Friday evening absent an urgent reason. But that’s what happened with a glut of stories drawn from previously unreported internal documents at the social media behemoth.The highlights:

  • Dissecting the company’s role in spreading election misinformation, NYT’s Ryan Mac and Sheera Frenkel go deep on a trove of documents that reveal the network’s staffers “repeatedly raised red flags about the spread of misinformation and conspiracies before and after the contested November vote,” but that “the company failed or struggled to address the issues.”
  • Political movements that Facebook deems dangerous are increasingly being suppressed on the platform, WSJ’s Jeff Horwitz and Justin Scheck report. But that approach doesn’t sit well with some employees, who see the social media giant “in essence playing whack-a-mole.”
  • The company has long known that its algorithms push some users to extremist views, reports NBC’s Brandy Zadrozny. In a particularly vivid illustration, a
  • Facebook researcher created an account for a fake person who described herself as a conservative, Christian mother from North Carolina interested in Fox News and DONALD TRUMP. Without expressing any interest in conspiracy theories, within two days of her account’s creation, Facebook’s algorithm recommended she join groups dedicated to QAnon. Within one week, her “feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation.”

And over at their competition, Axios’ morning email:

A gusher of Facebook stories hit the web last night and will cascade into next week, as a consortium of at least 17 news organizations unfurl a series branded “The Facebook Papers,” based on documents from whistleblower Frances Haugen.

  • The consortium’s plan was for the stories to begin Monday. But one outlet after another jumped the gun last evening. The latest:

⚡ As supporters of Donald Trump stormed the U.S. Capitol on Jan. 6, Facebook engineers in California were racing to tweak internal controls to slow the spread of misinformation, AP reports.

  • Emergency actions included banning Trump, freezing comments in groups with a record for hate speech and filtering out the “Stop the Steal” rallying cry.

Inside Facebook, some employees “took to an internal discussion board to express shock and outrage,” Bloomberg reports:

  • “Many of the posts were imbued with a dawning sense that they and their employer — whose platforms for weeks had spread content questioning the legitimacy of the election — bore part of the blame.”

🥊 Facebook responds: “We are a significant social media platform so it’s only natural for content about major events like that to show up on Facebook. But responsibility for the insurrection itself falls squarely on the insurrectionists … and those who incited them.” Keep reading.

What’s next: Facebook reports earnings Monday, and is reportedly planning a corporate rebrand with a new corporate name.

And NBC gives us a case study:

In summer 2019, a new Facebook user named Carol Smith signed up for the platform, describing herself as a politically conservative mother from Wilmington, North Carolina. Smith’s account indicated an interest in politics, parenting and Christianity and followed a few of her favorite brands, including Fox News and then-President Donald Trump.

Though Smith had never expressed interest in conspiracy theories, in just two days Facebook was recommending she join groups dedicated to QAnon, a sprawling and baseless conspiracy theory and movement that claimed Trump was secretly saving the world from a cabal of pedophiles and Satanists.

Smith didn’t follow the recommended QAnon groups, but whatever algorithm Facebook was using to determine how she should engage with the platform pushed ahead just the same. Within one week, Smith’s feed was full of groups and pages that had violated Facebook’s own rules, including those against hate speech and disinformation.

Smith wasn’t a real person. A researcher employed by Facebook invented the account, along with those of other fictitious “test users” in 2019 and 2020, as part of an experiment in studying the platform’s role in misinforming and polarizing users through its recommendations systems.


There is nothing good at Facebook or about Facebook.

You want to keep in touch with the grandkids or whatevs: call them. Video chat with them, you can do everything you did before Facebook and you can do it better without it leading to insurrection.

Delete your accounts. You’ll be glad you did.

This entry was posted in Facebook or Faceberg, Our Failed Social Media, Social Media. Bookmark the permalink.