The Associated Press (AP) also discovered that while Facebook had put measures in place before the 2020 election to prevent the platform from spreading dangerous and violent content. However, the AP discovered that as many as 22 measures were discarded after the election and before the January 6th attack on the Capitol. Facebook says that it removed those measures after analyzing signals from its own platform and law enforcement.
The documents revealed that while the Capitol was under attack on January 6th, Facebook employees were frustrated. According to the AP, one wrote on an internal bulletin board, “Haven’t we had enough time to figure out how to manage discourse without enabling violence? We’ve been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.”
The documents also show that Facebook was concerned that it was losing young users and planned to take steps to reverse this decline. Such steps would include asking young users to update their connections while Facebook would “tweak” algorithms to show users posts from outside their network. The company discovered that teenage usage of the platform had dropped 13% since 2019 with another 45% decline expected over the next two years.
The AP also found that Facebook has issues moderating content written in languages other than English. The Facebook Papers indicate that content in Arabic is a particularly tough language for Facebook to moderate. Those who communicate in Arabic use symbols or add spaces to write about militant groups and this sets off red flags inside the company.
The Verge reports that during a presentation made last March inside Facebook, a team of data scientists stated, “Most young adults perceive Facebook as a place for people in their 40s and 50s. Young adults perceive content as boring, misleading, and negative. They often have to get past irrelevant content to get to what matters.” Facebook responded by saying that it is no different from other social media firms that want teens to use their platform.
Two years ago, Apple threatened to pull Facebook and Instagram from the App Store
Some users have discovered ways to avoid Facebook’s moderation system and hate speech censors. Facebook wrote in an internal document, “We were incorrectly enforcing counterterrorism content in Arabic” which “limits users from participating in political speech, impeding their right to freedom of expression.” The AP says that Facebook has told them that it is looking to add more local dialect and topic experts while also looking into ways to improve its system.
The internal documents also revealed that it was “under-enforcing on confirmed abusive activity” after Filipina maids complained about being abused on Facebook. Apple threatened to pull Facebook and Instagram out of its App Store since the platform was being used to buy, sell, and trade maids in the Mideast.
The AP states that even today, two years later, Facebook users searching for “maids” in Arabic (“khadima”)will see photographs of Africans and South Asians with their ages and prices listed. And yet, after Apple gave in, Facebook and Instagram remain available for iOS users to install from the App Store. Facebook issued a statement today that says, “At the heart of these stories is a premise which is false. Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie. The truth is we’ve invested $13 billion and have over 40,000 people to do one job: keep people safe on Facebook.”