What Is Facebook Papers Now?
On Monday (October 25) a consortium of American and European news organizations began to put out content that analyzed and exposed thousands of pages of internal company documents of Facebook. The documents were obtained by the former Facebook employee and whistleblower Frances Haugen.
Some of the stories and analyses have been drawn from the redacted versions of disclosures presented to the US Congress in the ongoing proceedings. The said consortium had accessed these redacted documents. Journalists from an array of big and small newsrooms worked together to access and then analyze the documents used for the exposè.
The series of articles that have been put out since Monday have exposed how Facebook either neglected or willingly turned a blind eye to fake, violent, extremist, and anti-democratic content on its site. They have also analyzed how the social media platform became the vehicle for the rise of right-wing extremism and anti-democracy movements like the Capitol Hill coup of Jan 6.
What Do The Facebook Papers Reveal?
The documents reveal the social media giant’s inadequate and inconsistent responses to extremism, misinformation, and violent content on its site in the US and across the world. While in some cases it seems to have been caught off-guard, like in the Capitol riots, in most other cases Facebook clearly resorted to some calculated inaction.
The accessed documents reveal the views of many of Facebook’s own employees. Many of the current and previous employees have expressed their concerns and even anger over the company’s “piecemeal” approach to dealing with dangerous content.
“We have been fueling this fire for a long time and we shouldn’t be surprised it’s now out of control.” wrote one of the employees on an internal message board. Yet another employee who left the company in 2020 left a long note alleging that “promising new tools backed by strong research were constrained by Facebook for fears of public and policy stakeholder responses.” Adding further, the unnamed employee wrote, “ Similarly (though even more concerning), I’ve seen already built and functioning safeguards being rolled back for the same reasons.”
Confirming the claims made by Haugen, the documents reveal that Facebook has consistently put the preservation of its profits and growth above the preservation of democracy and peace. This tendency has been observed to be stronger outside of the US.
What About Facebook India?
One of the articles on AP News focused on Facebook’s role in the rise and mobilization of right-wing extremism, anti-Muslim propaganda, and fake news in India. A December 2020 internal document that was part of this tranche highlighted the influence of powerful political actors on Facebook’s policy decisions. The same document quotes a former Facebook CSO admitting that “local policy heads are generally pulled from the ruling political party and are rarely drawn from disadvantaged ethnic groups, religious creeds or cast(e)s.” Another Indian employee wrote that “barbarism” was being allowed to “flourish on our network.”
India, with 340 million Facebook users and 400 million WhatsApp users is the company’s largest market. With the company’s leadership dithering on adopting a strong stance against fake or inflammatory content, both the above platforms began to be used extensively by hate speech, fake news, and anti-minority propagandists.
Apart from the market pressures and political manipulation, Facebook India faced one other technical glitch that seriously hampered its ability to control and curb such dangerous content. Internal documents containing memos and research dating as far back as 2019 reveal how Facebook was aware of abusive content in India. In fact, India was seen as one of the most “at-risk countries” in the world. The company had identified Hindi and Bengali languages as priorities for “automation on violating hostile speech.” But, the lack of enough local language moderators and content flagging became a major stumbling block in its struggle to quash abusive content. The platform lacked local language fact-checkers as well. The result was; a lot of content remained unverified.
Facebook Responds to the allegations:
Speaking on the allegations made on the basis of these leaked papers Facebook’s spokesperson Dani Lever said that Facebook “carefully calibrated its controls to react quickly to spikes in hateful and violent content.” She was referring to the company’s response to the Capitol Hill riots of Jan 6. Facebook had put in place a toolkit of emergency measures designed to stem the spreading of dangerous content. It is called “Breaking the Glass.” Lever said that some of those measures are still in place. However, an internal spreadsheet reveals that as many as 22 of those measures were rolled back sometime after the 2020 election.
On the India situation, Facebook stated in a statement released to AP News, that it has ‘invested significantly in technology to find hate speech in various languages, including Hindi and Bengali’ which has resulted in ‘reduced amount of hate speech that people see by half’ in 2021. The spokesperson added further, “Separately our work on curbing hate speech continues and we have further strengthened our hate classifiers to include 4 Indian languages.”