top of page

Social Media's Worst Nightmare: A Facebook Whistleblower.

New York (CNN)—Former Facebook employee Frances Haugen released internal reports to Congress earlier this month, exposing the company’s neglect of human trafficking on its network and leaving room for more federal regulations on social media platforms.

Domestic servitude, the term Facebook uses to define this type of human trafficking, is when people use social media as a mechanism for coercion, force, fraud, and deception for the purpose of working inside private homes.

The Security and Exchange Commission (SEC) granted access to redacted versions of Haugen’s work to a total of 17 news organizations across the United States, including CNN.

Upon extensive review of the documentation, the consensus is that Facebook has been aware of its role as an enabler to human traffickers’ recruitment, facilitation, and exploitation of users by only removing a fraction of them and their malicious content since at least 2018.

Facebook spokesman Andy Stone denied Facebook culpability, saying, “We prohibit human exploitation in no uncertain terms,” in response to questioning by CNN reporters.

Stone argues that the social network utilizes artificial intelligence to proactively locate and remove content and users related to domestic servitude.

CNN cross-checked Stone’s claims, finding them to be consistent with the human exploitation policy on Facebook’s website; whether the company follows its policy cannot be confirmed.

The spokesman attests that the company’s artificial intelligence removed over 4,000 pieces of harmful content and users in violation of their policy between January 2020 and October 2021.

During this time, the app had 3 billion users, and CNBC estimates it had 64,000 reports of human trafficking.

Building upon Stone’s rationale, Facebook’s Vice President of Communications John Pinette tweeted that, “A curated selection out of millions of documents at Facebook can in no way be used to draw fair conclusions about us.”

What can be used to draw fair conclusions about the role Facebook plays as an enabler to human trafficking, however, is current research that leans in the direction of Haugen’s argument.

According to a study conducted by the University of Toledo’s Human Trafficking and Social Justice Institute, the human trafficking lifecycle—i.e., recruitment, facilitation, and exploitation, begins on profile home pages.

Traffickers analyze what their potential victims post, thus learning about their tendencies, desires, interests, and more.

The traffickers use this information to establish trust with their victims, then they transition to reaching out through direct messaging functions, such as Facebook Messenger.

Often, these traffickers hide behind fake, or copied, images and information on their profiles to maintain anonymity. This catfishing makes them difficult to flag online and identify in real life.

Young and old-aged users are most susceptible to such victimization due to their innately gullible online tendencies, per a recent study conducted by the journal of Science Advances.

Knowing this, Facebook continues to push for the discredit of the whistleblower’s testimony.

Haugen, who has expertise in debate, anticipated this would be the response she would get once she came out against her former employer.

In her testimony, she called on Congress to look past Facebook’s attempt to redirect the narrative of their human trafficking surveillance flaws and focus on the law that allows them to persist.

She further implores lawmakers to quickly decide the fate of the social media companies’ free-speech protections against legal liability under Section 230 of the Communication Decency Act (CDA).

This 1996 law provides companies with immunity when they do not remove users and content on their platforms that may be perceived as dangerous, offensive, or inappropriate.

In other words, Section 230 allows tech giants like Facebook to go unscathed under the court of law when human traffickers, among other harm-doers, are caught using their applications for heinous activity.

Lawmakers on both sides of the aisle admit regulations for social media must be harsher. They also acknowledge that these companies should be held accountable for enabling malefic users and content. However, their intentions to amend Section 230 remain divided.

This failure in bipartisan agreement is a reminder of why legislative proposals to regulate social media companies are not likely to progress any time soon.

Despite this reality, Frances Haugen strongly believes that “Congress can change the rules Facebook plays by and stop the harm it is causing if it acts now,” as mentioned in her testimony.


bottom of page