By Rohan D. ’25
If you use Facebook, Instagram, Whatsapp, or any of Facebook’s subsidiaries, you may have experienced an outage on October 4th, 2021. Approximately 3.5 billion people lost access to these apps for 5-6 hours, disrupting lives, halting jobs, and cementing Facebook as a quintessential aspect of today’s average lifestyle. Small businesses suffered, and many almost went out of business; many owners are looking for plan b to prevent this from happening again.
However, this unprecedented incident was not the only Facebook mishap that week; whistleblower Frances Haugen leaked many of Facebook’s internal documents to the Wall Street Journal.
Facebook is currently the most popular social media platform, with almost half of the world using it. Facebook is worth about 1 trillion dollars, and users account for 60% of all internet-connected people on Earth. So, you would think, powerful social media platforms like Facebook would incorporate ethics into its success, right? This very topic of the deleterious impact of social media was highlighted in the 2020 Netflix documentary “The Social Dilemma.”
Wrong. According to Ms. Haugen, Facebook achieved this growth by “buying their profits with our safety.” She also said, “Almost nobody outside of Facebook knows what happens inside Facebook.” This is extremely concerning because powerful social media platforms like Facebook should maintain a level of transparency to their clients. Ms. Haugen touched on the fact that the most engaging contents are often the most divisive and harmful.
Before being recruited by Facebook, Ms. Frances Haugen earned a Master of Business Administration from Harvard Business School in 2011and worked as a product manager at Google. In “The Facebook Files,” as the Wall Street Journal called them, Ms. Haugen revealed documents that outlined Facebook’s internal concerns about their products that could “harm teenagers, amplify extremism, and lead to violence.” Ms. Haugen obtained the files as an employee of Facebook. One of the internal memos, written in 2018, stated, “misinformation, toxicity, and violent content are inordinately prevalent among reshares.”
One quote from the Facebook Files read, “Our algorithms exploit the human brain’s attraction to divisiveness. If left unchecked, it will feed more and more divisive content.” Here, Facebook’s greatest problem of all is revealed: the algorithm. When it comes to teens, Facebook and its subsidiaries, especially Instagram, are contributing to body image issues, anxiety, eating issues, and depression, among other disorders. In one survey from data obtained by documents from Ms. Haugen, 13.5% of teen girls in the UK said Instagram worsens suicidal thoughts. Another survey found 17% of girls claimed that Instagram contributes to their eating disorders. This is a direct result of Facebook prioritizing its profits over its clients and their mental health.
Neil Potts, the Vice President for Trust and Safety Policy at Facebook, responded to Haugen’s comments, saying, “We’re not designing anything for the sensational or clickbaity or engagement-baity ways that polarization may be seen.” He also stated that Facebook is investing enormous amounts of money to fight misinformation and harmful content on its platforms, and he pointed to the decision to pause all political advertising during the 2020 Presidential Election. In response to the Instagram statistics, Potts said that many teens have had positive experiences with Instagram, and Facebook is investing in research to minimize effects on young women.
“The Facebook Files” stated that Mark Zuckerberg wanted to push for users to have more ‘meaningful social interactions’ or MSI. This algorithm shows users content that their families and friends viewed rather than credible news sources. This can be harmful because even if the user’s friend’s post is false, it will still be prioritized by the algorithm instead of the real facts. Ms. Haugen said, “Content that elicits a more extreme reaction from you is more likely to get a click, comment, or reshare,” amplifying misinformation.
Facebook may be conducting additional research right now, but, in the recent past, it has disregarded its findings. Facebook’s research illustrated that it was causing harm to multiple vulnerable populations. According to Ms. Haugen, “There is a pattern of behavior of Facebook choosing to prioritize profits over people.” Executives were given many statistics showing the detriments of their algorithm, and they chose to keep their money. This is both harming teens and expanding misinformation.
This is akin to many other health care crises, such as Purdue Pharma, Big Tobacco, and Juul. Executives were given information about the harm their products were causing; nevertheless, they chose to increase their profits. These effects included addiction, overdoses, and even death. In Purdue Pharma’s case, this mindset illustrated the concept of “blood money,” a term that refers to the way the Sackler Family and Purdue Pharma profited from the exploitation of their customers’ addictions.
Some people beg the question, ‘if Facebook is harming teens by showing them posts on how their bodies should look, shouldn’t other mediums of media, like books, magazines, movies, and television, be regulated as well?’ Even though books and magazines show graphics that may trigger eating disorders, this is significantly outweighed by the addiction that Facebook’s algorithm may augment, by showing these posts one after another, causing teens to “fall down the rabbit hole” and continue viewing them, increasing the severity of their disorders and negative body image.
Others may think, ‘if Facebook’s algorithm causes physical or emotional harm, shouldn’t all social media platforms be regulated?’ In public opinion, yes, all social media platforms should be regulated. As of now, no social media outlet is transparent enough to definitively be called ‘safe’ after this incident. The KIDS Act, sponsored by Sen. Edward Markey (D-MA), was written to “keep children safe and protect their interests on the internet, and for other purposes.” Unfortunately, the bill died in Congress. But, after this discovery, people around the globe are pushing for new technology reforms.
In summary, Facebook must fix its algorithms to prevent the spread of misinformation, especially during a pandemic, and social media platforms should be regulated to stop more harm to teenagers and other vulnerable populations. Hopefully, national and even international reforms for social media will be enacted. As Ms. Haugen said, “The only way Facebook can get reconciliation is by declaring moral bankruptcy.” One can only hope that this is just a speed bump for Facebook and that Ms. Haugen’s efforts are not in vain.