An internal presentation by Meta, the parent company of Facebook and Instagram, estimates that more than 100,000 minors receive explicit content depicting sexual abuse on its platforms every day. The disconcerting information emerged during a recent court hearing in a lawsuit filed by New Mexico, shedding light on the company's alleged negligence in protecting underage users.
According to redacted documents presented in court, Meta's "People You May Know" algorithm, abbreviated as PYMK, was identified by employees as a key contributor to connecting child users with potential predators. The lawsuit asserts that these concerns were raised with Meta executives several years ago, but the company purportedly rejected the notion of adjusting the algorithm.
Employees Alarmed: Calls for Action Ignored
Meta employees reportedly expressed deep concern over the algorithm's impact, with one comment suggesting that PYMK contributed up to 75 percent of inappropriate adult-minor contact. The question raised by an employee on turning off PYMK between adults and children remains unanswered, leading to further scrutiny of Meta's commitment to user safety.
Instagram's Disturbing Disparity: Continued Sexual Exploitation
Another internal email from 2020 highlighted that the prevalence of "sex talk" to minors on Instagram was a staggering 38 times greater than on Facebook Messenger in the United States. The email urged the company to implement additional safeguards on the Instagram platform, pointing to its perceived lack of attention to child safety.
Meta's Defense: Denial and Accusations of Mischaracterization
Meta has not directly addressed the revelations from the newly unsealed documents but refuted New Mexico's claims, stating that they "mischaracterize our work using selective quotes and cherry-picked documents." The company emphasized its commitment to child safety, describing child predators as "determined criminals" and highlighting its ongoing investment in safety-focused tools for young users and their parents.
Nationwide Scrutiny: More Than 40 States Sue Meta
The lawsuit filed by New Mexico is not an isolated case, as over 40 other states in the United States sued Meta in October of the previous year. These legal actions allege that Meta misled the public about the dangers its platforms pose to young people, intensifying the legal challenges faced by the tech giant.
In response to growing concerns and legal actions, Meta announced its intention to automatically restrict teen Instagram and Facebook accounts from harmful content. This includes videos and posts related to self-harm, graphic violence, and eating disorders, signalling a reactive measure to address the mounting criticism and revelations from internal documents and ongoing lawsuits against Meta highlighting the pressing need for tech companies to prioritize user safety, especially when it comes to protecting minors from explicit and harmful content.
The allegations against Meta raise questions about the effectiveness of existing algorithms and the company's commitment to proactively address these critical issues. As the legal battles unfold, the outcomes may shape the landscape of online safety regulations and demand accountability from major social media platforms.