Advertisment

TikTok May Face Lawsuit Over 10-Year-Old Girl's 'Blackout Challenge' Death

The "blackout challenge," which involves people choking themselves with objects until they lose consciousness, claimed the life of 10-year-old Nylah Anderson in December 2021

author-image
Ishika Thanvi
New Update
tiktok legal battle

Image: Getty Images & US District Court for the Eastern District of Pennsylvania)

In a landmark legal development, TikTok, one of the world's most popular social media platforms, is facing serious legal challenges after a 10-year-old girl tragically died while attempting a viral "blackout challenge" on the app. The case has reignited debates over the responsibility of social media companies in safeguarding their users, especially vulnerable children, from dangerous content.

Advertisment

The Tragic Death of Nylah Anderson

The "blackout challenge," which involves people choking themselves with objects until they lose consciousness, claimed the life of 10-year-old Nylah Anderson in December 2021. Nylah, described as an "active, happy, healthy, and incredibly intelligent" child, was found unresponsive in her bedroom by her mother, Tawainna Anderson. Despite being rushed to the hospital, Nylah passed away five days later.

Nylah's death was not an isolated incident. Between 2021 and 2022, the blackout challenge reportedly led to the deaths of 20 children, 15 of whom were under the age of 12. This is just one of several dangerous challenges that have gone viral on TikTok. Others, like the "Skullbreaker Challenge" and the "Fire Challenge," have resulted in severe injuries, including paralysis and burns, and have even led to criminal charges.

TikTok's Alleged Role and Legal Immunity

In May 2022, Tawainna Anderson filed a lawsuit against TikTok and its parent company, ByteDance, accusing them of knowingly promoting the blackout challenge to young users. The lawsuit claims that TikTok's algorithm specifically recommended these harmful videos to children, including Nylah.

TikTok, however, argued that it was protected under Section 230 of the Communications Decency Act (CDA) of 1996, a law that provides immunity to online platforms from being held liable for content created and uploaded by third parties. In October 2022, a federal judge dismissed Anderson's lawsuit on these grounds, ruling that TikTok could not be held accountable as a publisher of third-party content.

Advertisment

A Landmark Appeal and the Potential Shift in Big Tech Accountability

The legal battle took a significant turn when the United States Court of Appeals for the Third Circuit allowed Anderson's lawsuit to proceed. The appeals court's decision, handed down by US Circuit Judge Paul Matey, acknowledged that while Section 230 protects TikTok from being sued for hosting third-party videos, it does not shield the company from being held accountable for its alleged "knowing distribution and targeted recommendation" of harmful content.

This ruling represents a potential shift in how courts may interpret Section 230 in the future, particularly in cases involving the protection of children. Jeffrey Goodman, the attorney representing the Anderson family, stated, "Big Tech just lost its ‘get-out-of-jail-free’ card." The ruling ensures that social media companies will be held to the same standards as other corporations when their actions cause harm, especially to children.

TikTok's Defense and the Ongoing Legal Struggle

TikTok has maintained that user safety is its top priority and has implemented measures to protect minors from harmful content. The company has also denied the existence of the blackout challenge on its platform, despite multiple reports and incidents. However, the court's decision to allow the lawsuit to proceed indicates that TikTok may be forced to defend its practices and algorithms in court.

Anderson's attorneys argue that TikTok's algorithm is designed to maximise engagement and profits, even if it means exposing young users to dangerous content. The case raises critical questions about the ethical responsibilities of social media companies and the need for more stringent regulations to protect vulnerable users.

Advertisment

The Broader Implications for Social Media Regulation

The outcome of this lawsuit could have far-reaching implications for the tech industry. If TikTok is found liable, it could set a precedent for holding social media platforms accountable for the content they promote through their algorithms. This case may also prompt lawmakers to revisit and potentially revise Section 230, a law that has been the subject of intense debate in recent years.

For Tawainna Anderson, the lawsuit is about more than just seeking justice for her daughter. "Nothing will bring back our beautiful baby girl," she said in a statement. "But we are comforted knowing that — by holding TikTok accountable — our tragedy may help other families avoid future, unimaginable suffering."

A Call for Greater Accountability

As social media platforms continue to wield immense influence over the lives of millions, especially children, the responsibilities of these companies must be reexamined. The outcome of this case could mark a pivotal moment in the regulation of Big Tech, with far-reaching consequences for the future of online content and user safety.

#TikTok TikTok challenge Algorithms children on tiktok
Advertisment