A US appeals court has revived a lawsuit against TikTok, holding the social media platform responsible for recommending a dangerous “Blackout Challenge” to a 10-year-old girl who tragically died after attempting it.
The lawsuit, filed by the mother of Nylah Anderson, alleges that TikTok’s algorithm played a pivotal role in suggesting the challenge to her daughter.While internet companies typically enjoy legal protection from liability for user-generated content, the court ruled that this protection doesn’t extend to algorithmic recommendations.
The court’s decision marks a significant departure from previous rulings, which had generally shielded online platforms from liability for failing to prevent users from sharing harmful content. However, a recent Supreme Court ruling on social media platforms’ content moderation practices influenced this new interpretation.
The court found that TikTok’s algorithm reflects the company’s editorial judgments and therefore constitutes its own speech, which is not protected by the existing legal shield. This means the platform can be held accountable for the content it recommends to users.
TikTok has yet to respond to the court’s decision. The case now heads back to a lower court for further proceedings.
The lawsuit’s outcome could have far-reaching implications for social media platforms, potentially forcing them to take greater responsibility for the content they recommend to users. It also highlights the urgent need for stronger safeguards to protect young people from online dangers.
The lawsuit, filed by the mother of Nylah Anderson, alleges that TikTok’s algorithm played a pivotal role in suggesting the challenge to her daughter.While internet companies typically enjoy legal protection from liability for user-generated content, the court ruled that this protection doesn’t extend to algorithmic recommendations.
The court’s decision marks a significant departure from previous rulings, which had generally shielded online platforms from liability for failing to prevent users from sharing harmful content. However, a recent Supreme Court ruling on social media platforms’ content moderation practices influenced this new interpretation.
The court found that TikTok’s algorithm reflects the company’s editorial judgments and therefore constitutes its own speech, which is not protected by the existing legal shield. This means the platform can be held accountable for the content it recommends to users.
TikTok has yet to respond to the court’s decision. The case now heads back to a lower court for further proceedings.
The lawsuit’s outcome could have far-reaching implications for social media platforms, potentially forcing them to take greater responsibility for the content they recommend to users. It also highlights the urgent need for stronger safeguards to protect young people from online dangers.